I am planning to use
tf.nn.sigmoid_cross_entropy_with_logits
for creating N binary classification models. I want these N models to be independent binary models and not share weights? Can I achieve it using this function?
I am planning to use
tf.nn.sigmoid_cross_entropy_with_logits
for creating N binary classification models. I want these N models to be independent binary models and not share weights? Can I achieve it using this function?
Yes, you can, this function just applies sigmoid to the given logits, and then computes the cross-entropy loss. It does not have weights at all.
If i understand correctly your question, if you have several models you need to have separate loss function for each model in order to optimize each separately. In this case you should have separate sigmoid_cross_entropy_with_logits for each model. Each model will put its own logits in this function and then minimize it with optimizer.