0

I am planning to use

tf.nn.sigmoid_cross_entropy_with_logits 

for creating N binary classification models. I want these N models to be independent binary models and not share weights? Can I achieve it using this function?

Ravikrn
  • 387
  • 1
  • 3
  • 19
  • I don't think I understand your question clearly, could you add a sample code for the rest of the model ? Please note that sigmoid layer does not have **weights**, it's just a function of it's inputs – Abhai Kollara Feb 01 '18 at 14:20

2 Answers2

1

Yes, you can, this function just applies sigmoid to the given logits, and then computes the cross-entropy loss. It does not have weights at all.

Dr. Snoopy
  • 55,122
  • 7
  • 121
  • 140
0

If i understand correctly your question, if you have several models you need to have separate loss function for each model in order to optimize each separately. In this case you should have separate sigmoid_cross_entropy_with_logits for each model. Each model will put its own logits in this function and then minimize it with optimizer.

ofer-a
  • 521
  • 5
  • 21
  • Yes, you are right. One more small clarification. If I am my every model is just binary classifier, then it wont matter if I use sigmoid_cross_entropy_with_logits or softmax_cross_entropy_with_logits for loss function. OR if I use, tf.nn.softmax or tf.nn.sigmoid. Am I correct? – Ravikrn Feb 01 '18 at 16:57
  • Yes you are correct. if it just binary classification sigmoid is fine. – ofer-a Feb 01 '18 at 19:27