1

There is a related question here how to convert logits to probability in binary classification in tensorflow?

However, this question is asking something different. We often see people use tf.nn.sigmoid(logit) > 0.5 to predict positive class (i.e., 1). Is the sigmoid calculation wasting computation if we only need to predict (for training, of course, we need sigmoid in loss function)?

When logit > 0, then we always have tf.nn.sigmoid(logit) > 0.5. Thus, only by checking the sign of logit, we know whether it is positive or negative class.

The decision threshold is always 0.5 for testing dataset, right?

thinkdeep
  • 945
  • 1
  • 14
  • 32
  • The decision threshold is usually 0.5 for testing dataset, but you can set it whatever you want depending on your task, if your decision threshold is 0.5, of course you can check the sign of logit for faster computation – Mr. For Example Jan 04 '21 at 02:26
  • thanks. but the decision threshold is always 0.5? It seems that if using test dataset to optimize decision threshold, it will be leaking information from test dataset. – thinkdeep Jan 04 '21 at 18:43

0 Answers0