0

My label looks like this

label = [0, 1, 0, 0, 1, 0]

Meaning that classes 1, 4 are present on the matching sample input.

  1. How do I create one-hot encoded labels for a label like that?
  2. Which loss function is more appropriate for a case like this (sigmoid cross entropy, softmax cross entropy, or sparse softmax cross entropy)?
rodrigo-silveira
  • 12,607
  • 11
  • 69
  • 123
  • You can either use sigmoid cross entropy or softmax cross entropy for the one-hot encoded lables. Sparse softmax cross entropy only allows index labels. – Lerner Zhang Dec 02 '18 at 04:29

1 Answers1

1
  1. There is not a good reason to create a one-hot encoded version of this and if you want to keep the output labels size to be exactly the same, which is 6 in your case, you can't do a one-hot encoded version of it.

  2. Where multi-label classification is to be done, you can not (more appropriately should not) use softmax as activation. Softmax is good for the cases where only one of the output can be the truth value. So, in your case it is better to use sigmoid cross-entropy.

layog
  • 4,661
  • 1
  • 28
  • 30