0

I have a multi-label classification problem that I am working on where some of the labels are imbalanced and I want to set the threshold differently for each label. I was doing the following before but this uses the same threshold for all the labels.

results_pred = model.predict(x_test_np, batch_size=1)
results_pred[results_pred >= 0.2] = 1
results_pred[results_pred < 0.2] = 0

Also during the training how to set the threshold differently for each label for the precision and recall metrics. I do this now:

model.compile(loss='binary_crossentropy',optimizer=opt,metrics=['accuracy',tf.keras.metrics.Precision(thresholds=0.2),tf.keras.metrics.Recall(thresholds=0.2)])
desertnaut
  • 57,590
  • 26
  • 140
  • 166
  • It would probably be easier to use the `class_weights` parameter during `fit()` rather than setting different thresholds. – thushv89 Jun 17 '20 at 22:59
  • Please clarify if you are indeed in a multi-label setting (a sample can belong to more that one class simultaneously) or simply multi-class (a sample can belong to one and only one class). If it is multi-class, please edit your post accordingly. – desertnaut Jun 18 '20 at 00:53
  • I am sorry to not mention that, it is a multi-label setting. – Ahmad Ayyad Jun 18 '20 at 08:12
  • You do have mentioned that, it's only that, according to my experience at least, people *very* frequently use the term "multi-label" to actually mean "multi-class", and these two settings are completely different. Edited your tags accordingly. – desertnaut Jun 18 '20 at 15:36

0 Answers0