I'm doing multiclass-multilabel classification. Namely, I have N_labels
fully independent labels for each example, whereas each label may have N_classes
different values (mutually exclusive). More concretely, each example is classified by N_labels
-dimensional vector, while each vector components can by from the set {0, 1, ..., N_classes}
For example, if N_labels = 5
and N_classes = 3
, each example may be classified by the following tags:
[2, 1, 0, 0, 1], [0, 0, 2, 2, 1], [0, 0, 0, 0, 0]
In addition, for each label I have very imbalance between different classes, namely 90% of examples in training set belong to set 0
. So, I'd like to perform weighted softmax cross entropy in order to compute loss for each label (and average afterwards).
Tried to use:
tf.losses.sparse_softmax_cross_entropy # but it seems that it performs weightening between different label and not between classes for each label.
tf.nn.softmax_cross_entropy_with_logits, tf.nn.softmax_cross_entropy_with_logits_v2 # does not have weightening option ever
tf.nn.weighted_cross_entropy_with_logits # good only for binary classification
I'd like to find compute_loss
function to compute loss in the following way:
loss = compute_loss(logits=my_logits, labels=my_labels, weights=my_weights)
where
my_logits is of shape [batch_size, N_labels, N_classes]
my_labels is of shape [batch_size, N_labels]
my_weight is of shape [N_labels, N_classes]
Note that each label may have different weights (for classes)