0

I am trying custom the binary cross entropy loss from the paper by Pytroch, but I meet some problems here. I am not sure how can I use Pytorch here to get the target y when I let predict label (sigmoid(input)) == 1. Can someone help me with that? Or is there any method we can get the matched target value like the numpy or list used index function? The modified cross entropy used here:

loss1 = (t/(t+1)) * cross_entropy (only when predicted label==1)
loss2 = (1/(t+1)) * cross_entropy (only when predicted label==0) 
total loss = loss1 + loss2```   

which is to say, cross_entropy for loss1 is log(P(predicted label==1|x_i, theta)) * target_y(when predicted label == 1) 
                                                    
Bertie
  • 31
  • 2

1 Answers1

0

An easy way to implement this is by using the weight input parameter for Binary Cross Entropy function in PyTorch. The implementation for the same can be done using:

classWeightsPositive = 1 - (numberOfPositiveSamples/totalNumberOfSamples)
classWeightsNegative = 1 - classWeightsPositive
weights = [classWeightsPositive, classWeightsNegative]
criterion = nn.BCELoss(weights=weights)

This makes sure that your losses are calculated as per the weights given to the loss function. The rest of the part of the code remains the same, where loss can be calculated as follows:

for i, (data, target) in enumerate(dataLoader):
    optimiser.zero_grad()
    predictions = model(data)
    loss = criterion(predictions, target)
    loss.backward()
    optimiser.step()
Azhan Mohammed
  • 340
  • 2
  • 8