0

Getting wrong output while calculating Cross entropy loss using pytorch

Hi guys , I calculated the cross entropy loss using pytorch , Input = torch.tensor([[1.,0.0,0.0],[1.,0.0,0.0]]) ,label = torch.tensor([0, 0]) . The output must be 0 but I got ( tensor(0.5514) ) . Anyone Plz say why it's coming 0.55 instead of 0 code for reference

1 Answers1

0

Yes, you are getting the correct output.

import torch
Input = torch.tensor([[1.,0.0,0.0],[1.,0.0,0.0]])
label = torch.tensor([0, 0])

print(torch.nn.functional.cross_entropy(Input,label))
# tensor(0.5514)

torch.nn.functional.cross_entropy function combines log_softmax and nll_loss in a single function:

It is equivalent to :

torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label)

Code for reference:

print(torch.nn.functional.softmax(Input, 1).log())
tensor([[-0.5514, -1.5514, -1.5514],
        [-0.5514, -1.5514, -1.5514]])

print(torch.nn.functional.log_softmax(Input, 1))
tensor([[-0.5514, -1.5514, -1.5514],
        [-0.5514, -1.5514, -1.5514]])

print(torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label))
tensor(0.5514)

Now, you see:

torch.nn.functional.cross_entropy(Input,label) is equal to

torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label)

Talha Tayyab
  • 8,111
  • 25
  • 27
  • 44
  • Thanks for your detailed explanation and one more thing can we use softmax at the last layer? if we use cross entropy for loss calculation – Sukesh Ram Jan 30 '23 at 07:55
  • I think we can.. but, please double check I am not sure. Also, if this solution answers your question please accept it as an answer. – Talha Tayyab Jan 30 '23 at 08:24