0
conf_loss = cross_entropy_loss(conf_preds.view(-1,num_classes),conf_targets.view(-1))

The shape of x and y are

X: torch.Size([69856, 40]) ,  Y: torch.Size([69856])

respectively. Author has mentioned the size as x:[N,D] and y:[N,] but my y size is [N] I need to calculate the difference but I am getting out of memory . Can anyone help with the dimension? I should get the final shape after taking the difference as [N,]. I need to calculate this print(log_sum_exp - x.gather(1, y.view(-1,1)))

def cross_entropy_loss(x, y):
        '''Cross entropy loss w/o averaging across all samples.
        Args:
          x: (tensor) sized [N,D].
          y: (tensor) sized [N,].
        Return:
          (tensor) cross entroy loss, sized [N,].
        '''
        print("X:",x.shape)
        print("Y:",y.shape)
        xmax = x.data.max()


        log_sum_exp = torch.log(torch.sum(torch.exp(x-xmax), 1)) + xmax
        print(log_sum_exp.shape)   # torch.Size([69856])
        print(x.gather(1,y.view(-1,1)).shape)   #torch.Size([69856, 1])
        #print(log_sum_exp - x.gather(1, y.view(-1,1)))

        #return log_sum_exp - x.gather(1, y.view(-1,1))
benjaminplanche
  • 14,689
  • 5
  • 57
  • 69
ninjakx
  • 35
  • 13

0 Answers0