3

I am following a tutorial, and the function softmax crashes when I use it.

newSignals = [0.5, 0., 0., -0.7911, 0.7911]
newState = torch.Tensor(newSignals).float().unsqueeze(0)
probs = F.softmax(self.model(newState), dim=1)

self.model is a neural network (torch.nn.module), which return a Tensor like

tensor([[ 0.2699, -0.2176, 0.0333]], grad_fn=<AddmmBackward>)

So, the line probs = F.softmax(self.model(newState), dim=1) crash the program but when dim=0 it works but it is not good.

Umang Gupta
  • 15,022
  • 6
  • 48
  • 66
  • 1
    Welcome to SO, don't worry about the English, native speakers could correct it if required. Have a look to [mcve] and update your post if you wish having an answer. Specify and provide if possible inputs, explain what is the expected output. Also provide the traceback of the Exception raised. Those information are crucial to get help. – jlandercy Dec 16 '18 at 09:22

1 Answers1

0

Disclaimer: I am sorry this probably should have been a comment but I can't write all the below in a comment.

Are you sure this is the problem? Below snippet just worked for me.

import torch
a = torch.tensor([[ 0.2699, -0.2176,  0.0333]]) 
a.softmax(dim=1)
> tensor([[0.4161, 0.2555, 0.3284]])
a.softmax(dim=0)
> tensor([[1., 1., 1.]])
Umang Gupta
  • 15,022
  • 6
  • 48
  • 66