I am following a tutorial, and the function softmax crashes when I use it.
newSignals = [0.5, 0., 0., -0.7911, 0.7911]
newState = torch.Tensor(newSignals).float().unsqueeze(0)
probs = F.softmax(self.model(newState), dim=1)
self.model
is a neural network (torch.nn.module
), which return a Tensor like
tensor([[ 0.2699, -0.2176, 0.0333]], grad_fn=<AddmmBackward>)
So, the line probs = F.softmax(self.model(newState), dim=1)
crash the program but when dim=0
it works but it is not good.