0

I am trying to build a simple "neural network" with just elementwise multiplication with weights. Just for this scenario I have a data with 5 features which only one is "1" and all the rest are "0" (one hot encoded), and I am trying to predict with softmax and cross entropy loss the correct class. Here is my code (say I have 26 features and (classes):

class Net(nn.Module):
    def __init__(self,n):
        super(Net, self).__init__()
        self.weights = nn.Parameter(torch.Tensor(n))

    def forward(self, x):
        return F.softmax(x * self.weights)

net = Net(n=26)

now I try to give the unsquized tensor data (say batch size is one) which has shape of (1,1,26) and the tensor labels with shape of (1,26). When I insert to the loss function by using loss = criterion(nn_outputs, labels) and receive the following error:

RuntimeError: Assertion `cur_target >= 0 && cur_target < n_classes' failed.  at c:\a\w\1\s\tmp_conda_3.7_110509\conda\conda-bld\pytorch_1544094576194\work\aten\src\thnn\generic/SpatialClassNLLCriterion.c:110

Maybe there is a more simple way to build and train this simple neural network without errors?

Codevan
  • 538
  • 3
  • 20

0 Answers0