0
model = nnet(4, 2, 1)
criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.SGD(model.parameters(), lr=0.1)

trainloader = Dataloader(dataset=dataset, batch_size = 15)
val_loader = Dataloader(dataset=val_dataset, batch_size = 150)
LOSS = []
accuracy = []
N_test = len(val_dataset)

def my_trainer(epochs):
    for epoch in range(epochs):
        for x,y in trainloader:
            z = model(x)
            loss = criterion(z, y)
            loss.backward()
            optimizer.step()
            optimizer.zero_grad()
        LOSS.append(loss.item())
        correct = 0
        for x,y in val_loader:
            z = model(x)
            _, yhat = z.max(1)
            correct += (yhat == y).sum().item()
        acc = correct/N_test
        accuracy.append(acc)
    
my_trainer(5000)

Updated code- I should have been using CrossEntropyLoss instead of BCE as its a multiclass problem. This might have added to my error. Now I'm gettng the error

1D target tensor expected, multi-target not supported" 

for the line loss=criterion(z,y) even though y is a 1dimensional tensor for the target. Oh well, at least the accuracy problem is solved

desertnaut
  • 57,590
  • 26
  • 140
  • 166
  • 1
    Is the indentation correct? If so, you're only computing it once per epoch, and perhaps you're just unlucky with the last batch? – Berriel Sep 11 '21 at 18:51
  • @berriel hey you helped with my last pytorch question too! Albeit that was off my work account :) And yes Ive tried every indentation possible, haveno idea why it doesnt want to work, even though my loss does go down perfevtly – Robbie Meaney Sep 12 '21 at 09:30
  • Could you run your script for a single epoch and plot the loss and accuracy for each batch? – Ivan Sep 12 '21 at 09:55
  • Doing so gives me a good loss and accuracy at 0 the whole way. I think I found some errors already, i will clarify in a comment.Appreciate the assistance – Robbie Meaney Sep 12 '21 at 10:34
  • @Ivan question edited to reflect last few hours of findings – Robbie Meaney Sep 12 '21 at 10:37
  • Ok, can you show the model's definition most specifically are you applying an activation function on your model's output. Also, can you tell us the shapes of `x`, and `y`? – Ivan Sep 12 '21 at 11:43
  • @Ivan sure, the model has 1 hidden layer, and the forward pass is x=sigmoid(self.linear1(x)) and then x=self.linear2(x), no activation on the final one. Its the iris data set, so there is 4 inputs and one output – Robbie Meaney Sep 13 '21 at 13:14
  • Please edit your question with the relevant information. Thank! – Ivan Sep 13 '21 at 13:15

0 Answers0