I want to print the model's validation loss in each epoch, what is the right way to get and print the validation loss?
Is it like this:
criterion = nn.CrossEntropyLoss(reduction='mean')
for x, y in validation_loader:
optimizer.zero_grad()
out = model(x)
loss = criterion(out, y)
loss.backward()
optimizer.step()
losses += loss
display_loss = losses/len(validation_loader)
print(display_loss)
or like this
criterion = nn.CrossEntropyLoss(reduction='mean')
for x, y in validation_loader:
optimizer.zero_grad()
out = model(x)
loss = criterion(out, y)
loss.backward()
optimizer.step()
losses += loss
display_loss = losses/len(validation_loader.dataset)
print(display_loss)
or something else? Thank you.