1

I came across this code online and I was wondering if I interpreted it correctly. Below is a part of a gradient descent process. full code available through the link https://jovian.ml/aakashns/03-logistic-regression. My question is as followed: During the training step, I guess the author is trying to minimize the loss for each batch by updating the parameters. However, how can we be sure the total loss of all training samples is minimized if loss.backward() is only applied to the batch loss?

def fit(epochs, lr, model, train_loader, val_loader, opt_func=torch.optim.SGD):
    history = []
    optimizer = opt_func(model.parameters(), lr)

for epoch in range(epochs):
    # Training Phase 
    for batch in train_loader:
        loss = model.training_step(batch)
        loss.backward()
        optimizer.step()
        optimizer.zero_grad()
    # Validation phase
    result = evaluate(model, val_loader)
    model.epoch_end(epoch, result)
    history.append(result)
return history
Usama Abdulrehman
  • 1,041
  • 3
  • 11
  • 21
  • I think batches are only an alternative to having to go through the whole training set each epoch because that way it takes significantly shorter to complete training. – Nosrep Jun 01 '20 at 00:21
  • Yes, I understand that it's much faster if you have a smaller array. But when you have mini-batches, you will end up with a set of parameters for every batch. I think in this piece of code (assuming only 1 epoch, and 2 mini-batches), the parameter is updated based on the loss.backward() of the first batch, then on the loss.backward() of the second batch. In this way, the loss for the first batch might get larger after the second batch has been trained. In other words, this code does not necessarily lead to a minimum total loss. How to make sure we minimize the total loss for all samples? – anyang_peng Jun 01 '20 at 01:54
  • You are missing fundamental theoretical background. The process and the answers to the question that you have, are all mentioned and (maybe partially) answered in relevant textbooks. – Xxxo Jun 02 '20 at 06:56

0 Answers0