with torch.no_grad():
input = Variable(input).cuda()
target = Variable(target).cuda(non_blocking=True)
y=model(input)
# many things here
Is the no_grad continue to having effect out of the "with" scope?
with torch.no_grad():
input = Variable(input).cuda()
target = Variable(target).cuda(non_blocking=True)
y=model(input)
# many things here
Is the no_grad continue to having effect out of the "with" scope?
The no_grad
has no effect outside the "with" scope.
According to this answer from a moderator on the pytorch blog:
with torch.no_grad():
# No gradients in this block
x = self.cnn(x)
# Gradients as usual outside of it
x = self.lstm(x)
It is the purpose of the with
statement in python. The variable used by the with
(here torch.no_grad()
) has only effect in the with
context and not after. See the python doc for complete details.