I'm trying to understand the basic pytorch autograd system:
x = torch.tensor(10., requires_grad=True)
print('tensor:',x)
x.backward()
print('gradient:',x.grad)
output:
tensor: tensor(10., requires_grad=True)
gradient: tensor(1.)
since x
is a scalar constant and no function is applied to it, I expected 0.
as the gradient output. Why is the gradient 1.
instead?