I am trying to do a simple thing: use autograd to get gradients and do gradient descent:
import tangent
def model(x):
return a*x + b
def loss(x,y):
return (y-model(x))**2.0
After getting loss for an input-output pair, I want to get gradients wrt loss:
l = loss(1,2)
# grad_a = gradient of loss wrt a?
a = a - grad_a
b = b - grad_b
But the library tutorials don't show how to do obtain gradient with respect to a or b i.e. the parameters so, neither autograd nor tangent.