I understand that so long as i am defining a computation in tf.GradientTape()
context, the gradient tape would compute the gradient w.r.t all the variables that the output of the computation depends on. However, i think i am not quite grasping the subtelties of the gradient as the following code does not execute as i expect it to:
import tensorflow as tf
x = tf.Variable(2.)
loss_ = x**2-2*x+1
with tf.GradientTape(persistent=True) as g:
loss = loss_*1
print(g.gradient(loss,x))
output: None
Why is the gradient wrt x not computed?
I am able to compute only the gradients that are wrt to the variables that are getting explicitly used within the context. for example the following code does not compute gradients as well:
import tensorflow as tf
tf.compat.v1.disable_eager_execution()
x = tf.Variable(2.)
t1 = x**2
t2 = -2*x
t3 = 1.
with tf.GradientTape(persistent=True) as g:
loss = t1+t2+t3
print(g.gradient(loss,x))