2

Here is my example from Tensorflow 2.0:

import tensorflow as tf

w = tf.Variable([[1.0]])
with tf.GradientTape() as tape_1:
    loss_1 = w * w


with tf.GradientTape() as tape_2:
    loss_2 = w * w * w

grad_1 = tape_1.gradient(loss_1, w)
grad_2 = tape_2.gradient(loss_2, w)
print(grad_1)
print(grad_2)

it returns:

tf.Tensor([[2.]], shape=(1, 1), dtype=float32)
tf.Tensor([[3.]], shape=(1, 1), dtype=float32)

The above are correct coefficients, but grad_2 should also indicate that we have 3w^2. How can I retrieve w^2 part?

user1700890
  • 7,144
  • 18
  • 87
  • 183

1 Answers1

1

The gradient results do not mean that. If you take your functions, f(w) = w2 and g(w) = w3, their respective derivative functions with respect to w would be f'(w) = 2w and g'(w) = 3w2. What the gradient function gives you is the value of these functions for the current value of w. So, since w is initialized to 1, it gives you f'(1) = 2 and g'(1) = 3. TensorFlow can, in a way, compute the symbolic derivative function, but as a sequence of TensorFlow operations, so it is not straightforward to extract a nice mathematical expression from it. And with eager execution, as you are using, it is not even available, the operations are executed as necessary and intermediates are discarded.

jdehesa
  • 58,456
  • 7
  • 77
  • 121
  • Discarding derivative computations sounds inefficient. Does it only happens in eager execution or always? – user1700890 Nov 22 '19 at 19:44
  • 1
    @user1700890 To be honest I'm not exactly sure how it's done in eager execution, in graph mode the gradient operations are added to the graph and then evaluated on each request. With [`tf.GradientTape`](https://www.tensorflow.org/api_docs/python/tf/GradientTape) I _think_ it initially just tracks the performed operations and then when `gradients` is called it finds the dependency graph and does the backpropagation one step at a time, repeatedly looking up the gradient in the gradients registry, computing it and discarding the previous intermediate gradient. – jdehesa Nov 25 '19 at 10:55