0

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/eager/backprop.py#L994-L1095 here is the source code for the 'GradientTape' function, however making changes in the source code does not affect the function in the subsequent answer.

    flat_grad = imperative_grad.imperative_grad(
        self._tape,
        flat_targets,
        flat_sources,
        output_gradients=output_gradients,
        sources_raw=flat_sources_raw,
        unconnected_gradients=unconnected_gradients)

    if not self._persistent:
      # Keep track of watched variables before setting tape to None
      self._watched_variables = self._tape.watched_variables()
      self._tape = None

    grad = nest.pack_sequence_as(sources, flat_grad)
    return 0

Here I have changed the return statement. However using

x = tf.Variable(3.0)

with tf.GradientTape() as tape:
  y = x**2
dy_dx = tape.gradient(y, x)
dy_dx.numpy()

gives the output to be 6.0 What else needs to be changed?

1 Answers1

0

You should never edit an installed packages.

since tf.GradientTape is a class, you can inherit the class and override the .gradient method to return your desired value.

from tensorflow.python.ops.unconnected_gradients import UnconnectedGradients

class TestGradientTape(tf.GradientTape):
    def gradient(self,
                 target,
                 sources,
                 output_gradients=None,
                 unconnected_gradients=UnconnectedGradients.NONE):

        # Calls the original .gradient method from the parent class
        super().gradient(target, sources, output_gradients, unconnected_gradients)
        
        # Returns 0 instead of whatever the above method gave you
        return 0

# Now you can use TestGradientTape instead of tf.GradientTape

with TestGradientTape() as tape:
  y = x**2

# And this should go through our overriden method from TestGradientTape
dy_dx = tape.gradient(y, x)
KokoseiJ
  • 163
  • 6