1

Is there a way to figure out if an NDArray variable has a gradient/requires backprop, similar to the requires_grad attribute in ?

I've tried checking whether x.grad is None, but that doesn't work for intermediate variables:

import mxnet as mx

x = mx.nd.array([1, 2])
x.attach_grad()
with mx.autograd.record():
  y = x**2
  z = 2 * y
z.backward()

>>> print(x.grad)
[4. 8.]
<NDArray 2 @cpu(0)>
>>> print(y.grad)
None

In this example, x has a gradient but the intermediate variable y does not, but it should also be trainable.

I've tried looking through the source code, but can't figure it out.

ecoplaneteer
  • 1,918
  • 1
  • 8
  • 29
Raphael Koh
  • 111
  • 9

0 Answers0