0

Im trying to implement L2 norm layer for convolutional neural network, and im stuck on backward pass:

def forward(self, inputs):
    x, = inputs
    self._norm = np.expand_dims(np.linalg.norm(x, ord=2, axis=1), axis=1)
    z = np.divide(x, self._norm)
    return z,

def backward(self, inputs, grad_outputs):
    x, = inputs
    gz, = grad_outputs
    gx = None # how to compute gradient here?
    return gx,

How to calculate gx? My first guess was

gx = - gz * x / self._norm**2

But this one seems is wrong.

loknar
  • 539
  • 5
  • 12

1 Answers1

0

The right answer is

gx = np.divide(gz, self._norm)
loknar
  • 539
  • 5
  • 12