0

Mxnet and Tensorflow both declare that they has auto-differentiation feature.

In Mxnet, I need to define the backward part when creating a new op(like loss function), but not in Tensorflow.

In my knowledge, auto-differentiation means I don't need to care about the backward part. So, does mxnet has auto-differentiation feature?

Zehao Shi
  • 99
  • 8

1 Answers1

2

Yes, MXNet has autograd.

Here is a tutorial: http://gluon.mxnet.io/chapter01_crashcourse/autograd.html

Indhu Bharathi
  • 1,437
  • 1
  • 13
  • 22