24

can anyone tell me how is backpropagation done in Keras? I read that it is really easy in Torch and complex in Caffe, but I can't find anything about doing it with Keras. I am implementing my own layers in Keras (A very beginner) and would like to know how to do the backward propagation.

Thank you in advance

Tassou
  • 439
  • 1
  • 5
  • 16

1 Answers1

31

You simply don't. (Late edit: except when you are creating custom training loops, only for advanced uses)

Keras does backpropagation automatically. There's absolutely nothing you need to do for that except for training the model with one of the fit methods.

You just need to take care of a few things:

  • The vars you want to be updated with backpropagation (that means: the weights), must be defined in the custom layer with the self.add_weight() method inside the build method. See writing your own keras layers.
  • All calculations you're doing must use basic operators such as +, -, *, / or backend functions. By backend, tensorflow/theano/CNTK functions are also supported.
  • The functions you use must be differentiable (that means backpropagation will fail for functions that use constant results, for instance)

This is all you need to have the automatic backpropagation working properly.

If your layers don't have trainable weights, you don't need custom layers, create Lambda layers instead (only calculations, no trainable weights).

Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • Thank you very much for this quick answer Daniel! Could you maybe show me how to call this function? – Tassou Nov 21 '17 at 15:54
  • 2
    What function? Backpropagation? It simply happens automatically when you `fit` your model to some data. The method for training is `fit` in a keras `model`. – Daniel Möller Nov 21 '17 at 15:56
  • @DanielMöller, i have similar problem, i want to binarize(quantize) gradients at each backpropagation.. is there anyway i could achieve that in keras..in tensorflow it is possible i beleive – Eliethesaiyan Dec 23 '18 at 11:16
  • 2
    You need a custom optimizer. I suggest you copy the code from keras source for SGD and update it. – Daniel Möller Dec 25 '18 at 20:10
  • 1
    Update for @Eliethesaiyan, you can also use eager mode on and create a custom training loop: https://www.tensorflow.org/tutorials/customization/custom_training_walkthrough – Daniel Möller Oct 10 '19 at 13:58
  • @DanielMöller , thanks, it seems to have been updated now – Eliethesaiyan Nov 08 '19 at 06:45