4

I'm currently trying to set up a (LSTM) recurrent neural network with Keras (tensorflow backend). I would like to use variational dropout with MC Dropout on it. I believe that variational dropout is already implemented with the option "recurrent_dropout" of the LSTM layer but I don't find any way to set a "training" flag to put on to true like a classical Dropout layer.

Nosk
  • 753
  • 2
  • 6
  • 24

2 Answers2

2

This is quite easy in Keras, first you need to define a function that takes both model input and the learning_phase:

import keras.backend as K
f = K.function([model.layers[0].input, K.learning_phase()],
               [model.layers[-1].output])

For a Functional API model with multiple inputs/outputs you can use:

f = K.function([model.inputs, K.learning_phase()],
               [model.outputs])

Then you can call this function like f([input, 1]) and this will tell Keras to enable the learning phase during this call, executing Dropout. Then you can call this function multiple times and combine the predictions to estimate uncertainty.

Dr. Snoopy
  • 55,122
  • 7
  • 121
  • 140
  • Thank you for your answer. It is the work around that Yarin Gal used for the BayesianRNN implementation. Does it work like the following example ? I ran it and it seemed to work but the keras documentation is blurry about it... Example : inputs = keras.Input(shape=(10)) x = keras.layers.LSTM(10,recurrent_dropout=0.5)(inputs, training=True) – Nosk Jan 24 '19 at 04:14
2

The source code for "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning" (2015) is located at https://github.com/yaringal/DropoutUncertaintyExps/blob/master/net/net.py. They also use Keras and the code is quite easy to understand. The Dropout layers are used without the Sequential api in order to pass the training parameter. This is a different approach to the suggestion from Matias:

inter = Dropout(dropout_rate)(inter, training=True)
Tom S
  • 21
  • 3