21

I get the following error for the code snippet below:

You must feed a value for placeholder tensor 'bidirectional_1/keras_learning_phase' with dtype bool

If I add the dropout layer model.add(Dropout(dropout)), it works. Anyone knows why? The back-end is Tensorflow, Keras 2.0.1

def prep_model1(embedding_layer1, embedding_layer2, dropout=0.5):

    model0 = Sequential()  
    model0.add(embedding_layer1)
    model0.add(Bidirectional(LSTM(128, return_sequences=False, dropout=dropout)))

    model1 = Sequential() 
    model1.add(embedding_layer2)
    model1.add(Bidirectional(LSTM(128, return_sequences=False, dropout=dropout)))

    model = Sequential()
    model.add(Merge([model0, model1], mode='concat', concat_axis=1))
    #model.add(Dropout(dropout))
    model.add(Dense(1, activation='sigmoid'))

    return model
wolfshow
  • 211
  • 1
  • 2
  • 6

1 Answers1

32

Try to import K and set learning phase before your model.

from keras import backend as K

K.set_learning_phase(1) #set learning phase

From this issue

holdenlee
  • 969
  • 1
  • 8
  • 21
vega
  • 2,150
  • 15
  • 21
  • 4
    Important: use it before constructing the model. Also for BatchNorm. – ikamen Jan 27 '18 at 17:47
  • 6
    It is stated [here](https://github.com/tensorflow/tensorflow/issues/11336#issuecomment-315898065) that: _The "learning phase" is a flag is a flag which indicates training/inference. It is set to 1 when using e.g. `fit` and to 0 when using e.g. `predict`. `K.set_learning_phase(False)` sets the "learning phase" to be always 0, i.e. `fit` will have the model behave in inference mode (e.g. no dropout and BatchNorm behavior set to inference)._ – tuomastik Apr 11 '18 at 12:46