11

Does anyone know the default activation function used in the recurrent layers in Keras? https://keras.io/layers/recurrent/

It says the default activation function is linear. But what about the default recurrent activation function. Nothing is mentioned about that. Any help would be highly appreciated. Thanks in advance

Kiran Baktha
  • 627
  • 2
  • 9
  • 20

2 Answers2

7

Keras Recurrent is an abstact class for recurrent layers. In Keras 2.0 all default activations are linear for all implemented RNNs (LSTM, GRU and SimpleRNN). In previous versions you had:

  • linear for SimpleRNN,
  • tanh for LSTM and GRU.
Marcin Możejko
  • 39,542
  • 10
  • 109
  • 120
1

https://github.com/keras-team/keras/blob/master/keras/layers/recurrent.py#L2081

It mentions tanh here for version 2.3.0 :-)

Raghav
  • 11
  • 1