6

What is the difference between the Dropout layer and the dropout and recurrent_droput parameters in keras? Do they all serve the same purpose?

Example:

model.add(Dropout(0.2))  # layer
model.add(LSTM(100, dropout=0.2, recurrent_dropout=0.2))  # parameters
10 Rep
  • 2,217
  • 7
  • 19
  • 33
Nandini Matam
  • 119
  • 1
  • 10

1 Answers1

7

Yes they have the same functionality, dropout as a parameter is used before linear transformations of that layer (multiplication of weights and addition of bias). Dropout as layer can be used before an activation layer too.

recurrent_dropout also has same functionality but different direction(usually dropouts are between input and output, it is between timestamps)

newlearnershiv
  • 350
  • 1
  • 9