0

Whenever I saw neural networks implemented in Torch, nn, they only plug modules together. For example, there is a sequencer module with LookupTable, Splittable, FasLSTM, Linear, LogSoftMax. Why don't people use activation functions in between, such as tanh/Sigmoid/ReLu?

hypnoticpoisons
  • 342
  • 4
  • 11

1 Answers1

0

Do you have an example? Typically, ReLu or TanH are used between layers.

You would not use them between table manipulation functions and such, as they are not 'real' neural network layers with parameters.

Sander
  • 412
  • 2
  • 14