0

In pybrain LSTM layer there are these buffer that are used to store values.

 'bufferlist': [   ('ingate', 20),
                      ('outgate', 20),
                      ('forgetgate', 20),
                      ('ingatex', 20),
                      ('outgatex', 20),
                      ('forgetgatex', 20),
                      ('state', 20),
                      ('ingateError', 20),
                      ('outgateError', 20),
                      ('forgetgateError', 20),
                      ('stateError', 20),
                      ('inputbuffer', 80),
                      ('inputerror', 80),
                      ('outputbuffer', 20),
                      ('outputerror', 20)],

Could anyone explain what these variables are for? I am trying to get the activation of an LSTM layer. Which variable should I take?

OmG
  • 18,337
  • 10
  • 57
  • 90
dnth
  • 879
  • 2
  • 12
  • 22

1 Answers1

2

The activation is in 'outputbuffer'.

Regarding what the variables are for (your question is a little unclear), it would be easier for you to read the original paper. If you mean the specific usage in the implementation rather than their use in the model, you should inspect the LSTMLayer implementation, which uses these variables.

Most variables are named exactly like in the paper. If you understood the concepts, it's quite straight forward. The only thing to add is that the gates that end with an 'x' (outgatex,forgetgatex,ingatex) are the values calculated from peephole connections while (outgate, forgetgate,ingate) are the total gate values.

runDOSrun
  • 10,359
  • 7
  • 47
  • 57
  • 1
    from http://stackoverflow.com/questions/12436311/activation-values-for-all-nodes-in-a-pybrain-network it is shown that the activation is stored in outputbuffer. Are you certain of your answer? – dnth Dec 06 '14 at 06:50
  • You're right. Thanks for the heads up! Edited the answer. – runDOSrun Dec 06 '14 at 13:52