Hi I have a bidirectional LSTM layer:
class BiDirLSTMInput(Layer):
def __init__(self):
self.bidir_lstm = Bidirectional(
LSTM(32, return_sequences=True,return_state=True)
)
def call(self, input):
o, h1,h2, c1,c2 = self.bidir_lstm(input)
return [h1,h2]
As you can see, I am just consuming hidden state from LSTM (and not cell state)
Is that the reason, am I getting following warning:
WARNING:tensorflow:Gradients do not exist for variables for (backward layer):
- lstm_cell_2/kernel:0',
- lstm_cell_2/recurrent_kernel:0'
- lstm_cell_2/bias:0'
Ignoring this doesn't sound logical. How do I deal with this error?