4

I want to use tfa.seq2seq.BahdanauAttention with functional API of tf.keras. I have looked at the example given at tensorflow/nmt/attention_model.py. But I couldn't figure out how to use it with tf.keras's functional API.

So I would like to use tfa.seq2seq.BahdanauAttention for a lipreading task, something like this:


    # Using tf.keras functional API
    encoder_out = a tensor of shape (batch_size, time_steps, units)
    decoder_out = a tensor of shape (batch_size, time_steps, units)
    attn_out = attention_mechanism()(encoder_out, decoder_out)  # Need help figuring this out

Thanks in advance.

Manideep
  • 41
  • 5

0 Answers0