2

I've recently working on some code written in tensorflow 1.0.1 and i want to make it available on tenorflow 2. I am not very familiar with seq2seq. Thank you very much.

(attention_keys,
attention_values,
attention_score_fn,
attention_construct_fn) = tf.contrib.seq2seq.prepare_attention(
    attention_states=attention_states,
    attention_option="bahdanau",
    num_units=self.decoder_hidden_units,
)

2 Answers2

0

As per this Github Comment, tf.contrib.seq2seq.prepare_attention() is renamed as tf.contrib.seq2seq.DynamicAttentionWrapper.

As per this Github Tensorflow Commit, DynamicAttentionWrapper has been renamed to AttentionWrapper.

So, in 1.15, the Function equivalent to tf.contrib.seq2seq.prepare_attention() is tf.contrib.seq2seq.AttentionWrapper.

The function equivalent to tf.contrib.seq2seq.AttentionWrapper in Tensorflow 2.x is tfa.seq2seq.AttentionWrapper.

Please find this Tensorflow Documentation for more information.

0

First install tensorflow addons using:

pip install tensorflow-addons

Then import it in your program:

import tensorflow_addons as tfa

And use:

tfa.seq2seq.AttentionWrapper
Hajar Homayouni
  • 560
  • 2
  • 6
  • 16