1

I have a Keras model that trains find when eager mode is on (TF 2.1.0). One of my features is a string that I need to map to its corresponding vocabulary index. However, with eager execution disabled, I cannot find a neat way to do this.

I was initially using tft.apply_vocabulary, which used to work fine but fails without eager execution. I also tried tf.lookup.StaticVocabularyTable:

table = tf.lookup.StaticVocabularyTable(TextFileIdTableInitializer('item_vocab.txt'), 1)
out = table.lookup(input_strings)

which (with eager mode off) fails with:

tensorflow.python.framework.errors_impl.FailedPreconditionError: Table not initialized. [[{{node transformer/hash_table_Lookup_1/hash_table_Lookup/LookupTableFindV2}}]]

I am able to run the table's _initialize method in a tf.Session, but that feels like too much work for such a common task and is not TF2.0 compatible.

So, how do you map strings to integer indexes from a vocab file without eager execution?

Why not eager?

I have the impression that graph mode training has wider support (e.g. multi-gpu training) and better performance and I'm trying to make sure my code works with eager mode disabled, so that I can eventually tunr it off when I'm done developing. Is that a sensible goal?

Milad Shahidi
  • 627
  • 7
  • 13
  • If you have an answer by now mind posting it? – figs_and_nuts Apr 26 '20 at 18:18
  • I couldn't do this outside eager mode. I'm still using apply_vocabulary. One possible alternative that I haven't tried yet is to use feature_columns and create an embedding feature column (instead of Keras embedding layer). That will take a string and return the corresponding embedding (as opposed to Keras embedding layer which expects an integer index as its input) – Milad Shahidi Apr 27 '20 at 19:59

0 Answers0