I got the following warning:
94: UserWarning: Converting sparse IndexedSlices to a dense Tensor with 1200012120 elements. This may consume a large amount of memory.
For the following code:
from wordbatch.extractors import WordSeq
import wordbatch
from keras.layers import Input,Embedding
...
wb = wordbatch.WordBatch(normalize_text, extractor=(WordSeq, {"seq_maxlen": MAX_NAME_SEQ}), procs=NUM_PROCESSOR)
wb.dictionary_freeze = True
full_df["ws_name"] = wb.fit_transform(full_df["name"])
...
name = Input(shape=[full_df["name"].shape[1]], name="name")
emb_name = Embedding(MAX_TEXT, 50)(name)
...
That is I make use of WordSeq (from WordBatch) output from the Embedding layer of a GRU network. How should I modify the code to make it work without converting to dense tensor?