5

With pretrained embeddings, we can specify them as weights in keras' embedding layer. To use multiple embeddings, would specifying multiple embedding layer be suitable? i.e.

embedding_layer1 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_1],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 embedding_layer2 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_2],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 model.add(embedding_layer1)
 model.add(embedding_layer2)

This suggests to sum them up and represent them into a single layer, which is not what I am after.

dter
  • 1,065
  • 2
  • 11
  • 22

2 Answers2

1

I have come across the same issue.Is it because keras.Embedding layer internally uses some kind of object (lets call it x_object ) ,that gets initialized in keras.backend global session K. Hence the second embedding layer throws an exception saying the x_object name already exists in graph and cannot be added again.

0

if you want to use multiple embedding layers in a model the answer is in this thread Multiple Embedding layers for Keras Sequential model

brain pinky
  • 359
  • 1
  • 4
  • 17