10

I have seen such kind of code as follow:

embed_word = Embedding(params['word_voc_size'], params['embed_dim'], weights=[word_embed_matrix], input_length = params['word_max_size']
                        , trainable=False, mask_zero=True)

When I look up the document in Keras website [https://faroit.github.io/keras-docs/2.1.5/layers/embeddings/][1]

I didnt see weights argument,

keras.layers.Embedding(input_dim, output_dim, embeddings_initializer='uniform', embeddings_regularizer=None, activity_regularizer=None, embeddings_constraint=None, mask_zero=False, input_length=None)

So I am confused,why we can use the argument weights which was not defined the in Keras document?

My keras version is 2.1.5. Hope someone can help me.

Adam Miklosi
  • 764
  • 5
  • 18
  • 28
Johnny
  • 161
  • 2
  • 8

1 Answers1

13

Keras' Embedding layer subclasses the Layer class (every Keras layer does this). The weights attribute is implemented in this base class, so every subclass will allow to set this attribute through a weights argument. This is also why you won't find it back in the documentation or the implementation of the Embedding layer itself.

You can check the base layer implementation here (Ctrl + F for 'weight').

sdcbr
  • 7,021
  • 3
  • 27
  • 44
  • 1
    While this is technically true, this cannot be used to initialize the embedding weights anymore, see https://github.com/tensorflow/tensorflow/issues/14392 – Till Brychcy Feb 07 '20 at 10:35