1

In keras Sequential model, one can set weight directly using set_weights method.

model.layers[n].set_weights([your_wight])

However I am facing problem if I am trying to set weight to a layer using functional API.

Here is the code snippet:

emb = Embedding(max_words, embedding_dim, input_length=maxlen)(merge_ip)
         #skipping some lines
         .
         .
emb.set_weights([some_weight_matrix])

This is throwing error that

AttributeError: 'Tensor' object has no attribute 'set_weights'

which I think becouse emb is a Tensor object.

I am wondering how to set wight properly in my model

g_p
  • 5,499
  • 3
  • 19
  • 22

2 Answers2

5

If you want to set the weights on Embedding layers you might add them to the constructor like this:

from keras.layers import Embedding

embedding_layer = Embedding(len(word_index) + 1,
                            EMBEDDING_DIM,
                            weights=[embedding_matrix],
                            input_length=MAX_SEQUENCE_LENGTH,
                            trainable=False)

https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html

Later then you can hand over merge_ip:

x = embedding_layer(merge_ip)
MBT
  • 21,733
  • 19
  • 84
  • 102
1
embed_layer = Embedding(max_words, embedding_dim, input_length=maxlen)
emp = embed_layer(merge_ip)

embed_layer.set_weights("...")
Jie.Zhou
  • 1,318
  • 11
  • 17