Good morning everyone; I'm trying to implement this model where the neural network's inputs are based on a trainable vocabulary matrix (each row in the matrix represents a word entry in the vocabulary). I'm using keras (tensorflow backend), I was wondering if it's possible to define a trainable variable (without adding a custom layer), such that this variable will be trained as well as the neural network? like a tensorflow variable. Could you please give a short example of how I can do it? Thanks in advance.
Asked
Active
Viewed 1,714 times
1 Answers
0
The neural network's inputs are based on a trainable vocabulary matrix (each row in the matrix represents a word entry in the vocabulary)
This is the definition of a Word Embedding
There is already an embedding layer in Keras, you don't have to reimplement it.
You can find an easy example of how to use it here.

Ghilas BELHADJ
- 13,412
- 10
- 59
- 99
-
Thanks for responding, the problem is not in the embedding layer itself, I could use a pretrained embedding but I have to update it while training the model. Also, different entries (embedded words) will be weighted also during training. It's a bit complicated, anyway, I'll read the example you gave then I'll comment here. Thanks again. – Belkacem Thiziri Feb 07 '18 at 11:15
-
With an Embedding layer, you can set the trained weights with `model.get_layer('embedding_layer_name').set_weights(trained_weights)`. Now, whether you want it to be trainable or not is set with `model.get_layer('embedding_layer_name').trainable=True #(or False)`. This must be done before `model.compile()`. – Daniel Möller Feb 07 '18 at 11:26
-
By default `keras.layers.Embedding` is trainable (is updated during the training phase) – Ghilas BELHADJ Feb 07 '18 at 11:26
-
In fact, I've just read the example that you gave me Ghilas. It'a just a part of my problem: after the updating of the embeddings I have to create representations some text (documents and queries in my case) based on these embeddings, after that, these representations will be fed as input to the NN model. So, I don't really see how it can be done. [Here](https://stackoverflow.com/questions/46544329/keras-add-external-trainable-variable-to-graph) an answer to a similar problem (add trainable variables), could you please tell me if it can be applicable in my case, and how if so ? – Belkacem Thiziri Feb 07 '18 at 15:53
-
@BelkacemThiziri `I have to create representations ... based on these embeddings` Embeddings are already representations. You have to know in advance what kind of operation associates the document representation and the query representation, Say you just have to [concat](https://keras.io/layers/merge/#concatenate) or [add](https://keras.io/layers/merge/#add) them (if they have the same shape), all you have to do is to use the keras layer that is made for that. – Ghilas BELHADJ Feb 07 '18 at 16:13
-
Ok thanks a lot for your help @GhilasBELHADJ That's it! Sorry for all these questions, I'm not really expert on Keras. – Belkacem Thiziri Feb 08 '18 at 08:27
-
Now I have another question: How can I read the embedding layer output? I tried this `e=Embedding(200, 32, input_length=50)` that corresponds to the embedding of some text, then `d = keras.layers.Average(e[:16])` that must correspond the combination of the first 16 embedded words of my text, but it generates an error. I must miss something! – Belkacem Thiziri Feb 08 '18 at 11:22
-
You're trying to use Keras "à la NumPy". `e` is a [`Layer`](https://github.com/keras-team/keras/blob/master/keras/engine/topology.py#L191) and not an [`np.ndarray`](https://docs.scipy.org/doc/numpy-1.12.0/reference/generated/numpy.ndarray.html), so slicing is not allowed here. I think it can be done using a custom layer, see [this related issue](https://github.com/keras-team/keras/issues/890#issuecomment-220307384) – Ghilas BELHADJ Feb 08 '18 at 11:35
-
I tried to follow the example but it generates an error when I do `keras.layers.Average()(keras.layers.Lambda (lambda e: e[:,:,0:25]))` the error is: `ValueError: Layer average_7 was called with an input that isn't a symbolic tensor. Received type:
. Full input: [ – Belkacem Thiziri Feb 08 '18 at 12:35]. All inputs to the layer should be tensors`. -
I guess you forgot the `(x)` in the example I provided you, where `x` is your layer input. If you're new in Keras, I highly recommend you read [some tutorials](https://keras.io/getting-started/sequential-model-guide/) about it and experiment some basic sequential models before trying to implement your own. – Ghilas BELHADJ Feb 08 '18 at 15:20
-
Thanks @GhilasBELHADJ, you helped me a lot. Thanks anyway. Yes I just started working with keras, I've already done some tutorials, but there are still a lot of things to learn. Thank you. – Belkacem Thiziri Feb 08 '18 at 15:29