0

I am trying to implement Embedding layer for text classification using CNN.
Embedding layer
with tf.device('/cpu:0'), tf.name_scope("embedding"): self.W = tf.Variable(tf.random_uniform([vocab_size, embedding_size], -1.0, 1.0),name="W") self.embedded_chars = tf.nn.embedding_lookup(self.W, self.inputTensor) self.embedded_chars_expanded = tf.expand_dims(self.embedded_chars, -1)

I couldn't understand tf.nn.embedding_lookup working.

1 Answers1

0

This function is used to perform parallel lookups on the list of tensors in params.

It is generalized form of tf.gather. This example will clear the working of the tf.gather and tf.nn.embedding_lookup.

Suppose you have a tensor of shape (1), containing strings. Let's call it params.

PARAMS

|. 0. |. 1. |. 2. |. 3. |. 4. |. 5. | <= index

|. a1. |. a2. |. a3. | a4. |. a5. | a6 | <= values

Let Id be another tensor of int32 or int64

IDS

[ 2, 3]

Then this function returns the values at these indices in the params as another tensor.

In the above case it returns. [ a3, a4]

This image should make it clear

So in your above example , self.W 's value at indices pointed by self.InputTensor are extracted by the tf.nn.embedding_lookup function.

Community
  • 1
  • 1
coder3101
  • 3,920
  • 3
  • 24
  • 28