0

I want to train label embedding myself, (yes, label embedding like word embedding, but input is one hot vector of label)

When I found chainer.links.EmbedID I found example in official document, it must pass W in it.

How to train embbeding W matrix, then later we can use it to train another model?

I mean, How to train embedding vector representation of word/ label ?

machen
  • 283
  • 2
  • 10

1 Answers1

0

You don't need to take 2 step (train embedding followed by train another model), but you can train embedding in end-to-end way. Once you obtrained embed vector from categorical value, you can connect it to usual neural network to train loss as usual.

Word2vec is one official example which uses EmbedID:

corochann
  • 1,604
  • 1
  • 13
  • 24
  • Does the input must assure some order ? for instance, the order of x must have the same with article words order? Or it does not have such limit on it in learning embedding? If it is training in batch, shuffle between batch will break this order assure? – machen Mar 15 '18 at 06:01
  • Sorry, I could not understand what you want to know. For ID assignation order of EmbedID can be arbitrary, for example ID=1 and ID=2 does not mean they are close. For the order of training, I think it depends on what kind of task you are tackling. If you are dealing with RNN/LSTM, order of batch (what word is fed to the network) matters. – corochann Mar 17 '18 at 04:47