0

To my understanding, batch (vanilla) gradient descent makes one parameter update for all training data. Stochastic gradient descent (SGD) allows you to update parameter for each training sample, helping the model to converge faster, at the cost of high fluctuation in function loss.

enter image description here

Batch (vanilla) gradient descent sets batch_size=corpus_size.

SGD sets batch_size=1.

And mini-batch gradient descent sets batch_size=k, in which k is usually 32, 64, 128...

How does gensim apply SGD or mini-batch gradient descent? It seems that batch_words is the equivalent of batch_size, but I want to be sure.

Is setting batch_words=1 in gensim model equivalent to applying SGD?

Eric Kim
  • 2,493
  • 6
  • 33
  • 69

1 Answers1

0

No, batch_words in gensim refers to the size of work-chunks sent to worker threads.

The gensim Word2Vec class updates model parameters after each training micro-example of (context)->(target-word) (where context might be a single word, as in skip-gram, or the mean of several words, as in CBOW).

For example, you can review this optimized w2v_fast_sentence_sg_neg() cython function for skip-gram with negative-sampling, deep in the Word2Vec training loop:

https://github.com/RaRe-Technologies/gensim/blob/460dc1cb9921817f71b40b412e11a6d413926472/gensim/models/word2vec_inner.pyx#L159

Observe that it is considering exactly one target-word (word_index parameter) and one context-word (word2_index), and updating both the word-vectors (aka 'projection layer' syn0) and the model's hidden-to-output weights (syn1neg) before it might be called again with a subsequent single (context)->(target-word) pair.

gojomo
  • 52,260
  • 14
  • 86
  • 115
  • ok, then my follow up question is, how do I apply SGD or mini-batch gradient of size `k` with gensim models? – Eric Kim May 02 '19 at 22:45
  • I'm pretty sure the `gensim` `Word2Vec` approach (closely modeled after the original Mikolov/Google `word2vec.c`) could be fairly described as SGD already. But, it doesn't have any configurably-sized mini-batch option: you'd have to write that yourself. – gojomo May 03 '19 at 00:01