For questions about word embedding, a language modelling technique in natural language processing. Questions can concern particular methods, such as Word2Vec, GloVe, FastText, etc, or word embeddings and their use in machine learning libraries in general.
Word embedding are numeric representations which make it easier to have words with similar meanings to have similar representations. Numerical representations of the words are in a predefined vector space. Word embedding are used to capture the context (semantic or similarity or relation) of the word within a document.
Example frameworks : Word2Vec, GloVe, FastText.