I'm doing research about differents methods of word embedding.
As far as I understand, Latent semantic Analysis is a way of reducing dimensions of a huge matrice built by counting words in documents (eventually normalized with things like tf-idf but that's not the point). So it is a one time operation. After running SVD on your matrice you have the embeddings for every documents. So is it possible to vectorize a new document without re-running again the SVD on the whole matrice with all others document ?