9

I would like to clarify the relationship between latent Dirichlet allocation (LDA) and the generic task of document clustering.

The LDA analysis tends to output the topic proportions for each document. If my understanding is correct, this is not the direct result of document clustering. However, we can treat this probability proportions as a feature reprsentation for each document. Afterwards, we can invoke other established clustering method based on the feature configurations generated by LDA analysis.

Is my understanding correct? Thanks.

chl
  • 27,771
  • 5
  • 51
  • 71
user785099
  • 5,323
  • 10
  • 44
  • 62

1 Answers1

10

Yes, you can treat the output of LDA as features for your documents; this is exactly what Blei, Ng and Jordan did in the paper that introduced LDA. They did it for classification, but for clustering the procedure is the same.

(In machine learning terminology, this use of LDA is called dimensionality reduction because it reduces the feature space's number of dimensions from |V|, the vocabulary size, to some number k of topics selected by the user.)

Fred Foo
  • 355,277
  • 75
  • 744
  • 836
  • But in their paper, they claimed to use posterior dirichlet parameter $\gamma(w)$, which is different with probability proportion here. I agree that the underlying thoughts are the same in terms of feature reduction. But my concern is that why they choose to use $\gamma(w)$, which seems to me that it does not have very clear physical meaning like probability proportion. I am very curious about their underlying reasons. But I did not got clear explanations in the paper. – user785099 Jul 07 '11 at 14:55
  • 1
    @user: I'm not too familiar with the LDA internals. I suggest you try clustering on the proportions, and if it doesn't work, ask over at [metaoptimize.com](http://metaoptimize.com/qa) what the reasons for this choice are. Post a link here if you do, I'm very interested. – Fred Foo Jul 07 '11 at 15:05
  • the posterior $\gamma(w)$ can be understood as smoothed mixing proportions of topics. – Shaohua Li Mar 01 '16 at 03:45