0

I am studying EM algorithm and GMM together

I don't understand the EM algorithm as described in Wikipedia:

The EM algorithm is used to find the maximum likelihood parameters of a statistical model in cases where the equations cannot be solved directly. Typically these models involve latent variables in addition to unknown parameters and known data observations.`

until now, I am curious about the latent variables in this context.

That is, either there are missing values among the data, or the model can be formulated more simply by assuming the existence of additional unobserved data points.

and this sentence, would you give a simple example about missing or unobserved data?

Unfortunately, there is a example in wikipedia, but it is hard for me to understand this concept, and I am curious about the hidden data in Gaussian Mixture Model.

I think that mean and covariance and weighting factor are unknown parameters.

So what is the hidden data in the Gaussian mixture model?

Or is my idea wrong?

TylerH
  • 20,799
  • 66
  • 75
  • 101
user2874612
  • 39
  • 1
  • 4

0 Answers0