I have already seen that the Gaussian Mixture model is found using maximum likelihood estimation. Is there another way to solve it without using maximum likelihood estimation?
1 Answers
In Gaussian Mixture models, during parameter estimation, the Expectation-Maximization algorithm is involved, so it's convenient (and theoretically correct) to use only the maximum likelihood estimation.
For more information and statistics stuff you can take a look at chapters 2 and 3 of this book:
McLachlan, Geoffrey J., Sharon X. Lee, and Suren I. Rathnayake. "Finite mixture models." Annual review of statistics and its application 6 (2019): 355-378.
Generally speaking, there are two main problems with GMM:
Convergence of the algorithm is not granted in a finite number of iterations of the minimization process.
In different runs, you can end up with different parameter estimates.
So, you are facing 2 main problems: in the first case it's computing time, in the last one robustness of the parameter estimations.
You can solve the first problem giving starting points calculated by a Kmeans (or I suggest a fuzzy clustering), while the second using a frequentist approach, so repeating the parameter estimations many times.

- 49,934
- 160
- 51
- 83

- 51
- 6