I am implementing expectation maximization (EM) in C++ to estimate the parameter of Gaussian mixture model.
The EM is very slow to converge - is there a technique to converge log likelihood in a fast way?
I am implementing expectation maximization (EM) in C++ to estimate the parameter of Gaussian mixture model.
The EM is very slow to converge - is there a technique to converge log likelihood in a fast way?
The Armadillo C++ library has a multi-threaded implementation of Expectation Maximization (EM) for Gaussian Mixure Models (GMM). See the gmm_diag class for more information.