Abstract
In this brief, we propose a fast expectation conditional maximization (ECM) algorithm for maximum-likelihood (ML) estimation of mixtures of factor analyzers (MFA). Unlike the existing expectation-maximization (EM) algorithms such as the EM in Ghahramani and Hinton, 1996, and the alternating ECM (AECM) in McLachlan and Peel, 2003, where the missing data contains component-indicator vectors as well as latent factors, the missing data in our ECM consists of component-indicator vectors only. The novelty of our algorithm is that closed-form expressions in all conditional maximization (CM) steps are obtained explicitly, instead of resorting to numerical optimization methods. As revealed by experiments, the convergence of our ECM is substantially faster than EM and AECM regardless of whether assessed by central processing unit (CPU) time or number of iterations. Copyright © 2008 IEEE.
Original language | English |
---|---|
Pages (from-to) | 1956-1961 |
Journal | IEEE Transactions on Neural Networks |
Volume | 19 |
Issue number | 11 |
Early online date | Sept 2008 |
DOIs | |
Publication status | Published - Nov 2008 |