Greedy learning of latent tree models for multidimensional clustering

Teng-Fei LIU, Nevin Lianwen ZHANG, Peixian CHEN, April Hua LIU, Kin Man POON, Yi WANG

Research output: Contribution to journalArticlespeer-review

24 Citations (Scopus)

Abstract

Real-world data are often multifaceted and can be meaningfully clustered in more than one way. There is a growing interest in obtaining multiple partitions of data. In previous work we learnt from data a latent tree model (LTM) that contains multiple latent variables (Chen et al. 2012). Each latent variable represents a soft partition of data and hence multiple partitions result in. The LTM approach can, through model selection, automatically determine how many partitions there should be, what attributes define each partition, and how many clusters there should be for each partition. It has been shown to yield rich and meaningful clustering results. Our previous algorithm EAST for learning LTMs is only efficient enough to handle data sets with dozens of attributes. This paper proposes an algorithm called BI that can deal with data sets with hundreds of attributes. We empirically compare BI with EAST and other more efficient LTM learning algorithms, and show that BI outperforms its competitors on data sets with hundreds of attributes. In terms of clustering results, BI compares favorably with alternative methods that are not based on LTMs. Copyright © 2013 The Author(s) .
Original languageEnglish
Pages (from-to)301-330
JournalMachine Learning
Volume98
Issue number1-2
Early online dateJun 2013
DOIs
Publication statusPublished - 2015

Citation

Liu, T.-F., Zhang, N. L., Chen, P., Liu, A. H., Poon, L. K. M., & Wang, Y. (2015). Greedy learning of latent tree models for multidimensional clustering. Machine Learning, 98(1-2), 301-330.

Keywords

  • Model-based clustering
  • Multiple partitions
  • Latent tree models

Fingerprint

Dive into the research topics of 'Greedy learning of latent tree models for multidimensional clustering'. Together they form a unique fingerprint.