Truncated inverse-Lévy measure representation of the beta process

Junyi ZHANG, Angelos DASSIOS, Chong ZHONG, Qiufei YAO

Research output: Contribution to journalArticlespeer-review

Abstract

The beta process is a widely used nonparametric prior in Bayesian machine learning. While various inference schemes have been developed for the beta process and related models, the current state-of-the-art method relies heavily on the stick-breaking representation with decreasing atom weights, which is available only for a special hyperparameter. In this paper, we introduce the truncated inverse-Lévy measure representation (TILe-Rep) that extends the decreasing atom weights representation of the beta process to general hyperparameters. The TILe-Rep fills the gap between the two stick-breaking representations in Teh et al. (2007) and Paisley et al. (2010). Moreover, it has a lower truncation error compared to other sequential representations of the beta process and potentially leads to the posterior consistency property of the Bayesian factor models. We demonstrate the usage of the TILe-Rep in the celebrated beta process factor analysis model and beta process sparse factor model. Copyright © 2025 by the author(s).

Original languageEnglish
Pages (from-to)1720-1728
JournalProceedings of Machine Learning Research
Volume258
Publication statusPublished - 2025

Citation

Zhang, J., Dassios, A., Chong, Z., & Yao, Q. (2025). Truncated inverse-Lévy measure representation of the beta process. Proceedings of Machine Learning Research, 258, 1720-1728. https://proceedings.mlr.press/v258/zhang25e.html

Fingerprint

Dive into the research topics of 'Truncated inverse-Lévy measure representation of the beta process'. Together they form a unique fingerprint.