Bayesian robust tensor completion via CP decomposition

Xiaohang WANG, Leung Ho Philip YU, Weidong YANG, Jun SU

Research output: Contribution to journalArticlespeer-review

1 Citation (Scopus)

Abstract

The real-world tensor data are inevitably missing and corrupted with noise. Some models of the low-rank tensor factorization (LRTF) add an L1 norm or L2 norm to deal with the sparse or Gaussian noise. However, the real noise are usually complex. We propose a robust Bayesian tensor completion method, called MoG BTC-CP, which could impute the missing data and remove the complex noise simultaneously. The observed tensor is assumed to be the summation of a low-rank tensor and the noise. CP decomposition is proposed to extract the low-rank structure of the tensor. We assume that the noise follows a Mixture of Gaussian (MoG) distribution. A full Bayesian framework together with a Gibbs sampling algorithm is designed to estimate the model. Extensive experiments including synthetic data and real life applications show that MoG BTC-CP outperforms the recently published leading tensor completion and denoising methods. Copyright © 2022 Elsevier B.V. All rights reserved.

Original languageEnglish
Pages (from-to)121-128
JournalPattern Recognition Letters
Volume163
Early online dateOct 2022
DOIs
Publication statusPublished - Nov 2022

Citation

Wang, X., Yu, P. L. H., Yang, W., & Su, J. (2022). Bayesian robust tensor completion via CP decomposition. Pattern Recognition Letters, 163, 121-128. doi: 10.1016/j.patrec.2022.10.005

Fingerprint

Dive into the research topics of 'Bayesian robust tensor completion via CP decomposition'. Together they form a unique fingerprint.