Churn prediction via multimodal fusion learning: Integrating customer financial literacy, voice, and behavioral data

David Hason RUDD, Huan HUO, Md Rafiqul ISLAM, Guandong XU

Research output: Chapter in Book/Report/Conference proceedingChapters

Abstract

In today's competitive landscape, businesses grapple with customer retention. Churn prediction models, although beneficial, often lack accuracy due to the reliance on a single data source. The intricate nature of human behavior and high-dimensional customer data further complicate these efforts. To address these concerns, this paper proposes a multimodal fusion learning model for identifying customer churn risk levels in financial service providers. Our multimodal approach integrates customer sentiments, financial literacy (FL) level, and financial behavioral data, enabling more accurate and bias-free churn prediction models. The proposed FL model utilizes a SMOGNCOREG supervised model to gauge customer FL levels from their financial data. The baseline churn model applies an ensemble artificial neural network and oversampling techniques to predict churn propensity in high-dimensional financial data. We also incorporate a speech emotion recognition model employing a pre-Trained CNN-VGG16 to recognize customer emotions based on pitch, energy, and tone. To integrate these diverse features while retaining unique insights, we introduced late and hybrid fusion techniques that complementary boost coordinated multimodal co-learning. Robust metrics were utilized to evaluate the proposed multimodal fusion model and hence the approach's validity, including mean average precision and macro-Averaged F1 score. Our novel approach demonstrates a marked improvement in churn prediction, achieving a test accuracy of 91.2%, a Mean Average Precision (MAP) score of 66, and a Macro-Averaged F1 score of 54 through the proposed hybrid fusion learning technique compared with late fusion and baseline models. Furthermore, the analysis demonstrates a positive correlation between negative emotions, low FL scores, and high-risk customers. Copyright © 2023 IEEE.

Original languageEnglish
Title of host publicationProceedings of The 10th International Conference on Behavioural and Social Computing (BESC-2023)
Place of PublicationUSA
PublisherIEEE
ISBN (Electronic)9798350395884
DOIs
Publication statusPublished - 2023

Citation

Rudd, D. H., Huo, H., Islam, M. R., & Xu, G. (2023). Churn prediction via multimodal fusion learning: Integrating customer financial literacy, voice, and behavioral data. In Proceedings of The 10th International Conference on Behavioural and Social Computing (BESC-2023). IEEE. https://doi.org/10.1109/BESC59560.2023.10386253

Keywords

  • Churn prediction
  • Multimodal learning
  • Feature fusion
  • Financial literacy
  • Speech emotion recognition
  • Customer behavior

Fingerprint

Dive into the research topics of 'Churn prediction via multimodal fusion learning: Integrating customer financial literacy, voice, and behavioral data'. Together they form a unique fingerprint.