Abstract
Knowledge tracing aims to trace students' evolving knowledge states by predicting their future performance on concept-related exercises. Recently, some graph-based models have been developed to incorporate the relationships between exercises to improve knowledge tracing, but only a single type of relationship information is generally explored. In this article, we present a novel Dual Graph Ensemble learning method for Knowledge Tracing (DGEKT), which establishes a dual graph structure of students' learning interactions to capture the heterogeneous exercise-concept associations and interaction transitions by hypergraph modeling and directed graph modeling, respectively. To combine the dual graph models, we introduce the technique of online knowledge distillation. This choice arises from the observation that, while the knowledge tracing model is designed to predict students' responses to the exercises related to different concepts, it is optimized merely with respect to the prediction accuracy on a single exercise at each step. With online knowledge distillation, the dual graph models are adaptively combined to form a stronger ensemble teacher model, which provides its predictions on all exercises as extra supervision for better modeling ability. In the experiments, we compare DGEKT against eight knowledge tracing baselines on three benchmark datasets, and the results demonstrate that DGEKT achieves state-of-the-art performance. Copyright © 2024 held by the owner/author(s).
Original language | English |
---|---|
Article number | 78 |
Journal | ACM Transactions on Information Systems |
Volume | 42 |
Issue number | 3 |
DOIs | |
Publication status | Published - Jan 2024 |