Traditionally, teachers evaluate students’ abilities via their total test scores. Recently, cognitive diagnostic models (CDMs) have begun to provide information about the presence or absence of students’ skills or misconceptions. Nevertheless, CDMs are typically applied to tests with multiple-choice (MC) items, which provide less diagnostic information than constructed-response (CR) items. This paper introduces new CDMs for tests with both MC and CR items, and illustrates how to use them to analyse MC and CR data, and thus, identify students’ skills and misconceptions in a mathematics domain. Analyses of real data, the responses of 497 sixth-grade students randomly selected from four Taiwanese primary schools to eight direct proportion items, were conducted to demonstrate the application of the new models. The results show that the new models can better determine students’ skills and misconceptions, in that they have higher inter-rater agreement rates than traditional CDMs. Copyright © 2016 Informa UK Limited, trading as Taylor & Francis Group.
|Journal||Educational Psychology: An International Journal of Experimental Educational Psychology|
|Early online date||Apr 2016|
|Publication status||Published - 2016|
CitationKuo, B.-C., Chen, C.-H., Yang, C.-W., & Mok, M. M. C. (2016). Cognitive diagnostic models for tests with multiple-choice and constructed-response items. Educational Psychology: An International Journal of Experimental Educational Psychology, 36(6), 1115-1133.
- Cognitive diagnosis
- Multiple-choice item
- Constructed-response item