Assessment of differential item functioning under cognitive diagnosis models: The DINA model example

Xiaomin LI, Wen Chung WANG

Research output: Contribution to journalArticlespeer-review

24 Citations (Scopus)

Abstract

The assessment of differential item functioning (DIF) is routinely conducted to ensure test fairness and validity. Although many DIF assessment methods have been developed in the context of classical test theory and item response theory, they are not applicable for cognitive diagnosis models (CDMs), as the underlying latent attributes of CDMs are multidimensional and binary. This study proposes a very general DIF assessment method in the CDM framework which is applicable for various CDMs, more than two groups of examinees, and multiple grouping variables that are categorical, continuous, observed, or latent. The parameters can be estimated with Markov chain Monte Carlo algorithms implemented in the freeware WinBUGS. Simulation results demonstrated a good parameter recovery and advantages in DIF assessment for the new method over the Wald method. Copyright © 2015 by the National Council on Measurement in Education.
Original languageEnglish
Pages (from-to)28-54
JournalJournal of Educational Measurement
Volume52
Issue number1
Early online dateMar 2015
DOIs
Publication statusPublished - 2015

Citation

Li, X., & Wang, W.-C. (2015). Assessment of differential item functioning under cognitive diagnosis models: The DINA model example. Journal of Educational Measurement, 52(1), 28-54.

Fingerprint

Dive into the research topics of 'Assessment of differential item functioning under cognitive diagnosis models: The DINA model example'. Together they form a unique fingerprint.