Item response theory models for wording effects in mixed-format scales

Wen Chung WANG, Hui-Fang CHEN, Kuan Yu JIN

Research output: Contribution to journalArticle

23 Citations (Scopus)

Abstract

Many scales contain both positively and negatively worded items. Reverse recoding of negatively worded items might not be enough for them to function as positively worded items do. In this study, we commented on the drawbacks of existing approaches to wording effect in mixed-format scales and used bi-factor item response theory (IRT) models to test the assumption of reverse coding and evaluate the magnitude of the wording effect. The parameters of the bi-factor IRT models can be estimated with existing computer programs. Two empirical examples from the Program for International Student Assessment and the Trends in International Mathematics and Science Study were given to demonstrate the advantages of the bi-factor approach over traditional ones. It was found that the wording effect in these two data sets was substantial and that ignoring the wording effect resulted in overestimated test reliability and biased person measures. Copyright © 2014 The Author(s).
Original languageEnglish
Pages (from-to)157-178
JournalEducational and Psychological Measurement
Volume75
Issue number1
Early online dateApr 2014
DOIs
Publication statusPublished - 2015

Fingerprint

model theory
Mathematics
Model Theory
Software
Students
Computer program listings
Reverse
mathematics studies
PISA study
science studies
data processing program
Biased
coding
Person
Coding
human being
Datasets
Evaluate
trend
Demonstrate

Citation

Wang, W.-C., Chen, H.-F., & Jin, K.-Y. (2015). Item response theory models for wording effects in mixed-format scales. Educational and Psychological Measurement, 75(1), 157-178.

Keywords

  • Item response theory
  • Wording effects
  • Bi-factor models
  • Bayesian methods