A new item response model for rater effects in forced-choice ipsative tests

Xuelan QIU, Wen Chung WANG, Shungwon RO

Research output: Contribution to conferencePapers

Abstract

Assessment of noncognitive construct (e.g., career interest) using forced-choice items in educational setting is appealing because of its advantages of reducing the detrimental effect of the response styles. However, the tests pose great challenge to psychometric since it produces ipsative score. In some applications, in addition to the statement utility and person latent trait, rater severity also plays a role in defining the item responses. In this study, a new item response model was proposed to account for rater effects in forced-choice ipsative tests to ensure the fairness. The results showed that all the parameters were recovered fairly well under the true model whereas ignoring the rater effects led to biased estimation of both item and person parameters. Copyright © 2015 AERA.
Original languageEnglish
Publication statusPublished - Apr 2015

Citation

Qiu, X.-L., Wang, W.-C., & Ro, S. (2015, April). A new item response model for rater effects in forced-choice Ipsative tests. Paper presented at The 2015 American Educational Research Association Annual Meeting (AERA 2015): Toward justice: Culture, language, and heritage in education research and praxis, Sheraton Chicago, Chicago, Illinois.

Fingerprint Dive into the research topics of 'A new item response model for rater effects in forced-choice ipsative tests'. Together they form a unique fingerprint.