Assessment of noncognitive construct (e.g., career interest) using forced-choice items in educational setting is appealing because of its advantages of reducing the detrimental effect of the response styles. However, the tests pose great challenge to psychometric since it produces ipsative score. In some applications, in addition to the statement utility and person latent trait, rater severity also plays a role in defining the item responses. In this study, a new item response model was proposed to account for rater effects in forced-choice ipsative tests to ensure the fairness. The results showed that all the parameters were recovered fairly well under the true model whereas ignoring the rater effects led to biased estimation of both item and person parameters. Copyright © 2015 AERA.
|Publication status||Published - Apr 2015|