A hierarchical IRT approach to item nonresponse for Likert-type scales

Chen Wei LIU, Wen Chung WANG

Research output: Contribution to conferencePapers


Item Non Response (INR), such as “Don’t Know”, “Refusal”, “Hard to Say”, and “No Opinion”, occur when a respondent does not give a substantive answer to a particular question. Treating INR as missing at random is a common practice, but it could yield biased parameter estimates when they are not. In this study we classified responding processes into a hierarchy and proposed a new Item Response Theory (IRT) model for INR, in which additional latent traits were added to account for the hierarchical structure of responding processes. Simulation studies were conducted to evaluate parameter recovery when INR were ignorable or non-ignorable. The results showed that ignoring non-ignorable INR by fitting standard IRT models yielded severely biased parameter estimates especially when the latent traits were highly correlated; whereas the new model yielded unbiased estimates regardless of whether the INR were ignorable or not. The new model was fit to a real data of citizenship survey about democratic politics. The results demonstrated the superiority and feasibility of the new model for INR for Likert-type scales.
Original languageEnglish
Publication statusPublished - Jul 2015


Liu, C.-W., & Wang, W.-C. (2015, July). A hierarchical IRT approach to item nonresponse for Likert-type scales. Paper presented at the 2015 International Meeting of the Psychometric Society (IMPS), Beijing Normal University, Beijing, China.


Dive into the research topics of 'A hierarchical IRT approach to item nonresponse for Likert-type scales'. Together they form a unique fingerprint.