Although instruments for assessing students' computational thinking (CT) concepts in primary education have been developed, they have rarely been validated in terms of item response theory (IRT). We consider IRT to be a rigorous validation tool and apply it to a CT concepts test for primary education involving 13,670 students. A two-parameter logistic model was chosen over other IRT models, as it indicated an acceptable model fit and item fit. The discrimination parameters indicated that the instrument could effectively distinguish between students with various ability levels. Nominal response modelling in IRT was used to retrieve information from the students' responses, and those with a lower ability level were found to only consider one of the conditions provided, had no understanding of the repetition structure, and might have difficulties in associating a sprite with its corresponding codes. Based on ability estimates, we also found that the students’ ability in terms of CT concepts increased with grades and that boys generally performed slightly better than girls. These results suggest that the instrument can be used to examine the learning achievements of students in terms of CT concepts. Copyright © 2022 Elsevier Ltd. All rights reserved.
|Journal||Computers & Education|
|Early online date||Jun 2022|
|Publication status||Published - Oct 2022|
CitationKong, S.-C., & Lai, M. (2022). Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responses. Computers & Education, 187. Retrieved from https://doi.org/10.1016/j.compedu.2022.104562
- Computational thinking concepts
- Item response theory
- Primary education