Research Repository

Effect of response format on cognitive reflection: Validating a two- and four-option multiple choice question version of the Cognitive Reflection Test

Sirota, M and Juanchich, M (2018) 'Effect of response format on cognitive reflection: Validating a two- and four-option multiple choice question version of the Cognitive Reflection Test.' Behavior Research Methods, 50 (6). 2511 - 2522. ISSN 1554-351X

[img]
Preview
Text
Sirota_Manuscript.pdf - Accepted Version

Download (1MB) | Preview

Abstract

© 2018, Psychonomic Society, Inc. The Cognitive Reflection Test, measuring intuition inhibition and cognitive reflection, has become extremely popular because it reliably predicts reasoning performance, decision-making, and beliefs. Across studies, the response format of CRT items sometimes differs, based on the assumed construct equivalence of tests with open-ended versus multiple-choice items (the equivalence hypothesis). Evidence and theoretical reasons, however, suggest that the cognitive processes measured by these response formats and their associated performances might differ (the nonequivalence hypothesis). We tested the two hypotheses experimentally by assessing the performance in tests with different response formats and by comparing their predictive and construct validity. In a between-subjects experiment (n = 452), participants answered stem-equivalent CRT items in an open-ended, a two-option, or a four-option response format and then completed tasks on belief bias, denominator neglect, and paranormal beliefs (benchmark indicators of predictive validity), as well as on actively open-minded thinking and numeracy (benchmark indicators of construct validity). We found no significant differences between the three response formats in the numbers of correct responses, the numbers of intuitive responses (with the exception of the two-option version, which had a higher number than the other tests), and the correlational patterns of the indicators of predictive and construct validity. All three test versions were similarly reliable, but the multiple-choice formats were completed more quickly. We speculate that the specific nature of the CRT items helps build construct equivalence among the different response formats. We recommend using the validated multiple-choice version of the CRT presented here, particularly the four-option CRT, for practical and methodological reasons. Supplementary materials and data are available at https://osf.io/mzhyc/.

Item Type: Article
Subjects: B Philosophy. Psychology. Religion > BF Psychology
Divisions: Faculty of Science and Health > Psychology, Department of
Depositing User: Elements
Date Deposited: 23 Jul 2018 13:38
Last Modified: 27 Mar 2019 02:00
URI: http://repository.essex.ac.uk/id/eprint/21441

Actions (login required)

View Item View Item