Belur, Jyoti and Tompson, Lisa and Thornton, Amy and Simon, Miranda (2018) Interrater Reliability in Systematic Review Methodology. Sociological Methods and Research, 50 (2). pp. 1-29. DOI https://doi.org/10.1177/0049124118799372
Belur, Jyoti and Tompson, Lisa and Thornton, Amy and Simon, Miranda (2018) Interrater Reliability in Systematic Review Methodology. Sociological Methods and Research, 50 (2). pp. 1-29. DOI https://doi.org/10.1177/0049124118799372
Belur, Jyoti and Tompson, Lisa and Thornton, Amy and Simon, Miranda (2018) Interrater Reliability in Systematic Review Methodology. Sociological Methods and Research, 50 (2). pp. 1-29. DOI https://doi.org/10.1177/0049124118799372
Abstract
A methodologically sound systematic review is characterized by transparency, replicability, and a clear inclusion criterion. However, little attention has been paid to reporting the details of interrater reliability (IRR) when multiple coders are used to make decisions at various points in the screening and data extraction stages of a study. Prior research has mentioned the paucity of information on IRR including number of coders involved, at what stages and how IRR tests were conducted, and how disagreements were resolved. This article examines and reflects on the human factors that affect decision-making in systematic reviews via reporting on three IRR tests, conducted at three different points in the screening process, for two distinct reviews. Results of the two studies are discussed in the context of IRR and intrarater reliability in terms of the accuracy, precision, and reliability of coding behavior of multiple coders. Findings indicated that coding behavior changes both between and within individuals over time, emphasizing the importance of conducting regular and systematic IRR and intrarater reliability tests, especially when multiple coders are involved, to ensure consistency and clarity at the screening and coding stages. Implications for good practice while screening/coding for systematic reviews are discussed.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | interrater reliability, systematic review, screening, coding, κ statistic, precision, replicability |
Divisions: | Faculty of Social Sciences Faculty of Social Sciences > Government, Department of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 02 Oct 2019 11:49 |
Last Modified: | 06 Jan 2022 14:05 |
URI: | http://repository.essex.ac.uk/id/eprint/25489 |
Available files
Filename: Accepted version of paper July 2018.pdf