Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Apr:120:25-32.
doi: 10.1016/j.jclinepi.2019.12.016. Epub 2019 Dec 19.

Inter-review agreement of risk-of-bias judgments varied in Cochrane reviews

Affiliations

Inter-review agreement of risk-of-bias judgments varied in Cochrane reviews

Nadja Könsgen et al. J Clin Epidemiol. 2020 Apr.

Abstract

Objectives: The objective of the study was to measure the level of agreement between Cochrane reviews of overlapping randomized controlled trials (RCTs) regarding risk-of-bias (RoB) judgments.

Study design and setting: On November 5, 2017, the Cochrane Database of Systematic Reviews was searched for Cochrane reviews on tobacco. Reviews that included overlapping RCTs were included. RoB judgments were extracted from RoB tables using automated data scraping with manual verification and adjustments. Agreement between the reviews was calculated using Conger's generalized kappa coefficient (κ) and raw agreement (a).

Results: We included 53 Cochrane reviews of 376 RCTs. For the RoB domain "random sequence generation," the level of agreement between the reviews was substantial with κ = 0.63 (95% confidence interval: 0.56 to 0.71; a = 0.80). There was slight-to-moderate agreement between the reviews regarding the domains "allocation concealment": κ = 0.51 (0.41 to 0.61), a = 0.75; "blinding": κ = 0.19 (0.02 to 0.37), a = 0.52; "blinding of outcome assessment": κ = 0.43 (0.14 to 0.72) a = 0.67; and "incomplete outcome data": κ = 0.15 (-0.03 to 0.32), a = 0.64. For "blinding of participants and personnel" and "selective reporting", κ could not be calculated. The raw agreement was 0.40 and 0.42, respectively.

Conclusion: The level of agreement between Cochrane reviews regarding RoB judgments ranged from slight to substantial depending on the RoB domain. Further investigations regarding reasons for variation and interventions to improve agreement are needed.

Keywords: Cochrane reviews; Inter-review agreement; Kappa coefficient; Methodology; RCTs; Risk of bias.

PubMed Disclaimer

Similar articles

Cited by

LinkOut - more resources