A bias in the evaluation of bias comparing randomized trials with nonexperimental studies

19Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.

Abstract

In a recent BMJ article, the authors conducted a meta-analysis to compare estimated treatment effects from randomized trials with those derived from observational studies based on routinely collected data (RCD). They calculated a pooled relative odds ratio (ROR) of 1.31 (95% confidence interval [CI]: 1.03-1.65) and concluded that RCD studies systematically over-estimated protective effects. However, their meta-analysis inverted results for some clinical questions to force all estimates from RCD to be below 1. We evaluated the statistical properties of this pooled ROR, and found that the selective inversion rule employed in the original meta-analysis can positively bias the estimate of the ROR. We then repeated the random effects meta-analysis using a different inversion rule and found an estimated ROR of 0.98 (0.78-1.23), indicating the ROR is highly dependent on the direction of comparisons. As an alternative to the ROR, we calculated the observed proportion of clinical questions where the RCD and trial CIs overlap, as well as the expected proportion assuming no systematic difference between the studies. Out of 16 clinical questions, 50% CIs overlapped for 8 (50%; 25 to 75%) compared with an expected overlap of 60% assuming no systematic difference between RCD studies and trials. Thus, there was little evidence of a systematic difference in effect estimates between RCD and RCTs. Estimates of pooled RORs across distinct clinical questions are generally not interpretable and may be misleading.

References Powered by Scopus

Why most published research findings are false

6826Citations
N/AReaders
Get full text

Empirical Evidence of Bias: Dimensions of Methodological Quality Associated With Estimates of Treatment Effects in Controlled Trials

5460Citations
N/AReaders
Get full text

The REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) Statement

3288Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Emulating Randomized Clinical Trials With Nonrandomized Real-World Evidence Studies First Results From the RCT DUPLICATE Initiative

216Citations
N/AReaders
Get full text

When and How Can Real World Data Analyses Substitute for Randomized Controlled Trials?

211Citations
N/AReaders
Get full text

Opportunities and challenges in using real-world data for health care

124Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Franklin, J. M., Dejene, S., Huybrechts, K. F., Wang, S. V., Kulldorff, M., & Rothman, K. J. (2017). A bias in the evaluation of bias comparing randomized trials with nonexperimental studies. Epidemiologic Methods, 6(1). https://doi.org/10.1515/em-2016-0018

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 13

68%

Researcher 4

21%

Professor / Associate Prof. 1

5%

Lecturer / Post doc 1

5%

Readers' Discipline

Tooltip

Medicine and Dentistry 7

54%

Pharmacology, Toxicology and Pharmaceut... 3

23%

Mathematics 2

15%

Decision Sciences 1

8%

Article Metrics

Tooltip
Mentions
Blog Mentions: 2

Save time finding and organizing research with Mendeley

Sign up for free