Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 1998 Sep;32(3 Pt 1):310-7.
doi: 10.1016/s0196-0644(98)70006-x.

Who reviews the reviewers? Feasibility of using a fictitious manuscript to evaluate peer reviewer performance

Affiliations

Who reviews the reviewers? Feasibility of using a fictitious manuscript to evaluate peer reviewer performance

W G Baxt et al. Ann Emerg Med. 1998 Sep.

Abstract

Study objective: To determine whether a fictitious manuscript into which purposeful errors were placed could be used as an instrument to evaluate peer reviewer performance.

Methods: An instrument for reviewer evaluation was created in the form of a fictitious manuscript into which deliberate errors were placed in order to develop an approach for the analysis of peer reviewer performance. The manuscript described a double-blind, placebo control study purportedly demonstrating that intravenous propranolol reduced the pain of acute migraine headache. There were 10 major and 13 minor errors placed in the manuscript. The work was distributed to all reviewers of Annals of Emergency Medicine for review.

Results: The manuscript was sent to 262 reviewers; 203 (78%) reviews were returned. One-hundred ninety-nine reviewers recommended a disposition for the manuscript: 15 recommended acceptance, 117 rejection, and 67 revision. The 15 who recommended acceptance identified 17.3% (95% confidence interval [CI] 11.3% to 23.4%) of the major and 11.8% (CI 7.3% to 16.3%) of the minor errors. The 117 who recommended rejection identified 39.1 % (CI 36.3% to 41.9%) of the major and 25.2% (CI 23.0% to 27.4%) of the minor errors. The 67 who recommended revision identified 29.6% (CI 26.1% to 33.1%) of the major and 22.0% (CI 19.3% to 24.8%) of the minor errors. The number of errors identified differed significantly across recommended disposition. Sixty-eight percent of the reviewers did not realize that the conclusions of the work were not supported by the results.

Conclusion: These data suggest that the use of a preconceived manuscript into which purposeful errors are placed may be a viable approach to evaluate reviewer performance. Peer reviewers in this study failed to identify two thirds of the major errors in such a manuscript.

PubMed Disclaimer

Comment in

  • On reviewing the reviewers.
    Frumkin K. Frumkin K. Ann Emerg Med. 1999 Mar;33(3):356. doi: 10.1016/s0196-0644(99)70379-3. Ann Emerg Med. 1999. PMID: 10036387 No abstract available.

LinkOut - more resources