Search results

June 2011

No study’s perfect: a cross-disciplinary analysis of published errata

Mistakes in research are inevitable, and publishing corrections is vital for the integrity of the literature. These errata rarely require a retraction, and are therefore considered a lesser concern. This perception might be wrong, however, because the actual prevalence, nature and impact of errors across disciplines are unknown. Indeed, while several large studies have looked at retractions, existing studies on errata are small, limited in scope and rather different in methods and aims.

We will conduct the first large quantitative analysis of errata published in all disciplines. These will be retrieved and sampled from the over 11,000 journals listed in the Essential Science Indicators database, which classifies journals in 22 broad disciplines. By combining quantitative and qualitative analyses, we will produce accurate data on the frequency of corrections issued in the various disciplines over the years, the types of errors that are most common, the impact of such corrections and we will identify characteristics of study and journal that are most strongly associated with the publishing of a correction.

These results will help answer important questions on the integrity of the literature and its preservation. They will point out strengths and weaknesses in the current publication system, and will draw attention to areas that might need improvement, hopefully stimulating new approaches to ensuring best editorial and research practices.

The project will be conducted by Dr Daniele Fanelli of the University of Edinburgh, with the help of a research assistant.

    June 2010

    What instructions and guidance do journals provide to their reviewers to assess submitted manuscripts?: A survey with particular emphasis on the use of reporting guidelines

    The project aims to survey journals’ instructions to reviewers of submitted manuscripts. The study will summarise if and how journals use reporting guidelines in the peer review process, and will explore how effective the editors have found reporting guidelines in improving manuscript quality.

    The survey will provide an indication of the degree to which reporting guidelines are currently formally used in the peer review process. The study also hopes to identify examples of good practice which may inform recommendations for consideration by other journals. If simple processes which journals have found to be helpful can be identified, more journals may consider using them which may help to improve the quality of submissions to journals and, ultimately, ease the role of peer reviewers.

    This project will be undertaken by Allison Hirst, Research Fellow at the EQUATOR Network, with EQUATOR Steering Group colleagues Professor Doug Altman, Dr Iveta Simera, Dr David Moher, Dr John Hoey and Dr Kenneth F. Schulz. The EQUATOR Network is an international initiative set up to advance high quality reporting of health research studies; it promotes good reporting practices including the wider implementation of reporting guidelines.

    The following publication has arisen from this project: Hirst A, Altman DG (2012). Are peer reviewers encouraged to use reporting guidelines? A survey of 116 health research journals. PLoS ONE 7(4): e35621. doi:10.1371/journal.pone.0035621.