In the news: April 2018 Digest
Included in the news this month are articles related to April's theme of "Complaints and appeals”, together with authorship, code of ethics, retractions and preprints.
COMPLAINTS AND APPEALS
Institutional research misconduct reports need more credibility
A checklist for institutions to use when investigating allegations of misconduct by one of their faculty is proposed. Such investigations are inherently difficult due to potential self-interest on the part of the institutions. Use of the checklist to guide such reviews should help to promote fairness, transparency and accountability
Keeping science honest
Two whistleblowers describe the personal impetus for calling out witnessed fraud, the toll it took on them and their recommendations for a standardised institutional process for investigations of these problems
Bioscience-scale automated detection of figure element re-use
Three researchers have developed a scalable automatic system to detect image manipulation. By testing in over 2 million figures from online biomedical journals, they then had a 3 person panel review the findings identified as re-use. The study concluded that in this data set, about 0.6% of articles would be considered fraudulent. The authors warn, however, that there are some legitimate reasons for re-use and for now, even with the potential for scaled image evaluation, that it is important to “tread carefully” when examining image re-use or manipulation
Havard developing software to spot misused images in science
With additional funding from Elsevier, Harvard scientists are developing a machine-learning program to detect misused images. The program is open sourced and will be available for others to contribute. The motive for image misuse is likely related to image or data mismanagement, rather than the rare case of fraud
Transparency in authors’ contributions and responsibilities to promote integrity in scientific publication
“The notion of authorship implies both credit and accountability”. Actions that denote authorship vary across disciplines, within disciplines, in different journals and in different research groups. These authors propose an actionable plan to develop journal policies to remove ambiguity. While built on the current ICMJE statement on authorship, it offers language that makes it more generalizable. They recommend that authors define roles of corresponding authors, require the use of ORCID numbers and Contributor Role Taxonomy (CRediT). The statement includes recommendations for research institutions and societies
The National Academy of Science (NAS) is maintaining a website that lists those journals that have adopted some or all of these recommendations (http://www.nasonline.org/about-nas/Transparency_Author_Contributions.html)
CODE OF ETHICS
Code of ethics for researchers
The World Economic Forum Young Scientist Community has published a universal code of ethics for researchers. Key points include the ethical responsibilities to: engage with the public, pursue the truth, minimize harm, engage with decision makers, support diversity, be a mentor and be accountable
The full Code is available at http://widgets.weforum.org/coe/#principles
A new publishing approach—retract and replace—is having growing pains
Retract and replace is a designation proposed to allow a journal to correct the scientific record when an honest, or innocent, error needs to be corrected. However, the process is not well implemented, with confusion resulting from retention of the same DOI for the retracted, corrected, and then replaced version. By doing so, the National Library of Medicine considers these as “retracted” articles
Want to tell if a paper has been retracted? Good luck
Researcher Caitlin Bakker was interviewed about a study she co-authored with Amy Riegelman. The authors looked at 144 papers that Retraction Watch had listed as retracted and found that only 10 were so identified across all platforms. None met all of the COPE recommendations
Preprints and citations: should non-peer reviewed material be included in article references?
In the biomedical field, the use of preprints raises concerns as they may relate to public health and treatment plans. How should preprints be cited—if at all. One alternative is to consider anything not peer reviewed as “personal communication”. But reputable preprint servers should point the reader to the most recent version of the paper. The NIH recommends a standard format in which the writer include the DOI and the object type—preprint, protocol, in the bibliography. The NIH recommends including the document version and date of citation. Other resources recommend including a description of the citation (preprint, blog, website) in the reference list. Alternatively, and with greater resources needed to do so, non-peer reviewed resources could be listed in a separate bibliography. As a whole, the publishing world needs to catch up with preprints as they expand beyond physics and the physical sciences
Comparing published scientific journal articles to their pre-print versions
The text contents of scientific papers that were originally posted on preprint servers varied little from their final poublished versions, suggesting the commercial publishers’ added value is limited
Peer review fails to prevent publication of paper with unsupported claims about peer review
A recent publication, critiqued vigorously by Vines, notes “Its perhaps ironic that a paper finding no value in peer review is so flawed that its conclusions are untenable, yet its publication in a journal is itself an indictment of peer review”. Vines notes three immediate problems: (i) the authors didn’t test the hypothesis that “more vigorous peer review = more text changes” so they cannot conclude that journal peer review is useless; (ii) many posted preprints may have been peer reviewed and rejected prior to posting on a preprint server—after perhaps responding to the original peer review process; the peer review process may have prevented flawed articles from being printed—or on preprint servers
Opinion: Is science really facing a reproducibility crisis, and do we need it to?
The author argues, and provides evidence, that questionable research practices result in publication in a non-negligible, but small number of papers with a minimal impact on the quality of the published scientific literature. She argues that variation in the frequency of such practices across and within specialties should prevent broad strokes that condemns the quality of work in any one subject area. Furthermore, she argues that the “science in crisis” narrative is non-productive at best, and at worst may be preventing talented individuals from entering research careers
FBI indicts 9 Iranians in a massive scheme to target academic credentials and steal content
FBI indicted 9 Iranian citizens for illegally pirating intellectual content from academic institutions around the world with the intent to redistributes it. The author draws the parallel with and speculates that Sci-Hub may be involved, although not named in the indictment.
Authors of premier medical textbook didn’t disclose $11 million in industry payments
Researchers led by Brian Piper evaluated financial disclosures—or more important the lack thereof—by authors of a very prestigious internal medical journal. Their work shows that this textbook—as an example perhaps—of the differences between text books and journals with respect to declaring competing or conflicting interests. Between 2009 and 2013, the authors for text received > $11million from makers of drugs and medical devices
AuthorAID launches new research collaboration to catalyse international research partnerships
INASP’s AuthorAID website has several functions, mostly aimed to assist researchers in the Global South to connect with others to assist with writing, developing research ideas, and other skills necessary for a productive academic life—and not just in the sciences. This new intiative provides a “Research Collaboration Space” to connect investigators
Data management made simple
A data management plan is described (sounds obvious by name but includes data management before and after a project and includes details like naming conventions for files, plans for the creation, sharing and archiving research data, the type software and storage devices to be used). Practical steps for accomplishing these required tasks are described
Read this month's COPE Digest newsletter for advice and resources to support your complaints and appeals policies and procedures, a new preprints discussion document, the case of the month, details of our Australia Seminar, the upcoming COPE Forum and more events.
Read April 2018 Digest: Complaints and Appeals