You are here

In the news: August


Research Ethics

The University of Central Lancashire joins the University of Cape Town to adopt a code of conduct to stamp out “ethics dumping”, in which international research takes advantage of the lack of awareness or regulation in some poorer countries to conduct work. Examples include high risk medical research studies for which participants have experienced a failure to respect cultural requirements, and failure to compensate participants for harms.

Journal Management

Karen Stanwood, Editorial Director, Health Care Books and Journals at SLACK describes the process and lessons learned when SLACK journals switched from one journal management program to another.


Elisabeth Bik, who reports having over a 1000 papers for possible misconduct provides a "how to" guide for others, including a template letter. She advises sticking to the facts, avoiding accusatory language, and weighing the potential downsides if you are identified by the researchers or others. As well, she describes different scenarios of concerns about published work--single papers, groups of papers. and papers from ones' own institution.

Peer Review

IOP Publishing which publishes journals in the physical sciences "and beyond" has partners with Publons and Scholar One for a scalable transparent peer review workflow on a cross-publisher platform.

Predatory Science

Ruairi J Mackenzie writes about the experience of attending a predatory conference on a press pass. The article describes the similarities between these conferences with the hallmark of disorganization, heterogeneity in topics, missing speakers, rapid peer review for presentations with predatory publications. Both prey on young investigators who are under pressure to publish/present or perish. For Mackenzie, this amounts to fraud.


Researchers who fail to disclose all the relevant data, or they "spin" the data during presentations commit scientific misconduct that is harder to detect or deal with, according to Terrence Stephenson in this op-ed in the BMJ.

Faculty Promotion and Tenure documents at some institutions seem to dissuade faculty from publishing in open access journals. These authors surveyed faculty for their perceptions of open access and found that from the 234 respondents, about 75% understood that open access as a concept is related to the ability of readers to access the content. They were somewhat discouraged that about 30% raise concerns about cost to publish, low quality, and poor reputation for OA journals. The authors raise questions about how those researchers, who may wish to publish in OA journals to make their work accessible, will be able to navigate what they seem to review as faulty faculty promotion and tenure attitudes about OA publication.

Predatory Science

Simon Linacre of Cabells, the group that now publishes the list of of predatory journals, myth-busts some perceptions of the list.

In an op-ed in the European Journal of Clinical Investigation, John PA Lonnidis (Editor in Chief) and Brett D Thombs, describe ways that the Journal Impact Factor can be "gamed" and how to use some of the other metrics that Clarivate publishes to inform better (albeit still imperfect) decisions about where authors should submit their papers. Their conclusion is "Given that JIF is so well‐documented to be flawed, JCR should stop reporting it and replace it by the more appropriate median citations per article, median citations per review and median citations per other type of article, also excluding journal self‐citations."


A Russian company,, brokers authorship to papers that have already been accepted, but not published. The presumably legitimate paper authors submit a listing of their paper to the Russian company and offers additional authorship for a fee-more for a first author spot than middle author slots. They then ask the editorial office to add an author that was "forgotten" on initial submission.

Two methods that European countries use for research funding decisions are compared. In the UK, the best practice is considered to be through panel evaluation and peer review, as the Research Excellence Framework developers did not feel that it is possible to assess research quality using quantitative indicators alone. Other countries instead used indicators of external funding, productivity and citation impact within Web Of Science as quantitative indicators. Reasons for this dichotomy, even if some countries accept that peer review has a place include cost and a desire to separate direct institutional funding decisions from research evaluations.


This is a book review of "Research ethics in the real world" by Helen Kara which is an academic text book intended for an introductory class on research ethics. The author addresses research ethics in the social sciences and draws from non-European/Western paradigms and distinguishes social science research ethical principals from those guiding medical research.


Jennifer A Byrne is a cancer researcher who is committed to cleaning the literature as much as she is to adding to it. She began her crusade to do so after noting irregularities in the use of certain reagents used in molecular genetic studies of cancer-related genes and was dismayed at the frequency of this sort of fraud. She has worked with Dr Cyril Labbé to automate a tool to assess nucleotide sequences in published papers, the Seek & Blastn tool to do exactly this. The tool is freely available at

In a May 2019 publication entitled "Data Communities: A new model for supporting STEM Data Sharing, Danielle Cooper and Rebecca Stringer coined the term "data community" to describe a fluid and informal network of researchers who share and use a certain type of data ( with the data community being distinct from a discipline, such as "genetics". In a follow up posting, Stringer interviews investigator Dr. Vance Lemmon, whose lab studies axon regeneration following injuries to the spinal cord. Dr Lemmon aims to increase research efficiency, reproducibility, and innovation. 


COPE Council member  Nancy Chescheir