You are here

In the news: June Digest


Peer review

Peer review is the hallmark of scientific and scholarly publications. These authors explore 5 principles of good peer review and suggest ways for journals to use these to underpin best practices.

In conjunction with the publication of the description of good peer review, Wiley has developed a Better Peer Review Self-Assessment tool to give journals and editors feedback on their journal's peer review environment and practices. This work was discussed at a panel at the 6th World Congress on Research integrity in Hong Kong in early June.

Post publication peer review of PLOS articles is unusual-occurring for only 7.4% papers and it has been declining. About 1/3 of such comments are procedural (like copy editing comments) and of the remaining, only about 1/2 discuss the content of the paper. The authors note that if post publication peer review is to become more prevalent and more focused on content, academic culture will have to change.

Open access

Trying to understand the shifting publication landscape is difficult and important and sometimes full of surprises. Witness the identification for 152 previously open access journals that have "reverse flipped" to a closed-access model. The authors suggest some possible explanations for this counter-prevailing practice phenomenon.

In two Scholarly Kitchen posts, the idea of "Transformative Agreements" as that term relates to agreements between publishers and libraries or library consortia is discussed. Hinchliff gives this overarching definition: "a contract is a transformative agreement if it seeks to shift the contracted payment from a library or group of libraries to a publisher away from subscription-based reading and towards open access publishing" and then describes several different types.

A month later, she further explores how such transformative agreements are disrupting agreements in library consortia due to conflicts about how the costs of these agreements are distributed among its members.

On May 24, 2019 a conference at the Goldsmiths, University London on "Critical issues in open access and scholarly communications" was held, targeting the publication of books. Using the hashtag #OASC19 interested Twitter users can follow the live tweets from this meeting.

cOAlition S has released the revised Plan S document which extends implementation to January 1, 2021. They have not changed the language of making the "necessary transition to full and immediate Open Access".

Data sharing

In order to simplify research data policies across journals to avoid confusion by researchers and journal staff, the Data Policy Standardization and Implementation Interest Group of the Research Data Alliance has proposed an evidence-based 14- component description for components of a research data policy for journals. Research policy framework

In further support of open science, within the geosciences, more than 100 repositories, communities, societies, institutions, infrastructures, individuals and publishers have signed on to the Enabling FAIR Data Project's committed to make research data "findable, accessible, interoperable and reusable (FAIR). This group calls on the "entire scientific community to implement these practices.

Among ~1000 Japanese surveyed researchers, 95% have shared their  data. 75% were motivated to do so because the discoverability of their data is considered important; 50% wanted to progress their field of research and 42% for transparency. This infographic presents further details about how data are shared.

Research metrics

>2 dozen professionals met from across scholarly ecosystems to explore a more nuanced set of indicators of a journal's performance across all of its functions. New indicators should be justified, contextualised, informed and responsible. Those interested should contact the organisers to participate in a 2nd workshop in 2020. Unfortunately, I've spent about 30 minutes trying to figure out to do so, and I can't provide a contact method.

The  Hong Kong Manifesto for Assessing Researchers: Fostering Research Integrity argues cogently for moving past counting the number of publications and the Impact Factor of the journals they are published in as a measure of the quality of a researcher's output. The Manifesto proposes 6 criteria, providing the context, evidence and implementation process for each:
1. Contribution to societal needs; 2. Responsible indicators that reflect broadly the contribution to the research enterprise; 3. Research should be completely and transparently reported; 4. Open research culture should be rewarded; 5. Recognize and reward a broad range of research, such as innovation, replication, synthesis and meta-research; 6. Include a range of contributions to advancing research such as peer review. 

Oh My-hyphens included in titles impedes citation counts --as studied in Scopus and Web of Science! In papers studied there was a strong--and significant negative--correlation between the Journal Impact Factor and percentage of hyphenated paper titles published in the IEEE Transactions on Software Engineering.

The European University Association and Science Europe published a joint statement to work together to develop research assessment approaches that balance quantitative methods (counting publications, for instance) and qualitative one to evaluate the merits of the work and to recognize that research outputs are quite varied. They are also committing to finding better ways to reward and incentivise research quality. This sounds similar to the Hong Kong Manifesto goals.

The debate about the impact of social science has seemingly smouldered along for several years. A white paper from a working group to assess the metrics for social science impact and the concept of measuring this impact at all has been published.


One large publishing group has clarified their stance on preprint services across their journals: "All Springer Nature journals will adopt a unified policy that encourages preprint sharing and provides further details on preprint licensing, citation and communications with the media. "

medRxiv launched in May as a preprint server for clinical studies. It is the result of a partnership with BMJ, Yale University and Cold Spring Harbor. Several steps are in place to hopefully minimise the risk of harm from publishing a clinical paper prior to peer review.

Citations of articles published on preprint servers and then later in traditional journals sometimes go to the version on the preprint server which this author argues harms journals by "stealing" citations, loss of public recognition for the work they do and harms authors who may have difficulty aggregation the public recognition of their work for career advancement reasons. He gives examples from preprint sever published work, but also for work that is on Research Gate--with citations going to research gate!


A study of retractions of papers in chemistry and material science does a deep-dive into various aspects of retractions in these fields including timing, reasons, rates, and details about retraction notices. The author self-promoted with a nice Tweet with a graphic from the paper and a link to the publication.

Interesting case studies of flawed scientific publishing are presented, including self-citation, assumed poor English, fraudulent peer review and authorship. A brief stroll through publication misconduct.

Research integrity

Research integrity includes but is not limited to assessment of possible research misconduct. This author argues that it is about creating systems to boost the quality, relevance and reliability of all research. She calls for researchers to recognize that there are ways to improve their work, and just as in clinical medicine the tools of quality improvement may be useful.

By keeping concerns about research misconduct private, institutions allow bad behaviour to be perpetuated. These authors suggest the misconduct investigations can be improved by use of a checklist to strengthen investigations; use external peer review of investigatory reports and then publication of the reports.

The mnemonic TRAGEDIES is described to capture 9 ways research misconduct occurs. More importantly for this reader, the authors offer some mechanisms and advice to assess the vibrancy and integrity of research unit and they encourage research ethics education for trainees.

According to a study of 80 cases investigated by the US Office of Research Integrity, the first author of a paper is 38% more likely to be responsible for identified misconduct than the other authors. Should she or he be held accountable? All of the authors? Just the senior author? Arguments for these different ideas are presented here--illustrating that the answer isn't clear.

Supporting researchers

The Embassy of Good Science is a global platform to promote research integrity, focusing on researchers' daily practice. The developers hope that the Embassy will be the "go to" place for researchers to discuss hot topics, share knowledge and find guidance.

An analysis of gender and academic rank at universities in the UK showed that women make up 45% of all academic staff but only 20% of those with the rank of professor. The authors note negative associations with professor status with being a woman even in multivariable analysis (unless timing of children was made with career considerations made); the percentage of time in teaching or related activities. Children under the age of 18 years had a positive association, thought to be related to deferring childbearing until a certain rank had been achieved.

Nominations are open for the John Maddox Prize for Standing up for Science for an individual for any public activity in one of the following areas: addressing misleading information abut social or medical science; bringing sound evidence to bear in a public or policy debate; helping people to make sense of a complex scientific issues. The award is sponsored by Sense about Science.

The CWTS Leiden Ranking provides independent indicators for different universities in order to rank them, as opposed to a composite ranking. In this year's edition, the Leiden Ranking have added two measurements of achievement in 2 areas of social justice: gender parity and open access. For instance, for gender parity, the rankings include the percentage of authors (of papers for which author gender could be assigned) that are male or female.

COPE Council member  Nancy Chescheir


Read June 2019 Digest newsletter with an introduction from Deborah Poff, COPE Chair, introducing some of the issues around predatory publishing, raised at meetings and conferences attended in 2019 and in developing the third iteration of our Predatory Publishing Discussion Document. Read a summary from panel members on the Authorship and Transparency discussions at the World Conference on Research Integrity (#WCRI2019). The new and updated cases presented at the COPE Forum in May are now available online and include issues on authorship, dual submission, and post-publication corrections.. Keep abreast of news & events in #PublicationEthics as collated by COPE Council members.