Manels and Manferences. Two hashtag words that describe conferences or panels at conferences with 100% or heavily-skewed male members have been used for about a decade on social media to call out these situations. A Nature analysis of conference panels in many fields show that some have moved towards gender parity among invited speakers but the tendency to slip backward after significant strides is real. The authors describe various societies' attempts to have gender quotas among invited speakers, some of the difficulties associated with those quotas especially when there are already quotas related to geographic diversity. They also note the importance of allies: prominent men who are commonly invited speakers who decline to speak as part of a #manel. Inclusion is important--but so is integration. Having women speakers as tokens to check a box doesn't really advance the importance of involving multiple viewpoints and areas of expertise in research meetings. As Angeline Pendergrass, an atmospheric scientist at the US National Center for Atmospheric Research in Boulder, Colorado, is quoted in the paper, it's not enough to invite women, you should "listen to what they say".
In 2013, Peer J opened Peer J Preprints only 2 months after beginning a pre-publication peer review journal. In Sept 2019, Peer J Preprints stopped accepting submissions, but the site will continue to operate and all content will be available. The stated reasons for stopping is to concentrate on their peer reviewed products and because the editorial team was "secure in the knowledge that having helped lead this approach" and that there were now many established alternate preprint services operating in the same content space. It's an interesting commentary on what continues to be an evolving story of change in the publishing world.
Ironically, in September, N Penfold and J Polka published a preprint article on Peer J Preprint about preprints in the biosciences. It’s a thorough review of the technical and social issues influencing the adoption of preprints in the life sciences. They end their paper with something upon which we can all agree: "we should strive to ensure that research integrity is rewarded, discovery is accelerated, and the publication process is more inclusive and equitable."
Kai Kupferschmidt authored "Tide of Lies" in Science recounting the stories of Yoshihiro Sato, Alison Avenell and Mark Bolland spanning the globe from Japan to Scotland to Australia. Sato was a bone researcher in Japan whose prolific research output with suspicious results prompted Avenell, Bolland and others to investigate his work. Their findings and their pursuit of righting the published record over several years, has resulted to date in the retraction of 21 of 33 reports of Sato's clinical trials. But this is not just another report of research misconduct. Kupferschmidt illuminates the human toll on Avenell and Bolland in pursuit of the truth, collateral pain on Sato's co-author whose role in the misconduct is unclear, and ultimately in what is portrayed as the likely suicide of Sato 3 years after the investigation began.
The recent COPE study of concerns of editors who publish in the arts, social sciences and humanities was highlighted in a blog post entitled "From the Mouths of Editors" in "Inside Higher Education" by Colleen Flaherty. It’s a quick summary to whet your whistle for this important research--find the full paper here:
From May 2018 to July 2019, the Chinese central government has steadily acted to address research integrity problems that have been identified. In May 2018, the Communist Party of China and the State Council directed the Ministry of Science and Technology to establish a body for dealing with research misconduct, including systems for investigating natural science researchers and protecting whistleblowers. This was followed in November 2018 with a list of punitive measures to address misconduct, released by 40 Chinese agencies. The government released guidelines for assessing scientists in June 2019, tilting the evaluation process from the number of papers published to their importance. A month later, they released standards to address various standards for misconduct such as plagiarism, fabrication, falsification, authorship and duplicate or overlapping submissions.
Interviews with 21 journal editors from North America across multiple disciplines about their thoughts about text recycling (AKA "Self-plagiarism") shows a wide-range of tolerance for the practice, from tolerance of it in some circumstances to absolute intolerance. The authors surmise in the abstract that the editors at times have trouble aligning their beliefs and practices.
Elsevier's Jeroen Baas and Catriona Fennell, both analytics experts, found that about 1% of 55,000 reviewers in the publishing house's portfolio of journals have an unusually high rate of their own work being cited in papers they reviewed. Concerns about "coercive citation" have been raised and Elsevier is now working with the editors to see if the citations were relevant to the article.
In December 2019, Review Commons will be launched. The model will be that authors can post their work prior to submission to a traditional journal and it will undergo "high-quality and in-depth peer review" based on the technical rigour of the work. The reviews, through bioRxiv, can then be transferred to a journal of participating publishers and these editors have committed to use the peer review that has already been done in their editorial decisions, with minimal additional expert review. One can learn more by going to their website: www.reviewcommons.org
The editorial team, Molly Cranston and Jonathan Threlfall, at F1000 report in an editorial blog that their team undertook the COPE audit to assess compliance with the COPE Core Practices and then audited their compliance with TOP (Transparency and Openness Promotion) guidelines provided by the Center for Open Science (COS). They report changes in their editorial practices, especially with respect to image manipulation, that resulted from these two audits.
The Information for Authors of 835 journals across disciplines were assessed using machine-assistance techniques to assess for inclusion of information relating to 19 topics on transparency in reporting and research integrity. The results showed that only Conflicts of Interests and Peer Review Type of the 19 were mentioned in over 1/2 of the instructions for authors. Plagiarism was mentioned in 46%. The other 16 topics (Such as data sharing, image manipulation, ethics review and others) were mentioned from 0-31% of the time. They conclude that the instructions for authors leave "much to be desired" in addressing topics of research integrity and transparency.
The future of research
The director of Wellcome, Jeremy Farrar, invites researchers and those who support research to complete a survey to help envision the future of research. The culture of research has helped to produce stunning discoveries, but at the cost incurred by toxic cultures, hyper-competitiveness and poor leadership behaviour. The survey will help Wellcome develop a vision forward. "Research is a collective endeavour, the culture is shaped and owned by all of us."
The Research on Research Institute is a consortium of funders, academic institutions and technologists from several countries and institutions that will analyze research systems, experimenting with decision and evaluation data, tools and frameworks in research. Wellcome, Digital Science, UKRI, the University of Sheffield and Universiteit Leiden will work together to help make research more strategic, open, diverse and inclusive. Their launch included release of research funding landscape tools.
Open science and data sharing
Erik M. van Raaij draws attention to inappropriate reuse of empirical data in Operations and Supply Chain Management (OSCM) research, to explain its implications and to suggest ways in which to promote research quality and integrity. Recommendations on the issue of data reuse are provided for authors, reviewers and editors. OSCM scholars are under high pressure to show research productivity, often measured by the number of journal articles published. One possible response to such pressure is to improve research efficiency: publishing more journal articles from each data collection effort. As long as each publication makes a sufficient contribution, and authors ensure transparency in methods and consistency across publications, this is possible.
Not surprisingly given the expanding mandates by both funders and publisher regarding data sharing, the business models to provide data sharing platforms, sometimes integrated into research and publishing workflow are quite varied, and competition in this landscape is heating up. Rebecca Springer and Roger Schonfeld explore this in a Scholarly Kitchen post.
npj Urban Sustainability launched a pilot program in September 2019 which requires authors of accepted primary research papers to use enhanced support from a Research Data Editor to describe, share and link to the research data that support papers published in the journal. The goal is to promote transparency and reuse of research. Outputs from this pilot will be that authors will deposit their data sets in appropriate repositories, clear data availability statements, and metadata that will be deposited in the journal's figshare repository.
The US National Science Foundation has awarded researchers at Texas Tech University a grant to develop a training program for researchers to learn to identify and avoid predatory publishers. The goal is for the program to be free and online.
Lisa Janicke Hinchliffe posted a provocative blog on the Scholarly Kitchen exploring potential institutional responses to the Federal Trade Commission's ruling against the publishing group OMICS for "making deceptive claims regarding their academic journals and conferences". She explores different ways this ruling may impact institutions that either fund or employ researchers.
Reproducibility in science
The US National Academies of Science, Engineering and Medicine has published a monograph entitled "Reproducibility and Replicability in Science" of a Consensus Study Report. The PDF is downloadable for free at https://www.nap.edu/catalog/25303/reproducibility-and-replicability-in-science. In it they make recommendations to stakeholders in research, including funders, researchers, academic institutions and journals to improve reproducibility and replicability in science.