COPE Research Grant
Apply for a COPE research grant
COPE has funded the following projects:
Environmental scan of repositories of clinical research data: how far have we got with public disclosure of trial data?
There is growing interest in sharing of research data, as data sharing is expected to accelerate research and increase accountability. Sharing of data underlying published research is an increasingly important consideration for peer-reviewed journals, and some journals require researchers to state their data sharing policy in published articles. However, the field of clinical trials has been slow to adopt a culture of data transparency.
Data repositories are vital for achieving broad data sharing but in clinical research there is no widely-used repository for raw data which covers a variety of medical disciplines. Furthermore, to our knowledge there is no wide agreement on the standards, best practices and essential features of clinical data repositories.
Dr Karmela Krleza-Jeric and BioMed Central will be seeking to address these gaps in knowledge. The project aims to produce comprehensive information on the features and practices of data repositories with interests in clinical data disclosure.
The methodology of the study will include reviewing existing resources that catalogue information on data repositories, such as Databib, literature review, analysis of websites of repositories, and engagement of relevant stakeholders – such as interviews with repository managers.
We will aim to capture any methods of existing repositories for public disclosure of clinical data and non-public forms of data sharing, such as the unique and persistent identification systems for datasets; the data license, use or other agreements employed by the repositories; and the sustainability (business) models employed. We aim to understand how repositories have addressed these issues and summarize what are considered good practices.
The results and findings of this study will be made publicly available and used to inform the elaboration of another study which aims to develop a methodology for data sharing from clinical trials, as well as standards and guidelines for data repositories involved in the public disclosure of participant level datasets from clinical trials.
We hope the study outcomes will encourage collaboration between repositories on areas of common interest and foster collaboration between journals, publishers and data repositories to help enhance the reliability and connectedness of the scientific literature.
Characterisation of trials published in medical journals to determine whether there are specific characteristics of trials that are designed primarily for the purpose of marketing and, if identified, what the prevalence and distribution of such trials is in the medical literature
There is increasing anecdotal evidence of publications describing trials that appear to be for marketing purposes rather than addressing a genuine clinical need. The publication of such trials has the potential to distort the medical literature. However, there has been no systematic attempt to characterise such trials. We will sample trial publications in the medical literature and attempt to define the characteristics of trials that appear to be primarily marketing-driven. If we can define these trials, we will then investigate their prevalence and assess where published.
All trials are marketing exercises to a degree; our purpose is to characterise features of trials that are primarily driven by marketing, rather than addressing a genuine clinical need. Such trials may thus threaten the integrity of the medical literature and require readers to be very cautious of the findings. Of course, some trials are both valid scientifically and marketing – and a well designed, well executed study may still need to be interpreted with extreme caution.
This is a collaborative project with the following researchers and editors: Sara Schroter and Fiona Godlee (BMJ), Ginny Barbour (PLoS Medicine), Richard Lehman, Druin Birch, Joe Ross, Carl Heneghan, David Moher and Doug Altman.
No study’s perfect: a cross-disciplinary analysis of published errata
Mistakes in research are inevitable, and publishing corrections is vital for the integrity of the literature. These errata rarely require a retraction, and are therefore considered a lesser concern. This perception might be wrong, however, because the actual prevalence, nature and impact of errors across disciplines are unknown. Indeed, while several large studies have looked at retractions, existing studies on errata are small, limited in scope and rather different in methods and aims.
We will conduct the first large quantitative analysis of errata published in all disciplines. These will be retrieved and sampled from the over 11,000 journals listed in the Essential Science Indicators database, which classifies journals in 22 broad disciplines. By combining quantitative and qualitative analyses, we will produce accurate data on the frequency of corrections issued in the various disciplines over the years, the types of errors that are most common, the impact of such corrections and we will identify characteristics of study and journal that are most strongly associated with the publishing of a correction.
These results will help answer important questions on the integrity of the literature and its preservation. They will point out strengths and weaknesses in the current publication system, and will draw attention to areas that might need improvement, hopefully stimulating new approaches to ensuring best editorial and research practices.
The project will be conducted by Dr Daniele Fanelli of the University of Edinburgh, with the help of a research assistant.
CrossCheck guidance: an analysis of typical cases of plagiarism in different disciplines
Most plagiarism cannot be judged solely by the similarities discovered when using CrossCheck. Based on experience of cross checking more than 2000 manuscripts from approximately 50 countries in different parts of the world per year, this project aims to provide 3−5 typical cases of cross checked plagiarism in three different disciplines covered by the Journal of Zhejjiang University-SCIENCE A/B/C (http://www.zju.edu.cn/jzus/) (JZUS-A: Applied Physics and Engineering; JZUS-B: Biomedicine and Biotechnology; JZUS-C: Computers and Electronics). The typical plagiarism case analysis will be made into a list or a handbook that will be classified by discipline. For editors, they can learn how to deal with different kinds of plagiarism in different disciplines when using CrossCheck. For authors, these lists can act as an instruction for authors on plagiarism, from which they can learn more about plagiarism and CrossCheck, and know how to avoid being accused of plagiarism.
This research project was proposed by Professor Yuehong (Helen) Zhang and Dr Xiaoyan JIA, and will be conducted by them and their team (editors: Hanfeng Lin, Ziyang Zhai, Xinxin Zhang, Meiqing Jin, Chunjie Zhang).
The results of part of this research were presented at the CrossRef 2011 Annual Meeting, USA, 15 November 2011 (download the presentation, PDF 745kb). The purpose of this survey was to investigate journal editors’ use of CrossCheck to detect plagiarism, and their attitude to potential plagiarism once discovered. The following publication has arisen from this project: Helen Zhang, Xiaoyan JIA (2012). A survey on the use of CrossCheck for detecting plagiarism in journal articles. Learned Publishing 25:292–307 (doi:10.1087/20120408).
Several English papers arising from this project are listed below and can be downloaded from this site http://www.zju.edu.cn/jzus/editorpaper.php:
1. Zhang YH, Jia XY. A survey on the use of CrossCheck for detecting plagiarism in journal articles. Learned publishing 2012;25(4):292–307.
2. Zhang YH, McIntosh I. How to stop plagiarism: blacklist repeat offenders? Nature 2012;481:22. doi:10.1038/481021a
3. Zhang YH, Jia XY, Lin HF, Tan XF. Editorial: Be careful! Avoiding duplication: a case study. Journal of Zhejiang University-SCIENCE B (Biomedicine & Biotechnology) 2013;14(4):355–358. doi:10.1631/jzus.B1300078
4. Jia XY, Tan XF, Zhang YH. Replication of the methods section in biosciences papers: is it plagiarism? Scientometrics 2013. doi 10.1007/s11192-013-1033-5 http://link.springer.com/article/10.1007/s11192-013-1033-5
5. Zhang YH, Jia XY. Republication of conference papers in journals? Learned Publishing 2013;26(3):189–196. doi:10.1087/20130307
6. Zhang XX, Huo ZL, Zhang YH. Detecting and (not) dealing with plagiarism in an engineering paper: beyond. CrossCheck—a case study. Sci Eng Ethics DOI 10.1007/s11948-013-9460-5 http://www.zju.edu.cn/jzus/download/editorpapers/Detectingand.pdf
What instructions and guidance do journals provide to their reviewers to assess submitted manuscripts? : A survey with particular emphasis on the use of reporting guidelines
The project aims to survey journals’ instructions to reviewers of submitted manuscripts. The study will summarise if and how journals use reporting guidelines in the peer review process, and will explore how effective the editors have found reporting guidelines in improving manuscript quality.
The survey will provide an indication of the degree to which reporting guidelines are currently formally used in the peer review process. The study also hopes to identify examples of good practice which may inform recommendations for consideration by other journals. If simple processes which journals have found to be helpful can be identified, more journals may consider using them which may help to improve the quality of submissions to journals and, ultimately, ease the role of peer reviewers.
This project will be undertaken by Allison Hirst, Research Fellow at the EQUATOR Network, with EQUATOR Steering Group colleagues Professor Doug Altman, Dr Iveta Simera, Dr David Moher, Dr John Hoey and Dr Kenneth F. Schulz. The EQUATOR Network is an international initiative set up to advance high quality reporting of health research studies; it promotes good reporting practices including the wider implementation of reporting guidelines.
The following publication has arisen from this project: Hirst A, Altman DG (2012). Are peer reviewers encouraged to use reporting guidelines? A survey of 116 health research journals. PLoS ONE 7(4): e35621. doi:10.1371/journal.pone.0035621.
Prevalence and attitudes towards plagiarism in biomedical publishing
Plagiarism is a growing issue in scientific publishing domain. Information technology has immensely increased the accessibility of source literature, simultaneously making plagiarism easier then ever, but it has also enabled the development of plagiarism detection software tools. In order to detect and prevent plagiarism, we designed a research project to investigate two issues: the prevalence of plagiarism and attitudes towards plagiarism in the scientific community.
The prevalence of plagiarism in a biomedical journal will be measured using CrossCheck, eTBLAST, and WCopyFind plagiarism detection software. All papers submitted to the Croatian Medical Journal (CMJ, an international peer-reviewed open-access journal; http://www.cmj.hr) in 2009 and 2010 will be checked using plagiarism detection software, which will also search for possible sources and compare them with previously published papers and texts available in e-form on the Internet. All suspicious papers will be carefully analyzed to determine the extent and type of possible plagiarism.
The questionnaire measuring attitudes towards plagiarism will be developed and validated. Attitudes will be measured on two cohorts: corresponding authors of papers submitted to the CMJ in 2009 (approx. 300 authors) and research fellows in biomedical sciences in Croatia (approx. 500 research fellows). By correlating the prevalence of plagiarism with attitudes towards plagiarism in the two study groups, we will attempt to identify cross cultural differences and reasoning behind such a behavior. Better understanding of plagiarism will contribute to the prevention and discouragement of such practice among authors. The obtained results will be used for creating guidelines for editors on using available software to detect plagiarism in manuscripts before publication.
This is a research project proposed by Lidija Bilic-Zulle and collaborators from the Croatian Medical Journal and Department of Medical Informatics from the Rijeka University School of Medicine in Croatia.
Authors’ awareness of publication ethics: An international survey
This is a collaborative project between Sara Schroter, Gary Bryan, Elizabeth Loder (BMJ); Jason Roberts (International Society of Managing and Technical Editors), Tim Houle (Wake Forest University, North Carolina), and Don Penzien (University of Mississippi, Jackson, MS).
Considerable time is expended by journals investigating ethical transgressions and misconduct that may be caused by ignorance rather than wilful deceit. Publication ethics are rarely taught. We are conducting an international survey to measure the level of awareness of key ethical issues in publishing amongst a large sample of currently active researchers submitting articles to several BMJ journals. We intend this study to be one of the largest undertaken involving a variety of ethical issues on authorship, conflicts of interest, access to data, redundant publication, non-publication, dual submission, salami publishing, plagiarism, image manipulation, informed consent, fabrication, and falsification. We will use short vignettes rather than posing direct questions about awareness to avoid leading questions. To reduce respondent burden but at the same time including vignettes to cover all the important issues, we will randomise authors to one of a series of groups and this will determine which subset of vignettes they will receive. We will use the results generated by the survey to help inform the content of a comprehensive and readily accessible resource for all authors offering guidance on all aspects of the submission process.
Authorship research project
Although there is no universally agreed definition of authorship, authorship of research publications is a major requirement for academic advancement and a common cause of disputes among colleagues. Most research on authorship issues comes from biomedicine, where the authorship criteria are those of the International Committee of Medical Journal Editors. In order to make informed guidelines on authorship and set directions for future research, it is important to analyze authorship requirements across different research fields, as well as the current state of research into authorship. We are therefore conducting a systematic review of literature in major bibliographical databases from different scientific disciplines. We are also examining the definition of authorship and requirements for contribution disclosure in a sample of journals from different scientific disciplines. Finally, we plan to survey editors on their concepts of authorship across scientific disciplines, and use these findings as a tool for the analysis of policy issues around authorship. This project is the research effort of Ana Marusic and her collaborators from the Croatian Medical Journal and the Croatian Centre for Public Health, University of Split School of Medicine, Split, Croatia.
Literature appraisal for the systematic review of research on authorship is completed and we are currently working on the results synthesis and manuscript preparation. We also collected data on authorship policies in journals from different disciplines, as well as definitions and policies of different professional, academic or research organizations or associations. These results are now being written up in a second manuscript.
The results of this part of the research will be presented at the Second World Conference on Research Integrity in Singapore, 21-24 July 2010, where we plan to start the Delphi process for discussing the definitions of authorship across scientific disciplines.
Research on the authorship/publication practices of editors in their own journal has been completed and the manuscript has been accepted for publication. Details of the published article will follow.
One application received and rejected.
Retractions research project
The COPE Code of Conduct states that editors have a responsibility to ensure the reliability of the scientific record, implying that it will sometimes be necessary to retract an article, but it has never attempted to develop detailed guidance about this. Anecdotal evidence suggests that editors may be reluctant to retract articles because of concerns about litigation or uncertainty about the correct procedures. We are therefore examining retractions to understand journals’ current practices and any difficulties faced by editors. We are examining all retractions published on Medline in the last 10 years and categorising them according to the reasons for the retraction, who issued the retraction, etc. We are also doing qualitative research (using a semi-structured interview with journal editors) to learn about their experience of retracting articles, to discover what prompted the action, how editors decided what to do, and any barriers they faced in implementing their plans. We also plan to survey journal editors to gather further information about the retraction process. We plan to use these findings as the basis for developing practical guidelines about when articles should be retracted and how this should be carried out. This project is a collaboration between Liz Wager (freelance publication consultant) and Peter Williams of University College, London.
This study has now been published in the Journal of Medical Ethics (Why and how do journals retract articles? An analysis of Medline retractions 1988–2008' was published online first on 12 April 2011.