Advancing Research Publication Practices Through F1000 Research

Slide Note
Embed
Share

F1000 Research, led by Dr. Rebecca Lawrence, revolutionizes traditional publishing with immediate publication, open peer review, and collaborative initiatives. The platform facilitates post-publication peer review and aims to accelerate dissemination, ensuring data sharing and quality. With an innovative publishing process and diverse content formats, F1000 Research is a catalyst for open access and impactful scholarly communication.


Uploaded on Oct 09, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. DATA PUBLISHING: PEER REVIEW, SHARED STANDARDS AND COLLABORATION Rebecca Lawrence, PhD Publisher, F1000 Research rebecca.lawrence@f1000.com http://f1000research.com

  2. WHAT ARE WE GOING TO TALK ABOUT About F1000 About F1000 Research Publication process Importance of collaboration Collaborative initiatives and F1000 Research Challenges of data peer review Approaches to data peer review Challenges of post-publication peer review Summary

  3. F1000.com ABOUT F1000 Core service From the founders of BioMed Central and Current Opinions journals Post-publication peer review Faculty of 10,000 experts Faculty identify and evaluate the most important articles in biology and medicine 1,500 new evaluations per month; >120,000 total so far NEW! F1000 Posters F1000 Research (F1000R): Plans announced end-Jan; launch later this year

  4. F1000R: WHAT ARE WE TRYING TO ACHIEVE Alternative to current scholarly publishing approaches tackling 4 problems: Speed Immediate publication Peer review Open, post-publication peer review Dissemination of findings Wide variety of formats Sharing of primary data Sharing, publication and refereeing of datasets Other key features: Gold Open Access Creative Commons CC-BY licences as default Large (100+), very senior Advisory Panel (e.g. Sir Tim Hunt, Pippa Marrack, Steven Hyman, Alan Schechter, Janet Thornton)

  5. F1000R: OUR PUBLISHING PROCESS Traditional journal Submission to publication: MONTHS Closed referee process Author revises article Article submitted Publication If rejected outright, may have to go round this cycle multiple times F1000 Research Submission to publication: DAYS Open referee process In- Author revises article Article submitted house Sanity check Publication User commenting Data repository

  6. F1000R: AUTHOR INCENTIVE TO MAKE DATA USABLE Traditional article (1 DOI) Supplementary data Analysis/ Data article (protocol) DOI DOI Conclusions article (Optional) (Could be in another journal) Datasets (existing repositories) Datasets (Dryad, FigShare etc) Journal Policies: Journals and publishers that have confirmed that they would not see publication of datasets with a DOI and associated protocol information as prior publication, if a more standard (analysis/conclusions) article based on the data was subsequently submitted to them: DOI DOI BMC journals BMJ Group journals Elsevier journals The Lancet journals Nature-titled journals PLoS journals RSC journals SAGE journals J Clin Invest J Neurosci New Engl J Med Proc Natl Acad Sci Science http://f1000research.com/about/

  7. DATA PUBLICATION: MANY OUTSTANDING ISSUES Numerous outstanding issues need to be addressed Providing benefits even if someone else makes an important discovery from the data data co-authorship Effort and time required to sort out the data and dig out the necessary metadata Lack of formal recognition of data as a valuable output Technical issues formats, interoperability, mining tools Where to store the data, how much to store, and for how long Numerous stakeholders involved Funders Data centres Researchers Institutions (data management, administrators) Publishers Leaned societies

  8. DATA PUBLICATION: IMPORTANCE OF COLLABORATION Some progress has been made: Growing recognition of the value of data sharing/publication from all stakeholders Each stakeholder group have made their own advancements But not going to solve these issues working alone: need to look at the whole ecosystem Key areas that particularly require stakeholder collaboration (incl cross-publisher): Workflows involved in the data publication process: Cross-linking between journals and data repositories Minimise replication of effort by authors Format issues Data repository accreditation Peer review of datasets

  9. COLLABORATIVE INITIATIVES AND THE F1000R DATA ARTICLE F1000R working with other publishers (STM Data Group planned), and many members of all the stakeholder groups on all aspects of the data article. 1. Datasets Issues of common/mineable formats (DCXL) Deposit in relevant subject repositories where possible (BioDBCore) Otherwise in a stable general data host (Dryad, FigShare, institutional data repository if permanent e.g. Oxford DataBank) What counts as an approved repository ; what level of permanency guarantees are necessary? 2. Protocol information Enough for reuse Ultimate aim is computer mineable MIBBI standards too extreme but need some structure Collaborating on ISA framework development and workflow tools with key groups at Oxford and Harvard Universities

  10. F1000R: SIMPLIFYING AND INCENTIVISING DATA SHARING Keep it quick and simple! Minimal effort Maximal reuse of experimental and institutional metadata capture Smooth workflow between article, institutional repositories and data centres Incentives to share data: Show view/download statistics often higher than researchers think Provide impact measures to show value back to funders, institutions Encourage data citation in main article references: Open letter (most major publishers interested in signing) Scopus/WoK tracking Agree standard data citation approach

  11. CHALLENGES IN REFEREEING DATA 1. Time required to view, often many, data files (e.g. J Neurosci) 2. How do you know it is ok? Without repeating the experiment yourself Without analysing it yourself

  12. ESSD JOURNAL APPROACH Earth System Science Data journal (Copernicus) http://www.earth-system-science-data.net/ ESSD peer review ensures that the datasets are: At least plausible and contain no detectable problems; Sufficient high quality and their limitations clearly stated; Well annotated by standard metadata and available from a certified data center/repository; Customary with regard to their format(s) and/or access protocol, and expected to be useable for the foreseeable future. Openly accessible (toll free)

  13. PENSOFT BIODIVERSITY DATA PUBLISHING APPROACH Pensoft guidelines for reviewers of data papers Scientific importance and uniqueness Data stored in an appropriate repository? Description of data access? Complete and uniform recording of the data? Accurate description of the data? Use of applicable standards Possible sources of error appropriately addressed? Methods to process and analyse the data documented well enough to enable replication? Data plausible, given the protocols? All claims substantiated by the underlying data? http://www.pensoft.net/J_FILES/Pensoft_Data_Publishing_Policies_and_Guidelines. pdf

  14. BMC RESEARCH NOTES APPROACH Is the question posed original and well defined? Are the data sound and well controlled? Is the interpretation well balanced and supported by the data? Are the methods appropriate and well described; are sufficient details provided to allow others to evaluate and/or replicate the work? What are the strengths and weaknesses of the methods?

  15. F1000R DATA PEER REVIEW APPROACH Based on extensive discussion, peer review would focus on: Is the method used appropriate for the scientific question being asked? Has enough information been provided to be able to replicate the experiment? Have appropriate controls been conducted, and the data presented? Is the data in a useable format/structure? Are stated data limitations and possible sources of error appropriately described Does the data look ok (optional; e.g. Microarray data) Our sanity check will pick up: Format and suitable basic structure adherence A standard basic protocol structure is adhered to Data stored in the most appropriate and stable location Ultimate referee: reuse!

  16. F1000R: A TWO-STAGE PEER REVIEW PROCESS FIRST: Rapid seems ok stamp SECOND: Subsequent referee comments All open Focus is on whether the work is scientifically sound, not on novelty/interest etc Encourage author referee discussion Encourage author revision (versioning) Clearly labelled; separate to user commenting

  17. CHALLENGES OF POST-PUBLICATION REFEREEING Referee incentives; author revision incentives Clarity on referee status at any one time Knowledge of referee status away from the site: CrossMark Management of several versions; what and how to cite Simple universally recognisable system for overall referee status: Approved Not approved

  18. F1000R: REFEREE STATUS DISPLAY BETA

  19. SUMMARY There is now a general consensus that sharing and publishing data is good Each stakeholder group has made some steps forward We now need to work together offer some real publishing options to expose data We need to keep it simple Develop tools to minimise additional effort required by the research Develop some common approaches to minimise confusion and wasted effort on applying different formats for each publisher Build in referee incentives to conduct data peer review Develop variety of metrics to show the value of submitting your data for peer review; and recognition by those that count Questions? rebecca.lawrence@f1000.com

Related


More Related Content