The FAIR Principles for Data Management and Stewardship

Slide Note
Embed
Share

Embrace the FAIR principles - Findable, Accessible, Interoperable, Re-usable - for effective scientific data management and stewardship. Learn how annotations enhance data FAIRness and the key attributes of each principle. Dive into the high-level guiding principles that ensure data is globally unique, retrievable, and richly described with clear provenance and metadata. The kick-off meeting in Portland signifies the collaborative effort to explore technical challenges, converge on interoperability definitions, and establish actionable use cases for the FAIR principles.


Uploaded on Sep 29, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Making Annotations FAIR Force2017, October 26, 2017

  2. FAIR Annotations Maryann Martone

  3. The FAIR Guiding Principles for scientific data management and stewardship High level principles to make data: Findable Accessible Interoperable Re-usable Mark D. Wilkinson et al. The FAIR Guiding Principles for scientific data management and stewardship, Scientific Data (2016). DOI: 10.1038/sdata.2016.18

  4. FAIR and annotations? Annotations are data and should be FAIR Annotations make data FAIR by adding searchable metadata and links

  5. Findable F1. (meta)data are assigned a globally unique and persistent identifier F2. data are described with rich metadata F3. metadata clearly and explicitly include the identifier of the data it describes F4. (meta)data are registered or indexed in a searchable resource

  6. Accessible A1. (meta)data are retrievable by their identifier using a standardized communications protocol A1.1 the protocol is open, free, and universally implementable A1.2 the protocol allows for an authentication and authorization procedure, where necessary A2. metadata are accessible, even when the data are no longer available

  7. Interoperable I1. (meta)data use a formal, accessible, shared, and broadly applicable language for knowledge representation. I2. (meta)data use vocabularies that follow FAIR principles I3. (meta)data include qualified references to other (meta)data

  8. Re-usable R1. meta(data) are richly described with a plurality of accurate and relevant attributes R1.1. (meta)data are released with a clear and accessible data usage license R1.2. (meta)data are associated with detailed provenance R1.3. (meta)data meet domain-relevant community standards

  9. Kick off Meeting: Force2016 in Portland 70+ attendees came together to: Explore technical opportunities and challenges Explore publisher s opportunities and challenges Converge on a definition of interoperability Determine use cases that are in scope Identify next steps

  10. Attributes of interoperable annotations Open but standard framework that allows/supports/enables enrichment by community and global discovery to the extent possible. Granular annotations for online elements (html text, images, data, PDF, epub, etc.) Discovery and linking of annotations across different content instances (html vs. pdf) Public, private, group, private group, and authoritative or branded conversations and ability to evolve permissions on selected annotations Selection by both content creators and users Common identification of private conversations Follow/notification Classifications and endorsements, including authoritative endorsement Identities and management of identities among systems Discovery and linking of annotations across multiple versions of content for scholarly research across multiple repositories, preprints, etc. Persistence as content changes/evolves to new versions. Attributes of interoperable annotations

  11. FAIR annotations: Some considerations Web annotations are uniquely addressable and they therefore can be issued a GUID such as a DOI. But as annotations are anchored to specific fragments inside of an object, e.g., a span of text, or a part of an image, how are these identified? F1. (meta)data are assigned a globally unique and persistent identifier: A critical piece for science is to link annotation capability to standardized and rich metadata via community ontologies and data models. F2. data are described with rich metadata (defined by R1 below) F3. metadata clearly and explicitly include the identifier of the data it describes Annotations include an explicit reference to the DOI and URL of a document, but what about books? Book chapters? F4. (meta)data are registered or indexed in a searchable resource Currently,individual systems provide structured and free text search across all annotations made. What about annotations made by other platforms? https://docs.google.com/document/d/1UObmtnCL_Dw5_tQLJkWgfKBsm_4YpCXxiu3zj2Fsuv8/edit https://docs.google.com/document/d/1UObmtnCL_Dw5_tQLJkWgfKBsm_4YpCXxiu3zj2Fsuv8/edit

  12. SciBot: Machine-generated; human curated annotations on the scientific literature Annotation Research Resource Identifiers with additional information Curation private push to public Based on Hypothesis Annotates to DOI: cross platform annotation RRID DOI Cross Ref event database

  13. putting annotation fairness into practice Francesca Di Donato

  14. Main challenges

  15. Main challenges The majority of the challenges to reach a functional European Open Science Cloud are social rather than technical *Realising the European Open Science Cloud. First report and recommendations of the Commission High Level Expert Group on the European Open Science Cloud

  16. GO FAIR

  17. The GO FAIR initiative 3 main processes: GO-CHANGE GO-TRAIN GO-BUILD Education/Training MOOCs, SPOCs Wizards Certification Culture change Open Science promotion Reward systems Technical implementation FAIR data and services Technical infrastructure

  18. GO FAIR Implementation Network on annotation Possible actions Communication Advocacy Training Building

  19. Hi! Jennifer Lin, PhD Director of Product Management jlin@crossref.org @jenniferlin15 orcid.org/0000-0002-9680-2328

  20. FAIR is fair. Apply to annotations? Discussion is critical for validation & reproducibility of results Enable tracking of the evolution of scholarly claims through the lineage of expert discussion Make reviews to the full history of the published results transparent (Provide credit to contributors)

  21. Annotations as publication metadata https://support.crossref.org

  22. Publishers: how to deposit Directly into article s metadata as standard part of content registration process: As part of references AND/OR As part of relations assertions (structured metadata)

  23. But what if publisher is not aware? Event Data collects activities surrounding publications with DOIs (nearly 100mil publications) Annotations are important events! Crossref Event Data event stream contains Hypothesis annotations right now. Interested in integrating more data sources.

  24. Event Data Crossref & DataCite APIs

  25. Making annotations fully FAIR Annotations are important scholarly contributions & can be considered a form of peer review Already being registered by some publishers but support is insufficient NEW content type available dedicated to reviews (including pre- /post-pub annotations). Register annotations (assign DOI) as content in Crossref Through Event Data, track the online activity of annotations as autonomous/independent scholarly objects

Related


More Related Content