id author title date pages extension mime words sentences flesch summary cache txt work_hqes4liwkjbipfnjiwg72ck2oe Remzi Celebi Towards FAIR protocols and workflows: the OpenPREDICT use case 2020 29 .pdf application/pdf 12995 1291 52 and protocols, Reproducibility, Semantic web, Research Object, FAIR data principles FAIR version of the PREDICT workflow, (c) new competency questions for previously on the coverage of manual steps, different workflow abstraction levels, and versioning on approach allows us to separate the workflow steps, enabling the reuse of instructions protocols to concrete and executable workflow steps, and the links between these levels. prov:generated to link a workflow activity (p-plan:Activity) to an output artefacts Figure 2 OpenPREDICT Workflow (version 0.1) with manual and computational steps. The workflow consists of four steps: data preparation, feature https://raw.githubusercontent.com/fair-workflows/openpredict/master/data/external/meshAnnotationsFromBioPorttalUsingOMIMDesc.txt https://raw.githubusercontent.com/fair-workflows/openpredict/master/data/external/meshAnnotationsFromBioPorttalUsingOMIMDesc.txt https://raw.githubusercontent.com/fair-workflows/openpredict/master/data/external/meshAnnotationsFromBioPorttalUsingOMIMDesc.txt using HCLS (https://www.w3.org/TR/hcls-dataset/) and FAIR data point specification SIO and PROV to model input data and workflows (I1). first version of OpenPREDICT workflow (opredict:Plan_Main_Protocol_v01). OpenPREDICT's computational steps use datasets, as explained in 'FAIRified data Data and code are available at GitHub: https://github.com/fair-workflows/openpredict. A generic workflow for the data fairification process. ./cache/work_hqes4liwkjbipfnjiwg72ck2oe.pdf ./txt/work_hqes4liwkjbipfnjiwg72ck2oe.txt