Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 10 additions & 11 deletions .copier-answers.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Changes here will be overwritten by Copier
_commit: v0.4.1
_commit: v0.4.2
_src_path: https://github.com/linkml/linkml-project-copier
add_example: true
copyright_year: '2025'
Expand All @@ -9,15 +9,14 @@ gh_action_docs_preview: true
gh_action_pypi: true
github_org: nfdi-de
license: MIT
project_description: "This metadata schema is an Extension of the DCAT Application\
\ Profile for Providing Links to Use-case Specific Context. It allows to provide\
\ additional metadata regarding: which kind(s) of entity(s) or activity(s)\
\ were evaluated (the dcat:Dataset is about), which kind of activity generated\
\ the dcat:Dataset, which kind of instruments were used in the dataset generating\
\ activity, in which surrounding (e.g. a laboratory) and according to which\
\ plan the dataset generating activity took place, as well as regarding which\
\ kind(s) of qualitative and quantitative characteristic were attributed to\
\ the evaluated entity or evaluated activity and to the used instruments."
project_description: 'This metadata schema is an Extension of the DCAT Application
Profile for Providing Links to Use-case Specific Context. It allows to provide
additional metadata regarding: which kind(s) of entity(s) or activity(s) were
evaluated (the dcat:Dataset is about), which kind of activity generated the dcat:Dataset,
which kind of instruments were used in the dataset generating activity, in which
surrounding (e.g. a laboratory) and according to which plan the dataset generating
activity took place, as well as regarding which kind(s) of qualitative and quantitative
characteristic were attributed to the evaluated entity or evaluated activity and
to the used instruments.'
project_name: dcat-ap-plus
project_slug: dcat_ap_plus

9 changes: 7 additions & 2 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
# Please see the documentation for all configuration options:
# https://docs.github.com/en/code-security/dependabot/working-with-dependabot/dependabot-options-reference
# This configures updates via dependabot only for github actions.
# All updates are grouped into a single PR.
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates

version: 2
updates:
- package-ecosystem: github-actions
directory: "/"
schedule:
interval: monthly
groups:
github-actions:
patterns:
- "*"
10 changes: 7 additions & 3 deletions .github/workflows/deploy-docs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
name: Deploy docs
on: # yamllint disable-line rule:truthy
push:
tags:
branches: [main]
- 'v[0-9]+.[0-9]+.[0-9]+'
- 'v[0-9]+.[0-9]+.[0-9]+rc[0-9]'
branches:
Expand All @@ -18,6 +18,10 @@ jobs:
build-docs:
runs-on: ubuntu-latest

strategy:
matrix:
pyversion: ["3.13"]

# Grant GITHUB_TOKEN the permissions required to make a gh-pages deployment
permissions:
contents: write # to let mkdocs write the new docs
Expand All @@ -42,15 +46,15 @@ jobs:
- name: Install uv
uses: astral-sh/[email protected]
with:
python-version: 3.13
python-version: ${{ matrix.pyversion }}
enable-cache: true
cache-dependency-glob: "uv.lock"

# https://github.com/actions/setup-python
- name: Set up Python
uses: actions/[email protected]
with:
python-version: 3.13
python-version: ${{ matrix.pyversion }}

- name: Install just
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/main.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
python-version: ["3.10", "3.11", "3.12", "3.13"]
fail-fast: false

steps:
Expand Down
8 changes: 6 additions & 2 deletions .github/workflows/pypi-publish.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ jobs:
name: Build Python 🐍 distributions 📦 for publishing
runs-on: ubuntu-latest

strategy:
matrix:
pyversion: ["3.13"]

steps:
# https://github.com/actions/checkout
- name: Check out repository
Expand All @@ -36,14 +40,14 @@ jobs:
- name: Install uv
uses: astral-sh/[email protected]
with:
python-version: 3.13
python-version: ${{ matrix.pyversion }}
enable-cache: true

# https://github.com/actions/setup-python
- name: Set up Python
uses: actions/[email protected]
with:
python-version: 3.13
python-version: ${{ matrix.pyversion }}

- name: Install just
run: |
Expand Down
13 changes: 9 additions & 4 deletions .github/workflows/test_pages_build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ permissions: {}

jobs:
run:
# Don't run for PRs from forks
# Don't run for PRs from forks which fails due to lack of permissions.
if: github.event.pull_request.head.repo.full_name == github.repository
# Grant GITHUB_TOKEN the permissions required to make a gh-pages deployment
permissions:
Expand All @@ -25,6 +25,11 @@ jobs:
id-token: write # allow to generate an OpenID Connect (OIDC) token
pull-requests: write # add comment on the PR with the preview URL
runs-on: ubuntu-latest

strategy:
matrix:
pyversion: ["3.13"]

steps:
# https://github.com/actions/checkout
- name: Checkout
Expand All @@ -33,16 +38,16 @@ jobs:
fetch-depth: 0

# https://github.com/actions/setup-python
- name: Set up Python 3
- name: Set up Python
uses: actions/[email protected]
with:
python-version: 3.13
python-version: ${{ matrix.pyversion }}

# https://github.com/astral-sh/setup-uv
- name: Install uv
uses: astral-sh/[email protected]
with:
python-version: 3.13
python-version: ${{ matrix.pyversion }}
enable-cache: true
cache-dependency-glob: "uv.lock"

Expand Down
11 changes: 7 additions & 4 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,13 @@ repos:
- id: check-toml
- id: check-yaml
- id: end-of-file-fixer
exclude: '^(.*\.svg)$'
- id: trailing-whitespace
exclude: '^(.*\.svg)$'
args: [--markdown-linebreak-ext=md]

- repo: https://github.com/adrienverge/yamllint.git
rev: v1.37.1
rev: v1.38.0
hooks:
- id: yamllint
args: [-c=.yamllint.yaml]
Expand All @@ -24,13 +26,14 @@ repos:
- tomli

- repo: https://github.com/crate-ci/typos
rev: v1.39.2
rev: v1.44.0
hooks:
- id: typos
exclude: '^(.*\.svg)$'

- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.14.6
rev: v0.15.4
hooks:
# Run the linter.
- id: ruff
Expand All @@ -40,6 +43,6 @@ repos:

- repo: https://github.com/astral-sh/uv-pre-commit
# uv version.
rev: 0.9.11
rev: 0.10.7
hooks:
- id: uv-lock
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
[![DOI](https://zenodo.org/badge/1080296103.svg)](https://doi.org/10.5281/zenodo.17702369)
[![PyPI - Version](https://img.shields.io/pypi/v/dcat-ap-plus)](https://pypi.org/project/dcat-ap-plus)
[![Build and test](https://github.com/nfdi-de/dcat-ap-plus/actions/workflows/main.yaml/badge.svg)](https://github.com/nfdi-de/dcat-ap-plus/actions/workflows/main.yaml)
[![Copier Badge](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/copier-org/copier/master/img/badge/badge-grayscale-inverted-border-teal.json)](https://github.com/linkml/linkml-project-copier)
[![Copier Badge](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/copier-org/copier/master/img/badge/badge-grayscale-inverted-border-teal.json)](https://github.com/linkml/linkml-project-copier)

# DCAT Application Profile for Providing Links to Use-case Specific Context (DCAT-AP+)

The LinkML schema provided in this repository is an extension of the [DCAT Application Profile](https://semiceu.github.io/DCAT-AP/releases/3.0.0/), which allows to provide additional metadata for a `dcat:Dataset` in a very generic manner, such as:
* which kind(s) of entity(s) or activity(s) were evaluated,
* which kind of activity generated the `dcat:Dataset`,
* which kind of instruments were used in the dataset generating activity,
* in which surrounding (e.g. a laboratory) and according to which plan the dataset generating activity took place,
* which kind(s) of entity(s) or activity(s) were evaluated,
* which kind of activity generated the `dcat:Dataset`,
* which kind of instruments were used in the dataset generating activity,
* in which surrounding (e.g. a laboratory) and according to which plan the dataset generating activity took place,
* as well as which kind(s) of qualitative and quantitative characteristic(s) were attributed to the evaluated entity or evaluated activity and to the used instruments.

This extension is mainly based on the [Starting Point Terms of the Provenance Ontology (PROV-O)](https://www.w3.org/TR/prov-o/#description-starting-point-terms),
This extension is mainly based on the [Starting Point Terms of the Provenance Ontology (PROV-O)](https://www.w3.org/TR/prov-o/#description-starting-point-terms),
in that it makes the `prov:wasGeneratedBy` property of the `Dataset` class mandatory and specifies necessary properties for its expected range, the `prov:Activity` class.

The choice to use LinkML for extending DCAT-AP was based on the need to have different layers that cater to different domain-specific use cases. DCAT-AP+ serves as the basic layer for such extensions and is thus kept very generic. Being the basis of the [ChemDCAT-AP](
Expand Down Expand Up @@ -55,7 +55,7 @@ See also the documentation of the template: https://github.com/linkml/linkml-pro
Note: Environments with private PyPi repository may need extra configuration (example):

export UV_DEFAULT_INDEX=https://nexus.example.com/repository/pypi-all/simple
*
*

Copier

Expand All @@ -80,7 +80,7 @@ To regenerate the DCAT-AP LinkML representation as well as the PLUS extension ru

uv run python src/dcat_ap_shacl_2_linkml.py

### Test data validation and convertion
### Test data validation and conversion

Validate and test all: `just test`

Expand All @@ -106,15 +106,15 @@ To convert the test datasets of each DCAT-AP profile into a TTL graph run:
````

### Build GitHub pages docs locally

uv run mkdocs serve

rm -rf docs/elements/*.md && uv run gen-doc -d docs/elements src/dcat_ap_plus/schema/dcat_ap_plus.yaml

## Funding

This work was funded by the German Research Foundation (DFG) through the projects:
* "[NFDI4Cat](https://nfdi4cat.org/) - NFDI for Catalysis-Related Sciences" (DFG project no. [441926934](https://gepris.dfg.de/gepris/projekt/441926934)) and
This work was funded by the German Research Foundation (DFG) through the projects:
* "[NFDI4Cat](https://nfdi4cat.org/) - NFDI for Catalysis-Related Sciences" (DFG project no. [441926934](https://gepris.dfg.de/gepris/projekt/441926934)) and
* "[NFDI4Chem](https://nfdi4chem.de) - NFDI for Chemistry" (DFG project no. [441958208](https://gepris.dfg.de/gepris/projekt/441958208))"

within the National Research Data Infrastructure (NFDI) programme of the Joint Science Conference (GWK).
Expand Down
8 changes: 4 additions & 4 deletions README_pypkg.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# dcat_ap_plus

An extension of the [DCAT Application Profile](https://semiceu.github.io/DCAT-AP/releases/3.0.0/), which allows to provide additional metadata for a `dcat:Dataset` in a very generic manner, such as:
* which kind(s) of entity(s) or activity(s) were evaluated,
* which kind of activity generated the `dcat:Dataset`,
* which kind of instruments were used in the dataset generating activity,
* in which surrounding (e.g. a laboratory) and according to which plan the dataset generating activity took place,
* which kind(s) of entity(s) or activity(s) were evaluated,
* which kind of activity generated the `dcat:Dataset`,
* which kind of instruments were used in the dataset generating activity,
* in which surrounding (e.g. a laboratory) and according to which plan the dataset generating activity took place,
* as well as which kind(s) of qualitative and quantitative characteristic(s) were attributed to the evaluated entity or evaluated activity and to the used instruments.

This package ships the LinkML-generated Python datamodel for the schema.
Expand Down
7 changes: 2 additions & 5 deletions config.public.mk
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,10 @@ LINKML_GENERATORS_CONFIG_YAML=config.yaml
LINKML_GENERATORS_DOC_ARGS="--truncate-descriptions False --hierarchical-class-view --subfolder-type-separation --index-name overview"

## pass args to workaround genowl rdfs config bug (linkml#1453)
## (i.e. --no-type-objects --no-metaclasses --metadata-profile rdfs)
## (i.e. --no-type-objects --no-metaclasses --metadata-profile=rdfs)
# LINKML_GENERATORS_OWL_ARGS="--no-type-objects --no-metaclasses --metadata-profile=rdfs"
LINKML_GENERATORS_OWL_ARGS=

## pass args to trigger experimental java/typescript generation
LINKML_GENERATORS_JAVA_ARGS=
LINKML_GENERATORS_TYPESCRIPT_ARGS=

## pass args to pydantic generator which isn't supported by gen-project
## https://github.com/linkml/linkml/issues/2537
LINKML_GENERATORS_PYDANTIC_ARGS=
6 changes: 4 additions & 2 deletions config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,10 @@ generator_args:
mergeimports: true
owl:
mergeimports: true
metaclasses: true
type_objects: true
metaclasses: false
type_objects: false
# add_root_classes: true
# mixins_as_expressions: true
# throws 'Cannot handle metadata profile: rdfs'
# metadata_profile: rdfs
markdown:
Expand Down
2 changes: 1 addition & 1 deletion docs/how-to-extend.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ DCAT-AP+ is designed to be imported and specialized. [ChemDCAT-AP](https://githu
## What you MAY do

- **Subclass DCAT-AP+ classes** using `is_a` to create a new node shape with additional slots or stricter slot constraints. When you do so and also choose to map to a semantically narrower ontology term via `class_uri`, you should make sure to remain aligned with DCAT-AP+ for interoperability. See [Foundational principle: LinkML elements as SHACL shapes](design-patterns.md#foundational-principle-linkml-elements-as-shacl-shapes) for guidance. For example, ChemDCAT-AP defines `SubstanceSample` as `is_a: EvaluatedEntity` with `class_uri: SIO:001378`.
- **Create sub-slots** using `is_a` on slots for making stricter constraints. When you do so and also choose to map to a semantically narrower ontology term via `slot_uri`, you should make sure to remain aligned with DCAT-AP+ for interoperability. See [Foundational principle: LinkML elements as SHACL shapes](design-patterns.md#foundational-principle-linkml-elements-as-shacl-shapes) for guidance. For example, ChemDCAT-AP defines `used_catalyst` as `is_a: carried_out_by` with `slot_uri: RXNO:0000425`.
- **Create sub-slots** using `is_a` on slots for making stricter constraints. When you do so and also choose to map to a semantically narrower ontology term via `slot_uri`, you should make sure to remain aligned with DCAT-AP+ for interoperability. See [Foundational principle: LinkML elements as SHACL shapes](design-patterns.md#foundational-principle-linkml-elements-as-shacl-shapes) for guidance. For example, ChemDCAT-AP defines `used_catalyst` as `is_a: carried_out_by` with `slot_uri: RXNO:0000425`.
- **Add new slots** to your domain classes.
- **Constrain ranges** further (narrowing is allowed by DCAT-AP extension rules).
- **Provide domain-specific enums** to bind `rdf_type` or `type` to specific controlled vocabularies.
Expand Down
6 changes: 3 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@ description: "13C NMR spectral data for acetylsalicylic acid in CDCl3"
was_generated_by:
- id: ex:measurement-001
# an instance of the 13C NMR class from the Chemical Methods Ontology
rdf_type:
id: CHMO:0000595
rdf_type:
id: CHMO:0000595
title: "carbon-13 nuclear magnetic resonance spectroscopy"
evaluated_entity:
- id: ex:sample-001
Expand All @@ -37,7 +37,7 @@ was_generated_by:
- id: ex:spectrometer-001
# an instance of "JEOL ECX NMR spectrometer" class from OBI
rdf_type:
id: OBI:0000625
id: OBI:0000625
title: "JEOL ECX NMR spectrometer"
```

Expand Down
2 changes: 2 additions & 0 deletions docs/js/extra-loader.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
// See https://facelessuser.github.io/pymdown-extensions/extras/mermaid/#custom-loader
// And https://github.com/facelessuser/pymdown-extensions/blob/main/docs/src/js/material-extra-3rdparty.js
16 changes: 7 additions & 9 deletions docs/schema/README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,21 @@
# Merged model

This folder contains the DCAT-AP+ model serialized as:
* LinkML YAML,
* JSON schema,
* JSON-LD (including context)
* LinkML YAML,
* JSON schema,
* JSON-LD (including context)
* and SHACL shapes.

It also contains the prefix map in YAML and [dcat_ap_linkml.yaml](dcat_ap_linkml.yaml),
It also contains the prefix map in YAML and [dcat_ap_linkml.yaml](dcat_ap_linkml.yaml),
the direct translation of DCAT-AP into LinkML.

These files are auto-generated from the schema in the `src/dcat_ap_plus/schema` folder and version-stamped by
These files are auto-generated from the schema in the `src/dcat_ap_plus/schema` folder and version-stamped by
uv-dynamic-versioning.
For releases, these files are copied to gh-pages along with the rest of the documentation,
For releases, these files are copied to gh-pages along with the rest of the documentation,
making schema files with version information available on gh-pages. The w3id.org redirects link to these files.

A fully resolved version of the DCAT-AP+ model with imported models merged in is located in the
A fully resolved version of the DCAT-AP+ model with imported models merged in is located in the
subfolder [merged-yaml](merged-yaml).


Note, that the generated files are git-ignored.


2 changes: 1 addition & 1 deletion justfile
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ set allow-duplicate-recipes

# Overriding recipes from the root justfile by adding a recipe with the same
# name in an imported file is not possible until a known issue in just is fixed,
# https://github.com/casey/just/issues/2540
# https://github.com/casey/just/issues/2540 - So we need to override them here.

# Custom recipe for dcat-ap-plus to add project-specific artifacts to the distribution schema path
_add-artifacts:
Expand Down
Loading
Loading