From 76a870cbfe979aac5014275e91af9bead4aaef9d Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 16 Jan 2025 17:29:35 +1000 Subject: [PATCH 01/60] 7.1.3 prep (#3036) * 7.1.1 post release (#2953) * Fix Black formatting in ./admin/get_merged_prs.py (#2954) * build(deps-dev): bump ruff from 0.7.0 to 0.7.1 (#2955) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.7.0 to 0.7.1. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.7.0...0.7.1) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Ashley Sommer * Fix defined namespace warnings (#2964) * Fix defined namespace warnings Current docs-generation tests are polluted by lots of warnings that occur when Sphinx tries to read various parts of DefinedNamespace. * Fix tests that no longer need incorrect exceptions handled. * fix black formatting in test file * Undo typing changes, so this works on current pre-3.9 branch * better handling for any/all double-underscore properties * Don't include __slots__ in dir(). * test: earl test passing * Annotate Serializer.serialize and descendants (#2970) This patch aligns the type signatures on `Serializer` subclasses, including renaming the arbitrary-keywords dictionary to always be `**kwargs`. This is in part to prepare for the possibility of adding `*args` as a positional-argument delimiter. References: * https://github.com/RDFLib/rdflib/issues/1890#issuecomment-1144208982 Signed-off-by: Alex Nelson * build(deps): bump orjson from 3.10.10 to 3.10.11 (#2966) Bumps [orjson](https://github.com/ijl/orjson) from 3.10.10 to 3.10.11. - [Release notes](https://github.com/ijl/orjson/releases) - [Changelog](https://github.com/ijl/orjson/blob/master/CHANGELOG.md) - [Commits](https://github.com/ijl/orjson/compare/3.10.10...3.10.11) --- updated-dependencies: - dependency-name: orjson dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps-dev): bump ruff from 0.7.1 to 0.7.2 (#2969) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.7.1 to 0.7.2. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.7.1...0.7.2) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps-dev): bump ruff from 0.7.2 to 0.7.3 (#2979) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.7.2 to 0.7.3. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.7.2...0.7.3) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps-dev): bump ruff from 0.7.3 to 0.8.0 (#2994) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.7.3 to 0.8.0. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.7.3...0.8.0) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps): bump orjson from 3.10.11 to 3.10.12 (#2991) Bumps [orjson](https://github.com/ijl/orjson) from 3.10.11 to 3.10.12. - [Release notes](https://github.com/ijl/orjson/releases) - [Changelog](https://github.com/ijl/orjson/blob/master/CHANGELOG.md) - [Commits](https://github.com/ijl/orjson/compare/3.10.11...3.10.12) --- updated-dependencies: - dependency-name: orjson dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * added Node as an exported name from the root package location. Updated linting commands section in the developer section to use ruff check. (#2981) * build(deps-dev): bump wheel from 0.45.0 to 0.45.1 (#2992) Bumps [wheel](https://github.com/pypa/wheel) from 0.45.0 to 0.45.1. - [Release notes](https://github.com/pypa/wheel/releases) - [Changelog](https://github.com/pypa/wheel/blob/main/docs/news.rst) - [Commits](https://github.com/pypa/wheel/compare/0.45.0...0.45.1) --- updated-dependencies: - dependency-name: wheel dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Nicholas Car * feat: sort longturtle blank nodes (#2997) * feat: sort longturtle blank nodes in the object position by their cbd string * fix: https://github.com/RDFLib/rdflib/issues/2767 * build(deps-dev): bump pytest from 8.3.3 to 8.3.4 (#2999) Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.3 to 8.3.4. - [Release notes](https://github.com/pytest-dev/pytest/releases) - [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst) - [Commits](https://github.com/pytest-dev/pytest/compare/8.3.3...8.3.4) --- updated-dependencies: - dependency-name: pytest dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps-dev): bump poetry from 1.8.4 to 1.8.5 (#3001) Bumps [poetry](https://github.com/python-poetry/poetry) from 1.8.4 to 1.8.5. - [Release notes](https://github.com/python-poetry/poetry/releases) - [Changelog](https://github.com/python-poetry/poetry/blob/1.8.5/CHANGELOG.md) - [Commits](https://github.com/python-poetry/poetry/compare/1.8.4...1.8.5) --- updated-dependencies: - dependency-name: poetry dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps-dev): bump ruff from 0.8.0 to 0.8.2 (#3003) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.8.0 to 0.8.2. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.8.0...0.8.2) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps-dev): bump ruff from 0.8.2 to 0.8.3 (#3010) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.8.2 to 0.8.3. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.8.2...0.8.3) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps): bump berkeleydb from 18.1.11 to 18.1.12 (#3009) Bumps [berkeleydb](https://www.jcea.es/programacion/pybsddb.htm) from 18.1.11 to 18.1.12. --- updated-dependencies: - dependency-name: berkeleydb dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> # Conflicts: # poetry.lock * build(deps): bump orjson from 3.10.12 to 3.10.13 (#3018) Bumps [orjson](https://github.com/ijl/orjson) from 3.10.12 to 3.10.13. - [Release notes](https://github.com/ijl/orjson/releases) - [Changelog](https://github.com/ijl/orjson/blob/master/CHANGELOG.md) - [Commits](https://github.com/ijl/orjson/compare/3.10.12...3.10.13) --- updated-dependencies: - dependency-name: orjson dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * build(deps-dev): bump ruff from 0.8.4 to 0.8.6 (#3025) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.8.4 to 0.8.6. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.8.4...0.8.6) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * feat: deterministic longturtle serialisation using RDF canonicalization + n-triples sort (#3008) * feat: use the RGDA1 canonicalization algorithm + lexical n-triples sort to produce deterministic longturtle serialisation * chore: normalise usage of format * chore: apply black * fix: double up of semicolons when subject is a blank node * fix: lint * jsonld: Do not merge nodes with different invalid URIs (#3011) When parsing JSON-LD with invalid URIs in the `@id`, the `generalized_rdf: True` option allows parsing these nodes as blank nodes instead of outright rejecting the document. However, all nodes with invalid URIs were mapped to the same blank node, resulting in incorrect data. For example, without this patch, the new test fails with: ``` AssertionError: Expected: @prefix schema: . schema:author [ schema:familyName "Doe" ; schema:givenName "Jane" ; schema:name "Jane Doe" ], [ schema:familyName "Doe" ; schema:givenName "John" ; schema:name "John Doe" ] . Got: @prefix schema: . schema:author <> . <> schema:familyName "Doe" ; schema:givenName "Jane", "John" ; schema:name "Jane Doe", "John Doe" . ``` * Fixed incorrect ASK behaviour for dataset with one element (#2989) * Pass base uri to serializer when writing to file. (#2977) Co-authored-by: Nicholas Car * Dataset documentation improvements (#3012) * example printout improvements * added BN graph creation * updated tests var names & added one subtest * typos & improved formatting * updated Graph & Dataset docco * typo fix * fix code-in-comment syntax * fix code-in-comment syntax 2 * fix code-in-comment syntax - ellipses * fix code-in-comment syntax - sort print loop output * blacked * ruff fixes * Poetry 2.0.0 pyproject.toml file * move to PEP621 (Poetry 2.0.0) pyproject.toml * require poetry 2.0.0 * require poetry 2.0.0 * add in requirement for poetry-plugin-export * change from --sync to sync command * further pyproject.toml format updates * add poetry plugin to requirements-poetry.in * fix pre-commit poetry version to 2.0.0 * remove testing artifact * update license to 2025 * add me to contributors * remove outdated --check arg * typo * test add back in precommit args * test remove precommit args * match ruff version to pre-commit autoupdate PR #3026; add back in --check * re-remove --check * add David to CONTRIBUTORS * ruff in pyproject.toml to match pre-commit * updates for David's comments * fix Dataset docc ReST formatting * remove ConjunctiveGraph example; add Dataset example; add JSON-LS serialization example * Add RDFLib Path to SHACL path utility and corresponding tests (#2990) * shacl path parser: Add additional test case * shacl utilities: Add new SHACL path building utility with corresponding tests --------- Co-authored-by: Nicholas Car # Conflicts: # rdflib/extras/shacl.py * fix: typing and import issues * fix: line length as int * fix: ruff version conflict * fix: berkeleydb pin to 18.1.10 for python 3.8 compatibility * 3a not 2a --------- Signed-off-by: dependabot[bot] Signed-off-by: Alex Nelson Co-authored-by: Nicholas Car Co-authored-by: Ashley Sommer Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Alex Nelson Co-authored-by: joecrowleygaia <142864129+joecrowleygaia@users.noreply.github.com> Co-authored-by: Val Lorentz Co-authored-by: jcbiddle <114963309+jcbiddle@users.noreply.github.com> Co-authored-by: Sander Van Dooren Co-authored-by: Nicholas Car Co-authored-by: Matt Goldberg <59745812+mgberg@users.noreply.github.com> --- .readthedocs.yaml | 2 +- admin/get_merged_prs.py | 6 +- devtools/requirements-poetry.in | 2 +- docker/latest/requirements.in | 4 +- docker/latest/requirements.txt | 6 +- docs/apidocs/examples.rst | 14 +- docs/developers.rst | 11 +- examples/datasets.py | 379 ++++++++++++++---- examples/jsonld_serialization.py | 29 +- poetry.lock | 184 +++++---- pyproject.toml | 6 +- rdflib/__init__.py | 3 +- rdflib/extras/shacl.py | 123 +++++- rdflib/graph.py | 133 ++++-- rdflib/namespace/__init__.py | 44 +- rdflib/plugins/parsers/jsonld.py | 9 +- rdflib/plugins/serializers/hext.py | 4 +- rdflib/plugins/serializers/jsonld.py | 4 +- rdflib/plugins/serializers/longturtle.py | 32 +- rdflib/plugins/serializers/nquads.py | 6 +- rdflib/plugins/serializers/nt.py | 4 +- rdflib/plugins/serializers/patch.py | 6 +- rdflib/plugins/serializers/rdfxml.py | 16 +- rdflib/plugins/serializers/trig.py | 6 +- rdflib/plugins/serializers/trix.py | 6 +- rdflib/plugins/serializers/turtle.py | 2 +- rdflib/plugins/sparql/parser.py | 2 +- rdflib/plugins/stores/auditable.py | 2 +- test/data/longturtle/longturtle-target.ttl | 72 ++++ test/jsonld/local-suite/manifest.jsonld | 11 + .../local-suite/toRdf-twoinvalidids-in.jsonld | 20 + .../local-suite/toRdf-twoinvalidids-out.nq | 10 + test/test_dataset/test_dataset.py | 88 ++-- test/test_extras/test_shacl_extras.py | 86 +++- test/test_namespace/test_definednamespace.py | 28 +- .../test_serializer_longturtle.py | 82 +--- .../test_serializer_longturtle_sort.py | 120 ++++++ test/test_sparql/test_dataset_exclusive.py | 10 + test_reports/rdflib_w3c_sparql10-HEAD.ttl | 8 +- 39 files changed, 1166 insertions(+), 414 deletions(-) create mode 100644 test/data/longturtle/longturtle-target.ttl create mode 100644 test/jsonld/local-suite/toRdf-twoinvalidids-in.jsonld create mode 100644 test/jsonld/local-suite/toRdf-twoinvalidids-out.nq create mode 100644 test/test_serializers/test_serializer_longturtle_sort.py diff --git a/.readthedocs.yaml b/.readthedocs.yaml index f5becb937d..f737b9b003 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -1,7 +1,7 @@ --- # https://docs.readthedocs.io/en/stable/config-file/v2.html version: 2 -# NOTE: not builing epub because epub does not know how to handle .ico files +# NOTE: not building epub because epub does not know how to handle .ico files # which results in a warning which causes the build to fail due to # `sphinx.fail_on_warning` # https://github.com/sphinx-doc/sphinx/issues/10350 diff --git a/admin/get_merged_prs.py b/admin/get_merged_prs.py index 7e96d1d478..ddee02fb43 100644 --- a/admin/get_merged_prs.py +++ b/admin/get_merged_prs.py @@ -23,7 +23,11 @@ print(f"Getting {url}") with urllib.request.urlopen(url) as response: response_text = response.read() - link_headers = response.info()["link"].split(",") if response.info()["link"] is not None else None + link_headers = ( + response.info()["link"].split(",") + if response.info()["link"] is not None + else None + ) json_data = json.loads(response_text) ITEMS.extend(json_data["items"]) diff --git a/devtools/requirements-poetry.in b/devtools/requirements-poetry.in index 1bf7b707ab..7ba51be538 100644 --- a/devtools/requirements-poetry.in +++ b/devtools/requirements-poetry.in @@ -1,3 +1,3 @@ # Fixing this here as readthedocs can't use the compiled requirements-poetry.txt # due to conflicts. -poetry==1.8.4 +poetry==1.8.5 diff --git a/docker/latest/requirements.in b/docker/latest/requirements.in index 42fb39ae7d..cc344d2a6d 100644 --- a/docker/latest/requirements.in +++ b/docker/latest/requirements.in @@ -1,6 +1,4 @@ # This file is used for building a docker image of the latest rdflib release. It # will be updated by dependabot when new releases are made. -rdflib==7.1.0 +rdflib==7.1.1 html5rdf==1.2.0 -# html5lib-modern is required to allow the Dockerfile to build on with pre-RDFLib-7.1.1 releases. -html5lib-modern==1.2.0 diff --git a/docker/latest/requirements.txt b/docker/latest/requirements.txt index 570502462c..4357e6d526 100644 --- a/docker/latest/requirements.txt +++ b/docker/latest/requirements.txt @@ -5,12 +5,8 @@ # pip-compile docker/latest/requirements.in # html5rdf==1.2 - # via - # -r docker/latest/requirements.in - # rdflib -html5lib-modern==1.2 # via -r docker/latest/requirements.in pyparsing==3.0.9 # via rdflib -rdflib==7.1.0 +rdflib==7.1.1 # via -r docker/latest/requirements.in diff --git a/docs/apidocs/examples.rst b/docs/apidocs/examples.rst index 43b92c1373..a8c3429bd4 100644 --- a/docs/apidocs/examples.rst +++ b/docs/apidocs/examples.rst @@ -3,10 +3,18 @@ examples Package These examples all live in ``./examples`` in the source-distribution of RDFLib. -:mod:`~examples.conjunctive_graphs` Module ------------------------------------------- +:mod:`~examples.datasets` Module +-------------------------------- + +.. automodule:: examples.datasets + :members: + :undoc-members: + :show-inheritance: + +:mod:`~examples.jsonld_serialization` Module +-------------------------------------------- -.. automodule:: examples.conjunctive_graphs +.. automodule:: examples.jsonld_serialization :members: :undoc-members: :show-inheritance: diff --git a/docs/developers.rst b/docs/developers.rst index 7ca914fcaf..e3593711e9 100644 --- a/docs/developers.rst +++ b/docs/developers.rst @@ -231,20 +231,17 @@ our black.toml config file: poetry run black . -Check style and conventions with `flake8 `_: +Check style and conventions with `ruff `_: .. code-block:: bash - poetry run flake8 rdflib + poetry run ruff check -We also provide a `flakeheaven `_ -baseline that ignores existing flake8 errors and only reports on newly -introduced flake8 errors: +Any issues that are found can potentially be fixed automatically using: .. code-block:: bash - poetry run flakeheaven - + poetry run ruff check --fix Check types with `mypy `_: diff --git a/examples/datasets.py b/examples/datasets.py index d550775a16..eab3aa3845 100644 --- a/examples/datasets.py +++ b/examples/datasets.py @@ -1,13 +1,23 @@ """ -An RDFLib Dataset is a slight extension to ConjunctiveGraph: it uses simpler terminology -and has a few additional convenience methods, for example add() can be used to -add quads directly to a specific Graph within the Dataset. +This module contains a number of common tasks using the RDFLib Dataset class. -This example file shows how to declare a Dataset, add content to it, serialise it, query it -and remove things from it. +An RDFLib Dataset is an object that stores multiple Named Graphs - instances of RDFLib +Graph identified by IRI - within it and allows whole-of-dataset or single Graph use. + +Dataset extends Graph's Subject, Predicate, Object structure to include Graph - +archaically called Context - producing quads of s, p, o, g. + +There is an older implementation of a Dataset-like class in RDFLib < 7.x called +ConjunctiveGraph that is now deprecated. + +Sections in this module: + +1. Creating & Growing Datasets +2. Looping & Counting triples/quads in Datasets +3. Manipulating Graphs with Datasets """ -from rdflib import Dataset, Literal, Namespace, URIRef +from rdflib import Dataset, Graph, Literal, URIRef # Note regarding `mypy: ignore_errors=true`: # @@ -19,41 +29,48 @@ # mypy: ignore_errors=true -# -# Create & Add -# +####################################################################################### +# 1. Creating & Growing +####################################################################################### # Create an empty Dataset d = Dataset() + # Add a namespace prefix to it, just like for Graph -d.bind("ex", Namespace("http://example.com/")) +d.bind("ex", "http://example.com/") -# Declare a Graph URI to be used to identify a Graph -graph_1 = URIRef("http://example.com/graph-1") +# Declare a Graph identifier to be used to identify a Graph +# A string or a URIRef may be used, but safer to always use a URIRef for usage consistency +graph_1_id = URIRef("http://example.com/graph-1") -# Add an empty Graph, identified by graph_1, to the Dataset -d.graph(identifier=graph_1) +# Add an empty Graph, identified by graph_1_id, to the Dataset +d.graph(identifier=graph_1_id) -# Add two quads to Graph graph_1 in the Dataset +# Add two quads to the Dataset which are triples + graph ID +# These insert the triple into the GRaph specified by the ID d.add( ( URIRef("http://example.com/subject-x"), URIRef("http://example.com/predicate-x"), Literal("Triple X"), - graph_1, + graph_1_id, ) ) + d.add( ( URIRef("http://example.com/subject-z"), URIRef("http://example.com/predicate-z"), Literal("Triple Z"), - graph_1, + graph_1_id, ) ) -# Add another quad to the Dataset to a non-existent Graph: -# the Graph is created automatically +# We now have 2 distinct quads in the Dataset to the Dataset has a length of 2 +assert len(d) == 2 + +# Add another quad to the Dataset specifying a non-existent Graph. +# The Graph is created automatically d.add( ( URIRef("http://example.com/subject-y"), @@ -63,8 +80,15 @@ ) ) -# printing the Dataset like this: print(d.serialize(format="trig")) -# produces a result like this: +assert len(d) == 3 + + +# You can print the Dataset like you do a Graph but you must specify a quads format like +# 'trig' or 'trix', not 'turtle', unless the default_union parameter is set to True, and +# then you can print the entire Dataset in triples. +# print(d.serialize(format="trig").strip()) + +# you should see something like this: """ @prefix ex: . @@ -78,85 +102,278 @@ ex:subject-y ex:predicate-y "Triple Y" . } """ -print("Printing Serialised Dataset:") -print("---") -print(d.serialize(format="trig")) -print("---") -print() -print() -# -# Use & Query -# -# print the length of the Dataset, i.e. the count of all triples in all Graphs -# we should get +# Print out one graph in the Dataset, using a standard Graph serialization format - longturtle +print(d.get_graph(URIRef("http://example.com/graph-2")).serialize(format="longturtle")) + +# you should see something like this: """ -3 +PREFIX ex: + +ex:subject-y + ex:predicate-y "Triple Y" ; +. """ -print("Printing Dataset Length:") -print("---") -print(len(d)) -print("---") -print() -print() -# Query one graph in the Dataset for all its triples -# we should get + +####################################################################################### +# 2. Looping & Counting +####################################################################################### + +# Loop through all quads in the dataset +for s, p, o, g in d.quads((None, None, None, None)): # type: ignore[arg-type] + print(f"{s}, {p}, {o}, {g}") + +# you should see something like this: """ -(rdflib.term.URIRef('http://example.com/subject-z'), rdflib.term.URIRef('http://example.com/predicate-z'), rdflib.term.Literal('Triple Z')) -(rdflib.term.URIRef('http://example.com/subject-x'), rdflib.term.URIRef('http://example.com/predicate-x'), rdflib.term.Literal('Triple X')) +http://example.com/subject-z, http://example.com/predicate-z, Triple Z, http://example.com/graph-1 +http://example.com/subject-x, http://example.com/predicate-x, Triple X, http://example.com/graph-1 +http://example.com/subject-y, http://example.com/predicate-y, Triple Y, http://example.com/graph-2 """ -print("Printing all triple from one Graph in the Dataset:") -print("---") -for triple in d.triples((None, None, None, graph_1)): # type: ignore[arg-type] - print(triple) -print("---") -print() -print() -# Query the union of all graphs in the dataset for all triples -# we should get nothing: +# Loop through all the quads in one Graph - just constrain the Graph field +for s, p, o, g in d.quads((None, None, None, graph_1_id)): # type: ignore[arg-type] + print(f"{s}, {p}, {o}, {g}") + +# you should see something like this: """ +http://example.com/subject-x, http://example.com/predicate-x, Triple X, http://example.com/graph-1 +http://example.com/subject-z, http://example.com/predicate-z, Triple Z, http://example.com/graph-1 """ -# A Dataset's default union graph does not exist by default (default_union property is False) -print("Attempt #1 to print all triples in the Dataset:") -print("---") -for triple in d.triples((None, None, None, None)): - print(triple) -print("---") -print() -print() -# Set the Dataset's default_union property to True and re-query +# Looping through triples in one Graph still works too +for s, p, o in d.triples((None, None, None, graph_1_id)): # type: ignore[arg-type] + print(f"{s}, {p}, {o}") + +# you should see something like this: +""" +http://example.com/subject-x, http://example.com/predicate-x, Triple X +http://example.com/subject-z, http://example.com/predicate-z, Triple Z +""" + +# Looping through triples across the whole Dataset will produce nothing +# unless we set the default_union parameter to True, since each triple is in a Named Graph + +# Setting the default_union parameter to True essentially presents all triples in all +# Graphs as a single Graph d.default_union = True -print("Attempt #2 to print all triples in the Dataset:") -print("---") -for triple in d.triples((None, None, None, None)): - print(triple) -print("---") -print() -print() +for s, p, o in d.triples((None, None, None)): + print(f"{s}, {p}, {o}") +# you should see something like this: +""" +http://example.com/subject-x, http://example.com/predicate-x, Triple X +http://example.com/subject-z, http://example.com/predicate-z, Triple Z +http://example.com/subject-y, http://example.com/predicate-y, Triple Y +""" -# -# Remove -# +# You can still loop through all quads now with the default_union parameter to True +for s, p, o, g in d.quads((None, None, None)): + print(f"{s}, {p}, {o}, {g}") + +# you should see something like this: +""" +http://example.com/subject-z, http://example.com/predicate-z, Triple Z, http://example.com/graph-1 +http://example.com/subject-x, http://example.com/predicate-x, Triple X, http://example.com/graph-1 +http://example.com/subject-y, http://example.com/predicate-y, Triple Y, http://example.com/graph-2 +""" + +# Adding a triple in graph-1 to graph-2 increases the number of distinct of quads in +# the Dataset +d.add( + ( + URIRef("http://example.com/subject-z"), + URIRef("http://example.com/predicate-z"), + Literal("Triple Z"), + URIRef("http://example.com/graph-2"), + ) +) + +for s, p, o, g in d.quads((None, None, None, None)): + print(f"{s}, {p}, {o}, {g}") + +# you should see something like this, with the 'Z' triple in graph-1 and graph-2: +""" +http://example.com/subject-x, http://example.com/predicate-x, Triple X, http://example.com/graph-1 +http://example.com/subject-y, http://example.com/predicate-y, Triple Y, http://example.com/graph-2 +http://example.com/subject-z, http://example.com/predicate-z, Triple Z, http://example.com/graph-1 +http://example.com/subject-z, http://example.com/predicate-z, Triple Z, http://example.com/graph-2 +""" + +# but the 'length' of the Dataset is still only 3 as only distinct triples are counted +assert len(d) == 3 + + +# Looping through triples sees the 'Z' triple only once +for s, p, o in d.triples((None, None, None)): + print(f"{s}, {p}, {o}") + +# you should see something like this: +""" +http://example.com/subject-x, http://example.com/predicate-x, Triple X +http://example.com/subject-z, http://example.com/predicate-z, Triple Z +http://example.com/subject-y, http://example.com/predicate-y, Triple Y +""" + +####################################################################################### +# 3. Manipulating Graphs +####################################################################################### + +# List all the Graphs in the Dataset +for x in d.graphs(): + print(x) + +# this returns the graphs, something like: +""" + a rdfg:Graph;rdflib:storage [a rdflib:Store;rdfs:label 'Memory']. + a rdfg:Graph;rdflib:storage [a rdflib:Store;rdfs:label 'Memory']. + a rdfg:Graph;rdflib:storage [a rdflib:Store;rdfs:label 'Memory']. +""" + +# So try this +for x in d.graphs(): + print(x.identifier) + +# you should see something like this, noting the default, currently empty, graph: +""" +urn:x-rdflib:default +http://example.com/graph-2 +http://example.com/graph-1 +""" -# Remove Graph graph_1 from the Dataset -d.remove_graph(graph_1) +# To add to the default Graph, just add a triple, not a quad, to the Dataset directly +d.add( + ( + URIRef("http://example.com/subject-n"), + URIRef("http://example.com/predicate-n"), + Literal("Triple N"), + ) +) +for s, p, o, g in d.quads((None, None, None, None)): + print(f"{s}, {p}, {o}, {g}") + +# you should see something like this, noting the triple in the default Graph: +""" +http://example.com/subject-z, http://example.com/predicate-z, Triple Z, http://example.com/graph-1 +http://example.com/subject-z, http://example.com/predicate-z, Triple Z, http://example.com/graph-2 +http://example.com/subject-x, http://example.com/predicate-x, Triple X, http://example.com/graph-1 +http://example.com/subject-y, http://example.com/predicate-y, Triple Y, http://example.com/graph-2 +http://example.com/subject-n, http://example.com/predicate-n, Triple N, urn:x-rdflib:default +""" + +# Loop through triples per graph +for x in d.graphs(): + print(x.identifier) + for s, p, o in x.triples((None, None, None)): + print(f"\t{s}, {p}, {o}") -# printing the Dataset like this: print(d.serialize(format="trig")) -# now produces a result like this: +# you should see something like this: +""" +urn:x-rdflib:default + http://example.com/subject-n, http://example.com/predicate-n, Triple N +http://example.com/graph-1 + http://example.com/subject-x, http://example.com/predicate-x, Triple X + http://example.com/subject-z, http://example.com/predicate-z, Triple Z +http://example.com/graph-2 + http://example.com/subject-y, http://example.com/predicate-y, Triple Y + http://example.com/subject-z, http://example.com/predicate-z, Triple Z +""" +# The default_union parameter includes all triples in the Named Graphs and the Default Graph +for s, p, o in d.triples((None, None, None)): + print(f"{s}, {p}, {o}") + +# you should see something like this: +""" +http://example.com/subject-x, http://example.com/predicate-x, Triple X +http://example.com/subject-n, http://example.com/predicate-n, Triple N +http://example.com/subject-z, http://example.com/predicate-z, Triple Z +http://example.com/subject-y, http://example.com/predicate-y, Triple Y """ + +# To remove a graph +d.remove_graph(graph_1_id) + +# To remove the default graph +d.remove_graph(URIRef("urn:x-rdflib:default")) + +# print what's left - one graph, graph-2 +print(d.serialize(format="trig")) + +# you should see something like this: +""" +@prefix ex: . + ex:graph-2 { ex:subject-y ex:predicate-y "Triple Y" . + + ex:subject-z ex:predicate-z "Triple Z" . +} +""" + +# To add a Graph that already exists, you must give it an Identifier or else it will be assigned a Blank Node ID +g_with_id = Graph(identifier=URIRef("http://example.com/graph-3")) +g_with_id.bind("ex", "http://example.com/") + +# Add a distinct triple to the exiting Graph, using Namepspace IRI shortcuts +# g_with_id.bind("ex", "http://example.com/") +g_with_id.add( + ( + URIRef("http://example.com/subject-k"), + URIRef("http://example.com/predicate-k"), + Literal("Triple K"), + ) +) +d.add_graph(g_with_id) +print(d.serialize(format="trig")) + +# you should see something like this: +""" +@prefix ex: . + +ex:graph-3 { + ex:subject_k ex:predicate_k "Triple K" . +} + +ex:graph-2 { + ex:subject-y ex:predicate-y "Triple Y" . + + ex:subject-z ex:predicate-z "Triple Z" . +} +""" + +# If you add a Graph with no specified identifier... +g_no_id = Graph() +g_no_id.bind("ex", "http://example.com/") + +g_no_id.add( + ( + URIRef("http://example.com/subject-l"), + URIRef("http://example.com/predicate-l"), + Literal("Triple L"), + ) +) +d.add_graph(g_no_id) + +# now when we print it, we will see a Graph with a Blank Node id: +print(d.serialize(format="trig")) + +# you should see somthing like this, but with a different Blank Node ID , as this is rebuilt each code execution +""" +@prefix ex: . + +ex:graph-3 { + ex:subject-k ex:predicate-k "Triple K" . +} + +ex:graph-2 { + ex:subject-y ex:predicate-y "Triple Y" . + + ex:subject-z ex:predicate-z "Triple Z" . +} + +_:N9cc8b54c91724e31896da5ce41e0c937 { + ex:subject-l ex:predicate-l "Triple L" . } """ -print("Printing Serialised Dataset after graph_1 removal:") -print("---") -print(d.serialize(format="trig").strip()) -print("---") -print() -print() diff --git a/examples/jsonld_serialization.py b/examples/jsonld_serialization.py index 5bee1a6147..dd83d6a5d5 100644 --- a/examples/jsonld_serialization.py +++ b/examples/jsonld_serialization.py @@ -1,24 +1,35 @@ """ -JSON-LD is "A JSON-based Serialization for Linked Data" (https://www.w3.org/TR/json-ld/) that RDFLib implements for RDF serialization. +JSON-LD is "A JSON-based Serialization for Linked Data" (https://www.w3.org/TR/json-ld/) +that RDFLib implements for RDF serialization. -This file demonstrated some of the JSON-LD things you can do with RDFLib. Parsing & serializing so far. More to be added later. +This file demonstrated some of the JSON-LD things you can do with RDFLib. Parsing & +serializing so far. More to be added later. Parsing ------- -There are a number of "flavours" of JSON-LD - compact and verbose etc. RDFLib can parse all of these in a normal RDFLib way. + +There are a number of "flavours" of JSON-LD - compact and verbose etc. RDFLib can parse +all of these in a normal RDFLib way. Serialization ------------- -JSON-LD has a number of options for serialization - more than other RDF formats. For example, IRIs within JSON-LD can be compacted down to CURIES when a "context" statment is added to the JSON-LD data that maps identifiers - short codes - to IRIs and namespace IRIs like this: -# here the short code "dcterms" is mapped to the IRI http://purl.org/dc/terms/ and "schema" to https://schema.org/, as per RDFLib's in-build namespace prefixes +JSON-LD has a number of options for serialization - more than other RDF formats. For +example, IRIs within JSON-LD can be compacted down to CURIES when a "context" statement +is added to the JSON-LD data that maps identifiers - short codes - to IRIs and namespace +IRIs like this: -"@context": { - "dct": "http://purl.org/dc/terms/", - "schema": "https://schema.org/" -} +.. code-block:: json + + "@context": { + "dcterms": "http://purl.org/dc/terms/", + "schema": "https://schema.org/" + } + +Here the short code "dcterms" is mapped to the IRI http://purl.org/dc/terms/ and +"schema" to https://schema.org/, as per RDFLib's in-build namespace prefixes. """ # import RDFLib and other things diff --git a/poetry.lock b/poetry.lock index c2d3eb8977..2072d2c5cb 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1,4 +1,4 @@ -# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand. +# This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand. [[package]] name = "alabaster" @@ -830,69 +830,86 @@ test = ["codecov (>=2.1)", "pytest (>=7.2)", "pytest-cov (>=4.0)"] [[package]] name = "orjson" -version = "3.10.10" +version = "3.10.13" description = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy" optional = true python-versions = ">=3.8" files = [ - {file = "orjson-3.10.10-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:b788a579b113acf1c57e0a68e558be71d5d09aa67f62ca1f68e01117e550a998"}, - {file = "orjson-3.10.10-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:804b18e2b88022c8905bb79bd2cbe59c0cd014b9328f43da8d3b28441995cda4"}, - {file = "orjson-3.10.10-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9972572a1d042ec9ee421b6da69f7cc823da5962237563fa548ab17f152f0b9b"}, - {file = "orjson-3.10.10-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc6993ab1c2ae7dd0711161e303f1db69062955ac2668181bfdf2dd410e65258"}, - {file = "orjson-3.10.10-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d78e4cacced5781b01d9bc0f0cd8b70b906a0e109825cb41c1b03f9c41e4ce86"}, - {file = "orjson-3.10.10-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e6eb2598df518281ba0cbc30d24c5b06124ccf7e19169e883c14e0831217a0bc"}, - {file = "orjson-3.10.10-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:23776265c5215ec532de6238a52707048401a568f0fa0d938008e92a147fe2c7"}, - {file = "orjson-3.10.10-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8cc2a654c08755cef90b468ff17c102e2def0edd62898b2486767204a7f5cc9c"}, - {file = "orjson-3.10.10-cp310-none-win32.whl", hash = "sha256:081b3fc6a86d72efeb67c13d0ea7c030017bd95f9868b1e329a376edc456153b"}, - {file = "orjson-3.10.10-cp310-none-win_amd64.whl", hash = "sha256:ff38c5fb749347768a603be1fb8a31856458af839f31f064c5aa74aca5be9efe"}, - {file = "orjson-3.10.10-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:879e99486c0fbb256266c7c6a67ff84f46035e4f8749ac6317cc83dacd7f993a"}, - {file = "orjson-3.10.10-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:019481fa9ea5ff13b5d5d95e6fd5ab25ded0810c80b150c2c7b1cc8660b662a7"}, - {file = "orjson-3.10.10-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0dd57eff09894938b4c86d4b871a479260f9e156fa7f12f8cad4b39ea8028bb5"}, - {file = "orjson-3.10.10-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dbde6d70cd95ab4d11ea8ac5e738e30764e510fc54d777336eec09bb93b8576c"}, - {file = "orjson-3.10.10-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b2625cb37b8fb42e2147404e5ff7ef08712099197a9cd38895006d7053e69d6"}, - {file = "orjson-3.10.10-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dbf3c20c6a7db69df58672a0d5815647ecf78c8e62a4d9bd284e8621c1fe5ccb"}, - {file = "orjson-3.10.10-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:75c38f5647e02d423807d252ce4528bf6a95bd776af999cb1fb48867ed01d1f6"}, - {file = "orjson-3.10.10-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:23458d31fa50ec18e0ec4b0b4343730928296b11111df5f547c75913714116b2"}, - {file = "orjson-3.10.10-cp311-none-win32.whl", hash = "sha256:2787cd9dedc591c989f3facd7e3e86508eafdc9536a26ec277699c0aa63c685b"}, - {file = "orjson-3.10.10-cp311-none-win_amd64.whl", hash = "sha256:6514449d2c202a75183f807bc755167713297c69f1db57a89a1ef4a0170ee269"}, - {file = "orjson-3.10.10-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:8564f48f3620861f5ef1e080ce7cd122ee89d7d6dacf25fcae675ff63b4d6e05"}, - {file = "orjson-3.10.10-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5bf161a32b479034098c5b81f2608f09167ad2fa1c06abd4e527ea6bf4837a9"}, - {file = "orjson-3.10.10-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:68b65c93617bcafa7f04b74ae8bc2cc214bd5cb45168a953256ff83015c6747d"}, - {file = "orjson-3.10.10-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e8e28406f97fc2ea0c6150f4c1b6e8261453318930b334abc419214c82314f85"}, - {file = "orjson-3.10.10-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e4d0d9fe174cc7a5bdce2e6c378bcdb4c49b2bf522a8f996aa586020e1b96cee"}, - {file = "orjson-3.10.10-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b3be81c42f1242cbed03cbb3973501fcaa2675a0af638f8be494eaf37143d999"}, - {file = "orjson-3.10.10-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:65f9886d3bae65be026219c0a5f32dbbe91a9e6272f56d092ab22561ad0ea33b"}, - {file = "orjson-3.10.10-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:730ed5350147db7beb23ddaf072f490329e90a1d059711d364b49fe352ec987b"}, - {file = "orjson-3.10.10-cp312-none-win32.whl", hash = "sha256:a8f4bf5f1c85bea2170800020d53a8877812892697f9c2de73d576c9307a8a5f"}, - {file = "orjson-3.10.10-cp312-none-win_amd64.whl", hash = "sha256:384cd13579a1b4cd689d218e329f459eb9ddc504fa48c5a83ef4889db7fd7a4f"}, - {file = "orjson-3.10.10-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:44bffae68c291f94ff5a9b4149fe9d1bdd4cd0ff0fb575bcea8351d48db629a1"}, - {file = "orjson-3.10.10-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e27b4c6437315df3024f0835887127dac2a0a3ff643500ec27088d2588fa5ae1"}, - {file = "orjson-3.10.10-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bca84df16d6b49325a4084fd8b2fe2229cb415e15c46c529f868c3387bb1339d"}, - {file = "orjson-3.10.10-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c14ce70e8f39bd71f9f80423801b5d10bf93d1dceffdecd04df0f64d2c69bc01"}, - {file = "orjson-3.10.10-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:24ac62336da9bda1bd93c0491eff0613003b48d3cb5d01470842e7b52a40d5b4"}, - {file = "orjson-3.10.10-cp313-none-win32.whl", hash = "sha256:eb0a42831372ec2b05acc9ee45af77bcaccbd91257345f93780a8e654efc75db"}, - {file = "orjson-3.10.10-cp313-none-win_amd64.whl", hash = "sha256:f0c4f37f8bf3f1075c6cc8dd8a9f843689a4b618628f8812d0a71e6968b95ffd"}, - {file = "orjson-3.10.10-cp38-cp38-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:829700cc18503efc0cf502d630f612884258020d98a317679cd2054af0259568"}, - {file = "orjson-3.10.10-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e0ceb5e0e8c4f010ac787d29ae6299846935044686509e2f0f06ed441c1ca949"}, - {file = "orjson-3.10.10-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0c25908eb86968613216f3db4d3003f1c45d78eb9046b71056ca327ff92bdbd4"}, - {file = "orjson-3.10.10-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:218cb0bc03340144b6328a9ff78f0932e642199ac184dd74b01ad691f42f93ff"}, - {file = "orjson-3.10.10-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e2277ec2cea3775640dc81ab5195bb5b2ada2fe0ea6eee4677474edc75ea6785"}, - {file = "orjson-3.10.10-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:848ea3b55ab5ccc9d7bbd420d69432628b691fba3ca8ae3148c35156cbd282aa"}, - {file = "orjson-3.10.10-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:e3e67b537ac0c835b25b5f7d40d83816abd2d3f4c0b0866ee981a045287a54f3"}, - {file = "orjson-3.10.10-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:7948cfb909353fce2135dcdbe4521a5e7e1159484e0bb024c1722f272488f2b8"}, - {file = "orjson-3.10.10-cp38-none-win32.whl", hash = "sha256:78bee66a988f1a333dc0b6257503d63553b1957889c17b2c4ed72385cd1b96ae"}, - {file = "orjson-3.10.10-cp38-none-win_amd64.whl", hash = "sha256:f1d647ca8d62afeb774340a343c7fc023efacfd3a39f70c798991063f0c681dd"}, - {file = "orjson-3.10.10-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:5a059afddbaa6dd733b5a2d76a90dbc8af790b993b1b5cb97a1176ca713b5df8"}, - {file = "orjson-3.10.10-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6f9b5c59f7e2a1a410f971c5ebc68f1995822837cd10905ee255f96074537ee6"}, - {file = "orjson-3.10.10-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d5ef198bafdef4aa9d49a4165ba53ffdc0a9e1c7b6f76178572ab33118afea25"}, - {file = "orjson-3.10.10-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aaf29ce0bb5d3320824ec3d1508652421000ba466abd63bdd52c64bcce9eb1fa"}, - {file = "orjson-3.10.10-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dddd5516bcc93e723d029c1633ae79c4417477b4f57dad9bfeeb6bc0315e654a"}, - {file = "orjson-3.10.10-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a12f2003695b10817f0fa8b8fca982ed7f5761dcb0d93cff4f2f9f6709903fd7"}, - {file = "orjson-3.10.10-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:672f9874a8a8fb9bb1b771331d31ba27f57702c8106cdbadad8bda5d10bc1019"}, - {file = "orjson-3.10.10-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:1dcbb0ca5fafb2b378b2c74419480ab2486326974826bbf6588f4dc62137570a"}, - {file = "orjson-3.10.10-cp39-none-win32.whl", hash = "sha256:d9bbd3a4b92256875cb058c3381b782649b9a3c68a4aa9a2fff020c2f9cfc1be"}, - {file = "orjson-3.10.10-cp39-none-win_amd64.whl", hash = "sha256:766f21487a53aee8524b97ca9582d5c6541b03ab6210fbaf10142ae2f3ced2aa"}, - {file = "orjson-3.10.10.tar.gz", hash = "sha256:37949383c4df7b4337ce82ee35b6d7471e55195efa7dcb45ab8226ceadb0fe3b"}, + {file = "orjson-3.10.13-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:1232c5e873a4d1638ef957c5564b4b0d6f2a6ab9e207a9b3de9de05a09d1d920"}, + {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d26a0eca3035619fa366cbaf49af704c7cb1d4a0e6c79eced9f6a3f2437964b6"}, + {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d4b6acd7c9c829895e50d385a357d4b8c3fafc19c5989da2bae11783b0fd4977"}, + {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1884e53c6818686891cc6fc5a3a2540f2f35e8c76eac8dc3b40480fb59660b00"}, + {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a428afb5720f12892f64920acd2eeb4d996595bf168a26dd9190115dbf1130d"}, + {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba5b13b8739ce5b630c65cb1c85aedbd257bcc2b9c256b06ab2605209af75a2e"}, + {file = "orjson-3.10.13-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:cab83e67f6aabda1b45882254b2598b48b80ecc112968fc6483fa6dae609e9f0"}, + {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:62c3cc00c7e776c71c6b7b9c48c5d2701d4c04e7d1d7cdee3572998ee6dc57cc"}, + {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:dc03db4922e75bbc870b03fc49734cefbd50fe975e0878327d200022210b82d8"}, + {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:22f1c9a30b43d14a041a6ea190d9eca8a6b80c4beb0e8b67602c82d30d6eec3e"}, + {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b42f56821c29e697c68d7d421410d7c1d8f064ae288b525af6a50cf99a4b1200"}, + {file = "orjson-3.10.13-cp310-cp310-win32.whl", hash = "sha256:0dbf3b97e52e093d7c3e93eb5eb5b31dc7535b33c2ad56872c83f0160f943487"}, + {file = "orjson-3.10.13-cp310-cp310-win_amd64.whl", hash = "sha256:46c249b4e934453be4ff2e518cd1adcd90467da7391c7a79eaf2fbb79c51e8c7"}, + {file = "orjson-3.10.13-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:a36c0d48d2f084c800763473020a12976996f1109e2fcb66cfea442fdf88047f"}, + {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0065896f85d9497990731dfd4a9991a45b0a524baec42ef0a63c34630ee26fd6"}, + {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:92b4ec30d6025a9dcdfe0df77063cbce238c08d0404471ed7a79f309364a3d19"}, + {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a94542d12271c30044dadad1125ee060e7a2048b6c7034e432e116077e1d13d2"}, + {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3723e137772639af8adb68230f2aa4bcb27c48b3335b1b1e2d49328fed5e244c"}, + {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f00c7fb18843bad2ac42dc1ce6dd214a083c53f1e324a0fd1c8137c6436269b"}, + {file = "orjson-3.10.13-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0e2759d3172300b2f892dee85500b22fca5ac49e0c42cfff101aaf9c12ac9617"}, + {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ee948c6c01f6b337589c88f8e0bb11e78d32a15848b8b53d3f3b6fea48842c12"}, + {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:aa6fe68f0981fba0d4bf9cdc666d297a7cdba0f1b380dcd075a9a3dd5649a69e"}, + {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:dbcd7aad6bcff258f6896abfbc177d54d9b18149c4c561114f47ebfe74ae6bfd"}, + {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:2149e2fcd084c3fd584881c7f9d7f9e5ad1e2e006609d8b80649655e0d52cd02"}, + {file = "orjson-3.10.13-cp311-cp311-win32.whl", hash = "sha256:89367767ed27b33c25c026696507c76e3d01958406f51d3a2239fe9e91959df2"}, + {file = "orjson-3.10.13-cp311-cp311-win_amd64.whl", hash = "sha256:dca1d20f1af0daff511f6e26a27354a424f0b5cf00e04280279316df0f604a6f"}, + {file = "orjson-3.10.13-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:a3614b00621c77f3f6487792238f9ed1dd8a42f2ec0e6540ee34c2d4e6db813a"}, + {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c976bad3996aa027cd3aef78aa57873f3c959b6c38719de9724b71bdc7bd14b"}, + {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f74d878d1efb97a930b8a9f9898890067707d683eb5c7e20730030ecb3fb930"}, + {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:33ef84f7e9513fb13b3999c2a64b9ca9c8143f3da9722fbf9c9ce51ce0d8076e"}, + {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd2bcde107221bb9c2fa0c4aaba735a537225104173d7e19cf73f70b3126c993"}, + {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:064b9dbb0217fd64a8d016a8929f2fae6f3312d55ab3036b00b1d17399ab2f3e"}, + {file = "orjson-3.10.13-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c0044b0b8c85a565e7c3ce0a72acc5d35cda60793edf871ed94711e712cb637d"}, + {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7184f608ad563032e398f311910bc536e62b9fbdca2041be889afcbc39500de8"}, + {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:d36f689e7e1b9b6fb39dbdebc16a6f07cbe994d3644fb1c22953020fc575935f"}, + {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:54433e421618cd5873e51c0e9d0b9fb35f7bf76eb31c8eab20b3595bb713cd3d"}, + {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e1ba0c5857dd743438acecc1cd0e1adf83f0a81fee558e32b2b36f89e40cee8b"}, + {file = "orjson-3.10.13-cp312-cp312-win32.whl", hash = "sha256:a42b9fe4b0114b51eb5cdf9887d8c94447bc59df6dbb9c5884434eab947888d8"}, + {file = "orjson-3.10.13-cp312-cp312-win_amd64.whl", hash = "sha256:3a7df63076435f39ec024bdfeb4c9767ebe7b49abc4949068d61cf4857fa6d6c"}, + {file = "orjson-3.10.13-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:2cdaf8b028a976ebab837a2c27b82810f7fc76ed9fb243755ba650cc83d07730"}, + {file = "orjson-3.10.13-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:48a946796e390cbb803e069472de37f192b7a80f4ac82e16d6eb9909d9e39d56"}, + {file = "orjson-3.10.13-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7d64f1db5ecbc21eb83097e5236d6ab7e86092c1cd4c216c02533332951afc"}, + {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:711878da48f89df194edd2ba603ad42e7afed74abcd2bac164685e7ec15f96de"}, + {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:cf16f06cb77ce8baf844bc222dbcb03838f61d0abda2c3341400c2b7604e436e"}, + {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:8257c3fb8dd7b0b446b5e87bf85a28e4071ac50f8c04b6ce2d38cb4abd7dff57"}, + {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d9c3a87abe6f849a4a7ac8a8a1dede6320a4303d5304006b90da7a3cd2b70d2c"}, + {file = "orjson-3.10.13-cp313-cp313-win32.whl", hash = "sha256:527afb6ddb0fa3fe02f5d9fba4920d9d95da58917826a9be93e0242da8abe94a"}, + {file = "orjson-3.10.13-cp313-cp313-win_amd64.whl", hash = "sha256:b5f7c298d4b935b222f52d6c7f2ba5eafb59d690d9a3840b7b5c5cda97f6ec5c"}, + {file = "orjson-3.10.13-cp38-cp38-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:e49333d1038bc03a25fdfe11c86360df9b890354bfe04215f1f54d030f33c342"}, + {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:003721c72930dbb973f25c5d8e68d0f023d6ed138b14830cc94e57c6805a2eab"}, + {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:63664bf12addb318dc8f032160e0f5dc17eb8471c93601e8f5e0d07f95003784"}, + {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6066729cf9552d70de297b56556d14b4f49c8f638803ee3c90fd212fa43cc6af"}, + {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8a1152e2761025c5d13b5e1908d4b1c57f3797ba662e485ae6f26e4e0c466388"}, + {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:69b21d91c5c5ef8a201036d207b1adf3aa596b930b6ca3c71484dd11386cf6c3"}, + {file = "orjson-3.10.13-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b12a63f48bb53dba8453d36ca2661f2330126d54e26c1661e550b32864b28ce3"}, + {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:a5a7624ab4d121c7e035708c8dd1f99c15ff155b69a1c0affc4d9d8b551281ba"}, + {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:0fee076134398d4e6cb827002468679ad402b22269510cf228301b787fdff5ae"}, + {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:ae537fcf330b3947e82c6ae4271e092e6cf16b9bc2cef68b14ffd0df1fa8832a"}, + {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:f81b26c03f5fb5f0d0ee48d83cea4d7bc5e67e420d209cc1a990f5d1c62f9be0"}, + {file = "orjson-3.10.13-cp38-cp38-win32.whl", hash = "sha256:0bc858086088b39dc622bc8219e73d3f246fb2bce70a6104abd04b3a080a66a8"}, + {file = "orjson-3.10.13-cp38-cp38-win_amd64.whl", hash = "sha256:3ca6f17467ebbd763f8862f1d89384a5051b461bb0e41074f583a0ebd7120e8e"}, + {file = "orjson-3.10.13-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:4a11532cbfc2f5752c37e84863ef8435b68b0e6d459b329933294f65fa4bda1a"}, + {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c96d2fb80467d1d0dfc4d037b4e1c0f84f1fe6229aa7fea3f070083acef7f3d7"}, + {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dda4ba4d3e6f6c53b6b9c35266788053b61656a716a7fef5c884629c2a52e7aa"}, + {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e4f998bbf300690be881772ee9c5281eb9c0044e295bcd4722504f5b5c6092ff"}, + {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dce1cc42ed75b585c0c4dc5eb53a90a34ccb493c09a10750d1a1f9b9eff2bd12"}, + {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03b0f29d485411e3c13d79604b740b14e4e5fb58811743f6f4f9693ee6480a8f"}, + {file = "orjson-3.10.13-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:233aae4474078d82f425134bb6a10fb2b3fc5a1a1b3420c6463ddd1b6a97eda8"}, + {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:e384e330a67cf52b3597ee2646de63407da6f8fc9e9beec3eaaaef5514c7a1c9"}, + {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:4222881d0aab76224d7b003a8e5fdae4082e32c86768e0e8652de8afd6c4e2c1"}, + {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:e400436950ba42110a20c50c80dff4946c8e3ec09abc1c9cf5473467e83fd1c5"}, + {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:f47c9e7d224b86ffb086059cdcf634f4b3f32480f9838864aa09022fe2617ce2"}, + {file = "orjson-3.10.13-cp39-cp39-win32.whl", hash = "sha256:a9ecea472f3eb653e1c0a3d68085f031f18fc501ea392b98dcca3e87c24f9ebe"}, + {file = "orjson-3.10.13-cp39-cp39-win_amd64.whl", hash = "sha256:5385935a73adce85cc7faac9d396683fd813566d3857fa95a0b521ef84a5b588"}, + {file = "orjson-3.10.13.tar.gz", hash = "sha256:eb9bfb14ab8f68d9d9492d4817ae497788a15fd7da72e14dfabc289c3bb088ec"}, ] [[package]] @@ -1034,13 +1051,13 @@ files = [ [[package]] name = "pytest" -version = "8.3.3" +version = "8.3.4" description = "pytest: simple powerful testing with Python" optional = false python-versions = ">=3.8" files = [ - {file = "pytest-8.3.3-py3-none-any.whl", hash = "sha256:a6853c7375b2663155079443d2e45de913a911a11d669df02a50814944db57b2"}, - {file = "pytest-8.3.3.tar.gz", hash = "sha256:70b98107bd648308a7952b06e6ca9a50bc660be218d53c257cc1fc94fda10181"}, + {file = "pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6"}, + {file = "pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761"}, ] [package.dependencies] @@ -1108,6 +1125,7 @@ files = [ {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"}, {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"}, + {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"}, {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"}, {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"}, {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"}, @@ -1165,29 +1183,29 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] [[package]] name = "ruff" -version = "0.7.0" +version = "0.8.6" description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" files = [ - {file = "ruff-0.7.0-py3-none-linux_armv6l.whl", hash = "sha256:0cdf20c2b6ff98e37df47b2b0bd3a34aaa155f59a11182c1303cce79be715628"}, - {file = "ruff-0.7.0-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:496494d350c7fdeb36ca4ef1c9f21d80d182423718782222c29b3e72b3512737"}, - {file = "ruff-0.7.0-py3-none-macosx_11_0_arm64.whl", hash = "sha256:214b88498684e20b6b2b8852c01d50f0651f3cc6118dfa113b4def9f14faaf06"}, - {file = "ruff-0.7.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:630fce3fefe9844e91ea5bbf7ceadab4f9981f42b704fae011bb8efcaf5d84be"}, - {file = "ruff-0.7.0-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:211d877674e9373d4bb0f1c80f97a0201c61bcd1e9d045b6e9726adc42c156aa"}, - {file = "ruff-0.7.0-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:194d6c46c98c73949a106425ed40a576f52291c12bc21399eb8f13a0f7073495"}, - {file = "ruff-0.7.0-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:82c2579b82b9973a110fab281860403b397c08c403de92de19568f32f7178598"}, - {file = "ruff-0.7.0-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9af971fe85dcd5eaed8f585ddbc6bdbe8c217fb8fcf510ea6bca5bdfff56040e"}, - {file = "ruff-0.7.0-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b641c7f16939b7d24b7bfc0be4102c56562a18281f84f635604e8a6989948914"}, - {file = "ruff-0.7.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d71672336e46b34e0c90a790afeac8a31954fd42872c1f6adaea1dff76fd44f9"}, - {file = "ruff-0.7.0-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:ab7d98c7eed355166f367597e513a6c82408df4181a937628dbec79abb2a1fe4"}, - {file = "ruff-0.7.0-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:1eb54986f770f49edb14f71d33312d79e00e629a57387382200b1ef12d6a4ef9"}, - {file = "ruff-0.7.0-py3-none-musllinux_1_2_i686.whl", hash = "sha256:dc452ba6f2bb9cf8726a84aa877061a2462afe9ae0ea1d411c53d226661c601d"}, - {file = "ruff-0.7.0-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:4b406c2dce5be9bad59f2de26139a86017a517e6bcd2688da515481c05a2cb11"}, - {file = "ruff-0.7.0-py3-none-win32.whl", hash = "sha256:f6c968509f767776f524a8430426539587d5ec5c662f6addb6aa25bc2e8195ec"}, - {file = "ruff-0.7.0-py3-none-win_amd64.whl", hash = "sha256:ff4aabfbaaba880e85d394603b9e75d32b0693152e16fa659a3064a85df7fce2"}, - {file = "ruff-0.7.0-py3-none-win_arm64.whl", hash = "sha256:10842f69c245e78d6adec7e1db0a7d9ddc2fff0621d730e61657b64fa36f207e"}, - {file = "ruff-0.7.0.tar.gz", hash = "sha256:47a86360cf62d9cd53ebfb0b5eb0e882193fc191c6d717e8bef4462bc3b9ea2b"}, + {file = "ruff-0.8.6-py3-none-linux_armv6l.whl", hash = "sha256:defed167955d42c68b407e8f2e6f56ba52520e790aba4ca707a9c88619e580e3"}, + {file = "ruff-0.8.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:54799ca3d67ae5e0b7a7ac234baa657a9c1784b48ec954a094da7c206e0365b1"}, + {file = "ruff-0.8.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:e88b8f6d901477c41559ba540beeb5a671e14cd29ebd5683903572f4b40a9807"}, + {file = "ruff-0.8.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0509e8da430228236a18a677fcdb0c1f102dd26d5520f71f79b094963322ed25"}, + {file = "ruff-0.8.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:91a7ddb221779871cf226100e677b5ea38c2d54e9e2c8ed847450ebbdf99b32d"}, + {file = "ruff-0.8.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:248b1fb3f739d01d528cc50b35ee9c4812aa58cc5935998e776bf8ed5b251e75"}, + {file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:bc3c083c50390cf69e7e1b5a5a7303898966be973664ec0c4a4acea82c1d4315"}, + {file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:52d587092ab8df308635762386f45f4638badb0866355b2b86760f6d3c076188"}, + {file = "ruff-0.8.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:61323159cf21bc3897674e5adb27cd9e7700bab6b84de40d7be28c3d46dc67cf"}, + {file = "ruff-0.8.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ae4478b1471fc0c44ed52a6fb787e641a2ac58b1c1f91763bafbc2faddc5117"}, + {file = "ruff-0.8.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:0c000a471d519b3e6cfc9c6680025d923b4ca140ce3e4612d1a2ef58e11f11fe"}, + {file = "ruff-0.8.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:9257aa841e9e8d9b727423086f0fa9a86b6b420fbf4bf9e1465d1250ce8e4d8d"}, + {file = "ruff-0.8.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:45a56f61b24682f6f6709636949ae8cc82ae229d8d773b4c76c09ec83964a95a"}, + {file = "ruff-0.8.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:496dd38a53aa173481a7d8866bcd6451bd934d06976a2505028a50583e001b76"}, + {file = "ruff-0.8.6-py3-none-win32.whl", hash = "sha256:e169ea1b9eae61c99b257dc83b9ee6c76f89042752cb2d83486a7d6e48e8f764"}, + {file = "ruff-0.8.6-py3-none-win_amd64.whl", hash = "sha256:f1d70bef3d16fdc897ee290d7d20da3cbe4e26349f62e8a0274e7a3f4ce7a905"}, + {file = "ruff-0.8.6-py3-none-win_arm64.whl", hash = "sha256:7d7fc2377a04b6e04ffe588caad613d0c460eb2ecba4c0ccbbfe2bc973cbc162"}, + {file = "ruff-0.8.6.tar.gz", hash = "sha256:dcad24b81b62650b0eb8814f576fc65cfee8674772a6e24c9b747911801eeaa5"}, ] [[package]] @@ -1464,4 +1482,4 @@ orjson = ["orjson"] [metadata] lock-version = "2.0" python-versions = "^3.8.1" -content-hash = "71704ba175e33528872fab8121cb609041bd97b6a99f8f04022a26904941b27c" +content-hash = "3d9605c7f277f69e5c732d2edf25ed10fde6af31b791bb787229eb92be962af6" diff --git a/pyproject.toml b/pyproject.toml index 1e15fe5698..971e229d70 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.1.1" +version = "7.1.3a0" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] @@ -69,7 +69,7 @@ sphinx-autodoc-typehints = ">=1.25.3,<=2.0.1" typing-extensions = "^4.5.0" [tool.poetry.group.lint.dependencies] -ruff = ">=0.0.286,<0.8.0" +ruff = ">=0.0.286,<0.10.0" [tool.poetry.extras] berkeleydb = ["berkeleydb"] @@ -166,7 +166,7 @@ ignore = [ ] [tool.black] -line-length = "88" +line-length = 88 target-version = ['py38'] required-version = "24.4.2" include = '\.pyi?$' diff --git a/rdflib/__init__.py b/rdflib/__init__.py index 0c40cd7a4d..843b614e42 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -59,6 +59,7 @@ "BNode", "IdentifiedNode", "Literal", + "Node", "Variable", "Namespace", "Dataset", @@ -195,7 +196,7 @@ XSD, Namespace, ) -from rdflib.term import BNode, IdentifiedNode, Literal, URIRef, Variable +from rdflib.term import BNode, IdentifiedNode, Literal, Node, URIRef, Variable from rdflib import plugin, query, util # isort:skip from rdflib.container import * # isort:skip # noqa: F403 diff --git a/rdflib/extras/shacl.py b/rdflib/extras/shacl.py index 30fdab07bc..1a5094ce32 100644 --- a/rdflib/extras/shacl.py +++ b/rdflib/extras/shacl.py @@ -4,18 +4,30 @@ from __future__ import annotations -from typing import Optional, Union +from typing import TYPE_CHECKING, Optional, Union -from rdflib import Graph, Literal, URIRef, paths +from rdflib import BNode, Graph, Literal, URIRef, paths +from rdflib.collection import Collection from rdflib.namespace import RDF, SH from rdflib.paths import Path from rdflib.term import Node +if TYPE_CHECKING: + from rdflib.term import IdentifiedNode + class SHACLPathError(Exception): pass +# Map the variable length path operators to the corresponding SHACL path predicates +_PATH_MOD_TO_PRED = { + paths.ZeroOrMore: SH.zeroOrMorePath, + paths.OneOrMore: SH.oneOrMorePath, + paths.ZeroOrOne: SH.zeroOrOnePath, +} + + # This implementation is roughly based on # pyshacl.helper.sparql_query_helper::SPARQLQueryHelper._shacl_path_to_sparql_path def parse_shacl_path( @@ -91,3 +103,110 @@ def parse_shacl_path( raise SHACLPathError(f"Cannot parse {repr(path_identifier)} as a SHACL Path.") return path + + +def _build_path_component( + graph: Graph, path_component: URIRef | Path +) -> IdentifiedNode: + """ + Helper method that implements the recursive component of SHACL path + triple construction. + + :param graph: A :class:`~rdflib.graph.Graph` into which to insert triples + :param graph_component: A :class:`~rdflib.term.URIRef` or + :class:`~rdflib.paths.Path` that is part of a path expression + :return: The :class:`~rdflib.term.IdentifiedNode of the resource in the + graph that corresponds to the provided path_component + """ + # Literals or other types are not allowed + if not isinstance(path_component, (URIRef, Path)): + raise TypeError( + f"Objects of type {type(path_component)} are not valid " + + "components of a SHACL path." + ) + + # If the path component is a URI, return it + elif isinstance(path_component, URIRef): + return path_component + # Otherwise, the path component is represented as a blank node + bnode = BNode() + + # Handle Sequence Paths + if isinstance(path_component, paths.SequencePath): + # Sequence paths are a Collection directly with at least two items + if len(path_component.args) < 2: + raise SHACLPathError( + "A list of SHACL Sequence Paths must contain at least two path items." + ) + Collection( + graph, + bnode, + [_build_path_component(graph, arg) for arg in path_component.args], + ) + + # Handle Inverse Paths + elif isinstance(path_component, paths.InvPath): + graph.add( + (bnode, SH.inversePath, _build_path_component(graph, path_component.arg)) + ) + + # Handle Alternative Paths + elif isinstance(path_component, paths.AlternativePath): + # Alternative paths are a Collection but referenced by sh:alternativePath + # with at least two items + if len(path_component.args) < 2: + raise SHACLPathError( + "List of SHACL alternate paths must have at least two path items." + ) + coll = Collection( + graph, + BNode(), + [_build_path_component(graph, arg) for arg in path_component.args], + ) + graph.add((bnode, SH.alternativePath, coll.uri)) + + # Handle Variable Length Paths + elif isinstance(path_component, paths.MulPath): + # Get the predicate corresponding to the path modifiier + pred = _PATH_MOD_TO_PRED.get(path_component.mod) + if pred is None: + raise SHACLPathError(f"Unknown path modifier {path_component.mod}") + graph.add((bnode, pred, _build_path_component(graph, path_component.path))) + + # Return the blank node created for the provided path_component + return bnode + + +def build_shacl_path( + path: URIRef | Path, target_graph: Graph | None = None +) -> tuple[IdentifiedNode, Graph | None]: + """ + Build the SHACL Path triples for a path given by a :class:`~rdflib.term.URIRef` for + simple paths or a :class:`~rdflib.paths.Path` for complex paths. + + Returns an :class:`~rdflib.term.IdentifiedNode` for the path (which should be + the object of a triple with predicate sh:path) and the graph into which any + new triples were added. + + :param path: A :class:`~rdflib.term.URIRef` or a :class:`~rdflib.paths.Path` + :param target_graph: Optionally, a :class:`~rdflib.graph.Graph` into which to put + constructed triples. If not provided, a new graph will be created + :return: A (path_identifier, graph) tuple where: + - path_identifier: If path is a :class:`~rdflib.term.URIRef`, this is simply + the provided path. If path is a :class:`~rdflib.paths.Path`, this is + the :class:`~rdflib.term.BNode` corresponding to the root of the SHACL + path expression added to the graph. + - graph: None if path is a :class:`~rdflib.term.URIRef` (as no new triples + are constructed). If path is a :class:`~rdflib.paths.Path`, this is either the + target_graph provided or a new graph into which the path triples were added. + """ + # If a path is a URI, that's the whole path. No graph needs to be constructed. + if isinstance(path, URIRef): + return path, None + + # Create a graph if one was not provided + if target_graph is None: + target_graph = Graph() + + # Recurse through the path to build the graph representation + return _build_path_component(target_graph, path), target_graph diff --git a/rdflib/graph.py b/rdflib/graph.py index 80ccc3fa8e..d74dd85cfa 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -411,11 +411,50 @@ class Graph(Node): - """An RDF Graph + """An RDF Graph: a Python object containing nodes and relations between them as + RDF 'triples'. - The constructor accepts one argument, the "store" - that will be used to store the graph data (see the "store" - package for stores currently shipped with rdflib). + This is the central RDFLib object class and Graph objects are almost always present + it all uses of RDFLib. + + The basic use is to create a Graph and iterate through or query its content, e.g.: + + >>> from rdflib import Graph, URIRef + >>> g = Graph() + + >>> g.add(( + ... URIRef("http://example.com/s1"), # subject + ... URIRef("http://example.com/p1"), # predicate + ... URIRef("http://example.com/o1"), # object + ... )) # doctest: +ELLIPSIS + )> + + >>> g.add(( + ... URIRef("http://example.com/s2"), # subject + ... URIRef("http://example.com/p2"), # predicate + ... URIRef("http://example.com/o2"), # object + ... )) # doctest: +ELLIPSIS + )> + + >>> for triple in sorted(g): # simple looping + ... print(triple) + (rdflib.term.URIRef('http://example.com/s1'), rdflib.term.URIRef('http://example.com/p1'), rdflib.term.URIRef('http://example.com/o1')) + (rdflib.term.URIRef('http://example.com/s2'), rdflib.term.URIRef('http://example.com/p2'), rdflib.term.URIRef('http://example.com/o2')) + + >>> # get the object of the triple with subject s1 and predicate p1 + >>> o = g.value( + ... subject=URIRef("http://example.com/s1"), + ... predicate=URIRef("http://example.com/p1") + ... ) + + + The constructor accepts one argument, the "store" that will be used to store the + graph data with the default being the `Memory ` + (in memory) Store. Other Stores that persist content to disk using various file + databases or Stores that use remote servers (SPARQL systems) are supported. See + the :doc:`rdflib.plugins.stores` package for Stores currently shipped with RDFLib. + Other Stores not shipped with RDFLib can be added, such as + `HDT `_. Stores can be context-aware or unaware. Unaware stores take up (some) less space but cannot support features that require @@ -423,14 +462,15 @@ class Graph(Node): provenance. Even if used with a context-aware store, Graph will only expose the quads which - belong to the default graph. To access the rest of the data, `ConjunctiveGraph` or - `Dataset` classes can be used instead. + belong to the default graph. To access the rest of the data the + `Dataset` class can be used instead. The Graph constructor can take an identifier which identifies the Graph by name. If none is given, the graph is assigned a BNode for its identifier. - For more on named graphs, see: http://www.w3.org/2004/03/trix/ + For more on Named Graphs, see the RDFLib `Dataset` class and the TriG Specification, + https://www.w3.org/TR/trig/. """ context_aware: bool @@ -1090,10 +1130,10 @@ def transitiveClosure( # noqa: N802 function against the graph >>> from rdflib.collection import Collection - >>> g=Graph() - >>> a=BNode("foo") - >>> b=BNode("bar") - >>> c=BNode("baz") + >>> g = Graph() + >>> a = BNode("foo") + >>> b = BNode("bar") + >>> c = BNode("baz") >>> g.add((a,RDF.first,RDF.type)) # doctest: +ELLIPSIS )> >>> g.add((a,RDF.rest,b)) # doctest: +ELLIPSIS @@ -1354,7 +1394,7 @@ def serialize( else: os_path = location with open(os_path, "wb") as stream: - serializer.serialize(stream, encoding=encoding, **args) + serializer.serialize(stream, base=base, encoding=encoding, **args) return self def print( @@ -2297,21 +2337,49 @@ def __reduce__(self) -> Tuple[Type[Graph], Tuple[Store, _ContextIdentifierType]] class Dataset(ConjunctiveGraph): """ - RDF 1.1 Dataset. Small extension to the Conjunctive Graph: - - the primary term is graphs in the datasets and not contexts with quads, - so there is a separate method to set/retrieve a graph in a dataset and - operate with graphs - - graphs cannot be identified with blank nodes - - added a method to directly add a single quad + An RDFLib Dataset is an object that stores multiple Named Graphs - instances of + RDFLib Graph identified by IRI - within it and allows whole-of-dataset or single + Graph use. + + RDFLib's Dataset class is based on the `RDF 1.2. 'Dataset' definition + `_: + + .. + + An RDF dataset is a collection of RDF graphs, and comprises: + + - Exactly one default graph, being an RDF graph. The default graph does not + have a name and MAY be empty. + - Zero or more named graphs. Each named graph is a pair consisting of an IRI or + a blank node (the graph name), and an RDF graph. Graph names are unique + within an RDF dataset. - Examples of usage: + Accordingly, a Dataset allows for `Graph` objects to be added to it with + :class:`rdflib.term.URIRef` or :class:`rdflib.term.BNode` identifiers and always + creats a default graph with the :class:`rdflib.term.URIRef` identifier + :code:`urn:x-rdflib:default`. + + Dataset extends Graph's Subject, Predicate, Object (s, p, o) 'triple' + structure to include a graph identifier - archaically called Context - producing + 'quads' of s, p, o, g. + + Triples, or quads, can be added to a Dataset. Triples, or quads with the graph + identifer :code:`urn:x-rdflib:default` go into the default graph. + + .. note:: Dataset builds on the `ConjunctiveGraph` class but that class's direct + use is now deprecated (since RDFLib 7.x) and it should not be used. + `ConjunctiveGraph` will be removed from future RDFLib versions. + + Examples of usage and see also the examples/datast.py file: >>> # Create a new Dataset >>> ds = Dataset() >>> # simple triples goes to default graph - >>> ds.add((URIRef("http://example.org/a"), - ... URIRef("http://www.example.org/b"), - ... Literal("foo"))) # doctest: +ELLIPSIS + >>> ds.add(( + ... URIRef("http://example.org/a"), + ... URIRef("http://www.example.org/b"), + ... Literal("foo") + ... )) # doctest: +ELLIPSIS )> >>> >>> # Create a graph in the dataset, if the graph name has already been @@ -2320,16 +2388,19 @@ class Dataset(ConjunctiveGraph): >>> g = ds.graph(URIRef("http://www.example.com/gr")) >>> >>> # add triples to the new graph as usual - >>> g.add( - ... (URIRef("http://example.org/x"), + >>> g.add(( + ... URIRef("http://example.org/x"), ... URIRef("http://example.org/y"), - ... Literal("bar")) ) # doctest: +ELLIPSIS + ... Literal("bar") + ... )) # doctest: +ELLIPSIS )> >>> # alternatively: add a quad to the dataset -> goes to the graph - >>> ds.add( - ... (URIRef("http://example.org/x"), + >>> ds.add(( + ... URIRef("http://example.org/x"), ... URIRef("http://example.org/z"), - ... Literal("foo-bar"),g) ) # doctest: +ELLIPSIS + ... Literal("foo-bar"), + ... g + ... )) # doctest: +ELLIPSIS )> >>> >>> # querying triples return them all regardless of the graph @@ -2395,8 +2466,8 @@ class Dataset(ConjunctiveGraph): >>> >>> # graph names in the dataset can be queried: >>> for c in ds.graphs(): # doctest: +SKIP - ... print(c) # doctest: - DEFAULT + ... print(c.identifier) # doctest: + urn:x-rdflib:default http://www.example.com/gr >>> # A graph can be created without specifying a name; a skolemized genid >>> # is created on the fly @@ -2415,7 +2486,7 @@ class Dataset(ConjunctiveGraph): >>> >>> # a graph can also be removed from a dataset via ds.remove_graph(g) - .. versionadded:: 4.0 + ... versionadded:: 4.0 """ def __init__( diff --git a/rdflib/namespace/__init__.py b/rdflib/namespace/__init__.py index 4077b0be33..eb8e2eeed8 100644 --- a/rdflib/namespace/__init__.py +++ b/rdflib/namespace/__init__.py @@ -226,6 +226,7 @@ def __repr__(self) -> str: # considered part of __dir__ results. These should be all annotations on # `DefinedNamespaceMeta`. _DFNS_RESERVED_ATTRS: Set[str] = { + "__slots__", "_NS", "_warn", "_fail", @@ -244,6 +245,8 @@ def __repr__(self) -> str: class DefinedNamespaceMeta(type): """Utility metaclass for generating URIRefs with a common prefix.""" + __slots__: Tuple[str, ...] = tuple() + _NS: Namespace _warn: bool = True _fail: bool = False # True means mimic ClosedNamespace @@ -255,15 +258,11 @@ def __getitem__(cls, name: str, default=None) -> URIRef: name = str(name) if name in _DFNS_RESERVED_ATTRS: - raise AttributeError( - f"DefinedNamespace like object has no attribute {name!r}" + raise KeyError( + f"DefinedNamespace like object has no access item named {name!r}" ) elif name in _IGNORED_ATTR_LOOKUP: raise KeyError() - if str(name).startswith("__"): - # NOTE on type ignore: This seems to be a real bug, super() does not - # implement this method, it will fail if it is ever reached. - return super().__getitem__(name, default) # type: ignore[misc] # undefined in superclass if (cls._warn or cls._fail) and name not in cls: if cls._fail: raise AttributeError(f"term '{name}' not in namespace '{cls._NS}'") @@ -277,26 +276,39 @@ def __getitem__(cls, name: str, default=None) -> URIRef: def __getattr__(cls, name: str): if name in _IGNORED_ATTR_LOOKUP: raise AttributeError() + elif name in _DFNS_RESERVED_ATTRS: + raise AttributeError( + f"DefinedNamespace like object has no attribute {name!r}" + ) + elif name.startswith("__"): + return super(DefinedNamespaceMeta, cls).__getattribute__(name) return cls.__getitem__(name) def __repr__(cls) -> str: - return f"Namespace({str(cls._NS)!r})" + try: + ns_repr = repr(cls._NS) + except AttributeError: + ns_repr = "" + return f"Namespace({ns_repr})" def __str__(cls) -> str: - return str(cls._NS) + try: + return str(cls._NS) + except AttributeError: + return "" def __add__(cls, other: str) -> URIRef: return cls.__getitem__(other) def __contains__(cls, item: str) -> bool: """Determine whether a URI or an individual item belongs to this namespace""" + try: + this_ns = cls._NS + except AttributeError: + return False item_str = str(item) - if item_str.startswith("__"): - # NOTE on type ignore: This seems to be a real bug, super() does not - # implement this method, it will fail if it is ever reached. - return super().__contains__(item) # type: ignore[misc] # undefined in superclass - if item_str.startswith(str(cls._NS)): - item_str = item_str[len(str(cls._NS)) :] + if item_str.startswith(str(this_ns)): + item_str = item_str[len(str(this_ns)) :] return any( item_str in c.__annotations__ or item_str in c._extras @@ -313,7 +325,7 @@ def __dir__(cls) -> Iterable[str]: return values def as_jsonld_context(self, pfx: str) -> dict: # noqa: N804 - """Returns this DefinedNamespace as a a JSON-LD 'context' object""" + """Returns this DefinedNamespace as a JSON-LD 'context' object""" terms = {pfx: str(self._NS)} for key, term in self.__annotations__.items(): if issubclass(term, URIRef): @@ -328,6 +340,8 @@ class DefinedNamespace(metaclass=DefinedNamespaceMeta): Warnings are emitted if unknown members are referenced if _warn is True """ + __slots__: Tuple[str, ...] = tuple() + def __init__(self): raise TypeError("namespace may not be instantiated") diff --git a/rdflib/plugins/parsers/jsonld.py b/rdflib/plugins/parsers/jsonld.py index 295a971263..e103e7033a 100644 --- a/rdflib/plugins/parsers/jsonld.py +++ b/rdflib/plugins/parsers/jsonld.py @@ -34,6 +34,7 @@ # we should consider streaming the input to deal with arbitrarily large graphs. from __future__ import annotations +import secrets import warnings from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, Union @@ -221,6 +222,7 @@ def __init__( if allow_lists_of_lists is not None else ALLOW_LISTS_OF_LISTS ) + self.invalid_uri_to_bnode: dict[str, BNode] = {} def parse(self, data: Any, context: Context, dataset: Graph) -> Graph: topcontext = False @@ -629,7 +631,12 @@ def _to_rdf_id(self, context: Context, id_val: str) -> Optional[IdentifiedNode]: uri = context.resolve(id_val) if not self.generalized_rdf and ":" not in uri: return None - return URIRef(uri) + node: IdentifiedNode = URIRef(uri) + if not str(node): + if id_val not in self.invalid_uri_to_bnode: + self.invalid_uri_to_bnode[id_val] = BNode(secrets.token_urlsafe(20)) + node = self.invalid_uri_to_bnode[id_val] + return node def _get_bnodeid(self, ref: str) -> Optional[str]: if not ref.startswith("_:"): diff --git a/rdflib/plugins/serializers/hext.py b/rdflib/plugins/serializers/hext.py index 9a8187c760..898308a092 100644 --- a/rdflib/plugins/serializers/hext.py +++ b/rdflib/plugins/serializers/hext.py @@ -77,8 +77,8 @@ def serialize( stream: IO[bytes], base: Optional[str] = None, encoding: Optional[str] = "utf-8", - **kwargs, - ): + **kwargs: Any, + ) -> None: if base is not None: warnings.warn( "base has no meaning for Hextuples serialization. " diff --git a/rdflib/plugins/serializers/jsonld.py b/rdflib/plugins/serializers/jsonld.py index 15f307edf2..0afe8305a8 100644 --- a/rdflib/plugins/serializers/jsonld.py +++ b/rdflib/plugins/serializers/jsonld.py @@ -64,8 +64,8 @@ def serialize( stream: IO[bytes], base: Optional[str] = None, encoding: Optional[str] = None, - **kwargs, - ): + **kwargs: Any, + ) -> None: # TODO: docstring w. args and return value encoding = encoding or "utf-8" if encoding not in ("utf-8", "utf-16"): diff --git a/rdflib/plugins/serializers/longturtle.py b/rdflib/plugins/serializers/longturtle.py index e886574f35..8de1e52a28 100644 --- a/rdflib/plugins/serializers/longturtle.py +++ b/rdflib/plugins/serializers/longturtle.py @@ -16,7 +16,13 @@ - Nicholas Car, 2023 """ +from __future__ import annotations + +from typing import IO, Any, Optional + +from rdflib.compare import to_canonical_graph from rdflib.exceptions import Error +from rdflib.graph import Graph from rdflib.namespace import RDF from rdflib.term import BNode, Literal, URIRef @@ -38,11 +44,20 @@ class LongTurtleSerializer(RecursiveSerializer): def __init__(self, store): self._ns_rewrite = {} - super(LongTurtleSerializer, self).__init__(store) + store = to_canonical_graph(store) + content = store.serialize(format="application/n-triples") + lines = content.split("\n") + lines.sort() + graph = Graph() + graph.parse( + data="\n".join(lines), format="application/n-triples", skolemize=True + ) + graph = graph.de_skolemize() + super(LongTurtleSerializer, self).__init__(graph) self.keywords = {RDF.type: "a"} self.reset() self.stream = None - self._spacious = _SPACIOUS_OUTPUT + self._spacious: bool = _SPACIOUS_OUTPUT def addNamespace(self, prefix, namespace): # Turtle does not support prefixes that start with _ @@ -74,7 +89,14 @@ def reset(self): self._started = False self._ns_rewrite = {} - def serialize(self, stream, base=None, encoding=None, spacious=None, **args): + def serialize( + self, + stream: IO[bytes], + base: Optional[str] = None, + encoding: Optional[str] = None, + spacious: Optional[bool] = None, + **kwargs: Any, + ) -> None: self.reset() self.stream = stream # if base is given here, use, if not and a base is set for the graph use that @@ -175,7 +197,7 @@ def s_squared(self, subject): return False self.write("\n" + self.indent() + "[]") self.predicateList(subject, newline=False) - self.write(" ;\n.") + self.write("\n.") return True def path(self, node, position, newline=False): @@ -292,6 +314,8 @@ def objectList(self, objects): if count > 1: if not isinstance(objects[0], BNode): self.write("\n" + self.indent(1)) + else: + self.write(" ") first_nl = True self.path(objects[0], OBJECT, newline=first_nl) for obj in objects[1:]: diff --git a/rdflib/plugins/serializers/nquads.py b/rdflib/plugins/serializers/nquads.py index 3c8d02ccc4..b74b9cab52 100644 --- a/rdflib/plugins/serializers/nquads.py +++ b/rdflib/plugins/serializers/nquads.py @@ -1,7 +1,7 @@ from __future__ import annotations import warnings -from typing import IO, Optional +from typing import IO, Any, Optional from rdflib.graph import ConjunctiveGraph, Graph from rdflib.plugins.serializers.nt import _quoteLiteral @@ -26,8 +26,8 @@ def serialize( stream: IO[bytes], base: Optional[str] = None, encoding: Optional[str] = None, - **args, - ): + **kwargs: Any, + ) -> None: if base is not None: warnings.warn("NQuadsSerializer does not support base.") if encoding is not None and encoding.lower() != self.encoding.lower(): diff --git a/rdflib/plugins/serializers/nt.py b/rdflib/plugins/serializers/nt.py index e87f949e34..1b0343b5ac 100644 --- a/rdflib/plugins/serializers/nt.py +++ b/rdflib/plugins/serializers/nt.py @@ -2,7 +2,7 @@ import codecs import warnings -from typing import IO, TYPE_CHECKING, Optional, Tuple, Union +from typing import IO, TYPE_CHECKING, Any, Optional, Tuple, Union from rdflib.graph import Graph from rdflib.serializer import Serializer @@ -33,7 +33,7 @@ def serialize( stream: IO[bytes], base: Optional[str] = None, encoding: Optional[str] = "utf-8", - **args, + **kwargs: Any, ) -> None: if base is not None: warnings.warn("NTSerializer does not support base.") diff --git a/rdflib/plugins/serializers/patch.py b/rdflib/plugins/serializers/patch.py index 3a5d372150..1bc5ff41f7 100644 --- a/rdflib/plugins/serializers/patch.py +++ b/rdflib/plugins/serializers/patch.py @@ -1,7 +1,7 @@ from __future__ import annotations import warnings -from typing import IO, Optional +from typing import IO, Any, Optional from uuid import uuid4 from rdflib import Dataset @@ -32,8 +32,8 @@ def serialize( stream: IO[bytes], base: Optional[str] = None, encoding: Optional[str] = None, - **kwargs, - ): + **kwargs: Any, + ) -> None: """ Serialize the store to the given stream. :param stream: The stream to serialize to. diff --git a/rdflib/plugins/serializers/rdfxml.py b/rdflib/plugins/serializers/rdfxml.py index d6a2f6abb6..8ae7d78cbe 100644 --- a/rdflib/plugins/serializers/rdfxml.py +++ b/rdflib/plugins/serializers/rdfxml.py @@ -1,7 +1,7 @@ from __future__ import annotations import xml.dom.minidom -from typing import IO, Dict, Generator, Optional, Set, Tuple +from typing import IO, Any, Dict, Generator, Optional, Set, Tuple from xml.sax.saxutils import escape, quoteattr from rdflib.collection import Collection @@ -47,7 +47,7 @@ def serialize( stream: IO[bytes], base: Optional[str] = None, encoding: Optional[str] = None, - **args, + **kwargs: Any, ) -> None: # if base is given here, use that, if not and a base is set for the graph use that if base is not None: @@ -66,8 +66,8 @@ def serialize( write(" None: self.__serialized: Dict[Identifier, int] = {} store = self.store @@ -185,7 +185,7 @@ def serialize( self.base = base elif store.base is not None: self.base = store.base - self.max_depth = args.get("max_depth", 3) + self.max_depth = kwargs.get("max_depth", 3) assert self.max_depth > 0, "max_depth must be greater than 0" self.nm = nm = store.namespace_manager @@ -205,8 +205,8 @@ def serialize( writer.push(RDFVOC.RDF) - if "xml_base" in args: - writer.attribute(XMLBASE, args["xml_base"]) + if "xml_base" in kwargs: + writer.attribute(XMLBASE, kwargs["xml_base"]) elif self.base: writer.attribute(XMLBASE, self.base) diff --git a/rdflib/plugins/serializers/trig.py b/rdflib/plugins/serializers/trig.py index 984f80c5ac..95b5e42c03 100644 --- a/rdflib/plugins/serializers/trig.py +++ b/rdflib/plugins/serializers/trig.py @@ -5,7 +5,7 @@ from __future__ import annotations -from typing import IO, TYPE_CHECKING, Dict, List, Optional, Tuple, Union +from typing import IO, TYPE_CHECKING, Any, Dict, List, Optional, Tuple, Union from rdflib.graph import ConjunctiveGraph, Graph from rdflib.plugins.serializers.turtle import TurtleSerializer @@ -67,8 +67,8 @@ def serialize( base: Optional[str] = None, encoding: Optional[str] = None, spacious: Optional[bool] = None, - **args, - ): + **kwargs: Any, + ) -> None: self.reset() self.stream = stream # if base is given here, use that, if not and a base is set for the graph use that diff --git a/rdflib/plugins/serializers/trix.py b/rdflib/plugins/serializers/trix.py index 008360e6b8..95730e8fbb 100644 --- a/rdflib/plugins/serializers/trix.py +++ b/rdflib/plugins/serializers/trix.py @@ -1,6 +1,6 @@ from __future__ import annotations -from typing import IO, Optional +from typing import IO, Any, Optional from rdflib.graph import ConjunctiveGraph, Graph from rdflib.namespace import Namespace @@ -28,8 +28,8 @@ def serialize( stream: IO[bytes], base: Optional[str] = None, encoding: Optional[str] = None, - **args, - ): + **kwargs: Any, + ) -> None: nm = self.store.namespace_manager self.writer = XMLWriter(stream, nm, encoding, extra_ns={"": TRIXNS}) diff --git a/rdflib/plugins/serializers/turtle.py b/rdflib/plugins/serializers/turtle.py index a26df04a6f..d1dfcf4a67 100644 --- a/rdflib/plugins/serializers/turtle.py +++ b/rdflib/plugins/serializers/turtle.py @@ -228,7 +228,7 @@ def serialize( base: Optional[str] = None, encoding: Optional[str] = None, spacious: Optional[bool] = None, - **args: Any, + **kwargs: Any, ) -> None: self.reset() self.stream = stream diff --git a/rdflib/plugins/sparql/parser.py b/rdflib/plugins/sparql/parser.py index 3ee230f53d..789d076102 100644 --- a/rdflib/plugins/sparql/parser.py +++ b/rdflib/plugins/sparql/parser.py @@ -1483,7 +1483,7 @@ def expandCollection(terms: ParseResults) -> List[List[Any]]: AskQuery = Comp( "AskQuery", Keyword("ASK") - + Param("datasetClause", ZeroOrMore(DatasetClause)) + + ZeroOrMore(ParamList("datasetClause", DatasetClause)) + WhereClause + SolutionModifier + ValuesClause, diff --git a/rdflib/plugins/stores/auditable.py b/rdflib/plugins/stores/auditable.py index 7a9748c698..b8fb534195 100644 --- a/rdflib/plugins/stores/auditable.py +++ b/rdflib/plugins/stores/auditable.py @@ -10,7 +10,7 @@ Calls to commit or rollback, flush the list of reverse operations This provides thread-safe atomicity and isolation (assuming concurrent operations occur with different store instances), but no durability (transactions are -persisted in memory and wont be available to reverse operations after the +persisted in memory and won't be available to reverse operations after the system fails): A and I out of ACID. """ diff --git a/test/data/longturtle/longturtle-target.ttl b/test/data/longturtle/longturtle-target.ttl new file mode 100644 index 0000000000..54cf23e9ff --- /dev/null +++ b/test/data/longturtle/longturtle-target.ttl @@ -0,0 +1,72 @@ +PREFIX geo: +PREFIX rdf: +PREFIX schema: +PREFIX xsd: + + + a schema:Person ; + schema:age 41 ; + schema:alternateName + [ + schema:name "Dr N.J. Car" ; + ] , + "N.J. Car" , + "Nick Car" ; + schema:name + [ + a ; + schema:hasPart + [ + a ; + schema:hasPart + [ + a ; + rdf:value "Car" ; + ] , + [ + a ; + rdf:value "Maxov" ; + ] ; + ] , + [ + a ; + rdf:value "Nicholas" ; + ] , + [ + a ; + rdf:value "John" ; + ] ; + ] ; + schema:worksFor ; +. + + + a schema:Organization ; + schema:location ; +. + + + a schema:Place ; + schema:address + [ + a schema:PostalAddress ; + schema:addressCountry + [ + schema:identifier "au" ; + schema:name "Australia" ; + ] ; + schema:addressLocality "Shorncliffe" ; + schema:addressRegion "QLD" ; + schema:postalCode 4017 ; + schema:streetAddress ( + 72 + "Yundah" + "Street" + ) ; + ] ; + schema:geo + [ + schema:polygon "POLYGON((153.082403 -27.325801, 153.08241 -27.32582, 153.082943 -27.325612, 153.083010 -27.325742, 153.083543 -27.325521, 153.083456 -27.325365, 153.082403 -27.325801))"^^geo:wktLiteral ; + ] ; + schema:name "KurrawongAI HQ" ; +. diff --git a/test/jsonld/local-suite/manifest.jsonld b/test/jsonld/local-suite/manifest.jsonld index b32fd059ad..0150b44c77 100644 --- a/test/jsonld/local-suite/manifest.jsonld +++ b/test/jsonld/local-suite/manifest.jsonld @@ -27,6 +27,17 @@ "purpose": "Multiple @id aliases. Issue #2164", "input": "toRdf-twoimports-in.jsonld", "expect": "toRdf-twoimports-out.nq" + }, + { + "@id": "#toRdf-two-invalid-ids", + "@type": ["jld:PositiveEvaluationTest", "jld:ToRDFTest"], + "name": "Two invalid identifiers", + "purpose": "Multiple nodes with invalid @ids are not merged together.", + "option": { + "produceGeneralizedRdf": true + }, + "input": "toRdf-twoinvalidids-in.jsonld", + "expect": "toRdf-twoinvalidids-out.nq" } ] } diff --git a/test/jsonld/local-suite/toRdf-twoinvalidids-in.jsonld b/test/jsonld/local-suite/toRdf-twoinvalidids-in.jsonld new file mode 100644 index 0000000000..67f62927cc --- /dev/null +++ b/test/jsonld/local-suite/toRdf-twoinvalidids-in.jsonld @@ -0,0 +1,20 @@ +{ + "@id": "https://example.org/root-object", + "https://schema.org/author": [ + { + "@id": "https://example.org/ invalid url 1", + "https://schema.org/name": "Jane Doe" + }, + { + "@id": "https://example.org/ invalid url 1", + "https://schema.org/givenName": "Jane", + "https://schema.org/familyName": "Doe" + }, + { + "@id": "https://example.org/ invalid url 2", + "https://schema.org/name": "John Doe", + "https://schema.org/givenName": "John", + "https://schema.org/familyName": "Doe" + } + ] +} diff --git a/test/jsonld/local-suite/toRdf-twoinvalidids-out.nq b/test/jsonld/local-suite/toRdf-twoinvalidids-out.nq new file mode 100644 index 0000000000..c6550560c3 --- /dev/null +++ b/test/jsonld/local-suite/toRdf-twoinvalidids-out.nq @@ -0,0 +1,10 @@ + + _:b1. + _:b2. + +_:b1 "Jane Doe". +_:b1 "Jane". +_:b1 "Doe". +_:b2 "John Doe". +_:b2 "John". +_:b2 "Doe". diff --git a/test/test_dataset/test_dataset.py b/test/test_dataset/test_dataset.py index 19b9fe830a..9f9bc9c26c 100644 --- a/test/test_dataset/test_dataset.py +++ b/test/test_dataset/test_dataset.py @@ -5,11 +5,10 @@ import pytest -from rdflib import URIRef, plugin +from rdflib import BNode, Namespace, URIRef, plugin from rdflib.graph import DATASET_DEFAULT_GRAPH_ID, Dataset, Graph from rdflib.store import Store from test.data import CONTEXT1, LIKES, PIZZA, TAREK -from test.utils.namespace import EGSCHEME # Will also run SPARQLUpdateStore tests against local SPARQL1.1 endpoint if # available. This assumes SPARQL1.1 query/update endpoints running locally at @@ -58,9 +57,9 @@ def get_dataset(request): except ImportError: pytest.skip("Dependencies for store '%s' not available!" % store) - graph = Dataset(store=store) + d = Dataset(store=store) - if not graph.store.graph_aware: + if not d.store.graph_aware: return if store in ["SQLiteLSM", "LevelDB"]: @@ -75,31 +74,39 @@ def get_dataset(request): else: path = tempfile.mkdtemp() - graph.open(path, create=True if store != "SPARQLUpdateStore" else False) + d.open(path, create=True if store != "SPARQLUpdateStore" else False) if store == "SPARQLUpdateStore": try: - graph.store.update("CLEAR ALL") + d.graph() + d.add( + ( + URIRef("http://example.com/s"), + URIRef("http://example.com/p"), + URIRef("http://example.com/o"), + ) + ) + d.store.update("CLEAR ALL") except Exception as e: if "SPARQLStore does not support BNodes! " in str(e): pass else: raise Exception(e) - yield store, graph + yield store, d if store == "SPARQLUpdateStore": try: - graph.store.update("CLEAR ALL") + d.update("CLEAR ALL") except Exception as e: if "SPARQLStore does not support BNodes! " in str(e): pass else: raise Exception(e) - graph.close() + d.close() else: - graph.close() - graph.destroy(path) + d.close() + d.destroy(path) if os.path.isdir(path): shutil.rmtree(path) else: @@ -121,7 +128,7 @@ def test_graph_aware(get_dataset): # empty named graphs if store != "SPARQLUpdateStore": # added graph exists - assert set(x.identifier for x in dataset.contexts()) == set( + assert set(x.identifier for x in dataset.graphs()) == set( [CONTEXT1, DATASET_DEFAULT_GRAPH_ID] ) @@ -131,7 +138,7 @@ def test_graph_aware(get_dataset): g1.add((TAREK, LIKES, PIZZA)) # added graph still exists - assert set(x.identifier for x in dataset.contexts()) == set( + assert set(x.identifier for x in dataset.graphs()) == set( [CONTEXT1, DATASET_DEFAULT_GRAPH_ID] ) @@ -147,14 +154,14 @@ def test_graph_aware(get_dataset): # empty named graphs if store != "SPARQLUpdateStore": # graph still exists, although empty - assert set(x.identifier for x in dataset.contexts()) == set( + assert set(x.identifier for x in dataset.graphs()) == set( [CONTEXT1, DATASET_DEFAULT_GRAPH_ID] ) dataset.remove_graph(CONTEXT1) # graph is gone - assert set(x.identifier for x in dataset.contexts()) == set( + assert set(x.identifier for x in dataset.graphs()) == set( [DATASET_DEFAULT_GRAPH_ID] ) @@ -173,7 +180,7 @@ def test_default_graph(get_dataset): dataset.add((TAREK, LIKES, PIZZA)) assert len(dataset) == 1 # only default exists - assert list(dataset.contexts()) == [dataset.default_context] + assert list(dataset.graphs()) == [dataset.default_context] # removing default graph removes triples but not actual graph dataset.remove_graph(DATASET_DEFAULT_GRAPH_ID) @@ -181,7 +188,7 @@ def test_default_graph(get_dataset): assert len(dataset) == 0 # default still exists - assert set(dataset.contexts()) == set([dataset.default_context]) + assert set(dataset.graphs()) == set([dataset.default_context]) def test_not_union(get_dataset): @@ -193,11 +200,11 @@ def test_not_union(get_dataset): "its default graph as the union of the named graphs" ) - subgraph1 = dataset.graph(CONTEXT1) - subgraph1.add((TAREK, LIKES, PIZZA)) + g1 = dataset.graph(CONTEXT1) + g1.add((TAREK, LIKES, PIZZA)) assert list(dataset.objects(TAREK, None)) == [] - assert list(subgraph1.objects(TAREK, None)) == [PIZZA] + assert list(g1.objects(TAREK, None)) == [PIZZA] def test_iter(get_dataset): @@ -208,16 +215,16 @@ def test_iter(get_dataset): uri_c = URIRef("https://example.com/c") uri_d = URIRef("https://example.com/d") - d.graph(URIRef("https://example.com/subgraph1")) - d.add((uri_a, uri_b, uri_c, URIRef("https://example.com/subgraph1"))) + d.graph(URIRef("https://example.com/g1")) + d.add((uri_a, uri_b, uri_c, URIRef("https://example.com/g1"))) d.add( - (uri_a, uri_b, uri_c, URIRef("https://example.com/subgraph1")) + (uri_a, uri_b, uri_c, URIRef("https://example.com/g1")) ) # pointless addition: duplicates above d.graph(URIRef("https://example.com/g2")) d.add((uri_a, uri_b, uri_c, URIRef("https://example.com/g2"))) - d.add((uri_a, uri_b, uri_d, URIRef("https://example.com/subgraph1"))) + d.add((uri_a, uri_b, uri_d, URIRef("https://example.com/g1"))) # traditional iterator i_trad = 0 @@ -232,7 +239,7 @@ def test_iter(get_dataset): assert i_new == i_trad # both should be 3 -def test_subgraph_without_identifier() -> None: +def test_graph_without_identifier() -> None: """ Graphs with no identifies assigned are identified by Skolem IRIs with a prefix that is bound to `genid`. @@ -241,9 +248,9 @@ def test_subgraph_without_identifier() -> None: reviewed at some point. """ - dataset = Dataset() + d = Dataset() - nman = dataset.namespace_manager + nman = d.namespace_manager genid_prefix = URIRef("https://rdflib.github.io/.well-known/genid/rdflib/") @@ -253,15 +260,36 @@ def test_subgraph_without_identifier() -> None: is None ) - subgraph: Graph = dataset.graph() - subgraph.add((EGSCHEME["subject"], EGSCHEME["predicate"], EGSCHEME["object"])) + ex = Namespace("http://example.com/") + g1: Graph = d.graph() + g1.add((ex.subject, ex.predicate, ex.object)) namespaces = set(nman.namespaces()) assert next( (namespace for namespace in namespaces if namespace[0] == "genid"), None ) == ("genid", genid_prefix) - assert f"{subgraph.identifier}".startswith(genid_prefix) + assert f"{g1.identifier}".startswith(genid_prefix) + + # now add a preexisting graph with no identifier + # i.e. not one created within this Dataset object + g2 = Graph() + g2.add((ex.subject, ex.predicate, ex.object)) + d.add_graph(g2) + + iris = 0 + bns = 0 + others = 0 + for g in d.graphs(): + if type(g.identifier) is URIRef: + iris += 1 + elif type(g.identifier) is BNode: + bns += 1 + else: + others += 1 + assert iris == 2 + assert bns == 1 + assert others == 0 def test_not_deprecated(): diff --git a/test/test_extras/test_shacl_extras.py b/test/test_extras/test_shacl_extras.py index 417e75b68a..1144e9b9ef 100644 --- a/test/test_extras/test_shacl_extras.py +++ b/test/test_extras/test_shacl_extras.py @@ -4,8 +4,9 @@ import pytest -from rdflib import Graph, URIRef -from rdflib.extras.shacl import SHACLPathError, parse_shacl_path +from rdflib import Graph, Literal, URIRef, paths +from rdflib.compare import graph_diff +from rdflib.extras.shacl import SHACLPathError, build_shacl_path, parse_shacl_path from rdflib.namespace import SH, Namespace from rdflib.paths import Path @@ -109,7 +110,32 @@ def path_source_data(): ) ; ] ; . - ex:TestPropShape10 + ex:TestPropShape10a + sh:path ( + [ + sh:zeroOrMorePath [ + sh:inversePath ex:pred1 ; + ] ; + ] + [ + sh:alternativePath ( + [ + sh:zeroOrMorePath [ + sh:inversePath ex:pred1 ; + ] ; + ] + ex:pred1 + [ + sh:oneOrMorePath ex:pred2 ; + ] + [ + sh:zeroOrMorePath ex:pred3 ; + ] + ) ; + ] + ) ; + . + ex:TestPropShape10b sh:path ( [ sh:zeroOrMorePath [ @@ -192,7 +218,13 @@ def path_source_data(): ~EX.pred1 | EX.pred1 / EX.pred2 | EX.pred1 | EX.pred2 | EX.pred3, ), ( - EX.TestPropShape10, + EX.TestPropShape10a, + ~EX.pred1 + * "*" + / (~EX.pred1 * "*" | EX.pred1 | EX.pred2 * "+" | EX.pred3 * "*"), # type: ignore[operator] + ), + ( + EX.TestPropShape10b, ~EX.pred1 * "*" / (~EX.pred1 * "*" | EX.pred1 | EX.pred2 * "+" | EX.pred3 * "*"), # type: ignore[operator] @@ -216,3 +248,49 @@ def test_parse_shacl_path( parse_shacl_path(path_source_data, path_root) # type: ignore[arg-type] else: assert parse_shacl_path(path_source_data, path_root) == expected # type: ignore[arg-type] + + +@pytest.mark.parametrize( + ("resource", "path"), + ( + # Single SHACL Path + (EX.TestPropShape1, EX.pred1), + (EX.TestPropShape2a, EX.pred1 / EX.pred2 / EX.pred3), + (EX.TestPropShape3, ~EX.pred1), + (EX.TestPropShape4a, EX.pred1 | EX.pred2 | EX.pred3), + (EX.TestPropShape5, EX.pred1 * "*"), # type: ignore[operator] + (EX.TestPropShape6, EX.pred1 * "+"), # type: ignore[operator] + (EX.TestPropShape7, EX.pred1 * "?"), # type: ignore[operator] + # SHACL Path Combinations + (EX.TestPropShape8, ~EX.pred1 * "*"), + ( + EX.TestPropShape10a, + ~EX.pred1 + * "*" + / (~EX.pred1 * "*" | EX.pred1 | EX.pred2 * "+" | EX.pred3 * "*"), # type: ignore[operator] + ), + (TypeError, Literal("Not a valid path")), + (SHACLPathError, paths.SequencePath(SH.targetClass)), + (SHACLPathError, paths.AlternativePath(SH.targetClass)), + ), +) +def test_build_shacl_path( + path_source_data: Graph, resource: URIRef | type, path: Union[URIRef, Path] +): + if isinstance(resource, type): + with pytest.raises(resource): + build_shacl_path(path) + else: + expected_path_root = path_source_data.value(resource, SH.path) + actual_path_root, actual_path_graph = build_shacl_path(path) + if isinstance(expected_path_root, URIRef): + assert actual_path_root == expected_path_root + assert actual_path_graph is None + else: + assert isinstance(actual_path_graph, Graph) + expected_path_graph = path_source_data.cbd(expected_path_root) # type: ignore[arg-type] + in_both, in_first, in_second = graph_diff( + expected_path_graph, actual_path_graph + ) + assert len(in_first) == 0 + assert len(in_second) == 0 diff --git a/test/test_namespace/test_definednamespace.py b/test/test_namespace/test_definednamespace.py index ea8e129692..5860e8eb26 100644 --- a/test/test_namespace/test_definednamespace.py +++ b/test/test_namespace/test_definednamespace.py @@ -299,14 +299,9 @@ def test_repr(dfns: Type[DefinedNamespace]) -> None: ns_uri = f"{prefix}{dfns_info.suffix}" logging.debug("ns_uri = %s", ns_uri) - repr_str: Optional[str] = None - - with ExitStack() as xstack: - if dfns_info.suffix is None: - xstack.enter_context(pytest.raises(AttributeError)) - repr_str = f"{dfns_info.dfns!r}" + repr_str: str = f"{dfns_info.dfns!r}" if dfns_info.suffix is None: - assert repr_str is None + assert "" in repr_str else: assert repr_str is not None repro = eval(repr_str) @@ -368,20 +363,15 @@ def test_contains( dfns_info = get_dfns_info(dfns) if dfns_info.suffix is not None: logging.debug("dfns_info = %s", dfns_info) - if dfns_info.has_attrs is False: + if dfns_info.has_attrs is False or dfns_info.suffix is None: is_defined = False - does_contain: Optional[bool] = None - with ExitStack() as xstack: - if dfns_info.suffix is None: - xstack.enter_context(pytest.raises(AttributeError)) - does_contain = attr_name in dfns - if dfns_info.suffix is not None: - if is_defined: - assert does_contain is True - else: - assert does_contain is False + + does_contain: bool = attr_name in dfns + + if is_defined: + assert does_contain is True else: - assert does_contain is None + assert does_contain is False @pytest.mark.parametrize( diff --git a/test/test_serializers/test_serializer_longturtle.py b/test/test_serializers/test_serializer_longturtle.py index 847d506ab1..c1761b6dae 100644 --- a/test/test_serializers/test_serializer_longturtle.py +++ b/test/test_serializers/test_serializer_longturtle.py @@ -1,5 +1,5 @@ import difflib -from textwrap import dedent +from pathlib import Path from rdflib import Graph, Namespace from rdflib.namespace import GEO, SDO @@ -170,83 +170,11 @@ def test_longturtle(): output = g.serialize(format="longturtle") # fix the target - target = dedent( - """ PREFIX cn: - PREFIX ex: - PREFIX geo: - PREFIX rdf: - PREFIX sdo: - PREFIX xsd: + current_dir = Path.cwd() # Get the current directory + target_file_path = current_dir / "test/data/longturtle" / "longturtle-target.ttl" - ex:nicholas - a sdo:Person ; - sdo:age 41 ; - sdo:alternateName - [ - sdo:name "Dr N.J. Car" ; - ] , - "N.J. Car" , - "Nick Car" ; - sdo:name - [ - a cn:CompoundName ; - sdo:hasPart - [ - a cn:CompoundName ; - rdf:value "Nicholas" ; - ] , - [ - a cn:CompoundName ; - rdf:value "John" ; - ] , - [ - a cn:CompoundName ; - sdo:hasPart - [ - a cn:CompoundName ; - rdf:value "Car" ; - ] , - [ - a cn:CompoundName ; - rdf:value "Maxov" ; - ] ; - ] ; - ] ; - sdo:worksFor ; - . - - - a sdo:Organization ; - sdo:location ; - . - - - a sdo:Place ; - sdo:address - [ - a sdo:PostalAddress ; - sdo:addressCountry - [ - sdo:identifier "au" ; - sdo:name "Australia" ; - ] ; - sdo:addressLocality "Shorncliffe" ; - sdo:addressRegion "QLD" ; - sdo:postalCode 4017 ; - sdo:streetAddress ( - 72 - "Yundah" - "Street" - ) ; - ] ; - sdo:geo - [ - sdo:polygon "POLYGON((153.082403 -27.325801, 153.08241 -27.32582, 153.082943 -27.325612, 153.083010 -27.325742, 153.083543 -27.325521, 153.083456 -27.325365, 153.082403 -27.325801))"^^geo:wktLiteral ; - ] ; - sdo:name "KurrawongAI HQ" ; - . - """ - ) + with open(target_file_path, encoding="utf-8") as file: + target = file.read() # compare output to target # - any differences will produce output diff --git a/test/test_serializers/test_serializer_longturtle_sort.py b/test/test_serializers/test_serializer_longturtle_sort.py new file mode 100644 index 0000000000..0e397afaf2 --- /dev/null +++ b/test/test_serializers/test_serializer_longturtle_sort.py @@ -0,0 +1,120 @@ +#!/usr/bin/env python3 + +# Portions of this file contributed by NIST are governed by the +# following statement: +# +# This software was developed at the National Institute of Standards +# and Technology by employees of the Federal Government in the course +# of their official duties. Pursuant to Title 17 Section 105 of the +# United States Code, this software is not subject to copyright +# protection within the United States. NIST assumes no responsibility +# whatsoever for its use by other parties, and makes no guarantees, +# expressed or implied, about its quality, reliability, or any other +# characteristic. +# +# We would appreciate acknowledgement if the software is used. + +from __future__ import annotations + +import random +from collections import defaultdict +from typing import DefaultDict, List + +from rdflib import RDFS, BNode, Graph, Literal, Namespace, URIRef + +EX = Namespace("http://example.org/ex/") + + +def test_sort_semiblank_graph() -> None: + """ + This test reviews whether the output of the Turtle form is + consistent when involving repeated generates with blank nodes. + """ + + serialization_counter: DefaultDict[str, int] = defaultdict(int) + + first_graph_text: str = "" + + # Use a fixed sequence of once-but-no-longer random values for more + # consistent test results. + nonrandom_shuffler = random.Random(1234) + for x in range(1, 10): + graph = Graph() + graph.bind("ex", EX) + graph.bind("rdfs", RDFS) + + graph.add((EX.A, RDFS.comment, Literal("Thing A"))) + graph.add((EX.B, RDFS.comment, Literal("Thing B"))) + graph.add((EX.C, RDFS.comment, Literal("Thing C"))) + + nodes: List[URIRef] = [EX.A, EX.B, EX.C, EX.B] + nonrandom_shuffler.shuffle(nodes) + for node in nodes: + # Instantiate one bnode per URIRef node. + graph.add((BNode(), RDFS.seeAlso, node)) + + nesteds: List[URIRef] = [EX.A, EX.B, EX.C] + nonrandom_shuffler.shuffle(nesteds) + for nested in nesteds: + # Instantiate a nested node reference. + outer_node = BNode() + inner_node = BNode() + graph.add((outer_node, EX.has, inner_node)) + graph.add((inner_node, RDFS.seeAlso, nested)) + + graph_text = graph.serialize(format="longturtle", sort=True) + if first_graph_text == "": + first_graph_text = graph_text + + serialization_counter[graph_text] += 1 + + expected_serialization = """\ +PREFIX ns1: +PREFIX rdfs: + +ns1:A + rdfs:comment "Thing A" ; +. + +ns1:C + rdfs:comment "Thing C" ; +. + +ns1:B + rdfs:comment "Thing B" ; +. + +[] ns1:has + [ + rdfs:seeAlso ns1:A ; + ] ; +. + +[] rdfs:seeAlso ns1:B ; +. + +[] ns1:has + [ + rdfs:seeAlso ns1:C ; + ] ; +. + +[] rdfs:seeAlso ns1:A ; +. + +[] rdfs:seeAlso ns1:C ; +. + +[] rdfs:seeAlso ns1:B ; +. + +[] ns1:has + [ + rdfs:seeAlso ns1:B ; + ] ; +. + +""" + + assert expected_serialization.strip() == first_graph_text.strip() + assert 1 == len(serialization_counter) diff --git a/test/test_sparql/test_dataset_exclusive.py b/test/test_sparql/test_dataset_exclusive.py index 2ce23d52b2..d867623c2c 100644 --- a/test/test_sparql/test_dataset_exclusive.py +++ b/test/test_sparql/test_dataset_exclusive.py @@ -82,3 +82,13 @@ def test_from_and_from_named(): (None, URIRef("urn:s1"), URIRef("urn:p1"), URIRef("urn:o1")), (URIRef("urn:g2"), URIRef("urn:s2"), URIRef("urn:p2"), URIRef("urn:o2")), ] + + +def test_ask_from(): + query = """ + ASK + FROM + WHERE {?s ?p ?o} + """ + results = bool(dataset.query(query)) + assert results diff --git a/test_reports/rdflib_w3c_sparql10-HEAD.ttl b/test_reports/rdflib_w3c_sparql10-HEAD.ttl index 78997b01c4..b8369a94d3 100644 --- a/test_reports/rdflib_w3c_sparql10-HEAD.ttl +++ b/test_reports/rdflib_w3c_sparql10-HEAD.ttl @@ -1795,7 +1795,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -1859,7 +1859,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -1907,7 +1907,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -2787,7 +2787,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . From 4b0f58098adbdaa5a40f24cd33aa2721a705da0e Mon Sep 17 00:00:00 2001 From: Nicholas Car Date: Sat, 18 Jan 2025 13:30:17 +1000 Subject: [PATCH 02/60] 7.1.3-pre-release --- CHANGELOG.md | 28 +++++++++++++++++++++++++++- CITATION.cff | 4 ++-- LICENSE | 2 +- README.md | 9 ++++++--- docs/conf.py | 2 +- pyproject.toml | 2 +- rdflib/__init__.py | 2 +- 7 files changed, 39 insertions(+), 10 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index bb6d15999e..4014dc2d88 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,30 @@ +## 2025-01-17 RELEASE 7.1.3 + +A fix-up release that re-adds support for Python 3.8 after it was accidentally +removed in Release 7.1.2. + +This release cherrypicks many additions to 7.1.2 added to 7.1.1 but leaves out +typing changes that are not compatable +with Python 3.8. + +Also not carried over from 7.1.2 is the change from Poetry 1.x to 2.0. + +Included are PRs such as _Defined Namespace warnings fix_, _sort longturtle +blank nodes_, _deterministic longturtle serialisation_ and _Dataset documentation +improvements_. + +For the full list of included PRs, see the preparatory PR: +. + +## 2025-01-10 RELEASE 7.1.2 + +A minor release that inadvertently removed support for Python 3.8. This release +how now been deleted. + +All the improved features initially made available in this release that were +compatible with Python 3.8 have been preserved in the 7.1.3 release. The main +additions to 7.1.2 not preserved in 7.1.3 are updated type hints. + ## 2024-10-17 RELEASE 7.1.1 This minor release removes the dependency on some only Python packages, in particular @@ -31,7 +58,6 @@ Merged PRs: * 2024-10-23 - build(deps-dev): bump ruff from 0.6.9 to 0.7.0 [PR #2942](https://github.com/RDFLib/rdflib/pull/2942) - ## 2024-10-17 RELEASE 7.1.0 This minor release incorporates just over 100 substantive PRs - interesting diff --git a/CITATION.cff b/CITATION.cff index c403aa3833..0d37d305cb 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -69,7 +69,7 @@ authors: - family-names: "Stuart" given-names: "Veyndan" title: "RDFLib" -version: 7.1.1 -date-released: 2024-10-28 +version: 7.1.3 +date-released: 2024-01-18 url: "https://github.com/RDFLib/rdflib" doi: 10.5281/zenodo.6845245 diff --git a/LICENSE b/LICENSE index 6f2449678b..75e852b7b6 100644 --- a/LICENSE +++ b/LICENSE @@ -1,6 +1,6 @@ BSD 3-Clause License -Copyright (c) 2002-2024, RDFLib Team +Copyright (c) 2002-2025, RDFLib Team All rights reserved. Redistribution and use in source and binary forms, with or without diff --git a/README.md b/README.md index a5b9c9ff25..acee3697e5 100644 --- a/README.md +++ b/README.md @@ -43,8 +43,11 @@ Help with maintenance of all of the RDFLib family of packages is always welcome ## Versions & Releases -* `main` branch in this repository is the unstable release -* `7.1.1` current stable release, bugfixes to 7.1.0 +* `main` branch in this repository is the current unstable release +* `7.1.3` current stable release, small improvements to 7.1.1 +* `7.1.2` previously deleted release +* `7.1.1` previous stable release + * see * `7.0.0` previous stable release, supports Python 3.8.1+ only. * see [Releases](https://github.com/RDFLib/rdflib/releases) * `6.x.y` supports Python 3.7+ only. Many improvements over 5.0.0 @@ -68,7 +71,7 @@ Some features of RDFLib require optional dependencies which may be installed usi Alternatively manually download the package from the Python Package Index (PyPI) at https://pypi.python.org/pypi/rdflib -The current version of RDFLib is 7.1.1, see the ``CHANGELOG.md`` file for what's new in this release. +The current version of RDFLib is 7.1.3, see the ``CHANGELOG.md`` file for what's new in this release. ### Installation of the current main branch (for developers) diff --git a/docs/conf.py b/docs/conf.py index 44b21a91b3..b3c4a373bd 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -81,7 +81,7 @@ # General information about the project. project = "rdflib" -copyright = "2009 - 2024, RDFLib Team" +copyright = "2002 - 2025, RDFLib Team" # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the diff --git a/pyproject.toml b/pyproject.toml index 971e229d70..9aebf19e1e 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.1.3a0" +version = "7.1.3" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] diff --git a/rdflib/__init__.py b/rdflib/__init__.py index 843b614e42..dcfe8df36a 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -52,7 +52,7 @@ __docformat__ = "restructuredtext en" __version__: str = _DISTRIBUTION_METADATA["Version"] -__date__ = "2024-10-28" +__date__ = "2025-01-18" __all__ = [ "URIRef", From 006949a6c67f13947edf2a4c24cb6540b5256c7e Mon Sep 17 00:00:00 2001 From: Nicholas Car Date: Sat, 18 Jan 2025 14:32:32 +1000 Subject: [PATCH 03/60] small docco update (#3053) --- README.md | 11 +++++------ 1 file changed, 5 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index acee3697e5..69bd60d5a2 100644 --- a/README.md +++ b/README.md @@ -18,9 +18,10 @@ RDFLib RDFLib is a pure Python package for working with [RDF](http://www.w3.org/RDF/). RDFLib contains most things you need to work with RDF, including: -* parsers and serializers for RDF/XML, N3, NTriples, N-Quads, Turtle, TriX, Trig and JSON-LD +* parsers and serializers for RDF/XML, N3, NTriples, N-Quads, Turtle, TriX, Trig, JSON-LD and even HexTuples * a Graph interface which can be backed by any one of a number of Store implementations -* store implementations for in-memory, persistent on disk (Berkeley DB) and remote SPARQL endpoints +* Store implementations for in-memory, persistent on disk (Berkeley DB) and remote SPARQL endpoints + * additional Stores can be supplied via plugins * a SPARQL 1.1 implementation - supporting SPARQL 1.1 Queries and Update statements * SPARQL function extension mechanisms @@ -29,10 +30,8 @@ The RDFlib community maintains many RDF-related Python code repositories with di * [rdflib](https://github.com/RDFLib/rdflib) - the RDFLib core * [sparqlwrapper](https://github.com/RDFLib/sparqlwrapper) - a simple Python wrapper around a SPARQL service to remotely execute your queries -* [pyLODE](https://github.com/RDFLib/pyLODE) - An OWL ontology documentation tool using Python and templating, based on LODE. -* [pyrdfa3](https://github.com/RDFLib/pyrdfa3) - RDFa 1.1 distiller/parser library: can extract RDFa 1.1/1.0 from (X)HTML, SVG, or XML in general. -* [pymicrodata](https://github.com/RDFLib/pymicrodata) - A module to extract RDF from an HTML5 page annotated with microdata. -* [pySHACL](https://github.com/RDFLib/pySHACL) - A pure Python module which allows for the validation of RDF graphs against SHACL graphs. +* [pyLODE](https://github.com/RDFLib/pyLODE) - An OWL ontology documentation tool using Python and templating, based on LODE +* [pySHACL](https://github.com/RDFLib/pySHACL) - A pure Python module which allows for the validation of RDF graphs against SHACL graphs * [OWL-RL](https://github.com/RDFLib/OWL-RL) - A simple implementation of the OWL2 RL Profile which expands the graph with all possible triples that OWL RL defines. Please see the list for all packages/repositories here: From f4f3b731e4ac759eab1a5725cce710ff8c6577ca Mon Sep 17 00:00:00 2001 From: Nicholas Car Date: Sat, 18 Jan 2025 16:05:32 +1000 Subject: [PATCH 04/60] 7.1.3-post-release; some updated release info for devs (#3054) --- CHANGELOG.md | 2 +- README.md | 2 - docker/latest/requirements.in | 2 +- docker/latest/requirements.txt | 6 +- docs/developers.rst | 104 +++++++++++++++++++++++++-------- pyproject.toml | 2 +- 6 files changed, 87 insertions(+), 31 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 4014dc2d88..a4e3b61ca6 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -4,7 +4,7 @@ A fix-up release that re-adds support for Python 3.8 after it was accidentally removed in Release 7.1.2. This release cherrypicks many additions to 7.1.2 added to 7.1.1 but leaves out -typing changes that are not compatable +typing changes that are not compatible with Python 3.8. Also not carried over from 7.1.2 is the change from Poetry 1.x to 2.0. diff --git a/README.md b/README.md index 69bd60d5a2..f2b106648d 100644 --- a/README.md +++ b/README.md @@ -70,8 +70,6 @@ Some features of RDFLib require optional dependencies which may be installed usi Alternatively manually download the package from the Python Package Index (PyPI) at https://pypi.python.org/pypi/rdflib -The current version of RDFLib is 7.1.3, see the ``CHANGELOG.md`` file for what's new in this release. - ### Installation of the current main branch (for developers) With *pip* you can also install rdflib from the git repository with one of the following options: diff --git a/docker/latest/requirements.in b/docker/latest/requirements.in index cc344d2a6d..fa2ecec495 100644 --- a/docker/latest/requirements.in +++ b/docker/latest/requirements.in @@ -1,4 +1,4 @@ # This file is used for building a docker image of the latest rdflib release. It # will be updated by dependabot when new releases are made. -rdflib==7.1.1 +rdflib==7.1.3 html5rdf==1.2.0 diff --git a/docker/latest/requirements.txt b/docker/latest/requirements.txt index 4357e6d526..95131d137e 100644 --- a/docker/latest/requirements.txt +++ b/docker/latest/requirements.txt @@ -1,12 +1,14 @@ # -# This file is autogenerated by pip-compile with Python 3.12 +# This file is autogenerated by pip-compile with Python 3.8 # by the following command: # # pip-compile docker/latest/requirements.in # html5rdf==1.2 # via -r docker/latest/requirements.in +isodate==0.7.2 + # via rdflib pyparsing==3.0.9 # via rdflib -rdflib==7.1.1 +rdflib==7.1.3 # via -r docker/latest/requirements.in diff --git a/docs/developers.rst b/docs/developers.rst index e3593711e9..909c5bab66 100644 --- a/docs/developers.rst +++ b/docs/developers.rst @@ -434,6 +434,8 @@ flag them as expecting to fail. Compatibility ------------- +RDFLib 8.x is likely to support only the Python versions in bugfix status at the time of its release, so perhaps 3.12+. + RDFlib 7.0.0 release and later only support Python 3.8.1 and newer. RDFlib 6.0.0 release and later only support Python 3.7 and newer. @@ -443,22 +445,46 @@ RDFLib 5.0.0 maintained compatibility with Python versions 2.7, 3.4, 3.5, 3.6, 3 Releasing --------- +These are the major steps for releasing new versions of RDFLib: + +#. Create a pre-release PR + + * that updates all the version numbers + * merge it with all tests passing + +#. Do the PyPI release +#. Do the GitHub release +#. Create a post-release PR + + * that updates all version numbers to next (alpha) release + * merge it with all tests passing + +#. Let the world know + + +1. Create a pre-release PR +~~~~~~~~~~~~~~~~~~~~~~~~~~ + Create a release-preparation pull request with the following changes: -* Updated version and date in ``CITATION.cff``. -* Updated copyright year in the ``LICENSE`` file. -* Updated copyright year in the ``docs/conf.py`` file. -* Updated main branch version and current version in the ``README.md`` file. -* Updated version in the ``pyproject.toml`` file. -* Updated ``__date__`` in the ``rdflib/__init__.py`` file. -* Accurate ``CHANGELOG.md`` entry for the release. +#. In ``pyproject.toml``, update the version number +#. In ``README.md``, update the *Versions & Releases* section +#. In ``rdflib/__init__.py``, update the ``__date__`` value +#. In ``docs/conf.py``, update copyright year +#. In ``CITATION.cff``, update the version and date +#. In ``LICENSE``, update the copyright year +#. In ``CHANGELOG.md``, write an entry for this release + #. You can use the tool ``admin/get_merged_prs.py`` to assist with compiling a log of PRs and commits since last release + +2. Do the PyPI release +~~~~~~~~~~~~~~~~~~~~~~ -Once the PR is merged, switch to the main branch, build the release and upload it to PyPI: +Once the pre-release PR is merged, switch to the main branch, build the release and upload it to PyPI: .. code-block:: bash # Clean up any previous builds - \rm -vf dist/* + rm -vf dist/* # Build artifacts poetry build @@ -487,24 +513,54 @@ Once the PR is merged, switch to the main branch, build the release and upload i ## poetry publish -u __token__ -p pypi- -Once this is done, create a release tag from `GitHub releases -`_. For a release of version -6.3.1 the tag should be ``6.3.1`` (without a "v" prefix), and the release title -should be "RDFLib 6.3.1". The release notes for the latest version be added to -the release description. The artifacts built with ``poetry build`` should be -uploaded to the release as release artifacts. +3. Do the GitHub release +~~~~~~~~~~~~~~~~~~~~~~~~ -The resulting release will be available at https://github.com/RDFLib/rdflib/releases/tag/6.3.1 +Once the PyPI release is done, tag the main branch with the version number of the release. For a release of version +6.3.1 the tag should be ``6.3.1`` (without a "v" prefix): + +.. code-block:: bash + + git tag 6.3.1 -Once this is done, announce the release at the following locations: -* Twitter: Just make a tweet from your own account linking to the latest release. -* RDFLib mailing list. -* RDFLib Gitter / matrix.org chat room. +Push this tag to GitHub: + +.. code-block:: bash + + git push --tags + + +Make a release from this tag at https://github.com/RDFLib/rdflib/releases/new + +The release title should be "{DATE} RELEASE {VERSION}". See previous releases at https://github.com/RDFLib/rdflib/releases + +The release notes should be just the same as the release info in ``CHANGELOG.md``, as authored in the first major step in this release process. + +The resulting release will be available at https://github.com/RDFLib/rdflib/releases/tag/6.3.1 + +4. Create a post-release PR +~~~~~~~~~~~~~~~~~~~~~~~~~~~ Once this is all done, create another post-release pull request with the following changes: -* Set the just released version in ``docker/latest/requirements.in`` and run - ``task docker:prepare`` to update the ``docker/latest/requirements.txt`` file. -* Set the version in the ``pyproject.toml`` file to the next minor release with - a ``a0`` suffix to indicate alpha 0. +#. In ``pyproject.toml``, update to the next minor release alpha + + * so a 6.3.1 release would have 6.1.4a0 as the next release alpha + +#. In ``docker/latest/requirements.in`` set the version to the just released version +#. Use ``task docker:prepare`` to update ``docker/latest/requirements.txt`` + + + +5. Let the world know +~~~~~~~~~~~~~~~~~~~~~ + +Announce the release at the following locations: + +* RDFLib mailing list +* RDFLib Gitter / matrix.org chat room +* Twitter: Just make a tweet from your own account linking to the latest release +* related mailing lists + * Jena: users@jena.apache.org + * W3C (currently RDF-Star WG): public-rdf-star@w3.org diff --git a/pyproject.toml b/pyproject.toml index 9aebf19e1e..4aac771e62 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.1.3" +version = "7.1.4a" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] From 7d3666e49b6389c8b562296bb74d09ca1ffca845 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 24 Jan 2025 09:39:34 +1000 Subject: [PATCH 05/60] build(deps): bump orjson from 3.10.13 to 3.10.15 (#3055) Bumps [orjson](https://github.com/ijl/orjson) from 3.10.13 to 3.10.15. - [Release notes](https://github.com/ijl/orjson/releases) - [Changelog](https://github.com/ijl/orjson/blob/master/CHANGELOG.md) - [Commits](https://github.com/ijl/orjson/compare/3.10.13...3.10.15) --- updated-dependencies: - dependency-name: orjson dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- poetry.lock | 156 +++++++++++++++++++++++++++------------------------- 1 file changed, 80 insertions(+), 76 deletions(-) diff --git a/poetry.lock b/poetry.lock index 2072d2c5cb..c828d3a701 100644 --- a/poetry.lock +++ b/poetry.lock @@ -830,86 +830,90 @@ test = ["codecov (>=2.1)", "pytest (>=7.2)", "pytest-cov (>=4.0)"] [[package]] name = "orjson" -version = "3.10.13" +version = "3.10.15" description = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy" optional = true python-versions = ">=3.8" files = [ - {file = "orjson-3.10.13-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:1232c5e873a4d1638ef957c5564b4b0d6f2a6ab9e207a9b3de9de05a09d1d920"}, - {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d26a0eca3035619fa366cbaf49af704c7cb1d4a0e6c79eced9f6a3f2437964b6"}, - {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d4b6acd7c9c829895e50d385a357d4b8c3fafc19c5989da2bae11783b0fd4977"}, - {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1884e53c6818686891cc6fc5a3a2540f2f35e8c76eac8dc3b40480fb59660b00"}, - {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a428afb5720f12892f64920acd2eeb4d996595bf168a26dd9190115dbf1130d"}, - {file = "orjson-3.10.13-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba5b13b8739ce5b630c65cb1c85aedbd257bcc2b9c256b06ab2605209af75a2e"}, - {file = "orjson-3.10.13-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:cab83e67f6aabda1b45882254b2598b48b80ecc112968fc6483fa6dae609e9f0"}, - {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:62c3cc00c7e776c71c6b7b9c48c5d2701d4c04e7d1d7cdee3572998ee6dc57cc"}, - {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:dc03db4922e75bbc870b03fc49734cefbd50fe975e0878327d200022210b82d8"}, - {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:22f1c9a30b43d14a041a6ea190d9eca8a6b80c4beb0e8b67602c82d30d6eec3e"}, - {file = "orjson-3.10.13-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b42f56821c29e697c68d7d421410d7c1d8f064ae288b525af6a50cf99a4b1200"}, - {file = "orjson-3.10.13-cp310-cp310-win32.whl", hash = "sha256:0dbf3b97e52e093d7c3e93eb5eb5b31dc7535b33c2ad56872c83f0160f943487"}, - {file = "orjson-3.10.13-cp310-cp310-win_amd64.whl", hash = "sha256:46c249b4e934453be4ff2e518cd1adcd90467da7391c7a79eaf2fbb79c51e8c7"}, - {file = "orjson-3.10.13-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:a36c0d48d2f084c800763473020a12976996f1109e2fcb66cfea442fdf88047f"}, - {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0065896f85d9497990731dfd4a9991a45b0a524baec42ef0a63c34630ee26fd6"}, - {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:92b4ec30d6025a9dcdfe0df77063cbce238c08d0404471ed7a79f309364a3d19"}, - {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a94542d12271c30044dadad1125ee060e7a2048b6c7034e432e116077e1d13d2"}, - {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3723e137772639af8adb68230f2aa4bcb27c48b3335b1b1e2d49328fed5e244c"}, - {file = "orjson-3.10.13-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f00c7fb18843bad2ac42dc1ce6dd214a083c53f1e324a0fd1c8137c6436269b"}, - {file = "orjson-3.10.13-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0e2759d3172300b2f892dee85500b22fca5ac49e0c42cfff101aaf9c12ac9617"}, - {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ee948c6c01f6b337589c88f8e0bb11e78d32a15848b8b53d3f3b6fea48842c12"}, - {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:aa6fe68f0981fba0d4bf9cdc666d297a7cdba0f1b380dcd075a9a3dd5649a69e"}, - {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:dbcd7aad6bcff258f6896abfbc177d54d9b18149c4c561114f47ebfe74ae6bfd"}, - {file = "orjson-3.10.13-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:2149e2fcd084c3fd584881c7f9d7f9e5ad1e2e006609d8b80649655e0d52cd02"}, - {file = "orjson-3.10.13-cp311-cp311-win32.whl", hash = "sha256:89367767ed27b33c25c026696507c76e3d01958406f51d3a2239fe9e91959df2"}, - {file = "orjson-3.10.13-cp311-cp311-win_amd64.whl", hash = "sha256:dca1d20f1af0daff511f6e26a27354a424f0b5cf00e04280279316df0f604a6f"}, - {file = "orjson-3.10.13-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:a3614b00621c77f3f6487792238f9ed1dd8a42f2ec0e6540ee34c2d4e6db813a"}, - {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c976bad3996aa027cd3aef78aa57873f3c959b6c38719de9724b71bdc7bd14b"}, - {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f74d878d1efb97a930b8a9f9898890067707d683eb5c7e20730030ecb3fb930"}, - {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:33ef84f7e9513fb13b3999c2a64b9ca9c8143f3da9722fbf9c9ce51ce0d8076e"}, - {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd2bcde107221bb9c2fa0c4aaba735a537225104173d7e19cf73f70b3126c993"}, - {file = "orjson-3.10.13-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:064b9dbb0217fd64a8d016a8929f2fae6f3312d55ab3036b00b1d17399ab2f3e"}, - {file = "orjson-3.10.13-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c0044b0b8c85a565e7c3ce0a72acc5d35cda60793edf871ed94711e712cb637d"}, - {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7184f608ad563032e398f311910bc536e62b9fbdca2041be889afcbc39500de8"}, - {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:d36f689e7e1b9b6fb39dbdebc16a6f07cbe994d3644fb1c22953020fc575935f"}, - {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:54433e421618cd5873e51c0e9d0b9fb35f7bf76eb31c8eab20b3595bb713cd3d"}, - {file = "orjson-3.10.13-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e1ba0c5857dd743438acecc1cd0e1adf83f0a81fee558e32b2b36f89e40cee8b"}, - {file = "orjson-3.10.13-cp312-cp312-win32.whl", hash = "sha256:a42b9fe4b0114b51eb5cdf9887d8c94447bc59df6dbb9c5884434eab947888d8"}, - {file = "orjson-3.10.13-cp312-cp312-win_amd64.whl", hash = "sha256:3a7df63076435f39ec024bdfeb4c9767ebe7b49abc4949068d61cf4857fa6d6c"}, - {file = "orjson-3.10.13-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:2cdaf8b028a976ebab837a2c27b82810f7fc76ed9fb243755ba650cc83d07730"}, - {file = "orjson-3.10.13-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:48a946796e390cbb803e069472de37f192b7a80f4ac82e16d6eb9909d9e39d56"}, - {file = "orjson-3.10.13-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7d64f1db5ecbc21eb83097e5236d6ab7e86092c1cd4c216c02533332951afc"}, - {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:711878da48f89df194edd2ba603ad42e7afed74abcd2bac164685e7ec15f96de"}, - {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:cf16f06cb77ce8baf844bc222dbcb03838f61d0abda2c3341400c2b7604e436e"}, - {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:8257c3fb8dd7b0b446b5e87bf85a28e4071ac50f8c04b6ce2d38cb4abd7dff57"}, - {file = "orjson-3.10.13-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d9c3a87abe6f849a4a7ac8a8a1dede6320a4303d5304006b90da7a3cd2b70d2c"}, - {file = "orjson-3.10.13-cp313-cp313-win32.whl", hash = "sha256:527afb6ddb0fa3fe02f5d9fba4920d9d95da58917826a9be93e0242da8abe94a"}, - {file = "orjson-3.10.13-cp313-cp313-win_amd64.whl", hash = "sha256:b5f7c298d4b935b222f52d6c7f2ba5eafb59d690d9a3840b7b5c5cda97f6ec5c"}, - {file = "orjson-3.10.13-cp38-cp38-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:e49333d1038bc03a25fdfe11c86360df9b890354bfe04215f1f54d030f33c342"}, - {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:003721c72930dbb973f25c5d8e68d0f023d6ed138b14830cc94e57c6805a2eab"}, - {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:63664bf12addb318dc8f032160e0f5dc17eb8471c93601e8f5e0d07f95003784"}, - {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6066729cf9552d70de297b56556d14b4f49c8f638803ee3c90fd212fa43cc6af"}, - {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8a1152e2761025c5d13b5e1908d4b1c57f3797ba662e485ae6f26e4e0c466388"}, - {file = "orjson-3.10.13-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:69b21d91c5c5ef8a201036d207b1adf3aa596b930b6ca3c71484dd11386cf6c3"}, - {file = "orjson-3.10.13-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b12a63f48bb53dba8453d36ca2661f2330126d54e26c1661e550b32864b28ce3"}, - {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:a5a7624ab4d121c7e035708c8dd1f99c15ff155b69a1c0affc4d9d8b551281ba"}, - {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:0fee076134398d4e6cb827002468679ad402b22269510cf228301b787fdff5ae"}, - {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:ae537fcf330b3947e82c6ae4271e092e6cf16b9bc2cef68b14ffd0df1fa8832a"}, - {file = "orjson-3.10.13-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:f81b26c03f5fb5f0d0ee48d83cea4d7bc5e67e420d209cc1a990f5d1c62f9be0"}, - {file = "orjson-3.10.13-cp38-cp38-win32.whl", hash = "sha256:0bc858086088b39dc622bc8219e73d3f246fb2bce70a6104abd04b3a080a66a8"}, - {file = "orjson-3.10.13-cp38-cp38-win_amd64.whl", hash = "sha256:3ca6f17467ebbd763f8862f1d89384a5051b461bb0e41074f583a0ebd7120e8e"}, - {file = "orjson-3.10.13-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:4a11532cbfc2f5752c37e84863ef8435b68b0e6d459b329933294f65fa4bda1a"}, - {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c96d2fb80467d1d0dfc4d037b4e1c0f84f1fe6229aa7fea3f070083acef7f3d7"}, - {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dda4ba4d3e6f6c53b6b9c35266788053b61656a716a7fef5c884629c2a52e7aa"}, - {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e4f998bbf300690be881772ee9c5281eb9c0044e295bcd4722504f5b5c6092ff"}, - {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dce1cc42ed75b585c0c4dc5eb53a90a34ccb493c09a10750d1a1f9b9eff2bd12"}, - {file = "orjson-3.10.13-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03b0f29d485411e3c13d79604b740b14e4e5fb58811743f6f4f9693ee6480a8f"}, - {file = "orjson-3.10.13-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:233aae4474078d82f425134bb6a10fb2b3fc5a1a1b3420c6463ddd1b6a97eda8"}, - {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:e384e330a67cf52b3597ee2646de63407da6f8fc9e9beec3eaaaef5514c7a1c9"}, - {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:4222881d0aab76224d7b003a8e5fdae4082e32c86768e0e8652de8afd6c4e2c1"}, - {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:e400436950ba42110a20c50c80dff4946c8e3ec09abc1c9cf5473467e83fd1c5"}, - {file = "orjson-3.10.13-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:f47c9e7d224b86ffb086059cdcf634f4b3f32480f9838864aa09022fe2617ce2"}, - {file = "orjson-3.10.13-cp39-cp39-win32.whl", hash = "sha256:a9ecea472f3eb653e1c0a3d68085f031f18fc501ea392b98dcca3e87c24f9ebe"}, - {file = "orjson-3.10.13-cp39-cp39-win_amd64.whl", hash = "sha256:5385935a73adce85cc7faac9d396683fd813566d3857fa95a0b521ef84a5b588"}, - {file = "orjson-3.10.13.tar.gz", hash = "sha256:eb9bfb14ab8f68d9d9492d4817ae497788a15fd7da72e14dfabc289c3bb088ec"}, + {file = "orjson-3.10.15-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:552c883d03ad185f720d0c09583ebde257e41b9521b74ff40e08b7dec4559c04"}, + {file = "orjson-3.10.15-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:616e3e8d438d02e4854f70bfdc03a6bcdb697358dbaa6bcd19cbe24d24ece1f8"}, + {file = "orjson-3.10.15-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7c2c79fa308e6edb0ffab0a31fd75a7841bf2a79a20ef08a3c6e3b26814c8ca8"}, + {file = "orjson-3.10.15-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cb85490aa6bf98abd20607ab5c8324c0acb48d6da7863a51be48505646c814"}, + {file = "orjson-3.10.15-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:763dadac05e4e9d2bc14938a45a2d0560549561287d41c465d3c58aec818b164"}, + {file = "orjson-3.10.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a330b9b4734f09a623f74a7490db713695e13b67c959713b78369f26b3dee6bf"}, + {file = "orjson-3.10.15-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a61a4622b7ff861f019974f73d8165be1bd9a0855e1cad18ee167acacabeb061"}, + {file = "orjson-3.10.15-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:acd271247691574416b3228db667b84775c497b245fa275c6ab90dc1ffbbd2b3"}, + {file = "orjson-3.10.15-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:e4759b109c37f635aa5c5cc93a1b26927bfde24b254bcc0e1149a9fada253d2d"}, + {file = "orjson-3.10.15-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:9e992fd5cfb8b9f00bfad2fd7a05a4299db2bbe92e6440d9dd2fab27655b3182"}, + {file = "orjson-3.10.15-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f95fb363d79366af56c3f26b71df40b9a583b07bbaaf5b317407c4d58497852e"}, + {file = "orjson-3.10.15-cp310-cp310-win32.whl", hash = "sha256:f9875f5fea7492da8ec2444839dcc439b0ef298978f311103d0b7dfd775898ab"}, + {file = "orjson-3.10.15-cp310-cp310-win_amd64.whl", hash = "sha256:17085a6aa91e1cd70ca8533989a18b5433e15d29c574582f76f821737c8d5806"}, + {file = "orjson-3.10.15-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:c4cc83960ab79a4031f3119cc4b1a1c627a3dc09df125b27c4201dff2af7eaa6"}, + {file = "orjson-3.10.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ddbeef2481d895ab8be5185f2432c334d6dec1f5d1933a9c83014d188e102cef"}, + {file = "orjson-3.10.15-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9e590a0477b23ecd5b0ac865b1b907b01b3c5535f5e8a8f6ab0e503efb896334"}, + {file = "orjson-3.10.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a6be38bd103d2fd9bdfa31c2720b23b5d47c6796bcb1d1b598e3924441b4298d"}, + {file = "orjson-3.10.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ff4f6edb1578960ed628a3b998fa54d78d9bb3e2eb2cfc5c2a09732431c678d0"}, + {file = "orjson-3.10.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b0482b21d0462eddd67e7fce10b89e0b6ac56570424662b685a0d6fccf581e13"}, + {file = "orjson-3.10.15-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bb5cc3527036ae3d98b65e37b7986a918955f85332c1ee07f9d3f82f3a6899b5"}, + {file = "orjson-3.10.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d569c1c462912acdd119ccbf719cf7102ea2c67dd03b99edcb1a3048651ac96b"}, + {file = "orjson-3.10.15-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:1e6d33efab6b71d67f22bf2962895d3dc6f82a6273a965fab762e64fa90dc399"}, + {file = "orjson-3.10.15-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:c33be3795e299f565681d69852ac8c1bc5c84863c0b0030b2b3468843be90388"}, + {file = "orjson-3.10.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:eea80037b9fae5339b214f59308ef0589fc06dc870578b7cce6d71eb2096764c"}, + {file = "orjson-3.10.15-cp311-cp311-win32.whl", hash = "sha256:d5ac11b659fd798228a7adba3e37c010e0152b78b1982897020a8e019a94882e"}, + {file = "orjson-3.10.15-cp311-cp311-win_amd64.whl", hash = "sha256:cf45e0214c593660339ef63e875f32ddd5aa3b4adc15e662cdb80dc49e194f8e"}, + {file = "orjson-3.10.15-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:9d11c0714fc85bfcf36ada1179400862da3288fc785c30e8297844c867d7505a"}, + {file = "orjson-3.10.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dba5a1e85d554e3897fa9fe6fbcff2ed32d55008973ec9a2b992bd9a65d2352d"}, + {file = "orjson-3.10.15-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7723ad949a0ea502df656948ddd8b392780a5beaa4c3b5f97e525191b102fff0"}, + {file = "orjson-3.10.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6fd9bc64421e9fe9bd88039e7ce8e58d4fead67ca88e3a4014b143cec7684fd4"}, + {file = "orjson-3.10.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dadba0e7b6594216c214ef7894c4bd5f08d7c0135f4dd0145600be4fbcc16767"}, + {file = "orjson-3.10.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b48f59114fe318f33bbaee8ebeda696d8ccc94c9e90bc27dbe72153094e26f41"}, + {file = "orjson-3.10.15-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:035fb83585e0f15e076759b6fedaf0abb460d1765b6a36f48018a52858443514"}, + {file = "orjson-3.10.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d13b7fe322d75bf84464b075eafd8e7dd9eae05649aa2a5354cfa32f43c59f17"}, + {file = "orjson-3.10.15-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:7066b74f9f259849629e0d04db6609db4cf5b973248f455ba5d3bd58a4daaa5b"}, + {file = "orjson-3.10.15-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:88dc3f65a026bd3175eb157fea994fca6ac7c4c8579fc5a86fc2114ad05705b7"}, + {file = "orjson-3.10.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b342567e5465bd99faa559507fe45e33fc76b9fb868a63f1642c6bc0735ad02a"}, + {file = "orjson-3.10.15-cp312-cp312-win32.whl", hash = "sha256:0a4f27ea5617828e6b58922fdbec67b0aa4bb844e2d363b9244c47fa2180e665"}, + {file = "orjson-3.10.15-cp312-cp312-win_amd64.whl", hash = "sha256:ef5b87e7aa9545ddadd2309efe6824bd3dd64ac101c15dae0f2f597911d46eaa"}, + {file = "orjson-3.10.15-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:bae0e6ec2b7ba6895198cd981b7cca95d1487d0147c8ed751e5632ad16f031a6"}, + {file = "orjson-3.10.15-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f93ce145b2db1252dd86af37d4165b6faa83072b46e3995ecc95d4b2301b725a"}, + {file = "orjson-3.10.15-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7c203f6f969210128af3acae0ef9ea6aab9782939f45f6fe02d05958fe761ef9"}, + {file = "orjson-3.10.15-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8918719572d662e18b8af66aef699d8c21072e54b6c82a3f8f6404c1f5ccd5e0"}, + {file = "orjson-3.10.15-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f71eae9651465dff70aa80db92586ad5b92df46a9373ee55252109bb6b703307"}, + {file = "orjson-3.10.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e117eb299a35f2634e25ed120c37c641398826c2f5a3d3cc39f5993b96171b9e"}, + {file = "orjson-3.10.15-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:13242f12d295e83c2955756a574ddd6741c81e5b99f2bef8ed8d53e47a01e4b7"}, + {file = "orjson-3.10.15-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7946922ada8f3e0b7b958cc3eb22cfcf6c0df83d1fe5521b4a100103e3fa84c8"}, + {file = "orjson-3.10.15-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:b7155eb1623347f0f22c38c9abdd738b287e39b9982e1da227503387b81b34ca"}, + {file = "orjson-3.10.15-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:208beedfa807c922da4e81061dafa9c8489c6328934ca2a562efa707e049e561"}, + {file = "orjson-3.10.15-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eca81f83b1b8c07449e1d6ff7074e82e3fd6777e588f1a6632127f286a968825"}, + {file = "orjson-3.10.15-cp313-cp313-win32.whl", hash = "sha256:c03cd6eea1bd3b949d0d007c8d57049aa2b39bd49f58b4b2af571a5d3833d890"}, + {file = "orjson-3.10.15-cp313-cp313-win_amd64.whl", hash = "sha256:fd56a26a04f6ba5fb2045b0acc487a63162a958ed837648c5781e1fe3316cfbf"}, + {file = "orjson-3.10.15-cp38-cp38-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:5e8afd6200e12771467a1a44e5ad780614b86abb4b11862ec54861a82d677746"}, + {file = "orjson-3.10.15-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da9a18c500f19273e9e104cca8c1f0b40a6470bcccfc33afcc088045d0bf5ea6"}, + {file = "orjson-3.10.15-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb00b7bfbdf5d34a13180e4805d76b4567025da19a197645ca746fc2fb536586"}, + {file = "orjson-3.10.15-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:33aedc3d903378e257047fee506f11e0833146ca3e57a1a1fb0ddb789876c1e1"}, + {file = "orjson-3.10.15-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd0099ae6aed5eb1fc84c9eb72b95505a3df4267e6962eb93cdd5af03be71c98"}, + {file = "orjson-3.10.15-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7c864a80a2d467d7786274fce0e4f93ef2a7ca4ff31f7fc5634225aaa4e9e98c"}, + {file = "orjson-3.10.15-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c25774c9e88a3e0013d7d1a6c8056926b607a61edd423b50eb5c88fd7f2823ae"}, + {file = "orjson-3.10.15-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:e78c211d0074e783d824ce7bb85bf459f93a233eb67a5b5003498232ddfb0e8a"}, + {file = "orjson-3.10.15-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:43e17289ffdbbac8f39243916c893d2ae41a2ea1a9cbb060a56a4d75286351ae"}, + {file = "orjson-3.10.15-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:781d54657063f361e89714293c095f506c533582ee40a426cb6489c48a637b81"}, + {file = "orjson-3.10.15-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:6875210307d36c94873f553786a808af2788e362bd0cf4c8e66d976791e7b528"}, + {file = "orjson-3.10.15-cp38-cp38-win32.whl", hash = "sha256:305b38b2b8f8083cc3d618927d7f424349afce5975b316d33075ef0f73576b60"}, + {file = "orjson-3.10.15-cp38-cp38-win_amd64.whl", hash = "sha256:5dd9ef1639878cc3efffed349543cbf9372bdbd79f478615a1c633fe4e4180d1"}, + {file = "orjson-3.10.15-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:ffe19f3e8d68111e8644d4f4e267a069ca427926855582ff01fc012496d19969"}, + {file = "orjson-3.10.15-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d433bf32a363823863a96561a555227c18a522a8217a6f9400f00ddc70139ae2"}, + {file = "orjson-3.10.15-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:da03392674f59a95d03fa5fb9fe3a160b0511ad84b7a3914699ea5a1b3a38da2"}, + {file = "orjson-3.10.15-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3a63bb41559b05360ded9132032239e47983a39b151af1201f07ec9370715c82"}, + {file = "orjson-3.10.15-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3766ac4702f8f795ff3fa067968e806b4344af257011858cc3d6d8721588b53f"}, + {file = "orjson-3.10.15-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7a1c73dcc8fadbd7c55802d9aa093b36878d34a3b3222c41052ce6b0fc65f8e8"}, + {file = "orjson-3.10.15-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b299383825eafe642cbab34be762ccff9fd3408d72726a6b2a4506d410a71ab3"}, + {file = "orjson-3.10.15-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:abc7abecdbf67a173ef1316036ebbf54ce400ef2300b4e26a7b843bd446c2480"}, + {file = "orjson-3.10.15-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:3614ea508d522a621384c1d6639016a5a2e4f027f3e4a1c93a51867615d28829"}, + {file = "orjson-3.10.15-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:295c70f9dc154307777ba30fe29ff15c1bcc9dfc5c48632f37d20a607e9ba85a"}, + {file = "orjson-3.10.15-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:63309e3ff924c62404923c80b9e2048c1f74ba4b615e7584584389ada50ed428"}, + {file = "orjson-3.10.15-cp39-cp39-win32.whl", hash = "sha256:a2f708c62d026fb5340788ba94a55c23df4e1869fec74be455e0b2f5363b8507"}, + {file = "orjson-3.10.15-cp39-cp39-win_amd64.whl", hash = "sha256:efcf6c735c3d22ef60c4aa27a5238f1a477df85e9b15f2142f9d669beb2d13fd"}, + {file = "orjson-3.10.15.tar.gz", hash = "sha256:05ca7fe452a2e9d8d9d706a2984c95b9c2ebc5db417ce0b7a49b91d50642a23e"}, ] [[package]] From 62c528d8deba3cc970ad01b845b8db18f50aa8c1 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 24 Jan 2025 09:41:20 +1000 Subject: [PATCH 06/60] build(deps-dev): bump ruff from 0.8.6 to 0.9.2 (#3047) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.8.6 to 0.9.2. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.8.6...0.9.2) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- poetry.lock | 38 +++++++++++++++++++------------------- 1 file changed, 19 insertions(+), 19 deletions(-) diff --git a/poetry.lock b/poetry.lock index c828d3a701..cb90926f63 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1187,29 +1187,29 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] [[package]] name = "ruff" -version = "0.8.6" +version = "0.9.2" description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" files = [ - {file = "ruff-0.8.6-py3-none-linux_armv6l.whl", hash = "sha256:defed167955d42c68b407e8f2e6f56ba52520e790aba4ca707a9c88619e580e3"}, - {file = "ruff-0.8.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:54799ca3d67ae5e0b7a7ac234baa657a9c1784b48ec954a094da7c206e0365b1"}, - {file = "ruff-0.8.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:e88b8f6d901477c41559ba540beeb5a671e14cd29ebd5683903572f4b40a9807"}, - {file = "ruff-0.8.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0509e8da430228236a18a677fcdb0c1f102dd26d5520f71f79b094963322ed25"}, - {file = "ruff-0.8.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:91a7ddb221779871cf226100e677b5ea38c2d54e9e2c8ed847450ebbdf99b32d"}, - {file = "ruff-0.8.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:248b1fb3f739d01d528cc50b35ee9c4812aa58cc5935998e776bf8ed5b251e75"}, - {file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:bc3c083c50390cf69e7e1b5a5a7303898966be973664ec0c4a4acea82c1d4315"}, - {file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:52d587092ab8df308635762386f45f4638badb0866355b2b86760f6d3c076188"}, - {file = "ruff-0.8.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:61323159cf21bc3897674e5adb27cd9e7700bab6b84de40d7be28c3d46dc67cf"}, - {file = "ruff-0.8.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ae4478b1471fc0c44ed52a6fb787e641a2ac58b1c1f91763bafbc2faddc5117"}, - {file = "ruff-0.8.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:0c000a471d519b3e6cfc9c6680025d923b4ca140ce3e4612d1a2ef58e11f11fe"}, - {file = "ruff-0.8.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:9257aa841e9e8d9b727423086f0fa9a86b6b420fbf4bf9e1465d1250ce8e4d8d"}, - {file = "ruff-0.8.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:45a56f61b24682f6f6709636949ae8cc82ae229d8d773b4c76c09ec83964a95a"}, - {file = "ruff-0.8.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:496dd38a53aa173481a7d8866bcd6451bd934d06976a2505028a50583e001b76"}, - {file = "ruff-0.8.6-py3-none-win32.whl", hash = "sha256:e169ea1b9eae61c99b257dc83b9ee6c76f89042752cb2d83486a7d6e48e8f764"}, - {file = "ruff-0.8.6-py3-none-win_amd64.whl", hash = "sha256:f1d70bef3d16fdc897ee290d7d20da3cbe4e26349f62e8a0274e7a3f4ce7a905"}, - {file = "ruff-0.8.6-py3-none-win_arm64.whl", hash = "sha256:7d7fc2377a04b6e04ffe588caad613d0c460eb2ecba4c0ccbbfe2bc973cbc162"}, - {file = "ruff-0.8.6.tar.gz", hash = "sha256:dcad24b81b62650b0eb8814f576fc65cfee8674772a6e24c9b747911801eeaa5"}, + {file = "ruff-0.9.2-py3-none-linux_armv6l.whl", hash = "sha256:80605a039ba1454d002b32139e4970becf84b5fee3a3c3bf1c2af6f61a784347"}, + {file = "ruff-0.9.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b9aab82bb20afd5f596527045c01e6ae25a718ff1784cb92947bff1f83068b00"}, + {file = "ruff-0.9.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:fbd337bac1cfa96be615f6efcd4bc4d077edbc127ef30e2b8ba2a27e18c054d4"}, + {file = "ruff-0.9.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82b35259b0cbf8daa22a498018e300b9bb0174c2bbb7bcba593935158a78054d"}, + {file = "ruff-0.9.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b6a9701d1e371bf41dca22015c3f89769da7576884d2add7317ec1ec8cb9c3c"}, + {file = "ruff-0.9.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9cc53e68b3c5ae41e8faf83a3b89f4a5d7b2cb666dff4b366bb86ed2a85b481f"}, + {file = "ruff-0.9.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:8efd9da7a1ee314b910da155ca7e8953094a7c10d0c0a39bfde3fcfd2a015684"}, + {file = "ruff-0.9.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3292c5a22ea9a5f9a185e2d131dc7f98f8534a32fb6d2ee7b9944569239c648d"}, + {file = "ruff-0.9.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1a605fdcf6e8b2d39f9436d343d1f0ff70c365a1e681546de0104bef81ce88df"}, + {file = "ruff-0.9.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c547f7f256aa366834829a08375c297fa63386cbe5f1459efaf174086b564247"}, + {file = "ruff-0.9.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d18bba3d3353ed916e882521bc3e0af403949dbada344c20c16ea78f47af965e"}, + {file = "ruff-0.9.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:b338edc4610142355ccf6b87bd356729b62bf1bc152a2fad5b0c7dc04af77bfe"}, + {file = "ruff-0.9.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:492a5e44ad9b22a0ea98cf72e40305cbdaf27fac0d927f8bc9e1df316dcc96eb"}, + {file = "ruff-0.9.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:af1e9e9fe7b1f767264d26b1075ac4ad831c7db976911fa362d09b2d0356426a"}, + {file = "ruff-0.9.2-py3-none-win32.whl", hash = "sha256:71cbe22e178c5da20e1514e1e01029c73dc09288a8028a5d3446e6bba87a5145"}, + {file = "ruff-0.9.2-py3-none-win_amd64.whl", hash = "sha256:c5e1d6abc798419cf46eed03f54f2e0c3adb1ad4b801119dedf23fcaf69b55b5"}, + {file = "ruff-0.9.2-py3-none-win_arm64.whl", hash = "sha256:a1b63fa24149918f8b37cef2ee6fff81f24f0d74b6f0bdc37bc3e1f2143e41c6"}, + {file = "ruff-0.9.2.tar.gz", hash = "sha256:b5eceb334d55fae5f316f783437392642ae18e16dcf4f1858d55d3c2a0f8f5d0"}, ] [[package]] From 47165fc33cab245452d7a34be547184612a760b5 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 20 Feb 2025 09:17:45 +1000 Subject: [PATCH 07/60] build(deps): bump html5rdf from 1.2.0 to 1.2.1 in /docker/latest (#3045) Bumps [html5rdf](https://github.com/RDFLib/html5rdf) from 1.2.0 to 1.2.1. - [Release notes](https://github.com/RDFLib/html5rdf/releases) - [Changelog](https://github.com/RDFLib/html5rdf/blob/main/CHANGES.rst) - [Commits](https://github.com/RDFLib/html5rdf/commits/v1.2.1) --- updated-dependencies: - dependency-name: html5rdf dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- docker/latest/requirements.in | 2 +- docker/latest/requirements.txt | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docker/latest/requirements.in b/docker/latest/requirements.in index fa2ecec495..f710c09d57 100644 --- a/docker/latest/requirements.in +++ b/docker/latest/requirements.in @@ -1,4 +1,4 @@ # This file is used for building a docker image of the latest rdflib release. It # will be updated by dependabot when new releases are made. rdflib==7.1.3 -html5rdf==1.2.0 +html5rdf==1.2.1 diff --git a/docker/latest/requirements.txt b/docker/latest/requirements.txt index 95131d137e..2a04883448 100644 --- a/docker/latest/requirements.txt +++ b/docker/latest/requirements.txt @@ -4,11 +4,11 @@ # # pip-compile docker/latest/requirements.in # -html5rdf==1.2 - # via -r docker/latest/requirements.in +html5rdf==1.2.1 + # via -r requirements.in isodate==0.7.2 # via rdflib pyparsing==3.0.9 # via rdflib rdflib==7.1.3 - # via -r docker/latest/requirements.in + # via -r requirements.in From 74afcb06a14f76bb447bf165f517310da7d40744 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 20 Feb 2025 09:17:50 +1000 Subject: [PATCH 08/60] build(deps): bump html5rdf from 1.2 to 1.2.1 (#3048) Bumps [html5rdf](https://github.com/RDFLib/html5rdf) from 1.2 to 1.2.1. - [Release notes](https://github.com/RDFLib/html5rdf/releases) - [Changelog](https://github.com/RDFLib/html5rdf/blob/main/CHANGES.rst) - [Commits](https://github.com/RDFLib/html5rdf/commits/v1.2.1) --- updated-dependencies: - dependency-name: html5rdf dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- poetry.lock | 12 +++--------- 1 file changed, 3 insertions(+), 9 deletions(-) diff --git a/poetry.lock b/poetry.lock index cb90926f63..fccc5d101d 100644 --- a/poetry.lock +++ b/poetry.lock @@ -340,21 +340,15 @@ test = ["pytest (>=6)"] [[package]] name = "html5rdf" -version = "1.2" +version = "1.2.1" description = "HTML parser based on the WHATWG HTML specification" optional = true python-versions = ">=3.8" files = [ - {file = "html5rdf-1.2-py2.py3-none-any.whl", hash = "sha256:08169aa52a98ee3a6d3456d83feb36211fb5edcbcf3e05f6d19e0136f581638c"}, - {file = "html5rdf-1.2.tar.gz", hash = "sha256:08378cbbbb63993ba7bb5eb1eac44bf9ca7b1a23dbee3d2afef5376597fb00a5"}, + {file = "html5rdf-1.2.1-py2.py3-none-any.whl", hash = "sha256:1f519121bc366af3e485310dc8041d2e86e5173c1a320fac3dc9d2604069b83e"}, + {file = "html5rdf-1.2.1.tar.gz", hash = "sha256:ace9b420ce52995bb4f05e7425eedf19e433c981dfe7a831ab391e2fa2e1a195"}, ] -[package.extras] -all = ["chardet (>=2.2.1)", "genshi (>=0.7.1)", "lxml (>=3.4.0)"] -chardet = ["chardet (>=2.2.1)"] -genshi = ["genshi (>=0.7.1)"] -lxml = ["lxml (>=3.4.0)"] - [[package]] name = "idna" version = "3.4" From cfd422236cd03933290757a8cacb214a40866bea Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 20 Feb 2025 09:18:24 +1000 Subject: [PATCH 09/60] build(deps-dev): bump wheel from 0.44.0 to 0.45.1 (#3051) Bumps [wheel](https://github.com/pypa/wheel) from 0.44.0 to 0.45.1. - [Release notes](https://github.com/pypa/wheel/releases) - [Changelog](https://github.com/pypa/wheel/blob/main/docs/news.rst) - [Commits](https://github.com/pypa/wheel/compare/0.44.0...0.45.1) --- updated-dependencies: - dependency-name: wheel dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- poetry.lock | 8 ++++---- pyproject.toml | 2 +- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/poetry.lock b/poetry.lock index fccc5d101d..587c48bd2a 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1443,13 +1443,13 @@ zstd = ["zstandard (>=0.18.0)"] [[package]] name = "wheel" -version = "0.44.0" +version = "0.45.1" description = "A built-package format for Python" optional = false python-versions = ">=3.8" files = [ - {file = "wheel-0.44.0-py3-none-any.whl", hash = "sha256:2376a90c98cc337d18623527a97c31797bd02bad0033d41547043a1cbfbe448f"}, - {file = "wheel-0.44.0.tar.gz", hash = "sha256:a29c3f2817e95ab89aa4660681ad547c0e9547f20e75b0562fe7723c9a2a9d49"}, + {file = "wheel-0.45.1-py3-none-any.whl", hash = "sha256:708e7481cc80179af0e556bbf0cc00b8444c7321e2700b8d8580231d13017248"}, + {file = "wheel-0.45.1.tar.gz", hash = "sha256:661e1abd9198507b1409a20c02106d9670b2576e916d58f520316666abca6729"}, ] [package.extras] @@ -1480,4 +1480,4 @@ orjson = ["orjson"] [metadata] lock-version = "2.0" python-versions = "^3.8.1" -content-hash = "3d9605c7f277f69e5c732d2edf25ed10fde6af31b791bb787229eb92be962af6" +content-hash = "01e3ca79b8228cd40cbbc60cbb6690bbe0f6207ca070eee0a44259778c797172" diff --git a/pyproject.toml b/pyproject.toml index 4aac771e62..fecad7eaeb 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -59,7 +59,7 @@ pytest-cov = ">=4,<6" coverage = {version = "^7.0.1", extras = ["toml"]} types-setuptools = ">=68.0.0.3,<72.0.0.0" setuptools = ">=68,<72" -wheel = ">=0.42,<0.45" +wheel = ">=0.42,<0.46" [tool.poetry.group.docs.dependencies] sphinx = ">=7.1.2,<8" From 89103ec495253ad1a67328cb864514024cdddadd Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sat, 22 Mar 2025 13:01:39 +1000 Subject: [PATCH 10/60] build(deps-dev): bump ruff 0.9.2 - 0.9.6 + readthedocs conf (#3072) * build(deps-dev): bump ruff from 0.9.2 to 0.9.6 Bumps [ruff](https://github.com/astral-sh/ruff) from 0.9.2 to 0.9.6. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.9.2...0.9.6) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] * add readthedocs sphynx.configuration --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Nicholas Car Co-authored-by: Nicholas Car --- .readthedocs.yaml | 1 + poetry.lock | 38 +++++++++++++++++++------------------- 2 files changed, 20 insertions(+), 19 deletions(-) diff --git a/.readthedocs.yaml b/.readthedocs.yaml index f737b9b003..d847956c19 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -27,4 +27,5 @@ build: - python -c "from rdflib import Graph; print(Graph)" sphinx: + configuration: docs/conf.py fail_on_warning: true diff --git a/poetry.lock b/poetry.lock index 587c48bd2a..070ad0f10a 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1181,29 +1181,29 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] [[package]] name = "ruff" -version = "0.9.2" +version = "0.9.6" description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" files = [ - {file = "ruff-0.9.2-py3-none-linux_armv6l.whl", hash = "sha256:80605a039ba1454d002b32139e4970becf84b5fee3a3c3bf1c2af6f61a784347"}, - {file = "ruff-0.9.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b9aab82bb20afd5f596527045c01e6ae25a718ff1784cb92947bff1f83068b00"}, - {file = "ruff-0.9.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:fbd337bac1cfa96be615f6efcd4bc4d077edbc127ef30e2b8ba2a27e18c054d4"}, - {file = "ruff-0.9.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82b35259b0cbf8daa22a498018e300b9bb0174c2bbb7bcba593935158a78054d"}, - {file = "ruff-0.9.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b6a9701d1e371bf41dca22015c3f89769da7576884d2add7317ec1ec8cb9c3c"}, - {file = "ruff-0.9.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9cc53e68b3c5ae41e8faf83a3b89f4a5d7b2cb666dff4b366bb86ed2a85b481f"}, - {file = "ruff-0.9.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:8efd9da7a1ee314b910da155ca7e8953094a7c10d0c0a39bfde3fcfd2a015684"}, - {file = "ruff-0.9.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3292c5a22ea9a5f9a185e2d131dc7f98f8534a32fb6d2ee7b9944569239c648d"}, - {file = "ruff-0.9.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1a605fdcf6e8b2d39f9436d343d1f0ff70c365a1e681546de0104bef81ce88df"}, - {file = "ruff-0.9.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c547f7f256aa366834829a08375c297fa63386cbe5f1459efaf174086b564247"}, - {file = "ruff-0.9.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d18bba3d3353ed916e882521bc3e0af403949dbada344c20c16ea78f47af965e"}, - {file = "ruff-0.9.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:b338edc4610142355ccf6b87bd356729b62bf1bc152a2fad5b0c7dc04af77bfe"}, - {file = "ruff-0.9.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:492a5e44ad9b22a0ea98cf72e40305cbdaf27fac0d927f8bc9e1df316dcc96eb"}, - {file = "ruff-0.9.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:af1e9e9fe7b1f767264d26b1075ac4ad831c7db976911fa362d09b2d0356426a"}, - {file = "ruff-0.9.2-py3-none-win32.whl", hash = "sha256:71cbe22e178c5da20e1514e1e01029c73dc09288a8028a5d3446e6bba87a5145"}, - {file = "ruff-0.9.2-py3-none-win_amd64.whl", hash = "sha256:c5e1d6abc798419cf46eed03f54f2e0c3adb1ad4b801119dedf23fcaf69b55b5"}, - {file = "ruff-0.9.2-py3-none-win_arm64.whl", hash = "sha256:a1b63fa24149918f8b37cef2ee6fff81f24f0d74b6f0bdc37bc3e1f2143e41c6"}, - {file = "ruff-0.9.2.tar.gz", hash = "sha256:b5eceb334d55fae5f316f783437392642ae18e16dcf4f1858d55d3c2a0f8f5d0"}, + {file = "ruff-0.9.6-py3-none-linux_armv6l.whl", hash = "sha256:2f218f356dd2d995839f1941322ff021c72a492c470f0b26a34f844c29cdf5ba"}, + {file = "ruff-0.9.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b908ff4df65dad7b251c9968a2e4560836d8f5487c2f0cc238321ed951ea0504"}, + {file = "ruff-0.9.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b109c0ad2ececf42e75fa99dc4043ff72a357436bb171900714a9ea581ddef83"}, + {file = "ruff-0.9.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1de4367cca3dac99bcbd15c161404e849bb0bfd543664db39232648dc00112dc"}, + {file = "ruff-0.9.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ac3ee4d7c2c92ddfdaedf0bf31b2b176fa7aa8950efc454628d477394d35638b"}, + {file = "ruff-0.9.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5dc1edd1775270e6aa2386119aea692039781429f0be1e0949ea5884e011aa8e"}, + {file = "ruff-0.9.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:4a091729086dffa4bd070aa5dab7e39cc6b9d62eb2bef8f3d91172d30d599666"}, + {file = "ruff-0.9.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d1bbc6808bf7b15796cef0815e1dfb796fbd383e7dbd4334709642649625e7c5"}, + {file = "ruff-0.9.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:589d1d9f25b5754ff230dce914a174a7c951a85a4e9270613a2b74231fdac2f5"}, + {file = "ruff-0.9.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc61dd5131742e21103fbbdcad683a8813be0e3c204472d520d9a5021ca8b217"}, + {file = "ruff-0.9.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:5e2d9126161d0357e5c8f30b0bd6168d2c3872372f14481136d13de9937f79b6"}, + {file = "ruff-0.9.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:68660eab1a8e65babb5229a1f97b46e3120923757a68b5413d8561f8a85d4897"}, + {file = "ruff-0.9.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c4cae6c4cc7b9b4017c71114115db0445b00a16de3bcde0946273e8392856f08"}, + {file = "ruff-0.9.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:19f505b643228b417c1111a2a536424ddde0db4ef9023b9e04a46ed8a1cb4656"}, + {file = "ruff-0.9.6-py3-none-win32.whl", hash = "sha256:194d8402bceef1b31164909540a597e0d913c0e4952015a5b40e28c146121b5d"}, + {file = "ruff-0.9.6-py3-none-win_amd64.whl", hash = "sha256:03482d5c09d90d4ee3f40d97578423698ad895c87314c4de39ed2af945633caa"}, + {file = "ruff-0.9.6-py3-none-win_arm64.whl", hash = "sha256:0e2bb706a2be7ddfea4a4af918562fdc1bcb16df255e5fa595bbd800ce322a5a"}, + {file = "ruff-0.9.6.tar.gz", hash = "sha256:81761592f72b620ec8fa1068a6fd00e98a5ebee342a3642efd84454f3031dca9"}, ] [[package]] From 282ed561424be1024395786edb48c78c7d700bed Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sat, 22 Mar 2025 13:14:57 +1000 Subject: [PATCH 11/60] build(deps): bump lxml from 5.3.0 to 5.3.1 (#3071) Bumps [lxml](https://github.com/lxml/lxml) from 5.3.0 to 5.3.1. - [Release notes](https://github.com/lxml/lxml/releases) - [Changelog](https://github.com/lxml/lxml/blob/master/CHANGES.txt) - [Commits](https://github.com/lxml/lxml/compare/lxml-5.3.0...lxml-5.3.1) --- updated-dependencies: - dependency-name: lxml dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Nicholas Car --- poetry.lock | 282 ++++++++++++++++++++++++++-------------------------- 1 file changed, 141 insertions(+), 141 deletions(-) diff --git a/poetry.lock b/poetry.lock index 070ad0f10a..fc6fb2ea2a 100644 --- a/poetry.lock +++ b/poetry.lock @@ -431,157 +431,157 @@ i18n = ["Babel (>=2.7)"] [[package]] name = "lxml" -version = "5.3.0" +version = "5.3.1" description = "Powerful and Pythonic XML processing library combining libxml2/libxslt with the ElementTree API." optional = true python-versions = ">=3.6" files = [ - {file = "lxml-5.3.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:dd36439be765e2dde7660212b5275641edbc813e7b24668831a5c8ac91180656"}, - {file = "lxml-5.3.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ae5fe5c4b525aa82b8076c1a59d642c17b6e8739ecf852522c6321852178119d"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:501d0d7e26b4d261fca8132854d845e4988097611ba2531408ec91cf3fd9d20a"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb66442c2546446944437df74379e9cf9e9db353e61301d1a0e26482f43f0dd8"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9e41506fec7a7f9405b14aa2d5c8abbb4dbbd09d88f9496958b6d00cb4d45330"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f7d4a670107d75dfe5ad080bed6c341d18c4442f9378c9f58e5851e86eb79965"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:41ce1f1e2c7755abfc7e759dc34d7d05fd221723ff822947132dc934d122fe22"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:44264ecae91b30e5633013fb66f6ddd05c006d3e0e884f75ce0b4755b3e3847b"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_28_ppc64le.whl", hash = "sha256:3c174dc350d3ec52deb77f2faf05c439331d6ed5e702fc247ccb4e6b62d884b7"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_28_s390x.whl", hash = "sha256:2dfab5fa6a28a0b60a20638dc48e6343c02ea9933e3279ccb132f555a62323d8"}, - {file = "lxml-5.3.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:b1c8c20847b9f34e98080da785bb2336ea982e7f913eed5809e5a3c872900f32"}, - {file = "lxml-5.3.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2c86bf781b12ba417f64f3422cfc302523ac9cd1d8ae8c0f92a1c66e56ef2e86"}, - {file = "lxml-5.3.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:c162b216070f280fa7da844531169be0baf9ccb17263cf5a8bf876fcd3117fa5"}, - {file = "lxml-5.3.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:36aef61a1678cb778097b4a6eeae96a69875d51d1e8f4d4b491ab3cfb54b5a03"}, - {file = "lxml-5.3.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f65e5120863c2b266dbcc927b306c5b78e502c71edf3295dfcb9501ec96e5fc7"}, - {file = "lxml-5.3.0-cp310-cp310-win32.whl", hash = "sha256:ef0c1fe22171dd7c7c27147f2e9c3e86f8bdf473fed75f16b0c2e84a5030ce80"}, - {file = "lxml-5.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:052d99051e77a4f3e8482c65014cf6372e61b0a6f4fe9edb98503bb5364cfee3"}, - {file = "lxml-5.3.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:74bcb423462233bc5d6066e4e98b0264e7c1bed7541fff2f4e34fe6b21563c8b"}, - {file = "lxml-5.3.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a3d819eb6f9b8677f57f9664265d0a10dd6551d227afb4af2b9cd7bdc2ccbf18"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5b8f5db71b28b8c404956ddf79575ea77aa8b1538e8b2ef9ec877945b3f46442"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c3406b63232fc7e9b8783ab0b765d7c59e7c59ff96759d8ef9632fca27c7ee4"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2ecdd78ab768f844c7a1d4a03595038c166b609f6395e25af9b0f3f26ae1230f"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:168f2dfcfdedf611eb285efac1516c8454c8c99caf271dccda8943576b67552e"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa617107a410245b8660028a7483b68e7914304a6d4882b5ff3d2d3eb5948d8c"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:69959bd3167b993e6e710b99051265654133a98f20cec1d9b493b931942e9c16"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_28_ppc64le.whl", hash = "sha256:bd96517ef76c8654446fc3db9242d019a1bb5fe8b751ba414765d59f99210b79"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_28_s390x.whl", hash = "sha256:ab6dd83b970dc97c2d10bc71aa925b84788c7c05de30241b9e96f9b6d9ea3080"}, - {file = "lxml-5.3.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:eec1bb8cdbba2925bedc887bc0609a80e599c75b12d87ae42ac23fd199445654"}, - {file = "lxml-5.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6a7095eeec6f89111d03dabfe5883a1fd54da319c94e0fb104ee8f23616b572d"}, - {file = "lxml-5.3.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:6f651ebd0b21ec65dfca93aa629610a0dbc13dbc13554f19b0113da2e61a4763"}, - {file = "lxml-5.3.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:f422a209d2455c56849442ae42f25dbaaba1c6c3f501d58761c619c7836642ec"}, - {file = "lxml-5.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:62f7fdb0d1ed2065451f086519865b4c90aa19aed51081979ecd05a21eb4d1be"}, - {file = "lxml-5.3.0-cp311-cp311-win32.whl", hash = "sha256:c6379f35350b655fd817cd0d6cbeef7f265f3ae5fedb1caae2eb442bbeae9ab9"}, - {file = "lxml-5.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:9c52100e2c2dbb0649b90467935c4b0de5528833c76a35ea1a2691ec9f1ee7a1"}, - {file = "lxml-5.3.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:e99f5507401436fdcc85036a2e7dc2e28d962550afe1cbfc07c40e454256a859"}, - {file = "lxml-5.3.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:384aacddf2e5813a36495233b64cb96b1949da72bef933918ba5c84e06af8f0e"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:874a216bf6afaf97c263b56371434e47e2c652d215788396f60477540298218f"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65ab5685d56914b9a2a34d67dd5488b83213d680b0c5d10b47f81da5a16b0b0e"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aac0bbd3e8dd2d9c45ceb82249e8bdd3ac99131a32b4d35c8af3cc9db1657179"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b369d3db3c22ed14c75ccd5af429086f166a19627e84a8fdade3f8f31426e52a"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c24037349665434f375645fa9d1f5304800cec574d0310f618490c871fd902b3"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:62d172f358f33a26d6b41b28c170c63886742f5b6772a42b59b4f0fa10526cb1"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_28_ppc64le.whl", hash = "sha256:c1f794c02903c2824fccce5b20c339a1a14b114e83b306ff11b597c5f71a1c8d"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_28_s390x.whl", hash = "sha256:5d6a6972b93c426ace71e0be9a6f4b2cfae9b1baed2eed2006076a746692288c"}, - {file = "lxml-5.3.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:3879cc6ce938ff4eb4900d901ed63555c778731a96365e53fadb36437a131a99"}, - {file = "lxml-5.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:74068c601baff6ff021c70f0935b0c7bc528baa8ea210c202e03757c68c5a4ff"}, - {file = "lxml-5.3.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:ecd4ad8453ac17bc7ba3868371bffb46f628161ad0eefbd0a855d2c8c32dd81a"}, - {file = "lxml-5.3.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7e2f58095acc211eb9d8b5771bf04df9ff37d6b87618d1cbf85f92399c98dae8"}, - {file = "lxml-5.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e63601ad5cd8f860aa99d109889b5ac34de571c7ee902d6812d5d9ddcc77fa7d"}, - {file = "lxml-5.3.0-cp312-cp312-win32.whl", hash = "sha256:17e8d968d04a37c50ad9c456a286b525d78c4a1c15dd53aa46c1d8e06bf6fa30"}, - {file = "lxml-5.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:c1a69e58a6bb2de65902051d57fde951febad631a20a64572677a1052690482f"}, - {file = "lxml-5.3.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:8c72e9563347c7395910de6a3100a4840a75a6f60e05af5e58566868d5eb2d6a"}, - {file = "lxml-5.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e92ce66cd919d18d14b3856906a61d3f6b6a8500e0794142338da644260595cd"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d04f064bebdfef9240478f7a779e8c5dc32b8b7b0b2fc6a62e39b928d428e51"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5c2fb570d7823c2bbaf8b419ba6e5662137f8166e364a8b2b91051a1fb40ab8b"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0c120f43553ec759f8de1fee2f4794452b0946773299d44c36bfe18e83caf002"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:562e7494778a69086f0312ec9689f6b6ac1c6b65670ed7d0267e49f57ffa08c4"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:423b121f7e6fa514ba0c7918e56955a1d4470ed35faa03e3d9f0e3baa4c7e492"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:c00f323cc00576df6165cc9d21a4c21285fa6b9989c5c39830c3903dc4303ef3"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_28_ppc64le.whl", hash = "sha256:1fdc9fae8dd4c763e8a31e7630afef517eab9f5d5d31a278df087f307bf601f4"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_28_s390x.whl", hash = "sha256:658f2aa69d31e09699705949b5fc4719cbecbd4a97f9656a232e7d6c7be1a367"}, - {file = "lxml-5.3.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:1473427aff3d66a3fa2199004c3e601e6c4500ab86696edffdbc84954c72d832"}, - {file = "lxml-5.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a87de7dd873bf9a792bf1e58b1c3887b9264036629a5bf2d2e6579fe8e73edff"}, - {file = "lxml-5.3.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:0d7b36afa46c97875303a94e8f3ad932bf78bace9e18e603f2085b652422edcd"}, - {file = "lxml-5.3.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:cf120cce539453ae086eacc0130a324e7026113510efa83ab42ef3fcfccac7fb"}, - {file = "lxml-5.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:df5c7333167b9674aa8ae1d4008fa4bc17a313cc490b2cca27838bbdcc6bb15b"}, - {file = "lxml-5.3.0-cp313-cp313-win32.whl", hash = "sha256:c802e1c2ed9f0c06a65bc4ed0189d000ada8049312cfeab6ca635e39c9608957"}, - {file = "lxml-5.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:406246b96d552e0503e17a1006fd27edac678b3fcc9f1be71a2f94b4ff61528d"}, - {file = "lxml-5.3.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:8f0de2d390af441fe8b2c12626d103540b5d850d585b18fcada58d972b74a74e"}, - {file = "lxml-5.3.0-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1afe0a8c353746e610bd9031a630a95bcfb1a720684c3f2b36c4710a0a96528f"}, - {file = "lxml-5.3.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:56b9861a71575f5795bde89256e7467ece3d339c9b43141dbdd54544566b3b94"}, - {file = "lxml-5.3.0-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:9fb81d2824dff4f2e297a276297e9031f46d2682cafc484f49de182aa5e5df99"}, - {file = "lxml-5.3.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:2c226a06ecb8cdef28845ae976da407917542c5e6e75dcac7cc33eb04aaeb237"}, - {file = "lxml-5.3.0-cp36-cp36m-musllinux_1_2_x86_64.whl", hash = "sha256:7d3d1ca42870cdb6d0d29939630dbe48fa511c203724820fc0fd507b2fb46577"}, - {file = "lxml-5.3.0-cp36-cp36m-win32.whl", hash = "sha256:094cb601ba9f55296774c2d57ad68730daa0b13dc260e1f941b4d13678239e70"}, - {file = "lxml-5.3.0-cp36-cp36m-win_amd64.whl", hash = "sha256:eafa2c8658f4e560b098fe9fc54539f86528651f61849b22111a9b107d18910c"}, - {file = "lxml-5.3.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:cb83f8a875b3d9b458cada4f880fa498646874ba4011dc974e071a0a84a1b033"}, - {file = "lxml-5.3.0-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:25f1b69d41656b05885aa185f5fdf822cb01a586d1b32739633679699f220391"}, - {file = "lxml-5.3.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23e0553b8055600b3bf4a00b255ec5c92e1e4aebf8c2c09334f8368e8bd174d6"}, - {file = "lxml-5.3.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ada35dd21dc6c039259596b358caab6b13f4db4d4a7f8665764d616daf9cc1d"}, - {file = "lxml-5.3.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:81b4e48da4c69313192d8c8d4311e5d818b8be1afe68ee20f6385d0e96fc9512"}, - {file = "lxml-5.3.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:2bc9fd5ca4729af796f9f59cd8ff160fe06a474da40aca03fcc79655ddee1a8b"}, - {file = "lxml-5.3.0-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:07da23d7ee08577760f0a71d67a861019103e4812c87e2fab26b039054594cc5"}, - {file = "lxml-5.3.0-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:ea2e2f6f801696ad7de8aec061044d6c8c0dd4037608c7cab38a9a4d316bfb11"}, - {file = "lxml-5.3.0-cp37-cp37m-win32.whl", hash = "sha256:5c54afdcbb0182d06836cc3d1be921e540be3ebdf8b8a51ee3ef987537455f84"}, - {file = "lxml-5.3.0-cp37-cp37m-win_amd64.whl", hash = "sha256:f2901429da1e645ce548bf9171784c0f74f0718c3f6150ce166be39e4dd66c3e"}, - {file = "lxml-5.3.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c56a1d43b2f9ee4786e4658c7903f05da35b923fb53c11025712562d5cc02753"}, - {file = "lxml-5.3.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ee8c39582d2652dcd516d1b879451500f8db3fe3607ce45d7c5957ab2596040"}, - {file = "lxml-5.3.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fdf3a3059611f7585a78ee10399a15566356116a4288380921a4b598d807a22"}, - {file = "lxml-5.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:146173654d79eb1fc97498b4280c1d3e1e5d58c398fa530905c9ea50ea849b22"}, - {file = "lxml-5.3.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:0a7056921edbdd7560746f4221dca89bb7a3fe457d3d74267995253f46343f15"}, - {file = "lxml-5.3.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:9e4b47ac0f5e749cfc618efdf4726269441014ae1d5583e047b452a32e221920"}, - {file = "lxml-5.3.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:f914c03e6a31deb632e2daa881fe198461f4d06e57ac3d0e05bbcab8eae01945"}, - {file = "lxml-5.3.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:213261f168c5e1d9b7535a67e68b1f59f92398dd17a56d934550837143f79c42"}, - {file = "lxml-5.3.0-cp38-cp38-win32.whl", hash = "sha256:218c1b2e17a710e363855594230f44060e2025b05c80d1f0661258142b2add2e"}, - {file = "lxml-5.3.0-cp38-cp38-win_amd64.whl", hash = "sha256:315f9542011b2c4e1d280e4a20ddcca1761993dda3afc7a73b01235f8641e903"}, - {file = "lxml-5.3.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:1ffc23010330c2ab67fac02781df60998ca8fe759e8efde6f8b756a20599c5de"}, - {file = "lxml-5.3.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:2b3778cb38212f52fac9fe913017deea2fdf4eb1a4f8e4cfc6b009a13a6d3fcc"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b0c7a688944891086ba192e21c5229dea54382f4836a209ff8d0a660fac06be"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:747a3d3e98e24597981ca0be0fd922aebd471fa99d0043a3842d00cdcad7ad6a"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86a6b24b19eaebc448dc56b87c4865527855145d851f9fc3891673ff97950540"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b11a5d918a6216e521c715b02749240fb07ae5a1fefd4b7bf12f833bc8b4fe70"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:68b87753c784d6acb8a25b05cb526c3406913c9d988d51f80adecc2b0775d6aa"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:109fa6fede314cc50eed29e6e56c540075e63d922455346f11e4d7a036d2b8cf"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_28_ppc64le.whl", hash = "sha256:02ced472497b8362c8e902ade23e3300479f4f43e45f4105c85ef43b8db85229"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_28_s390x.whl", hash = "sha256:6b038cc86b285e4f9fea2ba5ee76e89f21ed1ea898e287dc277a25884f3a7dfe"}, - {file = "lxml-5.3.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:7437237c6a66b7ca341e868cda48be24b8701862757426852c9b3186de1da8a2"}, - {file = "lxml-5.3.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7f41026c1d64043a36fda21d64c5026762d53a77043e73e94b71f0521939cc71"}, - {file = "lxml-5.3.0-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:482c2f67761868f0108b1743098640fbb2a28a8e15bf3f47ada9fa59d9fe08c3"}, - {file = "lxml-5.3.0-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:1483fd3358963cc5c1c9b122c80606a3a79ee0875bcac0204149fa09d6ff2727"}, - {file = "lxml-5.3.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:2dec2d1130a9cda5b904696cec33b2cfb451304ba9081eeda7f90f724097300a"}, - {file = "lxml-5.3.0-cp39-cp39-win32.whl", hash = "sha256:a0eabd0a81625049c5df745209dc7fcef6e2aea7793e5f003ba363610aa0a3ff"}, - {file = "lxml-5.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:89e043f1d9d341c52bf2af6d02e6adde62e0a46e6755d5eb60dc6e4f0b8aeca2"}, - {file = "lxml-5.3.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7b1cd427cb0d5f7393c31b7496419da594fe600e6fdc4b105a54f82405e6626c"}, - {file = "lxml-5.3.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:51806cfe0279e06ed8500ce19479d757db42a30fd509940b1701be9c86a5ff9a"}, - {file = "lxml-5.3.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ee70d08fd60c9565ba8190f41a46a54096afa0eeb8f76bd66f2c25d3b1b83005"}, - {file = "lxml-5.3.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:8dc2c0395bea8254d8daebc76dcf8eb3a95ec2a46fa6fae5eaccee366bfe02ce"}, - {file = "lxml-5.3.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:6ba0d3dcac281aad8a0e5b14c7ed6f9fa89c8612b47939fc94f80b16e2e9bc83"}, - {file = "lxml-5.3.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:6e91cf736959057f7aac7adfc83481e03615a8e8dd5758aa1d95ea69e8931dba"}, - {file = "lxml-5.3.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:94d6c3782907b5e40e21cadf94b13b0842ac421192f26b84c45f13f3c9d5dc27"}, - {file = "lxml-5.3.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c300306673aa0f3ed5ed9372b21867690a17dba38c68c44b287437c362ce486b"}, - {file = "lxml-5.3.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:78d9b952e07aed35fe2e1a7ad26e929595412db48535921c5013edc8aa4a35ce"}, - {file = "lxml-5.3.0-pp37-pypy37_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:01220dca0d066d1349bd6a1726856a78f7929f3878f7e2ee83c296c69495309e"}, - {file = "lxml-5.3.0-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:2d9b8d9177afaef80c53c0a9e30fa252ff3036fb1c6494d427c066a4ce6a282f"}, - {file = "lxml-5.3.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:20094fc3f21ea0a8669dc4c61ed7fa8263bd37d97d93b90f28fc613371e7a875"}, - {file = "lxml-5.3.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ace2c2326a319a0bb8a8b0e5b570c764962e95818de9f259ce814ee666603f19"}, - {file = "lxml-5.3.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:92e67a0be1639c251d21e35fe74df6bcc40cba445c2cda7c4a967656733249e2"}, - {file = "lxml-5.3.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd5350b55f9fecddc51385463a4f67a5da829bc741e38cf689f38ec9023f54ab"}, - {file = "lxml-5.3.0-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:4c1fefd7e3d00921c44dc9ca80a775af49698bbfd92ea84498e56acffd4c5469"}, - {file = "lxml-5.3.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:71a8dd38fbd2f2319136d4ae855a7078c69c9a38ae06e0c17c73fd70fc6caad8"}, - {file = "lxml-5.3.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:97acf1e1fd66ab53dacd2c35b319d7e548380c2e9e8c54525c6e76d21b1ae3b1"}, - {file = "lxml-5.3.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:68934b242c51eb02907c5b81d138cb977b2129a0a75a8f8b60b01cb8586c7b21"}, - {file = "lxml-5.3.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b710bc2b8292966b23a6a0121f7a6c51d45d2347edcc75f016ac123b8054d3f2"}, - {file = "lxml-5.3.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18feb4b93302091b1541221196a2155aa296c363fd233814fa11e181adebc52f"}, - {file = "lxml-5.3.0-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:3eb44520c4724c2e1a57c0af33a379eee41792595023f367ba3952a2d96c2aab"}, - {file = "lxml-5.3.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:609251a0ca4770e5a8768ff902aa02bf636339c5a93f9349b48eb1f606f7f3e9"}, - {file = "lxml-5.3.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:516f491c834eb320d6c843156440fe7fc0d50b33e44387fcec5b02f0bc118a4c"}, - {file = "lxml-5.3.0.tar.gz", hash = "sha256:4e109ca30d1edec1ac60cdbe341905dc3b8f55b16855e03a54aaf59e51ec8c6f"}, + {file = "lxml-5.3.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a4058f16cee694577f7e4dd410263cd0ef75644b43802a689c2b3c2a7e69453b"}, + {file = "lxml-5.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:364de8f57d6eda0c16dcfb999af902da31396949efa0e583e12675d09709881b"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:528f3a0498a8edc69af0559bdcf8a9f5a8bf7c00051a6ef3141fdcf27017bbf5"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:db4743e30d6f5f92b6d2b7c86b3ad250e0bad8dee4b7ad8a0c44bfb276af89a3"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:17b5d7f8acf809465086d498d62a981fa6a56d2718135bb0e4aa48c502055f5c"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:928e75a7200a4c09e6efc7482a1337919cc61fe1ba289f297827a5b76d8969c2"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5a997b784a639e05b9d4053ef3b20c7e447ea80814a762f25b8ed5a89d261eac"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:7b82e67c5feb682dbb559c3e6b78355f234943053af61606af126df2183b9ef9"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_ppc64le.whl", hash = "sha256:f1de541a9893cf8a1b1db9bf0bf670a2decab42e3e82233d36a74eda7822b4c9"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_s390x.whl", hash = "sha256:de1fc314c3ad6bc2f6bd5b5a5b9357b8c6896333d27fdbb7049aea8bd5af2d79"}, + {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:7c0536bd9178f754b277a3e53f90f9c9454a3bd108b1531ffff720e082d824f2"}, + {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:68018c4c67d7e89951a91fbd371e2e34cd8cfc71f0bb43b5332db38497025d51"}, + {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:aa826340a609d0c954ba52fd831f0fba2a4165659ab0ee1a15e4aac21f302406"}, + {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:796520afa499732191e39fc95b56a3b07f95256f2d22b1c26e217fb69a9db5b5"}, + {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3effe081b3135237da6e4c4530ff2a868d3f80be0bda027e118a5971285d42d0"}, + {file = "lxml-5.3.1-cp310-cp310-win32.whl", hash = "sha256:a22f66270bd6d0804b02cd49dae2b33d4341015545d17f8426f2c4e22f557a23"}, + {file = "lxml-5.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:0bcfadea3cdc68e678d2b20cb16a16716887dd00a881e16f7d806c2138b8ff0c"}, + {file = "lxml-5.3.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:e220f7b3e8656ab063d2eb0cd536fafef396829cafe04cb314e734f87649058f"}, + {file = "lxml-5.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0f2cfae0688fd01f7056a17367e3b84f37c545fb447d7282cf2c242b16262607"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:67d2f8ad9dcc3a9e826bdc7802ed541a44e124c29b7d95a679eeb58c1c14ade8"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:db0c742aad702fd5d0c6611a73f9602f20aec2007c102630c06d7633d9c8f09a"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:198bb4b4dd888e8390afa4f170d4fa28467a7eaf857f1952589f16cfbb67af27"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d2a3e412ce1849be34b45922bfef03df32d1410a06d1cdeb793a343c2f1fd666"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2b8969dbc8d09d9cd2ae06362c3bad27d03f433252601ef658a49bd9f2b22d79"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:5be8f5e4044146a69c96077c7e08f0709c13a314aa5315981185c1f00235fe65"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_ppc64le.whl", hash = "sha256:133f3493253a00db2c870d3740bc458ebb7d937bd0a6a4f9328373e0db305709"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_s390x.whl", hash = "sha256:52d82b0d436edd6a1d22d94a344b9a58abd6c68c357ed44f22d4ba8179b37629"}, + {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:1b6f92e35e2658a5ed51c6634ceb5ddae32053182851d8cad2a5bc102a359b33"}, + {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:203b1d3eaebd34277be06a3eb880050f18a4e4d60861efba4fb946e31071a295"}, + {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:155e1a5693cf4b55af652f5c0f78ef36596c7f680ff3ec6eb4d7d85367259b2c"}, + {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:22ec2b3c191f43ed21f9545e9df94c37c6b49a5af0a874008ddc9132d49a2d9c"}, + {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:7eda194dd46e40ec745bf76795a7cccb02a6a41f445ad49d3cf66518b0bd9cff"}, + {file = "lxml-5.3.1-cp311-cp311-win32.whl", hash = "sha256:fb7c61d4be18e930f75948705e9718618862e6fc2ed0d7159b2262be73f167a2"}, + {file = "lxml-5.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:c809eef167bf4a57af4b03007004896f5c60bd38dc3852fcd97a26eae3d4c9e6"}, + {file = "lxml-5.3.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:e69add9b6b7b08c60d7ff0152c7c9a6c45b4a71a919be5abde6f98f1ea16421c"}, + {file = "lxml-5.3.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:4e52e1b148867b01c05e21837586ee307a01e793b94072d7c7b91d2c2da02ffe"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a4b382e0e636ed54cd278791d93fe2c4f370772743f02bcbe431a160089025c9"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c2e49dc23a10a1296b04ca9db200c44d3eb32c8d8ec532e8c1fd24792276522a"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4399b4226c4785575fb20998dc571bc48125dc92c367ce2602d0d70e0c455eb0"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5412500e0dc5481b1ee9cf6b38bb3b473f6e411eb62b83dc9b62699c3b7b79f7"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c93ed3c998ea8472be98fb55aed65b5198740bfceaec07b2eba551e55b7b9ae"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:63d57fc94eb0bbb4735e45517afc21ef262991d8758a8f2f05dd6e4174944519"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_ppc64le.whl", hash = "sha256:b450d7cabcd49aa7ab46a3c6aa3ac7e1593600a1a0605ba536ec0f1b99a04322"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_s390x.whl", hash = "sha256:4df0ec814b50275ad6a99bc82a38b59f90e10e47714ac9871e1b223895825468"}, + {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:d184f85ad2bb1f261eac55cddfcf62a70dee89982c978e92b9a74a1bfef2e367"}, + {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b725e70d15906d24615201e650d5b0388b08a5187a55f119f25874d0103f90dd"}, + {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:a31fa7536ec1fb7155a0cd3a4e3d956c835ad0a43e3610ca32384d01f079ea1c"}, + {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:3c3c8b55c7fc7b7e8877b9366568cc73d68b82da7fe33d8b98527b73857a225f"}, + {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:d61ec60945d694df806a9aec88e8f29a27293c6e424f8ff91c80416e3c617645"}, + {file = "lxml-5.3.1-cp312-cp312-win32.whl", hash = "sha256:f4eac0584cdc3285ef2e74eee1513a6001681fd9753b259e8159421ed28a72e5"}, + {file = "lxml-5.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:29bfc8d3d88e56ea0a27e7c4897b642706840247f59f4377d81be8f32aa0cfbf"}, + {file = "lxml-5.3.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c093c7088b40d8266f57ed71d93112bd64c6724d31f0794c1e52cc4857c28e0e"}, + {file = "lxml-5.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b0884e3f22d87c30694e625b1e62e6f30d39782c806287450d9dc2fdf07692fd"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1637fa31ec682cd5760092adfabe86d9b718a75d43e65e211d5931809bc111e7"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a364e8e944d92dcbf33b6b494d4e0fb3499dcc3bd9485beb701aa4b4201fa414"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:779e851fd0e19795ccc8a9bb4d705d6baa0ef475329fe44a13cf1e962f18ff1e"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c4393600915c308e546dc7003d74371744234e8444a28622d76fe19b98fa59d1"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:673b9d8e780f455091200bba8534d5f4f465944cbdd61f31dc832d70e29064a5"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:2e4a570f6a99e96c457f7bec5ad459c9c420ee80b99eb04cbfcfe3fc18ec6423"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_ppc64le.whl", hash = "sha256:71f31eda4e370f46af42fc9f264fafa1b09f46ba07bdbee98f25689a04b81c20"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_s390x.whl", hash = "sha256:42978a68d3825eaac55399eb37a4d52012a205c0c6262199b8b44fcc6fd686e8"}, + {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:8b1942b3e4ed9ed551ed3083a2e6e0772de1e5e3aca872d955e2e86385fb7ff9"}, + {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:85c4f11be9cf08917ac2a5a8b6e1ef63b2f8e3799cec194417e76826e5f1de9c"}, + {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:231cf4d140b22a923b1d0a0a4e0b4f972e5893efcdec188934cc65888fd0227b"}, + {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:5865b270b420eda7b68928d70bb517ccbe045e53b1a428129bb44372bf3d7dd5"}, + {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:dbf7bebc2275016cddf3c997bf8a0f7044160714c64a9b83975670a04e6d2252"}, + {file = "lxml-5.3.1-cp313-cp313-win32.whl", hash = "sha256:d0751528b97d2b19a388b302be2a0ee05817097bab46ff0ed76feeec24951f78"}, + {file = "lxml-5.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:91fb6a43d72b4f8863d21f347a9163eecbf36e76e2f51068d59cd004c506f332"}, + {file = "lxml-5.3.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:016b96c58e9a4528219bb563acf1aaaa8bc5452e7651004894a973f03b84ba81"}, + {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:82a4bb10b0beef1434fb23a09f001ab5ca87895596b4581fd53f1e5145a8934a"}, + {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d68eeef7b4d08a25e51897dac29bcb62aba830e9ac6c4e3297ee7c6a0cf6439"}, + {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:f12582b8d3b4c6be1d298c49cb7ae64a3a73efaf4c2ab4e37db182e3545815ac"}, + {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:2df7ed5edeb6bd5590914cd61df76eb6cce9d590ed04ec7c183cf5509f73530d"}, + {file = "lxml-5.3.1-cp36-cp36m-musllinux_1_2_x86_64.whl", hash = "sha256:585c4dc429deebc4307187d2b71ebe914843185ae16a4d582ee030e6cfbb4d8a"}, + {file = "lxml-5.3.1-cp36-cp36m-win32.whl", hash = "sha256:06a20d607a86fccab2fc15a77aa445f2bdef7b49ec0520a842c5c5afd8381576"}, + {file = "lxml-5.3.1-cp36-cp36m-win_amd64.whl", hash = "sha256:057e30d0012439bc54ca427a83d458752ccda725c1c161cc283db07bcad43cf9"}, + {file = "lxml-5.3.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:4867361c049761a56bd21de507cab2c2a608c55102311d142ade7dab67b34f32"}, + {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3dddf0fb832486cc1ea71d189cb92eb887826e8deebe128884e15020bb6e3f61"}, + {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1bcc211542f7af6f2dfb705f5f8b74e865592778e6cafdfd19c792c244ccce19"}, + {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aaca5a812f050ab55426c32177091130b1e49329b3f002a32934cd0245571307"}, + {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:236610b77589faf462337b3305a1be91756c8abc5a45ff7ca8f245a71c5dab70"}, + {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:aed57b541b589fa05ac248f4cb1c46cbb432ab82cbd467d1c4f6a2bdc18aecf9"}, + {file = "lxml-5.3.1-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:75fa3d6946d317ffc7016a6fcc44f42db6d514b7fdb8b4b28cbe058303cb6e53"}, + {file = "lxml-5.3.1-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:96eef5b9f336f623ffc555ab47a775495e7e8846dde88de5f941e2906453a1ce"}, + {file = "lxml-5.3.1-cp37-cp37m-win32.whl", hash = "sha256:ef45f31aec9be01379fc6c10f1d9c677f032f2bac9383c827d44f620e8a88407"}, + {file = "lxml-5.3.1-cp37-cp37m-win_amd64.whl", hash = "sha256:a0611da6b07dd3720f492db1b463a4d1175b096b49438761cc9f35f0d9eaaef5"}, + {file = "lxml-5.3.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b2aca14c235c7a08558fe0a4786a1a05873a01e86b474dfa8f6df49101853a4e"}, + {file = "lxml-5.3.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae82fce1d964f065c32c9517309f0c7be588772352d2f40b1574a214bd6e6098"}, + {file = "lxml-5.3.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7aae7a3d63b935babfdc6864b31196afd5145878ddd22f5200729006366bc4d5"}, + {file = "lxml-5.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e8e0d177b1fe251c3b1b914ab64135475c5273c8cfd2857964b2e3bb0fe196a7"}, + {file = "lxml-5.3.1-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:6c4dd3bfd0c82400060896717dd261137398edb7e524527438c54a8c34f736bf"}, + {file = "lxml-5.3.1-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:f1208c1c67ec9e151d78aa3435aa9b08a488b53d9cfac9b699f15255a3461ef2"}, + {file = "lxml-5.3.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:c6aacf00d05b38a5069826e50ae72751cb5bc27bdc4d5746203988e429b385bb"}, + {file = "lxml-5.3.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:5881aaa4bf3a2d086c5f20371d3a5856199a0d8ac72dd8d0dbd7a2ecfc26ab73"}, + {file = "lxml-5.3.1-cp38-cp38-win32.whl", hash = "sha256:45fbb70ccbc8683f2fb58bea89498a7274af1d9ec7995e9f4af5604e028233fc"}, + {file = "lxml-5.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:7512b4d0fc5339d5abbb14d1843f70499cab90d0b864f790e73f780f041615d7"}, + {file = "lxml-5.3.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5885bc586f1edb48e5d68e7a4b4757b5feb2a496b64f462b4d65950f5af3364f"}, + {file = "lxml-5.3.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1b92fe86e04f680b848fff594a908edfa72b31bfc3499ef7433790c11d4c8cd8"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a091026c3bf7519ab1e64655a3f52a59ad4a4e019a6f830c24d6430695b1cf6a"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8ffb141361108e864ab5f1813f66e4e1164181227f9b1f105b042729b6c15125"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3715cdf0dd31b836433af9ee9197af10e3df41d273c19bb249230043667a5dfd"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:88b72eb7222d918c967202024812c2bfb4048deeb69ca328363fb8e15254c549"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa59974880ab5ad8ef3afaa26f9bda148c5f39e06b11a8ada4660ecc9fb2feb3"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:3bb8149840daf2c3f97cebf00e4ed4a65a0baff888bf2605a8d0135ff5cf764e"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_ppc64le.whl", hash = "sha256:0d6b2fa86becfa81f0a0271ccb9eb127ad45fb597733a77b92e8a35e53414914"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_s390x.whl", hash = "sha256:136bf638d92848a939fd8f0e06fcf92d9f2e4b57969d94faae27c55f3d85c05b"}, + {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:89934f9f791566e54c1d92cdc8f8fd0009447a5ecdb1ec6b810d5f8c4955f6be"}, + {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:a8ade0363f776f87f982572c2860cc43c65ace208db49c76df0a21dde4ddd16e"}, + {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:bfbbab9316330cf81656fed435311386610f78b6c93cc5db4bebbce8dd146675"}, + {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:172d65f7c72a35a6879217bcdb4bb11bc88d55fb4879e7569f55616062d387c2"}, + {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:e3c623923967f3e5961d272718655946e5322b8d058e094764180cdee7bab1af"}, + {file = "lxml-5.3.1-cp39-cp39-win32.whl", hash = "sha256:ce0930a963ff593e8bb6fda49a503911accc67dee7e5445eec972668e672a0f0"}, + {file = "lxml-5.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:f7b64fcd670bca8800bc10ced36620c6bbb321e7bc1214b9c0c0df269c1dddc2"}, + {file = "lxml-5.3.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:afa578b6524ff85fb365f454cf61683771d0170470c48ad9d170c48075f86725"}, + {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:67f5e80adf0aafc7b5454f2c1cb0cde920c9b1f2cbd0485f07cc1d0497c35c5d"}, + {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2dd0b80ac2d8f13ffc906123a6f20b459cb50a99222d0da492360512f3e50f84"}, + {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:422c179022ecdedbe58b0e242607198580804253da220e9454ffe848daa1cfd2"}, + {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:524ccfded8989a6595dbdda80d779fb977dbc9a7bc458864fc9a0c2fc15dc877"}, + {file = "lxml-5.3.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:48fd46bf7155def2e15287c6f2b133a2f78e2d22cdf55647269977b873c65499"}, + {file = "lxml-5.3.1-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:05123fad495a429f123307ac6d8fd6f977b71e9a0b6d9aeeb8f80c017cb17131"}, + {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a243132767150a44e6a93cd1dde41010036e1cbc63cc3e9fe1712b277d926ce3"}, + {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c92ea6d9dd84a750b2bae72ff5e8cf5fdd13e58dda79c33e057862c29a8d5b50"}, + {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:2f1be45d4c15f237209bbf123a0e05b5d630c8717c42f59f31ea9eae2ad89394"}, + {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:a83d3adea1e0ee36dac34627f78ddd7f093bb9cfc0a8e97f1572a949b695cb98"}, + {file = "lxml-5.3.1-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:3edbb9c9130bac05d8c3fe150c51c337a471cc7fdb6d2a0a7d3a88e88a829314"}, + {file = "lxml-5.3.1-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2f23cf50eccb3255b6e913188291af0150d89dab44137a69e14e4dcb7be981f1"}, + {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df7e5edac4778127f2bf452e0721a58a1cfa4d1d9eac63bdd650535eb8543615"}, + {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:094b28ed8a8a072b9e9e2113a81fda668d2053f2ca9f2d202c2c8c7c2d6516b1"}, + {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:514fe78fc4b87e7a7601c92492210b20a1b0c6ab20e71e81307d9c2e377c64de"}, + {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:8fffc08de02071c37865a155e5ea5fce0282e1546fd5bde7f6149fcaa32558ac"}, + {file = "lxml-5.3.1-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:4b0d5cdba1b655d5b18042ac9c9ff50bda33568eb80feaaca4fc237b9c4fbfde"}, + {file = "lxml-5.3.1-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:3031e4c16b59424e8d78522c69b062d301d951dc55ad8685736c3335a97fc270"}, + {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb659702a45136c743bc130760c6f137870d4df3a9e14386478b8a0511abcfca"}, + {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5a11b16a33656ffc43c92a5343a28dc71eefe460bcc2a4923a96f292692709f6"}, + {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c5ae125276f254b01daa73e2c103363d3e99e3e10505686ac7d9d2442dd4627a"}, + {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c76722b5ed4a31ba103e0dc77ab869222ec36efe1a614e42e9bcea88a36186fe"}, + {file = "lxml-5.3.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:33e06717c00c788ab4e79bc4726ecc50c54b9bfb55355eae21473c145d83c2d2"}, + {file = "lxml-5.3.1.tar.gz", hash = "sha256:106b7b5d2977b339f1e97efe2778e2ab20e99994cbb0ec5e55771ed0795920c8"}, ] [package.extras] cssselect = ["cssselect (>=0.7)"] -html-clean = ["lxml-html-clean"] +html-clean = ["lxml_html_clean"] html5 = ["html5lib"] htmlsoup = ["BeautifulSoup4"] -source = ["Cython (>=3.0.11)"] +source = ["Cython (>=3.0.11,<3.1.0)"] [[package]] name = "lxml-stubs" From 62685d898c009cdc0dfa848b4d98b5442b01ae8f Mon Sep 17 00:00:00 2001 From: Nils Philippsen Date: Sat, 22 Mar 2025 04:25:51 +0100 Subject: [PATCH 12/60] Cope with Namespace annotations in Python 3.14 (#3084) The __annotations__ member can be incomplete, use the get_annotations() helper from annotationlib (Python >= 3.14) or inspect (Python >= 3.10) if available. Related: #3083 Signed-off-by: Nils Philippsen Co-authored-by: Nicholas Car --- rdflib/namespace/__init__.py | 22 +++++++++++++++++++--- test_reports/rdflib_w3c_sparql10-HEAD.ttl | 2 +- test_reports/rdflib_w3c_sparql11-HEAD.ttl | 20 ++++++++++---------- 3 files changed, 30 insertions(+), 14 deletions(-) diff --git a/rdflib/namespace/__init__.py b/rdflib/namespace/__init__.py index eb8e2eeed8..d37d6f0cc3 100644 --- a/rdflib/namespace/__init__.py +++ b/rdflib/namespace/__init__.py @@ -74,6 +74,22 @@ import logging import warnings + +try: + # Python >= 3.14 + from annotationlib import ( + get_annotations, # type: ignore[attr-defined,unused-ignore] + ) +except ImportError: # pragma: no cover + try: + # Python >= 3.10 + from inspect import get_annotations # type: ignore[attr-defined,unused-ignore] + except ImportError: + + def get_annotations(thing: Any) -> dict: + return thing.__annotations__ + + from functools import lru_cache from pathlib import Path from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, Set, Tuple, Union @@ -310,7 +326,7 @@ def __contains__(cls, item: str) -> bool: if item_str.startswith(str(this_ns)): item_str = item_str[len(str(this_ns)) :] return any( - item_str in c.__annotations__ + item_str in get_annotations(c) or item_str in c._extras or (cls._underscore_num and item_str[0] == "_" and item_str[1:].isdigit()) for c in cls.mro() @@ -318,7 +334,7 @@ def __contains__(cls, item: str) -> bool: ) def __dir__(cls) -> Iterable[str]: - attrs = {str(x) for x in cls.__annotations__} + attrs = {str(x) for x in get_annotations(cls)} # Removing these as they should not be considered part of the namespace. attrs.difference_update(_DFNS_RESERVED_ATTRS) values = {cls[str(x)] for x in attrs} @@ -327,7 +343,7 @@ def __dir__(cls) -> Iterable[str]: def as_jsonld_context(self, pfx: str) -> dict: # noqa: N804 """Returns this DefinedNamespace as a JSON-LD 'context' object""" terms = {pfx: str(self._NS)} - for key, term in self.__annotations__.items(): + for key, term in get_annotations(self).items(): if issubclass(term, URIRef): terms[key] = f"{pfx}:{key}" diff --git a/test_reports/rdflib_w3c_sparql10-HEAD.ttl b/test_reports/rdflib_w3c_sparql10-HEAD.ttl index b8369a94d3..f3ac4255d6 100644 --- a/test_reports/rdflib_w3c_sparql10-HEAD.ttl +++ b/test_reports/rdflib_w3c_sparql10-HEAD.ttl @@ -1603,7 +1603,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . diff --git a/test_reports/rdflib_w3c_sparql11-HEAD.ttl b/test_reports/rdflib_w3c_sparql11-HEAD.ttl index 6140fa9143..6d498df8b6 100644 --- a/test_reports/rdflib_w3c_sparql11-HEAD.ttl +++ b/test_reports/rdflib_w3c_sparql11-HEAD.ttl @@ -691,7 +691,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -699,7 +699,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -707,7 +707,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -715,7 +715,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -723,7 +723,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -731,7 +731,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -1939,7 +1939,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -1947,7 +1947,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -2251,7 +2251,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . @@ -2259,7 +2259,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:passed ] ; + earl:outcome earl:failed ] ; earl:subject ; earl:test . From 9af9a40f93dd0a275a07500f0a0199cd0c3742c0 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sat, 22 Mar 2025 13:26:19 +1000 Subject: [PATCH 13/60] build(deps): bump dorny/test-reporter from 1 to 2 (#3081) Bumps [dorny/test-reporter](https://github.com/dorny/test-reporter) from 1 to 2. - [Release notes](https://github.com/dorny/test-reporter/releases) - [Changelog](https://github.com/dorny/test-reporter/blob/main/CHANGELOG.md) - [Commits](https://github.com/dorny/test-reporter/compare/v1...v2) --- updated-dependencies: - dependency-name: dorny/test-reporter dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- .github/workflows/test-report.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/test-report.yml b/.github/workflows/test-report.yml index ad6a82883d..386183ad12 100644 --- a/.github/workflows/test-report.yml +++ b/.github/workflows/test-report.yml @@ -11,14 +11,14 @@ jobs: checks: write statuses: write steps: - - uses: dorny/test-reporter@v1 + - uses: dorny/test-reporter@v2 with: artifact: /(.*)-mypy-junit-xml$/ name: mypy report path: "*.xml" reporter: java-junit fail-on-error: "false" - - uses: dorny/test-reporter@v1 + - uses: dorny/test-reporter@v2 with: artifact: /(.*)-pytest-junit-xml$/ name: pytest report From c1fc39e7ba2868bc1f1419071301fee4558d97be Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sat, 22 Mar 2025 13:26:50 +1000 Subject: [PATCH 14/60] build(deps-dev): bump pytest from 8.3.4 to 8.3.5 (#3079) Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.4 to 8.3.5. - [Release notes](https://github.com/pytest-dev/pytest/releases) - [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst) - [Commits](https://github.com/pytest-dev/pytest/compare/8.3.4...8.3.5) --- updated-dependencies: - dependency-name: pytest dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- poetry.lock | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/poetry.lock b/poetry.lock index fc6fb2ea2a..0e30d858f3 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1049,13 +1049,13 @@ files = [ [[package]] name = "pytest" -version = "8.3.4" +version = "8.3.5" description = "pytest: simple powerful testing with Python" optional = false python-versions = ">=3.8" files = [ - {file = "pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6"}, - {file = "pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761"}, + {file = "pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820"}, + {file = "pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845"}, ] [package.dependencies] From e4d28f1c7aed5f6948058fa9eeadfddc21f515b7 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sat, 22 Mar 2025 13:27:08 +1000 Subject: [PATCH 15/60] build(deps): bump library/python in /docker/unstable (#3067) Bumps library/python from 3.12.7-slim to 3.13.2-slim. --- updated-dependencies: - dependency-name: library/python dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Nicholas Car --- docker/unstable/Dockerfile | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docker/unstable/Dockerfile b/docker/unstable/Dockerfile index 85564a955d..4a30091309 100644 --- a/docker/unstable/Dockerfile +++ b/docker/unstable/Dockerfile @@ -1,4 +1,4 @@ -FROM docker.io/library/python:3.12.7-slim@sha256:af4e85f1cac90dd3771e47292ea7c8a9830abfabbe4faa5c53f158854c2e819d +FROM docker.io/library/python:3.13.2-slim@sha256:ae9f9ac89467077ed1efefb6d9042132d28134ba201b2820227d46c9effd3174 # This file is generated from docker:unstable in Taskfile.yml COPY var/requirements.txt /var/tmp/build/ From 69b7d6c6a840df1586ffdc97548757ad8cd35019 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Sat, 22 Mar 2025 13:27:19 +1000 Subject: [PATCH 16/60] build(deps): bump library/python in /docker/latest (#3066) Bumps library/python from 3.12.7-slim to 3.13.2-slim. --- updated-dependencies: - dependency-name: library/python dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Nicholas Car --- docker/latest/Dockerfile | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docker/latest/Dockerfile b/docker/latest/Dockerfile index fbaa974807..24a516c775 100644 --- a/docker/latest/Dockerfile +++ b/docker/latest/Dockerfile @@ -1,4 +1,4 @@ -FROM docker.io/library/python:3.12.7-slim@sha256:af4e85f1cac90dd3771e47292ea7c8a9830abfabbe4faa5c53f158854c2e819d +FROM docker.io/library/python:3.13.2-slim@sha256:ae9f9ac89467077ed1efefb6d9042132d28134ba201b2820227d46c9effd3174 COPY docker/latest/requirements.txt /var/tmp/build/ From b74c6574fd982b410aed1aa43853eed37504bf15 Mon Sep 17 00:00:00 2001 From: Alexandre Detiste Date: Sat, 22 Mar 2025 04:29:45 +0100 Subject: [PATCH 17/60] remove old hacks against 2to3 (#3076) This reverts e2fb491a3da80f9e01f3303b3df24881ab41eefa Co-authored-by: Nicholas Car --- rdflib/plugins/parsers/rdfxml.py | 9 +++------ rdflib/plugins/stores/berkeleydb.py | 15 +++++---------- 2 files changed, 8 insertions(+), 16 deletions(-) diff --git a/rdflib/plugins/parsers/rdfxml.py b/rdflib/plugins/parsers/rdfxml.py index 54fc69567b..e0f6e05fa7 100644 --- a/rdflib/plugins/parsers/rdfxml.py +++ b/rdflib/plugins/parsers/rdfxml.py @@ -298,8 +298,7 @@ def document_element_start( self, name: Tuple[str, str], qname, attrs: AttributesImpl ) -> None: if name[0] and URIRef("".join(name)) == RDFVOC.RDF: - # Cheap hack so 2to3 doesn't turn it into __next__ - next = getattr(self, "next") + next = self.next next.start = self.node_element_start next.end = self.node_element_end else: @@ -316,8 +315,7 @@ def node_element_start( current = self.current absolutize = self.absolutize - # Cheap hack so 2to3 doesn't turn it into __next__ - next = getattr(self, "next") + next = self.next next.start = self.property_element_start next.end = self.property_element_end @@ -410,8 +408,7 @@ def property_element_start( current = self.current absolutize = self.absolutize - # Cheap hack so 2to3 doesn't turn it into __next__ - next = getattr(self, "next") + next = self.next object: Optional[_ObjectType] = None current.data = None current.list = None diff --git a/rdflib/plugins/stores/berkeleydb.py b/rdflib/plugins/stores/berkeleydb.py index 12009787cd..872dc368ef 100644 --- a/rdflib/plugins/stores/berkeleydb.py +++ b/rdflib/plugins/stores/berkeleydb.py @@ -428,8 +428,7 @@ def remove( # type: ignore[override] cursor = index.cursor(txn=txn) try: cursor.set_range(key) - # Hack to stop 2to3 converting this to next(cursor) - current = getattr(cursor, "next")() + current = cursor.next except db.DBNotFoundError: current = None cursor.close() @@ -506,8 +505,7 @@ def triples( cursor = index.cursor(txn=txn) try: cursor.set_range(key) - # Cheap hack so 2to3 doesn't convert to next(cursor) - current = getattr(cursor, "next")() + current = cursor.next except db.DBNotFoundError: current = None cursor.close() @@ -539,8 +537,7 @@ def __len__(self, context: Optional[_ContextType] = None) -> int: key, value = current if key.startswith(prefix): count += 1 - # Hack to stop 2to3 converting this to next(cursor) - current = getattr(cursor, "next")() + current = cursor.next else: break cursor.close() @@ -593,8 +590,7 @@ def namespaces(self) -> Generator[Tuple[str, URIRef], None, None]: while current: prefix, namespace = current results.append((prefix.decode("utf-8"), namespace.decode("utf-8"))) - # Hack to stop 2to3 converting this to next(cursor) - current = getattr(cursor, "next")() + current = cursor.next cursor.close() for prefix, namespace in results: yield prefix, URIRef(namespace) @@ -637,8 +633,7 @@ def contexts( cursor = index.cursor() try: cursor.set_range(key) - # Hack to stop 2to3 converting this to next(cursor) - current = getattr(cursor, "next")() + current = cursor.next except db.DBNotFoundError: current = None cursor.close() From c5600288ec6ae1b30442aa72677aca5baca9cc10 Mon Sep 17 00:00:00 2001 From: Yaroslav Halchenko Date: Sat, 22 Mar 2025 01:26:00 -0400 Subject: [PATCH 18/60] Downgrade log message about plugin without override argument to debug from warning (#3063) Original issue - https://github.com/RDFLib/rdflib/issues/1880 was worked around by providing a fall back call without "override" in - https://github.com/RDFLib/rdflib/pull/2018 As a result any user with such a store (in our case just ) is flooded with such warnings pointing to that issue #1880 and for all means and purposes AFAIK just need to ignore them since there is nothing user can do to address it. That raises the question on why is it a Warning at a user level and not some debug message for developers/troubleshooting etc? Hence I am lowering it to debug level to bring peace of mind to users of the library. Co-authored-by: Nicholas Car --- rdflib/namespace/__init__.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/rdflib/namespace/__init__.py b/rdflib/namespace/__init__.py index d37d6f0cc3..cd2946ad55 100644 --- a/rdflib/namespace/__init__.py +++ b/rdflib/namespace/__init__.py @@ -734,7 +734,7 @@ def _store_bind(self, prefix: str, namespace: URIRef, override: bool) -> None: return self.store.bind(prefix, namespace, override=override) except TypeError as error: if "override" in str(error): - logger.warning( + logger.debug( "caught a TypeError, " "retrying call to %s.bind without override, " "see https://github.com/RDFLib/rdflib/issues/1880 for more info", From 7d1f4c7359b26ee9f9fc7e32cd549df0e9888452 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Sat, 22 Mar 2025 22:46:16 +1000 Subject: [PATCH 19/60] [pre-commit.ci] pre-commit autoupdate (#3056) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit updates: - [github.com/astral-sh/ruff-pre-commit: v0.5.4 → v0.11.0](https://github.com/astral-sh/ruff-pre-commit/compare/v0.5.4...v0.11.0) - [github.com/psf/black-pre-commit-mirror: 24.4.2 → 25.1.0](https://github.com/psf/black-pre-commit-mirror/compare/24.4.2...25.1.0) - [github.com/python-poetry/poetry: 1.8.3 → 2.1.1](https://github.com/python-poetry/poetry/compare/1.8.3...2.1.1) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Nicholas Car --- .pre-commit-config.yaml | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 8a74122ccf..a558a54112 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -8,20 +8,20 @@ ci: repos: - repo: https://github.com/astral-sh/ruff-pre-commit # WARNING: Ruff version should be the same as in `pyproject.toml` - rev: v0.5.4 + rev: v0.11.0 hooks: - id: ruff args: ["--fix"] - repo: https://github.com/psf/black-pre-commit-mirror # WARNING: Black version should be the same as in `pyproject.toml` - rev: "24.4.2" + rev: "25.1.0" hooks: - id: black pass_filenames: false require_serial: true args: ["."] - repo: https://github.com/python-poetry/poetry - rev: 1.8.3 + rev: 2.1.1 hooks: - id: poetry-check - id: poetry-lock From 8bbb30d04315e74e3e31e51cc886c6355a49637e Mon Sep 17 00:00:00 2001 From: Nicholas Car Date: Sun, 23 Mar 2025 22:43:03 +1000 Subject: [PATCH 20/60] Reduce warnings (#3085) * build(deps-dev): bump ruff from 0.9.2 to 0.9.6 Bumps [ruff](https://github.com/astral-sh/ruff) from 0.9.2 to 0.9.6. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.9.2...0.9.6) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] * add readthedocs sphynx.configuration * replace ConjunctiveGraph() with Dataset() in tests * tidy some notation * align black version * fix black & ruff * poetry --check -> poetry-check --lock * more CG -> Datasets * ruff * GC -> Dataset --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- .pre-commit-config.yaml | 5 +- rdflib/plugins/parsers/notation3.py | 4 +- test/test_graph/test_diff.py | 4 +- test/test_graph/test_graph_context.py | 76 +++++++++---------- test/test_graph/test_graph_formula.py | 6 +- test/test_graph/test_namespace_rebinding.py | 5 +- test/test_n3.py | 26 +++---- test/test_parsers/test_empty_xml_base.py | 6 +- test/test_parsers/test_nquads.py | 59 ++++++++------ test/test_parsers/test_parser_hext.py | 10 +-- test/test_parsers/test_trix_parse.py | 10 +-- test/test_serializers/test_serializer_hext.py | 4 +- test/test_serializers/test_serializer_trix.py | 12 +-- test/test_serializers/test_serializer_xml.py | 18 ++--- test/test_sparql/test_initbindings.py | 8 +- test/test_sparql/test_sparql.py | 10 +-- test/test_tools/test_chunk_serializer.py | 2 +- test/test_trig.py | 24 +++--- test/test_turtle_quoting.py | 4 +- test/test_util.py | 4 +- test_reports/rdflib_w3c_sparql10-HEAD.ttl | 2 +- test_reports/rdflib_w3c_sparql11-HEAD.ttl | 20 ++--- 22 files changed, 159 insertions(+), 160 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index a558a54112..098305df07 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -14,7 +14,7 @@ repos: args: ["--fix"] - repo: https://github.com/psf/black-pre-commit-mirror # WARNING: Black version should be the same as in `pyproject.toml` - rev: "25.1.0" + rev: "24.4.2" hooks: - id: black pass_filenames: false @@ -24,6 +24,5 @@ repos: rev: 2.1.1 hooks: - id: poetry-check - - id: poetry-lock # sadly `--no-update` does not work on pre-commit.ci - args: ["--check"] + args: ["--lock"] diff --git a/rdflib/plugins/parsers/notation3.py b/rdflib/plugins/parsers/notation3.py index da71405e04..4df892ee45 100755 --- a/rdflib/plugins/parsers/notation3.py +++ b/rdflib/plugins/parsers/notation3.py @@ -57,7 +57,7 @@ from rdflib.compat import long_type from rdflib.exceptions import ParserError -from rdflib.graph import ConjunctiveGraph, Graph, QuotedGraph +from rdflib.graph import Dataset, Graph, QuotedGraph from rdflib.term import ( _XSD_PFX, BNode, @@ -2047,7 +2047,7 @@ def parse( # type: ignore[override] elif not fa: raise ParserError("Cannot parse N3 into non-formula-aware store.") - conj_graph = ConjunctiveGraph(store=graph.store) + conj_graph = Dataset(store=graph.store) conj_graph.default_context = graph # TODO: CG __init__ should have a # default_context arg # TODO: update N3Processor so that it can use conj_graph as the sink diff --git a/test/test_graph/test_diff.py b/test/test_graph/test_diff.py index 696d8be898..a1d4f948e1 100644 --- a/test/test_graph/test_diff.py +++ b/test/test_graph/test_diff.py @@ -9,7 +9,7 @@ import rdflib from rdflib import Graph from rdflib.compare import graph_diff -from rdflib.graph import ConjunctiveGraph, Dataset +from rdflib.graph import Dataset from rdflib.namespace import FOAF, RDF from rdflib.term import BNode, Literal from test.utils import ( @@ -111,7 +111,7 @@ def as_element_set(self, value: _ElementSetTypeOrStr) -> _ElementSetType: if isinstance(value, str): graph = self.graph_type() graph.parse(data=value, format=self.format) - if isinstance(graph, ConjunctiveGraph): + if isinstance(graph, Dataset): return GraphHelper.quad_set(graph, BNodeHandling.COLLAPSE) else: return GraphHelper.triple_set(graph, BNodeHandling.COLLAPSE) diff --git a/test/test_graph/test_graph_context.py b/test/test_graph/test_graph_context.py index 7d0a90f7cd..8c1a8a4579 100644 --- a/test/test_graph/test_graph_context.py +++ b/test/test_graph/test_graph_context.py @@ -291,71 +291,63 @@ def test_triples(self): for c in [graph, self.graph.get_context(c1)]: # unbound subjects - asserte(set(c.subjects(likes, pizza)), set((michel, tarek))) - asserte(set(c.subjects(hates, pizza)), set((bob,))) - asserte(set(c.subjects(likes, cheese)), set([tarek, bob, michel])) + asserte(set(c.subjects(likes, pizza)), {michel, tarek}) + asserte(set(c.subjects(hates, pizza)), {bob}) + asserte(set(c.subjects(likes, cheese)), {tarek, bob, michel}) asserte(set(c.subjects(hates, cheese)), set()) # unbound objects - asserte(set(c.objects(michel, likes)), set([cheese, pizza])) - asserte(set(c.objects(tarek, likes)), set([cheese, pizza])) - asserte(set(c.objects(bob, hates)), set([michel, pizza])) - asserte(set(c.objects(bob, likes)), set([cheese])) + asserte(set(c.objects(michel, likes)), {cheese, pizza}) + asserte(set(c.objects(tarek, likes)), {cheese, pizza}) + asserte(set(c.objects(bob, hates)), {michel, pizza}) + asserte(set(c.objects(bob, likes)), {cheese}) # unbound predicates - asserte(set(c.predicates(michel, cheese)), set([likes])) - asserte(set(c.predicates(tarek, cheese)), set([likes])) - asserte(set(c.predicates(bob, pizza)), set([hates])) - asserte(set(c.predicates(bob, michel)), set([hates])) + asserte(set(c.predicates(michel, cheese)), {likes}) + asserte(set(c.predicates(tarek, cheese)), {likes}) + asserte(set(c.predicates(bob, pizza)), {hates}) + asserte(set(c.predicates(bob, michel)), {hates}) - asserte(set(c.subject_objects(hates)), set([(bob, pizza), (bob, michel)])) + asserte(set(c.subject_objects(hates)), {(bob, pizza), (bob, michel)}) asserte( set(c.subject_objects(likes)), - set( - [ - (tarek, cheese), - (michel, cheese), - (michel, pizza), - (bob, cheese), - (tarek, pizza), - ] - ), + { + (tarek, cheese), + (michel, cheese), + (michel, pizza), + (bob, cheese), + (tarek, pizza), + }, ) - asserte( - set(c.predicate_objects(michel)), set([(likes, cheese), (likes, pizza)]) - ) + asserte(set(c.predicate_objects(michel)), {(likes, cheese), (likes, pizza)}) asserte( set(c.predicate_objects(bob)), - set([(likes, cheese), (hates, pizza), (hates, michel)]), - ) - asserte( - set(c.predicate_objects(tarek)), set([(likes, cheese), (likes, pizza)]) + {(likes, cheese), (hates, pizza), (hates, michel)}, ) + asserte(set(c.predicate_objects(tarek)), {(likes, cheese), (likes, pizza)}) asserte( set(c.subject_predicates(pizza)), - set([(bob, hates), (tarek, likes), (michel, likes)]), + {(bob, hates), (tarek, likes), (michel, likes)}, ) asserte( set(c.subject_predicates(cheese)), - set([(bob, likes), (tarek, likes), (michel, likes)]), + {(bob, likes), (tarek, likes), (michel, likes)}, ) - asserte(set(c.subject_predicates(michel)), set([(bob, hates)])) + asserte(set(c.subject_predicates(michel)), {(bob, hates)}) asserte( set(c), - set( - [ - (bob, hates, michel), - (bob, likes, cheese), - (tarek, likes, pizza), - (michel, likes, pizza), - (michel, likes, cheese), - (bob, hates, pizza), - (tarek, likes, cheese), - ] - ), + { + (bob, hates, michel), + (bob, likes, cheese), + (tarek, likes, pizza), + (michel, likes, pizza), + (michel, likes, cheese), + (bob, hates, pizza), + (tarek, likes, cheese), + }, ) # remove stuff and make sure the graph is empty again diff --git a/test/test_graph/test_graph_formula.py b/test/test_graph/test_graph_formula.py index 0f77dc2941..eebc8385a9 100644 --- a/test/test_graph/test_graph_formula.py +++ b/test/test_graph/test_graph_formula.py @@ -4,7 +4,7 @@ import pytest from rdflib import RDF, RDFS, BNode, URIRef, Variable, plugin -from rdflib.graph import ConjunctiveGraph, QuotedGraph +from rdflib.graph import Dataset, QuotedGraph implies = URIRef("http://www.w3.org/2000/10/swap/log#implies") testN3 = """ @@ -21,7 +21,7 @@ def checkFormulaStore(store="default", configString=None): # noqa: N802, N803 try: - g = ConjunctiveGraph(store=store) + g = Dataset(store=store) except ImportError: pytest.skip("Dependencies for store '%s' not available!" % store) @@ -48,7 +48,7 @@ def checkFormulaStore(store="default", configString=None): # noqa: N802, N803 d = URIRef("http://test/d") v = Variable("y") - universe = ConjunctiveGraph(g.store) + universe = Dataset(g.store) # test formula as terms assert len(list(universe.triples((formulaA, implies, formulaB)))) == 1 diff --git a/test/test_graph/test_namespace_rebinding.py b/test/test_graph/test_namespace_rebinding.py index babac1b4ff..253fbedc22 100644 --- a/test/test_graph/test_namespace_rebinding.py +++ b/test/test_graph/test_namespace_rebinding.py @@ -1,6 +1,6 @@ import pytest -from rdflib import ConjunctiveGraph, Graph, Literal +from rdflib import Dataset, Graph, Literal from rdflib.namespace import OWL, Namespace, NamespaceManager from rdflib.plugins.stores.memory import Memory from rdflib.term import URIRef @@ -292,12 +292,11 @@ def test_multigraph_bindings(): assert list(g1.namespaces()) == [("friend-of-a-friend", foaf1_uri)] # Including newly-created objects that use the store - cg = ConjunctiveGraph(store=store) + cg = Dataset(store=store, default_union=True) cg.namespace_manager = NamespaceManager(cg, bind_namespaces="core") assert ("foaf", foaf1_uri) not in list(cg.namespaces()) assert ("friend-of-a-friend", foaf1_uri) in list(cg.namespaces()) - assert len(list(g1.namespaces())) == 6 assert len(list(g2.namespaces())) == 6 assert len(list(cg.namespaces())) == 6 diff --git a/test/test_n3.py b/test/test_n3.py index f3d7eeb07f..d2c362a39a 100644 --- a/test/test_n3.py +++ b/test/test_n3.py @@ -4,7 +4,7 @@ import pytest -from rdflib.graph import ConjunctiveGraph, Graph +from rdflib.graph import Dataset, Graph from rdflib.plugins.parsers.notation3 import BadSyntax, exponent_syntax from rdflib.term import Literal, URIRef from test import TEST_DIR @@ -44,7 +44,7 @@ n3:predicate :p; n3:object :y ] a log:Truth}. -# Needs more thought ... ideally, we have the implcit AND rules of +# Needs more thought ... ideally, we have the implicit AND rules of # juxtaposition (introduction and elimination) { @@ -125,9 +125,9 @@ def test_base_serialize(self): ) s = g.serialize(base="http://example.com/", format="n3", encoding="latin-1") assert b"" in s - g2 = ConjunctiveGraph() - g2.parse(data=s, publicID="http://example.com/", format="n3") - assert list(g) == list(g2) + g2 = Dataset() + g2.parse(data=s, format="n3") + assert list(g) == list(g2.triples((None, None, None))) def test_issue23(self): input = """ "this word is in \\u201Cquotes\\u201D".""" @@ -195,24 +195,24 @@ def test_dot_in_prefix(self): ) def test_model(self): - g = ConjunctiveGraph() + g = Dataset() g.parse(data=test_data, format="n3") i = 0 - for s, p, o in g: + for s, p, o, c in g: if isinstance(s, Graph): i += 1 assert i == 3 - assert len(list(g.contexts())) == 13 + assert len(list(g.graphs())) == 13 g.close() def test_quoted_serialization(self): - g = ConjunctiveGraph() + g = Dataset() g.parse(data=test_data, format="n3") g.serialize(format="n3") def test_parse(self): - g = ConjunctiveGraph() + g = Dataset() try: g.parse( "http://groups.csail.mit.edu/dig/2005/09/rein/examples/troop42-policy.n3", @@ -229,14 +229,14 @@ def test_single_quoted_literals(self): for data in test_data: # N3 doesn't accept single quotes around string literals - g = ConjunctiveGraph() + g = Dataset() with pytest.raises(BadSyntax): g.parse(data=data, format="n3") - g = ConjunctiveGraph() + g = Dataset() g.parse(data=data, format="turtle") assert len(g) == 1 - for _, _, o in g: + for _, _, o, c in g: assert o == Literal("o") def test_empty_prefix(self): diff --git a/test/test_parsers/test_empty_xml_base.py b/test/test_parsers/test_empty_xml_base.py index 0f3f186942..0225c4c639 100644 --- a/test/test_parsers/test_empty_xml_base.py +++ b/test/test_parsers/test_empty_xml_base.py @@ -5,7 +5,7 @@ and RDF/XML dependence on it """ -from rdflib.graph import ConjunctiveGraph +from rdflib.graph import Dataset from rdflib.namespace import FOAF, RDF from rdflib.term import URIRef @@ -36,7 +36,7 @@ class TestEmptyBase: def test_empty_base_ref(self): - self.graph = ConjunctiveGraph() + self.graph = Dataset() self.graph.parse(data=test_data, publicID=baseUri, format="xml") assert ( len(list(self.graph)) > 0 @@ -50,7 +50,7 @@ def test_empty_base_ref(self): class TestRelativeBase: def test_relative_base_ref(self): - self.graph = ConjunctiveGraph() + self.graph = Dataset() self.graph.parse(data=test_data2, publicID=baseUri2, format="xml") assert ( len(self.graph) > 0 diff --git a/test/test_parsers/test_nquads.py b/test/test_parsers/test_nquads.py index ad17b5aeec..ee14856ab6 100644 --- a/test/test_parsers/test_nquads.py +++ b/test/test_parsers/test_nquads.py @@ -1,6 +1,7 @@ import os -from rdflib import ConjunctiveGraph, Namespace, URIRef +from rdflib import Dataset, Literal, URIRef +from rdflib.namespace import FOAF from test.data import TEST_DATA_DIR TEST_BASE = os.path.join(TEST_DATA_DIR, "nquads.rdflib") @@ -8,7 +9,7 @@ class TestNQuadsParser: def _load_example(self): - g = ConjunctiveGraph() + g = Dataset() nq_path = os.path.relpath( os.path.join(TEST_DATA_DIR, "nquads.rdflib/example.nquads"), os.curdir ) @@ -21,9 +22,9 @@ def test_01_simple_open(self): assert len(g.store) == 449 def test_02_contexts(self): - # There should be 16 separate contexts + # There should be 17 separate contexts - 16 Named + default g = self._load_example() - assert len([x for x in g.store.contexts()]) == 16 + assert len([x for x in g.store.contexts()]) == 17 def test_03_get_value(self): # is the name of entity E10009 "Arco Publications"? @@ -36,11 +37,11 @@ def test_03_get_value(self): g = self._load_example() s = URIRef("http://bibliographica.org/entity/E10009") - FOAF = Namespace("http://xmlns.com/foaf/0.1/") # noqa: N806 - assert g.value(s, FOAF.name).eq("Arco Publications") + for s, p, o, c in list(g.quads((s, FOAF.name, None, None))): + assert o == Literal("Arco Publications") def test_context_is_optional(self): - g = ConjunctiveGraph() + g = Dataset() nq_path = os.path.relpath( os.path.join(TEST_DATA_DIR, "nquads.rdflib/test6.nq"), os.curdir ) @@ -49,7 +50,7 @@ def test_context_is_optional(self): assert len(g) > 0 def test_serialize(self): - g = ConjunctiveGraph() + g = Dataset() uri1 = URIRef("http://example.org/mygraph1") uri2 = URIRef("http://example.org/mygraph2") @@ -63,7 +64,7 @@ def test_serialize(self): s = g.serialize(format="nquads", encoding="utf-8") assert len([x for x in s.split(b"\n") if x.strip()]) == 2 - g2 = ConjunctiveGraph() + g2 = Dataset() g2.parse(data=s, format="nquads") assert len(g) == len(g2) @@ -89,8 +90,8 @@ def teardown_method(self, method): def test_parse_shared_bnode_context(self): bnode_ctx = dict() - g = ConjunctiveGraph() - h = ConjunctiveGraph() + g = Dataset() + h = Dataset() g.parse(self.data, format="nquads", bnode_context=bnode_ctx) self.data.seek(0) h.parse(self.data, format="nquads", bnode_context=bnode_ctx) @@ -98,7 +99,7 @@ def test_parse_shared_bnode_context(self): def test_parse_shared_bnode_context_same_graph(self): bnode_ctx = dict() - g = ConjunctiveGraph() + g = Dataset() g.parse(self.data_obnodes, format="nquads", bnode_context=bnode_ctx) o1 = set(g.objects()) self.data_obnodes.seek(0) @@ -107,27 +108,27 @@ def test_parse_shared_bnode_context_same_graph(self): assert o1 == o2 def test_parse_distinct_bnode_context(self): - g = ConjunctiveGraph() + g = Dataset() g.parse(self.data, format="nquads", bnode_context=dict()) - s1 = set(g.subjects()) + s1 = set([x for x, p, o, c in list(g.quads((None, None, None, None)))]) self.data.seek(0) g.parse(self.data, format="nquads", bnode_context=dict()) - s2 = set(g.subjects()) + s2 = set([x for x, p, o, c in list(g.quads((None, None, None, None)))]) assert set() != (s2 - s1) def test_parse_distinct_bnode_contexts_between_graphs(self): - g = ConjunctiveGraph() - h = ConjunctiveGraph() + g = Dataset() + h = Dataset() g.parse(self.data, format="nquads") - s1 = set(g.subjects()) + s1 = sorted(set([x for x, p, o, c in list(g.quads((None, None, None, None)))])) self.data.seek(0) h.parse(self.data, format="nquads") - s2 = set(h.subjects()) + s2 = sorted(set([x for x, p, o, c in list(h.quads((None, None, None, None)))])) assert s1 != s2 def test_parse_distinct_bnode_contexts_named_graphs(self): - g = ConjunctiveGraph() - h = ConjunctiveGraph() + g = Dataset() + h = Dataset() g.parse(self.data, format="nquads") self.data.seek(0) h.parse(self.data, format="nquads") @@ -135,9 +136,17 @@ def test_parse_distinct_bnode_contexts_named_graphs(self): def test_parse_shared_bnode_contexts_named_graphs(self): bnode_ctx = dict() - g = ConjunctiveGraph() - h = ConjunctiveGraph() - g.parse(self.data, format="nquads", bnode_context=bnode_ctx) + g = Dataset() + h = Dataset() + g.parse( + TEST_DATA_DIR / "nquads.rdflib/bnode_context.nquads", + format="nquads", + bnode_context=bnode_ctx, + ) self.data.seek(0) - h.parse(self.data, format="nquads", bnode_context=bnode_ctx) + h.parse( + TEST_DATA_DIR / "nquads.rdflib/bnode_context.nquads", + format="nquads", + bnode_context=bnode_ctx, + ) assert set(h.contexts()) == set(g.contexts()) diff --git a/test/test_parsers/test_parser_hext.py b/test/test_parsers/test_parser_hext.py index c71bd1a49a..17d19fa26a 100644 --- a/test/test_parsers/test_parser_hext.py +++ b/test/test_parsers/test_parser_hext.py @@ -1,6 +1,6 @@ from pathlib import Path -from rdflib import BNode, ConjunctiveGraph, Dataset, Literal, URIRef +from rdflib import BNode, Dataset, Literal, URIRef from rdflib.compare import isomorphic from rdflib.graph import DATASET_DEFAULT_GRAPH_ID from rdflib.namespace import XSD @@ -97,7 +97,7 @@ def test_small_string_cg(): ["http://example.com/s01", "http://example.com/op1", "http://example.com/o2", "globalId", "", ""] ["http://example.com/s01", "http://example.com/op2", "http://example.com/o3", "globalId", "", ""] """ - d = ConjunctiveGraph(identifier=DATASET_DEFAULT_GRAPH_ID) + d = Dataset() d.parse(data=s, format="hext") expected_graph_names = ( @@ -140,7 +140,7 @@ def test_small_file_multigraph(): def test_small_file_multigraph_cg(): - d = ConjunctiveGraph() + d = Dataset() assert len(d) == 0 d.parse( Path(__file__).parent.parent / "data/test_parser_hext_multigraph.ndjson", @@ -185,14 +185,14 @@ def test_roundtrip(): print(f"Test {tests}: {f}") if f.name not in files_to_skip.keys(): try: - cg = ConjunctiveGraph().parse(f, format="nt") + cg = Dataset().parse(f, format="nt") # print(cg.serialize(format="n3")) except Exception: print("Skipping: could not NT parse") skipped += 1 skip = True if not skip: - cg2 = ConjunctiveGraph() + cg2 = Dataset() cg2.parse( data=cg.serialize(format="hext"), format="hext", diff --git a/test/test_parsers/test_trix_parse.py b/test/test_parsers/test_trix_parse.py index e6f2ae91be..e48cf9b079 100644 --- a/test/test_parsers/test_trix_parse.py +++ b/test/test_parsers/test_trix_parse.py @@ -1,6 +1,6 @@ import os -from rdflib.graph import ConjunctiveGraph +from rdflib.graph import Dataset from test.data import TEST_DATA_DIR @@ -12,7 +12,7 @@ def teardown_method(self): pass def testAperture(self): # noqa: N802 - g = ConjunctiveGraph() + g = Dataset() trix_path = os.path.relpath( os.path.join(TEST_DATA_DIR, "suites", "trix/trix-aperture.trix"), os.curdir @@ -24,12 +24,12 @@ def testAperture(self): # noqa: N802 t = sum(map(len, g.contexts())) assert t == 24 - assert len(c) == 4 + assert len(c) == 5 # print "Parsed %d triples"%t def testSpec(self): # noqa: N802 - g = ConjunctiveGraph() + g = Dataset() trix_path = os.path.relpath( os.path.join(TEST_DATA_DIR, "suites", "trix/trix-nokia-example.trix"), @@ -40,7 +40,7 @@ def testSpec(self): # noqa: N802 # print "Parsed %d triples"%len(g) def testNG4j(self): # noqa: N802 - g = ConjunctiveGraph() + g = Dataset() trix_path = os.path.relpath( os.path.join(TEST_DATA_DIR, "suites", "trix/trix-ng4j-test-01.trix"), diff --git a/test/test_serializers/test_serializer_hext.py b/test/test_serializers/test_serializer_hext.py index 2b0577bc1b..de39e37dca 100644 --- a/test/test_serializers/test_serializer_hext.py +++ b/test/test_serializers/test_serializer_hext.py @@ -1,7 +1,7 @@ import json from pathlib import Path -from rdflib import ConjunctiveGraph, Dataset, Graph +from rdflib import Dataset, Graph def test_hext_graph(): @@ -90,7 +90,7 @@ def test_hext_graph(): def test_hext_cg(): """Tests ConjunctiveGraph data""" - d = ConjunctiveGraph() + d = Dataset() trig_data = """ PREFIX ex: PREFIX owl: diff --git a/test/test_serializers/test_serializer_trix.py b/test/test_serializers/test_serializer_trix.py index bdfc91c817..37bc5912f8 100644 --- a/test/test_serializers/test_serializer_trix.py +++ b/test/test_serializers/test_serializer_trix.py @@ -1,6 +1,6 @@ from io import BytesIO -from rdflib.graph import ConjunctiveGraph, Graph +from rdflib.graph import Dataset, Graph from rdflib.term import Literal, URIRef @@ -19,7 +19,7 @@ def test_serialize(): g2 = Graph(identifier=s2) g2.add((r2, label, Literal("label 3"))) - g = ConjunctiveGraph() + g = Dataset() for s, p, o in g1.triples((None, None, None)): g.addN([(s, p, o, g1)]) for s, p, o in g2.triples((None, None, None)): @@ -28,14 +28,14 @@ def test_serialize(): g.add((r3, label, Literal(4))) r = g.serialize(format="trix", encoding="utf-8") - g3 = ConjunctiveGraph() + g3 = Dataset() g3.parse(BytesIO(r), format="trix") for q in g3.quads((None, None, None)): # TODO: Fix once getGraph/getContext is in conjunctive graph - if isinstance(q[3].identifier, URIRef): - tg = Graph(store=g.store, identifier=q[3].identifier) + if isinstance(q[3], URIRef): + tg = Graph(store=g.store, identifier=q[3]) else: # BNode, this is a bit ugly # we cannot match the bnode to the right graph automagically @@ -74,7 +74,7 @@ def test_issue_250(): """ - graph = ConjunctiveGraph() + graph = Dataset() graph.bind(None, "http://defaultnamespace") sg = graph.serialize(format="trix") assert 'xmlns="http://defaultnamespace"' not in sg, sg diff --git a/test/test_serializers/test_serializer_xml.py b/test/test_serializers/test_serializer_xml.py index eda0b3d43a..535b24c853 100644 --- a/test/test_serializers/test_serializer_xml.py +++ b/test/test_serializers/test_serializer_xml.py @@ -1,6 +1,6 @@ from io import BytesIO -from rdflib.graph import ConjunctiveGraph +from rdflib.graph import Dataset from rdflib.namespace import RDFS from rdflib.plugins.serializers.rdfxml import XMLSerializer from rdflib.term import BNode, URIRef @@ -10,7 +10,7 @@ class SerializerTestBase: repeats = 8 def setup_method(self): - graph = ConjunctiveGraph() + graph = Dataset(default_union=True) graph.parse(data=self.test_content, format=self.test_content_format) self.source_graph = graph @@ -40,13 +40,13 @@ def _assert_equal_graphs(g1, g2): def _mangled_copy(g): - "Makes a copy of the graph, replacing all bnodes with the bnode ``_blank``." - gcopy = ConjunctiveGraph() + """Makes a copy of the graph, replacing all bnodes with the bnode ``_blank``.""" + gcopy = Dataset() def isbnode(v): return isinstance(v, BNode) - for s, p, o in g: + for s, p, o, c in g: if isbnode(s): s = _blank if isbnode(p): @@ -67,8 +67,8 @@ def serialize(source_graph, make_serializer, get_value=True, extra_args={}): def serialize_and_load(source_graph, make_serializer): stream = serialize(source_graph, make_serializer, False) stream.seek(0) - reparsed_graph = ConjunctiveGraph() - reparsed_graph.parse(stream, publicID=None, format="xml") + reparsed_graph = Dataset(default_union=True) + reparsed_graph.parse(stream, format="xml") return reparsed_graph @@ -173,7 +173,7 @@ def test_result_fragments_with_base(self): '' term. g.add(TRIPLE + (rdflib.URIRef("http://example.com/foo."),)) @@ -81,7 +81,7 @@ def test_graph_uri_syntax(): def test_blank_graph_identifier(): - g = rdflib.ConjunctiveGraph() + g = rdflib.Dataset() g.add(TRIPLE + (rdflib.BNode(),)) out = g.serialize(format="trig", encoding="latin-1") graph_label_line = out.splitlines()[-4] @@ -94,7 +94,7 @@ def test_graph_parsing(): data = """ . """ - g = rdflib.ConjunctiveGraph() + g = rdflib.Dataset() g.parse(data=data, format="trig") assert len(list(g.contexts())) == 1 @@ -104,7 +104,7 @@ def test_graph_parsing(): { . } """ - g = rdflib.ConjunctiveGraph() + g = rdflib.Dataset() g.parse(data=data, format="trig") assert len(list(g.contexts())) == 1 @@ -118,7 +118,7 @@ def test_graph_parsing(): . } """ - g = rdflib.ConjunctiveGraph() + g = rdflib.Dataset() g.parse(data=data, format="trig") assert len(list(g.contexts())) == 2 @@ -133,7 +133,7 @@ def test_round_trips(): . } """ - g = rdflib.ConjunctiveGraph() + g = rdflib.Dataset() for i in range(5): g.parse(data=data, format="trig") data = g.serialize(format="trig") @@ -154,7 +154,7 @@ def test_default_graph_serializes_without_name(): { . } """ - g = rdflib.ConjunctiveGraph() + g = rdflib.Dataset() g.parse(data=data, format="trig") data = g.serialize(format="trig", encoding="latin-1") @@ -174,7 +174,7 @@ def test_prefixes(): } """ - cg = rdflib.ConjunctiveGraph() + cg = rdflib.Dataset() cg.parse(data=data, format="trig") data = cg.serialize(format="trig", encoding="latin-1") diff --git a/test/test_turtle_quoting.py b/test/test_turtle_quoting.py index aa523f57b6..7cdd63a24b 100644 --- a/test/test_turtle_quoting.py +++ b/test/test_turtle_quoting.py @@ -12,7 +12,7 @@ import pytest -from rdflib.graph import ConjunctiveGraph, Graph +from rdflib.graph import Dataset, Graph from rdflib.plugins.parsers import ntriples from rdflib.term import Literal, URIRef from test.utils.namespace import EGDC @@ -147,7 +147,7 @@ def test_parse_correctness( data = f' "{quoted}" .' else: data = f' "{quoted}".' - graph = ConjunctiveGraph() + graph = Dataset(default_union=True) graph.parse(data=data, format=format) objs = list(graph.objects()) assert len(objs) == 1 diff --git a/test/test_util.py b/test/test_util.py index 63c0850336..8906974753 100644 --- a/test/test_util.py +++ b/test/test_util.py @@ -9,7 +9,7 @@ import pytest from rdflib import XSD, util -from rdflib.graph import ConjunctiveGraph, Graph, QuotedGraph +from rdflib.graph import Dataset, Graph, QuotedGraph from rdflib.namespace import RDF, RDFS from rdflib.term import BNode, IdentifiedNode, Literal, Node, URIRef from rdflib.util import _coalesce, _iri2uri, find_roots, get_tree @@ -262,7 +262,7 @@ def parse_n3(term_n3): "@prefix xsd: .\n" " %s.\n" % term_n3 ) - g = ConjunctiveGraph() + g = Dataset(default_union=True) g.parse(data=prepstr, format="n3") return [t for t in g.triples((None, None, None))][0][2] diff --git a/test_reports/rdflib_w3c_sparql10-HEAD.ttl b/test_reports/rdflib_w3c_sparql10-HEAD.ttl index f3ac4255d6..b8369a94d3 100644 --- a/test_reports/rdflib_w3c_sparql10-HEAD.ttl +++ b/test_reports/rdflib_w3c_sparql10-HEAD.ttl @@ -1603,7 +1603,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . diff --git a/test_reports/rdflib_w3c_sparql11-HEAD.ttl b/test_reports/rdflib_w3c_sparql11-HEAD.ttl index 6d498df8b6..6140fa9143 100644 --- a/test_reports/rdflib_w3c_sparql11-HEAD.ttl +++ b/test_reports/rdflib_w3c_sparql11-HEAD.ttl @@ -691,7 +691,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -699,7 +699,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -707,7 +707,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -715,7 +715,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -723,7 +723,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -731,7 +731,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -1939,7 +1939,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -1947,7 +1947,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -2251,7 +2251,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . @@ -2259,7 +2259,7 @@ earl:assertedBy ; earl:mode earl:automatic ; earl:result [ a earl:TestResult ; - earl:outcome earl:failed ] ; + earl:outcome earl:passed ] ; earl:subject ; earl:test . From 4cf2180f6f407e4a95044aeadf0b83efee73a3a1 Mon Sep 17 00:00:00 2001 From: Nicholas Car Date: Mon, 24 Mar 2025 14:16:33 +1000 Subject: [PATCH 21/60] Reduce warnings (#3087) * build(deps-dev): bump ruff from 0.9.2 to 0.9.6 Bumps [ruff](https://github.com/astral-sh/ruff) from 0.9.2 to 0.9.6. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.9.2...0.9.6) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] * add readthedocs sphynx.configuration * replace ConjunctiveGraph() with Dataset() in tests * tidy some notation * align black version * fix black & ruff * poetry --check -> poetry-check --lock * more CG -> Datasets * ruff * GC -> Dataset * CG -> D * ruff fixes --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- test/test_graph/test_aggregate_graphs.py | 8 ++++---- test/test_graph/test_graph_context.py | 9 ++++++--- test/test_issues/test_issue535.py | 4 ++-- test/test_namespace/test_namespace.py | 2 +- .../test_broken_parse_data_from_jena.py | 2 +- test/test_serializers/test_prettyxml.py | 16 ++++++++-------- test/test_serializers/test_serializer.py | 6 +++--- .../test_store_sparqlupdatestore_mock.py | 6 +++--- 8 files changed, 28 insertions(+), 25 deletions(-) diff --git a/test/test_graph/test_aggregate_graphs.py b/test/test_graph/test_aggregate_graphs.py index fd45c4a14d..6ede92fda8 100644 --- a/test/test_graph/test_aggregate_graphs.py +++ b/test/test_graph/test_aggregate_graphs.py @@ -1,7 +1,7 @@ from io import StringIO -from rdflib import logger, plugin -from rdflib.graph import ConjunctiveGraph, Graph, ReadOnlyGraphAggregate +from rdflib import Dataset, Graph, logger, plugin +from rdflib.graph import ReadOnlyGraphAggregate from rdflib.namespace import RDF, RDFS from rdflib.store import Store from rdflib.term import URIRef @@ -108,10 +108,10 @@ def test_aggregate2(): graph4 = Graph(mem_store, RDFS) graph4.parse(data=TEST_GRAPH_1N3, format="n3") - g = ConjunctiveGraph(mem_store) + g = Dataset(store=mem_store, default_union=True) assert g is not None assert len(list(g.quads((None, None, None, None)))) == 11 - assert len(list(g.contexts())) == 4 + assert len(list(g.contexts())) == 5 logger.debug(list(g.contexts())) assert ( len(list(g.quads((None, None, None, URIRef("http://example.com/graph2"))))) == 4 diff --git a/test/test_graph/test_graph_context.py b/test/test_graph/test_graph_context.py index 8c1a8a4579..93c5f7d9b6 100644 --- a/test/test_graph/test_graph_context.py +++ b/test/test_graph/test_graph_context.py @@ -9,7 +9,7 @@ import pytest -from rdflib import BNode, ConjunctiveGraph, Graph, URIRef, plugin +from rdflib import BNode, Dataset, Graph, URIRef, plugin from rdflib.store import Store @@ -20,7 +20,7 @@ class ContextTestCase(unittest.TestCase): def setUp(self): try: - self.graph = ConjunctiveGraph(store=self.store) + self.graph = Dataset(store=self.store, default_union=True) except ImportError: pytest.skip("Dependencies for store '%s' not available!" % self.store) if self.store == "SQLite": @@ -337,8 +337,11 @@ def test_triples(self): ) asserte(set(c.subject_predicates(michel)), {(bob, hates)}) + d = set() + for x in c: + d.add(x[0:3]) asserte( - set(c), + set(d), { (bob, hates, michel), (bob, likes, cheese), diff --git a/test/test_issues/test_issue535.py b/test/test_issues/test_issue535.py index 03f75d6014..80a6396941 100644 --- a/test/test_issues/test_issue535.py +++ b/test/test_issues/test_issue535.py @@ -1,8 +1,8 @@ -from rdflib import ConjunctiveGraph, URIRef +from rdflib import Dataset, URIRef def test_nquads_default_graph(): - ds = ConjunctiveGraph() + ds = Dataset(default_union=True) data = """ . diff --git a/test/test_namespace/test_namespace.py b/test/test_namespace/test_namespace.py index 409d703f3a..60d6874245 100644 --- a/test/test_namespace/test_namespace.py +++ b/test/test_namespace/test_namespace.py @@ -232,7 +232,7 @@ def add_not_in_namespace(s): # a property name within the FOAF namespace assert FOAF.givenName == URIRef("http://xmlns.com/foaf/0.1/givenName") - # namescape can be used as str + # namespace can be used as str assert FOAF.givenName.startswith(FOAF) def test_contains_method(self): diff --git a/test/test_parsers/test_broken_parse_data_from_jena.py b/test/test_parsers/test_broken_parse_data_from_jena.py index 353593837c..da25154a7f 100644 --- a/test/test_parsers/test_broken_parse_data_from_jena.py +++ b/test/test_parsers/test_broken_parse_data_from_jena.py @@ -33,6 +33,6 @@ def xfail_broken_parse_data(request): @pytest.mark.parametrize("testfile", os.listdir(broken_parse_data)) @pytest.mark.usefixtures("xfail_broken_parse_data") def test_n3_serializer_roundtrip(testfile) -> None: - g1 = rdflib.ConjunctiveGraph() + g1 = rdflib.Dataset(default_union=True) g1.parse(os.path.join(broken_parse_data, testfile), format="n3") diff --git a/test/test_serializers/test_prettyxml.py b/test/test_serializers/test_prettyxml.py index 6c798e8255..aac19af50e 100644 --- a/test/test_serializers/test_prettyxml.py +++ b/test/test_serializers/test_prettyxml.py @@ -1,6 +1,6 @@ from io import BytesIO -from rdflib.graph import ConjunctiveGraph +from rdflib.graph import Dataset from rdflib.namespace import RDF, RDFS from rdflib.plugins.serializers.rdfxml import PrettyXMLSerializer from rdflib.term import BNode, Literal, URIRef @@ -10,7 +10,7 @@ class SerializerTestBase: repeats = 8 def setup_method(self): - graph = ConjunctiveGraph() + graph = Dataset() graph.parse(data=self.test_content, format=self.test_content_format) self.source_graph = graph @@ -41,12 +41,12 @@ def _assert_equal_graphs(g1, g2): def _mangled_copy(g): "Makes a copy of the graph, replacing all bnodes with the bnode ``_blank``." - gcopy = ConjunctiveGraph() + gcopy = Dataset() def isbnode(v): return isinstance(v, BNode) - for s, p, o in g: + for s, p, o, c in g: if isbnode(s): s = _blank if isbnode(p): @@ -67,7 +67,7 @@ def serialize(source_graph, make_serializer, get_value=True, extra_args={}): def serialize_and_load(source_graph, make_serializer): stream = serialize(source_graph, make_serializer, False) stream.seek(0) - reparsed_graph = ConjunctiveGraph() + reparsed_graph = Dataset() reparsed_graph.parse(stream, format="xml") return reparsed_graph @@ -170,7 +170,7 @@ def test_subclass_of_objects(self): def test_pretty_xmlliteral(self): # given: - g = ConjunctiveGraph() + g = Dataset() g.add( ( BNode(), @@ -191,7 +191,7 @@ def test_pretty_xmlliteral(self): def test_pretty_broken_xmlliteral(self): # given: - g = ConjunctiveGraph() + g = Dataset() g.add((BNode(), RDF.value, Literal("""

None: NS = Namespace("example:") # noqa: N806 - graph = ConjunctiveGraph() + graph = Dataset(default_union=True) graph.bind("eg", NS) nodes = [NS.subj, NS.pred, NS.obj, NS.graph] nodes[tuple_index] = RDF.type @@ -68,7 +68,7 @@ def test_rdf_type(format: str, tuple_index: int, is_keyword: bool) -> None: assert str(RDF) not in data else: assert str(RDF) in data - parsed_graph = ConjunctiveGraph() + parsed_graph = Dataset(default_union=True) parsed_graph.parse(data=data, format=format) GraphHelper.assert_triple_sets_equals(graph, parsed_graph) diff --git a/test/test_store/test_store_sparqlupdatestore_mock.py b/test/test_store/test_store_sparqlupdatestore_mock.py index 5d7e13eb84..054f746e19 100644 --- a/test/test_store/test_store_sparqlupdatestore_mock.py +++ b/test/test_store/test_store_sparqlupdatestore_mock.py @@ -1,6 +1,6 @@ from typing import ClassVar -from rdflib.graph import ConjunctiveGraph +from rdflib.graph import Dataset from rdflib.plugins.stores.sparqlstore import SPARQLUpdateStore from test.utils.http import MethodName, MockHTTPResponse from test.utils.httpservermock import ServedBaseHTTPServerMock @@ -33,7 +33,7 @@ def teardown_method(self): pass def test_graph_update(self): - graph = ConjunctiveGraph("SPARQLUpdateStore") + graph = Dataset("SPARQLUpdateStore") graph.open((self.query_endpoint, self.update_endpoint)) update_statement = ( f"INSERT DATA {{ {EGDO['subj']} {EGDO['pred']} {EGDO['obj']}. }}" @@ -58,7 +58,7 @@ def test_graph_update(self): assert "application/sparql-update" in req.headers.get("content-type") def test_update_encoding(self): - graph = ConjunctiveGraph("SPARQLUpdateStore") + graph = Dataset("SPARQLUpdateStore") graph.open((self.query_endpoint, self.update_endpoint)) update_statement = ( f"INSERT DATA {{ {EGDO['subj']} {EGDO['pred']} {EGDO['obj']}. }}" From 0c277e99ef6bc76453d997287adb16a129aa3ddb Mon Sep 17 00:00:00 2001 From: Nicholas Car Date: Mon, 24 Mar 2025 14:47:23 +1000 Subject: [PATCH 22/60] Allow multi subjects & objects in graph funcs (#3086) * subjects() * objects() & tests * blacked --- rdflib/graph.py | 71 ++++++++++++++---------- test/test_graph/test_graph_generators.py | 16 ++++++ 2 files changed, 57 insertions(+), 30 deletions(-) diff --git a/rdflib/graph.py b/rdflib/graph.py index d74dd85cfa..857491a2e4 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -847,26 +847,32 @@ def set( def subjects( self, predicate: Union[None, Path, _PredicateType] = None, - object: Optional[_ObjectType] = None, + object: Optional[Union[_ObjectType, List[_ObjectType]]] = None, unique: bool = False, ) -> Generator[_SubjectType, None, None]: """A generator of (optionally unique) subjects with the given - predicate and object""" - if not unique: - for s, p, o in self.triples((None, predicate, object)): - yield s + predicate and object(s)""" + # if the object is a list of Nodes, yield results from subject() call for each + if isinstance(object, list): + for obj in object: + for s in self.subjects(predicate, obj, unique): + yield s else: - subs = set() - for s, p, o in self.triples((None, predicate, object)): - if s not in subs: + if not unique: + for s, p, o in self.triples((None, predicate, object)): yield s - try: - subs.add(s) - except MemoryError as e: - logger.error( - f"{e}. Consider not setting parameter 'unique' to True" - ) - raise + else: + subs = set() + for s, p, o in self.triples((None, predicate, object)): + if s not in subs: + yield s + try: + subs.add(s) + except MemoryError as e: + logger.error( + f"{e}. Consider not setting parameter 'unique' to True" + ) + raise def predicates( self, @@ -894,27 +900,32 @@ def predicates( def objects( self, - subject: Optional[_SubjectType] = None, + subject: Optional[Union[_SubjectType, List[_SubjectType]]] = None, predicate: Union[None, Path, _PredicateType] = None, unique: bool = False, ) -> Generator[_ObjectType, None, None]: """A generator of (optionally unique) objects with the given - subject and predicate""" - if not unique: - for s, p, o in self.triples((subject, predicate, None)): - yield o + subject(s) and predicate""" + if isinstance(subject, list): + for subj in subject: + for o in self.objects(subj, predicate, unique): + yield o else: - objs = set() - for s, p, o in self.triples((subject, predicate, None)): - if o not in objs: + if not unique: + for s, p, o in self.triples((subject, predicate, None)): yield o - try: - objs.add(o) - except MemoryError as e: - logger.error( - f"{e}. Consider not setting parameter 'unique' to True" - ) - raise + else: + objs = set() + for s, p, o in self.triples((subject, predicate, None)): + if o not in objs: + yield o + try: + objs.add(o) + except MemoryError as e: + logger.error( + f"{e}. Consider not setting parameter 'unique' to True" + ) + raise def subject_predicates( self, object: Optional[_ObjectType] = None, unique: bool = False diff --git a/test/test_graph/test_graph_generators.py b/test/test_graph/test_graph_generators.py index 0d89c9b7fb..bec7ccb4ce 100644 --- a/test/test_graph/test_graph_generators.py +++ b/test/test_graph/test_graph_generators.py @@ -75,3 +75,19 @@ def test_parse_berners_lee_card_into_graph(): assert len(list(graph.subjects(unique=True))) == no_of_unique_subjects assert len(list(graph.predicates(unique=True))) == no_of_unique_predicates assert len(list(graph.objects(unique=True))) == no_of_unique_objects + + +def test_subjects_multi(): + graph = Graph() + add_stuff(graph) + assert len([subj for subj in graph.subjects(LIKES, [CHEESE, PIZZA])]) == 5 + assert len([subj for subj in graph.subjects(LIKES, [])]) == 0 + assert len([subj for subj in graph.subjects(LIKES | HATES, [CHEESE, PIZZA])]) == 6 + + +def test_objects_multi(): + graph = Graph() + add_stuff(graph) + assert len([obj for obj in graph.objects([TAREK, BOB], LIKES)]) == 6 + assert len([obj for obj in graph.objects([], LIKES)]) == 0 + assert len([obj for obj in graph.objects([TAREK, BOB], LIKES | HATES)]) == 8 From d220ee3bcba10a7af6630c4faaa37ca9cee33554 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 25 Mar 2025 08:55:29 +1000 Subject: [PATCH 23/60] build(deps): bump library/python in /docker/unstable (#3089) Bumps library/python from `ae9f9ac` to `8f3aba4`. --- updated-dependencies: - dependency-name: library/python dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- docker/unstable/Dockerfile | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docker/unstable/Dockerfile b/docker/unstable/Dockerfile index 4a30091309..fc4c43c4de 100644 --- a/docker/unstable/Dockerfile +++ b/docker/unstable/Dockerfile @@ -1,4 +1,4 @@ -FROM docker.io/library/python:3.13.2-slim@sha256:ae9f9ac89467077ed1efefb6d9042132d28134ba201b2820227d46c9effd3174 +FROM docker.io/library/python:3.13.2-slim@sha256:8f3aba466a471c0ab903dbd7cb979abd4bda370b04789d25440cc90372b50e04 # This file is generated from docker:unstable in Taskfile.yml COPY var/requirements.txt /var/tmp/build/ From 721164cb7491cf964ea6d22bedb8159645ec5b7f Mon Sep 17 00:00:00 2001 From: Nicholas Car Date: Sat, 29 Mar 2025 12:19:04 +1000 Subject: [PATCH 24/60] 7.1.4 pre-release (#3098) --- CHANGELOG.md | 25 +++++++++++++++++++++++++ README.md | 3 ++- admin/get_merged_prs.py | 2 +- pyproject.toml | 2 +- rdflib/__init__.py | 2 +- 5 files changed, 30 insertions(+), 4 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index a4e3b61ca6..df3de6f5a0 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,28 @@ +## 2025-03-29 RELEASE 7.1.4 + +A tidy-up release with no major updates over 7.1.3. This may be the last 7.x +release as we move to a version 8 with breaking changes to Dataset and a few +APIs. + +Interesting PRs merged: + +* 2025-03-24 - remove old hacks against 2to3 + [PR #3095](https://github.com/RDFLib/rdflib/pull/3095) +* 2025-03-24 - Allow multi subjects & objects in graph funcs + [PR #3086](https://github.com/RDFLib/rdflib/pull/3086) +* 2025-03-24 - Reduce test warnings + [PR #3085](https://github.com/RDFLib/rdflib/pull/3085) +* 2025-03-22 - Downgrade log message about plugin + [PR #3063](https://github.com/RDFLib/rdflib/pull/3063) +* 2025-03-22 - remove old hacks against 2to3 + [PR #3076](https://github.com/RDFLib/rdflib/pull/3076) +* 2025-03-22 - Cope with Namespace annotations in Python 3.14 + [PR #3084](https://github.com/RDFLib/rdflib/pull/3084) +* 2025-01-18 - small docco update + [PR #3053](https://github.com/RDFLib/rdflib/pull/3053) + +... and lots of boring dependency bump PRs merged! + ## 2025-01-17 RELEASE 7.1.3 A fix-up release that re-adds support for Python 3.8 after it was accidentally diff --git a/README.md b/README.md index f2b106648d..33a3441f43 100644 --- a/README.md +++ b/README.md @@ -42,7 +42,8 @@ Help with maintenance of all of the RDFLib family of packages is always welcome ## Versions & Releases -* `main` branch in this repository is the current unstable release +* `main` branch in this repository is the current unstable release - version 8 alpha +* `7.1.4` tidy-up release, possibly last 7.x release * `7.1.3` current stable release, small improvements to 7.1.1 * `7.1.2` previously deleted release * `7.1.1` previous stable release diff --git a/admin/get_merged_prs.py b/admin/get_merged_prs.py index ddee02fb43..4b049b4257 100644 --- a/admin/get_merged_prs.py +++ b/admin/get_merged_prs.py @@ -5,7 +5,7 @@ import urllib.request # https://api.github.com/search/issues?q=repo:rdflib/rdflib+is:pr+merged:%3E=2023-08-02&per_page=300&page=1 -LAST_RELEASE_DATE = "2024-10-17" +LAST_RELEASE_DATE = "2025-01-18" ISSUES_URL = "https://api.github.com/search/issues" ITEMS = [] PAGE = 1 diff --git a/pyproject.toml b/pyproject.toml index fecad7eaeb..ac074a7878 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.1.4a" +version = "7.1.4" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] diff --git a/rdflib/__init__.py b/rdflib/__init__.py index dcfe8df36a..051c5e3ca4 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -52,7 +52,7 @@ __docformat__ = "restructuredtext en" __version__: str = _DISTRIBUTION_METADATA["Version"] -__date__ = "2025-01-18" +__date__ = "2025-03-29" __all__ = [ "URIRef", From c580c00dbe76b6c6fb201448e4e81887eb88b40b Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 18 Sep 2025 13:08:37 +1000 Subject: [PATCH 25/60] Merge 7-maintenance branch into 7.x (#3222) * 7.1.4 pre-release * fix namespace prefixes in longturtle serialization (#3134) Co-authored-by: Daan de Schepper * Fix failing webtest (#3194) * test: fix failing webtest Fixes https://github.com/RDFLib/rdflib/issues/3192 * Revert "remove old hacks against 2to3 (#3076)" (#3195) This reverts commit b74c6574fd982b410aed1aa43853eed37504bf15. * Specify `Optional` parameters in `Graph.triples_choices` (#3075) * Specify `Optional` parameters in `Graph.triples_choices` The two non-list parameters can be `None`, but this is not reflected in the type hint. Also introduces a type alias to simplify method signatures. * style: remove unused imports --------- Co-authored-by: Nicholas Car Co-authored-by: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Co-authored-by: Edmond Chuc * feat: canonicalization with longturtle serializer now optional (#3197) * feat: canonicalization with longturtle serializer now optional Fixes https://github.com/RDFLib/rdflib/issues/3196 * docs: fix docs build error by removing py obj reference to canon --------- Co-authored-by: Nicholas Car Co-authored-by: Daan de Schepper Co-authored-by: Sigmund Lahn Co-authored-by: Nicholas Car --- rdflib/graph.py | 23 +++---- rdflib/plugins/parsers/rdfxml.py | 9 ++- rdflib/plugins/serializers/longturtle.py | 43 ++++++++++--- rdflib/plugins/stores/berkeleydb.py | 15 +++-- rdflib/plugins/stores/sparqlstore.py | 7 +- rdflib/store.py | 11 +--- test/data/longturtle/longturtle-target.ttl | 64 ++++++++++--------- test/test_graph/test_graph.py | 2 +- .../test_serializer_longturtle.py | 2 +- .../test_serializer_longturtle_sort.py | 30 ++++----- 10 files changed, 111 insertions(+), 95 deletions(-) diff --git a/rdflib/graph.py b/rdflib/graph.py index 857491a2e4..f9e00f2f91 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -355,6 +355,11 @@ _TripleOrQuadSelectorType = Union["_TripleSelectorType", "_QuadSelectorType"] _TriplePathType = Tuple["_SubjectType", Path, "_ObjectType"] _TripleOrTriplePathType = Union["_TripleType", "_TriplePathType"] +_TripleChoiceType = Union[ + Tuple[List[_SubjectType], Optional[_PredicateType], Optional[_ObjectType]], + Tuple[Optional[_SubjectType], List[_PredicateType], Optional[_ObjectType]], + Tuple[Optional[_SubjectType], Optional[_PredicateType], List[_ObjectType]], +] _GraphT = TypeVar("_GraphT", bound="Graph") _ConjunctiveGraphT = TypeVar("_ConjunctiveGraphT", bound="ConjunctiveGraph") @@ -994,11 +999,7 @@ def predicate_objects( def triples_choices( self, - triple: Union[ - Tuple[List[_SubjectType], _PredicateType, _ObjectType], - Tuple[_SubjectType, List[_PredicateType], _ObjectType], - Tuple[_SubjectType, _PredicateType, List[_ObjectType]], - ], + triple: _TripleChoiceType, context: Optional[_ContextType] = None, ) -> Generator[_TripleType, None, None]: subject, predicate, object_ = triple @@ -2196,11 +2197,7 @@ def quads( def triples_choices( self, - triple: Union[ - Tuple[List[_SubjectType], _PredicateType, _ObjectType], - Tuple[_SubjectType, List[_PredicateType], _ObjectType], - Tuple[_SubjectType, _PredicateType, List[_ObjectType]], - ], + triple: _TripleChoiceType, context: Optional[_ContextType] = None, ) -> Generator[_TripleType, None, None]: """Iterate over all the triples in the entire conjunctive graph""" @@ -2946,11 +2943,7 @@ def __isub__(self: _GraphT, other: Iterable[_TripleType]) -> NoReturn: def triples_choices( self, - triple: Union[ - Tuple[List[_SubjectType], _PredicateType, _ObjectType], - Tuple[_SubjectType, List[_PredicateType], _ObjectType], - Tuple[_SubjectType, _PredicateType, List[_ObjectType]], - ], + triple: _TripleChoiceType, context: Optional[_ContextType] = None, ) -> Generator[_TripleType, None, None]: subject, predicate, object_ = triple diff --git a/rdflib/plugins/parsers/rdfxml.py b/rdflib/plugins/parsers/rdfxml.py index e0f6e05fa7..54fc69567b 100644 --- a/rdflib/plugins/parsers/rdfxml.py +++ b/rdflib/plugins/parsers/rdfxml.py @@ -298,7 +298,8 @@ def document_element_start( self, name: Tuple[str, str], qname, attrs: AttributesImpl ) -> None: if name[0] and URIRef("".join(name)) == RDFVOC.RDF: - next = self.next + # Cheap hack so 2to3 doesn't turn it into __next__ + next = getattr(self, "next") next.start = self.node_element_start next.end = self.node_element_end else: @@ -315,7 +316,8 @@ def node_element_start( current = self.current absolutize = self.absolutize - next = self.next + # Cheap hack so 2to3 doesn't turn it into __next__ + next = getattr(self, "next") next.start = self.property_element_start next.end = self.property_element_end @@ -408,7 +410,8 @@ def property_element_start( current = self.current absolutize = self.absolutize - next = self.next + # Cheap hack so 2to3 doesn't turn it into __next__ + next = getattr(self, "next") object: Optional[_ObjectType] = None current.data = None current.list = None diff --git a/rdflib/plugins/serializers/longturtle.py b/rdflib/plugins/serializers/longturtle.py index 8de1e52a28..2aaed36e6d 100644 --- a/rdflib/plugins/serializers/longturtle.py +++ b/rdflib/plugins/serializers/longturtle.py @@ -39,21 +39,20 @@ class LongTurtleSerializer(RecursiveSerializer): + """LongTurtle, a Turtle serialization format. + + When the optional parameter ``canon`` is set to :py:obj:`True`, the graph is canonicalized + before serialization. This normalizes blank node identifiers and allows for + deterministic serialization of the graph. Useful when consistent outputs are required. + """ + short_name = "longturtle" indentString = " " def __init__(self, store): self._ns_rewrite = {} - store = to_canonical_graph(store) - content = store.serialize(format="application/n-triples") - lines = content.split("\n") - lines.sort() - graph = Graph() - graph.parse( - data="\n".join(lines), format="application/n-triples", skolemize=True - ) - graph = graph.de_skolemize() - super(LongTurtleSerializer, self).__init__(graph) + self._canon = False + super(LongTurtleSerializer, self).__init__(store) self.keywords = {RDF.type: "a"} self.reset() self.stream = None @@ -83,11 +82,34 @@ def addNamespace(self, prefix, namespace): super(LongTurtleSerializer, self).addNamespace(prefix, namespace) return prefix + def canonize(self): + """Apply canonicalization to the store. + + This normalizes blank node identifiers and allows for deterministic + serialization of the graph. + """ + if not self._canon: + return + + namespace_manager = self.store.namespace_manager + store = to_canonical_graph(self.store) + content = store.serialize(format="application/n-triples") + lines = content.split("\n") + lines.sort() + graph = Graph() + graph.parse( + data="\n".join(lines), format="application/n-triples", skolemize=True + ) + graph = graph.de_skolemize() + graph.namespace_manager = namespace_manager + self.store = graph + def reset(self): super(LongTurtleSerializer, self).reset() self._shortNames = {} self._started = False self._ns_rewrite = {} + self.canonize() def serialize( self, @@ -97,6 +119,7 @@ def serialize( spacious: Optional[bool] = None, **kwargs: Any, ) -> None: + self._canon = kwargs.get("canon", False) self.reset() self.stream = stream # if base is given here, use, if not and a base is set for the graph use that diff --git a/rdflib/plugins/stores/berkeleydb.py b/rdflib/plugins/stores/berkeleydb.py index 872dc368ef..12009787cd 100644 --- a/rdflib/plugins/stores/berkeleydb.py +++ b/rdflib/plugins/stores/berkeleydb.py @@ -428,7 +428,8 @@ def remove( # type: ignore[override] cursor = index.cursor(txn=txn) try: cursor.set_range(key) - current = cursor.next + # Hack to stop 2to3 converting this to next(cursor) + current = getattr(cursor, "next")() except db.DBNotFoundError: current = None cursor.close() @@ -505,7 +506,8 @@ def triples( cursor = index.cursor(txn=txn) try: cursor.set_range(key) - current = cursor.next + # Cheap hack so 2to3 doesn't convert to next(cursor) + current = getattr(cursor, "next")() except db.DBNotFoundError: current = None cursor.close() @@ -537,7 +539,8 @@ def __len__(self, context: Optional[_ContextType] = None) -> int: key, value = current if key.startswith(prefix): count += 1 - current = cursor.next + # Hack to stop 2to3 converting this to next(cursor) + current = getattr(cursor, "next")() else: break cursor.close() @@ -590,7 +593,8 @@ def namespaces(self) -> Generator[Tuple[str, URIRef], None, None]: while current: prefix, namespace = current results.append((prefix.decode("utf-8"), namespace.decode("utf-8"))) - current = cursor.next + # Hack to stop 2to3 converting this to next(cursor) + current = getattr(cursor, "next")() cursor.close() for prefix, namespace in results: yield prefix, URIRef(namespace) @@ -633,7 +637,8 @@ def contexts( cursor = index.cursor() try: cursor.set_range(key) - current = cursor.next + # Hack to stop 2to3 converting this to next(cursor) + current = getattr(cursor, "next")() except db.DBNotFoundError: current = None cursor.close() diff --git a/rdflib/plugins/stores/sparqlstore.py b/rdflib/plugins/stores/sparqlstore.py index f9827cf94c..e7a9723e8c 100644 --- a/rdflib/plugins/stores/sparqlstore.py +++ b/rdflib/plugins/stores/sparqlstore.py @@ -35,6 +35,7 @@ _TripleType, _ContextType, _QuadType, + _TripleChoiceType, _TriplePatternType, _SubjectType, _PredicateType, @@ -367,11 +368,7 @@ def triples( # type: ignore[override] def triples_choices( self, - _: Tuple[ - Union[_SubjectType, List[_SubjectType]], - Union[_PredicateType, List[_PredicateType]], - Union[_ObjectType, List[_ObjectType]], - ], + _: _TripleChoiceType, context: Optional[_ContextType] = None, ) -> Generator[ Tuple[ diff --git a/rdflib/store.py b/rdflib/store.py index 2ca03529ab..9cada631d7 100644 --- a/rdflib/store.py +++ b/rdflib/store.py @@ -36,7 +36,6 @@ Generator, Iterable, Iterator, - List, Mapping, Optional, Tuple, @@ -49,10 +48,8 @@ from rdflib.graph import ( Graph, _ContextType, - _ObjectType, - _PredicateType, _QuadType, - _SubjectType, + _TripleChoiceType, _TriplePatternType, _TripleType, ) @@ -281,11 +278,7 @@ def remove( def triples_choices( self, - triple: Union[ - Tuple[List[_SubjectType], _PredicateType, _ObjectType], - Tuple[_SubjectType, List[_PredicateType], _ObjectType], - Tuple[_SubjectType, _PredicateType, List[_ObjectType]], - ], + triple: _TripleChoiceType, context: Optional[_ContextType] = None, ) -> Generator[ Tuple[ diff --git a/test/data/longturtle/longturtle-target.ttl b/test/data/longturtle/longturtle-target.ttl index 54cf23e9ff..b9df06e751 100644 --- a/test/data/longturtle/longturtle-target.ttl +++ b/test/data/longturtle/longturtle-target.ttl @@ -1,72 +1,74 @@ +PREFIX cn: +PREFIX ex: PREFIX geo: PREFIX rdf: -PREFIX schema: +PREFIX sdo: PREFIX xsd: - - a schema:Person ; - schema:age 41 ; - schema:alternateName +ex:nicholas + a sdo:Person ; + sdo:age 41 ; + sdo:alternateName [ - schema:name "Dr N.J. Car" ; + sdo:name "Dr N.J. Car" ; ] , "N.J. Car" , "Nick Car" ; - schema:name + sdo:name [ - a ; - schema:hasPart + a cn:CompoundName ; + sdo:hasPart [ - a ; - schema:hasPart + a cn:CompoundName ; + sdo:hasPart [ - a ; + a cn:CompoundName ; rdf:value "Car" ; ] , [ - a ; + a cn:CompoundName ; rdf:value "Maxov" ; ] ; ] , [ - a ; + a cn:CompoundName ; rdf:value "Nicholas" ; ] , [ - a ; + a cn:CompoundName ; rdf:value "John" ; ] ; ] ; - schema:worksFor ; + sdo:worksFor ; . - a schema:Organization ; - schema:location ; + a sdo:Organization ; + sdo:location ; . - a schema:Place ; - schema:address + a sdo:Place ; + sdo:address [ - a schema:PostalAddress ; - schema:addressCountry + a sdo:PostalAddress ; + sdo:addressCountry [ - schema:identifier "au" ; - schema:name "Australia" ; + sdo:identifier "au" ; + sdo:name "Australia" ; ] ; - schema:addressLocality "Shorncliffe" ; - schema:addressRegion "QLD" ; - schema:postalCode 4017 ; - schema:streetAddress ( + sdo:addressLocality "Shorncliffe" ; + sdo:addressRegion "QLD" ; + sdo:postalCode 4017 ; + sdo:streetAddress ( 72 "Yundah" "Street" ) ; ] ; - schema:geo + sdo:geo [ - schema:polygon "POLYGON((153.082403 -27.325801, 153.08241 -27.32582, 153.082943 -27.325612, 153.083010 -27.325742, 153.083543 -27.325521, 153.083456 -27.325365, 153.082403 -27.325801))"^^geo:wktLiteral ; + sdo:polygon "POLYGON((153.082403 -27.325801, 153.08241 -27.32582, 153.082943 -27.325612, 153.083010 -27.325742, 153.083543 -27.325521, 153.083456 -27.325365, 153.082403 -27.325801))"^^geo:wktLiteral ; ] ; - schema:name "KurrawongAI HQ" ; + sdo:name "KurrawongAI HQ" ; . diff --git a/test/test_graph/test_graph.py b/test/test_graph/test_graph.py index 639aa710c7..0e8227042c 100644 --- a/test/test_graph/test_graph.py +++ b/test/test_graph/test_graph.py @@ -399,7 +399,7 @@ def test_guess_format_for_parse_http_text_plain(): assert len(graph) > 0 # A url that returns content-type text/html. - url = "https://github.com/RDFLib/rdflib/issues/2734" + url = "https://www.w3.org/TR/REC-rdf-syntax/" with pytest.raises(PluginException): graph = Graph().parse(url) diff --git a/test/test_serializers/test_serializer_longturtle.py b/test/test_serializers/test_serializer_longturtle.py index c1761b6dae..65821784ee 100644 --- a/test/test_serializers/test_serializer_longturtle.py +++ b/test/test_serializers/test_serializer_longturtle.py @@ -167,7 +167,7 @@ def test_longturtle(): g.bind("sdo", SDO) # run the long turtle serializer - output = g.serialize(format="longturtle") + output = g.serialize(format="longturtle", canon=True) # fix the target current_dir = Path.cwd() # Get the current directory diff --git a/test/test_serializers/test_serializer_longturtle_sort.py b/test/test_serializers/test_serializer_longturtle_sort.py index 0e397afaf2..044660e3ed 100644 --- a/test/test_serializers/test_serializer_longturtle_sort.py +++ b/test/test_serializers/test_serializer_longturtle_sort.py @@ -62,55 +62,55 @@ def test_sort_semiblank_graph() -> None: graph.add((outer_node, EX.has, inner_node)) graph.add((inner_node, RDFS.seeAlso, nested)) - graph_text = graph.serialize(format="longturtle", sort=True) + graph_text = graph.serialize(format="longturtle", canon=True) if first_graph_text == "": first_graph_text = graph_text serialization_counter[graph_text] += 1 expected_serialization = """\ -PREFIX ns1: +PREFIX ex: PREFIX rdfs: -ns1:A +ex:A rdfs:comment "Thing A" ; . -ns1:C +ex:C rdfs:comment "Thing C" ; . -ns1:B +ex:B rdfs:comment "Thing B" ; . -[] ns1:has +[] ex:has [ - rdfs:seeAlso ns1:A ; + rdfs:seeAlso ex:A ; ] ; . -[] rdfs:seeAlso ns1:B ; +[] rdfs:seeAlso ex:B ; . -[] ns1:has +[] ex:has [ - rdfs:seeAlso ns1:C ; + rdfs:seeAlso ex:C ; ] ; . -[] rdfs:seeAlso ns1:A ; +[] rdfs:seeAlso ex:A ; . -[] rdfs:seeAlso ns1:C ; +[] rdfs:seeAlso ex:C ; . -[] rdfs:seeAlso ns1:B ; +[] rdfs:seeAlso ex:B ; . -[] ns1:has +[] ex:has [ - rdfs:seeAlso ns1:B ; + rdfs:seeAlso ex:B ; ] ; . From 747b8d30e6ca12c12fd895f7ae44685a828ff33a Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 18 Sep 2025 14:18:51 +1000 Subject: [PATCH 26/60] [7.x] notation3.py: don't normalize float representation (#3221) * 7.1.4 pre-release * fix namespace prefixes in longturtle serialization (#3134) Co-authored-by: Daan de Schepper * Fix failing webtest (#3194) * test: fix failing webtest Fixes https://github.com/RDFLib/rdflib/issues/3192 * Revert "remove old hacks against 2to3 (#3076)" (#3195) This reverts commit b74c6574fd982b410aed1aa43853eed37504bf15. * Specify `Optional` parameters in `Graph.triples_choices` (#3075) * Specify `Optional` parameters in `Graph.triples_choices` The two non-list parameters can be `None`, but this is not reflected in the type hint. Also introduces a type alias to simplify method signatures. * style: remove unused imports --------- Co-authored-by: Nicholas Car Co-authored-by: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Co-authored-by: Edmond Chuc * feat: canonicalization with longturtle serializer now optional (#3197) * feat: canonicalization with longturtle serializer now optional Fixes https://github.com/RDFLib/rdflib/issues/3196 * docs: fix docs build error by removing py obj reference to canon * added n3 test to check for internal float normalization made as a separate commit to illustrate the old broken behavior priro to the fix in the next commit * notation3.py: don't normalize float representation fix behavior of the n3 parser family to avoid normalizing raw float string representation which makes it impossible to roundtrip the exact original string representation of e.g. 1e10 * test: refactor test_float_no_norm to use pytest parametrization * style: add noqa to sfloat class * chore: remove unused mypy type ignore * docs: fix sfloat reference * fix: sphinx nitpicky reference --------- Co-authored-by: Nicholas Car Co-authored-by: Daan de Schepper Co-authored-by: Sigmund Lahn Co-authored-by: Nicholas Car Co-authored-by: Tom Gillespie --- rdflib/plugins/parsers/notation3.py | 13 +++++++++---- test/test_n3.py | 17 +++++++++++++++++ 2 files changed, 26 insertions(+), 4 deletions(-) diff --git a/rdflib/plugins/parsers/notation3.py b/rdflib/plugins/parsers/notation3.py index 4df892ee45..e9c2d0f27f 100755 --- a/rdflib/plugins/parsers/notation3.py +++ b/rdflib/plugins/parsers/notation3.py @@ -83,6 +83,7 @@ "Formula", "RDFSink", "SinkParser", + "sfloat", ] from rdflib.parser import Parser @@ -380,6 +381,10 @@ def unicodeExpand(m: Match) -> str: langcode = re.compile(r"[a-zA-Z0-9]+(-[a-zA-Z0-9]+)*") +class sfloat(str): # noqa: N801 + """don't normalize raw XSD.double string representation""" + + class SinkParser: def __init__( self, @@ -1528,7 +1533,7 @@ def nodeOrLiteral(self, argstr: str, i: int, res: MutableSequence[Any]) -> int: m = exponent_syntax.match(argstr, i) if m: j = m.end() - res.append(float(argstr[i:j])) + res.append(sfloat(argstr[i:j])) return j m = decimal_syntax.match(argstr, i) @@ -1911,7 +1916,7 @@ def makeStatement( def normalise( self, f: Optional[Formula], - n: Union[Tuple[int, str], bool, int, Decimal, float, _AnyT], + n: Union[Tuple[int, str], bool, int, Decimal, sfloat, _AnyT], ) -> Union[URIRef, Literal, BNode, _AnyT]: if isinstance(n, tuple): return URIRef(str(n[1])) @@ -1931,7 +1936,7 @@ def normalise( s = Literal(value, datatype=DECIMAL_DATATYPE) return s - if isinstance(n, float): + if isinstance(n, sfloat): s = Literal(str(n), datatype=DOUBLE_DATATYPE) return s @@ -1947,7 +1952,7 @@ def normalise( # f.universals[n] = f.newBlankNode() # return f.universals[n] # type error: Incompatible return value type (got "Union[int, _AnyT]", expected "Union[URIRef, Literal, BNode, _AnyT]") [return-value] - return n # type: ignore[return-value] + return n def intern(self, something: _AnyT) -> _AnyT: return something diff --git a/test/test_n3.py b/test/test_n3.py index d2c362a39a..40f8718681 100644 --- a/test/test_n3.py +++ b/test/test_n3.py @@ -251,6 +251,23 @@ def test_empty_prefix(self): g2 ), "Document with declared empty prefix must match default #" + @pytest.mark.parametrize( + "do_normalize_literal, expected_result", + [(True, {"1.0", "10000000000.0"}), (False, {"1e10", "1e0"})], + ) + def test_float_no_norm(self, do_normalize_literal, expected_result): + import rdflib + + original_normalize_literal = rdflib.NORMALIZE_LITERALS + try: + rdflib.NORMALIZE_LITERALS = do_normalize_literal + g1 = Graph() + g1.parse(data=":a :b 1e10, 1e0 .", format="n3") + values = set(str(o) for o in g1.objects()) + assert values == expected_result + finally: + rdflib.NORMALIZE_LITERALS = original_normalize_literal + class TestRegularExpressions: def test_exponents(self): From f276bd6cc70bfaf3719f7eb9d787e6d794b5a3dd Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 18 Sep 2025 15:30:20 +1000 Subject: [PATCH 27/60] Fix incorrect deskolemization of literals (#3127) (#3223) * Fix issue 3126 * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --------- Co-authored-by: Daan de Schepper Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> --- rdflib/graph.py | 11 +++++------ test/test_issues/test_issue3126.py | 21 +++++++++++++++++++++ 2 files changed, 26 insertions(+), 6 deletions(-) create mode 100644 test/test_issues/test_issue3126.py diff --git a/rdflib/graph.py b/rdflib/graph.py index f9e00f2f91..3a84dcf246 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -1869,12 +1869,11 @@ def do_de_skolemize2(t: _TripleType) -> _TripleType: # type error: Argument 1 to "Genid" has incompatible type "Node"; expected "str" s = Genid(s).de_skolemize() # type: ignore[arg-type] - if RDFLibGenid._is_rdflib_skolem(o): - # type error: Argument 1 to "RDFLibGenid" has incompatible type "Node"; expected "str" - o = RDFLibGenid(o).de_skolemize() # type: ignore[arg-type] - elif Genid._is_external_skolem(o): - # type error: Argument 1 to "Genid" has incompatible type "Node"; expected "str" - o = Genid(o).de_skolemize() # type: ignore[arg-type] + if isinstance(o, URIRef): + if RDFLibGenid._is_rdflib_skolem(o): + o = RDFLibGenid(o).de_skolemize() + elif Genid._is_external_skolem(o): + o = Genid(o).de_skolemize() return s, p, o diff --git a/test/test_issues/test_issue3126.py b/test/test_issues/test_issue3126.py new file mode 100644 index 0000000000..321ff7d3e8 --- /dev/null +++ b/test/test_issues/test_issue3126.py @@ -0,0 +1,21 @@ +import pytest + +from rdflib import Graph + + +def test_skolem_de_skolem_roundtrip(): + """Test deskolemization should ignore literals. + + Issue: https://github.com/RDFLib/rdflib/issues/3126 + """ + + nt = ( + ' "http://example.com [some remark]" .' + ) + + graph = Graph().parse(data=nt, format="nt").de_skolemize() + + try: + graph.de_skolemize() + except BaseException as ex: + pytest.fail(f"Unexpected error: {ex}") From 19a0ccf014e2661aa04c2efde925f83ca6b54409 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 19 Sep 2025 09:37:11 +1000 Subject: [PATCH 28/60] chore: prep 7.2.0 release (#3224) * chore: prep 7.2.0 release * chore: remove mention of charter * chore: remove scratch file --- CHANGELOG.md | 119 ++ CITATION.cff | 4 +- README.md | 1 + admin/get_merged_prs.py | 2 +- merged_prs.json | 3939 +++++++++++++++++++++++++++++++++++++++ pyproject.toml | 2 +- rdflib/__init__.py | 2 +- 7 files changed, 4064 insertions(+), 5 deletions(-) create mode 100644 merged_prs.json diff --git a/CHANGELOG.md b/CHANGELOG.md index df3de6f5a0..f8812660a9 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,122 @@ +## 2025-09-18 RELEASE 7.2.0 + +This release contains a number of fixes and improvements to RDFLib. + +New features: +- Canonicalization is now optional in the longturtle serializer. When serializing with the `longturtle` format, set `canon=True` to enable canonicalization. +- General type hints enhancements + +Fixes: +- Fixed deskolemization of literals +- Fixed round-tripping of floats in N3 - floating point and scientific notation handling +- Fixed CI validation workflow, including regressions in tests +- Fixed `Literal.toPython` date conversion for gYear/gYearMonth +- Fixed namespace prefixes in longturtle serialization +- Fixed missing dot in RDF patch serialization + +PRs merged: + +* 2025-09-18 - [7.x] Fix incorrect deskolemization of literals + [PR #3223](https://github.com/RDFLib/rdflib/pull/3223) +* 2025-09-18 - Fix incorrect deskolemization of literals + [PR #3127](https://github.com/RDFLib/rdflib/pull/3127) +* 2025-09-18 - notation3.py: don't normalize float representation + [PR #3020](https://github.com/RDFLib/rdflib/pull/3020) +* 2025-09-18 - [7.x] notation3.py: don't normalize float representation + [PR #3221](https://github.com/RDFLib/rdflib/pull/3221) +* 2025-09-18 - Merge 7-maintenance branch into 7.x + [PR #3222](https://github.com/RDFLib/rdflib/pull/3222) +* 2025-09-17 - Allow lxml 6 + [PR #3219](https://github.com/RDFLib/rdflib/pull/3219) +* 2025-09-16 - ci: fix firejail command for poetry 2.1.0 + [PR #3218](https://github.com/RDFLib/rdflib/pull/3218) +* 2025-09-16 - chore: address dependabot security vulnerabilities + [PR #3210](https://github.com/RDFLib/rdflib/pull/3210) +* 2025-09-09 - Merge 7-maintenance changes into main + [PR #3202](https://github.com/RDFLib/rdflib/pull/3202) +* 2025-09-08 - feat: canonicalization with longturtle serializer now optional + [PR #3197](https://github.com/RDFLib/rdflib/pull/3197) +* 2025-09-03 - Specify `Optional` parameters in `Graph.triples_choices` + [PR #3075](https://github.com/RDFLib/rdflib/pull/3075) +* 2025-08-29 - Fix failing webtest + [PR #3194](https://github.com/RDFLib/rdflib/pull/3194) +* 2025-08-29 - Revert "remove old hacks against 2to3 (#3076)" + [PR #3195](https://github.com/RDFLib/rdflib/pull/3195) +* 2025-08-18 - Fix #3181 + [PR #3182](https://github.com/RDFLib/rdflib/pull/3182) +* 2025-08-18 - Fix contributing guide link in README.md + [PR #3158](https://github.com/RDFLib/rdflib/pull/3158) +* 2025-08-18 - Creation of an RDFLib Charter + [PR #3178](https://github.com/RDFLib/rdflib/pull/3178) +* 2025-08-11 - Feature: Add Tentris Plugin to docs + [PR #3177](https://github.com/RDFLib/rdflib/pull/3177) +* 2025-06-02 - Replacement for #3125 + [PR #3146](https://github.com/RDFLib/rdflib/pull/3146) +* 2025-06-01 - Cope with Namespace annotations in Python 3.14 + [PR #3132](https://github.com/RDFLib/rdflib/pull/3132) +* 2025-06-01 - replace PR 3109; improve plugins modules docs; change header colour t… + [PR #3145](https://github.com/RDFLib/rdflib/pull/3145) +* 2025-06-01 - Pr/3143 + [PR #3144](https://github.com/RDFLib/rdflib/pull/3144) +* 2025-05-31 - fix: remove Literal.toPython date conversion for gYear/gYearMonth + [PR #3115](https://github.com/RDFLib/rdflib/pull/3115) +* 2025-05-31 - fix: do not automatically generate header id in RDF patch generation and fix missing fullstop + [PR #3141](https://github.com/RDFLib/rdflib/pull/3141) +* 2025-05-31 - fix namespace prefixes in longturtle serialization + [PR #3106](https://github.com/RDFLib/rdflib/pull/3106) +* 2025-05-20 - [7.x] fix namespace prefixes in longturtle serialization + [PR #3134](https://github.com/RDFLib/rdflib/pull/3134) +* 2025-05-20 - List on docs the COTTAS store backend + [PR #3139](https://github.com/RDFLib/rdflib/pull/3139) +* 2025-03-29 - 7.1.4 pre-release + [PR #3098](https://github.com/RDFLib/rdflib/pull/3098) + + +* 2025-09-16 - build(deps): bump poetry from 2.0.0 to 2.1.4 in /devtools + [PR #3176](https://github.com/RDFLib/rdflib/pull/3176) +* 2025-09-16 - build(deps): bump library/python from `8220cce` to `58c30f5` in /docker/latest + [PR #3217](https://github.com/RDFLib/rdflib/pull/3217) +* 2025-09-16 - build(deps): bump library/python from `8220cce` to `58c30f5` in /docker/unstable + [PR #3216](https://github.com/RDFLib/rdflib/pull/3216) +* 2025-09-16 - build(deps-dev): bump ruff from 0.8.6 to 0.13.0 + [PR #3213](https://github.com/RDFLib/rdflib/pull/3213) +* 2025-09-16 - build(deps-dev): bump pip-tools from 7.4.1 to 7.5.0 + [PR #3211](https://github.com/RDFLib/rdflib/pull/3211) +* 2025-09-16 - build(deps-dev): bump mkdocstrings from 0.29.1 to 0.30.0 + [PR #3214](https://github.com/RDFLib/rdflib/pull/3214) +* 2025-09-12 - build(deps): bump library/python from 3.13.3-slim to 3.13.7-slim in /docker/unstable + [PR #3183](https://github.com/RDFLib/rdflib/pull/3183) +* 2025-09-12 - build(deps): bump actions/checkout from 4 to 5 + [PR #3184](https://github.com/RDFLib/rdflib/pull/3184) +* 2025-09-12 - build(deps): bump library/python from 3.13.3-slim to 3.13.7-slim in /docker/latest + [PR #3185](https://github.com/RDFLib/rdflib/pull/3185) +* 2025-09-12 - build(deps): bump actions/setup-java from 4 to 5 + [PR #3189](https://github.com/RDFLib/rdflib/pull/3189) +* 2025-09-12 - build(deps-dev): bump pytest-cov from 6.1.1 to 6.3.0 + [PR #3203](https://github.com/RDFLib/rdflib/pull/3203) +* 2025-09-12 - build(deps-dev): bump typing-extensions from 4.13.2 to 4.15.0 + [PR #3204](https://github.com/RDFLib/rdflib/pull/3204) +* 2025-09-12 - build(deps-dev): bump pytest from 8.3.5 to 8.4.2 + [PR #3205](https://github.com/RDFLib/rdflib/pull/3205) +* 2025-09-12 - build(deps-dev): bump mkdocs-include-markdown-plugin from 7.1.5 to 7.1.7 + [PR #3207](https://github.com/RDFLib/rdflib/pull/3207) +* 2025-09-12 - build(deps): bump actions/setup-python from 5 to 6 + [PR #3206](https://github.com/RDFLib/rdflib/pull/3206) +* 2025-09-12 - build(deps-dev): bump mkdocs-material from 9.6.14 to 9.6.19 + [PR #3208](https://github.com/RDFLib/rdflib/pull/3208) +* 2025-09-12 - build(deps-dev): bump coverage from 7.8.2 to 7.10.6 + [PR #3209](https://github.com/RDFLib/rdflib/pull/3209) +* 2025-05-31 - build(deps-dev): bump coverage from 7.7.1 to 7.8.2 + [PR #3142](https://github.com/RDFLib/rdflib/pull/3142) +* 2025-05-31 - build(deps-dev): bump typing-extensions from 4.13.0 to 4.13.2 + [PR #3121](https://github.com/RDFLib/rdflib/pull/3121) +* 2025-05-31 - build(deps): bump library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4 in /docker/unstable + [PR #3118](https://github.com/RDFLib/rdflib/pull/3118) +* 2025-05-31 - build(deps): bump library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4 in /docker/latest + [PR #3117](https://github.com/RDFLib/rdflib/pull/3117) +* 2025-05-27 - build(deps): bump rdflib from 7.1.2 to 7.1.4 in /docker/latest + [PR #3101](https://github.com/RDFLib/rdflib/pull/3101) + ## 2025-03-29 RELEASE 7.1.4 A tidy-up release with no major updates over 7.1.3. This may be the last 7.x diff --git a/CITATION.cff b/CITATION.cff index 0d37d305cb..8b36270572 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -69,7 +69,7 @@ authors: - family-names: "Stuart" given-names: "Veyndan" title: "RDFLib" -version: 7.1.3 -date-released: 2024-01-18 +version: 7.2.0 +date-released: 2025-09-18 url: "https://github.com/RDFLib/rdflib" doi: 10.5281/zenodo.6845245 diff --git a/README.md b/README.md index 33a3441f43..8ac6f8d3cd 100644 --- a/README.md +++ b/README.md @@ -43,6 +43,7 @@ Help with maintenance of all of the RDFLib family of packages is always welcome ## Versions & Releases * `main` branch in this repository is the current unstable release - version 8 alpha +* `7.2.0` general fixes and usability improvements, see changelog for details * `7.1.4` tidy-up release, possibly last 7.x release * `7.1.3` current stable release, small improvements to 7.1.1 * `7.1.2` previously deleted release diff --git a/admin/get_merged_prs.py b/admin/get_merged_prs.py index 4b049b4257..c4b9044488 100644 --- a/admin/get_merged_prs.py +++ b/admin/get_merged_prs.py @@ -5,7 +5,7 @@ import urllib.request # https://api.github.com/search/issues?q=repo:rdflib/rdflib+is:pr+merged:%3E=2023-08-02&per_page=300&page=1 -LAST_RELEASE_DATE = "2025-01-18" +LAST_RELEASE_DATE = "2025-03-29" ISSUES_URL = "https://api.github.com/search/issues" ITEMS = [] PAGE = 1 diff --git a/merged_prs.json b/merged_prs.json new file mode 100644 index 0000000000..3a59d972d4 --- /dev/null +++ b/merged_prs.json @@ -0,0 +1,3939 @@ +[ + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3223", + "id": 3428695947, + "node_id": "PR_kwDOADL-3s6pNAnb", + "number": 3223, + "title": "[7.x] Fix incorrect deskolemization of literals", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-09-18T05:18:09Z", + "updated_at": "2025-09-18T05:30:22Z", + "closed_at": "2025-09-18T05:30:20Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3223", + "html_url": "https://github.com/RDFLib/rdflib/pull/3223", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3223.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3223.patch", + "merged_at": "2025-09-18T05:30:20Z" + }, + "body": "This is the v7 version of PR https://github.com/RDFLib/rdflib/pull/3127.\r\n\r\n* Fix issue 3126\r\n\r\n* [pre-commit.ci] auto fixes from pre-commit.com hooks\r\n\r\nfor more information, see https://pre-commit.ci\r\n\r\n---------\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3222", + "id": 3428459459, + "node_id": "PR_kwDOADL-3s6pMNyn", + "number": 3222, + "title": "Merge 7-maintenance branch into 7.x", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-09-18T03:00:50Z", + "updated_at": "2025-09-18T03:08:39Z", + "closed_at": "2025-09-18T03:08:37Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3222", + "html_url": "https://github.com/RDFLib/rdflib/pull/3222", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3222.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3222.patch", + "merged_at": "2025-09-18T03:08:37Z" + }, + "body": "# Summary of changes\r\n\r\nI am merging in the `7-maintenance` branch after reviewing the current set of v7 branches. The `7.x` branch has branch protection rules enabled and is intended to be long-lived for all future v7 related features and fixes.\r\n\r\n`7-maintenance` branch will be closed after this merge. All future v7 PRs should target `7.x` instead.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3221", + "id": 3428294912, + "node_id": "PR_kwDOADL-3s6pLqU1", + "number": 3221, + "title": "[7.x] notation3.py: don't normalize float representation", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 9291256166, + "node_id": "LA_kwDOADL-3s8AAAACKc1RZg", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.x", + "name": "7.x", + "color": "95113B", + "default": false, + "description": "" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-09-18T01:31:07Z", + "updated_at": "2025-09-18T04:18:53Z", + "closed_at": "2025-09-18T04:18:51Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3221", + "html_url": "https://github.com/RDFLib/rdflib/pull/3221", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3221.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3221.patch", + "merged_at": "2025-09-18T04:18:51Z" + }, + "body": "# Summary of changes\r\n\r\nCode from PR https://github.com/RDFLib/rdflib/pull/3020 into v7.x.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3219", + "id": 3423885690, + "node_id": "PR_kwDOADL-3s6o8vSy", + "number": 3219, + "title": "Allow lxml 6", + "user": { + "login": "jhgit", + "id": 772518, + "node_id": "MDQ6VXNlcjc3MjUxOA==", + "avatar_url": "https://avatars.githubusercontent.com/u/772518?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/jhgit", + "html_url": "https://github.com/jhgit", + "followers_url": "https://api.github.com/users/jhgit/followers", + "following_url": "https://api.github.com/users/jhgit/following{/other_user}", + "gists_url": "https://api.github.com/users/jhgit/gists{/gist_id}", + "starred_url": "https://api.github.com/users/jhgit/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/jhgit/subscriptions", + "organizations_url": "https://api.github.com/users/jhgit/orgs", + "repos_url": "https://api.github.com/users/jhgit/repos", + "events_url": "https://api.github.com/users/jhgit/events{/privacy}", + "received_events_url": "https://api.github.com/users/jhgit/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-09-16T22:06:35Z", + "updated_at": "2025-09-17T01:52:22Z", + "closed_at": "2025-09-17T01:52:22Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3219", + "html_url": "https://github.com/RDFLib/rdflib/pull/3219", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3219.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3219.patch", + "merged_at": "2025-09-17T01:52:22Z" + }, + "body": "rdflib builds with lxml 6.0.1 - the current latest release.\r\n\r\nFixes #3220\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\nlxml is at 6.0.1. rdflib 7.1.4 builds with that version of lxml. Update pypproject.toml accordingly. Tested locally with python 3.9 and 3.11.\r\n\r\n# Checklist\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes (same pytest tests pass or fail with lxml5 as lxml6).\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [x] Considered adding additional documentation. (didn't see any documentation that needed updating)\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3218", + "id": 3420409710, + "node_id": "PR_kwDOADL-3s6oxCzU", + "number": 3218, + "title": "ci: fix firejail command for poetry 2.1.0", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-09-16T05:06:40Z", + "updated_at": "2025-09-16T05:44:53Z", + "closed_at": "2025-09-16T05:44:52Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3218", + "html_url": "https://github.com/RDFLib/rdflib/pull/3218", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3218.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3218.patch", + "merged_at": "2025-09-16T05:44:52Z" + }, + "body": "# Summary of changes\r\n\r\nIncrementally bumping poetry from v2.0.0 to see which patch/minor version breaks the CI.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3217", + "id": 3416326911, + "node_id": "PR_kwDOADL-3s6ojScX", + "number": 3217, + "title": "build(deps): bump library/python from `8220cce` to `58c30f5` in /docker/latest", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4545133062, + "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", + "name": "docker", + "color": "21ceff", + "default": false, + "description": "Pull requests that update Docker code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 3, + "created_at": "2025-09-15T05:14:09Z", + "updated_at": "2025-09-16T03:00:23Z", + "closed_at": "2025-09-16T02:59:34Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3217", + "html_url": "https://github.com/RDFLib/rdflib/pull/3217", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3217.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3217.patch", + "merged_at": "2025-09-16T02:59:34Z" + }, + "body": "Bumps library/python from `8220cce` to `58c30f5`.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.7-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once it's up-to-date and CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n

\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3216", + "id": 3416322658, + "node_id": "PR_kwDOADL-3s6ojRf8", + "number": 3216, + "title": "build(deps): bump library/python from `8220cce` to `58c30f5` in /docker/unstable", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4545133062, + "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", + "name": "docker", + "color": "21ceff", + "default": false, + "description": "Pull requests that update Docker code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 3, + "created_at": "2025-09-15T05:12:27Z", + "updated_at": "2025-09-16T02:47:32Z", + "closed_at": "2025-09-16T02:46:16Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3216", + "html_url": "https://github.com/RDFLib/rdflib/pull/3216", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3216.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3216.patch", + "merged_at": "2025-09-16T02:46:15Z" + }, + "body": "Bumps library/python from `8220cce` to `58c30f5`.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.7-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once it's up-to-date and CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3214", + "id": 3416315331, + "node_id": "PR_kwDOADL-3s6ojP3g", + "number": 3214, + "title": "build(deps-dev): bump mkdocstrings from 0.29.1 to 0.30.0", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-09-15T05:09:22Z", + "updated_at": "2025-09-16T01:23:50Z", + "closed_at": "2025-09-16T01:23:49Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3214", + "html_url": "https://github.com/RDFLib/rdflib/pull/3214", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3214.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3214.patch", + "merged_at": "2025-09-16T01:23:49Z" + }, + "body": "Bumps [mkdocstrings](https://github.com/mkdocstrings/mkdocstrings) from 0.29.1 to 0.30.0.\n
\nRelease notes\n

Sourced from mkdocstrings's releases.

\n
\n

0.30.0

\n

0.30.0 - 2025-07-23

\n

Compare with 0.29.1

\n

Features

\n
    \n
  • Add data-skip-inventory boolean attribute for elements to skip registration in local inventory (f856160 by Bartosz S\u0142awecki). Issue-671, PR-774
  • \n
  • Add I18N support (translations) (2b4ed54 by Nyuan Zhang). PR-645, Co-authored-by: Timoth\u00e9e Mazzucotelli dev@pawamoy.fr
  • \n
\n
\n
\n
\nChangelog\n

Sourced from mkdocstrings's changelog.

\n
\n

0.30.0 - 2025-07-23

\n

Compare with 0.29.1

\n

Features

\n
    \n
  • Add data-skip-inventory boolean attribute for elements to skip registration in local inventory (f856160 by Bartosz S\u0142awecki). Issue-671, PR-774
  • \n
  • Add I18N support (translations) (2b4ed54 by Nyuan Zhang). PR-645, Co-authored-by: Timoth\u00e9e Mazzucotelli dev@pawamoy.fr
  • \n
\n
\n
\n
\nCommits\n
    \n
  • 2be445f chore: Prepare release 0.30.0
  • \n
  • f856160 feat: Add data-skip-inventory boolean attribute for elements to skip regist...
  • \n
  • 2b4ed54 feat: Add I18N support (translations)
  • \n
  • 51f217f chore: Template upgrade
  • \n
  • b1da3d0 ci: Ignore Ruff warnings
  • \n
  • d5bf4e1 docs: Update link to YAML idiosyncrasies
  • \n
  • See full diff in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mkdocstrings&package-manager=pip&previous-version=0.29.1&new-version=0.30.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3213", + "id": 3416314480, + "node_id": "PR_kwDOADL-3s6ojPr0", + "number": 3213, + "title": "build(deps-dev): bump ruff from 0.8.6 to 0.13.0", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-09-15T05:09:01Z", + "updated_at": "2025-09-16T02:11:02Z", + "closed_at": "2025-09-16T02:10:33Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3213", + "html_url": "https://github.com/RDFLib/rdflib/pull/3213", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3213.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3213.patch", + "merged_at": "2025-09-16T02:10:32Z" + }, + "body": "Bumps [ruff](https://github.com/astral-sh/ruff) from 0.8.6 to 0.13.0.\n
\nRelease notes\n

Sourced from ruff's releases.

\n
\n

0.13.0

\n

Release Notes

\n

Check out the blog post for a migration guide and overview of the changes!

\n

Breaking changes

\n
    \n
  • \n

    Several rules can now add from __future__ import annotations automatically

    \n

    TC001, TC002, TC003, RUF013, and UP037 now add from __future__ import annotations as part of their fixes when the lint.future-annotations setting is enabled. This allows the rules to move more imports into TYPE_CHECKING blocks (TC001, TC002, and TC003), use PEP 604 union syntax on Python versions before 3.10 (RUF013), and unquote more annotations (UP037).

    \n
  • \n
  • \n

    Full module paths are now used to verify first-party modules

    \n

    Ruff now checks that the full path to a module exists on disk before categorizing it as a first-party import. This change makes first-party import detection more accurate, helping to avoid false positives on local directories with the same name as a third-party dependency, for example. See the FAQ section on import categorization for more details.

    \n
  • \n
  • \n

    Deprecated rules must now be selected by exact rule code

    \n

    Ruff will no longer activate deprecated rules selected by their group name or prefix. As noted below, the two remaining deprecated rules were also removed in this release, so this won't affect any current rules, but it will still affect any deprecations in the future.

    \n
  • \n
  • \n

    The deprecated macOS configuration directory fallback has been removed

    \n

    Ruff will no longer look for a user-level configuration file at ~/Library/Application Support/ruff/ruff.toml on macOS. This feature was deprecated in v0.5 in favor of using the XDG specification (usually resolving to ~/.config/ruff/ruff.toml), like on Linux. The fallback and accompanying deprecation warning have now been removed.

    \n
  • \n
\n

Removed Rules

\n

The following rules have been removed:

\n\n

Stabilization

\n

The following rules have been stabilized and are no longer in preview:

\n\n

The following behaviors have been stabilized:

\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from ruff's changelog.

\n
\n

0.13.0

\n

Check out the blog post for a migration\nguide and overview of the changes!

\n

Breaking changes

\n
    \n
  • \n

    Several rules can now add from __future__ import annotations automatically

    \n

    TC001, TC002, TC003, RUF013, and UP037 now add from __future__ import annotations as part of their fixes when the\nlint.future-annotations setting is enabled. This allows the rules to move\nmore imports into TYPE_CHECKING blocks (TC001, TC002, and TC003),\nuse PEP 604 union syntax on Python versions before 3.10 (RUF013), and\nunquote more annotations (UP037).

    \n
  • \n
  • \n

    Full module paths are now used to verify first-party modules

    \n

    Ruff now checks that the full path to a module exists on disk before\ncategorizing it as a first-party import. This change makes first-party\nimport detection more accurate, helping to avoid false positives on local\ndirectories with the same name as a third-party dependency, for example. See\nthe FAQ\nsection on import categorization for more details.

    \n
  • \n
  • \n

    Deprecated rules must now be selected by exact rule code

    \n

    Ruff will no longer activate deprecated rules selected by their group name\nor prefix. As noted below, the two remaining deprecated rules were also\nremoved in this release, so this won't affect any current rules, but it will\nstill affect any deprecations in the future.

    \n
  • \n
  • \n

    The deprecated macOS configuration directory fallback has been removed

    \n

    Ruff will no longer look for a user-level configuration file at\n~/Library/Application Support/ruff/ruff.toml on macOS. This feature was\ndeprecated in v0.5 in favor of using the XDG\nspecification\n(usually resolving to ~/.config/ruff/ruff.toml), like on Linux. The\nfallback and accompanying deprecation warning have now been removed.

    \n
  • \n
\n

Removed Rules

\n

The following rules have been removed:

\n\n

Stabilization

\n

The following rules have been stabilized and are no longer in preview:

\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • a1fdd66 Bump 0.13.0 (#20336)
  • \n
  • 8770b95 [ty] introduce DivergentType (#20312)
  • \n
  • 65982a1 [ty] Use 'unknown' specialization for upper bound on Self (#20325)
  • \n
  • 57d1f71 [ty] Simplify unions of enum literals and subtypes thereof (#20324)
  • \n
  • 7a75702 Ignore deprecated rules unless selected by exact code (#20167)
  • \n
  • 9ca632c Stabilize adding future import via config option (#20277)
  • \n
  • 64fe7d3 [flake8-errmsg] Stabilize extending raw-string-in-exception (EM101) to ...
  • \n
  • beeeb8d Stabilize the remaining Airflow rules (#20250)
  • \n
  • b6fca52 [flake8-bugbear] Stabilize support for non-context-manager calls in `assert...
  • \n
  • ac7f882 [flake8-commas] Stabilize support for trailing comma checks in type paramet...
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.8.6&new-version=0.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3211", + "id": 3416312512, + "node_id": "PR_kwDOADL-3s6ojPQZ", + "number": 3211, + "title": "build(deps-dev): bump pip-tools from 7.4.1 to 7.5.0", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 3, + "created_at": "2025-09-15T05:08:08Z", + "updated_at": "2025-09-16T01:36:07Z", + "closed_at": "2025-09-16T01:35:09Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3211", + "html_url": "https://github.com/RDFLib/rdflib/pull/3211", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3211.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3211.patch", + "merged_at": "2025-09-16T01:35:09Z" + }, + "body": "Bumps [pip-tools](https://github.com/jazzband/pip-tools) from 7.4.1 to 7.5.0.\n
\nRelease notes\n

Sourced from pip-tools's releases.

\n
\n

v7.5.0

\n

2025-07-30

\n

Bug fixes

\n
    \n
  • \n

    Fixed the ordering of format controls to preserve underlying pip behavior -- by @\u200bsethmlarson.

    \n

    PRs and issues: #2082

    \n
  • \n
  • \n

    Fixed NoCandidateFound exception to be compatible with pip >= 24.1 -- by @\u200bchrysle.

    \n

    PRs and issues: #2083

    \n
  • \n
  • \n

    pip-compile now produces relative paths for editable dependencies -- by @\u200bmacro1.

    \n

    PRs and issues: #2087

    \n
  • \n
  • \n

    Fixed crash failures due to incompatibility with pip >= 25.1 -- by @\u200bgkreitz and @\u200bsirosen.

    \n

    PRs and issues: #2176, #2178

    \n
  • \n
\n

Features

\n
    \n
  • \n

    pip-compile now treats package versions requested on the command line as constraints for the underlying pip usage.\nThis applies to build deps in addition to normal package requirements.

    \n

    -- by @\u200bchrysle

    \n

    PRs and issues: #2106

    \n
  • \n
  • \n

    pip-tools now tests on and officially supports Python 3.12 -- by @\u200bsirosen.

    \n

    PRs and issues: #2188

    \n
  • \n
  • \n

    Requirements file paths in pip-compile output are now normalized to POSIX-style, even when pip-compile is run on Windows.\nThis provides more consistent output across various platforms.

    \n

    -- by @\u200bsirosen

    \n

    PRs and issues: #2195

    \n
  • \n
  • \n

    pip-tools now tests against and supports pip up to version 25.1 -- by @\u200bsirosen.

    \n

    PRs and issues: #2195

    \n
  • \n
\n

Removals and backward incompatible breaking changes

\n
    \n
  • pip-compile will now relativize the requirements paths which are recorded in its output.\nPaths are made relative to the working directory.\nThis provides more consistent results across pip versions.
  • \n
\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from pip-tools's changelog.

\n
\n

v7.5.0

\n

2025-07-30

\n

Bug fixes

\n
    \n
  • \n

    Fixed the ordering of format controls to preserve underlying pip behavior\n-- by {user}sethmlarson.

    \n

    PRs and issues: {issue}2082

    \n
  • \n
  • \n

    Fixed NoCandidateFound exception to be compatible with pip >= 24.1\n-- by {user}chrysle.

    \n

    PRs and issues: {issue}2083

    \n
  • \n
  • \n

    pip-compile now produces relative paths for editable dependencies\n-- by {user}macro1.

    \n

    PRs and issues: {issue}2087

    \n
  • \n
  • \n

    Fixed crash failures due to incompatibility with pip >= 25.1\n-- by {user}gkreitz and {user}sirosen.

    \n

    PRs and issues: {issue}2176, {issue}2178

    \n
  • \n
\n

Features

\n
    \n
  • \n

    pip-compile now treats package versions requested on the command line as\nconstraints for the underlying pip usage.\nThis applies to build deps in addition to normal package requirements.

    \n

    -- by {user}chrysle

    \n

    PRs and issues: {issue}2106

    \n
  • \n
  • \n

    pip-tools now tests on and officially supports Python 3.12\n-- by {user}sirosen.

    \n

    PRs and issues: {issue}2188

    \n
  • \n
  • \n

    Requirements file paths in pip-compile output are now normalized to\nPOSIX-style, even when pip-compile is run on Windows.\nThis provides more consistent output across various platforms.

    \n

    -- by {user}sirosen

    \n

    PRs and issues: {issue}2195

    \n
  • \n
  • \n

    pip-tools now tests against and supports pip up to version 25.1

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • debe5a4 Update changelog for version 7.5.0
  • \n
  • 1c7d9fb Merge pull request #2210 from webknjaz/bugfixes/release-env-context-access
  • \n
  • 96ed4d2 Merge pull request #2209 from webknjaz/maintenance/release-attestations-cleanup
  • \n
  • a180dd9 \ud83d\udcdd Link the PR #2209 change note to PR #2149
  • \n
  • 7f9512a \ud83d\udcdd Link the PR #2210 change note to PR #2149
  • \n
  • 396da33 Run the dist build job in PRs
  • \n
  • 7b1c22c Fix accessing repo id in the release workflow
  • \n
  • 05daad6 Drop release attestations for Jazzband upload
  • \n
  • b4ddd75 Merge pull request #2203 from sirosen/use-towncrier
  • \n
  • a136172 Add a run of 'changelog-draft' to QA CI jobs
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pip-tools&package-manager=pip&previous-version=7.4.1&new-version=7.5.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3210", + "id": 3415928342, + "node_id": "PR_kwDOADL-3s6oh7Eb", + "number": 3210, + "title": "chore: address dependabot security vulnerabilities", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-09-15T01:24:49Z", + "updated_at": "2025-09-16T03:18:21Z", + "closed_at": "2025-09-16T03:18:19Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3210", + "html_url": "https://github.com/RDFLib/rdflib/pull/3210", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3210.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3210.patch", + "merged_at": "2025-09-16T03:18:19Z" + }, + "body": "# Summary of changes\r\n\r\n- Upgrade [urllib3](https://pypi.org/project/urllib3/) to `2.5.0`\r\n - https://github.com/RDFLib/rdflib/security/dependabot/25\r\n - https://github.com/RDFLib/rdflib/security/dependabot/27\r\n- Upgrade [requests](https://pypi.org/project/requests/) to `2.32.5`\r\n - https://github.com/RDFLib/rdflib/security/dependabot/24\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3209", + "id": 3392612128, + "node_id": "PR_kwDOADL-3s6nTQkG", + "number": 3209, + "title": "build(deps-dev): bump coverage from 7.8.2 to 7.10.6", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-09-08T05:03:55Z", + "updated_at": "2025-09-12T03:48:06Z", + "closed_at": "2025-09-12T03:48:04Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3209", + "html_url": "https://github.com/RDFLib/rdflib/pull/3209", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3209.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3209.patch", + "merged_at": "2025-09-12T03:48:04Z" + }, + "body": "Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.8.2 to 7.10.6.\n
\nChangelog\n

Sourced from coverage's changelog.

\n
\n

Version 7.10.6 \u2014 2025-08-29

\n
    \n
  • \n

    Fix: source directories were not properly communicated to subprocesses\nthat ran in different directories, as reported in issue 1499_. This is now\nfixed.

    \n
  • \n
  • \n

    Performance: Alex Gaynor continues fine-tuning <pull 2038_>_ the speed of\ncombination, especially with many contexts.

    \n
  • \n
\n

.. _issue 1499: nedbat/coveragepy#1499\n.. _pull 2038: nedbat/coveragepy#2038

\n

.. _changes_7-10-5:

\n

Version 7.10.5 \u2014 2025-08-23

\n
    \n
  • Big speed improvements for coverage combine: it's now about twice as\nfast! Huge thanks to Alex Gaynor for pull requests 2032 <pull 2032_>,\n2033 <pull 2033_>, and 2034 <pull 2034_>_.
  • \n
\n

.. _pull 2032: nedbat/coveragepy#2032\n.. _pull 2033: nedbat/coveragepy#2033\n.. _pull 2034: nedbat/coveragepy#2034

\n

.. _changes_7-10-4:

\n

Version 7.10.4 \u2014 2025-08-16

\n
    \n
  • \n

    Added patch = fork for times when the built-in forking support is\ninsufficient.

    \n
  • \n
  • \n

    Fix: patch = execv also inherits the entire coverage configuration now.

    \n
  • \n
\n

.. _changes_7-10-3:

\n

Version 7.10.3 \u2014 2025-08-10

\n
    \n
  • \n

    Fixes for patch = subprocess:

    \n
      \n
    • \n

      If subprocesses spawned yet more subprocesses simultaneously, some coverage\ncould be missed. This is now fixed, closing issue 2024_.

      \n
    • \n
    • \n

      If subprocesses were created in other directories, their data files were

      \n
    • \n
    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • 88c55ff docs: sample HTML for 7.10.6
  • \n
  • 01d8995 docs: prep for 7.10.6
  • \n
  • 9b0c24f docs: thanks Alex #2038
  • \n
  • 66d6910 fix: make source paths absolute where they exist. #1499
  • \n
  • bb3382f build: no need for the combine/html times now
  • \n
  • 9ea349a lab: warn_executed.py
  • \n
  • 808c9b4 build: changing metacov.ini should trigger metacov
  • \n
  • 384f5f2 build: oops, some 'if's are really line pragmas
  • \n
  • a7224af perf: pre-compute the mapping between other_db.context and main.context (#2038)
  • \n
  • 5c00c5b chore: bump the action-dependencies group with 3 updates (#2039)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=coverage&package-manager=pip&previous-version=7.8.2&new-version=7.10.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3208", + "id": 3392611610, + "node_id": "PR_kwDOADL-3s6nTQc1", + "number": 3208, + "title": "build(deps-dev): bump mkdocs-material from 9.6.14 to 9.6.19", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-09-08T05:03:41Z", + "updated_at": "2025-09-12T04:00:23Z", + "closed_at": "2025-09-12T03:59:59Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3208", + "html_url": "https://github.com/RDFLib/rdflib/pull/3208", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3208.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3208.patch", + "merged_at": "2025-09-12T03:59:59Z" + }, + "body": "Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.6.14 to 9.6.19.\n
\nRelease notes\n

Sourced from mkdocs-material's releases.

\n
\n

mkdocs-material-9.6.19

\n
    \n
  • Added support for Python 3.14
  • \n
  • Updated Bahasa Malaysia translations
  • \n
\n

mkdocs-material-9.6.18

\n
    \n
  • Updated Azerbaijani translations
  • \n
  • Fixed last compat issues with minijinja, now 100% compatible
  • \n
\n

mkdocs-material-9.6.17

\n
    \n
  • Fixed #8396: Videos do not autoplay when inside a content tab
  • \n
  • Fixed #8394: Stroke width not effective in Mermaid.js diagrams
  • \n
  • Fixed disappearing version selector when hiding page title
  • \n
\n

mkdocs-material-9.6.16

\n
    \n
  • Fixed #8349: Info plugin doesn't correctly detect virtualenv in some cases
  • \n
  • Fixed #8334: Find-in-page detects matches in hidden search result list
  • \n
\n

mkdocs-material-9.6.15

\n
    \n
  • Updated Mongolian translations
  • \n
  • Improved semantic markup of "edit this page" button
  • \n
  • Improved info plugin virtual environment resolution
  • \n
  • Fixed #8291: Large font size setting throws of breakpoints in JavaScript
  • \n
\n
\n
\n
\nChangelog\n

Sourced from mkdocs-material's changelog.

\n
\n

mkdocs-material-9.6.19 (2025-09-07)

\n
    \n
  • Added support for Python 3.14
  • \n
  • Updated Bahasa Malaysia translations
  • \n
\n

mkdocs-material-9.6.18 (2025-08-22)

\n
    \n
  • Updated Azerbaijani translations
  • \n
  • Fixed last compat issues with [minijinja], now 100% compatible
  • \n
\n

mkdocs-material-9.6.17 (2025-08-15)

\n
    \n
  • Fixed #8396: Videos do not autoplay when inside a content tab
  • \n
  • Fixed #8394: Stroke width not effective in Mermaid.js diagrams
  • \n
  • Fixed disappearing version selector when hiding page title
  • \n
\n

mkdocs-material-9.6.16 (2025-07-26)

\n
    \n
  • Fixed #8349: Info plugin doesn't correctly detect virtualenv in some cases
  • \n
  • Fixed #8334: Find-in-page detects matches in hidden search result list
  • \n
\n

mkdocs-material-9.6.15 (2025-07-01)

\n
    \n
  • Updated Mongolian translations
  • \n
  • Improved semantic markup of "edit this page" button
  • \n
  • Improved info plugin virtual environment resolution
  • \n
  • Fixed #8291: Large font size setting throws of breakpoints in JavaScript
  • \n
\n

mkdocs-material-9.6.14 (2025-05-13)

\n
    \n
  • Fixed #8215: Social plugin crashes when CairoSVG is updated to 2.8
  • \n
\n

mkdocs-material-9.6.13 (2025-05-10)

\n
    \n
  • Fixed #8204: Annotations showing list markers in print view
  • \n
  • Fixed #8153: Improve style of cardinality symbols in Mermaid.js ER diagrams
  • \n
\n

mkdocs-material-9.6.12 (2025-04-17)

\n
    \n
  • Fixed #8158: Flip footnote back reference icon for right-to-left languages
  • \n
\n

mkdocs-material-9.6.11 (2025-04-01)

\n
    \n
  • Updated Docker image to latest Alpine Linux
  • \n
  • Bump required Jinja version to 3.1
  • \n
  • Fixed #8133: Jinja filter items not available (9.6.10 regression)
  • \n
  • Fixed #8128: Search plugin not entirely disabled via enabled setting
  • \n
\n

mkdocs-material-9.6.10 (2025-03-30)

\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • 2fe55ee Prepare 9.6.19 release
  • \n
  • c9d5303 Documentation
  • \n
  • 3a0cea1 Bump actions/upload-pages-artifact from 3 to 4
  • \n
  • 3026a57 Bump actions/checkout from 4 to 5
  • \n
  • cb1fc6f Updated dependencies
  • \n
  • 1f3c48e Fixed pillow version range
  • \n
  • 13c9c77 Added pillow 11 to supported version range
  • \n
  • 0d262ec Documentation
  • \n
  • 97ae22f Updated Premium sponsors
  • \n
  • ee6484e Updated Premium sponsors
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mkdocs-material&package-manager=pip&previous-version=9.6.14&new-version=9.6.19)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3207", + "id": 3392609945, + "node_id": "PR_kwDOADL-3s6nTQFT", + "number": 3207, + "title": "build(deps-dev): bump mkdocs-include-markdown-plugin from 7.1.5 to 7.1.7", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 3, + "created_at": "2025-09-08T05:02:56Z", + "updated_at": "2025-09-12T04:23:57Z", + "closed_at": "2025-09-12T04:23:40Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3207", + "html_url": "https://github.com/RDFLib/rdflib/pull/3207", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3207.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3207.patch", + "merged_at": "2025-09-12T04:23:40Z" + }, + "body": "Bumps [mkdocs-include-markdown-plugin](https://github.com/mondeja/mkdocs-include-markdown-plugin) from 7.1.5 to 7.1.7.\n
\nRelease notes\n

Sourced from mkdocs-include-markdown-plugin's releases.

\n
\n

v7.1.7

\n

Bug fixes

\n
    \n
  • Fix passing negative values to heading-offset argument of include-markdown directive.
  • \n
\n

v7.1.6

\n

Bug fixes

\n
    \n
  • Fix internal anchor in included file incorrectly rewritten.
  • \n
\n
\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mkdocs-include-markdown-plugin&package-manager=pip&previous-version=7.1.5&new-version=7.1.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3206", + "id": 3392609909, + "node_id": "PR_kwDOADL-3s6nTQEx", + "number": 3206, + "title": "build(deps): bump actions/setup-python from 5 to 6", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4132956439, + "node_id": "LA_kwDOADL-3s72V-kX", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/github_actions", + "name": "github_actions", + "color": "000000", + "default": false, + "description": "Pull requests that update GitHub Actions code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 4, + "created_at": "2025-09-08T05:02:55Z", + "updated_at": "2025-09-12T04:12:29Z", + "closed_at": "2025-09-12T04:11:27Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3206", + "html_url": "https://github.com/RDFLib/rdflib/pull/3206", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3206.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3206.patch", + "merged_at": "2025-09-12T04:11:26Z" + }, + "body": "Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.\n
\nRelease notes\n

Sourced from actions/setup-python's releases.

\n
\n

v6.0.0

\n

What's Changed

\n

Breaking Changes

\n\n

Make sure your runner is on version v2.327.1 or later to ensure compatibility with this release. See Release Notes

\n

Enhancements:

\n\n

Bug fixes:

\n\n

Dependency updates:

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/setup-python/compare/v5...v6.0.0

\n

v5.6.0

\n

What's Changed

\n\n

Full Changelog: https://github.com/actions/setup-python/compare/v5...v5.6.0

\n

v5.5.0

\n

What's Changed

\n

Enhancements:

\n\n

Bug fixes:

\n\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • e797f83 Upgrade to node 24 (#1164)
  • \n
  • 3d1e2d2 Revert "Enhance cache-dependency-path handling to support files outside the w...
  • \n
  • 65b0712 Clarify pythonLocation behavior for PyPy and GraalPy in environment variables...
  • \n
  • 5b668cf Bump actions/checkout from 4 to 5 (#1181)
  • \n
  • f62a0e2 Change missing cache directory error to warning (#1182)
  • \n
  • 9322b3c Upgrade setuptools to 78.1.1 to fix path traversal vulnerability in PackageIn...
  • \n
  • fbeb884 Bump form-data to fix critical vulnerabilities #182 & #183 (#1163)
  • \n
  • 03bb615 Bump idna from 2.9 to 3.7 in /tests/data (#843)
  • \n
  • 36da51d Add version parsing from Pipfile (#1067)
  • \n
  • 3c6f142 update documentation (#1156)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/setup-python&package-manager=github_actions&previous-version=5&new-version=6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3205", + "id": 3392609468, + "node_id": "PR_kwDOADL-3s6nTP-k", + "number": 3205, + "title": "build(deps-dev): bump pytest from 8.3.5 to 8.4.2", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-09-08T05:02:44Z", + "updated_at": "2025-09-12T04:35:58Z", + "closed_at": "2025-09-12T04:34:37Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3205", + "html_url": "https://github.com/RDFLib/rdflib/pull/3205", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3205.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3205.patch", + "merged_at": "2025-09-12T04:34:37Z" + }, + "body": "Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.5 to 8.4.2.\n
\nRelease notes\n

Sourced from pytest's releases.

\n
\n

8.4.2

\n

pytest 8.4.2 (2025-09-03)

\n

Bug fixes

\n
    \n
  • \n

    #13478: Fixed a crash when using console_output_style{.interpreted-text role="confval"} with times and a module is skipped.

    \n
  • \n
  • \n

    #13530: Fixed a crash when using pytest.approx{.interpreted-text role="func"} and decimal.Decimal{.interpreted-text role="class"} instances with the decimal.FloatOperation{.interpreted-text role="class"} trap set.

    \n
  • \n
  • \n

    #13549: No longer evaluate type annotations in Python 3.14 when inspecting function signatures.

    \n

    This prevents crashes during module collection when modules do not explicitly use from __future__ import annotations and import types for annotations within a if TYPE_CHECKING: block.

    \n
  • \n
  • \n

    #13559: Added missing [int]{.title-ref} and [float]{.title-ref} variants to the [Literal]{.title-ref} type annotation of the [type]{.title-ref} parameter in pytest.Parser.addini{.interpreted-text role="meth"}.

    \n
  • \n
  • \n

    #13563: pytest.approx{.interpreted-text role="func"} now only imports numpy if NumPy is already in sys.modules. This fixes unconditional import behavior introduced in [8.4.0]{.title-ref}.

    \n
  • \n
\n

Improved documentation

\n
    \n
  • #13577: Clarify that pytest_generate_tests is discovered in test modules/classes; other hooks must be in conftest.py or plugins.
  • \n
\n

Contributor-facing changes

\n
    \n
  • #13480: Self-testing: fixed a few test failures when run with -Wdefault or a similar override.
  • \n
  • #13547: Self-testing: corrected expected message for test_doctest_unexpected_exception in Python 3.14.
  • \n
  • #13684: Make pytest's own testsuite insensitive to the presence of the CI environment variable -- by ogrisel{.interpreted-text role="user"}.
  • \n
\n

8.4.1

\n

pytest 8.4.1 (2025-06-17)

\n

Bug fixes

\n
    \n
  • \n

    #13461: Corrected _pytest.terminal.TerminalReporter.isatty to support\nbeing called as a method. Before it was just a boolean which could\nbreak correct code when using -o log_cli=true).

    \n
  • \n
  • \n

    #13477: Reintroduced pytest.PytestReturnNotNoneWarning{.interpreted-text role="class"} which was removed by accident in pytest [8.4]{.title-ref}.

    \n

    This warning is raised when a test functions returns a value other than None, which is often a mistake made by beginners.

    \n

    See return-not-none{.interpreted-text role="ref"} for more information.

    \n
  • \n
  • \n

    #13497: Fixed compatibility with Twisted 25+.

    \n
  • \n
\n

Improved documentation

\n
    \n
  • #13492: Fixed outdated warning about faulthandler not working on Windows.
  • \n
\n

8.4.0

\n

pytest 8.4.0 (2025-06-02)

\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • bfae422 Prepare release version 8.4.2
  • \n
  • 8990538 Fix passenv CI in tox ini and make tests insensitive to the presence of the C...
  • \n
  • ca676bf Merge pull request #13687 from pytest-dev/patchback/backports/8.4.x/e63f6e51c...
  • \n
  • 975a60a Merge pull request #13686 from pytest-dev/patchback/backports/8.4.x/12bde8af6...
  • \n
  • 7723ce8 Merge pull request #13683 from even-even/fix_Exeption_to_Exception_in_errorMe...
  • \n
  • b7f0568 Merge pull request #13685 from CoretexShadow/fix/docs-pytest-generate-tests
  • \n
  • 2c94c4a add missing colon (#13640) (#13641)
  • \n
  • c3d7684 Merge pull request #13606 from pytest-dev/patchback/backports/8.4.x/5f9938563...
  • \n
  • dc6e3be Merge pull request #13605 from The-Compiler/training-update-2025-07
  • \n
  • f87289c Fix crash with times output style and skipped module (#13573) (#13579)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pytest&package-manager=pip&previous-version=8.3.5&new-version=8.4.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3204", + "id": 3392609060, + "node_id": "PR_kwDOADL-3s6nTP41", + "number": 3204, + "title": "build(deps-dev): bump typing-extensions from 4.13.2 to 4.15.0", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 3, + "created_at": "2025-09-08T05:02:32Z", + "updated_at": "2025-09-12T05:00:39Z", + "closed_at": "2025-09-12T04:59:56Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3204", + "html_url": "https://github.com/RDFLib/rdflib/pull/3204", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3204.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3204.patch", + "merged_at": "2025-09-12T04:59:56Z" + }, + "body": "Bumps [typing-extensions](https://github.com/python/typing_extensions) from 4.13.2 to 4.15.0.\n
\nRelease notes\n

Sourced from typing-extensions's releases.

\n
\n

4.15.0

\n

No user-facing changes since 4.15.0rc1.

\n

New features since 4.14.1:

\n
    \n
  • Add the @typing_extensions.disjoint_base decorator, as specified\nin PEP 800. Patch by Jelle Zijlstra.
  • \n
  • Add typing_extensions.type_repr, a backport of\nannotationlib.type_repr,\nintroduced in Python 3.14 (CPython PR #124551,\noriginally by Jelle Zijlstra). Patch by Semyon Moroz.
  • \n
  • Fix behavior of type params in typing_extensions.evaluate_forward_ref. Backport of\nCPython PR #137227 by Jelle Zijlstra.
  • \n
\n

4.15.0rc1

\n
    \n
  • Add the @typing_extensions.disjoint_base decorator, as specified\nin PEP 800. Patch by Jelle Zijlstra.
  • \n
  • Add typing_extensions.type_repr, a backport of\nannotationlib.type_repr,\nintroduced in Python 3.14 (CPython PR #124551,\noriginally by Jelle Zijlstra). Patch by Semyon Moroz.
  • \n
  • Fix behavior of type params in typing_extensions.evaluate_forward_ref. Backport of\nCPython PR #137227 by Jelle Zijlstra.
  • \n
\n

4.14.1

\n

Release 4.14.1 (July 4, 2025)

\n
    \n
  • Fix usage of typing_extensions.TypedDict nested inside other types\n(e.g., typing.Type[typing_extensions.TypedDict]). This is not allowed by the\ntype system but worked on older versions, so we maintain support.
  • \n
\n

4.14.0

\n

This release adds several new features, including experimental support for inline typed dictionaries (PEP 764) and sentinels (PEP 661), and support for changes in Python 3.14. In addition, Python 3.8 is no longer supported.

\n

Changes since 4.14.0rc1:

\n
    \n
  • Remove __or__ and __ror__ methods from typing_extensions.Sentinel\non Python versions <3.10. PEP 604 was introduced in Python 3.10, and\ntyping_extensions does not generally attempt to backport PEP-604 methods\nto prior versions.
  • \n
  • Further update typing_extensions.evaluate_forward_ref with changes in Python 3.14.
  • \n
\n

Changes included in 4.14.0rc1:

\n
    \n
  • Drop support for Python 3.8 (including PyPy-3.8). Patch by Victorien Plot.
  • \n
  • Do not attempt to re-export names that have been removed from typing,\nanticipating the removal of typing.no_type_check_decorator in Python 3.15.\nPatch by Jelle Zijlstra.
  • \n
  • Update typing_extensions.Format, typing_extensions.evaluate_forward_ref, and\ntyping_extensions.TypedDict to align
  • \n
\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from typing-extensions's changelog.

\n
\n

Release 4.15.0 (August 25, 2025)

\n

No user-facing changes since 4.15.0rc1.

\n

Release 4.15.0rc1 (August 18, 2025)

\n
    \n
  • Add the @typing_extensions.disjoint_base decorator, as specified\nin PEP 800. Patch by Jelle Zijlstra.
  • \n
  • Add typing_extensions.type_repr, a backport of\nannotationlib.type_repr,\nintroduced in Python 3.14 (CPython PR #124551,\noriginally by Jelle Zijlstra). Patch by Semyon Moroz.
  • \n
  • Fix behavior of type params in typing_extensions.evaluate_forward_ref. Backport of\nCPython PR #137227 by Jelle Zijlstra.
  • \n
\n

Release 4.14.1 (July 4, 2025)

\n
    \n
  • Fix usage of typing_extensions.TypedDict nested inside other types\n(e.g., typing.Type[typing_extensions.TypedDict]). This is not allowed by the\ntype system but worked on older versions, so we maintain support.
  • \n
\n

Release 4.14.0 (June 2, 2025)

\n

Changes since 4.14.0rc1:

\n
    \n
  • Remove __or__ and __ror__ methods from typing_extensions.Sentinel\non Python versions <3.10. PEP 604 was introduced in Python 3.10, and\ntyping_extensions does not generally attempt to backport PEP-604 methods\nto prior versions.
  • \n
  • Further update typing_extensions.evaluate_forward_ref with changes in Python 3.14.
  • \n
\n

Release 4.14.0rc1 (May 24, 2025)

\n
    \n
  • Drop support for Python 3.8 (including PyPy-3.8). Patch by Victorien Plot.
  • \n
  • Do not attempt to re-export names that have been removed from typing,\nanticipating the removal of typing.no_type_check_decorator in Python 3.15.\nPatch by Jelle Zijlstra.
  • \n
  • Update typing_extensions.Format, typing_extensions.evaluate_forward_ref, and\ntyping_extensions.TypedDict to align\nwith changes in Python 3.14. Patches by Jelle Zijlstra.
  • \n
  • Fix tests for Python 3.14 and 3.15. Patches by Jelle Zijlstra.
  • \n
\n

New features:

\n
    \n
  • Add support for inline typed dictionaries (PEP 764).\nPatch by Victorien Plot.
  • \n
  • Add typing_extensions.Reader and typing_extensions.Writer. Patch by\nSebastian Rittau.
  • \n
  • Add support for sentinels (PEP 661). Patch by\nVictorien Plot.
  • \n
\n
\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=typing-extensions&package-manager=pip&previous-version=4.13.2&new-version=4.15.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3203", + "id": 3392607684, + "node_id": "PR_kwDOADL-3s6nTPlp", + "number": 3203, + "title": "build(deps-dev): bump pytest-cov from 6.1.1 to 6.3.0", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 3, + "created_at": "2025-09-08T05:01:54Z", + "updated_at": "2025-09-12T05:12:59Z", + "closed_at": "2025-09-12T05:12:15Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3203", + "html_url": "https://github.com/RDFLib/rdflib/pull/3203", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3203.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3203.patch", + "merged_at": "2025-09-12T05:12:15Z" + }, + "body": "Bumps [pytest-cov](https://github.com/pytest-dev/pytest-cov) from 6.1.1 to 6.3.0.\n
\nChangelog\n

Sourced from pytest-cov's changelog.

\n
\n

6.3.0 (2025-09-06)

\n
    \n
  • Added support for markdown reports.\nContributed by Marcos Boger in [#712](https://github.com/pytest-dev/pytest-cov/issues/712) <https://github.com/pytest-dev/pytest-cov/pull/712>_\nand [#714](https://github.com/pytest-dev/pytest-cov/issues/714) <https://github.com/pytest-dev/pytest-cov/pull/714>_.
  • \n
  • Fixed some formatting issues in docs.\nAnonymous contribution in [#706](https://github.com/pytest-dev/pytest-cov/issues/706) <https://github.com/pytest-dev/pytest-cov/pull/706>_.
  • \n
\n

6.2.1 (2025-06-12)

\n
    \n
  • \n

    Added a version requirement for pytest's pluggy dependency (1.2.0, released 2023-06-21) that has the required new-style hookwrapper API.

    \n
  • \n
  • \n

    Removed deprecated license classifier (packaging).

    \n
  • \n
  • \n

    Disabled coverage warnings in two more situations where they have no value:

    \n
      \n
    • "module-not-measured" in workers
    • \n
    • "already-imported" in subprocesses
    • \n
    \n
  • \n
\n

6.2.0 (2025-06-11)

\n
    \n
  • \n

    The plugin now adds 3 rules in the filter warnings configuration to prevent common coverage warnings being raised as obscure errors::

    \n

    default:unclosed database in <sqlite3.Connection object at:ResourceWarning\nonce::PytestCovWarning\nonce::CoverageWarning

    \n

    This fixes most of the bad interactions that are occurring on pytest 8.4 with filterwarnings=error.

    \n

    The plugin will check if there already matching rules for the 3 categories\n(ResourceWarning, PytestCovWarning, CoverageWarning) and message (unclosed database in <sqlite3.Connection object at) before adding the filters.

    \n

    This means you can have this in your pytest configuration for complete oblivion (not recommended, if that is not clear)::

    \n

    filterwarnings = [\n"error",\n"ignore:unclosed database in <sqlite3.Connection object at:ResourceWarning",\n"ignore::PytestCovWarning",\n"ignore::CoverageWarning",\n]

    \n
  • \n
\n
\n
\n
\nCommits\n
    \n
  • a69d1ab Bump version: 6.2.1 \u2192 6.3.0
  • \n
  • 475bf32 Update changelog.
  • \n
  • 3834009 Add GitHub Actions example and fix example to not break with default markdown...
  • \n
  • 0824728 Small phrasing adustments in Markdown docs
  • \n
  • 474c1f4 Move markdown dest files check to StoreReport for earlier error and parser.er...
  • \n
  • 7b21833 Default markdown-append to coverage.md and raise warning if both markdown opt...
  • \n
  • 3a15312 Fix usage of Path.open() to write/append to files
  • \n
  • 4b79449 Change output file cov-append.md in md-append example
  • \n
  • 40e9e8e Add docs and update AUTHORS.rst
  • \n
  • f5ca33a Add tests for markdown and markdown-append
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pytest-cov&package-manager=pip&previous-version=6.1.1&new-version=6.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3202", + "id": 3382370163, + "node_id": "PR_kwDOADL-3s6myRLC", + "number": 3202, + "title": "Merge 7-maintenance changes into main", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-09-04T06:37:03Z", + "updated_at": "2025-09-09T03:33:26Z", + "closed_at": "2025-09-09T03:33:25Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3202", + "html_url": "https://github.com/RDFLib/rdflib/pull/3202", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3202.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3202.patch", + "merged_at": "2025-09-09T03:33:25Z" + }, + "body": "# Summary of changes\r\n\r\nThis PR integrates the recent features and bug fixes from `7-maintenance` branch into `main`. This will be merged when the 7.2.0 version is released.\r\n\r\nMany merge conflicts were resolved and all tests and checks are passing.\r\n\r\nThis PR supersedes https://github.com/RDFLib/rdflib/pull/3199.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3197", + "id": 3378517619, + "node_id": "PR_kwDOADL-3s6mlSqI", + "number": 3197, + "title": "feat: canonicalization with longturtle serializer now optional", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-09-03T06:41:22Z", + "updated_at": "2025-09-08T01:31:53Z", + "closed_at": "2025-09-08T01:31:52Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3197", + "html_url": "https://github.com/RDFLib/rdflib/pull/3197", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3197.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3197.patch", + "merged_at": "2025-09-08T01:31:52Z" + }, + "body": "Fixes https://github.com/RDFLib/rdflib/issues/3196\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3195", + "id": 3365279463, + "node_id": "PR_kwDOADL-3s6l6NCn", + "number": 3195, + "title": "Revert \"remove old hacks against 2to3 (#3076)\"", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 7242799529, + "node_id": "LA_kwDOADL-3s8AAAABr7RZqQ", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.1", + "name": "7.1", + "color": "FC7848", + "default": false, + "description": "Issues planned to fix in v7.1" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-08-29T03:03:18Z", + "updated_at": "2025-08-29T03:50:11Z", + "closed_at": "2025-08-29T03:50:10Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3195", + "html_url": "https://github.com/RDFLib/rdflib/pull/3195", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3195.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3195.patch", + "merged_at": "2025-08-29T03:50:10Z" + }, + "body": "This reverts commit b74c6574fd982b410aed1aa43853eed37504bf15.\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\nFixes https://github.com/RDFLib/rdflib/issues/3193\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3194", + "id": 3365240233, + "node_id": "PR_kwDOADL-3s6l6E02", + "number": 3194, + "title": "Fix failing webtest", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 7242799529, + "node_id": "LA_kwDOADL-3s8AAAABr7RZqQ", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.1", + "name": "7.1", + "color": "FC7848", + "default": false, + "description": "Issues planned to fix in v7.1" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-08-29T02:35:40Z", + "updated_at": "2025-08-29T05:33:39Z", + "closed_at": "2025-08-29T05:33:36Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3194", + "html_url": "https://github.com/RDFLib/rdflib/pull/3194", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3194.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3194.patch", + "merged_at": "2025-08-29T05:33:36Z" + }, + "body": "\r\n\r\n# Summary of changes\r\n\r\nFixes https://github.com/RDFLib/rdflib/issues/3192\r\n\r\nNote: this is a cascading PR and includes https://github.com/RDFLib/rdflib/pull/3195 to ensure all fixes to tests are applied before merging into `7-maintenance` branch.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3189", + "id": 3351175415, + "node_id": "PR_kwDOADL-3s6lLaSS", + "number": 3189, + "title": "build(deps): bump actions/setup-java from 4 to 5", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4132956439, + "node_id": "LA_kwDOADL-3s72V-kX", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/github_actions", + "name": "github_actions", + "color": "000000", + "default": false, + "description": "Pull requests that update GitHub Actions code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-08-25T09:48:26Z", + "updated_at": "2025-09-12T05:24:57Z", + "closed_at": "2025-09-12T05:24:16Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3189", + "html_url": "https://github.com/RDFLib/rdflib/pull/3189", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3189.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3189.patch", + "merged_at": "2025-09-12T05:24:16Z" + }, + "body": "Bumps [actions/setup-java](https://github.com/actions/setup-java) from 4 to 5.\n
\nRelease notes\n

Sourced from actions/setup-java's releases.

\n
\n

v5.0.0

\n

What's Changed

\n

Breaking Changes

\n\n

Make sure your runner is updated to this version or newer to use this release. v2.327.1 Release Notes

\n

Dependency Upgrades

\n\n

Bug Fixes

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/setup-java/compare/v4...v5.0.0

\n

v4.7.1

\n

What's Changed

\n

Documentation changes

\n\n

Dependency updates:

\n\n

Full Changelog: https://github.com/actions/setup-java/compare/v4...v4.7.1

\n

v4.7.0

\n

What's Changed

\n\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • dded088 Bump actions/checkout from 4 to 5 (#896)
  • \n
  • 0913e9a Upgrade to node 24 (#888)
  • \n
  • e9343db Bumps form-data (#887)
  • \n
  • ae2b61d Bump undici from 5.28.5 to 5.29.0 (#833)
  • \n
  • c190c18 Bump eslint-plugin-jest from 27.9.0 to 29.0.1 (#730)
  • \n
  • 67aec00 Fix: prevent default installation of JetBrains pre-releases (#859)
  • \n
  • ebb356c Improve Error Handling for Setup-Java Action to Help Debug Intermittent Failu...
  • \n
  • f4f1212 Update publish-immutable-actions.yml (#798)
  • \n
  • See full diff in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/setup-java&package-manager=github_actions&previous-version=4&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3185", + "id": 3329618398, + "node_id": "PR_kwDOADL-3s6kDWYr", + "number": 3185, + "title": "build(deps): bump library/python from 3.13.3-slim to 3.13.7-slim in /docker/latest", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4545133062, + "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", + "name": "docker", + "color": "21ceff", + "default": false, + "description": "Pull requests that update Docker code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-08-18T07:56:51Z", + "updated_at": "2025-09-12T05:36:21Z", + "closed_at": "2025-09-12T05:35:57Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3185", + "html_url": "https://github.com/RDFLib/rdflib/pull/3185", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3185.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3185.patch", + "merged_at": "2025-09-12T05:35:57Z" + }, + "body": "Bumps library/python from 3.13.3-slim to 3.13.7-slim.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.3-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3184", + "id": 3329609992, + "node_id": "PR_kwDOADL-3s6kDUnS", + "number": 3184, + "title": "build(deps): bump actions/checkout from 4 to 5", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4132956439, + "node_id": "LA_kwDOADL-3s72V-kX", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/github_actions", + "name": "github_actions", + "color": "000000", + "default": false, + "description": "Pull requests that update GitHub Actions code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-08-18T07:54:08Z", + "updated_at": "2025-09-12T05:46:47Z", + "closed_at": "2025-09-12T05:46:06Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3184", + "html_url": "https://github.com/RDFLib/rdflib/pull/3184", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3184.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3184.patch", + "merged_at": "2025-09-12T05:46:06Z" + }, + "body": "Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to 5.\n
\nRelease notes\n

Sourced from actions/checkout's releases.

\n
\n

v5.0.0

\n

What's Changed

\n\n

\u26a0\ufe0f Minimum Compatible Runner Version

\n

v2.327.1
\nRelease Notes

\n

Make sure your runner is updated to this version or newer to use this release.

\n

Full Changelog: https://github.com/actions/checkout/compare/v4...v5.0.0

\n

v4.3.0

\n

What's Changed

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/checkout/compare/v4...v4.3.0

\n

v4.2.2

\n

What's Changed

\n\n

Full Changelog: https://github.com/actions/checkout/compare/v4.2.1...v4.2.2

\n

v4.2.1

\n

What's Changed

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/checkout/compare/v4.2.0...v4.2.1

\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from actions/checkout's changelog.

\n
\n

Changelog

\n

V5.0.0

\n\n

V4.3.0

\n\n

v4.2.2

\n\n

v4.2.1

\n\n

v4.2.0

\n\n

v4.1.7

\n\n

v4.1.6

\n\n

v4.1.5

\n\n

v4.1.4

\n\n

v4.1.3

\n\n
\n

... (truncated)

\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/checkout&package-manager=github_actions&previous-version=4&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3183", + "id": 3329595517, + "node_id": "PR_kwDOADL-3s6kDRhT", + "number": 3183, + "title": "build(deps): bump library/python from 3.13.3-slim to 3.13.7-slim in /docker/unstable", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4545133062, + "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", + "name": "docker", + "color": "21ceff", + "default": false, + "description": "Pull requests that update Docker code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-08-18T07:49:17Z", + "updated_at": "2025-09-12T05:58:56Z", + "closed_at": "2025-09-12T05:57:58Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3183", + "html_url": "https://github.com/RDFLib/rdflib/pull/3183", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3183.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3183.patch", + "merged_at": "2025-09-12T05:57:58Z" + }, + "body": "Bumps library/python from 3.13.3-slim to 3.13.7-slim.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.3-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3182", + "id": 3328829189, + "node_id": "PR_kwDOADL-3s6kAyCB", + "number": 3182, + "title": "Fix #3181", + "user": { + "login": "nicholascar", + "id": 7321872, + "node_id": "MDQ6VXNlcjczMjE4NzI=", + "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/nicholascar", + "html_url": "https://github.com/nicholascar", + "followers_url": "https://api.github.com/users/nicholascar/followers", + "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", + "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", + "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", + "organizations_url": "https://api.github.com/users/nicholascar/orgs", + "repos_url": "https://api.github.com/users/nicholascar/repos", + "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", + "received_events_url": "https://api.github.com/users/nicholascar/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-08-18T00:52:09Z", + "updated_at": "2025-08-18T00:52:18Z", + "closed_at": "2025-08-18T00:52:17Z", + "author_association": "MEMBER", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3182", + "html_url": "https://github.com/RDFLib/rdflib/pull/3182", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3182.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3182.patch", + "merged_at": "2025-08-18T00:52:17Z" + }, + "body": "README link fix", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3178", + "id": 3311960710, + "node_id": "PR_kwDOADL-3s6jJe79", + "number": 3178, + "title": "Creation of an RDFLib Charter", + "user": { + "login": "nicholascar", + "id": 7321872, + "node_id": "MDQ6VXNlcjczMjE4NzI=", + "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/nicholascar", + "html_url": "https://github.com/nicholascar", + "followers_url": "https://api.github.com/users/nicholascar/followers", + "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", + "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", + "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", + "organizations_url": "https://api.github.com/users/nicholascar/orgs", + "repos_url": "https://api.github.com/users/nicholascar/repos", + "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", + "received_events_url": "https://api.github.com/users/nicholascar/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-08-11T23:35:05Z", + "updated_at": "2025-08-18T00:43:02Z", + "closed_at": "2025-08-18T00:43:00Z", + "author_association": "MEMBER", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3178", + "html_url": "https://github.com/RDFLib/rdflib/pull/3178", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3178.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3178.patch", + "merged_at": "2025-08-18T00:43:00Z" + }, + "body": "Edits of the Contributing guidelines to streamline their advice and to add a Charter that states RDFLib's community's principles.", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3177", + "id": 3309552816, + "node_id": "PR_kwDOADL-3s6jBhWX", + "number": 3177, + "title": "Feature: Add Tentris Plugin to docs", + "user": { + "login": "bigerl", + "id": 933146, + "node_id": "MDQ6VXNlcjkzMzE0Ng==", + "avatar_url": "https://avatars.githubusercontent.com/u/933146?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/bigerl", + "html_url": "https://github.com/bigerl", + "followers_url": "https://api.github.com/users/bigerl/followers", + "following_url": "https://api.github.com/users/bigerl/following{/other_user}", + "gists_url": "https://api.github.com/users/bigerl/gists{/gist_id}", + "starred_url": "https://api.github.com/users/bigerl/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/bigerl/subscriptions", + "organizations_url": "https://api.github.com/users/bigerl/orgs", + "repos_url": "https://api.github.com/users/bigerl/repos", + "events_url": "https://api.github.com/users/bigerl/events{/privacy}", + "received_events_url": "https://api.github.com/users/bigerl/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-08-11T11:09:53Z", + "updated_at": "2025-08-11T23:43:05Z", + "closed_at": "2025-08-11T23:43:05Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3177", + "html_url": "https://github.com/RDFLib/rdflib/pull/3177", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3177.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3177.patch", + "merged_at": "2025-08-11T23:43:05Z" + }, + "body": "# Summary of changes\r\n\r\nAt Tentris, we developed a plugin that allows users to run their `rdflib.Graph` (1) with a native in-memory Tentris instance and (2) connect it to an Tentris SPARQL HTTP endpoint. \r\n\r\nI have added it to the list of Plugins. \r\n\r\n# Checklist\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes.\r\n- If the change has a potential impact on users of this project:\r\n - [x] Considered adding additional documentation.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n \r\n **Note: Some organization policy seems to prevent that. If anybody is aware how I can adjust that I am happy to change it.**\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3176", + "id": 3309030353, + "node_id": "PR_kwDOADL-3s6i_zhJ", + "number": 3176, + "title": "build(deps): bump poetry from 2.0.0 to 2.1.4 in /devtools", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 4, + "created_at": "2025-08-11T08:40:57Z", + "updated_at": "2025-09-16T06:01:05Z", + "closed_at": "2025-09-16T06:00:35Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3176", + "html_url": "https://github.com/RDFLib/rdflib/pull/3176", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3176.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3176.patch", + "merged_at": "2025-09-16T06:00:35Z" + }, + "body": "Bumps [poetry](https://github.com/python-poetry/poetry) from 2.0.0 to 2.1.4.\n
\nRelease notes\n

Sourced from poetry's releases.

\n
\n

2.1.4

\n

Changed

\n
    \n
  • Require virtualenv<20.33 to work around an issue where Poetry uses the wrong Python version (#10491).
  • \n
  • Improve the error messages for the validation of the pyproject.toml file (#10471).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where project plugins were installed even though poetry install was called with --no-plugins (#10405).
  • \n
  • Fix an issue where dependency resolution failed for self-referential extras with duplicate dependencies (#10488).
  • \n
\n

Docs

\n
    \n
  • Clarify how to include files that were automatically excluded via VCS ignore settings (#10442).
  • \n
  • Clarify the behavior of poetry add if no version constraint is explicitly specified (#10445).
  • \n
\n

2.1.3

\n

Changed

\n
    \n
  • Require importlib-metadata<8.7 for Python 3.9 because of a breaking change in importlib-metadata 8.7 (#10374).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where re-locking failed for incomplete multiple-constraints dependencies with explicit sources (#10324).
  • \n
  • Fix an issue where the --directory option did not work if a plugin, which accesses the poetry instance during its activation, was installed (#10352).
  • \n
  • Fix an issue where poetry env activate -v printed additional information to stdout instead of stderr so that the output could not be used as designed (#10353).
  • \n
  • Fix an issue where the original error was not printed if building a git dependency failed (#10366).
  • \n
  • Fix an issue where wheels for the wrong platform were installed in rare cases. (#10361).
  • \n
\n

poetry-core (2.1.3)

\n
    \n
  • Fix an issue where the union of specific inverse or partially inverse markers was not simplified (#858).
  • \n
  • Fix an issue where optional dependencies defined in the project section were treated as non-optional when a source was defined for them in the tool.poetry section (#857).
  • \n
  • Fix an issue where markers with === were not parsed correctly (#860).
  • \n
  • Fix an issue where local versions with upper case letters caused an error (#859).
  • \n
  • Fix an issue where extra markers with a value starting with "in" were not validated correctly (#862).
  • \n
\n

2.1.2

\n

Changed

\n
    \n
  • Improve performance of locking dependencies (#10275).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where markers were not locked correctly (#10240).
  • \n
  • Fix an issue where the result of poetry lock was not deterministic (#10276).
  • \n
  • Fix an issue where poetry env activate returned the wrong command for tcsh (#10243).
  • \n
  • Fix an issue where poetry env activate returned the wrong command for pwsh on Linux (#10256).
  • \n
\n

Docs

\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from poetry's changelog.

\n
\n

[2.1.4] - 2025-08-05

\n

Changed

\n
    \n
  • Require virtualenv<20.33 to work around an issue where Poetry uses the wrong Python version (#10491).
  • \n
  • Improve the error messages for the validation of the pyproject.toml file (#10471).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where project plugins were installed even though poetry install was called with --no-plugins (#10405).
  • \n
  • Fix an issue where dependency resolution failed for self-referential extras with duplicate dependencies (#10488).
  • \n
\n

Docs

\n
    \n
  • Clarify how to include files that were automatically excluded via VCS ignore settings (#10442).
  • \n
  • Clarify the behavior of poetry add if no version constraint is explicitly specified (#10445).
  • \n
\n

[2.1.3] - 2025-05-04

\n

Changed

\n
    \n
  • Require importlib-metadata<8.7 for Python 3.9 because of a breaking change in importlib-metadata 8.7 (#10374).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where re-locking failed for incomplete multiple-constraints dependencies with explicit sources (#10324).
  • \n
  • Fix an issue where the --directory option did not work if a plugin, which accesses the poetry instance during its activation, was installed (#10352).
  • \n
  • Fix an issue where poetry env activate -v printed additional information to stdout instead of stderr so that the output could not be used as designed (#10353).
  • \n
  • Fix an issue where the original error was not printed if building a git dependency failed (#10366).
  • \n
  • Fix an issue where wheels for the wrong platform were installed in rare cases. (#10361).
  • \n
\n

poetry-core (2.1.3)

\n
    \n
  • Fix an issue where the union of specific inverse or partially inverse markers was not simplified (#858).
  • \n
  • Fix an issue where optional dependencies defined in the project section were treated as non-optional when a source was defined for them in the tool.poetry section (#857).
  • \n
  • Fix an issue where markers with === were not parsed correctly (#860).
  • \n
  • Fix an issue where local versions with upper case letters caused an error (#859).
  • \n
  • Fix an issue where extra markers with a value starting with "in" were not validated correctly (#862).
  • \n
\n

[2.1.2] - 2025-03-29

\n

Changed

\n
    \n
  • Improve performance of locking dependencies (#10275).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where markers were not locked correctly (#10240).
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • a8f0889 release: bump version to 2.1.4
  • \n
  • 683fd83 fix: adjust virtualenv constraint in pyproject.toml to < 20.33.0 (#10491)
  • \n
  • 501346e solver: fix dependency resolution for self-referential extras with duplicate ...
  • \n
  • c9e8a4c fix deprecated parts in pyproject example in README (#10479)
  • \n
  • 2855b2e Fix test_python_get_preferred_default for rc Python releases (#10478)
  • \n
  • 9ee000a improve pyproject.toml validation error messages by replacing data with `to...
  • \n
  • 6d6c2f1 docs: update unspecified version docs for add (#10445)
  • \n
  • 5e58233 Documentation: Clarified negating VCS excluded files (#10442)
  • \n
  • ac51717 fix: typo in dependency-specification.md (#10427)
  • \n
  • c1220a7 Add missing tmp_venv mock to test_no_additional_output_in_verbose_mode (#10397)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=poetry&package-manager=pip&previous-version=2.0.0&new-version=2.1.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nYou can trigger a rebase of this PR by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once it's up-to-date and CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
\n\n> **Note**\n> Automatic rebases have been disabled on this pull request as it has been open for over 30 days.", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3158", + "id": 3166176394, + "node_id": "PR_kwDOADL-3s6bj-5z", + "number": 3158, + "title": "Fix contributing guide link in README.md", + "user": { + "login": "rodrigosetti", + "id": 99732, + "node_id": "MDQ6VXNlcjk5NzMy", + "avatar_url": "https://avatars.githubusercontent.com/u/99732?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/rodrigosetti", + "html_url": "https://github.com/rodrigosetti", + "followers_url": "https://api.github.com/users/rodrigosetti/followers", + "following_url": "https://api.github.com/users/rodrigosetti/following{/other_user}", + "gists_url": "https://api.github.com/users/rodrigosetti/gists{/gist_id}", + "starred_url": "https://api.github.com/users/rodrigosetti/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/rodrigosetti/subscriptions", + "organizations_url": "https://api.github.com/users/rodrigosetti/orgs", + "repos_url": "https://api.github.com/users/rodrigosetti/repos", + "events_url": "https://api.github.com/users/rodrigosetti/events{/privacy}", + "received_events_url": "https://api.github.com/users/rodrigosetti/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-06-22T18:47:04Z", + "updated_at": "2025-08-18T03:28:21Z", + "closed_at": "2025-08-18T00:45:52Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3158", + "html_url": "https://github.com/RDFLib/rdflib/pull/3158", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3158.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3158.patch", + "merged_at": "2025-08-18T00:45:51Z" + }, + "body": "Previous link was broken", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3146", + "id": 3108065541, + "node_id": "PR_kwDOADL-3s6YhDye", + "number": 3146, + "title": "Replacement for #3125", + "user": { + "login": "nicholascar", + "id": 7321872, + "node_id": "MDQ6VXNlcjczMjE4NzI=", + "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/nicholascar", + "html_url": "https://github.com/nicholascar", + "followers_url": "https://api.github.com/users/nicholascar/followers", + "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", + "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", + "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", + "organizations_url": "https://api.github.com/users/nicholascar/orgs", + "repos_url": "https://api.github.com/users/nicholascar/repos", + "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", + "received_events_url": "https://api.github.com/users/nicholascar/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-06-01T23:59:34Z", + "updated_at": "2025-06-02T00:48:58Z", + "closed_at": "2025-06-02T00:48:56Z", + "author_association": "MEMBER", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3146", + "html_url": "https://github.com/RDFLib/rdflib/pull/3146", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3146.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3146.patch", + "merged_at": "2025-06-02T00:48:56Z" + }, + "body": "This PR replaces #3125 since a bunch of conflicts from subsequent PRs needed merging into it.\r\n\r\nCloses #3125", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3145", + "id": 3106545238, + "node_id": "PR_kwDOADL-3s6Ycf5X", + "number": 3145, + "title": "replace PR 3109; improve plugins modules docs; change header colour t\u2026", + "user": { + "login": "nicholascar", + "id": 7321872, + "node_id": "MDQ6VXNlcjczMjE4NzI=", + "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/nicholascar", + "html_url": "https://github.com/nicholascar", + "followers_url": "https://api.github.com/users/nicholascar/followers", + "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", + "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", + "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", + "organizations_url": "https://api.github.com/users/nicholascar/orgs", + "repos_url": "https://api.github.com/users/nicholascar/repos", + "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", + "received_events_url": "https://api.github.com/users/nicholascar/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-06-01T04:12:57Z", + "updated_at": "2025-06-01T04:25:26Z", + "closed_at": "2025-06-01T04:25:24Z", + "author_association": "MEMBER", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3145", + "html_url": "https://github.com/RDFLib/rdflib/pull/3145", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3145.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3145.patch", + "merged_at": "2025-06-01T04:25:24Z" + }, + "body": "Closes #3109", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3144", + "id": 3105339557, + "node_id": "PR_kwDOADL-3s6YYuTz", + "number": 3144, + "title": "Pr/3143", + "user": { + "login": "nicholascar", + "id": 7321872, + "node_id": "MDQ6VXNlcjczMjE4NzI=", + "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/nicholascar", + "html_url": "https://github.com/nicholascar", + "followers_url": "https://api.github.com/users/nicholascar/followers", + "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", + "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", + "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", + "organizations_url": "https://api.github.com/users/nicholascar/orgs", + "repos_url": "https://api.github.com/users/nicholascar/repos", + "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", + "received_events_url": "https://api.github.com/users/nicholascar/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-05-31T11:16:16Z", + "updated_at": "2025-06-01T02:23:59Z", + "closed_at": "2025-06-01T02:23:58Z", + "author_association": "MEMBER", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3144", + "html_url": "https://github.com/RDFLib/rdflib/pull/3144", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3144.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3144.patch", + "merged_at": "2025-06-01T02:23:58Z" + }, + "body": "Replacement for #3143 with some black & mypy additions\r\n\r\nCloses #3143", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3142", + "id": 3090061335, + "node_id": "PR_kwDOADL-3s6XlAuc", + "number": 3142, + "title": "build(deps-dev): bump coverage from 7.7.1 to 7.8.2", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-05-26T05:35:59Z", + "updated_at": "2025-05-31T10:00:27Z", + "closed_at": "2025-05-31T10:00:25Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3142", + "html_url": "https://github.com/RDFLib/rdflib/pull/3142", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3142.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3142.patch", + "merged_at": "2025-05-31T10:00:25Z" + }, + "body": "Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.7.1 to 7.8.2.\n
\nChangelog\n

Sourced from coverage's changelog.

\n
\n

Version 7.8.2 \u2014 2025-05-23

\n
    \n
  • Wheels are provided for Windows ARM64 on Python 3.11, 3.12, and 3.13.\nThanks, Finn Womack <pull 1972_>_.
  • \n
\n

.. _issue 1971: nedbat/coveragepy#1971\n.. _pull 1972: nedbat/coveragepy#1972

\n

.. _changes_7-8-1:

\n

Version 7.8.1 \u2014 2025-05-21

\n
    \n
  • \n

    A number of EncodingWarnings were fixed that could appear if you've enabled\nPYTHONWARNDEFAULTENCODING, fixing issue 1966. Thanks, Henry Schreiner <pull 1967_>.

    \n
  • \n
  • \n

    Fixed a race condition when using sys.monitoring with free-threading Python,\nclosing issue 1970_.

    \n
  • \n
\n

.. _issue 1966: nedbat/coveragepy#1966\n.. _pull 1967: nedbat/coveragepy#1967\n.. _issue 1970: nedbat/coveragepy#1970

\n

.. _changes_7-8-0:

\n

Version 7.8.0 \u2014 2025-03-30

\n
    \n
  • \n

    Added a new source_dirs setting for symmetry with the existing\nsource_pkgs setting. It's preferable to the existing source setting,\nbecause you'll get a clear error when directories don't exist. Fixes issue 1942. Thanks, Jeremy Fleischman <pull 1943_>.

    \n
  • \n
  • \n

    Fix: the PYTHONSAFEPATH environment variable new in Python 3.11 is properly\nsupported, closing issue 1696. Thanks, Philipp A. <pull 1700_>. This\nworks properly except for a detail when using the coverage command on\nWindows. There you can use python -m coverage instead if you need exact\nemulation.

    \n
  • \n
\n

.. _issue 1696: nedbat/coveragepy#1696\n.. _pull 1700: nedbat/coveragepy#1700\n.. _issue 1942: nedbat/coveragepy#1942\n.. _pull 1943: nedbat/coveragepy#1943

\n

.. _changes_7-7-1:

\n
\n
\n
\nCommits\n
    \n
  • 51ab2e5 build: have to keep expected dist counts in sync
  • \n
  • be7bbf2 docs: sample HTML for 7.8.2
  • \n
  • 3cee850 docs: prep for 7.8.2
  • \n
  • 39bc6b0 docs: provide more details if the kit matrix is edited.
  • \n
  • a608fb3 build: add support for Windows arm64 (#1972)
  • \n
  • 2fe6225 build: run tox lint if actions have changed
  • \n
  • 3d93a78 docs: docs need scriv for making github releases
  • \n
  • 0c443a2 build: bump version to 7.8.2
  • \n
  • ed98b87 docs: sample HTML for 7.8.1
  • \n
  • b98bc9b docs: prep for 7.8.1
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=coverage&package-manager=pip&previous-version=7.7.1&new-version=7.8.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3141", + "id": 3089982425, + "node_id": "PR_kwDOADL-3s6Xkvpa", + "number": 3141, + "title": "fix: do not automatically generate header id in RDF patch generation and fix missing fullstop", + "user": { + "login": "recalcitrantsupplant", + "id": 10570038, + "node_id": "MDQ6VXNlcjEwNTcwMDM4", + "avatar_url": "https://avatars.githubusercontent.com/u/10570038?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/recalcitrantsupplant", + "html_url": "https://github.com/recalcitrantsupplant", + "followers_url": "https://api.github.com/users/recalcitrantsupplant/followers", + "following_url": "https://api.github.com/users/recalcitrantsupplant/following{/other_user}", + "gists_url": "https://api.github.com/users/recalcitrantsupplant/gists{/gist_id}", + "starred_url": "https://api.github.com/users/recalcitrantsupplant/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/recalcitrantsupplant/subscriptions", + "organizations_url": "https://api.github.com/users/recalcitrantsupplant/orgs", + "repos_url": "https://api.github.com/users/recalcitrantsupplant/repos", + "events_url": "https://api.github.com/users/recalcitrantsupplant/events{/privacy}", + "received_events_url": "https://api.github.com/users/recalcitrantsupplant/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-05-26T04:42:48Z", + "updated_at": "2025-05-31T09:48:58Z", + "closed_at": "2025-05-31T09:48:58Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3141", + "html_url": "https://github.com/RDFLib/rdflib/pull/3141", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3141.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3141.patch", + "merged_at": "2025-05-31T09:48:58Z" + }, + "body": "# Summary of changes\r\n\r\nFixes a bug in `PatchSerializer` where the `H prev` header line missed a trailing period. Also, `header_id` is now treated as optional; the `H id` line is only written if `header_id` is provided, removing the previous default UUID generation. This change is backwards compatible and primarily addresses a formatting issue and refines header generation.\r\n\r\n# Checklist\r\n\r\n- [x] Checked that there aren't other open pull requests for the same change.\r\n- [x] Checked that all tests and type checking passes. \r\n- If the change has a potential impact on users of this project:\r\n - [x] Added or updated tests that fail without the change.\r\n - [N/A] Updated relevant documentation to avoid inaccuracies.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3139", + "id": 3074399060, + "node_id": "PR_kwDOADL-3s6WwPrJ", + "number": 3139, + "title": "List on docs the COTTAS store backend", + "user": { + "login": "arenas-guerrero-julian", + "id": 18464038, + "node_id": "MDQ6VXNlcjE4NDY0MDM4", + "avatar_url": "https://avatars.githubusercontent.com/u/18464038?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/arenas-guerrero-julian", + "html_url": "https://github.com/arenas-guerrero-julian", + "followers_url": "https://api.github.com/users/arenas-guerrero-julian/followers", + "following_url": "https://api.github.com/users/arenas-guerrero-julian/following{/other_user}", + "gists_url": "https://api.github.com/users/arenas-guerrero-julian/gists{/gist_id}", + "starred_url": "https://api.github.com/users/arenas-guerrero-julian/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/arenas-guerrero-julian/subscriptions", + "organizations_url": "https://api.github.com/users/arenas-guerrero-julian/orgs", + "repos_url": "https://api.github.com/users/arenas-guerrero-julian/repos", + "events_url": "https://api.github.com/users/arenas-guerrero-julian/events{/privacy}", + "received_events_url": "https://api.github.com/users/arenas-guerrero-julian/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-05-19T16:25:00Z", + "updated_at": "2025-05-20T01:49:26Z", + "closed_at": "2025-05-20T01:49:26Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3139", + "html_url": "https://github.com/RDFLib/rdflib/pull/3139", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3139.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3139.patch", + "merged_at": "2025-05-20T01:49:26Z" + }, + "body": "# Summary of changes\r\n\r\nAdded [COTTAS](https://github.com/arenas-guerrero-julian/pycottas) store backend to the docs.\r\n\r\n# Checklist\r\n\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3134", + "id": 3044690733, + "node_id": "PR_kwDOADL-3s6VNJb_", + "number": 3134, + "title": "[7.x] fix namespace prefixes in longturtle serialization", + "user": { + "login": "edmondchuc", + "id": 37032744, + "node_id": "MDQ6VXNlcjM3MDMyNzQ0", + "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/edmondchuc", + "html_url": "https://github.com/edmondchuc", + "followers_url": "https://api.github.com/users/edmondchuc/followers", + "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", + "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", + "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", + "organizations_url": "https://api.github.com/users/edmondchuc/orgs", + "repos_url": "https://api.github.com/users/edmondchuc/repos", + "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", + "received_events_url": "https://api.github.com/users/edmondchuc/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-05-07T05:00:00Z", + "updated_at": "2025-05-20T01:50:14Z", + "closed_at": "2025-05-20T01:50:13Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3134", + "html_url": "https://github.com/RDFLib/rdflib/pull/3134", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3134.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3134.patch", + "merged_at": "2025-05-20T01:50:13Z" + }, + "body": "7.x PR of https://github.com/RDFLib/rdflib/pull/3106.\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3132", + "id": 3039033122, + "node_id": "PR_kwDOADL-3s6U52DB", + "number": 3132, + "title": "Cope with Namespace annotations in Python 3.14", + "user": { + "login": "nphilipp", + "id": 820624, + "node_id": "MDQ6VXNlcjgyMDYyNA==", + "avatar_url": "https://avatars.githubusercontent.com/u/820624?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/nphilipp", + "html_url": "https://github.com/nphilipp", + "followers_url": "https://api.github.com/users/nphilipp/followers", + "following_url": "https://api.github.com/users/nphilipp/following{/other_user}", + "gists_url": "https://api.github.com/users/nphilipp/gists{/gist_id}", + "starred_url": "https://api.github.com/users/nphilipp/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/nphilipp/subscriptions", + "organizations_url": "https://api.github.com/users/nphilipp/orgs", + "repos_url": "https://api.github.com/users/nphilipp/repos", + "events_url": "https://api.github.com/users/nphilipp/events{/privacy}", + "received_events_url": "https://api.github.com/users/nphilipp/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-05-05T08:57:17Z", + "updated_at": "2025-06-01T06:50:30Z", + "closed_at": "2025-06-01T06:50:30Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3132", + "html_url": "https://github.com/RDFLib/rdflib/pull/3132", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3132.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3132.patch", + "merged_at": "2025-06-01T06:50:30Z" + }, + "body": "I submitted this already in #3084 which got merged, but the change is missing from the main branch. So here we go again:\r\n\r\n-----------------------\r\n\r\nThe __annotations__ member can be incomplete, use the get_annotations() helper from annotationlib (Python >= 3.14) or inspect (Python >= 3.10) if available.\r\n\r\nRelated: #3083\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\nThis fixes accessing Namespace annotations on Python 3.14, which makes `import rdflib` fail on this Python version. This should be backwards-compatible.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes. \u21d2 Other tests (sparql) fail on Python 3.14, see #3083 \r\n- If the change adds new features or changes the RDFLib public API: n/a\r\n- If the change has a potential impact on users of this project: n/a (covered by existing tests)\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3127", + "id": 3033489921, + "node_id": "PR_kwDOADL-3s6Uny5a", + "number": 3127, + "title": "Fix incorrect deskolemization of literals", + "user": { + "login": "ddeschepper", + "id": 1130183, + "node_id": "MDQ6VXNlcjExMzAxODM=", + "avatar_url": "https://avatars.githubusercontent.com/u/1130183?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/ddeschepper", + "html_url": "https://github.com/ddeschepper", + "followers_url": "https://api.github.com/users/ddeschepper/followers", + "following_url": "https://api.github.com/users/ddeschepper/following{/other_user}", + "gists_url": "https://api.github.com/users/ddeschepper/gists{/gist_id}", + "starred_url": "https://api.github.com/users/ddeschepper/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/ddeschepper/subscriptions", + "organizations_url": "https://api.github.com/users/ddeschepper/orgs", + "repos_url": "https://api.github.com/users/ddeschepper/repos", + "events_url": "https://api.github.com/users/ddeschepper/events{/privacy}", + "received_events_url": "https://api.github.com/users/ddeschepper/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-05-01T09:16:35Z", + "updated_at": "2025-09-18T05:02:33Z", + "closed_at": "2025-09-18T05:02:33Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3127", + "html_url": "https://github.com/RDFLib/rdflib/pull/3127", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3127.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3127.patch", + "merged_at": "2025-09-18T05:02:33Z" + }, + "body": "Fixes issue https://github.com/RDFLib/rdflib/issues/3126.\r\n\r\nGraph.de_skolemize() incorrectly tries to deskolemize literals, which fails in some edge cases. Limiting deskolemization of objects to `URIRef`s only fixes this behaviour.", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3121", + "id": 3023781553, + "node_id": "PR_kwDOADL-3s6UGzEk", + "number": 3121, + "title": "build(deps-dev): bump typing-extensions from 4.13.0 to 4.13.2", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-04-28T05:53:17Z", + "updated_at": "2025-05-31T09:46:31Z", + "closed_at": "2025-05-31T09:46:29Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3121", + "html_url": "https://github.com/RDFLib/rdflib/pull/3121", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3121.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3121.patch", + "merged_at": "2025-05-31T09:46:29Z" + }, + "body": "Bumps [typing-extensions](https://github.com/python/typing_extensions) from 4.13.0 to 4.13.2.\n
\nRelease notes\n

Sourced from typing-extensions's releases.

\n
\n

4.13.2

\n
    \n
  • Fix TypeError when taking the union of typing_extensions.TypeAliasType and a\ntyping.TypeAliasType on Python 3.12 and 3.13.\nPatch by Joren Hammudoglu.
  • \n
  • Backport from CPython PR #132160\nto avoid having user arguments shadowed in generated __new__ by\n@typing_extensions.deprecated.\nPatch by Victorien Plot.
  • \n
\n

4.13.1

\n

This is a bugfix release fixing two edge cases that appear on old bugfix releases of CPython.

\n

Bugfixes:

\n
    \n
  • Fix regression in 4.13.0 on Python 3.10.2 causing a TypeError when using Concatenate.\nPatch by Daraan.
  • \n
  • Fix TypeError when using evaluate_forward_ref on Python 3.10.1-2 and 3.9.8-10.\nPatch by Daraan.
  • \n
\n
\n
\n
\nChangelog\n

Sourced from typing-extensions's changelog.

\n
\n

Release 4.13.2 (April 10, 2025)

\n
    \n
  • Fix TypeError when taking the union of typing_extensions.TypeAliasType and a\ntyping.TypeAliasType on Python 3.12 and 3.13.\nPatch by Joren Hammudoglu.
  • \n
  • Backport from CPython PR #132160\nto avoid having user arguments shadowed in generated __new__ by\n@typing_extensions.deprecated.\nPatch by Victorien Plot.
  • \n
\n

Release 4.13.1 (April 3, 2025)

\n

Bugfixes:

\n
    \n
  • Fix regression in 4.13.0 on Python 3.10.2 causing a TypeError when using Concatenate.\nPatch by Daraan.
  • \n
  • Fix TypeError when using evaluate_forward_ref on Python 3.10.1-2 and 3.9.8-10.\nPatch by Daraan.
  • \n
\n
\n
\n
\nCommits\n
    \n
  • 4525e9d Prepare release 4.13.2 (#583)
  • \n
  • 88a0c20 Do not shadow user arguments in generated __new__ by @deprecated (#581)
  • \n
  • 281d7b0 Add 3rd party tests for litestar (#578)
  • \n
  • 8092c39 fix TypeAliasType union with typing.TypeAliasType (#575)
  • \n
  • 45a8847 Prepare release 4.13.1 (#573)
  • \n
  • f264e58 Move CI to "ubuntu-latest" (round 2) (#570)
  • \n
  • 5ce0e69 Fix TypeError with evaluate_forward_ref on some 3.10 and 3.9 versions (#558)
  • \n
  • 304f5cb Add SQLAlchemy to third-party daily tests (#561)
  • \n
  • ebe2b94 Fix duplicated keywords for typing._ConcatenateGenericAlias in 3.10.2 (#557)
  • \n
  • 9f93d6f Add intersphinx links for 3.13 typing features (#550)
  • \n
  • See full diff in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=typing-extensions&package-manager=pip&previous-version=4.13.0&new-version=4.13.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3118", + "id": 3007735442, + "node_id": "PR_kwDOADL-3s6TQzkU", + "number": 3118, + "title": "build(deps): bump library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4 in /docker/unstable", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4545133062, + "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", + "name": "docker", + "color": "21ceff", + "default": false, + "description": "Pull requests that update Docker code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-04-21T06:55:35Z", + "updated_at": "2025-05-31T09:43:35Z", + "closed_at": "2025-05-31T09:43:28Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3118", + "html_url": "https://github.com/RDFLib/rdflib/pull/3118", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3118.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3118.patch", + "merged_at": "2025-05-31T09:43:28Z" + }, + "body": "Bumps library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1&new-version=sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nYou can trigger a rebase of this PR by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
\n\n> **Note**\n> Automatic rebases have been disabled on this pull request as it has been open for over 30 days.\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3117", + "id": 3007648012, + "node_id": "PR_kwDOADL-3s6TQgTc", + "number": 3117, + "title": "build(deps): bump library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4 in /docker/latest", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4545133062, + "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", + "name": "docker", + "color": "21ceff", + "default": false, + "description": "Pull requests that update Docker code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-04-21T06:06:30Z", + "updated_at": "2025-05-31T09:42:54Z", + "closed_at": "2025-05-31T09:42:52Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3117", + "html_url": "https://github.com/RDFLib/rdflib/pull/3117", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3117.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3117.patch", + "merged_at": "2025-05-31T09:42:52Z" + }, + "body": "Bumps library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1&new-version=sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3115", + "id": 2999054664, + "node_id": "PR_kwDOADL-3s6Szxae", + "number": 3115, + "title": "fix: remove Literal.toPython date conversion for gYear/gYearMonth", + "user": { + "login": "lu-pl", + "id": 128675670, + "node_id": "U_kgDOB6tvVg", + "avatar_url": "https://avatars.githubusercontent.com/u/128675670?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/lu-pl", + "html_url": "https://github.com/lu-pl", + "followers_url": "https://api.github.com/users/lu-pl/followers", + "following_url": "https://api.github.com/users/lu-pl/following{/other_user}", + "gists_url": "https://api.github.com/users/lu-pl/gists{/gist_id}", + "starred_url": "https://api.github.com/users/lu-pl/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/lu-pl/subscriptions", + "organizations_url": "https://api.github.com/users/lu-pl/orgs", + "repos_url": "https://api.github.com/users/lu-pl/repos", + "events_url": "https://api.github.com/users/lu-pl/events{/privacy}", + "received_events_url": "https://api.github.com/users/lu-pl/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 1, + "created_at": "2025-04-16T09:34:25Z", + "updated_at": "2025-05-31T09:51:02Z", + "closed_at": "2025-05-31T09:51:02Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3115", + "html_url": "https://github.com/RDFLib/rdflib/pull/3115", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3115.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3115.patch", + "merged_at": "2025-05-31T09:51:02Z" + }, + "body": "\r\n\r\n# Summary of changes\r\n\r\nIssue #3078 reports, that `rdflib.Literal.toPython`-casting of `xsd:gYear` and `xsd:gYearMonth` to datetime objects should not be possible, as there is no appropriate Python equivalence for those types. \r\n\r\nThe current implementation casts `xsd:gYear` and `xsd:gYearMonth` to datetime objects assuming January 1st for `xsd:gYear` and the 1st day of the given month for `xsd:gYearMonth`. This is plain wrong.\r\n\r\nThe change removes datetime casting for `xsd:gYear` and `xsd:gYearMonth` for `rdflib.Literal.toPython` and adapts the `rdflib.Literal` tests accordingly.\r\n\r\nNote that validation of `xsd:gYear` and `xsd:gYearMonth` is lost as a result, but could be easily implemented using regex checks. As I understand it, XSD types without an entry in the `rdflib.term.XSDToPython` mapping are never typed-checked though; at least for `xsd:gYear` and `xsd:gYearMonth` the xsd-type checks ran as part of `rdflib.xsd_datetime.parse_xsd_gyear` and `rdflib.xsd_datetime.parse_xsd_gyearmonth`. \r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [x] Created an issue to discuss the change and get in-principle agreement.\r\n - [] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [x] Added or updated tests that fail without the change.\r\n - [x] Updated relevant documentation to avoid inaccuracies.\r\n - [x] Considered adding additional documentation.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\nNote: I looked through the docs and couldn't find a place where `xsd:gYear` or `xsd:gYearMonth` casting was mentioned (apart from the generated references).", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3106", + "id": 2961384878, + "node_id": "PR_kwDOADL-3s6Q0v6d", + "number": 3106, + "title": "fix namespace prefixes in longturtle serialization", + "user": { + "login": "ddeschepper", + "id": 1130183, + "node_id": "MDQ6VXNlcjExMzAxODM=", + "avatar_url": "https://avatars.githubusercontent.com/u/1130183?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/ddeschepper", + "html_url": "https://github.com/ddeschepper", + "followers_url": "https://api.github.com/users/ddeschepper/followers", + "following_url": "https://api.github.com/users/ddeschepper/following{/other_user}", + "gists_url": "https://api.github.com/users/ddeschepper/gists{/gist_id}", + "starred_url": "https://api.github.com/users/ddeschepper/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/ddeschepper/subscriptions", + "organizations_url": "https://api.github.com/users/ddeschepper/orgs", + "repos_url": "https://api.github.com/users/ddeschepper/repos", + "events_url": "https://api.github.com/users/ddeschepper/events{/privacy}", + "received_events_url": "https://api.github.com/users/ddeschepper/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 3, + "created_at": "2025-03-31T19:28:56Z", + "updated_at": "2025-05-31T09:47:31Z", + "closed_at": "2025-05-31T09:47:31Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3106", + "html_url": "https://github.com/RDFLib/rdflib/pull/3106", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3106.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3106.patch", + "merged_at": "2025-05-31T09:47:31Z" + }, + "body": "\r\n\r\n# Summary of changes\r\n\r\n\r\n\r\nSolves https://github.com/RDFLib/rdflib/issues/3105 by storing the namespace manager of the graph that is to be serialized temporarily and reapplying it to the graph that is actually serialized by the implementation of the longturtle serializer.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [x] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3101", + "id": 2957748383, + "node_id": "PR_kwDOADL-3s6QpJbk", + "number": 3101, + "title": "build(deps): bump rdflib from 7.1.2 to 7.1.4 in /docker/latest", + "user": { + "login": "dependabot[bot]", + "id": 49699333, + "node_id": "MDM6Qm90NDk2OTkzMzM=", + "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/dependabot%5Bbot%5D", + "html_url": "https://github.com/apps/dependabot", + "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", + "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", + "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", + "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", + "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", + "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", + "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", + "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", + "type": "Bot", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 1999840232, + "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", + "name": "dependencies", + "color": "0366d6", + "default": false, + "description": "Pull requests that update a dependency file" + }, + { + "id": 4181259078, + "node_id": "LA_kwDOADL-3s75OPNG", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", + "name": "python", + "color": "2b67c6", + "default": false, + "description": "Pull requests that update Python code" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-03-29T04:48:39Z", + "updated_at": "2025-05-27T03:04:02Z", + "closed_at": "2025-05-27T03:04:01Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3101", + "html_url": "https://github.com/RDFLib/rdflib/pull/3101", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3101.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3101.patch", + "merged_at": "2025-05-27T03:04:01Z" + }, + "body": "Bumps [rdflib](https://github.com/RDFLib/rdflib) from 7.1.2 to 7.1.4.\n
\nRelease notes\n

Sourced from rdflib's releases.

\n
\n

2025-03-29 RELEASE 7.1.4

\n

A tidy-up release with no major updates over 7.1.3. This may be the last 7.x release as we move to a version 8 with breaking changes to Dataset and a few APIs.

\n

Interesting PRs merged:

\n\n

... and lots of boring dependency bump PRs merged!

\n

2025-01-18 RELEASE 7.1.3

\n

A fix-up release that re-adds support for Python 3.8 after it was accidentally\nremoved in Release 7.1.2.

\n

This release cherrypicks many additions to 7.1.2 added to 7.1.1 but leaves out\ntyping changes that are not compatible\nwith Python 3.8.

\n

Also not carried over from 7.1.2 is the change from Poetry 1.x to 2.0.

\n

Included are PRs such as Defined Namespace warnings fix, sort longturtle\nblank nodes, deterministic longturtle serialisation and Dataset documentation\nimprovements.

\n
\n
\n
\nChangelog\n

Sourced from rdflib's changelog.

\n
\n

2025-03-29 RELEASE 7.1.4

\n

A tidy-up release with no major updates over 7.1.3. This may be the last 7.x\nrelease as we move to a version 8 with breaking changes to Dataset and a few\nAPIs.

\n

Interesting PRs merged:

\n\n

... and lots of boring dependency bump PRs merged!

\n

2025-01-17 RELEASE 7.1.3

\n

A fix-up release that re-adds support for Python 3.8 after it was accidentally\nremoved in Release 7.1.2.

\n

This release cherrypicks many additions to 7.1.2 added to 7.1.1 but leaves out\ntyping changes that are not compatible\nwith Python 3.8.

\n

Also not carried over from 7.1.2 is the change from Poetry 1.x to 2.0.

\n

Included are PRs such as Defined Namespace warnings fix, sort longturtle\nblank nodes, deterministic longturtle serialisation and Dataset documentation\nimprovements.

\n

For the full list of included PRs, see the preparatory PR:\nRDFLib/rdflib#3036.

\n
\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=rdflib&package-manager=pip&previous-version=7.1.2&new-version=7.1.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3098", + "id": 2957658627, + "node_id": "PR_kwDOADL-3s6Qo2lb", + "number": 3098, + "title": "7.1.4 pre-release", + "user": { + "login": "nicholascar", + "id": 7321872, + "node_id": "MDQ6VXNlcjczMjE4NzI=", + "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/nicholascar", + "html_url": "https://github.com/nicholascar", + "followers_url": "https://api.github.com/users/nicholascar/followers", + "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", + "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", + "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", + "organizations_url": "https://api.github.com/users/nicholascar/orgs", + "repos_url": "https://api.github.com/users/nicholascar/repos", + "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", + "received_events_url": "https://api.github.com/users/nicholascar/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 0, + "created_at": "2025-03-29T02:09:03Z", + "updated_at": "2025-03-29T02:19:06Z", + "closed_at": "2025-03-29T02:19:05Z", + "author_association": "MEMBER", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3098", + "html_url": "https://github.com/RDFLib/rdflib/pull/3098", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3098.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3098.patch", + "merged_at": "2025-03-29T02:19:05Z" + }, + "body": "A tidy-up release with no major updates over 7.1.3. This may be the last 7.x release as we move to a version 8 with breaking changes to Dataset and a few APIs.\r\n\r\nFixed some small pre-commit issues too", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3075", + "id": 2869027589, + "node_id": "PR_kwDOADL-3s6MDLLC", + "number": 3075, + "title": "Specify `Optional` parameters in `Graph.triples_choices`", + "user": { + "login": "slahn", + "id": 3298124, + "node_id": "MDQ6VXNlcjMyOTgxMjQ=", + "avatar_url": "https://avatars.githubusercontent.com/u/3298124?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/slahn", + "html_url": "https://github.com/slahn", + "followers_url": "https://api.github.com/users/slahn/followers", + "following_url": "https://api.github.com/users/slahn/following{/other_user}", + "gists_url": "https://api.github.com/users/slahn/gists{/gist_id}", + "starred_url": "https://api.github.com/users/slahn/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/slahn/subscriptions", + "organizations_url": "https://api.github.com/users/slahn/orgs", + "repos_url": "https://api.github.com/users/slahn/repos", + "events_url": "https://api.github.com/users/slahn/events{/privacy}", + "received_events_url": "https://api.github.com/users/slahn/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [ + { + "id": 7242799529, + "node_id": "LA_kwDOADL-3s8AAAABr7RZqQ", + "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.1", + "name": "7.1", + "color": "FC7848", + "default": false, + "description": "Issues planned to fix in v7.1" + } + ], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 2, + "created_at": "2025-02-21T13:26:01Z", + "updated_at": "2025-09-03T04:59:58Z", + "closed_at": "2025-09-03T04:59:58Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3075", + "html_url": "https://github.com/RDFLib/rdflib/pull/3075", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3075.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3075.patch", + "merged_at": "2025-09-03T04:59:58Z" + }, + "body": "\r\n\r\n# Summary of changes\r\n\r\nChange the typing of `Graph.triples_choices` and `Store.triples_choises`\r\nto match the actual types allowed by the code.\r\n\r\nThe two non-list parameters can be `None`, but this is not reflected in\r\nthe type hint today.\r\n\r\nIntroduce a type alias to simplify method signatures, and update all\r\noverloads of `triples_choises` to use this alias.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [\u2713] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [\u2713] Checked that all tests and type checking passes.\r\n - Did not run webtests (`pytest -m \"not webtest\"`), since I could not get them working at all.\r\n `7266 passed, 61 skipped, 333 deselected, 330 xfailed, 36 xpassed, 6925 warnings`\r\n- [\u2713] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + }, + { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020", + "repository_url": "https://api.github.com/repos/RDFLib/rdflib", + "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/labels{/name}", + "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/comments", + "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/events", + "html_url": "https://github.com/RDFLib/rdflib/pull/3020", + "id": 2769455166, + "node_id": "PR_kwDOADL-3s6GwndC", + "number": 3020, + "title": "notation3.py: don't normalize float representation", + "user": { + "login": "tgbugs", + "id": 4299776, + "node_id": "MDQ6VXNlcjQyOTk3NzY=", + "avatar_url": "https://avatars.githubusercontent.com/u/4299776?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/tgbugs", + "html_url": "https://github.com/tgbugs", + "followers_url": "https://api.github.com/users/tgbugs/followers", + "following_url": "https://api.github.com/users/tgbugs/following{/other_user}", + "gists_url": "https://api.github.com/users/tgbugs/gists{/gist_id}", + "starred_url": "https://api.github.com/users/tgbugs/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/tgbugs/subscriptions", + "organizations_url": "https://api.github.com/users/tgbugs/orgs", + "repos_url": "https://api.github.com/users/tgbugs/repos", + "events_url": "https://api.github.com/users/tgbugs/events{/privacy}", + "received_events_url": "https://api.github.com/users/tgbugs/received_events", + "type": "User", + "user_view_type": "public", + "site_admin": false + }, + "labels": [], + "state": "closed", + "locked": false, + "assignee": null, + "assignees": [], + "milestone": null, + "comments": 5, + "created_at": "2025-01-05T21:21:48Z", + "updated_at": "2025-09-18T04:19:43Z", + "closed_at": "2025-09-18T04:19:43Z", + "author_association": "CONTRIBUTOR", + "type": null, + "active_lock_reason": null, + "draft": false, + "pull_request": { + "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3020", + "html_url": "https://github.com/RDFLib/rdflib/pull/3020", + "diff_url": "https://github.com/RDFLib/rdflib/pull/3020.diff", + "patch_url": "https://github.com/RDFLib/rdflib/pull/3020.patch", + "merged_at": "2025-09-18T04:19:43Z" + }, + "body": "fix behavior of the n3 parser family to avoid normalizing raw float string representation which makes it impossible to roundtrip the exact original string representation of e.g. `1e10`", + "reactions": { + "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/reactions", + "total_count": 0, + "+1": 0, + "-1": 0, + "laugh": 0, + "hooray": 0, + "confused": 0, + "heart": 0, + "rocket": 0, + "eyes": 0 + }, + "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/timeline", + "performed_via_github_app": null, + "state_reason": null, + "score": 1.0 + } +] \ No newline at end of file diff --git a/pyproject.toml b/pyproject.toml index ac074a7878..4cd5e9a108 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.1.4" +version = "7.2.0" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] diff --git a/rdflib/__init__.py b/rdflib/__init__.py index 051c5e3ca4..953e2309b8 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -52,7 +52,7 @@ __docformat__ = "restructuredtext en" __version__: str = _DISTRIBUTION_METADATA["Version"] -__date__ = "2025-03-29" +__date__ = "2025-09-18" __all__ = [ "URIRef", From 856c1af6b59f919d0d1230cb718c72f1415caf23 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 19 Sep 2025 12:23:52 +1000 Subject: [PATCH 29/60] chore: prep 7.2.1 release (#3225) * chore: prep 7.2.1 release * docs: add new step in releasing section * build: update poetry lockfile * docs: fix title underline too short --- CHANGELOG.md | 9 ++++++++- CITATION.cff | 4 ++-- README.md | 1 + docs/developers.rst | 9 ++++++++- poetry.lock | 4 ++-- pyproject.toml | 4 ++-- rdflib/__init__.py | 2 +- 7 files changed, 24 insertions(+), 9 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index f8812660a9..047828f0e7 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,4 +1,11 @@ -## 2025-09-18 RELEASE 7.2.0 +## 2025-09-19 RELEASE 7.2.1 + +A tiny clean up release. + +Fixes: +- Previous RDFLib releases required all downstream projects to specify <4.0.0. This release relaxes this requirement to allow Python 3.8.1 and later. + +## 2025-09-19 RELEASE 7.2.0 This release contains a number of fixes and improvements to RDFLib. diff --git a/CITATION.cff b/CITATION.cff index 8b36270572..ef4772952a 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -69,7 +69,7 @@ authors: - family-names: "Stuart" given-names: "Veyndan" title: "RDFLib" -version: 7.2.0 -date-released: 2025-09-18 +version: 7.2.1 +date-released: 2025-09-19 url: "https://github.com/RDFLib/rdflib" doi: 10.5281/zenodo.6845245 diff --git a/README.md b/README.md index 8ac6f8d3cd..8ddb4eb72f 100644 --- a/README.md +++ b/README.md @@ -43,6 +43,7 @@ Help with maintenance of all of the RDFLib family of packages is always welcome ## Versions & Releases * `main` branch in this repository is the current unstable release - version 8 alpha +* `7.2.1` tiny clean up release, relaxes Python version requirement * `7.2.0` general fixes and usability improvements, see changelog for details * `7.1.4` tidy-up release, possibly last 7.x release * `7.1.3` current stable release, small improvements to 7.1.1 diff --git a/docs/developers.rst b/docs/developers.rst index 909c5bab66..80cd01bbc7 100644 --- a/docs/developers.rst +++ b/docs/developers.rst @@ -551,9 +551,16 @@ Once this is all done, create another post-release pull request with the followi #. In ``docker/latest/requirements.in`` set the version to the just released version #. Use ``task docker:prepare`` to update ``docker/latest/requirements.txt`` +5. Port changes to the next major working branch +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +If maintaining multiple long-lived version branches, ensure changes from this release are ported to the next major working branch. -5. Let the world know +For instance, releasing a ``7.x`` would require merging in changes to ``main`` for version ``8.x``. + +This ensures general fixes and enhancements are ported over and maintained in the next major working branch. + +6. Let the world know ~~~~~~~~~~~~~~~~~~~~~ Announce the release at the following locations: diff --git a/poetry.lock b/poetry.lock index 0e30d858f3..8622bb56c4 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1479,5 +1479,5 @@ orjson = ["orjson"] [metadata] lock-version = "2.0" -python-versions = "^3.8.1" -content-hash = "01e3ca79b8228cd40cbbc60cbb6690bbe0f6207ca070eee0a44259778c797172" +python-versions = ">=3.8.1" +content-hash = "0ec27e1dca1f3b60dce28a2109a95eaada0bce2cbfbaee4d209d434a0d6e4086" diff --git a/pyproject.toml b/pyproject.toml index 4cd5e9a108..5cf476b105 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.2.0" +version = "7.2.1" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] @@ -38,7 +38,7 @@ rdfs2dot = 'rdflib.tools.rdfs2dot:main' rdfgraphisomorphism = 'rdflib.tools.graphisomorphism:main' [tool.poetry.dependencies] -python = "^3.8.1" +python = ">=3.8.1" isodate = {version=">=0.7.2,<1.0.0", python = "<3.11"} pyparsing = ">=2.1.0,<4" berkeleydb = {version = "^18.1.0", optional = true} diff --git a/rdflib/__init__.py b/rdflib/__init__.py index 953e2309b8..af6dfeefe8 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -52,7 +52,7 @@ __docformat__ = "restructuredtext en" __version__: str = _DISTRIBUTION_METADATA["Version"] -__date__ = "2025-09-18" +__date__ = "2025-09-19" __all__ = [ "URIRef", From 28261ff682426f3cd3005fe17c06f0e642d4da64 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 19 Sep 2025 13:23:00 +1000 Subject: [PATCH 30/60] chore: 7.2.1 post release (#3226) * chore: bump rdflib version in docker image requirements.txt * chore: generate docker requirements.txt using pip-compile task Adding --no-strip-extras to pip-compile command to preserve existing behaviour in pip-tools v8 * chore: update pyproject.toml version with alpha suffix --- Taskfile.yml | 2 +- docker/latest/requirements.in | 2 +- docker/latest/requirements.txt | 12 +++++------- pyproject.toml | 2 +- 4 files changed, 8 insertions(+), 10 deletions(-) diff --git a/Taskfile.yml b/Taskfile.yml index 2b9582f5fa..1ae5947ec0 100644 --- a/Taskfile.yml +++ b/Taskfile.yml @@ -272,7 +272,7 @@ tasks: pip-compile: cmds: - - cmd: "{{.PIP_COMPILE}} --quiet --annotate --emit-options --resolver=backtracking {{.CLI_ARGS}}" + - cmd: "{{.PIP_COMPILE}} --quiet --annotate --emit-options --resolver=backtracking --no-strip-extras {{.CLI_ARGS}}" docker:prepare: cmds: diff --git a/docker/latest/requirements.in b/docker/latest/requirements.in index f710c09d57..acfcb2e191 100644 --- a/docker/latest/requirements.in +++ b/docker/latest/requirements.in @@ -1,4 +1,4 @@ # This file is used for building a docker image of the latest rdflib release. It # will be updated by dependabot when new releases are made. -rdflib==7.1.3 +rdflib==7.2.1 html5rdf==1.2.1 diff --git a/docker/latest/requirements.txt b/docker/latest/requirements.txt index 2a04883448..411657d7b6 100644 --- a/docker/latest/requirements.txt +++ b/docker/latest/requirements.txt @@ -1,14 +1,12 @@ # -# This file is autogenerated by pip-compile with Python 3.8 +# This file is autogenerated by pip-compile with Python 3.13 # by the following command: # -# pip-compile docker/latest/requirements.in +# pip-compile --cert=None --client-cert=None --index-url=None --pip-args=None docker/latest/requirements.in # html5rdf==1.2.1 - # via -r requirements.in -isodate==0.7.2 - # via rdflib + # via -r docker/latest/requirements.in pyparsing==3.0.9 # via rdflib -rdflib==7.1.3 - # via -r requirements.in +rdflib==7.2.1 + # via -r docker/latest/requirements.in diff --git a/pyproject.toml b/pyproject.toml index 5cf476b105..ff1665908b 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.2.1" +version = "7.3.0-a0" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] From f358e8945eecf8b1250b36901e788f1a2d9761d6 Mon Sep 17 00:00:00 2001 From: Natanael Arndt Date: Fri, 26 Sep 2025 00:58:16 +0000 Subject: [PATCH 31/60] Run the example queries agains the local fuseki (#3240) 0. skip the test if the sparql endpoints host is not available 1. prepare the local fuseki store with some data 2. execute the example queries --- examples/sparqlstore_example.py | 73 +++++++++++++++++++++++++++------ test/test_examples.py | 10 ++--- 2 files changed, 63 insertions(+), 20 deletions(-) diff --git a/examples/sparqlstore_example.py b/examples/sparqlstore_example.py index 12fe43301a..bd1ca7bb1f 100644 --- a/examples/sparqlstore_example.py +++ b/examples/sparqlstore_example.py @@ -2,16 +2,53 @@ Simple examples showing how to use the SPARQLStore """ +import sys +from urllib.request import urlopen + from rdflib import Graph, Namespace, URIRef -from rdflib.plugins.stores.sparqlstore import SPARQLStore -from rdflib.term import Identifier +from rdflib.namespace import RDF, SKOS +from rdflib.plugins.stores.sparqlstore import SPARQLStore, SPARQLUpdateStore +from rdflib.term import Identifier, Literal + +# Shows examples of the useage of SPARQLStore and SPARQLUpdateStore against local SPARQL1.1 endpoint if +# available. This assumes SPARQL1.1 query/update endpoints running locally at +# http://localhost:3030/db/ +# +# It uses the same endpoint as the test_dataset.py! +# +# For the tests here to run, you can for example start fuseki with: +# ./fuseki-server --mem --update /db + +# THIS WILL ADD DATA TO THE /db dataset + + +HOST = "http://localhost:3030" if __name__ == "__main__": + try: + assert len(urlopen(HOST).read()) > 0 + except Exception: + print(f"{HOST} is unavailable.") + sys.exit(126) + dbo = Namespace("http://dbpedia.org/ontology/") + dbr = Namespace("http://dbpedia.org/resource/") + + # EXAMPLE Update Store: + update_store = SPARQLUpdateStore( + query_endpoint="http://localhost:3030/db/sparql", + update_endpoint="http://localhost:3030/db/update", + ) + graph = Graph(store=update_store, identifier="http://dbpedia.org") + graph.add((dbr.Berlin, dbo.populationTotal, Literal(3))) + graph.add((dbr.Brisbane, dbo.populationTotal, Literal(2))) + graph.add((dbr["Category:Capitals_in_Europe"], RDF.type, SKOS.Concept)) + graph.add((dbr["Category:Holy_Grail"], RDF.type, SKOS.Concept)) + graph.add((dbr["Category:Hospital_ships_of_Japan"], RDF.type, SKOS.Concept)) - # EXAMPLE 1: using a Graph with the Store type string set to "SPARQLStore" + # EXAMPLE Store 1: using a Graph with the Store type string set to "SPARQLStore" graph = Graph("SPARQLStore", identifier="http://dbpedia.org") - graph.open("http://dbpedia.org/sparql") + graph.open("http://localhost:3030/db/sparql") pop = graph.value(URIRef("http://dbpedia.org/resource/Berlin"), dbo.populationTotal) assert isinstance(pop, Identifier) @@ -23,25 +60,35 @@ ) print() - # EXAMPLE 2: using a SPARQLStore object directly - st = SPARQLStore(query_endpoint="http://dbpedia.org/sparql") + # EXAMPLE Query 2: using a SPARQLStore object directly + st = SPARQLStore(query_endpoint="http://localhost:3030/db/sparql") for p in st.objects( URIRef("http://dbpedia.org/resource/Brisbane"), dbo.populationTotal ): assert isinstance(p, Identifier) - print( - "According to DBPedia, Brisbane has a population of " "{0}".format(int(p)) - ) + print("According to DBPedia, Brisbane has a population of {0}".format(int(p))) print() - # EXAMPLE 3: doing RDFlib triple navigation using SPARQLStore as a Graph() + # EXAMPLE Query 3: doing RDFlib triple navigation using SPARQLStore as a Graph() print("Triple navigation using SPARQLStore as a Graph():") graph = Graph("SPARQLStore", identifier="http://dbpedia.org") - graph.open("http://dbpedia.org/sparql") + graph.open("http://localhost:3030/db/sparql") + # we are asking DBPedia for 3 skos:Concept instances + count = 0 + + for s in graph.subjects(predicate=RDF.type, object=SKOS.Concept): + count += 1 + print(f"\t- {s}") + if count >= 3: + break + + # EXAMPLE Query 4: doing RDFlib triple navigation using a Graph() with a SPARQLStore backend + print("Triple navigation using a Graph() with a SPARQLStore backend:") + st = SPARQLStore(query_endpoint="http://localhost:3030/db/sparql") + graph = Graph(store=st) # we are asking DBPedia for 3 skos:Concept instances count = 0 - from rdflib.namespace import RDF, SKOS for s in graph.subjects(predicate=RDF.type, object=SKOS.Concept): count += 1 @@ -49,7 +96,7 @@ if count >= 3: break - # EXAMPLE 4: using a SPARQL endpoint that requires Basic HTTP authentication + # EXAMPLE Store 5: using a SPARQL endpoint that requires Basic HTTP authentication # NOTE: this example won't run since the endpoint isn't live (or real) sparql_store = SPARQLStore( query_endpoint="http://fake-sparql-endpoint.com/repository/x", diff --git a/test/test_examples.py b/test/test_examples.py index 39b967c5c6..0454d36ad6 100644 --- a/test/test_examples.py +++ b/test/test_examples.py @@ -39,11 +39,7 @@ def test_example(example_file: Path) -> None: try: result.check_returncode() - except subprocess.CalledProcessError: - if ( - example_file.stem == "sparqlstore_example" - and "http.client.RemoteDisconnected: Remote end closed connection without response" - in result.stderr.decode("utf-8") - ): - pytest.skip("this test uses dbpedia which is down sometimes") + except subprocess.CalledProcessError as process_error: + if process_error.returncode == 126: + pytest.skip("This test returned 126 indikating to skip it.") raise From 062b7a917b772213549419ed42a189ea5b2030e4 Mon Sep 17 00:00:00 2001 From: Natanael Arndt Date: Mon, 29 Sep 2025 03:49:49 +0000 Subject: [PATCH 32/60] Adjust the type hint for Graph open to reflect a SPARQLUpdateStore configuration (#3239) * Adjust the type hint for Graph open to reflect a SPARQLUpdateStore configuration Also adjust the type hint for ReadOnlyGraphAggregate and for further stores * fix doc string --------- Co-authored-by: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> --- rdflib/graph.py | 6 ++++-- rdflib/plugins/stores/auditable.py | 6 ++++-- rdflib/plugins/stores/berkeleydb.py | 22 +++++++++++++++++++--- rdflib/plugins/stores/sparqlstore.py | 4 ++-- rdflib/store.py | 7 ++++--- 5 files changed, 33 insertions(+), 12 deletions(-) diff --git a/rdflib/graph.py b/rdflib/graph.py index 3a84dcf246..b3d16c4819 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -562,7 +562,9 @@ def rollback(self: _GraphT) -> _GraphT: self.__store.rollback() return self - def open(self, configuration: str, create: bool = False) -> Optional[int]: + def open( + self, configuration: Union[str, tuple[str, str]], create: bool = False + ) -> Optional[int]: """Open the graph store Might be necessary for stores that require opening a connection to a @@ -2824,7 +2826,7 @@ def commit(self) -> NoReturn: def rollback(self) -> NoReturn: raise ModificationException() - def open(self, configuration: str, create: bool = False) -> None: + def open(self, configuration: str | tuple[str, str], create: bool = False) -> None: # TODO: is there a use case for this method? for graph in self.graphs: # type error: Too many arguments for "open" of "Graph" diff --git a/rdflib/plugins/stores/auditable.py b/rdflib/plugins/stores/auditable.py index b8fb534195..ca2c7d79b0 100644 --- a/rdflib/plugins/stores/auditable.py +++ b/rdflib/plugins/stores/auditable.py @@ -18,7 +18,7 @@ from __future__ import annotations import threading -from typing import TYPE_CHECKING, Any, Generator, Iterator, List, Optional, Tuple +from typing import TYPE_CHECKING, Any, Generator, Iterator, List, Optional, Tuple, Union from rdflib.graph import ConjunctiveGraph, Graph from rdflib.store import Store @@ -62,7 +62,9 @@ def __init__(self, store: Store): ] = [] self.rollbackLock = threading.RLock() - def open(self, configuration: str, create: bool = True) -> Optional[int]: + def open( + self, configuration: Union[str, tuple[str, str]], create: bool = True + ) -> Optional[int]: return self.store.open(configuration, create) def close(self, commit_pending_transaction: bool = False) -> None: diff --git a/rdflib/plugins/stores/berkeleydb.py b/rdflib/plugins/stores/berkeleydb.py index 12009787cd..11195432f3 100644 --- a/rdflib/plugins/stores/berkeleydb.py +++ b/rdflib/plugins/stores/berkeleydb.py @@ -4,7 +4,17 @@ from os import mkdir from os.path import abspath, exists from threading import Thread -from typing import TYPE_CHECKING, Any, Callable, Dict, Generator, List, Optional, Tuple +from typing import ( + TYPE_CHECKING, + Any, + Callable, + Dict, + Generator, + List, + Optional, + Tuple, + Union, +) from urllib.request import pathname2url from rdflib.store import NO_STORE, VALID_STORE, Store @@ -127,10 +137,16 @@ def _init_db_environment( def is_open(self) -> bool: return self.__open - def open(self, path: str, create: bool = True) -> Optional[int]: + def open( + self, configuration: Union[str, tuple[str, str]], create: bool = True + ) -> Optional[int]: if not has_bsddb: return NO_STORE - homeDir = path # noqa: N806 + + if type(configuration) is str: + homeDir = configuration # noqa: N806 + else: + raise Exception("Invalid configuration provided") if self.__identifier is None: self.__identifier = URIRef(pathname2url(abspath(homeDir))) diff --git a/rdflib/plugins/stores/sparqlstore.py b/rdflib/plugins/stores/sparqlstore.py index e7a9723e8c..d63986b143 100644 --- a/rdflib/plugins/stores/sparqlstore.py +++ b/rdflib/plugins/stores/sparqlstore.py @@ -148,11 +148,11 @@ def __init__( self._queries = 0 # type error: Missing return statement - def open(self, configuration: str, create: bool = False) -> Optional[int]: # type: ignore[return] + def open(self, configuration: Union[str, tuple[str, str]], create: bool = False) -> Optional[int]: # type: ignore[return] """This method is included so that calls to this Store via Graph, e.g. Graph("SPARQLStore"), can set the required parameters """ - if type(configuration) == str: # noqa: E721 + if type(configuration) is str: self.query_endpoint = configuration else: raise Exception( diff --git a/rdflib/store.py b/rdflib/store.py index 9cada631d7..86dabf1854 100644 --- a/rdflib/store.py +++ b/rdflib/store.py @@ -205,9 +205,10 @@ def node_pickler(self) -> NodePickler: def create(self, configuration: str) -> None: self.dispatcher.dispatch(StoreCreatedEvent(configuration=configuration)) - def open(self, configuration: str, create: bool = False) -> Optional[int]: - """ - Opens the store specified by the configuration string. If + def open( + self, configuration: Union[str, tuple[str, str]], create: bool = False + ) -> Optional[int]: + """Opens the store specified by the configuration string or tuple. If create is True a store will be created if it does not already exist. If create is False and a store does not already exist an exception is raised. An exception is also raised if a store From 1964642d472c4b44de6aba1a6fe3be6b9d7a8153 Mon Sep 17 00:00:00 2001 From: Natanael Arndt Date: Tue, 30 Sep 2025 02:03:04 +0000 Subject: [PATCH 33/60] SPARQL result parsing (#2796) * Remove plugin registration for "application/sparql-results+xml; charset=UTF-8" Remove ResultParser plugin registration with the name "application/sparql-results+xml; charset=UTF-8" since it is never called in internal code. The only place where ResultParser plugins are requested is in "rdflib/query.py:276" ``` if format: plugin_key = format elif content_type: plugin_key = content_type.split(";", 1)[0] else: plugin_key = "xml" parser = plugin.get(plugin_key, ResultParser)() ``` In case of a content-type the charset is split off already. * Resolve TODO: Pull in these from the result implementation plugins for _response_mime_types in sparqlconnector * Register all Parser plugins as SPARQL ResultParser plugins * Avoid overwriting ResultParser plugins --------- Co-authored-by: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> --- rdflib/plugin.py | 23 +++++---- rdflib/plugins/stores/sparqlconnector.py | 42 ++++++++++------ rdflib/plugins/stores/sparqlstore.py | 2 +- rdflib/util.py | 61 ++++++++++++++++-------- 4 files changed, 81 insertions(+), 47 deletions(-) diff --git a/rdflib/plugin.py b/rdflib/plugin.py index 23699e68d4..556b788040 100644 --- a/rdflib/plugin.py +++ b/rdflib/plugin.py @@ -579,18 +579,6 @@ def plugins( "rdflib.plugins.sparql.results.xmlresults", "XMLResultParser", ) -register( - "application/sparql-results+xml; charset=UTF-8", - ResultParser, - "rdflib.plugins.sparql.results.xmlresults", - "XMLResultParser", -) -register( - "application/rdf+xml", - ResultParser, - "rdflib.plugins.sparql.results.graph", - "GraphResultParser", -) register( "json", ResultParser, @@ -627,3 +615,14 @@ def plugins( "rdflib.plugins.sparql.results.tsvresults", "TSVResultParser", ) + +graph_parsers = {parser.name for parser in plugins(kind=Parser)} +result_parsers = {parser.name for parser in plugins(kind=ResultParser)} +graph_result_parsers = graph_parsers - result_parsers +for parser_name in graph_result_parsers: + register( + parser_name, + ResultParser, + "rdflib.plugins.sparql.results.graph", + "GraphResultParser", + ) diff --git a/rdflib/plugins/stores/sparqlconnector.py b/rdflib/plugins/stores/sparqlconnector.py index e2bb83909c..d4cb86c63e 100644 --- a/rdflib/plugins/stores/sparqlconnector.py +++ b/rdflib/plugins/stores/sparqlconnector.py @@ -9,8 +9,10 @@ from urllib.parse import urlencode from urllib.request import Request, urlopen -from rdflib.query import Result +from rdflib.plugin import plugins +from rdflib.query import Result, ResultParser from rdflib.term import BNode +from rdflib.util import FORMAT_MIMETYPE_MAP, RESPONSE_TABLE_FORMAT_MIMETYPE_MAP log = logging.getLogger(__name__) @@ -22,16 +24,6 @@ class SPARQLConnectorException(Exception): # noqa: N818 pass -# TODO: Pull in these from the result implementation plugins? -_response_mime_types = { - "xml": "application/sparql-results+xml, application/rdf+xml", - "json": "application/sparql-results+json", - "csv": "text/csv", - "tsv": "text/tab-separated-values", - "application/rdf+xml": "application/rdf+xml", -} - - class SPARQLConnector: """ this class deals with nitty gritty details of talking to a SPARQL server @@ -41,7 +33,7 @@ def __init__( self, query_endpoint: Optional[str] = None, update_endpoint: Optional[str] = None, - returnFormat: str = "xml", # noqa: N803 + returnFormat: Optional[str] = "xml", # noqa: N803 method: te.Literal["GET", "POST", "POST_FORM"] = "GET", auth: Optional[Tuple[str, str]] = None, **kwargs, @@ -95,7 +87,7 @@ def query( if default_graph is not None and type(default_graph) is not BNode: params["default-graph-uri"] = default_graph - headers = {"Accept": _response_mime_types[self.returnFormat]} + headers = {"Accept": self.response_mime_types()} args = copy.deepcopy(self.kwargs) @@ -170,7 +162,7 @@ def update( params["using-named-graph-uri"] = named_graph headers = { - "Accept": _response_mime_types[self.returnFormat], + "Accept": self.response_mime_types(), "Content-Type": "application/sparql-update; charset=UTF-8", } @@ -188,5 +180,27 @@ def update( ) ) + def response_mime_types(self) -> str: + """Construct a HTTP-Header Accept field to reflect the supported mime types. + + If the return_format parameter is set, the mime types are restricted to these accordingly. + """ + sparql_format_mimetype_map = { + k: FORMAT_MIMETYPE_MAP.get(k, []) + + RESPONSE_TABLE_FORMAT_MIMETYPE_MAP.get(k, []) + for k in list(FORMAT_MIMETYPE_MAP.keys()) + + list(RESPONSE_TABLE_FORMAT_MIMETYPE_MAP.keys()) + } + + supported_formats = set() + for plugin in plugins(name=self.returnFormat, kind=ResultParser): + if "/" not in plugin.name: + supported_formats.update( + sparql_format_mimetype_map.get(plugin.name, []) + ) + else: + supported_formats.add(plugin.name) + return ", ".join(supported_formats) + __all__ = ["SPARQLConnector", "SPARQLConnectorException"] diff --git a/rdflib/plugins/stores/sparqlstore.py b/rdflib/plugins/stores/sparqlstore.py index d63986b143..2d3e259fe5 100644 --- a/rdflib/plugins/stores/sparqlstore.py +++ b/rdflib/plugins/stores/sparqlstore.py @@ -129,7 +129,7 @@ def __init__( sparql11: bool = True, context_aware: bool = True, node_to_sparql: _NodeToSparql = _node_to_sparql, - returnFormat: str = "xml", # noqa: N803 + returnFormat: Optional[str] = "xml", # noqa: N803 auth: Optional[Tuple[str, str]] = None, **sparqlconnector_kwargs, ): diff --git a/rdflib/util.py b/rdflib/util.py index ab594b5bef..96260fc208 100644 --- a/rdflib/util.py +++ b/rdflib/util.py @@ -74,6 +74,47 @@ _AnyT = TypeVar("_AnyT") +SUFFIX_FORMAT_MAP = { + "xml": "xml", + "rdf": "xml", + "owl": "xml", + "n3": "n3", + "ttl": "turtle", + "nt": "nt", + "trix": "trix", + "xhtml": "rdfa", + "html": "rdfa", + "svg": "rdfa", + "nq": "nquads", + "nquads": "nquads", + "trig": "trig", + "json": "json-ld", + "jsonld": "json-ld", + "json-ld": "json-ld", +} + + +FORMAT_MIMETYPE_MAP = { + "xml": ["application/rdf+xml"], + "n3": ["text/n3"], + "turtle": ["text/turtle"], + "nt": ["application/n-triples"], + "trix": ["application/trix"], + "rdfa": ["text/html", "application/xhtml+xml"], + "nquads": ["application/n-quads"], + "trig": ["application/trig"], + "json-ld": ["application/ld+json"], +} + + +RESPONSE_TABLE_FORMAT_MIMETYPE_MAP = { + "xml": ["application/sparql-results+xml"], + "json": ["application/sparql-results+json"], + "csv": ["text/csv"], + "tsv": ["text/tab-separated-values"], +} + + def list2set(seq: Iterable[_HashableT]) -> List[_HashableT]: """ Return a new list without duplicates. @@ -331,26 +372,6 @@ def parse_date_time(val: str) -> int: return t -SUFFIX_FORMAT_MAP = { - "xml": "xml", - "rdf": "xml", - "owl": "xml", - "n3": "n3", - "ttl": "turtle", - "nt": "nt", - "trix": "trix", - "xhtml": "rdfa", - "html": "rdfa", - "svg": "rdfa", - "nq": "nquads", - "nquads": "nquads", - "trig": "trig", - "json": "json-ld", - "jsonld": "json-ld", - "json-ld": "json-ld", -} - - def guess_format(fpath: str, fmap: Optional[Dict[str, str]] = None) -> Optional[str]: """ Guess RDF serialization based on file suffix. Uses From 49afac8c15ce345ca797bffa5d6fdc6f738c42a5 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Tue, 30 Sep 2025 15:42:28 +1000 Subject: [PATCH 34/60] fix: skip prefix generation for predicates corresponding to base namespace (#3244) * skip prefix generation for predicates corresponding to base namespace * fix: skip prefix generation for predicates corresponding to the base namespace. This applies to both turtle and longturtle Also added fix to escape brackets in pnames to longturtle. --------- Co-authored-by: bulricht <108867473+bulricht@users.noreply.github.com> --- rdflib/plugins/serializers/longturtle.py | 19 ++++++++--- rdflib/plugins/serializers/turtle.py | 14 ++++++-- .../test_serializer_turtle_base_namespace.py | 34 +++++++++++++++++++ 3 files changed, 60 insertions(+), 7 deletions(-) create mode 100644 test/test_serializers/test_serializer_turtle_base_namespace.py diff --git a/rdflib/plugins/serializers/longturtle.py b/rdflib/plugins/serializers/longturtle.py index 2aaed36e6d..1e68e68495 100644 --- a/rdflib/plugins/serializers/longturtle.py +++ b/rdflib/plugins/serializers/longturtle.py @@ -22,7 +22,7 @@ from rdflib.compare import to_canonical_graph from rdflib.exceptions import Error -from rdflib.graph import Graph +from rdflib.graph import Graph, _TripleType from rdflib.namespace import RDF from rdflib.term import BNode, Literal, URIRef @@ -149,11 +149,20 @@ def serialize( self.base = None - def preprocessTriple(self, triple): + def preprocessTriple(self, triple: _TripleType) -> None: super(LongTurtleSerializer, self).preprocessTriple(triple) for i, node in enumerate(triple): - if node in self.keywords: - continue + if i == VERB: + if node in self.keywords: + # predicate is a keyword + continue + if ( + self.base is not None + and isinstance(node, URIRef) + and node.startswith(self.base) + ): + # predicate corresponds to base namespace + continue # Don't use generated prefixes for subjects and objects self.getQName(node, gen_prefix=(i == VERB)) if isinstance(node, Literal) and node.datatype: @@ -180,6 +189,8 @@ def getQName(self, uri, gen_prefix=True): prefix, namespace, local = parts + local = local.replace(r"(", r"\(").replace(r")", r"\)") + # QName cannot end with . if local.endswith("."): return None diff --git a/rdflib/plugins/serializers/turtle.py b/rdflib/plugins/serializers/turtle.py index d1dfcf4a67..abee2e43b4 100644 --- a/rdflib/plugins/serializers/turtle.py +++ b/rdflib/plugins/serializers/turtle.py @@ -263,9 +263,17 @@ def serialize( def preprocessTriple(self, triple: _TripleType) -> None: super(TurtleSerializer, self).preprocessTriple(triple) for i, node in enumerate(triple): - if i == VERB and node in self.keywords: - # predicate is a keyword - continue + if i == VERB: + if node in self.keywords: + # predicate is a keyword + continue + if ( + self.base is not None + and isinstance(node, URIRef) + and node.startswith(self.base) + ): + # predicate corresponds to base namespace + continue # Don't use generated prefixes for subjects and objects self.getQName(node, gen_prefix=(i == VERB)) if isinstance(node, Literal) and node.datatype: diff --git a/test/test_serializers/test_serializer_turtle_base_namespace.py b/test/test_serializers/test_serializer_turtle_base_namespace.py new file mode 100644 index 0000000000..f0971611d8 --- /dev/null +++ b/test/test_serializers/test_serializer_turtle_base_namespace.py @@ -0,0 +1,34 @@ +from rdflib import Graph, Namespace, URIRef + +# https://github.com/RDFLib/rdflib/issues/3237 +# Test that the ns1 prefix is not generated when the base is set. +mns = Namespace("http://my-namespace.net/") + + +def test_turtle(): + g = Graph(base="http://my-base.net/") + g.add((mns.foo, URIRef("http://my-base.net/my-predicate"), mns.bar)) + result = g.serialize(format="text/turtle") + assert ( + result + == """@base . + + . + +""" + ) + + +def test_longturtle(): + g = Graph(base="http://my-base.net/") + g.add((mns.foo, URIRef("http://my-base.net/my-predicate"), mns.bar)) + result = g.serialize(format="longturtle", canon=True) + assert ( + result + == """BASE + + + ; +. +""" + ), print(result) From 76642cc8bbc3402d7d72548e2d8e5834f5e88117 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Wed, 15 Oct 2025 15:45:06 +1000 Subject: [PATCH 35/60] fix(v7): remove Literal.toPython date conversion for gYear/gYearMonth (#3115) (#3258) * fix: remove Literal.toPython date conversion for gYear/gYearMonth (#3115) * fix: remove Literal.toPython casting for gYear and gYearMonth Issue #3078 reports, that rdflib.Literal.toPython casting of xsd:gYear and xsd:gYearMonth to datetime objects is not possible, as there is no appropriate Python equivalence for those types. The current implementation casts xsd:gYear and xsd:gYearMonth to datetime objects assuming January 1st for xsd:gYear and the 1st day of the given month for xsd:gYearMonth. This is plain wrong. The change removes casting to datetime objects in rdflib.Literal.toPython for xsd:gYear and xsd:gYearMonth. Closes #3078 . * test: adapt rdflib.Literal tests to gYear/gYearMonth toPython change --------- Co-authored-by: Nicholas Car * style: apply black formatting --------- Co-authored-by: Lukas Plank <128675670+lu-pl@users.noreply.github.com> Co-authored-by: Nicholas Car --- rdflib/term.py | 4 -- rdflib/xsd_datetime.py | 73 ------------------------------- test/test_literal/test_literal.py | 25 ++++++----- 3 files changed, 13 insertions(+), 89 deletions(-) diff --git a/rdflib/term.py b/rdflib/term.py index 3e397674b0..3e1b56ed6e 100644 --- a/rdflib/term.py +++ b/rdflib/term.py @@ -76,8 +76,6 @@ parse_time, parse_xsd_date, parse_xsd_duration, - parse_xsd_gyear, - parse_xsd_gyearmonth, ) if TYPE_CHECKING: @@ -2065,8 +2063,6 @@ def _castPythonToLiteral( # noqa: N802 None: None, # plain literals map directly to value space URIRef(_XSD_PFX + "time"): parse_time, URIRef(_XSD_PFX + "date"): parse_xsd_date, - URIRef(_XSD_PFX + "gYear"): parse_xsd_gyear, - URIRef(_XSD_PFX + "gYearMonth"): parse_xsd_gyearmonth, URIRef(_XSD_PFX + "dateTime"): parse_datetime, URIRef(_XSD_PFX + "duration"): parse_xsd_duration, URIRef(_XSD_PFX + "dayTimeDuration"): parse_xsd_duration, diff --git a/rdflib/xsd_datetime.py b/rdflib/xsd_datetime.py index bc3bebd67c..e05dd3c137 100644 --- a/rdflib/xsd_datetime.py +++ b/rdflib/xsd_datetime.py @@ -593,79 +593,6 @@ def parse_xsd_date(date_string: str): return parse_date(date_string if not minus else ("-" + date_string)) -def parse_xsd_gyear(gyear_string: str): - """ - XSD gYear has more features than ISO8601 dates, specifically - XSD allows timezones on a gYear, that must be stripped off. - """ - if gyear_string.endswith("Z") or gyear_string.endswith("z"): - gyear_string = gyear_string[:-1] - if gyear_string.startswith("-"): - gyear_string = gyear_string[1:] - minus = True - else: - minus = False - has_plus = gyear_string.rfind("+") - if has_plus > 0: - # Drop the +07:00 timezone part - gyear_string = gyear_string[:has_plus] - else: - split_parts = gyear_string.rsplit("-", 1) - if len(split_parts) > 1 and ":" in split_parts[-1]: - # Drop the -09:00 timezone part - gyear_string = split_parts[0] - if len(gyear_string) < 4: - raise ValueError("gYear string must be at least 4 numerals in length") - gyear_string = gyear_string.lstrip("0") # strip all leading zeros - try: - y = int(gyear_string if not minus else ("-" + gyear_string)) - except ValueError: - raise ValueError("gYear string must be a valid integer") - return date(y, 1, 1) - - -def parse_xsd_gyearmonth(gym_string: str): - """ - XSD gYearMonth has more features than ISO8601 dates, specifically - XSD allows timezones on a gYearMonth, that must be stripped off. - """ - if gym_string.endswith("Z") or gym_string.endswith("z"): - gym_string = gym_string[:-1] - if gym_string.startswith("-"): - gym_string = gym_string[1:] - minus = True - else: - minus = False - has_plus = gym_string.rfind("+") - if has_plus > 0: - # Drop the +07:00 timezone part - gym_string = gym_string[:has_plus] - else: - split_parts = gym_string.rsplit("-", 1) - if len(split_parts) > 1 and ":" in split_parts[-1]: - # Drop the -09:00 timezone part - gym_string = split_parts[0] - year_month_parts = gym_string.split("-", 1) - if len(year_month_parts) < 2: - raise ValueError("XSD gYearMonth string must contain one dash") - - if len(year_month_parts[0]) < 4: - raise ValueError("gYearMonth Year part must be at least 4 numerals in length") - elif len(year_month_parts[1]) < 2: - raise ValueError("gYearMonth Month part must be exactly 2 numerals in length") - year_string = year_month_parts[0].lstrip("0") # strip all leading zeros - month_string = year_month_parts[1].lstrip("0") # strip all leading zeros - try: - y = int(year_string if not minus else ("-" + year_string)) - except ValueError: - raise ValueError("gYearMonth Year part must be a valid integer") - try: - m = int(month_string) - except ValueError: - raise ValueError("gYearMonth Month part must be a valid integer") - return date(y, m, 1) - - # Parse XSD Datetime is the same as ISO8601 Datetime # It uses datetime.fromisoformat for python 3.11 and above # or isodate.parse_datetime for older versions diff --git a/test/test_literal/test_literal.py b/test/test_literal/test_literal.py index a28e67e12e..7e31fb0401 100644 --- a/test/test_literal/test_literal.py +++ b/test/test_literal/test_literal.py @@ -848,21 +848,22 @@ def unlexify(s: str) -> str: ("0000-00-00", XSD.date, None), ("NOT A VALID HEX STRING", XSD.hexBinary, None), ("NOT A VALID BASE64 STRING", XSD.base64Binary, None), + # xsd:gYear and xsd:gYearMonth also do not get converted + ("1921-05", XSD.gYearMonth, None), + ("0001-01", XSD.gYearMonth, None), + ("0001-12", XSD.gYearMonth, None), + ("2002-01", XSD.gYearMonth, None), + ("9999-01", XSD.gYearMonth, None), + ("9999-12", XSD.gYearMonth, None), + ("1921", XSD.gYear, None), + ("2000", XSD.gYear, None), + ("0001", XSD.gYear, None), + ("9999", XSD.gYear, None), + ("1982", XSD.gYear, None), + ("2002", XSD.gYear, None), # these literals get converted to python types ("1921-05-01", XSD.date, datetime.date), ("1921-05-01T00:00:00", XSD.dateTime, datetime.datetime), - ("1921-05", XSD.gYearMonth, datetime.date), - ("0001-01", XSD.gYearMonth, datetime.date), - ("0001-12", XSD.gYearMonth, datetime.date), - ("2002-01", XSD.gYearMonth, datetime.date), - ("9999-01", XSD.gYearMonth, datetime.date), - ("9999-12", XSD.gYearMonth, datetime.date), - ("1921", XSD.gYear, datetime.date), - ("2000", XSD.gYear, datetime.date), - ("0001", XSD.gYear, datetime.date), - ("9999", XSD.gYear, datetime.date), - ("1982", XSD.gYear, datetime.date), - ("2002", XSD.gYear, datetime.date), ("1921-05-01T00:00:00+00:30", XSD.dateTime, datetime.datetime), ("1921-05-01T00:00:00-00:30", XSD.dateTime, datetime.datetime), ("true", XSD.boolean, bool), From 070e7afd76da9b325951623c7e6dd576192a2758 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 16 Oct 2025 14:28:17 +1000 Subject: [PATCH 36/60] feat: allow adding graphs backed by different stores to the same dataset (#3259) * feat: allow adding graphs backed by different stores to the same dataset * chore: add comment for clarity * test: confirm adding a new graph, different backing store but same identifier appends to the dataset * test: update comments and test names --- rdflib/graph.py | 14 ++++- test/test_dataset/test_dataset_add.py | 88 +++++++++++++++++++++++++++ 2 files changed, 101 insertions(+), 1 deletion(-) create mode 100644 test/test_dataset/test_dataset_add.py diff --git a/rdflib/graph.py b/rdflib/graph.py index b3d16c4819..dc7dcdc096 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -2104,7 +2104,19 @@ def _graph( if not isinstance(c, Graph): return self.get_context(c) else: - return c + if isinstance(c, (Dataset, ConjunctiveGraph)): + # Preserve the old behaviour for datasets. + return c + else: + # Copy the graph triples so they're added to the store. + try: + _graph = self.get_graph(c.identifier) + assert _graph is not None + except IndexError: + _graph = self.get_context(c.identifier) + _graph.__iadd__(c) + # Return the graph with the same backing store. + return _graph def addN( # noqa: N802 self: _ConjunctiveGraphT, quads: Iterable[_QuadType] diff --git a/test/test_dataset/test_dataset_add.py b/test/test_dataset/test_dataset_add.py new file mode 100644 index 0000000000..e2b587613f --- /dev/null +++ b/test/test_dataset/test_dataset_add.py @@ -0,0 +1,88 @@ +from rdflib import RDF, RDFS, Dataset, Graph, URIRef + + +def test_behaviour_where_graph_is_created_via_dataset(): + """ + Test that the dataset store state is intact when graphs are created from the dataset. + """ + ds = Dataset(default_union=True) + graph_name = URIRef("urn:graph") + graph = ds.graph(graph_name) + + graph.add((URIRef("urn:class"), RDF.type, RDFS.Class)) + assert len(graph) == 1 + assert len(ds) == 1 + + ds.add_graph(graph) + assert any(g.identifier == graph_name for g in ds.graphs()) + assert len(list(ds.objects(None, None))) == 1 + + retrieved_graph = ds.get_graph(graph_name) + assert retrieved_graph.identifier == graph_name + assert isinstance(retrieved_graph, Graph) + + assert len(retrieved_graph) == 1 + assert len(ds) == 1 + + ds.remove_graph(retrieved_graph) + assert len(retrieved_graph) == 0 + assert len(ds) == 0 + + +def test_behaviour_where_graph_is_created_separately(): + """ + Test that the graphs created externally from the dataset are added to the dataset + store. + """ + ds = Dataset(default_union=True) + graph_name = URIRef("urn:graph") + graph = Graph(identifier=graph_name) + + graph.add((URIRef("urn:class"), RDF.type, RDFS.Class)) + assert len(graph) == 1 + assert len(ds) == 0 + + ds.add_graph(graph) + assert any(g.identifier == graph_name for g in ds.graphs()) + + assert len(list(ds.objects(None, None))) == 1 + + retrieved_graph = ds.get_graph(graph_name) + assert retrieved_graph.identifier == graph_name + assert isinstance(retrieved_graph, Graph) + + assert len(retrieved_graph) == 1 + assert len(ds) == 1 + + ds.remove_graph(retrieved_graph) + assert len(retrieved_graph) == 0 + assert len(ds) == 0 + + +def test_adding_appends_to_dataset_graph(): + """ + Test that external graphs added to the dataset have their triples appended if + the graph identifier already exists. + """ + ds = Dataset(default_union=True) + graph_name = URIRef("urn:graph") + graph = Graph(identifier=graph_name) + + graph.add((URIRef("urn:class"), RDF.type, RDFS.Class)) + assert len(graph) == 1 + assert len(ds) == 0 + + # Make sure to use the returned graph object with the same dataset store. + graph = ds.add_graph(graph) + assert len(ds) == 1 + + graph.add((URIRef("urn:instance"), RDF.type, URIRef("urn:class"))) + assert len(graph) == 2 + assert len(ds) == 2 + + # Another graph, same identifier + another_graph = Graph(identifier=graph_name) + another_graph.add((URIRef("urn:another_instance"), RDF.type, URIRef("urn:class"))) + graph = ds.add_graph(another_graph) + assert len(graph) == 3 + assert len(ds) == 3 From 5ee04666501ef244120f53ef0a2d55b6700ba545 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 16 Oct 2025 17:10:11 +1000 Subject: [PATCH 37/60] feat: set dataset's default serialize format to trig (#3260) --- rdflib/graph.py | 67 +++++++++++++++++++ .../test_dataset_default_serialize_format.py | 18 +++++ 2 files changed, 85 insertions(+) create mode 100644 test/test_dataset/test_dataset_default_serialize_format.py diff --git a/rdflib/graph.py b/rdflib/graph.py index dc7dcdc096..bc9ab05f2b 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -2671,6 +2671,73 @@ def __iter__( # type: ignore[override] """Iterates over all quads in the store""" return self.quads((None, None, None, None)) + @overload + def serialize( + self, + destination: None, + format: str, + base: Optional[str], + encoding: str, + **args: Any, + ) -> bytes: ... + + # no destination and non-None keyword encoding + @overload + def serialize( + self, + destination: None = ..., + format: str = ..., + base: Optional[str] = ..., + *, + encoding: str, + **args: Any, + ) -> bytes: ... + + # no destination and None encoding + @overload + def serialize( + self, + destination: None = ..., + format: str = ..., + base: Optional[str] = ..., + encoding: None = ..., + **args: Any, + ) -> str: ... + + # non-None destination + @overload + def serialize( + self, + destination: Union[str, pathlib.PurePath, IO[bytes]], + format: str = ..., + base: Optional[str] = ..., + encoding: Optional[str] = ..., + **args: Any, + ) -> Graph: ... + + # fallback + @overload + def serialize( + self, + destination: Optional[Union[str, pathlib.PurePath, IO[bytes]]] = ..., + format: str = ..., + base: Optional[str] = ..., + encoding: Optional[str] = ..., + **args: Any, + ) -> Union[bytes, str, Graph]: ... + + def serialize( + self, + destination: Optional[Union[str, pathlib.PurePath, IO[bytes]]] = None, + format: str = "trig", + base: Optional[str] = None, + encoding: Optional[str] = None, + **args: Any, + ) -> Union[bytes, str, Graph]: + return super(Dataset, self).serialize( + destination=destination, format=format, base=base, encoding=encoding, **args + ) + class QuotedGraph(Graph): """ diff --git a/test/test_dataset/test_dataset_default_serialize_format.py b/test/test_dataset/test_dataset_default_serialize_format.py new file mode 100644 index 0000000000..ba12d0c706 --- /dev/null +++ b/test/test_dataset/test_dataset_default_serialize_format.py @@ -0,0 +1,18 @@ +from rdflib import Dataset +from test.data import TEST_DATA_DIR + + +def test_dataset_default_serialize_format_trig(): + ds = Dataset() + ds.parse(source=TEST_DATA_DIR / "nquads.rdflib" / "test5.nquads", format="nquads") + statements_count = len(ds) + assert statements_count + + # Previously, when the default format is 'turtle', given that the dataset has no + # statements in the default graph, the output of serialize() was empty. + output = ds.serialize().strip() + assert output != "" + + ds2 = Dataset() + ds2.parse(data=output, format="trig") + assert len(ds2) == statements_count From 32b6b88e9a267c3b517e0a36bf49c3f77535dca9 Mon Sep 17 00:00:00 2001 From: Focke Date: Thu, 16 Oct 2025 11:11:41 +0200 Subject: [PATCH 38/60] sparqls optionals clause can now bind variables. with test. issue 2957 (#3247) Co-authored-by: Richard Focke Fechner Co-authored-by: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> --- rdflib/plugins/sparql/evaluate.py | 2 +- test/test_sparql/test_optional.py | 37 +++++++++++++++++++++++++++++++ 2 files changed, 38 insertions(+), 1 deletion(-) create mode 100644 test/test_sparql/test_optional.py diff --git a/rdflib/plugins/sparql/evaluate.py b/rdflib/plugins/sparql/evaluate.py index 82fe8034f6..363918179a 100644 --- a/rdflib/plugins/sparql/evaluate.py +++ b/rdflib/plugins/sparql/evaluate.py @@ -191,7 +191,7 @@ def evalLeftJoin( for b in evalPart(c, join.p2): if _ebv(join.expr, b.forget(ctx)): ok = True - yield b + yield b.merge(a) if not ok: # we've cheated, the ctx above may contain # vars bound outside our scope diff --git a/test/test_sparql/test_optional.py b/test/test_sparql/test_optional.py new file mode 100644 index 0000000000..75fd71a1a4 --- /dev/null +++ b/test/test_sparql/test_optional.py @@ -0,0 +1,37 @@ +from rdflib import Graph, Literal, URIRef, Variable + + +def test_binding_with_optional_clause() -> None: + """ + Optional clauses should bind variables if feasible. + + See https://github.com/RDFLib/rdflib/issues/2957 + """ + g = Graph().parse( + data=""" + prefix ex: + ex:document ex:subject "Nice cars" . + ex:someCar ex:type "Car" . + """ + ) + result = g.query( + """prefix ex: + select ?subject ?car + where { + $this ex:subject ?subject. + optional + { + # an offending subselect clause + select ?car + where { + ?car ex:type "Car". + } + } + }""" + ) + assert len(result.bindings) == 1 + (first,) = result.bindings + assert first.get(Variable("car")) == URIRef("https://www.example.org/someCar") + assert first.get(Variable("subject")) == Literal( + "Nice cars" + ), "optional clause didnt bind" From 1a20cd94c24816860cff6372b64ecc7f7ffd0b91 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 17 Oct 2025 17:14:36 +1000 Subject: [PATCH 39/60] fix: dataset nquads serialization including RDFLib internal default graph identifier (#3262) * fix: dataset nquads serialization including RDFLib internal default graph identifier * chore: remove json-ld code --- rdflib/plugins/serializers/nquads.py | 7 ++-- .../test_nquads_default_graph.py | 39 +++++++++++++++++++ 2 files changed, 43 insertions(+), 3 deletions(-) create mode 100644 test/test_serializers/test_nquads_default_graph.py diff --git a/rdflib/plugins/serializers/nquads.py b/rdflib/plugins/serializers/nquads.py index b74b9cab52..d88454d1ed 100644 --- a/rdflib/plugins/serializers/nquads.py +++ b/rdflib/plugins/serializers/nquads.py @@ -3,7 +3,7 @@ import warnings from typing import IO, Any, Optional -from rdflib.graph import ConjunctiveGraph, Graph +from rdflib.graph import DATASET_DEFAULT_GRAPH_ID, ConjunctiveGraph, Graph from rdflib.plugins.serializers.nt import _quoteLiteral from rdflib.serializer import Serializer from rdflib.term import Literal @@ -45,17 +45,18 @@ def serialize( def _nq_row(triple, context): + graph_name = context.n3() if context and context != DATASET_DEFAULT_GRAPH_ID else "" if isinstance(triple[2], Literal): return "%s %s %s %s .\n" % ( triple[0].n3(), triple[1].n3(), _quoteLiteral(triple[2]), - context.n3(), + graph_name, ) else: return "%s %s %s %s .\n" % ( triple[0].n3(), triple[1].n3(), triple[2].n3(), - context.n3(), + graph_name, ) diff --git a/test/test_serializers/test_nquads_default_graph.py b/test/test_serializers/test_nquads_default_graph.py new file mode 100644 index 0000000000..c1699d2353 --- /dev/null +++ b/test/test_serializers/test_nquads_default_graph.py @@ -0,0 +1,39 @@ +from rdflib import Dataset +from rdflib.compare import isomorphic + + +def test_nquads_default_graph(): + data = """ + @prefix rdf: . + @prefix rdfs: . + @prefix xsd: . + + { + "2012-04-09"^^xsd:date . + } + + { + a ; + "http://manu.sporny.org/about#manu" ; + "Gregg Kellogg" . + + a ; + "http://greggkellogg.net/foaf#me" ; + "Manu Sporny" . + } + """ + + ds = Dataset() + ds.parse(data=data, format="trig") + output = ds.serialize(format="nquads") + + # The internal RDFLib default graph identifier should not appear in the output. + assert "" not in output + + # Ensure dataset round-trip still works. + ds2 = Dataset() + ds2.parse(data=output, format="nquads") + for graph in ds.graphs(): + assert isomorphic(graph, ds2.graph(graph.identifier)), print( + f"{graph.identifier} not isomorphic" + ) From 0f3237d6765bcd98eafbf1282175bbb6eac9c1fa Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 17 Oct 2025 18:29:12 +1000 Subject: [PATCH 40/60] fix: Dataset.parse now returns Self (#3263) * fix: Dataset.parse now returns Self * chore: remove test that's not relevant for this PR --- rdflib/graph.py | 8 +++----- rdflib/plugins/parsers/nquads.py | 2 +- test/test_dataset/test_dataset_add.py | 13 +++++++++++++ 3 files changed, 17 insertions(+), 6 deletions(-) diff --git a/rdflib/graph.py b/rdflib/graph.py index bc9ab05f2b..90907cde01 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -2346,8 +2346,7 @@ def parse( context = self.default_context context.parse(source, publicID=publicID, format=format, **args) - # TODO: FIXME: This should not return context, but self. - return context + return self def __reduce__(self) -> Tuple[Type[Graph], Tuple[Store, _ContextIdentifierType]]: return ConjunctiveGraph, (self.store, self.identifier) @@ -2615,11 +2614,10 @@ def parse( (i.e. :attr:`.Dataset.default_context`). """ - c = ConjunctiveGraph.parse( + ConjunctiveGraph.parse( self, source, publicID, format, location, file, data, **args ) - self.graph(c) - return c + return self def add_graph( self, g: Optional[Union[_ContextIdentifierType, _ContextType, str]] diff --git a/rdflib/plugins/parsers/nquads.py b/rdflib/plugins/parsers/nquads.py index 60b793b65c..cadcfa28a3 100644 --- a/rdflib/plugins/parsers/nquads.py +++ b/rdflib/plugins/parsers/nquads.py @@ -7,7 +7,7 @@ >>> g = ConjunctiveGraph() >>> data = open("test/data/nquads.rdflib/example.nquads", "rb") >>> g.parse(data, format="nquads") # doctest:+ELLIPSIS -)> +)> >>> assert len(g.store) == 449 >>> # There should be 16 separate contexts >>> assert len([x for x in g.store.contexts()]) == 16 diff --git a/test/test_dataset/test_dataset_add.py b/test/test_dataset/test_dataset_add.py index e2b587613f..a3197ae2e2 100644 --- a/test/test_dataset/test_dataset_add.py +++ b/test/test_dataset/test_dataset_add.py @@ -1,4 +1,5 @@ from rdflib import RDF, RDFS, Dataset, Graph, URIRef +from test.data import TEST_DATA_DIR def test_behaviour_where_graph_is_created_via_dataset(): @@ -86,3 +87,15 @@ def test_adding_appends_to_dataset_graph(): graph = ds.add_graph(another_graph) assert len(graph) == 3 assert len(ds) == 3 + + +def test_dataset_parse_return_value(): + """ + Test that the return value of ds.parse has the same reference as ds. + """ + ds = Dataset() + return_value = ds.parse( + source=TEST_DATA_DIR / "nquads.rdflib/example.nquads", format="nquads" + ) + assert len(ds) + assert return_value is ds From d5774efffab6d28e3805a0beb740ad5c287c34aa Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 17 Oct 2025 18:32:44 +1000 Subject: [PATCH 41/60] chore: add deprecation notice to Dataset methods and attributes (#3264) * chore: add deprecation warning to Dataset.contexts() * chore: deprecate Dataset.default_context and introduce Dataset.default_graph * chore: deprecate Dataset.identifier * chore: replace usage of default_context with default_graph in Dataset --- rdflib/graph.py | 78 ++++++++++++++++--- .../test_dataset_deprec_notice.py | 37 +++++++++ 2 files changed, 104 insertions(+), 11 deletions(-) create mode 100644 test/test_dataset/test_dataset_deprec_notice.py diff --git a/rdflib/graph.py b/rdflib/graph.py index 90907cde01..5d739a9574 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -1969,7 +1969,7 @@ class ConjunctiveGraph(Graph): All queries are carried out against the union of all graphs. """ - default_context: _ContextType + _default_context: _ContextType def __init__( self, @@ -1991,10 +1991,18 @@ def __init__( ) self.context_aware = True self.default_union = True # Conjunctive! - self.default_context: _ContextType = Graph( + self._default_context: _ContextType = Graph( store=self.store, identifier=identifier or BNode(), base=default_graph_base ) + @property + def default_context(self): + return self._default_context + + @default_context.setter + def default_context(self, value): + self._default_context = value + def __str__(self) -> str: pattern = ( "[a rdflib:ConjunctiveGraph;rdflib:storage " @@ -2519,7 +2527,7 @@ def __init__( if not self.store.graph_aware: raise Exception("DataSet must be backed by a graph-aware store!") - self.default_context = Graph( + self._default_context = Graph( store=self.store, identifier=DATASET_DEFAULT_GRAPH_ID, base=default_graph_base, @@ -2527,6 +2535,41 @@ def __init__( self.default_union = default_union + @property + def default_context(self): + warnings.warn( + "Dataset.default_context is deprecated, use Dataset.default_graph instead.", + DeprecationWarning, + stacklevel=2, + ) + return self._default_context + + @default_context.setter + def default_context(self, value): + warnings.warn( + "Dataset.default_context is deprecated, use Dataset.default_graph instead.", + DeprecationWarning, + stacklevel=2, + ) + self._default_context = value + + @property + def default_graph(self): + return self._default_context + + @default_graph.setter + def default_graph(self, value): + self._default_context = value + + @property + def identifier(self): + warnings.warn( + "Dataset.identifier is deprecated and will be removed in future versions.", + DeprecationWarning, + stacklevel=2, + ) + return super(Dataset, self).identifier + def __str__(self) -> str: pattern = ( "[a rdflib:Dataset;rdflib:storage " "[a rdflib:Store;rdfs:label '%s']]" @@ -2539,14 +2582,14 @@ def __reduce__(self) -> Tuple[Type[Dataset], Tuple[Store, bool]]: # type: ignor return (type(self), (self.store, self.default_union)) def __getstate__(self) -> Tuple[Store, _ContextIdentifierType, _ContextType, bool]: - return self.store, self.identifier, self.default_context, self.default_union + return self.store, self.identifier, self.default_graph, self.default_union def __setstate__( self, state: Tuple[Store, _ContextIdentifierType, _ContextType, bool] ) -> None: # type error: Property "store" defined in "Graph" is read-only # type error: Property "identifier" defined in "Graph" is read-only - self.store, self.identifier, self.default_context, self.default_union = state # type: ignore[misc] + self.store, self.identifier, self.default_graph, self.default_union = state # type: ignore[misc] def graph( self, @@ -2590,7 +2633,7 @@ def parse( If the source is in a format that does not support named graphs its triples will be added to the default graph - (i.e. :attr:`.Dataset.default_context`). + (i.e. :attr:`.Dataset.default_graph`). .. caution:: @@ -2611,7 +2654,7 @@ def parse( the ``publicID`` parameter will also not be used as the name for the graph that the data is loaded into, and instead the triples from sources that do not support named graphs will be loaded into the default graph - (i.e. :attr:`.Dataset.default_context`). + (i.e. :attr:`.Dataset.default_graph`). """ ConjunctiveGraph.parse( @@ -2632,15 +2675,20 @@ def remove_graph( g = self.get_context(g) self.store.remove_graph(g) - if g is None or g == self.default_context: + if g is None or g == self.default_graph: # default graph cannot be removed # only triples deleted, so add it back in - self.store.add_graph(self.default_context) + self.store.add_graph(self.default_graph) return self def contexts( self, triple: Optional[_TripleType] = None ) -> Generator[_ContextType, None, None]: + warnings.warn( + "Dataset.contexts is deprecated, use Dataset.graphs instead.", + DeprecationWarning, + stacklevel=2, + ) default = False for c in super(Dataset, self).contexts(triple): default |= c.identifier == DATASET_DEFAULT_GRAPH_ID @@ -2648,7 +2696,15 @@ def contexts( if not default: yield self.graph(DATASET_DEFAULT_GRAPH_ID) - graphs = contexts + def graphs( + self, triple: Optional[_TripleType] = None + ) -> Generator[_ContextType, None, None]: + default = False + for c in super(Dataset, self).contexts(triple): + default |= c.identifier == DATASET_DEFAULT_GRAPH_ID + yield c + if not default: + yield self.graph(DATASET_DEFAULT_GRAPH_ID) # type error: Return type "Generator[Tuple[Node, Node, Node, Optional[Node]], None, None]" of "quads" incompatible with return type "Generator[Tuple[Node, Node, Node, Optional[Graph]], None, None]" in supertype "ConjunctiveGraph" def quads( # type: ignore[override] @@ -2656,7 +2712,7 @@ def quads( # type: ignore[override] ) -> Generator[_OptionalIdentifiedQuadType, None, None]: for s, p, o, c in super(Dataset, self).quads(quad): # type error: Item "None" of "Optional[Graph]" has no attribute "identifier" - if c.identifier == self.default_context: # type: ignore[union-attr] + if c.identifier == self.default_graph: # type: ignore[union-attr] yield s, p, o, None else: # type error: Item "None" of "Optional[Graph]" has no attribute "identifier" [union-attr] diff --git a/test/test_dataset/test_dataset_deprec_notice.py b/test/test_dataset/test_dataset_deprec_notice.py new file mode 100644 index 0000000000..068ddd3c25 --- /dev/null +++ b/test/test_dataset/test_dataset_deprec_notice.py @@ -0,0 +1,37 @@ +import pytest + +from rdflib import Dataset + + +def test_dataset_contexts_method(): + ds = Dataset() + with pytest.warns( + DeprecationWarning, + match="Dataset.contexts is deprecated, use Dataset.graphs instead.", + ): + # Call list() to consume the generator to emit the warning. + list(ds.contexts()) + + +def test_dataset_default_context_property(): + ds = Dataset() + with pytest.warns( + DeprecationWarning, + match="Dataset.default_context is deprecated, use Dataset.default_graph instead.", + ): + ds.default_context + + with pytest.warns( + DeprecationWarning, + match="Dataset.default_context is deprecated, use Dataset.default_graph instead.", + ): + ds.default_context = ds.graph() + + +def test_dataset_identifier_property(): + ds = Dataset() + with pytest.warns( + DeprecationWarning, + match="Dataset.identifier is deprecated and will be removed in future versions.", + ): + ds.identifier From 2796fee74cf22b2debf5bff019bd0a929af8cee3 Mon Sep 17 00:00:00 2001 From: Focke Date: Fri, 17 Oct 2025 14:01:46 +0200 Subject: [PATCH 42/60] patch for reevaluation in sparql modify between update loops. with test (#3261) Co-authored-by: Richard Focke Fechner Co-authored-by: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> --- rdflib/plugins/sparql/update.py | 2 +- test/test_sparql/test_update.py | 74 +++++++++++++++++++++++++++++++++ 2 files changed, 75 insertions(+), 1 deletion(-) diff --git a/rdflib/plugins/sparql/update.py b/rdflib/plugins/sparql/update.py index cd22a7520f..a690839358 100644 --- a/rdflib/plugins/sparql/update.py +++ b/rdflib/plugins/sparql/update.py @@ -181,7 +181,7 @@ def evalModify(ctx: QueryContext, u: CompValue) -> None: g = ctx.dataset.get_context(u.withClause) ctx = ctx.pushGraph(g) - for c in res: + for c in list(res): dg = ctx.graph if u.delete: # type error: Unsupported left operand type for - ("None") diff --git a/test/test_sparql/test_update.py b/test/test_sparql/test_update.py index 17c7967fae..ba1ddb3068 100644 --- a/test/test_sparql/test_update.py +++ b/test/test_sparql/test_update.py @@ -4,6 +4,7 @@ import pytest +from rdflib import Literal, Namespace, Variable from rdflib.graph import ConjunctiveGraph, Dataset, Graph from test.data import TEST_DATA_DIR from test.utils import GraphHelper @@ -90,3 +91,76 @@ def test_load_into_named( ) GraphHelper.assert_collection_graphs_equal(expected_graph, actual_graph) + + +def test_reevaluation_between_updates_modify() -> None: + """ + during an update the values should be bound once and then deleted and inserted + once per valid binding. + + See https://github.com/RDFLib/rdflib/issues/3246 + """ + ex = Namespace("http://example.com/") + + g = Graph() + g.bind("ex", ex) + + g.add((ex.foo, ex.value, Literal(1))) + g.add((ex.foo, ex.value, Literal(11))) + + g.add((ex.bar, ex.value, Literal(3))) + + g.update( + """ + DELETE { + ex:bar ex:value ?oldValue . + } + INSERT { + ex:bar ex:value ?newValue . + } + WHERE { + ex:foo ex:value ?instValue . + OPTIONAL { ex:bar ex:value ?oldValue . } + BIND(COALESCE(?oldValue, 0) + ?instValue AS ?newValue) + } + """ + ) + + result = g.query("SELECT ?x WHERE { ex:bar ex:value ?x }") + values = {b.get(Variable("x")) for b in result} # type: ignore + assert values == {Literal(4), Literal(14)} + + +def test_reevaluation_between_updates_insert() -> None: + """ + during an update the values should be bound once and then deleted and inserted + once per valid binding. + + See https://github.com/RDFLib/rdflib/issues/3246 + """ + ex = Namespace("http://example.com/") + + g = Graph() + g.bind("ex", ex) + + g.add((ex.foo, ex.value, Literal(1))) + g.add((ex.foo, ex.value, Literal(11))) + + g.add((ex.bar, ex.value, Literal(3))) + + g.update( + """ + INSERT { + ex:bar ex:value ?newValue . + } + WHERE { + ex:foo ex:value ?instValue . + OPTIONAL { ex:bar ex:value ?oldValue . } + BIND(COALESCE(?oldValue, 0) + ?instValue AS ?newValue) + } + """ + ) + + result = g.query("SELECT ?x WHERE { ex:bar ex:value ?x }") + values = {b.get(Variable("x")) for b in result} # type: ignore + assert values == {Literal(3), Literal(4), Literal(14)} From 01de9bbd3a295a07ef8d00dc1ee9591f8aaeb411 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 24 Oct 2025 13:56:02 +1000 Subject: [PATCH 43/60] fix: SPARQL Update inserts into the default graph (#3265) * fix: the return type of Dataset.parse * fix: SPARQL Update inserts into the default graph does not create a new graph with a blank node label as the graph name * chore: add todo comment --- rdflib/graph.py | 2 +- rdflib/plugins/sparql/update.py | 5 +- test/test_sparql/test_update.py | 90 ++++++++++++++++++++++++++++++++- 3 files changed, 93 insertions(+), 4 deletions(-) diff --git a/rdflib/graph.py b/rdflib/graph.py index 5d739a9574..7b7256d041 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -2623,7 +2623,7 @@ def parse( file: Optional[Union[BinaryIO, TextIO]] = None, data: Optional[Union[str, bytes]] = None, **args: Any, - ) -> Graph: + ) -> Dataset: """ Parse an RDF source adding the resulting triples to the Graph. diff --git a/rdflib/plugins/sparql/update.py b/rdflib/plugins/sparql/update.py index a690839358..c9d36564c0 100644 --- a/rdflib/plugins/sparql/update.py +++ b/rdflib/plugins/sparql/update.py @@ -182,7 +182,10 @@ def evalModify(ctx: QueryContext, u: CompValue) -> None: ctx = ctx.pushGraph(g) for c in list(res): - dg = ctx.graph + # TODO: Make this more intentional and without the weird type checking logic + # once ConjunctiveGraph is removed and Dataset no longer inherits from + # Graph. + dg = ctx.graph if type(ctx.graph) is Graph else ctx.dataset.default_context if u.delete: # type error: Unsupported left operand type for - ("None") # type error: Unsupported operand types for - ("Graph" and "Generator[Tuple[Identifier, Identifier, Identifier], None, None]") diff --git a/test/test_sparql/test_update.py b/test/test_sparql/test_update.py index ba1ddb3068..4b16aa7de2 100644 --- a/test/test_sparql/test_update.py +++ b/test/test_sparql/test_update.py @@ -4,8 +4,9 @@ import pytest -from rdflib import Literal, Namespace, Variable -from rdflib.graph import ConjunctiveGraph, Dataset, Graph +from rdflib import Literal, Namespace, URIRef, Variable +from rdflib.compare import isomorphic +from rdflib.graph import DATASET_DEFAULT_GRAPH_ID, ConjunctiveGraph, Dataset, Graph from test.data import TEST_DATA_DIR from test.utils import GraphHelper from test.utils.graph import GraphSource @@ -164,3 +165,88 @@ def test_reevaluation_between_updates_insert() -> None: result = g.query("SELECT ?x WHERE { ex:bar ex:value ?x }") values = {b.get(Variable("x")) for b in result} # type: ignore assert values == {Literal(3), Literal(4), Literal(14)} + + +def test_inserts_in_named_graph(): + trig_data = """ + @prefix ex: . + @prefix rdf: . + + # Named graph 1 + ex:graph1 { + ex:person1 ex:name "Alice" ; + ex:age 30 . + } + + # Named graph 2 + ex:graph2 { + ex:person1 ex:worksFor ex:company1 . + ex:company1 ex:industry "Technology" . + } + """ + ds = Dataset().parse(data=trig_data, format="trig") + ds.update( + """ + INSERT { + GRAPH { + ?s ?p ?o + } + + ?s ?p ?o + } + WHERE { + GRAPH ?g { + ?s ?p ?o + } + } + """ + ) + + expected_trig = """ + @prefix ex: . + @prefix xsd: . + + { + ex:person1 ex:age 30 ; + ex:name "Alice" ; + ex:worksFor ex:company1 . + + ex:company1 ex:industry "Technology" . + } + + { + ex:person1 ex:age 30 ; + ex:name "Alice" ; + ex:worksFor ex:company1 . + + ex:company1 ex:industry "Technology" . + } + + ex:graph1 { + ex:person1 ex:age 30 ; + ex:name "Alice" . + } + + ex:graph2 { + ex:person1 ex:worksFor ex:company1 . + + ex:company1 ex:industry "Technology" . + } + """ + expected_ds = Dataset().parse(data=expected_trig, format="trig") + + # There should be exactly 4 graphs, including the default graph. + # SPARQL Update inserts into the default graph should go into the default graph, + # not to a new graph with a blank node label. + # See https://github.com/RDFLib/rdflib/issues/3080 + expected_graph_names = [ + DATASET_DEFAULT_GRAPH_ID, + URIRef("urn:graph"), + URIRef("http://example.org/graph1"), + URIRef("http://example.org/graph2"), + ] + assert set(expected_graph_names) == set(graph.identifier for graph in ds.graphs()) + + for graph in ds.graphs(): + expected_graph = expected_ds.graph(graph.identifier) + assert isomorphic(graph, expected_graph) From 2a902e512123d1b9a8035fa1652aa3d0a0a918a4 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 24 Oct 2025 13:56:39 +1000 Subject: [PATCH 44/60] fix: allow static type checkers to infer term's __new__ type (#3266) * fix: the return value type of term's __new__ * fix: import of Self for python versions lower than 3.11 * fix: import errors in CI due to reliance on typing_extensions. Remove explicit typing and let type checkers to infer instead --------- Co-authored-by: Nicholas Car --- rdflib/term.py | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/rdflib/term.py b/rdflib/term.py index 3e1b56ed6e..254bc9d62c 100644 --- a/rdflib/term.py +++ b/rdflib/term.py @@ -153,7 +153,7 @@ class Identifier(Node, str): # allow Identifiers to be Nodes in the Graph __slots__ = () - def __new__(cls, value: str) -> Identifier: + def __new__(cls, value: str): return str.__new__(cls, value) def eq(self, other: Any) -> bool: @@ -288,7 +288,7 @@ class URIRef(IdentifiedNode): __neg__: Callable[[URIRef], NegatedPath] __truediv__: Callable[[URIRef, Union[URIRef, Path]], SequencePath] - def __new__(cls, value: str, base: Optional[str] = None) -> URIRef: + def __new__(cls, value: str, base: Optional[str] = None): if base is not None: ends_in_hash = value.endswith("#") # type error: Argument "allow_fragments" to "urljoin" has incompatible type "int"; expected "bool" @@ -464,7 +464,7 @@ def __new__( value: Optional[str] = None, _sn_gen: Optional[Union[Callable[[], str], Generator]] = None, _prefix: str = _unique_id(), - ) -> BNode: + ): """ # only store implementations should pass in a value """ @@ -494,7 +494,7 @@ def __new__( # must be valid NCNames" _:[A-Za-z][A-Za-z0-9]* # http://www.w3.org/TR/2004/REC-rdf-testcases-20040210/#nodeID # type error: Incompatible return value type (got "Identifier", expected "BNode") - return Identifier.__new__(cls, value) # type: ignore[return-value] + return Identifier.__new__(cls, value) def n3(self, namespace_manager: Optional[NamespaceManager] = None) -> str: # note - for two strings, concat with + is faster than f"{x}{y}" @@ -631,7 +631,7 @@ def __new__( lang: Optional[str] = None, datatype: Optional[str] = None, normalize: Optional[bool] = None, - ) -> Literal: + ): if lang == "": lang = None # no empty lang-tags in RDF @@ -701,7 +701,7 @@ def __new__( lexical_or_value = _strip_and_collapse_whitespace(lexical_or_value) try: - inst: Literal = str.__new__(cls, lexical_or_value) + inst = str.__new__(cls, lexical_or_value) except UnicodeDecodeError: inst = str.__new__(cls, lexical_or_value, "utf-8") @@ -2242,7 +2242,7 @@ class Variable(Identifier): __slots__ = () - def __new__(cls, value: str) -> Variable: + def __new__(cls, value: str): if len(value) == 0: raise Exception("Attempted to create variable with empty string as name!") if value[0] == "?": From 72287e45ad1a9801a85d262f20657c23e8831209 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 24 Oct 2025 14:10:32 +1000 Subject: [PATCH 45/60] fix: RecursiveSerializer- outputs undeclared prefix for predicates that contains the base as a substring (#3267) * fix: turtle, longturtle, and n3 serializers - outputs undeclared prefix when using base on a predicate that contains the base as a substring Fixes: https://github.com/RDFLib/rdflib/issues/3160 * chore: remove print * chore: formatting --------- Co-authored-by: Nicholas Car --- rdflib/plugins/serializers/longturtle.py | 2 ++ rdflib/plugins/serializers/turtle.py | 18 ++++++++++++ test/test_n3.py | 6 ++-- .../test_serializer_longturtle.py | 29 +++++++++++++++++++ .../test_serializer_turtle.py | 28 ++++++++++++++++++ 5 files changed, 81 insertions(+), 2 deletions(-) diff --git a/rdflib/plugins/serializers/longturtle.py b/rdflib/plugins/serializers/longturtle.py index 1e68e68495..1cb2fa7368 100644 --- a/rdflib/plugins/serializers/longturtle.py +++ b/rdflib/plugins/serializers/longturtle.py @@ -160,6 +160,8 @@ def preprocessTriple(self, triple: _TripleType) -> None: self.base is not None and isinstance(node, URIRef) and node.startswith(self.base) + and "#" not in node.replace(self.base, "") + and "/" not in node.replace(self.base, "") ): # predicate corresponds to base namespace continue diff --git a/rdflib/plugins/serializers/turtle.py b/rdflib/plugins/serializers/turtle.py index abee2e43b4..a6d2b631ab 100644 --- a/rdflib/plugins/serializers/turtle.py +++ b/rdflib/plugins/serializers/turtle.py @@ -17,6 +17,8 @@ Optional, Sequence, Tuple, + TypeVar, + Union, ) from rdflib.exceptions import Error @@ -25,6 +27,8 @@ from rdflib.serializer import Serializer from rdflib.term import BNode, Literal, Node, URIRef +_StrT = TypeVar("_StrT", bound=str) + if TYPE_CHECKING: from rdflib.graph import _PredicateType, _SubjectType, _TripleType @@ -169,6 +173,18 @@ def write(self, text: str) -> None: # type error: Item "None" of "Optional[IO[bytes]]" has no attribute "write" self.stream.write(text.encode(self.encoding, "replace")) # type: ignore[union-attr] + def relativize(self, uri: _StrT) -> Union[_StrT, URIRef]: + base = self.base + if ( + base is not None + and uri.startswith(base) + and "#" not in uri.replace(base, "") + and "/" not in uri.replace(base, "") + ): + # type error: Incompatible types in assignment (expression has type "str", variable has type "Node") + uri = URIRef(uri.replace(base, "", 1)) # type: ignore[assignment] + return uri + SUBJECT = 0 VERB = 1 @@ -271,6 +287,8 @@ def preprocessTriple(self, triple: _TripleType) -> None: self.base is not None and isinstance(node, URIRef) and node.startswith(self.base) + and "#" not in node.replace(self.base, "") + and "/" not in node.replace(self.base, "") ): # predicate corresponds to base namespace continue diff --git a/test/test_n3.py b/test/test_n3.py index 40f8718681..c713f35711 100644 --- a/test/test_n3.py +++ b/test/test_n3.py @@ -123,8 +123,10 @@ def test_base_serialize(self): URIRef("http://example.com/people/Linda"), ) ) - s = g.serialize(base="http://example.com/", format="n3", encoding="latin-1") - assert b"" in s + s = g.serialize( + base="http://example.com/people/", format="n3", encoding="latin-1" + ) + assert b"" in s g2 = Dataset() g2.parse(data=s, format="n3") assert list(g) == list(g2.triples((None, None, None))) diff --git a/test/test_serializers/test_serializer_longturtle.py b/test/test_serializers/test_serializer_longturtle.py index 65821784ee..b15400979a 100644 --- a/test/test_serializers/test_serializer_longturtle.py +++ b/test/test_serializers/test_serializer_longturtle.py @@ -1,5 +1,6 @@ import difflib from pathlib import Path +from textwrap import dedent from rdflib import Graph, Namespace from rdflib.namespace import GEO, SDO @@ -181,3 +182,31 @@ def test_longturtle(): diff = "\n".join(list(difflib.unified_diff(target.split("\n"), output.split("\n")))) assert not diff, diff + + +def test_longturtle_undeclared_prefix_when_using_base(): + """ + See https://github.com/RDFLib/rdflib/issues/3160 + """ + from rdflib import Graph, Literal, URIRef + + g = Graph() + g.add( + ( + URIRef("https://example.com/subject"), + URIRef("https://example.com/p/predicate"), + Literal("object"), + ) + ) + output = g.serialize(format="longturtle", base="https://example.com/") + expected = dedent( + """ + BASE + PREFIX ns1: + + + ns1:predicate "object" ; + . + """ + ) + assert output.strip() == expected.strip() diff --git a/test/test_serializers/test_serializer_turtle.py b/test/test_serializers/test_serializer_turtle.py index c91459829f..737701d261 100644 --- a/test/test_serializers/test_serializer_turtle.py +++ b/test/test_serializers/test_serializer_turtle.py @@ -1,3 +1,5 @@ +from textwrap import dedent + from rdflib import RDF, RDFS, BNode, Graph, Literal, Namespace, URIRef from rdflib.collection import Collection from rdflib.plugins.serializers.turtle import TurtleSerializer @@ -113,3 +115,29 @@ def test_turtle_namespace(): assert "GENO:0000385" in output assert "SERIAL:0167-6423" in output assert r"EX:name_with_\(parenthesis\)" in output + + +def test_turtle_undeclared_prefix_when_using_base(): + """ + See https://github.com/RDFLib/rdflib/issues/3160 + """ + from rdflib import Graph, Literal, URIRef + + g = Graph() + g.add( + ( + URIRef("https://example.com/subject"), + URIRef("https://example.com/p/predicate"), + Literal("object"), + ) + ) + output = g.serialize(format="turtle", base="https://example.com/") + expected = dedent( + """ + @base . + @prefix ns1: . + + ns1:predicate "object" . + """ + ) + assert output.strip() == expected.strip() From 634638fa31e450a2c7ff010f6c7a3dfcd4870383 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 24 Oct 2025 14:31:17 +1000 Subject: [PATCH 46/60] feat: add Dataset __iadd__ support (#3268) fixes: https://github.com/RDFLib/rdflib/issues/3031 Co-authored-by: Nicholas Car --- rdflib/graph.py | 6 ++ test/test_dataset/test_dataset_add.py | 88 ++++++++++++++++++++++++++- test/test_sparql/test_initbindings.py | 5 +- 3 files changed, 96 insertions(+), 3 deletions(-) diff --git a/rdflib/graph.py b/rdflib/graph.py index 7b7256d041..bef31e2b6e 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -2591,6 +2591,12 @@ def __setstate__( # type error: Property "identifier" defined in "Graph" is read-only self.store, self.identifier, self.default_graph, self.default_union = state # type: ignore[misc] + def __iadd__(self: _DatasetT, other: Iterable[_QuadType]) -> _DatasetT: # type: ignore[override, misc] + """Add all quads in Dataset other to Dataset. + BNode IDs are not changed.""" + self.addN((s, p, o, g) for s, p, o, g in other) + return self + def graph( self, identifier: Optional[Union[_ContextIdentifierType, _ContextType, str]] = None, diff --git a/test/test_dataset/test_dataset_add.py b/test/test_dataset/test_dataset_add.py index a3197ae2e2..3a56998ac3 100644 --- a/test/test_dataset/test_dataset_add.py +++ b/test/test_dataset/test_dataset_add.py @@ -1,4 +1,8 @@ -from rdflib import RDF, RDFS, Dataset, Graph, URIRef +from textwrap import dedent + +from rdflib import RDF, RDFS, Dataset, Graph, Literal, URIRef +from rdflib.compare import isomorphic +from rdflib.graph import DATASET_DEFAULT_GRAPH_ID from test.data import TEST_DATA_DIR @@ -99,3 +103,85 @@ def test_dataset_parse_return_value(): ) assert len(ds) assert return_value is ds + + +def test_dataset_iadd(): + ds = Dataset() + ds.add( + ( + URIRef("https://example.com/subject"), + URIRef("https://example.com/p/predicate"), + Literal("object"), + ) + ) + + ds2 = Dataset() + ds2.add( + ( + URIRef("https://example.com/subject"), + URIRef("https://example.com/p/predicate"), + Literal("object"), + URIRef("https://example.com/graph"), + ) + ) + + data = """ + { + "object2" . + } + + { + "Triple-Other" . + } + + { + "Triple Y" . + } + + """ + ds3 = Dataset().parse(data=data, format="trig") + + # Combine the datasets + ds += ds2 + ds3 + + expected_default_graph_data = dedent( + """ + @prefix ns2: . + @prefix ns3: . + ns2:subject2 ns3:predicate2 "object2" . + ns2:subject ns3:predicate "object" . + """ + ) + expected_default_graph = Graph(identifier=DATASET_DEFAULT_GRAPH_ID).parse( + data=expected_default_graph_data, format="turtle" + ) + + expected_graph1_data = dedent( + """ + @prefix ns2: . + @prefix ns3: . + ns2:subject ns3:predicate "object" . + ns2:subject-other ns3:predicate-other "Triple-Other" . + """ + ) + expected_graph1 = Graph(identifier=URIRef("https://example.com/graph")).parse( + data=expected_graph1_data, format="turtle" + ) + + expected_graph2 = dedent( + """ + @prefix ns2: . + ns2:subject-y ns2:predicate-y "Triple Y" . + """ + ) + expected_graph2 = Graph(identifier=URIRef("https://example.com/graph2")).parse( + data=expected_graph2, format="turtle" + ) + + assert isomorphic(expected_default_graph, ds.default_graph) + assert isomorphic( + expected_graph1, ds.get_graph(URIRef("https://example.com/graph")) + ) + assert isomorphic( + expected_graph2, ds.get_graph(URIRef("https://example.com/graph2")) + ) diff --git a/test/test_sparql/test_initbindings.py b/test/test_sparql/test_initbindings.py index 55f5853a3c..58aa8d6177 100644 --- a/test/test_sparql/test_initbindings.py +++ b/test/test_sparql/test_initbindings.py @@ -1,4 +1,5 @@ from rdflib import Dataset, Literal, URIRef, Variable +from rdflib.graph import DATASET_DEFAULT_GRAPH_ID from rdflib.plugins.sparql import prepareQuery from test.utils.namespace import EGDC @@ -274,8 +275,8 @@ def test_prepare(): def test_data(): data = Dataset() data += [ - (URIRef("urn:a"), URIRef("urn:p"), Literal("a")), - (URIRef("urn:b"), URIRef("urn:p"), Literal("b")), + (URIRef("urn:a"), URIRef("urn:p"), Literal("a"), DATASET_DEFAULT_GRAPH_ID), + (URIRef("urn:b"), URIRef("urn:p"), Literal("b"), DATASET_DEFAULT_GRAPH_ID), ] a = set( From 3b67ea708580e60fa0e5ce1f3a9e190c66f84428 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 24 Oct 2025 17:02:24 +1000 Subject: [PATCH 47/60] build: prep 7.3.0 (#3277) * build: prep 7.3.0 * revert noisy diff * chore: add back 7.3.0 changelog * style: format * chore: add summary to release --- CHANGELOG.md | 52 ++++++++++++++++++++++++++++++++++++++++++++ CITATION.cff | 4 ++-- README.md | 1 + admin/README.md | 8 +++++++ admin/pr_markdown.py | 28 ++++++++++++++++++++++++ pyproject.toml | 2 +- rdflib/__init__.py | 2 +- 7 files changed, 93 insertions(+), 4 deletions(-) create mode 100644 admin/pr_markdown.py diff --git a/CHANGELOG.md b/CHANGELOG.md index 047828f0e7..f6addadc9e 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,55 @@ +## 2025-10-24 RELEASE 7.3.0 + +This release delivers several important fixes and enhancements to RDFLib’s Dataset implementation, resolving long-standing issues and improving consistency across serialization and SPARQL operations. It also introduces new deprecation notices for certain Dataset methods and attributes, which will be removed in the next major release. In addition, this version includes a range of improvements to SPARQL result parsing, typing, and literal handling. + +### Features + +- Added `Dataset.__iadd__` support +- Dataset's default serialize format is now `trig` +- Datasets can now add graphs backed by different stores + +### Fixes and Improvements + +- Fixed an issue where the `RecursiveSerializer` would output undeclared prefixes for predicates that contained the base as a substring +- Prevented prefix generation for predicates corresponding to the base namespace +- SPARQL Update now correctly inserts into the default graph +- Dataset.parse now returns Self +- N-Quads serialization no longer includes the RDFLib internal default graph identifier +- Static type checkers can now infer the type of `Term.__new__` +- Removed automatic date conversion for gYear and gYearMonth literals +- Optional clauses in SPARQL queries can now bind variables +- Fixed reevaluation logic in SPARQL Update between update loops + +### Maintenance + +- Added deprecation notices to certain Dataset methods and attributes + - Use Dataset.graphs instead of Dataset.contexts method + - Use Dataset.default_graph instead of Dataset.default_context + - Deprecate Dataset.identifier entirely. +- Updated type hints for Graph.open() with SPARQLUpdateStore configuration +- SPARQL Result Parsing Improvements + - Simplified and modernized the SPARQL result parsing system: + - These changes maintain backward compatibility while making the SPARQL API more flexible and extensible. + +Pull requests merged: + +- feat: add Dataset `__iadd__` support by @edmondchuc in [#3268](https://api.github.com/repos/RDFLib/rdflib/pulls/3268) +- fix: RecursiveSerializer- outputs undeclared prefix for predicates that contains the base as a substring by @edmondchuc in [#3267](https://api.github.com/repos/RDFLib/rdflib/pulls/3267) +- fix: allow static type checkers to infer term's `__new__` type by @edmondchuc in [#3266](https://api.github.com/repos/RDFLib/rdflib/pulls/3266) +- fix: SPARQL Update inserts into the default graph by @edmondchuc in [#3265](https://api.github.com/repos/RDFLib/rdflib/pulls/3265) +- chore: add deprecation notice to Dataset methods and attributes by @edmondchuc in [#3264](https://api.github.com/repos/RDFLib/rdflib/pulls/3264) +- fix: Dataset.parse now returns Self by @edmondchuc in [#3263](https://api.github.com/repos/RDFLib/rdflib/pulls/3263) +- fix: dataset nquads serialization including RDFLib internal default graph identifier by @edmondchuc in [#3262](https://api.github.com/repos/RDFLib/rdflib/pulls/3262) +- patch for reevaluation in sparql modify between update loops. with test by @WhiteGobo in [#3261](https://api.github.com/repos/RDFLib/rdflib/pulls/3261) +- feat: change dataset's default serialize format to trig by @edmondchuc in [#3260](https://api.github.com/repos/RDFLib/rdflib/pulls/3260) +- feat: allow adding graphs backed by different stores to the same dataset by @edmondchuc in [#3259](https://api.github.com/repos/RDFLib/rdflib/pulls/3259) +- fix(v7): remove Literal.toPython date conversion for gYear/gYearMonth (#3115) by @edmondchuc in [#3258](https://api.github.com/repos/RDFLib/rdflib/pulls/3258) +- sparqls optionals clause can now bind variables. with test. issue 2957 by @WhiteGobo in [#3247](https://api.github.com/repos/RDFLib/rdflib/pulls/3247) +- fix: skip prefix generation for predicates corresponding to base namespace by @edmondchuc in [#3244](https://api.github.com/repos/RDFLib/rdflib/pulls/3244) +- Run the example queries agains the local fuseki by @white-gecko in [#3240](https://api.github.com/repos/RDFLib/rdflib/pulls/3240) +- Adjust the type hint for Graph open to reflect a SPARQLUpdateStore configuration by @white-gecko in [#3239](https://api.github.com/repos/RDFLib/rdflib/pulls/3239) +- SPARQL result parsing by @white-gecko in [#2796](https://api.github.com/repos/RDFLib/rdflib/pulls/2796) + ## 2025-09-19 RELEASE 7.2.1 A tiny clean up release. diff --git a/CITATION.cff b/CITATION.cff index ef4772952a..c111335f86 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -69,7 +69,7 @@ authors: - family-names: "Stuart" given-names: "Veyndan" title: "RDFLib" -version: 7.2.1 -date-released: 2025-09-19 +version: 7.3.0 +date-released: 2025-10-24 url: "https://github.com/RDFLib/rdflib" doi: 10.5281/zenodo.6845245 diff --git a/README.md b/README.md index 8ddb4eb72f..369b36a53a 100644 --- a/README.md +++ b/README.md @@ -43,6 +43,7 @@ Help with maintenance of all of the RDFLib family of packages is always welcome ## Versions & Releases * `main` branch in this repository is the current unstable release - version 8 alpha +* `7.3.0` many fixes and usability improvements, particularly for the Dataset class. See changelog for details * `7.2.1` tiny clean up release, relaxes Python version requirement * `7.2.0` general fixes and usability improvements, see changelog for details * `7.1.4` tidy-up release, possibly last 7.x release diff --git a/admin/README.md b/admin/README.md index f09c39a748..c511b9bcec 100644 --- a/admin/README.md +++ b/admin/README.md @@ -3,3 +3,11 @@ Tools to assist with RDFlib releases, like extracting all merged PRs from GitHub since last release and printing them into MArkdown lists. To make a release of RDFLib, see the [Developer's Guide](https://rdflib.readthedocs.io/en/latest/developers.html). + +## PR Changelog Summary + +An alternative to the `get_merged_prs.py` script is to use the GitHub CL to get the list of PRs and pipe it into the `pr_markdown.py` script. The following command retrieves the list of PRs merged since the last release (`2025-09-19`) from a particular branch (`7.x`). + +```bash +gh api '/search/issues?q=repo:rdflib/rdflib+is:pr+is:merged+base:7.x+merged:>2025-09-19&per_page=100' | jq '{total_count, incomplete_results, items: [.items[] | {number, title, pull_request_merged_at: .pull_request.merged_at, pull_request_url: .pull_request.url, username: .user.login}]}' | poetry run python admin/pr_markdown.py +``` diff --git a/admin/pr_markdown.py b/admin/pr_markdown.py new file mode 100644 index 0000000000..f9548c5ce5 --- /dev/null +++ b/admin/pr_markdown.py @@ -0,0 +1,28 @@ +import json +import sys +from dataclasses import dataclass + + +@dataclass +class PR: + number: int + title: str + pull_request_merged_at: str + pull_request_url: str + username: str + + def __repr__(self): + return f"{self.title} by @{self.username} in [#{self.number}]({self.pull_request_url})" + + +try: + json_data = json.load(sys.stdin) + prs = [PR(**pr) for pr in json_data["items"]] + for pr in prs: + print(f"- {pr}") +except json.JSONDecodeError as e: + print(f"Error parsing JSON: {e}") + sys.exit(1) +except Exception as e: + print(f"Error: {e}") + sys.exit(1) diff --git a/pyproject.toml b/pyproject.toml index ff1665908b..d280e1faf2 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.3.0-a0" +version = "7.3.0" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] diff --git a/rdflib/__init__.py b/rdflib/__init__.py index af6dfeefe8..5e37ec92f3 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -52,7 +52,7 @@ __docformat__ = "restructuredtext en" __version__: str = _DISTRIBUTION_METADATA["Version"] -__date__ = "2025-09-19" +__date__ = "2025-10-24" __all__ = [ "URIRef", From ec04ee9dac1fad0e46a55d0dda56e0bc96ce11ec Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 24 Oct 2025 19:30:22 +1000 Subject: [PATCH 48/60] chore: 7.3.0 post release (#3278) --- docker/latest/requirements.in | 2 +- docker/latest/requirements.txt | 2 +- pyproject.toml | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/docker/latest/requirements.in b/docker/latest/requirements.in index acfcb2e191..4bbaa611e4 100644 --- a/docker/latest/requirements.in +++ b/docker/latest/requirements.in @@ -1,4 +1,4 @@ # This file is used for building a docker image of the latest rdflib release. It # will be updated by dependabot when new releases are made. -rdflib==7.2.1 +rdflib==7.3.0 html5rdf==1.2.1 diff --git a/docker/latest/requirements.txt b/docker/latest/requirements.txt index 411657d7b6..d490b232ad 100644 --- a/docker/latest/requirements.txt +++ b/docker/latest/requirements.txt @@ -8,5 +8,5 @@ html5rdf==1.2.1 # via -r docker/latest/requirements.in pyparsing==3.0.9 # via rdflib -rdflib==7.2.1 +rdflib==7.3.0 # via -r docker/latest/requirements.in diff --git a/pyproject.toml b/pyproject.toml index d280e1faf2..ce80422313 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.3.0" +version = "7.4.0-a0" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] From 0ab817f86b5733c9a3b4ede7ef065b8d79e53fc5 Mon Sep 17 00:00:00 2001 From: Focke Date: Fri, 24 Oct 2025 12:04:13 +0200 Subject: [PATCH 49/60] added TypeError to test_roundtrip[test_other__service1], because different error on python 7.13.8. (#3275) with input from @edmondchuc Co-authored-by: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> --- test/test_sparql/test_translate_algebra.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/test/test_sparql/test_translate_algebra.py b/test/test_sparql/test_translate_algebra.py index bd1871fd7e..46bacffa77 100644 --- a/test/test_sparql/test_translate_algebra.py +++ b/test/test_sparql/test_translate_algebra.py @@ -260,7 +260,7 @@ def _format_query(query: str) -> str: "Test if a nested service pattern is properly translated" "into the query text.", pytest.mark.xfail( - raises=RecursionError, + raises=(RecursionError, TypeError), reason="Fails with RecursionError inside parser.parseQuery", ), ) From 9b14b8f97994222b3744aa3c7710ecab240c9042 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Wed, 29 Oct 2025 22:08:11 +1000 Subject: [PATCH 50/60] feat: v7 mkdocs (#3287) * Pr/3143 (#3144) * Start migrating the documentation from .rst sphynx to .md material for mkdocs. Add mkdocs.yml with proper configuration, enable automated generation of the doc API from the docstring, and start converting a few pages (index, getting started, developers) for demo. * Add automated (opt-in) tests of all python codeblocks in the markdown docs using pytest-markdown-docs. Needed to comment 1 small test that seemingly should fail (AttributeError: DefinedNamespace like object has no attribute '_NS', indeed the DefinedNamespace class expect a _NS, so it makes sense it fails) but for some reason it was not properly failing when ran with regular pytest, but it fails with pytest-markdown-docs * convert all documentations pages to markdown, convert all docstrings to markdown with google style, updated config for mkdocs (readthedocs, tox, task) fixed https://github.com/RDFLib/rdflib/issues/3128 * delete files, dependencies and mentions related to sphinx * uncomment test previously commented for experimenting with markdown codeblock testing * update poetry lock * blacked again * ignore mypy errors --------- Co-authored-by: Vincent Emonet * build: set mkdocs versions for rdflib v7 compatibility * style: apply black formatting * chore: mypy comments * fix: mypy, ruff, and apply black formatting * build: only test mkdocs build on python 3.11 and above * build: update poetry lock * build: fix python version constraint and add comment explaining why the python constraint for mkdocs deps * build: update poetry lock * chore: fix mypy and apply formatting * build: set setuptools version constraints in tox.ini * build: set berkeleydb version * build: loosen berkeleydb constraint --------- Co-authored-by: Nicholas Car Co-authored-by: Vincent Emonet --- .github/dependabot.yml | 4 - .github/workflows/validate.yaml | 2 - .gitignore | 3 +- .readthedocs.yaml | 13 +- MANIFEST.in | 1 + README.md | 19 +- Taskfile.yml | 8 +- devtools/diffrtpy.py | 11 +- docs/CONTRIBUTING.md | 6 +- docs/_static/pyramid.css | 323 ---- docs/_themes/armstrong/LICENSE | 26 - docs/_themes/armstrong/README | 3 - docs/_themes/armstrong/layout.html | 48 - docs/_themes/armstrong/static/rtd.css_t | 784 -------- docs/_themes/armstrong/theme-old.conf | 65 - docs/_themes/armstrong/theme.conf | 65 - docs/apidocs/.gitignore | 2 - docs/apidocs/examples.rst | 141 -- docs/changelog.md | 3 +- docs/conf.py | 333 ---- docs/decisions.md | 35 + docs/decisions/20220826-default_branch.md | 30 + docs/decisions/20220826-default_branch.rst | 42 - docs/decisions/index.rst | 69 - docs/developers.md | 384 ++++ docs/developers.rst | 573 ------ docs/docs.md | 47 + docs/docs.rst | 55 - docs/gen_ref_pages.py | 62 + docs/gettingstarted.md | 144 ++ docs/gettingstarted.rst | 178 -- docs/includes/abbreviations.md | 31 + docs/index.md | 90 + docs/index.rst | 144 -- docs/intro_to_creating_rdf.md | 167 ++ docs/intro_to_creating_rdf.rst | 201 --- docs/intro_to_graphs.md | 101 ++ docs/intro_to_graphs.rst | 131 -- docs/intro_to_parsing.md | 134 ++ docs/intro_to_parsing.rst | 158 -- docs/intro_to_sparql.md | 159 ++ docs/intro_to_sparql.rst | 207 --- docs/merging.md | 39 + docs/merging.rst | 44 - docs/namespaces_and_bindings.md | 143 ++ docs/namespaces_and_bindings.rst | 156 -- docs/persistence.md | 60 + docs/persistence.rst | 81 - docs/persisting_n3_terms.md | 89 + docs/persisting_n3_terms.rst | 93 - docs/plugin_parsers.rst | 46 - docs/plugin_query_results.rst | 32 - docs/plugin_serializers.rst | 60 - docs/plugin_stores.rst | 70 - docs/plugins.md | 187 ++ docs/plugins.rst | 21 - docs/rdf_terms.md | 154 ++ docs/rdf_terms.rst | 230 --- docs/security_considerations.md | 78 + docs/security_considerations.rst | 114 -- docs/{type_hints.rst => type_hints.md} | 109 +- docs/upgrade4to5.md | 203 +++ docs/upgrade4to5.rst | 213 --- docs/upgrade5to6.md | 61 + docs/upgrade5to6.rst | 79 - docs/upgrade6to7.md | 36 + docs/upgrade6to7.rst | 50 - docs/utilities.md | 146 ++ docs/utilities.rst | 166 -- examples/__init__.py | 1 + examples/conjunctive_graphs.py | 2 +- examples/custom_datatype.py | 2 +- examples/custom_eval.py | 16 +- examples/foafpaths.py | 27 +- examples/prepared_query.py | 6 +- examples/resource_example.py | 6 +- examples/secure_with_audit.py | 26 +- examples/secure_with_urlopen.py | 11 +- examples/slice.py | 4 +- examples/smushing.py | 16 +- examples/sparql_query_example.py | 10 +- examples/sparql_update_example.py | 2 +- examples/transitive.py | 38 +- mkdocs.yml | 178 ++ poetry.lock | 1674 ++++++++++-------- pyproject.toml | 33 +- rdflib/__init__.py | 74 +- rdflib/_networking.py | 48 +- rdflib/_type_checking.py | 10 +- rdflib/collection.py | 23 +- rdflib/compare.py | 160 +- rdflib/container.py | 81 +- rdflib/events.py | 21 +- rdflib/extras/describer.py | 233 +-- rdflib/extras/external_graph_libs.py | 313 ++-- rdflib/extras/infixowl.py | 200 ++- rdflib/extras/shacl.py | 59 +- rdflib/graph.py | 1353 +++++++------- rdflib/namespace/_GEO.py | 28 +- rdflib/namespace/__init__.py | 127 +- rdflib/parser.py | 7 +- rdflib/paths.py | 77 +- rdflib/plugin.py | 36 +- rdflib/plugins/parsers/jsonld.py | 51 +- rdflib/plugins/parsers/notation3.py | 33 +- rdflib/plugins/parsers/nquads.py | 29 +- rdflib/plugins/parsers/ntriples.py | 52 +- rdflib/plugins/parsers/patch.py | 20 +- rdflib/plugins/parsers/rdfxml.py | 2 + rdflib/plugins/serializers/jsonld.py | 11 +- rdflib/plugins/serializers/longturtle.py | 4 +- rdflib/plugins/serializers/n3.py | 2 + rdflib/plugins/serializers/nquads.py | 2 + rdflib/plugins/serializers/nt.py | 11 +- rdflib/plugins/serializers/patch.py | 15 +- rdflib/plugins/serializers/rdfxml.py | 4 + rdflib/plugins/serializers/trig.py | 2 + rdflib/plugins/serializers/trix.py | 2 + rdflib/plugins/serializers/turtle.py | 13 +- rdflib/plugins/serializers/xmlwriter.py | 2 + rdflib/plugins/shared/jsonld/context.py | 8 +- rdflib/plugins/shared/jsonld/util.py | 19 +- rdflib/plugins/sparql/__init__.py | 5 +- rdflib/plugins/sparql/algebra.py | 38 +- rdflib/plugins/sparql/evaluate.py | 11 +- rdflib/plugins/sparql/operators.py | 44 +- rdflib/plugins/sparql/parserutils.py | 7 +- rdflib/plugins/sparql/processor.py | 37 +- rdflib/plugins/sparql/results/csvresults.py | 6 +- rdflib/plugins/sparql/results/jsonresults.py | 12 +- rdflib/plugins/sparql/results/tsvresults.py | 2 + rdflib/plugins/sparql/results/xmlresults.py | 4 + rdflib/plugins/sparql/sparql.py | 19 +- rdflib/plugins/sparql/update.py | 12 +- rdflib/plugins/stores/auditable.py | 2 + rdflib/plugins/stores/berkeleydb.py | 12 +- rdflib/plugins/stores/concurrent.py | 2 + rdflib/plugins/stores/memory.py | 18 +- rdflib/plugins/stores/sparqlstore.py | 121 +- rdflib/query.py | 49 +- rdflib/resource.py | 532 +++--- rdflib/serializer.py | 6 +- rdflib/store.py | 180 +- rdflib/term.py | 696 ++++---- rdflib/tools/chunk_serializer.py | 43 +- rdflib/tools/csv2rdf.py | 3 +- rdflib/tools/rdf2dot.py | 7 +- rdflib/tools/rdfpipe.py | 3 + rdflib/tools/rdfs2dot.py | 6 +- rdflib/util.py | 155 +- rdflib/void.py | 4 +- rdflib/xsd_datetime.py | 32 +- run_tests.py | 21 +- test/test_graph/test_graph.py | 6 +- test/test_graph/test_graph_store.py | 4 +- test/test_graph/test_namespace_rebinding.py | 6 +- test/test_misc/test_bnode_ncname.py | 2 +- test/test_misc/test_input_source.py | 74 +- test/test_namespace/test_namespacemanager.py | 26 +- test/test_serializers/test_prettyxml.py | 4 +- test/test_serializers/test_serializer_xml.py | 4 +- test/test_sparql/test_result.py | 2 +- test/test_sparql/test_sparql.py | 4 +- test/test_sparql/test_update.py | 4 +- test/test_store/test_store.py | 2 +- test/test_store/test_store_sparqlstore.py | 9 +- test/test_turtle_quoting.py | 4 +- test/utils/__init__.py | 10 +- test/utils/graph.py | 12 +- test/utils/httpfileserver.py | 13 +- test/utils/iri.py | 14 +- test/utils/outcome.py | 41 +- test/utils/test/__init__.py | 2 +- test/utils/test/test_outcome.py | 2 +- tox.ini | 6 +- 175 files changed, 6859 insertions(+), 8427 deletions(-) delete mode 100644 docs/_static/pyramid.css delete mode 100644 docs/_themes/armstrong/LICENSE delete mode 100644 docs/_themes/armstrong/README delete mode 100644 docs/_themes/armstrong/layout.html delete mode 100644 docs/_themes/armstrong/static/rtd.css_t delete mode 100644 docs/_themes/armstrong/theme-old.conf delete mode 100644 docs/_themes/armstrong/theme.conf delete mode 100644 docs/apidocs/.gitignore create mode 100644 docs/decisions.md create mode 100644 docs/decisions/20220826-default_branch.md delete mode 100644 docs/decisions/20220826-default_branch.rst delete mode 100644 docs/decisions/index.rst create mode 100644 docs/developers.md create mode 100644 docs/docs.md delete mode 100644 docs/docs.rst create mode 100644 docs/gen_ref_pages.py create mode 100644 docs/gettingstarted.md delete mode 100644 docs/gettingstarted.rst create mode 100644 docs/includes/abbreviations.md create mode 100644 docs/index.md delete mode 100644 docs/index.rst create mode 100644 docs/intro_to_creating_rdf.md delete mode 100644 docs/intro_to_creating_rdf.rst create mode 100644 docs/intro_to_graphs.md delete mode 100644 docs/intro_to_graphs.rst create mode 100644 docs/intro_to_parsing.md delete mode 100644 docs/intro_to_parsing.rst create mode 100644 docs/intro_to_sparql.md delete mode 100644 docs/intro_to_sparql.rst create mode 100644 docs/merging.md delete mode 100644 docs/merging.rst create mode 100644 docs/namespaces_and_bindings.md delete mode 100644 docs/namespaces_and_bindings.rst create mode 100644 docs/persistence.md delete mode 100644 docs/persistence.rst create mode 100644 docs/persisting_n3_terms.md delete mode 100644 docs/persisting_n3_terms.rst delete mode 100644 docs/plugin_parsers.rst delete mode 100644 docs/plugin_query_results.rst delete mode 100644 docs/plugin_serializers.rst create mode 100644 docs/plugins.md delete mode 100644 docs/plugins.rst create mode 100644 docs/rdf_terms.md delete mode 100644 docs/rdf_terms.rst create mode 100644 docs/security_considerations.md delete mode 100644 docs/security_considerations.rst rename docs/{type_hints.rst => type_hints.md} (56%) create mode 100644 docs/upgrade4to5.md delete mode 100644 docs/upgrade4to5.rst create mode 100644 docs/upgrade5to6.md delete mode 100644 docs/upgrade5to6.rst create mode 100644 docs/upgrade6to7.md delete mode 100644 docs/upgrade6to7.rst create mode 100644 docs/utilities.md delete mode 100644 docs/utilities.rst create mode 100644 mkdocs.yml diff --git a/.github/dependabot.yml b/.github/dependabot.yml index fb915a9e8c..3b6ada218f 100644 --- a/.github/dependabot.yml +++ b/.github/dependabot.yml @@ -9,10 +9,6 @@ updates: # see https://github.com/dependabot/dependabot-core/pull/10194 versioning-strategy: auto ignore: - - dependency-name: sphinx - versions: - - 3.4.3 - - 3.5.2 # We only use setuptools for a couple of things in the test suite # There is no need to keep it bleeding-edge. There are too frequent # updates to setuptools, requires too much maintenance to keep it up to date. diff --git a/.github/workflows/validate.yaml b/.github/workflows/validate.yaml index b4318a959b..178a33d12a 100644 --- a/.github/workflows/validate.yaml +++ b/.github/workflows/validate.yaml @@ -34,7 +34,6 @@ jobs: - python-version: "3.8" os: ubuntu-latest extensive-tests: true - TOXENV_SUFFIX: "-docs" - python-version: "3.8" os: ubuntu-latest extensive-tests: true @@ -43,7 +42,6 @@ jobs: TOXENV_SUFFIX: "-min" - python-version: "3.9" os: ubuntu-latest - TOXENV_SUFFIX: "-docs" - python-version: "3.10" os: ubuntu-latest TOX_EXTRA_COMMAND: "- black --check --diff ./rdflib" diff --git a/.gitignore b/.gitignore index d42dc26fd3..b058f851d9 100644 --- a/.gitignore +++ b/.gitignore @@ -1,6 +1,5 @@ .flakeheaven_cache/ RDFLib.sublime-project -/docs/_build/ RDFLib.sublime-workspace coverage/ cov.xml @@ -8,6 +7,8 @@ cov.xml /.hgignore build/ /docs/draft/ +/docs/apidocs/ +/docs/_build/ *~ test_reports/*latest.ttl # PyCharm diff --git a/.readthedocs.yaml b/.readthedocs.yaml index d847956c19..96dcb371c3 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -9,12 +9,11 @@ formats: - htmlzip - pdf +# https://docs.readthedocs.com/platform/stable/intro/mkdocs.html build: - os: ubuntu-20.04 + os: "ubuntu-24.04" tools: - # Using 3.9 as earlier versions have trouble generating documentation for - # `@typing.overload`` with type aliases. - python: "3.9" + python: "3" jobs: post_create_environment: # Using requirements-poetry.in as requirements-poetry.txt has conflicts with @@ -24,8 +23,6 @@ build: - poetry export --only=main --only=docs --without-hashes -o requirements.txt - pip install --no-cache-dir -r requirements.txt - pip install . - - python -c "from rdflib import Graph; print(Graph)" -sphinx: - configuration: docs/conf.py - fail_on_warning: true +mkdocs: + configuration: mkdocs.yml diff --git a/MANIFEST.in b/MANIFEST.in index 1eeed9fe94..276b18a561 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -9,4 +9,5 @@ recursive-include examples *.py graft test graft docs prune docs/_build +prune site/ global-exclude *.pyc *$py.class diff --git a/README.md b/README.md index 369b36a53a..3cfb4e632f 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@ -![](docs/_static/RDFlib.png) +![](docs/_static/RDFlib.png) + +# RDFLib -RDFLib -====== [![Build Status](https://github.com/RDFLib/rdflib/actions/workflows/validate.yaml/badge.svg?branch=main)](https://github.com/RDFLib/rdflib/actions?query=branch%3Amain) [![Documentation Status](https://readthedocs.org/projects/rdflib/badge/?version=latest)](https://rdflib.readthedocs.io/en/latest/?badge=latest) [![Coveralls branch](https://img.shields.io/coveralls/RDFLib/rdflib/main.svg)](https://coveralls.io/r/RDFLib/rdflib?branch=main) @@ -30,8 +30,10 @@ The RDFlib community maintains many RDF-related Python code repositories with di * [rdflib](https://github.com/RDFLib/rdflib) - the RDFLib core * [sparqlwrapper](https://github.com/RDFLib/sparqlwrapper) - a simple Python wrapper around a SPARQL service to remotely execute your queries -* [pyLODE](https://github.com/RDFLib/pyLODE) - An OWL ontology documentation tool using Python and templating, based on LODE -* [pySHACL](https://github.com/RDFLib/pySHACL) - A pure Python module which allows for the validation of RDF graphs against SHACL graphs +* [pyLODE](https://github.com/RDFLib/pyLODE) - An OWL ontology documentation tool using Python and templating, based on LODE. +* [pyrdfa3](https://github.com/RDFLib/pyrdfa3) - RDFa 1.1 distiller/parser library: can extract RDFa 1.1/1.0 from (X)HTML, SVG, or XML in general. +* [pymicrodata](https://github.com/RDFLib/pymicrodata) - A module to extract RDF from an HTML5 page annotated with microdata. +* [pySHACL](https://github.com/RDFLib/pySHACL) - A pure Python module which allows for the validation of RDF graphs against SHACL graphs. * [OWL-RL](https://github.com/RDFLib/OWL-RL) - A simple implementation of the OWL2 RL Profile which expands the graph with all possible triples that OWL RL defines. Please see the list for all packages/repositories here: @@ -136,18 +138,21 @@ g.add(( Literal("Nick", datatype=XSD.string) )) ``` + The triple (in n-triples notation) ` "Nick"^^ .` is created where the property `FOAF.givenName` is the URI `` and `XSD.string` is the URI ``. You can bind namespaces to prefixes to shorten the URIs for RDF/XML, Turtle, N3, TriG, TriX & JSON-LD serializations: - ```python +```python g.bind("foaf", FOAF) g.bind("xsd", XSD) ``` + This will allow the n-triples triple above to be serialised like this: - ```python + +```python print(g.serialize(format="turtle")) ``` diff --git a/Taskfile.yml b/Taskfile.yml index 1ae5947ec0..735b634f7f 100644 --- a/Taskfile.yml +++ b/Taskfile.yml @@ -170,19 +170,19 @@ tasks: desc: Clean generated documentation cmds: - task: _rimraf - vars: { RIMRAF_TARGET: "docs/_build/" } + vars: { RIMRAF_TARGET: "site/" } docs: desc: Build documentation cmds: - echo "PYTHONPATH=${PYTHONPATH}" - - "{{.VENV_PYTHON}} -m sphinx.cmd.build -T -W -b html -d docs/_build/doctree docs docs/_build/html {{.CLI_ARGS}}" + - "{{.VENV_PYTHON}} -m mkdocs build {{.CLI_ARGS}}" docs:live-server: desc: Run a live server on generated docs cmds: - 'echo "NOTE: Docs must be built for this to work"' - - npx -p live-server live-server docs/_build/html/ {{.CLI_ARGS}} + - npx -p live-server live-server site/ {{.CLI_ARGS}} default: desc: Run validate @@ -356,7 +356,7 @@ tasks: cd var/test-sdist/rdflib-* poetry install poetry run mypy --show-error-context --show-error-codes -p rdflib - poetry run sphinx-build -T -W -b html -d docs/_build/doctree docs docs/_build/html + poetry run mkdocs build poetry run pytest test:no_internet: diff --git a/devtools/diffrtpy.py b/devtools/diffrtpy.py index 1d4b097222..01a1e43ae2 100755 --- a/devtools/diffrtpy.py +++ b/devtools/diffrtpy.py @@ -3,18 +3,19 @@ This is a tool that can be used with git difftool to generate a diff that ignores type hints and comments. -The name of this script, ``diffrtpy`` is short for "diff runtime python", as +The name of this script, `diffrtpy` is short for "diff runtime python", as this will only compare the parts of the python code that has a runtime impact. This is to make it easier to review PRs that contain type hints. To use this script -.. code-block:: bash - task run -- python -m pip install --upgrade strip-hints black python-minifier - PYLOGGING_LEVEL=INFO task run -- git difftool -y -x $(readlink -f devtools/diffrtpy.py) upstream/main | tee /var/tmp/compact.diff +```bash +task run -- python -m pip install --upgrade strip-hints black python-minifier +PYLOGGING_LEVEL=INFO task run -- git difftool -y -x $(readlink -f devtools/diffrtpy.py) upstream/main | tee /var/tmp/compact.diff +``` -Then attach ``/var/tmp/compact.diff`` to the PR. +Then attach `/var/tmp/compact.diff` to the PR. """ from __future__ import annotations diff --git a/docs/CONTRIBUTING.md b/docs/CONTRIBUTING.md index 03f06e1833..259d80c8ea 100644 --- a/docs/CONTRIBUTING.md +++ b/docs/CONTRIBUTING.md @@ -46,7 +46,7 @@ Some ways in which you can contribute to RDFLib are: ## Pull Requests Contributions that involve changes to the RDFLib repository have to be made with -pull requests and should follow the [RDFLib developers guide](./developers.rst). +pull requests and should follow the [RDFLib developers guide](./developers.md). For changes that add features or affect the public API of RDFLib, it is recommended to first open an issue to discuss the change before starting to work @@ -55,5 +55,5 @@ spending time on it. ## Code of Conduct -All contributions to the project should be consistent with the [code of -conduct](./CODE_OF_CONDUCT.md) adopted by RDFLib. +All contributions to the project should be consistent with the +[code of conduct](./CODE_OF_CONDUCT.md) adopted by RDFLib. diff --git a/docs/_static/pyramid.css b/docs/_static/pyramid.css deleted file mode 100644 index e238803a4e..0000000000 --- a/docs/_static/pyramid.css +++ /dev/null @@ -1,323 +0,0 @@ -/* - * pylons.css_t - * ~~~~~~~~~~~~ - * - * Sphinx stylesheet -- pylons theme. - * - * :copyright: Copyright 2007-2010 by the Sphinx team, see AUTHORS. - * :license: BSD, see LICENSE for details. - * - */ - -@import url("basic.css"); - -/* -- page layout ----------------------------------------------------------- */ - -body { - font-family: "Nobile", sans-serif; - font-size: 100%; - background-color: #393939; - color: #ffffff; - margin: 0; - padding: 0; -} - -div.documentwrapper { - float: left; - width: 100%; -} - -div.bodywrapper { - margin: 0 0 0 230px; -} - -hr { - border: 1px solid #B1B4B6; -} - -div.document { - background-color: #eee; -} - -div.header { - width:100%; - background: #f4ad32 url(headerbg.png) repeat-x 0 top; - border-bottom: 2px solid #ffffff; -} - -div.logo { - text-align: center; - padding-top: 10px; -} - -div.body { - background-color: #ffffff; - color: #3E4349; - padding: 0 30px 30px 30px; - font-size: 1em; - border: 2px solid #ddd; - border-right-style: none; - overflow: auto; -} - -div.footer { - color: #ffffff; - width: 100%; - padding: 13px 0; - text-align: center; - font-size: 75%; - background: transparent; - clear:both; -} - -div.footer a { - color: #ffffff; - text-decoration: none; -} - -div.footer a:hover { - color: #e88f00; - text-decoration: underline; -} - -div.related { - line-height: 30px; - color: #373839; - font-size: 0.8em; - background-color: #eee; -} - -div.related a { - color: #1b61d6; -} - -div.related ul { - padding-left: 240px; -} - -div.sphinxsidebar { - font-size: 0.75em; - line-height: 1.5em; -} - -div.sphinxsidebarwrapper{ - padding: 10px 0; -} - -div.sphinxsidebar h3, -div.sphinxsidebar h4 { - font-family: "Neuton", sans-serif; - color: #373839; - font-size: 1.4em; - font-weight: normal; - margin: 0; - padding: 5px 10px; - border-bottom: 2px solid #ddd; -} - -div.sphinxsidebar h4{ - font-size: 1.3em; -} - -div.sphinxsidebar h3 a { - color: #000000; -} - - -div.sphinxsidebar p { - color: #888; - padding: 5px 20px; -} - -div.sphinxsidebar p.topless { -} - -div.sphinxsidebar ul { - margin: 10px 20px; - padding: 0; - color: #373839; -} - -div.sphinxsidebar a { - color: #444; -} - -div.sphinxsidebar input { - border: 1px solid #ccc; - font-family: sans-serif; - font-size: 1em; -} - -div.sphinxsidebar input[type=text]{ - margin-left: 20px; -} - -/* -- sidebars -------------------------------------------------------------- */ - -div.sidebar { - margin: 0 0 0.5em 1em; - border: 2px solid #c6d880; - background-color: #e6efc2; - width: 40%; - float: right; - border-right-style: none; - border-left-style: none; - padding: 10px 20px; -} - -p.sidebar-title { - font-weight: bold; -} - -/* -- body styles ----------------------------------------------------------- */ - -a, a .pre { - color: #1b61d6; - text-decoration: none; -} - -a:hover, a:hover .pre { - text-decoration: underline; -} - -div.body h1, -div.body h2, -div.body h3, -div.body h4, -div.body h5, -div.body h6 { - font-family: "Neuton", sans-serif; - background-color: #ffffff; - font-weight: normal; - color: #373839; - margin: 30px 0px 10px 0px; - padding: 5px 0; -} - -div.body h1 { border-top: 20px solid white; margin-top: 0; font-size: 200%; } -div.body h2 { font-size: 150%; background-color: #ffffff; } -div.body h3 { font-size: 120%; background-color: #ffffff; } -div.body h4 { font-size: 110%; background-color: #ffffff; } -div.body h5 { font-size: 100%; background-color: #ffffff; } -div.body h6 { font-size: 100%; background-color: #ffffff; } - -a.headerlink { - color: #1b61d6; - font-size: 0.8em; - padding: 0 4px 0 4px; - text-decoration: none; -} - -a.headerlink:hover { - text-decoration: underline; -} - -div.body p, div.body dd, div.body li { - line-height: 1.5em; -} - -div.admonition p.admonition-title + p { - display: inline; -} - -div.highlight{ - background-color: white; -} - -div.note { - border: 2px solid #7a9eec; - border-right-style: none; - border-left-style: none; - padding: 10px 20px 10px 60px; - background: #e1ecfe url(dialog-note.png) no-repeat 10px 8px; -} - -div.seealso { - background: #fff6bf url(dialog-seealso.png) no-repeat 10px 8px; - border: 2px solid #ffd324; - border-left-style: none; - border-right-style: none; - padding: 10px 20px 10px 60px; -} - -div.topic { - background: #eeeeee; - border: 2px solid #C6C9CB; - padding: 10px 20px; - border-right-style: none; - border-left-style: none; -} - -div.warning { - background: #fbe3e4 url(dialog-warning.png) no-repeat 10px 8px; - border: 2px solid #fbc2c4; - border-right-style: none; - border-left-style: none; - padding: 10px 20px 10px 60px; -} - -p.admonition-title { - display: none; -} - -p.admonition-title:after { - content: ":"; -} - -pre { - padding: 10px; - background-color: #fafafa; - color: #222; - line-height: 1.2em; - border: 2px solid #C6C9CB; - font-size: 1.1em; - margin: 1.5em 0 1.5em 0; - border-right-style: none; - border-left-style: none; -} - -tt { - background-color: transparent; - color: #222; - font-size: 1.1em; - font-family: monospace; -} - -.viewcode-back { - font-family: "Nobile", sans-serif; -} - -div.viewcode-block:target { - background-color: #fff6bf; - border: 2px solid #ffd324; - border-left-style: none; - border-right-style: none; - padding: 10px 20px; -} - -table.highlighttable { - width: 100%; -} - -table.highlighttable td { - padding: 0; -} - -a em.std-term { - color: #007f00; -} - -a:hover em.std-term { - text-decoration: underline; -} - -.download { - font-family: "Nobile", sans-serif; - font-weight: normal; - font-style: normal; -} - -tt.xref { - font-weight: normal; - font-style: normal; -} \ No newline at end of file diff --git a/docs/_themes/armstrong/LICENSE b/docs/_themes/armstrong/LICENSE deleted file mode 100644 index 894aa018a1..0000000000 --- a/docs/_themes/armstrong/LICENSE +++ /dev/null @@ -1,26 +0,0 @@ -Copyright (c) 2011 Bay Citizen & Texas Tribune - -Original ReadTheDocs.org code -Copyright (c) 2010 Charles Leifer, Eric Holscher, Bobby Grace - -Permission is hereby granted, free of charge, to any person -obtaining a copy of this software and associated documentation -files (the "Software"), to deal in the Software without -restriction, including without limitation the rights to use, -copy, modify, merge, publish, distribute, sublicense, and/or sell -copies of the Software, and to permit persons to whom the -Software is furnished to do so, subject to the following -conditions: - -The above copyright notice and this permission notice shall be -included in all copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, -EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES -OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND -NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT -HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, -WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING -FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR -OTHER DEALINGS IN THE SOFTWARE. - diff --git a/docs/_themes/armstrong/README b/docs/_themes/armstrong/README deleted file mode 100644 index 56ce661cd0..0000000000 --- a/docs/_themes/armstrong/README +++ /dev/null @@ -1,3 +0,0 @@ -This is the Armstrong Sphinx theme from https://github.com/armstrong/armstrong_sphinx - -Used under BSD license. diff --git a/docs/_themes/armstrong/layout.html b/docs/_themes/armstrong/layout.html deleted file mode 100644 index d7b8fbb142..0000000000 --- a/docs/_themes/armstrong/layout.html +++ /dev/null @@ -1,48 +0,0 @@ -{% extends "basic/layout.html" %} - -{% set script_files = script_files + [pathto("_static/searchtools.js", 1)] %} - -{% block htmltitle %} -{{ super() }} - - - -{% endblock %} - -{% block footer %} - - - -{% if theme_analytics_code %} - - -{% endif %} - -{% endblock %} diff --git a/docs/_themes/armstrong/static/rtd.css_t b/docs/_themes/armstrong/static/rtd.css_t deleted file mode 100644 index 489911a2fb..0000000000 --- a/docs/_themes/armstrong/static/rtd.css_t +++ /dev/null @@ -1,784 +0,0 @@ -/* - * rtd.css - * ~~~~~~~~~~~~~~~ - * - * Sphinx stylesheet -- sphinxdoc theme. Originally created by - * Armin Ronacher for Werkzeug. - * - * Customized for ReadTheDocs by Eric Pierce & Eric Holscher - * - * :copyright: Copyright 2007-2010 by the Sphinx team, see AUTHORS. - * :license: BSD, see LICENSE for details. - * - */ - -/* RTD colors - * light blue: {{ theme_light_color }} - * medium blue: {{ theme_medium_color }} - * dark blue: {{ theme_dark_color }} - * dark grey: {{ theme_grey_color }} - * - * medium blue hover: {{ theme_medium_color_hover }}; - * green highlight: {{ theme_green_highlight }} - * light blue (project bar): {{ theme_light_color }} - */ - -@import url("basic.css"); - -/* PAGE LAYOUT -------------------------------------------------------------- */ - -body { - font: 100%/1.5 "ff-meta-web-pro-1","ff-meta-web-pro-2",Arial,"Helvetica Neue",sans-serif; - text-align: center; - color: black; - background-color: {{ theme_background }}; - padding: 0; - margin: 0; -} - -div.document { - text-align: left; - background-color: {{ theme_light_color }}; -} - -div.bodywrapper { - background-color: {{ theme_white }}; - border-left: 1px solid {{ theme_lighter_gray }}; - border-bottom: 1px solid {{ theme_lighter_gray }}; - margin: 0 0 0 16em; -} - -div.body { - margin: 0; - padding: 0.5em 1.3em; - max-width: 55em; - min-width: 20em; -} - -div.related { - font-size: 1em; - background-color: {{ theme_background }}; -} - -div.documentwrapper { - float: left; - width: 100%; - background-color: {{ theme_light_color }}; -} - -p.logo { - padding-top: 30px; -} - -/* HEADINGS --------------------------------------------------------------- */ - -h1 { - margin: 0; - padding: 0.7em 0 0.3em 0; - font-size: 1.5em; - line-height: 1.15; - color: {{ theme_h1 }}; - clear: both; -} - -h2 { - margin: 2em 0 0.2em 0; - font-size: 1.35em; - padding: 0; - color: {{ theme_h2 }}; -} - -h3 { - margin: 1em 0 -0.3em 0; - font-size: 1.2em; - color: {{ theme_h3 }}; -} - -div.body h1 a, div.body h2 a, div.body h3 a, div.body h4 a, div.body h5 a, div.body h6 a { - color: black; -} - -h1 a.anchor, h2 a.anchor, h3 a.anchor, h4 a.anchor, h5 a.anchor, h6 a.anchor { - display: none; - margin: 0 0 0 0.3em; - padding: 0 0.2em 0 0.2em; - color: {{ theme_gray_a }} !important; -} - -h1:hover a.anchor, h2:hover a.anchor, h3:hover a.anchor, h4:hover a.anchor, -h5:hover a.anchor, h6:hover a.anchor { - display: inline; -} - -h1 a.anchor:hover, h2 a.anchor:hover, h3 a.anchor:hover, h4 a.anchor:hover, -h5 a.anchor:hover, h6 a.anchor:hover { - color: {{ theme_gray_7 }}; - background-color: {{ theme_dirty_white }}; -} - - -/* LINKS ------------------------------------------------------------------ */ - -/* Normal links get a pseudo-underline */ -a { - color: {{ theme_link_color }}; - text-decoration: none; - border-bottom: 1px solid {{ theme_link_color_decoration }}; -} - -/* Links in sidebar, TOC, index trees and tables have no underline */ -.sphinxsidebar a, -.toctree-wrapper a, -.indextable a, -#indices-and-tables a { - color: {{ theme_dark_gray }}; - text-decoration: none; - border-bottom: none; -} - -/* Most links get an underline-effect when hovered */ -a:hover, -div.toctree-wrapper a:hover, -.indextable a:hover, -#indices-and-tables a:hover { - color: {{ theme_black }}; - text-decoration: none; - border-bottom: 1px solid {{ theme_black }}; -} - -/* Footer links */ -div.footer a { - color: {{ theme_background_text_link }}; - text-decoration: none; - border: none; -} -div.footer a:hover { - color: {{ theme_medium_color_link_hover }}; - text-decoration: underline; - border: none; -} - -/* Permalink anchor (subtle grey with a red hover) */ -div.body a.headerlink { - color: {{ theme_lighter_gray }}; - font-size: 1em; - margin-left: 6px; - padding: 0 4px 0 4px; - text-decoration: none; - border: none; -} -div.body a.headerlink:hover { - color: {{ theme_negative_text }}; - border: none; -} - - -/* NAVIGATION BAR --------------------------------------------------------- */ - -div.related ul { - height: 2.5em; -} - -div.related ul li { - margin: 0; - padding: 0.65em 0; - float: left; - display: block; - color: {{ theme_background_link_half }}; /* For the >> separators */ - font-size: 0.8em; -} - -div.related ul li.right { - float: right; - margin-right: 5px; - color: transparent; /* Hide the | separators */ -} - -/* "Breadcrumb" links in nav bar */ -div.related ul li a { - order: none; - background-color: inherit; - font-weight: bold; - margin: 6px 0 6px 4px; - line-height: 1.75em; - color: {{ theme_background_link }}; - text-shadow: 0 1px rgba(0, 0, 0, 0.5); - padding: 0.4em 0.8em; - border: none; - border-radius: 3px; -} -/* previous / next / modules / index links look more like buttons */ -div.related ul li.right a { - margin: 0.375em 0; - background-color: {{ theme_medium_color_hover }}; - text-shadow: 0 1px rgba(0, 0, 0, 0.5); - border-radius: 3px; - -webkit-border-radius: 3px; - -moz-border-radius: 3px; -} -/* All navbar links light up as buttons when hovered */ -div.related ul li a:hover { - background-color: {{ theme_medium_color }}; - color: {{ theme_white }}; - text-decoration: none; - border-radius: 3px; - -webkit-border-radius: 3px; - -moz-border-radius: 3px; -} -/* Take extra precautions for tt within links */ -a tt, -div.related ul li a tt { - background: inherit !important; - color: inherit !important; -} - - -/* SIDEBAR ---------------------------------------------------------------- */ - -div.sphinxsidebarwrapper { - padding: 0; -} - -div.sphinxsidebar { - margin: 0; - margin-left: -100%; - float: left; - top: 3em; - left: 0; - padding: 0 1em; - width: 14em; - font-size: 1em; - text-align: left; - background-color: {{ theme_light_color }}; -} - -div.sphinxsidebar img { - max-width: 12em; -} - -div.sphinxsidebar h3, div.sphinxsidebar h4 { - margin: 1.2em 0 0.3em 0; - font-size: 1em; - padding: 0; - color: {{ theme_gray_2 }}; - font-family: "ff-meta-web-pro-1", "ff-meta-web-pro-2", "Arial", "Helvetica Neue", sans-serif; -} - -div.sphinxsidebar h3 a { - color: {{ theme_grey_color }}; -} - -div.sphinxsidebar ul, -div.sphinxsidebar p { - margin-top: 0; - padding-left: 0; - line-height: 130%; - background-color: {{ theme_light_color }}; -} - -/* No bullets for nested lists, but a little extra indentation */ -div.sphinxsidebar ul ul { - list-style-type: none; - margin-left: 1.5em; - padding: 0; -} - -/* A little top/bottom padding to prevent adjacent links' borders - * from overlapping each other */ -div.sphinxsidebar ul li { - padding: 1px 0; -} - -/* A little left-padding to make these align with the ULs */ -div.sphinxsidebar p.topless { - padding-left: 0 0 0 1em; -} - -/* Make these into hidden one-liners */ -div.sphinxsidebar ul li, -div.sphinxsidebar p.topless { - white-space: nowrap; - overflow: hidden; -} -/* ...which become visible when hovered */ -div.sphinxsidebar ul li:hover, -div.sphinxsidebar p.topless:hover { - overflow: visible; -} - -/* Search text box and "Go" button */ -#searchbox { - margin-top: 2em; - margin-bottom: 1em; - background: {{ theme_dirtier_white }}; - padding: 0.5em; - border-radius: 6px; - -moz-border-radius: 6px; - -webkit-border-radius: 6px; -} -#searchbox h3 { - margin-top: 0; -} - -/* Make search box and button abut and have a border */ -input, -div.sphinxsidebar input { - border: 1px solid {{ theme_gray_9 }}; - float: left; -} - -/* Search textbox */ -input[type="text"] { - margin: 0; - padding: 0 3px; - height: 20px; - width: 144px; - border-top-left-radius: 3px; - border-bottom-left-radius: 3px; - -moz-border-radius-topleft: 3px; - -moz-border-radius-bottomleft: 3px; - -webkit-border-top-left-radius: 3px; - -webkit-border-bottom-left-radius: 3px; -} -/* Search button */ -input[type="submit"] { - margin: 0 0 0 -1px; /* -1px prevents a double-border with textbox */ - height: 22px; - color: {{ theme_dark_gray }}; - background-color: {{ theme_light_color }}; - padding: 1px 4px; - font-weight: bold; - border-top-right-radius: 3px; - border-bottom-right-radius: 3px; - -moz-border-radius-topright: 3px; - -moz-border-radius-bottomright: 3px; - -webkit-border-top-right-radius: 3px; - -webkit-border-bottom-right-radius: 3px; -} -input[type="submit"]:hover { - color: {{ theme_white }}; - background-color: {{ theme_green_highlight }}; -} - -div.sphinxsidebar p.searchtip { - clear: both; - padding: 0.5em 0 0 0; - background: {{ theme_dirtier_white }}; - color: {{ theme_gray }}; - font-size: 0.9em; -} - -/* Sidebar links are unusual */ -div.sphinxsidebar li a, -div.sphinxsidebar p a { - background: {{ theme_light_color }}; /* In case links overlap main content */ - border-radius: 3px; - -moz-border-radius: 3px; - -webkit-border-radius: 3px; - border: 1px solid transparent; /* To prevent things jumping around on hover */ - padding: 0 5px 0 5px; -} -div.sphinxsidebar li a:hover, -div.sphinxsidebar p a:hover { - color: {{ theme_black }}; - text-decoration: none; - border: 1px solid {{ theme_light_gray }}; -} - -/* Tweak any link appearing in a heading */ -div.sphinxsidebar h3 a { -} - - - - -/* OTHER STUFF ------------------------------------------------------------ */ - -cite, code, tt { - font-family: 'Consolas', 'Deja Vu Sans Mono', - 'Bitstream Vera Sans Mono', monospace; - font-size: 0.95em; - letter-spacing: 0.01em; -} - -tt { - background-color: {{ theme_code_background }}; - color: {{ theme_dark_gray }}; -} - -tt.descname, tt.descclassname, tt.xref { - border: 0; -} - -hr { - border: 1px solid {{ theme_ruler }}; - margin: 2em; -} - -pre, #_fontwidthtest { - font-family: 'Consolas', 'Deja Vu Sans Mono', - 'Bitstream Vera Sans Mono', monospace; - margin: 1em 2em; - font-size: 0.95em; - letter-spacing: 0.015em; - line-height: 120%; - padding: 0.5em; - border: 1px solid {{ theme_lighter_gray }}; - background-color: {{ theme_code_background }}; - border-radius: 6px; - -moz-border-radius: 6px; - -webkit-border-radius: 6px; -} - -pre a { - color: inherit; - text-decoration: underline; -} - -td.linenos pre { - padding: 0.5em 0; -} - -div.quotebar { - background-color: {{ theme_almost_white }}; - max-width: 250px; - float: right; - padding: 2px 7px; - border: 1px solid {{ theme_lighter_gray }}; -} - -div.topic { - background-color: {{ theme_almost_white }}; -} - -table { - border-collapse: collapse; - margin: 0 -0.5em 0 0; -} - -table td, table th { - padding: 0.2em 0.5em 0.2em 0.5em; -} - - -/* ADMONITIONS AND WARNINGS ------------------------------------------------- */ - -/* Shared by admonitions, warnings and sidebars */ -div.admonition, -div.warning, -div.sidebar { - font-size: 0.9em; - margin: 2em; - padding: 0; - /* - border-radius: 6px; - -moz-border-radius: 6px; - -webkit-border-radius: 6px; - */ -} -div.admonition p, -div.warning p, -div.sidebar p { - margin: 0.5em 1em 0.5em 1em; - padding: 0; -} -div.admonition pre, -div.warning pre, -div.sidebar pre { - margin: 0.4em 1em 0.4em 1em; -} -div.admonition p.admonition-title, -div.warning p.admonition-title, -div.sidebar p.sidebar-title { - margin: 0; - padding: 0.1em 0 0.1em 0.5em; - color: white; - font-weight: bold; - font-size: 1.1em; - text-shadow: 0 1px rgba(0, 0, 0, 0.5); -} -div.admonition ul, div.admonition ol, -div.warning ul, div.warning ol, -div.sidebar ul, div.sidebar ol { - margin: 0.1em 0.5em 0.5em 3em; - padding: 0; -} - - -/* Admonitions and sidebars only */ -div.admonition, div.sidebar { - border: 1px solid {{ theme_positive_dark }}; - background-color: {{ theme_positive_light }}; -} -div.admonition p.admonition-title, -div.sidebar p.sidebar-title { - background-color: {{ theme_positive_medium }}; - border-bottom: 1px solid {{ theme_positive_dark }}; -} - - -/* Warnings only */ -div.warning { - border: 1px solid {{ theme_negative_dark }}; - background-color: {{ theme_negative_light }}; -} -div.warning p.admonition-title { - background-color: {{ theme_negative_medium }}; - border-bottom: 1px solid {{ theme_negative_dark }}; -} - - -/* Sidebars only */ -div.sidebar { - max-width: 200px; -} - - - -div.versioninfo { - margin: 1em 0 0 0; - border: 1px solid {{ theme_lighter_gray }}; - background-color: {{ theme_light_medium_color }}; - padding: 8px; - line-height: 1.3em; - font-size: 0.9em; -} - -.viewcode-back { - font-family: 'Lucida Grande', 'Lucida Sans Unicode', 'Geneva', - 'Verdana', sans-serif; -} - -div.viewcode-block:target { - background-color: {{ theme_viewcode_bg }}; - border-top: 1px solid {{ theme_viewcode_border }}; - border-bottom: 1px solid {{ theme_viewcode_border }}; -} - -dl { - margin: 1em 0 2.5em 0; -} - -/* Highlight target when you click an internal link */ -dt:target { - background: {{ theme_highlight }}; -} -/* Don't highlight whole divs */ -div.highlight { - background: transparent; -} -/* But do highlight spans (so search results can be highlighted) */ -span.highlight { - background: {{ theme_highlight }}; -} - -div.footer { - background-color: {{ theme_background }}; - color: {{ theme_background_text }}; - padding: 0 2em 2em 2em; - clear: both; - font-size: 0.8em; - text-align: center; -} - -p { - margin: 0.8em 0 0.5em 0; -} - -.section p img { - margin: 1em 2em; -} - - -/* MOBILE LAYOUT -------------------------------------------------------------- */ - -@media screen and (max-width: 600px) { - - h1, h2, h3, h4, h5 { - position: relative; - } - - ul { - padding-left: 1.75em; - } - - div.bodywrapper a.headerlink, #indices-and-tables h1 a { - color: {{ theme_almost_dirty_white }}; - font-size: 80%; - float: right; - line-height: 1.8; - position: absolute; - right: -0.7em; - visibility: inherit; - } - - div.bodywrapper h1 a.headerlink, #indices-and-tables h1 a { - line-height: 1.5; - } - - pre { - font-size: 0.7em; - overflow: auto; - word-wrap: break-word; - white-space: pre-wrap; - } - - div.related ul { - height: 2.5em; - padding: 0; - text-align: left; - } - - div.related ul li { - clear: both; - color: {{ theme_dark_color }}; - padding: 0.2em 0; - } - - div.related ul li:last-child { - border-bottom: 1px dotted {{ theme_medium_color }}; - padding-bottom: 0.4em; - margin-bottom: 1em; - width: 100%; - } - - div.related ul li a { - color: {{ theme_dark_color }}; - padding-right: 0; - } - - div.related ul li a:hover { - background: inherit; - color: inherit; - } - - div.related ul li.right { - clear: none; - padding: 0.65em 0; - margin-bottom: 0.5em; - } - - div.related ul li.right a { - color: {{ theme_white }}; - padding-right: 0.8em; - } - - div.related ul li.right a:hover { - background-color: {{ theme_medium_color }}; - } - - div.body { - clear: both; - min-width: 0; - word-wrap: break-word; - } - - div.bodywrapper { - margin: 0 0 0 0; - } - - div.sphinxsidebar { - float: none; - margin: 0; - width: auto; - } - - div.sphinxsidebar input[type="text"] { - height: 2em; - line-height: 2em; - width: 70%; - } - - div.sphinxsidebar input[type="submit"] { - height: 2em; - margin-left: 0.5em; - width: 20%; - } - - div.sphinxsidebar p.searchtip { - background: inherit; - margin-bottom: 1em; - } - - div.sphinxsidebar ul li, div.sphinxsidebar p.topless { - white-space: normal; - } - - .bodywrapper img { - display: block; - margin-left: auto; - margin-right: auto; - max-width: 100%; - } - - div.documentwrapper { - float: none; - } - - div.admonition, div.warning, pre, blockquote { - margin-left: 0em; - margin-right: 0em; - } - - .body p img { - margin: 0; - } - - #searchbox { - background: transparent; - } - - .related:not(:first-child) li { - display: none; - } - - .related:not(:first-child) li.right { - display: block; - } - - div.footer { - padding: 1em; - } - - .rtd_doc_footer .badge { - float: none; - margin: 1em auto; - position: static; - } - - .rtd_doc_footer .badge.revsys-inline { - margin-right: auto; - margin-bottom: 2em; - } - - table.indextable { - display: block; - width: auto; - } - - .indextable tr { - display: block; - } - - .indextable td { - display: block; - padding: 0; - width: auto !important; - } - - .indextable td dt { - margin: 1em 0; - } - - ul.search { - margin-left: 0.25em; - } - - ul.search li div.context { - font-size: 90%; - line-height: 1.1; - margin-bottom: 1; - margin-left: 0; - } - -} diff --git a/docs/_themes/armstrong/theme-old.conf b/docs/_themes/armstrong/theme-old.conf deleted file mode 100644 index c77da3a193..0000000000 --- a/docs/_themes/armstrong/theme-old.conf +++ /dev/null @@ -1,65 +0,0 @@ -[theme] -inherit = default -stylesheet = rtd.css -pygment_style = default -show_sphinx = False - -[options] -show_rtd = True - -white = #ffffff -almost_white = #f8f8f8 -barely_white = #f2f2f2 -dirty_white = #eeeeee -almost_dirty_white = #e6e6e6 -dirtier_white = #DAC6AF -lighter_gray = #cccccc -gray_a = #aaaaaa -gray_9 = #999999 -light_gray = #888888 -gray_7 = #777777 -gray = #666666 -dark_gray = #444444 -gray_2 = #222222 -black = #111111 -light_color = #EDE4D8 -light_medium_color = #DDEAF0 -medium_color_link = #634320 -medium_color_link_hover = #261a0c -dark_color = rgba(160, 109, 52, 1.0) - -h1 = #1f3744 -h2 = #335C72 -h3 = #638fa6 - -link_color = #335C72 -link_color_decoration = #99AEB9 - -medium_color_hover = rgba(255, 255, 255, 0.25) -medium_color = rgba(255, 255, 255, 0.5) -green_highlight = #8ecc4c - - -positive_dark = rgba(51, 77, 0, 1.0) -positive_medium = rgba(102, 153, 0, 1.0) -positive_light = rgba(102, 153, 0, 0.1) - -negative_dark = rgba(51, 13, 0, 1.0) -negative_medium = rgba(204, 51, 0, 1.0) -negative_light = rgba(204, 51, 0, 0.1) -negative_text = #c60f0f - -ruler = #abc - -viewcode_bg = #f4debf -viewcode_border = #ac9 - -highlight = #ffe080 - -code_background = rgba(0, 0, 0, 0.075) - -background = rgba(135, 57, 34, 1.0) -background_link = rgba(212, 195, 172, 1.0) -background_link_half = rgba(212, 195, 172, 0.5) -background_text = rgba(212, 195, 172, 1.0) -background_text_link = rgba(171, 138, 93, 1.0) diff --git a/docs/_themes/armstrong/theme.conf b/docs/_themes/armstrong/theme.conf deleted file mode 100644 index 5930488d75..0000000000 --- a/docs/_themes/armstrong/theme.conf +++ /dev/null @@ -1,65 +0,0 @@ -[theme] -inherit = default -stylesheet = rtd.css -pygment_style = default -show_sphinx = False - -[options] -show_rtd = True - -white = #ffffff -almost_white = #f8f8f8 -barely_white = #f2f2f2 -dirty_white = #eeeeee -almost_dirty_white = #e6e6e6 -dirtier_white = #dddddd -lighter_gray = #cccccc -gray_a = #aaaaaa -gray_9 = #999999 -light_gray = #888888 -gray_7 = #777777 -gray = #666666 -dark_gray = #444444 -gray_2 = #222222 -black = #111111 -light_color = #e8ecef -light_medium_color = #DDEAF0 -medium_color = #8ca1af -medium_color_link = #86989b -medium_color_link_hover = #a6b8bb -dark_color = #465158 - -h1 = #000000 -h2 = #465158 -h3 = #6c818f - -link_color = #444444 -link_color_decoration = #CCCCCC - -medium_color_hover = #697983 -green_highlight = #8ecc4c - - -positive_dark = #609060 -positive_medium = #70a070 -positive_light = #e9ffe9 - -negative_dark = #900000 -negative_medium = #b04040 -negative_light = #ffe9e9 -negative_text = #c60f0f - -ruler = #abc - -viewcode_bg = #f4debf -viewcode_border = #ac9 - -highlight = #ffe080 - -code_background = #eeeeee - -background = #465158 -background_link = #ffffff -background_link_half = #ffffff -background_text = #eeeeee -background_text_link = #86989b diff --git a/docs/apidocs/.gitignore b/docs/apidocs/.gitignore deleted file mode 100644 index 89867378b3..0000000000 --- a/docs/apidocs/.gitignore +++ /dev/null @@ -1,2 +0,0 @@ -modules.rst -rdflib*.rst diff --git a/docs/apidocs/examples.rst b/docs/apidocs/examples.rst index a8c3429bd4..e69de29bb2 100644 --- a/docs/apidocs/examples.rst +++ b/docs/apidocs/examples.rst @@ -1,141 +0,0 @@ -examples Package -================ - -These examples all live in ``./examples`` in the source-distribution of RDFLib. - -:mod:`~examples.datasets` Module --------------------------------- - -.. automodule:: examples.datasets - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.jsonld_serialization` Module --------------------------------------------- - -.. automodule:: examples.jsonld_serialization - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.custom_datatype` Module ---------------------------------------- - -.. automodule:: examples.custom_datatype - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.custom_eval` Module ------------------------------------ - -.. automodule:: examples.custom_eval - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.foafpaths` Module ---------------------------------- - -.. automodule:: examples.foafpaths - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.prepared_query` Module --------------------------------------- - -.. automodule:: examples.prepared_query - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.resource_example` Module ----------------------------------------- - -.. automodule:: examples.resource_example - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.berkeleydb_example` Module ------------------------------------------- - -.. automodule:: examples.berkeleydb_example - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.slice` Module ------------------------------ - -.. automodule:: examples.slice - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.smushing` Module --------------------------------- - -.. automodule:: examples.smushing - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.sparql_query_example` Module --------------------------------------------- - -.. automodule:: examples.sparql_query_example - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.sparql_update_example` Module ---------------------------------------------- - -.. automodule:: examples.sparql_update_example - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.sparqlstore_example` Module -------------------------------------------- - -.. automodule:: examples.sparqlstore_example - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.swap_primer` Module ------------------------------------ - -.. automodule:: examples.swap_primer - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.transitive` Module ----------------------------------- - -.. automodule:: examples.transitive - :members: - :undoc-members: - :show-inheritance: - -:mod:`~examples.secure_with_audit` Module ------------------------------------------ - -.. automodule:: examples.secure_with_audit - :members: - :undoc-members: - :show-inheritance: - - -:mod:`~examples.secure_with_urlopen` Module -------------------------------------------- - -.. automodule:: examples.secure_with_urlopen - :members: - :undoc-members: - :show-inheritance: diff --git a/docs/changelog.md b/docs/changelog.md index 63ae71beb0..e40ac58a2a 100644 --- a/docs/changelog.md +++ b/docs/changelog.md @@ -1,4 +1,3 @@ # Changelog -```{include} ../CHANGELOG.md -``` +{% include "../CHANGELOG.md" %} diff --git a/docs/conf.py b/docs/conf.py index b3c4a373bd..e69de29bb2 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -1,333 +0,0 @@ -# rdflib documentation build configuration file, created by -# sphinx-quickstart on Fri May 15 15:03:54 2009. -# -# This file is execfile()d with the current directory set to its containing dir. -# -# Note that not all possible configuration values are present in this -# autogenerated file. -# -# All configuration values have a default; values that are commented out -# serve to show the default. -# https://www.sphinx-doc.org/en/master/usage/configuration.html -from __future__ import annotations - -import logging -import os -import re -import sys -from typing import Any, Dict - -import sphinx -import sphinx.application - -import rdflib - -# If extensions (or modules to document with autodoc) are in another directory, -# add these directories to sys.path here. If the directory is relative to the -# documentation root, use os.path.abspath to make it absolute, like shown here. -# sys.path.append(os.path.abspath("..")) -sys.path.append(os.path.abspath("..")) - -# -- General configuration ----------------------------------------------------- - -# Add any Sphinx extension module names here, as strings. They can be extensions -# coming with Sphinx (named 'sphinx.ext.*') or your custom ones. -# extensions = ['sphinx.ext.autodoc', 'sphinx.ext.todo', 'sphinx.ext.doctest'] -extensions = [ - "sphinxcontrib.apidoc", - "sphinx.ext.autodoc", - #'sphinx.ext.autosummary', - "sphinx_autodoc_typehints", - "sphinx.ext.doctest", - "sphinx.ext.intersphinx", - "sphinx.ext.todo", - "sphinx.ext.coverage", - "sphinx.ext.ifconfig", - "sphinx.ext.viewcode", - "myst_parser", - "sphinx.ext.autosectionlabel", -] - -# https://github.com/sphinx-contrib/apidoc/blob/master/README.rst#configuration -apidoc_module_dir = "../rdflib" -apidoc_output_dir = "apidocs" - -# https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html -autodoc_default_options = {"special-members": True} -autodoc_inherit_docstrings = True - -# https://github.com/tox-dev/sphinx-autodoc-typehints -always_document_param_types = True - -autosummary_generate = True - -autosectionlabel_prefix_document = True - -# Add any paths that contain templates here, relative to this directory. -templates_path = ["_templates"] - -# epydoc_mapping = { -# '/_static/api/': [r'rdflib\.'], -# } - -# The suffix of source filenames. -source_suffix = ".rst" - -# The encoding of source files. -source_encoding = "utf-8" - -# The master toctree document. -master_doc = "index" - -# General information about the project. -project = "rdflib" -copyright = "2002 - 2025, RDFLib Team" - -# The version info for the project you're documenting, acts as replacement for -# |version| and |release|, also used in various other places throughout the -# built documents. - - -# Find version. We have to do this because we can't import it in Python 3 until -# its been automatically converted in the setup process. -# UPDATE: This function is no longer used; once builds are confirmed to succeed, it -# can/should be removed. --JCL 2022-12-30 -def find_version(filename): - _version_re = re.compile(r'__version__ = "(.*)"') - for line in open(filename): - version_match = _version_re.match(line) - if version_match: - return version_match.group(1) - - -# The full version, including alpha/beta/rc tags. -release = rdflib.__version__ -# The short X.Y version. -version = re.sub("[0-9]+\\.[0-9]\\..*", "\1", release) - -# The language for content autogenerated by Sphinx. Refer to documentation -# for a list of supported languages. -# language = None - -# There are two options for replacing |today|: either, you set today to some -# non-false value, then it is used: -# today = '' -# Else, today_fmt is used as the format for a strftime call. -# today_fmt = '%B %d, %Y' - -# List of documents that shouldn't be included in the build. -# unused_docs = [] - -# List of directories, relative to source directory, that shouldn't be searched -# for source files. -exclude_trees = ["_build", "draft"] - -# The reST default role (used for this markup: `text`) to use for all documents. -default_role = "py:obj" - -# If true, '()' will be appended to :func: etc. cross-reference text. -add_function_parentheses = True - -# If true, the current module name will be prepended to all description -# unit titles (such as .. function::). -add_module_names = True - -# If true, sectionauthor and moduleauthor directives will be shown in the -# output. They are ignored by default. -# show_authors = False - -# The name of the Pygments (syntax highlighting) style to use. -pygments_style = "sphinx" - -# A list of ignored prefixes for module index sorting. -# modindex_common_prefix = [] - - -# -- Options for HTML output --------------------------------------------------- - -# The theme to use for HTML and HTML Help pages. Major themes that come with -# Sphinx are currently 'default' and 'sphinxdoc'. -html_theme = "armstrong" - - -# Theme options are theme-specific and customize the look and feel of a theme -# further. For a list of options available for each theme, see the -# documentation. -# html_theme_options = {} - -# Add any paths that contain custom themes here, relative to this directory. -html_theme_path = [ - "_themes", -] - -# The name for this set of Sphinx documents. If None, it defaults to -# " v documentation". -# html_title = None - -# A shorter title for the navigation bar. Default is the same as html_title. -# html_short_title = None - -# The name of an image file (relative to this directory) to place at the top -# of the sidebar. -# html_logo = None -html_logo = "_static/RDFlib.png" - -# The name of an image file (within the static path) to use as favicon of the -# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 -# pixels large. -html_favicon = "_static/RDFlib.ico" - -# Add any paths that contain custom static files (such as style sheets) here, -# relative to this directory. They are copied after the builtin static files, -# so a file named "default.css" will overwrite the builtin "default.css". -html_static_path = ["_static"] - -# If not '', a 'Last updated on:' timestamp is inserted at every page bottom, -# using the given strftime format. -# html_last_updated_fmt = '%b %d, %Y' - -# If true, SmartyPants will be used to convert quotes and dashes to -# typographically correct entities. -# html_use_smartypants = True - -# Custom sidebar templates, maps document names to template names. -# html_sidebars = {} - -# Additional templates that should be rendered to pages, maps page names to -# template names. -# html_additional_pages = {} - -# If false, no module index is generated. -# html_use_modindex = True - -# If false, no index is generated. -# html_use_index = True - -# If true, the index is split into individual pages for each letter. -# html_split_index = False - -# If true, links to the reST sources are added to the pages. -# html_show_sourcelink = True - -# If true, an OpenSearch description file will be output, and all pages will -# contain a tag referring to it. The value of this option must be the -# base URL from which the finished HTML is served. -# html_use_opensearch = '' - -# If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml"). -# html_file_suffix = '' - -# Output file base name for HTML help builder. -htmlhelp_basename = "rdflibdoc" - - -# -- Options for LaTeX output -------------------------------------------------- - -# The paper size ('letter' or 'a4'). -# latex_paper_size = 'letter' - -# The font size ('10pt', '11pt' or '12pt'). -# latex_font_size = '10pt' - -# Grouping the document tree into LaTeX files. List of tuples -# (source start file, target name, title, author, documentclass [howto/manual]). -# latex_documents = [ -# ("index", "rdflib.tex", "rdflib Documentation", "RDFLib Team", "manual"), -# ] - -# The name of an image file (relative to this directory) to place at the top of -# the title page. -# latex_logo = None - -# For "manual" documents, if this is true, then toplevel headings are parts, -# not chapters. -# latex_use_parts = False - -# Additional stuff for the LaTeX preamble. -# latex_preamble = '' - -# Documents to append as an appendix to all manuals. -# latex_appendices = [] - -# If false, no module index is generated. -# latex_use_modindex = True - - -# Example configuration for intersphinx: refer to the Python standard library. -intersphinx_mapping = { - "python": ("https://docs.python.org/3.8", None), -} - -html_experimental_html5_writer = True - -needs_sphinx = "4.1.2" - -suppress_warnings = [ - # This is here to prevent: - # "WARNING: more than one target found for cross-reference" - "ref.python", - "autosectionlabel.*", -] - -sphinx_version = tuple(int(part) for part in sphinx.__version__.split(".")) - - -nitpicky = True - -nitpick_ignore = [ - ("py:class", "urllib.response.addinfourl"), - ("py:class", "importlib.metadata.EntryPoint"), - ("py:class", "xml.dom.minidom.Document"), - ("py:class", "xml.dom.minidom.DocumentFragment"), - ("py:class", "isodate.duration.Duration"), - ("py:class", "pyparsing.core.TokenConverter"), - ("py:class", "pyparsing.results.ParseResults"), - ("py:class", "pyparsing.core.ParserElement"), -] - - -def autodoc_skip_member_handler( - app: sphinx.application.Sphinx, - what: str, - name: str, - obj: Any, - skip: bool, - options: Dict[str, Any], -): - """ - This function will be called by Sphinx when it is deciding whether to skip a - member of a class or module. - """ - # https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#event-autodoc-skip-member - if ( - app.env.docname == "apidocs/rdflib" - and what == "module" - and type(obj).__name__.endswith("DefinedNamespaceMeta") - ): - # Don't document namespaces in the `rdflib` module, they will be - # documented in the `rdflib.namespace` module instead and Sphinx does - # not like when these are documented in two places. - # - # An example of the WARNINGS that occur without this is: - # - # "WARNING: duplicate object description of rdflib.namespace._SDO.SDO, - # other instance in apidocs/rdflib, use :noindex: for one of them" - logging.info( - "Skipping %s %s in %s, it will be documented in ", - what, - name, - app.env.docname, - ) - return True - return None - - -# https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#skipping-members -def setup(app: sphinx.application.Sphinx) -> None: - """ - Setup the Sphinx application. - """ - - # Register a autodoc-skip-member handler so that certain members can be - # skipped. - app.connect("autodoc-skip-member", autodoc_skip_member_handler) diff --git a/docs/decisions.md b/docs/decisions.md new file mode 100644 index 0000000000..9f68c4260c --- /dev/null +++ b/docs/decisions.md @@ -0,0 +1,35 @@ +# Decision Records + +To ensure that significant changes to RDFLib are made with sufficient consultation, consideration and planning they should be preceded by a decision record that captures the particulars of the decision that lead to the change. + +Decision records present the users and maintainers of RDFLib with an opportunity to review decisions before effort is expended to implement the decision in code, and it also makes it possible to review decisions without having to reconstruct them from the code changes that implement them. + +Whether a change is significant is hard to measure objectively, but some characteristics that may indicate that a change is significant include: + +* It will require changes to code that use RDFLib. +* It cannot be reversed without requiring changes to code that use RDFLib. +* It is onerous to reverse later. +* It increases the maintenance burden of RDFLib. +* It is very large. + +Some of these characteristics are not binary but measured in degrees, so some discretion is required when determining if an architectural decision record is appropriate. + +Decision records may also be used for changes that do not have any of the listed characteristics if a decision record would be otherwise helpful, for example to capture a decision to change the maintenance process of RDFLib. + +Changes not preceded by decision records won't be rejected solely on this basis even if they are deemed significant, and decision records may also be created retrospectively for changes. + +Decision records as described here are similar to the concept of [Architectural Decision Records](https://adr.github.io/), though it is slightly broader as it could include decisions which are not classified as architectural. + +## Creating a decision record + +Decision records should be added to the RDFLib repository in the `./docs/decisions/` directory with a name `{YYYYmmdd}-{title}.md`. + +The content of the decision record should succinctly describe the context of the decision, the decision itself, and the status of the decision. + +Decision records should preferably follow [Michael Nygard decision record template](https://github.com/joelparkerhenderson/architecture-decision-record/blob/main/templates/decision-record-template-by-michael-nygard/index.md) that he described in a [2011 article](https://cognitect.com/blog/2011/11/15/documenting-architecture-decisions.html) on documenting architecture decisions. + +For questions about decision records please reach out to the RDFLib maintainers and community using the options given in [further_help_and_contact]. + +## Decisions list + +- [Default branch](decisions/20220826-default_branch.md) diff --git a/docs/decisions/20220826-default_branch.md b/docs/decisions/20220826-default_branch.md new file mode 100644 index 0000000000..22443cfca8 --- /dev/null +++ b/docs/decisions/20220826-default_branch.md @@ -0,0 +1,30 @@ +# Default Branch Name + +!!! success "Status" + Accepted + +## Context + +In recent years usage of the word `master` has become somewhat controversial [as noted by SFC][SFC-BNAMING] and consequently default branch name of Git repos has become `main`, both in Git itself [according to SFC][SFC-BNAMING] and in Git hosting solutions such as GitHub [documentation][GH-BRANCHES]. + +## Decision + +RDFLib's default branch will be renamed from `master` to `main`. This is primarily to stay in line with modern conventions and to adhere to the principle of least surprise. + +## Consequences + +Anticipated negative consequences: + +* Some links to old code will be broken. +* Some people's workflow may break unexpectedly and need adjusting. +* Any code and systems reliant on the old default branch name will fail. + +Anticipated positive consequences: + +* It will become a bit easier to work with RDFLib for developers that are used + to `main` as the default branch. + +## References + +[GH-BRANCHES]: https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/about-branches#about-the-default-branch "GitHub: About the default branch" +[SFC-BNAMING]: https://sfconservancy.org/news/2020/jun/23/gitbranchname/ "Regarding Git and Branch Naming" diff --git a/docs/decisions/20220826-default_branch.rst b/docs/decisions/20220826-default_branch.rst deleted file mode 100644 index dfa4189faf..0000000000 --- a/docs/decisions/20220826-default_branch.rst +++ /dev/null @@ -1,42 +0,0 @@ -Default Branch Name -=========================== - -.. admonition:: Status - - Accepted - -Context -------- - -In recent years usage of the word ``master`` has become somewhat controversial -[SFC-BNAMING]_ and consequently default branch name of Git repos has become -``main``, both in Git itself [SFC-BNAMING]_ and in Git hosting solutions such as -GitHub [GH-BRANCHES]_. - -Decision --------- - -RDFLib's -default branch will be renamed from ``master`` to ``main``. This is primarily to stay in line with modern conventions and to adhere to the principle of least surprise. - -Consequences ------------- - -Anticipated negative consequences: - -* Some links to old code will be broken. -* Some people's workflow may break unexpectedly and need adjusting. -* Any code and systems reliant on the old default branch name will fail. - -Anticipated positive consequences: - -* It will become a bit easier to work with RDFLib for developers that are used - to ``main`` as the default branch. - -References ----------- - -.. [GH-BRANCHES] `GitHub: About the default branch - `_ -.. [SFC-BNAMING] `Regarding Git and Branch Naming - `_ diff --git a/docs/decisions/index.rst b/docs/decisions/index.rst deleted file mode 100644 index 39d02ccc9a..0000000000 --- a/docs/decisions/index.rst +++ /dev/null @@ -1,69 +0,0 @@ -.. _decision_records: Decision Records - -Decision Records -================ - -To ensure that significant changes to RDFLib are made with sufficient consultation, -consideration and planning they should be preceded by a decision record that -captures the particulars of the decision that lead to the change. - -Decision records present the users and maintainers of RDFLib with an opportunity -to review decisions before effort is expended to implement the decision in code, -and it also makes it possible to review decisions without having to reconstruct -them from the code changes that implement them. - -Whether a change is significant is hard to measure objectively, but some -characteristics that may indicate that a change is significant include: - -* It will require changes to code that use RDFLib. -* It cannot be reversed without requiring changes to code that use - RDFLib. -* It is onerous to reverse later. -* It increases the maintenance burden of RDFLib. -* It is very large. - -Some of these characteristics are not binary but measured in degrees, so some -discretion is required when determining if an architectural decision record is -appropriate. - -Decision records may also be used for changes that do not have any of the listed -characteristics if a decision record would be otherwise helpful, for example to -capture a decision to change the maintenance process of RDFLib. - -Changes not preceded by decision records won't be rejected solely on this basis -even if they are deemed significant, and decision records may also be created -retrospectively for changes. - -Decision records as described here are similar to the concept of `Architectural -Decision Records `_, though it is slightly broader as it -could include decisions which are not classified as architectural. - -Creating a decision record --------------------------- - -Decision records should be added to the RDFLib repository in the -``./docs/decisions/`` directory with a name ``{YYYYmmdd}-{title}.rst``. - -The content of the decision record should succinctly describe the context of the -decision, the decision itself, and the status of the decision. - -Decision records should preferably follow `Michael Nygard decision record -template -`_ -that he described in a `2011 article -`_ -on documenting architecture decisions. - -For questions about decision records please reach out to the RDFLib maintainers -and community using the options given in :ref:`further_help_and_contact`. - - -Decision list -------------- - -.. toctree:: - :glob: - - 20*-* - - \ No newline at end of file diff --git a/docs/developers.md b/docs/developers.md new file mode 100644 index 0000000000..4687bbe038 --- /dev/null +++ b/docs/developers.md @@ -0,0 +1,384 @@ +# RDFLib developers guide + +## Introduction + +This document describes the process and conventions to follow when +developing RDFLib code. + +* Please be as Pythonic as possible ([PEP 8](https://www.python.org/dev/peps/pep-0008/)). +* Code should be formatted using [black](https://github.com/psf/black) and we use Black v23.1.0, with the black config in `pyproject.toml`. +* Code should also pass [flake8](https://flake8.pycqa.org/en/latest/) linting + and [mypy](http://mypy-lang.org/) type checking. +* You must supply tests for new code. +* RDFLib uses [Poetry](https://python-poetry.org/docs/master/) for dependency management and packaging. + +If you add a new cool feature, consider also adding an example in `./examples`. + +## Pull Requests Guidelines + +Contributions to RDFLib are made through pull requests (PRs). + +For changes that add features or affect the public API of RDFLib, it is recommended to first open an issue to discuss the change before starting to work on it. That way you can get feedback on the design of the feature before spending time on it. + +In general, maintainers will only merge PRs if the following conditions are met: + +* The PR has been sufficiently reviewed. + + Each PR should be reviewed and approved by at least two people other than the + author of the PR before it is merged and PRs will be processed faster if + they are easier to review and approve of. + + Reviews are open to everyone, but the weight assigned to any particular + review is at the discretion of maintainers. + +* Changes that have a runtime impact are covered by unit tests. + + There should either be existing tests that cover the changed code and + behaviour, or the PR should include tests. For more information about what is + considered adequate testing see the [Tests section](#tests). + +* Documentation that covers something that changed has been updated. + +* Type checks and unit tests that are part of our continuous integration workflow pass. + +In addition to these conditions, PRs that are easier to review and approve will be processed quicker. The primary factors that determine this are the scope and size of a PR. If there are few changes and the scope is limited, then there is less that a reviewer has to understand and less that they can disagree with. It is thus important to try to split up your changes into multiple independent PRs if possible. No PR is too small. + +For PRs that introduce breaking changes, it is even more critical that they are limited in size and scope, as they will likely have to be kept up to date with the `main` branch of this project for some time before they are merged. + +It is also critical that your PR is understandable both in what it does and why it does it, and how the change will impact the users of this project, for this reason, it is essential that your PR's description explains the nature of the PR, what the PR intends to do, why this is desirable, and how this will affect the users of this project. + +Please note that while we would like all PRs to follow the guidelines given here, we will not reject a PR just because it does not. + +## Maintenance Guidelines + +This section contains guidelines for maintaining RDFLib. RDFLib maintainers should try to follow these. These guidelines also serve as an indication to RDFLib users what they can expect. + +### Breaking changes + +Breaking changes to RDFLib's public API should be made incrementally, with small pull requests to the main branch that change as few things as possible. + +Breaking changes should be discussed first in an issue before work is started, as it is possible that the change is not necessary or that there is a better way to achieve the same goal, in which case the work on the PR would have been wasted. This will however not be strictly enforced, and no PR will be rejected solely on the basis that it was not discussed upfront. + +RDFLib follows [semantic versioning](https://semver.org/spec/v2.0.0.html) and [trunk-based development](https://trunkbaseddevelopment.com/), so if any breaking changes were introduced into the main branch since the last release, then the next release will be a major release with an incremented major version. + +Releases of RDFLib will not as a rule be conditioned on specific features, so there may be new major releases that contain very few breaking changes, and there could be no minor or patch releases between two major releases. + +#### Rationale + +RDFLib has been around for more than a decade, and in this time both Python and RDF have evolved, and RDFLib's API also has to evolve to keep up with these changes and to make it easier for users to use. This will inevitably require breaking changes. + +There are more or less two ways to introduce breaking changes to RDFLib's public API: + +- Revolutionary: Create a new API from scratch and reimplement it, and when + ready, release a new version of RDFLib with the new API. +- Evolutionary: Incrementally improve the existing API with small changes and + release any breaking changes that were made at regular intervals. + +While the revolutionary approach seems appealing, it is also risky and time-consuming. + +The evolutionary approach puts a lot of strain on the users of RDFLib as they have to adapt to breaking changes more often, but the shortcomings of the RDFLib public API also put a lot of strain on the users of RDFLib. On the other hand, a major advantage of the evolutionary approach is that it is simple and achievable from a maintenance and contributor perspective. + +### Deprecating functionality + +To whatever extent possible, classes, functions, variables, or parameters that will be removed should be marked for deprecation in documentation, and if possible, should be changed to raise deprecation warnings if used. + +There is however no hard requirement that something may only be removed after a deprecation notice has been added, or only after a release was made with a deprecation notice. + +Consequently, functionality may be removed without it ever being marked as deprecated. + +#### Rationale + +Current resource limitations and the backlog of issues make it impractical to first release or incorporate deprecation notices before making quality of life changes. + +RDFLib uses semantic versioning and provides type hints, and these are the primary mechanisms for signalling breaking changes to our users. + +## Tests + +Any new functionality being added to RDFLib *must* have unit tests and should have doc tests supplied. + +Typically, you should add your functionality and new tests to a branch of RDFlib and run all tests locally and see them pass. There are currently close to 4,000 tests, with a some expected failures and skipped tests. We won't merge pull requests unless the test suite completes successfully. + +Tests that you add should show how your new feature or bug fix is doing what you say it is doing: if you remove your enhancement, your new tests should fail! + +Finally, please consider adding simple and more complex tests. It's good to see the basic functionality of your feature tests and then also any tricky bits or edge cases. + +### Testing framework + +RDFLib uses the [pytest](https://docs.pytest.org/en/latest/) testing framework. + +### Running tests + +To run RDFLib's test suite with [pytest](https://docs.pytest.org/en/latest/): + +```bash +poetry install +poetry run pytest +``` + +Specific tests can be run by file name. For example: + +```bash +poetry run pytest test/test_graph/test_graph.py +``` + +For more extensive tests, including tests for the [berkleydb](https://www.oracle.com/database/technologies/related/berkeleydb.html) backend, install extra requirements before executing the tests. + +```bash +poetry install --all-extras +poetry run pytest +``` + +### Writing tests + +New tests should be written for [pytest](https://docs.pytest.org/en/latest/) instead of for python's built-in `unittest` module as pytest provides advanced features such as parameterization and more flexibility in writing expected failure tests than `unittest`. + +A primer on how to write tests for pytest can be found [here](https://docs.pytest.org/en/latest/getting-started.html#create-your-first-test). + +The existing tests that use `unittest` work well with pytest, but they should ideally be updated to the pytest test-style when they are touched. + +Test should go into the `test/` directory, either into an existing test file with a name that is applicable to the test being written, or into a new test file with a name that is descriptive of the tests placed in it. Test files should be named `test_*.py` so that [pytest can discover them](https://docs.pytest.org/en/latest/explanation/goodpractices.html#conventions-for-python-test-discovery). + +## Running static checks + +Check formatting with [black](https://github.com/psf/black), making sure you use +our black.toml config file: + +```bash +poetry run black . +``` + +Check style and conventions with [ruff](https://docs.astral.sh/ruff/linter/): + +```bash +poetry run ruff check +``` + +Any issues that are found can potentially be fixed automatically using: + +```bash +poetry run ruff check --fix +``` + +Check types with [mypy](http://mypy-lang.org/): + +```bash +poetry run mypy --show-error-context --show-error-codes +``` + +## pre-commit and pre-commit ci + +We have [pre-commit](https://pre-commit.com/) configured with [black](https://github.com/psf/black) for formatting code. + +Some useful commands for using pre-commit: + +```bash +# Install pre-commit. +pip install --user --upgrade pre-commit + +# Install pre-commit hooks, this will run pre-commit +# every time you make a git commit. +pre-commit install + +# Run pre-commit on changed files. +pre-commit run + +# Run pre-commit on all files. +pre-commit run --all-files +``` + +There is also two tox environments for pre-commit: + +```bash +# run pre-commit on changed files. +tox -e precommit + +# run pre-commit on all files. +tox -e precommitall +``` + +There is no hard requirement for pull requests to be processed with pre-commit (or the underlying processors), however doing this makes for a less noisy codebase with cleaner history. + +We have enabled [https://pre-commit.ci/](https://pre-commit.ci/) and this can be used to automatically fix pull requests by commenting `pre-commit.ci autofix` on a pull request. + +## Using tox + +RDFLib has a [tox](https://tox.wiki/en/latest/index.html) config file that makes it easier to run validation on all supported python versions. + +```bash +# Install tox. +pip install tox + +# List the tox environments that run by default. +tox -e + +# Run the default environments. +tox + +# List all tox environments, including ones that don't run by default. +tox -a + +# Run a specific environment. +tox -e py39 # default environment with py39 +tox -e py311-extra # extra tests with py311 + +# Override the test command. +# the below command will run `pytest test/test_translate_algebra.py` +# instead of the default pytest command. +tox -e py39,py311 -- pytest test/test_translate_algebra.py +``` + +## `go-task` and `Taskfile.yml` + +A `Taskfile.yml` is provided for [go-task](https://taskfile.dev/#/) with various commands that facilitate development. + +Instructions for installing go-task can be seen in the [go-task installation guide](https://taskfile.dev/#/installation). + +Some useful commands for working with the task in the taskfile is given below: + +```bash +# List available tasks. +task -l + +# Configure the environment for development +task configure + +# Run basic validation +task validate + +# Build docs +task docs + +# Run live-preview on the docs +task docs:live-server + +# Run the py310 tox environment +task tox -- -e py310 +``` + +The [Taskfile usage documentation](https://taskfile.dev/#/usage) provides more information on how to work with taskfiles. + +## Development container + +To simplify the process of getting a working development environment to develop rdflib in we provide a [Development Container](https://devcontainers.github.io/containers.dev/) (*devcontainer*) that is configured in [Docker Compose](https://docs.docker.com/compose/). This container can be used directly to run various commands, or it can be used with [editors that support Development Containers](https://devcontainers.github.io/containers.dev/supporting). + +!!! bug "Rootless docker" + The devcontainer is intended to run with a + [rootless docker](https://docs.docker.com/engine/security/rootless/) + daemon so it can edit files owned by the invoking user without + an invovled configuration process. + + Using a rootless docker daemon also has general security benefits. + +To use the development container directly: + +```bash +# Build the devcontainer docker image. +docker-compose build + +# Configure the system for development. +docker-compose run --rm run task configure + +# Run the validate task inside the devtools container. +docker-compose run --rm run task validate + +# Run extensive tests inside the devtools container. +docker-compose run --rm run task EXTENSIVE=true test + +# To get a shell into the devcontainer docker image. +docker-compose run --rm run bash +``` + +The devcontainer also works with [Podman Compose](https://github.com/containers/podman-compose). + +Details on how to use the development container with [VSCode](https://code.visualstudio.com/) can found in the [Developing inside a Container](https://code.visualstudio.com/docs/remote/containers) page. With the VSCode [development container CLI](https://code.visualstudio.com/docs/remote/devcontainer-cli) installed the following command can be used to open the repository inside the development container: + +```bash +# Inside the repository base directory +cd ./rdflib/ + +# Build the development container. +devcontainer build . + +# Open the code inside the development container. +devcontainer open . +``` + +## Writing documentation + +We use mkdocs for generating HTML docs, see [docs](docs.md). + +## Continuous Integration + +We used GitHub Actions for CI, see: [https://github.com/RDFLib/rdflib/actions](https://github.com/RDFLib/rdflib/actions) + +If you make a pull-request to RDFLib on GitHub, GitHub Actions will automatically test your code and we will only merge code passing all tests. + +Please do *not* commit tests you know will fail, even if you're just pointing out a bug. If you commit such tests, flag them as expecting to fail. + +## Compatibility + +RDFlib 7.0.0 release and later only support Python 3.8.1 and newer. + +RDFlib 6.0.0 release and later only support Python 3.7 and newer. + +RDFLib 5.0.0 maintained compatibility with Python versions 2.7, 3.4, 3.5, 3.6, 3.7. + +## Releasing + +Create a release-preparation pull request with the following changes: + +* Updated version and date in `CITATION.cff`. +* Updated copyright year in the `LICENSE` file. +* Updated copyright year in the `docs/conf.py` file. +* Updated main branch version and current version in the `README.md` file. +* Updated version in the `pyproject.toml` file. +* Updated `__date__` in the `rdflib/__init__.py` file. +* Accurate `CHANGELOG.md` entry for the release. + +Once the PR is merged, switch to the main branch, build the release and upload it to PyPI: + +```bash +# Clean up any previous builds +rm -vf dist/* + +# Build artifacts +poetry build + +# Verify package metadata +bsdtar -xvf dist/rdflib-*.whl -O '*/METADATA' | view - +bsdtar -xvf dist/rdflib-*.tar.gz -O '*/PKG-INFO' | view - + +# Check that the built wheel and sdist works correctly: +## Ensure pipx is installed but not within RDFLib's environment +pipx run --no-cache --spec "$(readlink -f dist/rdflib*.whl)" rdfpipe --version +pipx run --no-cache --spec "$(readlink -f dist/rdflib*.whl)" rdfpipe https://github.com/RDFLib/rdflib/raw/main/test/data/defined_namespaces/rdfs.ttl +pipx run --no-cache --spec "$(readlink -f dist/rdflib*.tar.gz)" rdfpipe --version +pipx run --no-cache --spec "$(readlink -f dist/rdflib*.tar.gz)" rdfpipe https://github.com/RDFLib/rdflib/raw/main/test/data/defined_namespaces/rdfs.ttl + +# Dry run publishing +poetry publish --repository=testpypi --dry-run +poetry publish --dry-run + +# Publish to TestPyPI +## ensure you are authed as per https://pypi.org/help/#apitoken and https://github.com/python-poetry/poetry/issues/6320 +poetry publish --repository=testpypi + +# Publish to PyPI +poetry publish +## poetry publish -u __token__ -p pypi- +``` + +Once this is done, create a release tag from [GitHub releases](https://github.com/RDFLib/rdflib/releases/new). For a release of version 6.3.1 the tag should be `6.3.1` (without a "v" prefix), and the release title should be "RDFLib 6.3.1". The release notes for the latest version be added to the release description. The artifacts built with `poetry build` should be uploaded to the release as release artifacts. + +The resulting release will be available at https://github.com/RDFLib/rdflib/releases/tag/6.3.1 + +Once this is done, announce the release at the following locations: + +* Twitter: Just make a tweet from your own account linking to the latest release. +* RDFLib mailing list. +* RDFLib Gitter / matrix.org chat room. + +Once this is all done, create another post-release pull request with the following changes: + +* Set the just released version in `docker/latest/requirements.in` and run `task docker:prepare` to update the `docker/latest/requirements.txt` file. +* Set the version in the `pyproject.toml` file to the next minor release with a `a0` suffix to indicate alpha 0. diff --git a/docs/developers.rst b/docs/developers.rst index 80cd01bbc7..e69de29bb2 100644 --- a/docs/developers.rst +++ b/docs/developers.rst @@ -1,573 +0,0 @@ -.. developers: - -RDFLib developers guide -======================= - -Introduction ------------- - -This document describes the process and conventions to follow when -developing RDFLib code. - -* Please be as Pythonic as possible (:pep:`8`). -* Code should be formatted using `black `_ and we use Black v23.1.0, with the black config in ``pyproject.toml``. -* Code should also pass `flake8 `_ linting - and `mypy `_ type checking. -* You must supply tests for new code. -* RDFLib uses `Poetry `_ for dependency management and packaging. - -If you add a new cool feature, consider also adding an example in ``./examples``. - -Pull Requests Guidelines ------------------------- - -Contributions to RDFLib are made through pull requests (PRs). - -For changes that add features or affect the public API of RDFLib, it -is recommended to first open an issue to discuss the change before starting to -work on it. That way you can get feedback on the design of the feature before -spending time on it. - -In general, maintainers will only merge PRs if the following conditions are -met: - -* The PR has been sufficiently reviewed. - - Each PR should be reviewed and approved by at least two people other than the - author of the PR before it is merged and PRs will be processed faster if - they are easier to review and approve of. - - Reviews are open to everyone, but the weight assigned to any particular - review is at the discretion of maintainers. - -* Changes that have a runtime impact are covered by unit tests. - - There should either be existing tests that cover the changed code and - behaviour, or the PR should include tests. For more information about what is - considered adequate testing see the :ref:`Tests section `. - -* Documentation that covers something that changed has been updated. - -* Type checks and unit tests that are part of our continuous integration - workflow pass. - -In addition to these conditions, PRs that are easier to review and approve will -be processed quicker. The primary factors that determine this are the scope and -size of a PR. If there are few changes and the scope is limited, then there is -less that a reviewer has to understand and less that they can disagree with. It -is thus important to try to split up your changes into multiple independent PRs -if possible. No PR is too small. - -For PRs that introduce breaking changes, it is even more critical that they are -limited in size and scope, as they will likely have to be kept up to date with -the ``main`` branch of this project for some time before they are merged. - -It is also critical that your PR is understandable both in what it does and why -it does it, and how the change will impact the users of this project, for this -reason, it is essential that your PR's description explains the nature of the -PR, what the PR intends to do, why this is desirable, and how this will affect -the users of this project. - -Please note that while we would like all PRs to follow the guidelines given -here, we will not reject a PR just because it does not. - -Maintenance Guidelines ----------------------- - -This section contains guidelines for maintaining RDFLib. RDFLib maintainers -should try to follow these. These guidelines also serve as an indication to -RDFLib users what they can expect. - -Breaking changes -~~~~~~~~~~~~~~~~ - -Breaking changes to RDFLib's public API should be made incrementally, with small -pull requests to the main branch that change as few things as possible. - -Breaking changes should be discussed first in an issue before work is started, -as it is possible that the change is not necessary or that there is a better way -to achieve the same goal, in which case the work on the PR would have been -wasted. This will however not be strictly enforced, and no PR will be rejected -solely on the basis that it was not discussed upfront. - -RDFLib follows `semantic versioning `_ and `trunk-based development -`_, so if any breaking changes were -introduced into the main branch since the last release, then the next release -will be a major release with an incremented major version. - -Releases of RDFLib will not as a rule be conditioned on specific features, so -there may be new major releases that contain very few breaking changes, and -there could be no minor or patch releases between two major releases. - -.. _breaking_changes_rationale: - -Rationale -^^^^^^^^^ - -RDFLib has been around for more than a decade, and in this time both Python and -RDF have evolved, and RDFLib's API also has to evolve to keep up with these -changes and to make it easier for users to use. This will inevitably require -breaking changes. - -There are more or less two ways to introduce breaking changes to RDFLib's public -API: - -- Revolutionary: Create a new API from scratch and reimplement it, and when - ready, release a new version of RDFLib with the new API. -- Evolutionary: Incrementally improve the existing API with small changes and - release any breaking changes that were made at regular intervals. - -While the revolutionary approach seems appealing, it is also risky and -time-consuming. - -The evolutionary approach puts a lot of strain on the users of RDFLib as they -have to adapt to breaking changes more often, but the shortcomings of the RDFLib -public API also put a lot of strain on the users of RDFLib. On the other hand, a -major advantage of the evolutionary approach is that it is simple and achievable -from a maintenance and contributor perspective. - -Deprecating functionality -~~~~~~~~~~~~~~~~~~~~~~~~~ - -To whatever extent possible, classes, functions, variables, or parameters that -will be removed should be marked for deprecation in documentation, and if -possible, should be changed to raise deprecation warnings if used. - -There is however no hard requirement that something may only be removed after a -deprecation notice has been added, or only after a release was made with a -deprecation notice. - -Consequently, functionality may be removed without it ever being marked as -deprecated. - -.. _deprecation_rationale: - -Rationale -^^^^^^^^^ - -Current resource limitations and the backlog of issues make it impractical to -first release or incorporate deprecation notices before making quality of life -changes. - -RDFLib uses semantic versioning and provides type hints, and these are the -primary mechanisms for signalling breaking changes to our users. - -.. _tests: - -Tests ------ -Any new functionality being added to RDFLib *must* have unit tests and -should have doc tests supplied. - -Typically, you should add your functionality and new tests to a branch of -RDFlib and run all tests locally and see them pass. There are currently -close to 4,000 tests, with a some expected failures and skipped tests. -We won't merge pull requests unless the test suite completes successfully. - -Tests that you add should show how your new feature or bug fix is doing what -you say it is doing: if you remove your enhancement, your new tests should fail! - -Finally, please consider adding simple and more complex tests. It's good to see -the basic functionality of your feature tests and then also any tricky bits or -edge cases. - -Testing framework -~~~~~~~~~~~~~~~~~ -RDFLib uses the `pytest `_ testing framework. - -Running tests -~~~~~~~~~~~~~ - -To run RDFLib's test suite with `pytest `_: - -.. code-block:: console - - $ poetry install - $ poetry run pytest - -Specific tests can be run by file name. For example: - -.. code-block:: console - - $ poetry run pytest test/test_graph/test_graph.py - -For more extensive tests, including tests for the `berkleydb -`_ -backend, install extra requirements before -executing the tests. - -.. code-block:: console - - $ poetry install --all-extras - $ poetry run pytest - -Writing tests -~~~~~~~~~~~~~ - -New tests should be written for `pytest `_ -instead of for python's built-in `unittest` module as pytest provides advanced -features such as parameterization and more flexibility in writing expected -failure tests than `unittest`. - -A primer on how to write tests for pytest can be found `here -`_. - -The existing tests that use `unittest` work well with pytest, but they should -ideally be updated to the pytest test-style when they are touched. - -Test should go into the ``test/`` directory, either into an existing test file -with a name that is applicable to the test being written, or into a new test -file with a name that is descriptive of the tests placed in it. Test files -should be named ``test_*.py`` so that `pytest can discover them -`_. - -Running static checks ---------------------- - -Check formatting with `black `_, making sure you use -our black.toml config file: - -.. code-block:: bash - - poetry run black . - -Check style and conventions with `ruff `_: - -.. code-block:: bash - - poetry run ruff check - -Any issues that are found can potentially be fixed automatically using: - -.. code-block:: bash - - poetry run ruff check --fix - -Check types with `mypy `_: - -.. code-block:: bash - - poetry run mypy --show-error-context --show-error-codes - -pre-commit and pre-commit ci ----------------------------- - -We have `pre-commit `_ configured with `black -`_ for formatting code. - -Some useful commands for using pre-commit: - -.. code-block:: bash - - # Install pre-commit. - pip install --user --upgrade pre-commit - - # Install pre-commit hooks, this will run pre-commit - # every time you make a git commit. - pre-commit install - - # Run pre-commit on changed files. - pre-commit run - - # Run pre-commit on all files. - pre-commit run --all-files - -There is also two tox environments for pre-commit: - -.. code-block:: bash - - # run pre-commit on changed files. - tox -e precommit - - # run pre-commit on all files. - tox -e precommitall - - -There is no hard requirement for pull requests to be processed with pre-commit (or the underlying processors), however doing this makes for a less noisy codebase with cleaner history. - -We have enabled `https://pre-commit.ci/ `_ and this can -be used to automatically fix pull requests by commenting ``pre-commit.ci -autofix`` on a pull request. - -Using tox ---------------------- - -RDFLib has a `tox `_ config file that -makes it easier to run validation on all supported python versions. - -.. code-block:: bash - - # Install tox. - pip install tox - - # List the tox environments that run by default. - tox -e - - # Run the default environments. - tox - - # List all tox environments, including ones that don't run by default. - tox -a - - # Run a specific environment. - tox -e py38 # default environment with py37 - tox -e py39-extra # extra tests with py39 - - # Override the test command. - # the below command will run `pytest test/test_translate_algebra.py` - # instead of the default pytest command. - tox -e py38,py39 -- pytest test/test_translate_algebra.py - - -``go-task`` and ``Taskfile.yml`` --------------------------------- - -A ``Taskfile.yml`` is provided for `go-task `_ with -various commands that facilitate development. - -Instructions for installing go-task can be seen in the `go-task installation -guide `_. - -Some useful commands for working with the task in the taskfile is given below: - -.. code-block:: bash - - # List available tasks. - task -l - - # Configure the environment for development - task configure - - # Run basic validation - task validate - - # Build docs - task docs - - # Run live-preview on the docs - task docs:live-server - - # Run the py310 tox environment - task tox -- -e py310 - -The `Taskfile usage documentation `_ provides -more information on how to work with taskfiles. - -Development container ---------------------- - -To simplify the process of getting a working development environment to develop -rdflib in we provide a `Development Container -`_ (*devcontainer*) that is -configured in `Docker Compose `_. This -container can be used directly to run various commands, or it can be used with -`editors that support Development Containers -`_. - -.. important:: - The devcontainer is intended to run with a - `rootless docker `_ - daemon so it can edit files owned by the invoking user without - an invovled configuration process. - - Using a rootless docker daemon also has general security benefits. - -To use the development container directly: - -.. code-block:: bash - - # Build the devcontainer docker image. - docker-compose build - - # Configure the system for development. - docker-compose run --rm run task configure - - # Run the validate task inside the devtools container. - docker-compose run --rm run task validate - - # Run extensive tests inside the devtools container. - docker-compose run --rm run task EXTENSIVE=true test - - # To get a shell into the devcontainer docker image. - docker-compose run --rm run bash - -The devcontainer also works with `Podman Compose -`_. - -Details on how to use the development container with `VSCode -`_ can found in the `Developing inside a -Container `_ page. With -the VSCode `development container CLI -`_ installed the -following command can be used to open the repository inside the development -container: - -.. code-block:: bash - - # Inside the repository base directory - cd ./rdflib/ - - # Build the development container. - devcontainer build . - - # Open the code inside the development container. - devcontainer open . - -Writing documentation ---------------------- - -We use sphinx for generating HTML docs, see :ref:`docs`. - -Continuous Integration ----------------------- - -We used GitHub Actions for CI, see: - - https://github.com/RDFLib/rdflib/actions - -If you make a pull-request to RDFLib on GitHub, GitHub Actions will -automatically test your code and we will only merge code passing all tests. - -Please do *not* commit tests you know will fail, even if you're just pointing out a bug. If you commit such tests, -flag them as expecting to fail. - -Compatibility -------------- - -RDFLib 8.x is likely to support only the Python versions in bugfix status at the time of its release, so perhaps 3.12+. - -RDFlib 7.0.0 release and later only support Python 3.8.1 and newer. - -RDFlib 6.0.0 release and later only support Python 3.7 and newer. - -RDFLib 5.0.0 maintained compatibility with Python versions 2.7, 3.4, 3.5, 3.6, 3.7. - -Releasing ---------- - -These are the major steps for releasing new versions of RDFLib: - -#. Create a pre-release PR - - * that updates all the version numbers - * merge it with all tests passing - -#. Do the PyPI release -#. Do the GitHub release -#. Create a post-release PR - - * that updates all version numbers to next (alpha) release - * merge it with all tests passing - -#. Let the world know - - -1. Create a pre-release PR -~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Create a release-preparation pull request with the following changes: - -#. In ``pyproject.toml``, update the version number -#. In ``README.md``, update the *Versions & Releases* section -#. In ``rdflib/__init__.py``, update the ``__date__`` value -#. In ``docs/conf.py``, update copyright year -#. In ``CITATION.cff``, update the version and date -#. In ``LICENSE``, update the copyright year -#. In ``CHANGELOG.md``, write an entry for this release - #. You can use the tool ``admin/get_merged_prs.py`` to assist with compiling a log of PRs and commits since last release - -2. Do the PyPI release -~~~~~~~~~~~~~~~~~~~~~~ - -Once the pre-release PR is merged, switch to the main branch, build the release and upload it to PyPI: - -.. code-block:: bash - - # Clean up any previous builds - rm -vf dist/* - - # Build artifacts - poetry build - - # Verify package metadata - bsdtar -xvf dist/rdflib-*.whl -O '*/METADATA' | view - - bsdtar -xvf dist/rdflib-*.tar.gz -O '*/PKG-INFO' | view - - - # Check that the built wheel and sdist works correctly: - ## Ensure pipx is installed but not within RDFLib's environment - pipx run --no-cache --spec "$(readlink -f dist/rdflib*.whl)" rdfpipe --version - pipx run --no-cache --spec "$(readlink -f dist/rdflib*.whl)" rdfpipe https://github.com/RDFLib/rdflib/raw/main/test/data/defined_namespaces/rdfs.ttl - pipx run --no-cache --spec "$(readlink -f dist/rdflib*.tar.gz)" rdfpipe --version - pipx run --no-cache --spec "$(readlink -f dist/rdflib*.tar.gz)" rdfpipe https://github.com/RDFLib/rdflib/raw/main/test/data/defined_namespaces/rdfs.ttl - - # Dry run publishing - poetry publish --repository=testpypi --dry-run - poetry publish --dry-run - - # Publish to TestPyPI - ## ensure you are authed as per https://pypi.org/help/#apitoken and https://github.com/python-poetry/poetry/issues/6320 - poetry publish --repository=testpypi - - # Publish to PyPI - poetry publish - ## poetry publish -u __token__ -p pypi- - - -3. Do the GitHub release -~~~~~~~~~~~~~~~~~~~~~~~~ - -Once the PyPI release is done, tag the main branch with the version number of the release. For a release of version -6.3.1 the tag should be ``6.3.1`` (without a "v" prefix): - -.. code-block:: bash - - git tag 6.3.1 - - -Push this tag to GitHub: - -.. code-block:: bash - - git push --tags - - -Make a release from this tag at https://github.com/RDFLib/rdflib/releases/new - -The release title should be "{DATE} RELEASE {VERSION}". See previous releases at https://github.com/RDFLib/rdflib/releases - -The release notes should be just the same as the release info in ``CHANGELOG.md``, as authored in the first major step in this release process. - -The resulting release will be available at https://github.com/RDFLib/rdflib/releases/tag/6.3.1 - -4. Create a post-release PR -~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -Once this is all done, create another post-release pull request with the following changes: - -#. In ``pyproject.toml``, update to the next minor release alpha - - * so a 6.3.1 release would have 6.1.4a0 as the next release alpha - -#. In ``docker/latest/requirements.in`` set the version to the just released version -#. Use ``task docker:prepare`` to update ``docker/latest/requirements.txt`` - -5. Port changes to the next major working branch -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -If maintaining multiple long-lived version branches, ensure changes from this release are ported to the next major working branch. - -For instance, releasing a ``7.x`` would require merging in changes to ``main`` for version ``8.x``. - -This ensures general fixes and enhancements are ported over and maintained in the next major working branch. - -6. Let the world know -~~~~~~~~~~~~~~~~~~~~~ - -Announce the release at the following locations: - -* RDFLib mailing list -* RDFLib Gitter / matrix.org chat room -* Twitter: Just make a tweet from your own account linking to the latest release -* related mailing lists - * Jena: users@jena.apache.org - * W3C (currently RDF-Star WG): public-rdf-star@w3.org diff --git a/docs/docs.md b/docs/docs.md new file mode 100644 index 0000000000..4ebe2e3792 --- /dev/null +++ b/docs/docs.md @@ -0,0 +1,47 @@ +# Writing RDFLib Documentation + +These docs are generated with [Material for MkDocs](https://squidfunk.github.io/mkdocs-material). + +- When writing doc-strings use markdown and google style. +- API Docs are automatically generated with [`mkdocstring`](https://mkdocstrings.github.io). +- See the [supported admonitions here](https://squidfunk.github.io/mkdocs-material/reference/admonitions/#supported-types) + +## Building + +To build the documentation you can use `mkdocs` from within the poetry environment. To do this, run the following commands: + +```bash +# Install poetry venv +poetry install + +# Build the docs +poetry run mkdocs build +``` + +Built HTML docs will be generated in `site/` and API documentation, generated as markdown from doc-strings, will be placed in `docs/apidocs/`. + +API Docs are automatically generated with `mkdocstring` + +There is also a [tox](https://tox.wiki/en/latest/) environment for building documentation: + +```bash +tox -e docs +``` + +You can check the built documentation with: + +```bash +npx -p live-server live-server site/ +``` + +## Development + +Run development server with auto-reload on change to code: + +```bash +poetry run mkdocs serve +``` + +## Tables + +The tables in `plugin_*.rst` were generated with `plugintable.py` diff --git a/docs/docs.rst b/docs/docs.rst deleted file mode 100644 index 5ff9177557..0000000000 --- a/docs/docs.rst +++ /dev/null @@ -1,55 +0,0 @@ -.. _docs: - -================================ -Writing RDFLib Documentation -================================ - - -These docs are generated with Sphinx. - -Sphinx makes it very easy to pull in doc-strings from modules, -classes, methods, etc. When writing doc-strings, special reST fields -can be used to annotate parameters, return-types, etc. This makes for -pretty API docs. See `here `_ -for the Shinx documentation about these fields. - -Building --------- - -To build the documentation you can use Sphinx from within the poetry environment. To do this, run the following commands: - -.. code-block:: bash - - # Install poetry venv - poetry install - - # Build the sphinx docs - poetry run sphinx-build -b html -d docs/_build/doctrees docs docs/_build/html - - -Docs will be generated in :file:`docs/_build/html` and API documentation, -generated from doc-strings, will be placed in :file:`docs/apidocs/`. - -There is also a `tox `_ environment for building -documentation: - -.. code-block:: bash - - tox -e docs - -API Docs --------- - -API Docs are automatically generated with ``sphinx-apidoc``: - -.. code-block:: bash - - poetry run sphinx-apidoc -f -d 10 -o docs/apidocs/ rdflib examples - -Note that ``rdflib.rst`` was manually tweaked so as to not include all - imports in ``rdflib/__init__.py``. - -Tables ------- - -The tables in ``plugin_*.rst`` were generated with ``plugintable.py`` diff --git a/docs/gen_ref_pages.py b/docs/gen_ref_pages.py new file mode 100644 index 0000000000..0abb530d0a --- /dev/null +++ b/docs/gen_ref_pages.py @@ -0,0 +1,62 @@ +"""Generate the code reference pages.""" + +import importlib +import pkgutil +from pathlib import Path + +import mkdocs_gen_files + + +def generate_module_docs(module_path, output_path, nav, indent=0): + """Generate documentation for a module and its submodules.""" + try: + module = importlib.import_module(module_path) + doc_path = Path(output_path) + # Collect submodule information for parent modules + submodules = [] + if hasattr(module, "__path__"): + for _, submodule_name, is_pkg in pkgutil.iter_modules(module.__path__): + submodules.append((submodule_name, is_pkg)) + + # Create a .md file for the current module + if not module_path == "rdflib": + with mkdocs_gen_files.open(doc_path, "w") as fd: + fd.write(f"::: {module_path}\n\n") + # namespace module page gets too big, so we disable source code display + if module_path.startswith("rdflib.namespace"): + fd.write(" options:\n") + fd.write(" show_source: false\n") + fd.write(" show_if_no_docstring: false\n\n") + + mkdocs_gen_files.set_edit_path( + doc_path, Path(f"../{module_path.replace('.', '/')}.py") + ) + # Add to navigation - convert path to tuple of parts for nav + # parts = tuple(doc_path.with_suffix("").parts) + # nav[parts] = doc_path.as_posix() + # Process submodules + if hasattr(module, "__path__"): + for _, submodule_name, is_pkg in pkgutil.iter_modules(module.__path__): + full_submodule_path = f"{module_path}.{submodule_name}" + # Create path for submodule documentation + generate_module_docs( + full_submodule_path, + Path(f"apidocs/{full_submodule_path}.md"), + nav, + indent + 4, + ) + except (ImportError, AttributeError) as e: + print(f"Error processing {module_path}: {e}") + + +# Creating navigation structure requires mkdocs-literate-nav +# nav = mkdocs_gen_files.Nav() +nav = None + +# Generate all docs +generate_module_docs("rdflib", Path("apidocs/_index.md"), nav) +generate_module_docs("examples", Path("apidocs/examples.md"), nav) + +# # Write the navigation file for the literate-nav plugin +# with mkdocs_gen_files.open("SUMMARY.md", "w") as nav_file: +# nav_file.writelines(nav.build_literate_nav()) diff --git a/docs/gettingstarted.md b/docs/gettingstarted.md new file mode 100644 index 0000000000..1e8cfd4171 --- /dev/null +++ b/docs/gettingstarted.md @@ -0,0 +1,144 @@ +# Getting started with RDFLib + +## Installation + +RDFLib is open source and is maintained in a [GitHub](https://github.com/RDFLib/rdflib/) repository. RDFLib releases, current and previous, are listed on [PyPI](https://pypi.python.org/pypi/rdflib/) + +The best way to install RDFLib is to use `pip` (sudo as required): + +```bash +pip install rdflib +``` + +If you want the latest code to run, clone the `main` branch of the GitHub repo and use that or you can `pip install` directly from GitHub: + +```bash +pip install git+https://github.com/RDFLib/rdflib.git@main#egg=rdflib +``` + +## Support + +Usage support is available via questions tagged with `[rdflib]` on [StackOverflow](https://stackoverflow.com/questions/tagged/rdflib) and development support, notifications and detailed discussion through the rdflib-dev group (mailing list): [http://groups.google.com/group/rdflib-dev](http://groups.google.com/group/rdflib-dev) + +If you notice a bug or want to request an enhancement, please do so via our Issue Tracker in Github: [http://github.com/RDFLib/rdflib/issues](http://github.com/RDFLib/rdflib/issues) + +## How it all works + +*The package uses various Python idioms that offer an appropriate way to introduce RDF to a Python programmer who hasn't worked with RDF before.* + +The primary interface that RDFLib exposes for working with RDF is a [`Graph`][rdflib.graph.Graph]. + +RDFLib graphs are un-sorted containers; they have ordinary Python `set` operations (e.g. [`add()`][rdflib.graph.Graph.add] to add a triple) plus methods that search triples and return them in arbitrary order. + +RDFLib graphs also redefine certain built-in Python methods in order to behave in a predictable way. They do this by [emulating container types](https://docs.python.org/3.8/reference/datamodel.html#emulating-container-types) and are best thought of as a set of 3-item tuples ("triples", in RDF-speak): + +```python +[ + (subject0, predicate0, object0), + (subject1, predicate1, object1), + # ... + (subjectN, predicateN, objectN), +] +``` + +## A tiny example + +```python +from rdflib import Graph + +# Create a Graph +g = Graph() + +# Parse in an RDF file hosted on the Internet +g.parse("http://www.w3.org/People/Berners-Lee/card") + +# Loop through each triple in the graph (subj, pred, obj) +for subj, pred, obj in g: + # Check if there is at least one triple in the Graph + if (subj, pred, obj) not in g: + raise Exception("It better be!") + +# Print the number of "triples" in the Graph +print(f"Graph g has {len(g)} statements.") +# Prints: Graph g has 86 statements. + +# Print out the entire Graph in the RDF Turtle format +print(g.serialize(format="turtle")) +``` + +Here a [`Graph`][rdflib.graph.Graph] is created and then an RDF file online, Tim Berners-Lee's social network details, is parsed into that graph. The `print()` statement uses the `len()` function to count the number of triples in the graph. + +## A more extensive example + +```python +from rdflib import Graph, Literal, RDF, URIRef +# rdflib knows about quite a few popular namespaces, like W3C ontologies, schema.org etc. +from rdflib.namespace import FOAF , XSD + +# Create a Graph +g = Graph() + +# Create an RDF URI node to use as the subject for multiple triples +donna = URIRef("http://example.org/donna") + +# Add triples using store's add() method. +g.add((donna, RDF.type, FOAF.Person)) +g.add((donna, FOAF.nick, Literal("donna", lang="en"))) +g.add((donna, FOAF.name, Literal("Donna Fales"))) +g.add((donna, FOAF.mbox, URIRef("mailto:donna@example.org"))) + +# Add another person +ed = URIRef("http://example.org/edward") + +# Add triples using store's add() method. +g.add((ed, RDF.type, FOAF.Person)) +g.add((ed, FOAF.nick, Literal("ed", datatype=XSD.string))) +g.add((ed, FOAF.name, Literal("Edward Scissorhands"))) +g.add((ed, FOAF.mbox, Literal("e.scissorhands@example.org", datatype=XSD.anyURI))) + +# Iterate over triples in store and print them out. +print("--- printing raw triples ---") +for s, p, o in g: + print((s, p, o)) + +# For each foaf:Person in the store, print out their mbox property's value. +print("--- printing mboxes ---") +for person in g.subjects(RDF.type, FOAF.Person): + for mbox in g.objects(person, FOAF.mbox): + print(mbox) + +# Bind the FOAF namespace to a prefix for more readable output +g.bind("foaf", FOAF) + +# print all the data in the Notation3 format +print("--- printing mboxes ---") +print(g.serialize(format='n3')) +``` + +## A SPARQL query example + +```python +from rdflib import Graph + +# Create a Graph, parse in Internet data +g = Graph().parse("http://www.w3.org/People/Berners-Lee/card") + +# Query the data in g using SPARQL +# This query returns the 'name' of all `foaf:Person` instances +q = """ + PREFIX foaf: + + SELECT ?name + WHERE { + ?p rdf:type foaf:Person . + + ?p foaf:name ?name . + } +""" + +# Apply the query to the graph and iterate through results +for r in g.query(q): + print(r["name"]) + +# prints: Timothy Berners-Lee +``` diff --git a/docs/gettingstarted.rst b/docs/gettingstarted.rst deleted file mode 100644 index b3ee9572fc..0000000000 --- a/docs/gettingstarted.rst +++ /dev/null @@ -1,178 +0,0 @@ -.. _gettingstarted: - -=============================== -Getting started with RDFLib -=============================== - -Installation -============ - -RDFLib is open source and is maintained in a -`GitHub `_ repository. RDFLib releases, current and previous, -are listed on `PyPi `_ - -The best way to install RDFLib is to use ``pip`` (sudo as required): - -.. code-block :: bash - - $ pip install rdflib - -If you want the latest code to run, clone the ``main`` branch of the GitHub repo and use that or you can ``pip install`` -directly from GitHub: - -.. code-block :: bash - - $ pip install git+https://github.com/RDFLib/rdflib.git@main#egg=rdflib - - -Support -======= -Usage support is available via questions tagged with ``[rdflib]`` on `StackOverflow `__ -and development support, notifications and detailed discussion through the rdflib-dev group (mailing list): - - http://groups.google.com/group/rdflib-dev - -If you notice an bug or want to request an enhancement, please do so via our Issue Tracker in Github: - - ``_ - -How it all works -================ -*The package uses various Python idioms -that offer an appropriate way to introduce RDF to a Python programmer -who hasn't worked with RDF before.* - -The primary interface that RDFLib exposes for working with RDF is a -:class:`~rdflib.graph.Graph`. - -RDFLib graphs are un-sorted containers; they have ordinary Python ``set`` -operations (e.g. :meth:`~rdflib.Graph.add` to add a triple) plus -methods that search triples and return them in arbitrary order. - -RDFLib graphs also redefine certain built-in Python methods in order -to behave in a predictable way. They do this by `emulating container types -`_ and -are best thought of as a set of 3-item tuples ("triples", in RDF-speak): - -.. code-block:: text - - [ - (subject0, predicate0, object0), - (subject1, predicate1, object1), - ... - (subjectN, predicateN, objectN) - ] - -A tiny example -============== - -.. code-block:: python - - from rdflib import Graph - - # Create a Graph - g = Graph() - - # Parse in an RDF file hosted on the Internet - g.parse("http://www.w3.org/People/Berners-Lee/card") - - # Loop through each triple in the graph (subj, pred, obj) - for subj, pred, obj in g: - # Check if there is at least one triple in the Graph - if (subj, pred, obj) not in g: - raise Exception("It better be!") - - # Print the number of "triples" in the Graph - print(f"Graph g has {len(g)} statements.") - # Prints: Graph g has 86 statements. - - # Print out the entire Graph in the RDF Turtle format - print(g.serialize(format="turtle")) - -Here a :class:`~rdflib.graph.Graph` is created and then an RDF file online, Tim Berners-Lee's social network details, is -parsed into that graph. The ``print()`` statement uses the ``len()`` function to count the number of triples in the -graph. - -A more extensive example -======================== - -.. code-block:: python - - from rdflib import Graph, Literal, RDF, URIRef - # rdflib knows about quite a few popular namespaces, like W3C ontologies, schema.org etc. - from rdflib.namespace import FOAF , XSD - - # Create a Graph - g = Graph() - - # Create an RDF URI node to use as the subject for multiple triples - donna = URIRef("http://example.org/donna") - - # Add triples using store's add() method. - g.add((donna, RDF.type, FOAF.Person)) - g.add((donna, FOAF.nick, Literal("donna", lang="en"))) - g.add((donna, FOAF.name, Literal("Donna Fales"))) - g.add((donna, FOAF.mbox, URIRef("mailto:donna@example.org"))) - - # Add another person - ed = URIRef("http://example.org/edward") - - # Add triples using store's add() method. - g.add((ed, RDF.type, FOAF.Person)) - g.add((ed, FOAF.nick, Literal("ed", datatype=XSD.string))) - g.add((ed, FOAF.name, Literal("Edward Scissorhands"))) - g.add((ed, FOAF.mbox, Literal("e.scissorhands@example.org", datatype=XSD.anyURI))) - - # Iterate over triples in store and print them out. - print("--- printing raw triples ---") - for s, p, o in g: - print((s, p, o)) - - # For each foaf:Person in the store, print out their mbox property's value. - print("--- printing mboxes ---") - for person in g.subjects(RDF.type, FOAF.Person): - for mbox in g.objects(person, FOAF.mbox): - print(mbox) - - # Bind the FOAF namespace to a prefix for more readable output - g.bind("foaf", FOAF) - - # print all the data in the Notation3 format - print("--- printing mboxes ---") - print(g.serialize(format='n3')) - - -A SPARQL query example -====================== - -.. code-block:: python - - from rdflib import Graph - - # Create a Graph, parse in Internet data - g = Graph().parse("http://www.w3.org/People/Berners-Lee/card") - - # Query the data in g using SPARQL - # This query returns the 'name' of all ``foaf:Person`` instances - q = """ - PREFIX foaf: - - SELECT ?name - WHERE { - ?p rdf:type foaf:Person . - - ?p foaf:name ?name . - } - """ - - # Apply the query to the graph and iterate through results - for r in g.query(q): - print(r["name"]) - - # prints: Timothy Berners-Lee - - - -More examples -============= -There are many more :doc:`examples ` in the :file:`examples` folder in the source distribution. diff --git a/docs/includes/abbreviations.md b/docs/includes/abbreviations.md new file mode 100644 index 0000000000..6cf8a7e15b --- /dev/null +++ b/docs/includes/abbreviations.md @@ -0,0 +1,31 @@ +*[HTML]: Hyper Text Markup Language +*[HTTP]: HyperText Transfer Protocol +*[HTTPS]: HyperText Transfer Protocol Secure +*[API]: Application Programming Interface +*[UI]: User Interface +*[CLI]: Command-Line Interface +*[PIP]: Pip Install Packages +*[PyPI]: Python Packaging Index +*[PyPA]: Python Packaging Authority +*[PEP]: Python Enhancement Proposal +*[RDF]: Resource Description Framework +*[N3]: Notation 3, an assertion and logic language which is a superset of RDF +*[TriX]: Triples in XML +*[TriG]: Triples in Graphs +*[RDFa]: Resource Description Framework in Attributes +*[JSON-LD]: JavaScript Object Notation - Linked Data +*[JSON]: JavaScript Object Notation +*[OWL]: Web Ontology Language +*[XML]: Extensible Markup Language +*[SPARQL]: SPARQL Protocol and RDF Query Language +*[URL]: Uniform Resource Locator +*[URI]: Uniform Resource Identifier +*[IRI]: Internationalized Resource Identifier +*[CSV]: Comma-Separated Value +*[TSV]: Tab-Separated Value +*[PSV]: Pipe-Separated Value +*[RegEx]: Regular Expression +*[OBO]: Open Biological and Biomedical Ontology +*[VSCode]: VisualStudio Code +*[PR]: Pull request +*[PRs]: Pull requests diff --git a/docs/index.md b/docs/index.md new file mode 100644 index 0000000000..f5e2c93ea9 --- /dev/null +++ b/docs/index.md @@ -0,0 +1,90 @@ +# RDFLib + +RDFLib is a pure Python package for working with [RDF](http://www.w3.org/RDF/). It contains: + +* **Parsers & Serializers** + * for RDF/XML, N3, NTriples, N-Quads, Turtle, TriG, TriX, JSON-LD, HexTuples, RDFa and Microdata + +* **Store implementations** + * memory stores + * persistent, on-disk stores, using databases such as BerkeleyDB + * remote SPARQL endpoints + +* **Graph interface** + * to a single graph + * or to multiple Named Graphs within a dataset + +* **SPARQL 1.1 implementation** + * both Queries and Updates are supported + +!!! warning "Security considerations" + RDFLib is designed to access arbitrary network and file resources, in some + cases these are directly requested resources, in other cases they are + indirectly referenced resources. + + If you are using RDFLib to process untrusted documents or queries you should + take measures to restrict file and network access. + + For information on available security measures, see the RDFLib + [Security Considerations](security_considerations.md) + documentation. + +## Getting started + +If you have never used RDFLib, the following will help get you started: + +* [Getting Started](gettingstarted.md) +* [Introduction to Parsing](intro_to_parsing.md) +* [Introduction to Creating RDF](intro_to_creating_rdf.md) +* [Introduction to Graphs](intro_to_graphs.md) +* [Introduction to SPARQL](intro_to_sparql.md) +* [Utilities](utilities.md) +* [Examples](apidocs/examples.md) + +## In depth + +If you are familiar with RDF and are looking for details on how RDFLib handles it, these are for you: + +* [RDF Terms](rdf_terms.md) +* [Namespaces and Bindings](namespaces_and_bindings.md) +* [Persistence](persistence.md) +* [Merging](merging.md) +* [Changelog](changelog.md) +* [Upgrade 6 to 7](upgrade6to7.md) +* [Upgrade 5 to 6](upgrade5to6.md) +* [Upgrade 4 to 5](upgrade4to5.md) +* [Security Considerations](security_considerations.md) + +## Versioning + +RDFLib follows [Semantic Versioning 2.0.0](https://semver.org/spec/v2.0.0.html), which can be summarized as follows: + +Given a version number `MAJOR.MINOR.PATCH`, increment the: + +1. `MAJOR` version when you make incompatible API changes +2. `MINOR` version when you add functionality in a backwards-compatible manner +3. `PATCH` version when you make backwards-compatible bug fixes + +## For developers + +* [Developers guide](developers.md) +* [Documentation guide](docs.md) +* [Contributing guide](CONTRIBUTING.md) +* [Code of Conduct](CODE_OF_CONDUCT.md) +* [Persisting N3 Terms](persisting_n3_terms.md) +* [Type Hints](type_hints.md) +* [Decisions](decisions.md) + +## Source Code + +The rdflib source code is hosted on GitHub at [https://github.com/RDFLib/rdflib](https://github.com/RDFLib/rdflib) where you can lodge Issues and create Pull Requests to help improve this community project! + +The RDFlib organisation on GitHub at [https://github.com/RDFLib](https://github.com/RDFLib) maintains this package and a number of other RDF and RDFlib-related packaged that you might also find useful. + +## Further help & Contact + +If you would like help with using RDFlib, rather than developing it, please post a question on StackOverflow using the tag `[rdflib]`. A list of existing `[rdflib]` tagged questions can be found [here](https://stackoverflow.com/questions/tagged/rdflib). + +You might also like to join RDFlib's [dev mailing list](https://groups.google.com/group/rdflib-dev) or use RDFLib's [GitHub discussions section](https://github.com/RDFLib/rdflib/discussions). + +The chat is available at [gitter](https://gitter.im/RDFLib/rdflib) or via matrix [#RDFLib_rdflib:gitter.im](https://matrix.to/#/#RDFLib_rdflib:gitter.im). diff --git a/docs/index.rst b/docs/index.rst deleted file mode 100644 index ad6e7c00d2..0000000000 --- a/docs/index.rst +++ /dev/null @@ -1,144 +0,0 @@ -.. rdflib documentation documentation main file - -================ -rdflib |release| -================ - -RDFLib is a pure Python package for working with `RDF `_. It contains: - -* **Parsers & Serializers** - - * for RDF/XML, N3, NTriples, N-Quads, Turtle, TriX, JSON-LD, HexTuples, RDFa and Microdata - - -* **Store implementations** - - * memory stores - * persistent, on-disk stores, using databases such as BerkeleyDB - * remote SPARQL endpoints - -* **Graph interface** - - * to a single graph - * or to multiple Named Graphs within a dataset - -* **SPARQL 1.1 implementation** - - * both Queries and Updates are supported - -.. caution:: - - RDFLib is designed to access arbitrary network and file resources, in some - cases these are directly requested resources, in other cases they are - indirectly referenced resources. - - If you are using RDFLib to process untrusted documents or queries you should - take measures to restrict file and network access. - - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. - -Getting started ---------------- -If you have never used RDFLib, the following will help get you started: - -.. toctree:: - :maxdepth: 1 - - gettingstarted - intro_to_parsing - intro_to_creating_rdf - intro_to_graphs - intro_to_sparql - utilities - Examples - - -In depth --------- -If you are familiar with RDF and are looking for details on how RDFLib handles it, these are for you: - -.. toctree:: - :maxdepth: 1 - - rdf_terms - namespaces_and_bindings - persistence - merging - changelog - upgrade6to7 - upgrade5to6 - upgrade4to5 - security_considerations - - -Reference ---------- -The nitty-gritty details of everything. - -API reference: - -.. toctree:: - :maxdepth: 1 - - apidocs/modules - -.. toctree:: - :maxdepth: 2 - - plugins - -.. * :ref:`genindex` -.. * :ref:`modindex` - -Versioning ----------- -RDFLib follows `Semantic Versioning 2.0.0 `_, which can be summarized as follows: - - Given a version number ``MAJOR.MINOR.PATCH``, increment the: - - #. ``MAJOR`` version when you make incompatible API changes - #. ``MINOR`` version when you add functionality in a backwards-compatible - manner - #. ``PATCH`` version when you make backwards-compatible bug fixes - -For developers --------------- -.. toctree:: - :maxdepth: 1 - - developers - CODE_OF_CONDUCT - docs - persisting_n3_terms - type_hints - CONTRIBUTING - decisions/index - -Source Code ------------ -The rdflib source code is hosted on GitHub at ``__ where you can lodge Issues and -create Pull Requests to help improve this community project! - -The RDFlib organisation on GitHub at ``__ maintains this package and a number of other RDF -and RDFlib-related packaged that you might also find useful. - - -.. _further_help_and_contact: - -Further help & Contact ----------------------- - -If you would like help with using RDFlib, rather than developing it, please post -a question on StackOverflow using the tag ``[rdflib]``. A list of existing -``[rdflib]`` tagged questions can be found -`here `_. - -You might also like to join RDFlib's `dev mailing list -`_ or use RDFLib's `GitHub -discussions section `_. - -The chat is available at `gitter `_ or via -matrix `#RDFLib_rdflib:gitter.im -`_. diff --git a/docs/intro_to_creating_rdf.md b/docs/intro_to_creating_rdf.md new file mode 100644 index 0000000000..9d4de9655b --- /dev/null +++ b/docs/intro_to_creating_rdf.md @@ -0,0 +1,167 @@ +# Creating RDF triples + +## Creating Nodes + +RDF data is a graph where the nodes are URI references, Blank Nodes or Literals. In RDFLib, these node types are represented by the classes [`URIRef`][rdflib.term.URIRef], [`BNode`][rdflib.term.BNode], and [`Literal`][rdflib.term.Literal]. `URIRefs` and `BNodes` can both be thought of as resources, such a person, a company, a website, etc. + +* A `BNode` is a node where the exact URI is not known - usually a node with identity only in relation to other nodes. +* A `URIRef` is a node where the exact URI is known. In addition to representing some subjects and predicates in RDF graphs, `URIRef`s are always used to represent properties/predicates +* `Literals` represent object values, such as a name, a date, a number, etc. The most common literal values are XML data types, e.g. string, int... but custom types can be declared too + +Nodes can be created by the constructors of the node classes: + +```python +from rdflib import URIRef, BNode, Literal + +bob = URIRef("http://example.org/people/Bob") +linda = BNode() # a GUID is generated + +name = Literal("Bob") # passing a string +age = Literal(24) # passing a python int +height = Literal(76.5) # passing a python float +``` + +Literals can be created from Python objects, this creates `data-typed literals`. For the details on the mapping see [rdflibliterals](rdf_terms.md). + +For creating many `URIRefs` in the same `namespace`, i.e. URIs with the same prefix, RDFLib has the [`Namespace`][rdflib.namespace.Namespace] class + +```python +from rdflib import Namespace + +n = Namespace("http://example.org/people/") + +n.bob # == rdflib.term.URIRef("http://example.org/people/bob") +n.eve # == rdflib.term.URIRef("http://example.org/people/eve") +``` + +This is very useful for schemas where all properties and classes have the same URI prefix. RDFLib defines Namespaces for some common RDF/OWL schemas, including most W3C ones: + +```python +from rdflib.namespace import CSVW, DC, DCAT, DCTERMS, DOAP, FOAF, ODRL2, ORG, OWL, \ + PROF, PROV, RDF, RDFS, SDO, SH, SKOS, SOSA, SSN, TIME, \ + VOID, XMLNS, XSD + +RDF.type +# == rdflib.term.URIRef("http://www.w3.org/1999/02/22-rdf-syntax-ns#type") + +FOAF.knows +# == rdflib.term.URIRef("http://xmlns.com/foaf/0.1/knows") + +PROF.isProfileOf +# == rdflib.term.URIRef("http://www.w3.org/ns/dx/prof/isProfileOf") + +SOSA.Sensor +# == rdflib.term.URIRef("http://www.w3.org/ns/sosa/Sensor") +``` + + +## Adding Triples to a graph + +We already saw in [intro_to_parsing](intro_to_parsing.md), how triples can be added from files and online locations with the [`parse()`][rdflib.graph.Graph.parse] function. + +Triples can also be added within Python code directly, using the [`add()`][rdflib.graph.Graph.add] function: + +[`add()`][rdflib.graph.Graph.add] takes a 3-tuple (a "triple") of RDFLib nodes. Using the nodes and namespaces we defined previously: + +```python +from rdflib import Graph, URIRef, Literal, BNode +from rdflib.namespace import FOAF, RDF + +g = Graph() +g.bind("foaf", FOAF) + +bob = URIRef("http://example.org/people/Bob") +linda = BNode() # a GUID is generated + +name = Literal("Bob") +age = Literal(24) + +g.add((bob, RDF.type, FOAF.Person)) +g.add((bob, FOAF.name, name)) +g.add((bob, FOAF.age, age)) +g.add((bob, FOAF.knows, linda)) +g.add((linda, RDF.type, FOAF.Person)) +g.add((linda, FOAF.name, Literal("Linda"))) + +print(g.serialize()) +``` + +outputs: + +```turtle +@prefix foaf: . +@prefix xsd: . + + a foaf:Person ; + foaf:age 24 ; + foaf:knows [ a foaf:Person ; + foaf:name "Linda" ] ; + foaf:name "Bob" . +``` + +For some properties, only one value per resource makes sense (i.e they are *functional properties*, or have a max-cardinality of 1). The [`set()`][rdflib.graph.Graph.set] method is useful for this: + +```python +from rdflib import Graph, URIRef, Literal +from rdflib.namespace import FOAF + +g = Graph() +bob = URIRef("http://example.org/people/Bob") + +g.add((bob, FOAF.age, Literal(42))) +print(f"Bob is {g.value(bob, FOAF.age)}") +# prints: Bob is 42 + +g.set((bob, FOAF.age, Literal(43))) # replaces 42 set above +print(f"Bob is now {g.value(bob, FOAF.age)}") +# prints: Bob is now 43 +``` + + +[`value()`][rdflib.graph.Graph.value] is the matching query method. It will return a single value for a property, optionally raising an exception if there are more. + +You can also add triples by combining entire graphs, see [graph-setops](intro_to_graphs.md). + +## Removing Triples + +Similarly, triples can be removed by a call to [`remove()`][rdflib.graph.Graph.remove]: + +When removing, it is possible to leave parts of the triple unspecified (i.e. passing `None`), this will remove all matching triples: + +```python +g.remove((bob, None, None)) # remove all triples about bob +``` + + +## An example + +LiveJournal produces FOAF data for their users, but they seem to use `foaf:member_name` for a person's full name but `foaf:member_name` isn't in FOAF's namespace and perhaps they should have used `foaf:name` + +To retrieve some LiveJournal data, add a `foaf:name` for every `foaf:member_name` and then remove the `foaf:member_name` values to ensure the data actually aligns with other FOAF data, we could do this: + +```python +from rdflib import Graph +from rdflib.namespace import FOAF + +g = Graph() +# get the data +g.parse("http://danbri.livejournal.com/data/foaf") + +# for every foaf:member_name, add foaf:name and remove foaf:member_name +for s, p, o in g.triples((None, FOAF['member_name'], None)): + g.add((s, FOAF['name'], o)) + g.remove((s, FOAF['member_name'], o)) +``` + +!!! info "Foaf member name" + Since rdflib 5.0.0, using `foaf:member_name` is somewhat prevented in RDFlib since FOAF is declared as a [`ClosedNamespace`][rdflib.namespace.ClosedNamespace] class instance that has a closed set of members and `foaf:member_name` isn't one of them! If LiveJournal had used RDFlib 5.0.0, an error would have been raised for `foaf:member_name` when the triple was created. + + +## Creating Containers & Collections + +There are two convenience classes for RDF Containers & Collections which you can use instead of declaring each triple of a Containers or a Collections individually: + +* [`Container`][rdflib.container.Container] (also `Bag`, `Seq` & `Alt`) and +* [`Collection`][rdflib.collection.Collection] + +See their documentation for how. diff --git a/docs/intro_to_creating_rdf.rst b/docs/intro_to_creating_rdf.rst deleted file mode 100644 index 9409dfbe8a..0000000000 --- a/docs/intro_to_creating_rdf.rst +++ /dev/null @@ -1,201 +0,0 @@ -.. _intro_to_creating_rdf: - -==================== -Creating RDF triples -==================== - -Creating Nodes --------------- - -RDF data is a graph where the nodes are URI references, Blank Nodes or Literals. In RDFLib, these node types are -represented by the classes :class:`~rdflib.term.URIRef`, :class:`~rdflib.term.BNode`, and :class:`~rdflib.term.Literal`. -``URIRefs`` and ``BNodes`` can both be thought of as resources, such a person, a company, a website, etc. - -* A ``BNode`` is a node where the exact URI is not known - usually a node with identity only in relation to other nodes. -* A ``URIRef`` is a node where the exact URI is known. In addition to representing some subjects and predicates in RDF graphs, ``URIRef``\s are always used to represent properties/predicates -* ``Literals`` represent object values, such as a name, a date, a number, etc. The most common literal values are XML data types, e.g. string, int... but custom types can be declared too - -Nodes can be created by the constructors of the node classes: - -.. code-block:: python - - from rdflib import URIRef, BNode, Literal - - bob = URIRef("http://example.org/people/Bob") - linda = BNode() # a GUID is generated - - name = Literal("Bob") # passing a string - age = Literal(24) # passing a python int - height = Literal(76.5) # passing a python float - -Literals can be created from Python objects, this creates ``data-typed literals``. For the details on the mapping see -:ref:`rdflibliterals`. - -For creating many ``URIRefs`` in the same ``namespace``, i.e. URIs with the same prefix, RDFLib has the -:class:`rdflib.namespace.Namespace` class - -:: - - from rdflib import Namespace - - n = Namespace("http://example.org/people/") - - n.bob # == rdflib.term.URIRef("http://example.org/people/bob") - n.eve # == rdflib.term.URIRef("http://example.org/people/eve") - - -This is very useful for schemas where all properties and classes have the same URI prefix. RDFLib defines Namespaces for -some common RDF/OWL schemas, including most W3C ones: - -.. code-block:: python - - from rdflib.namespace import CSVW, DC, DCAT, DCTERMS, DOAP, FOAF, ODRL2, ORG, OWL, \ - PROF, PROV, RDF, RDFS, SDO, SH, SKOS, SOSA, SSN, TIME, \ - VOID, XMLNS, XSD - - RDF.type - # == rdflib.term.URIRef("http://www.w3.org/1999/02/22-rdf-syntax-ns#type") - - FOAF.knows - # == rdflib.term.URIRef("http://xmlns.com/foaf/0.1/knows") - - PROF.isProfileOf - # == rdflib.term.URIRef("http://www.w3.org/ns/dx/prof/isProfileOf") - - SOSA.Sensor - # == rdflib.term.URIRef("http://www.w3.org/ns/sosa/Sensor") - - -Adding Triples to a graph -------------------------- - -We already saw in :doc:`intro_to_parsing`, how triples can be added from files and online locations with with the -:meth:`~rdflib.graph.Graph.parse` function. - -Triples can also be added within Python code directly, using the :meth:`~rdflib.graph.Graph.add` function: - -.. automethod:: rdflib.graph.Graph.add - :noindex: - -:meth:`~rdflib.graph.Graph.add` takes a 3-tuple (a "triple") of RDFLib nodes. Using the nodes and -namespaces we defined previously: - -.. code-block:: python - - from rdflib import Graph, URIRef, Literal, BNode - from rdflib.namespace import FOAF, RDF - - g = Graph() - g.bind("foaf", FOAF) - - bob = URIRef("http://example.org/people/Bob") - linda = BNode() # a GUID is generated - - name = Literal("Bob") - age = Literal(24) - - g.add((bob, RDF.type, FOAF.Person)) - g.add((bob, FOAF.name, name)) - g.add((bob, FOAF.age, age)) - g.add((bob, FOAF.knows, linda)) - g.add((linda, RDF.type, FOAF.Person)) - g.add((linda, FOAF.name, Literal("Linda"))) - - print(g.serialize()) - - -outputs: - -.. code-block:: Turtle - - @prefix foaf: . - @prefix xsd: . - - a foaf:Person ; - foaf:age 24 ; - foaf:knows [ a foaf:Person ; - foaf:name "Linda" ] ; - foaf:name "Bob" . - -For some properties, only one value per resource makes sense (i.e they are *functional properties*, or have a -max-cardinality of 1). The :meth:`~rdflib.graph.Graph.set` method is useful for this: - -.. code-block:: python - - from rdflib import Graph, URIRef, Literal - from rdflib.namespace import FOAF - - g = Graph() - bob = URIRef("http://example.org/people/Bob") - - g.add((bob, FOAF.age, Literal(42))) - print(f"Bob is {g.value(bob, FOAF.age)}") - # prints: Bob is 42 - - g.set((bob, FOAF.age, Literal(43))) # replaces 42 set above - print(f"Bob is now {g.value(bob, FOAF.age)}") - # prints: Bob is now 43 - - -:meth:`rdflib.graph.Graph.value` is the matching query method. It will return a single value for a property, optionally -raising an exception if there are more. - -You can also add triples by combining entire graphs, see :ref:`graph-setops`. - - -Removing Triples ----------------- - -Similarly, triples can be removed by a call to :meth:`~rdflib.graph.Graph.remove`: - -.. automethod:: rdflib.graph.Graph.remove - :noindex: - -When removing, it is possible to leave parts of the triple unspecified (i.e. passing ``None``), this will remove all -matching triples: - -.. code-block:: python - - g.remove((bob, None, None)) # remove all triples about bob - - -An example ----------- - -LiveJournal produces FOAF data for their users, but they seem to use -``foaf:member_name`` for a person's full name but ``foaf:member_name`` -isn't in FOAF's namespace and perhaps they should have used ``foaf:name`` - -To retrieve some LiveJournal data, add a ``foaf:name`` for every -``foaf:member_name`` and then remove the ``foaf:member_name`` values to -ensure the data actually aligns with other FOAF data, we could do this: - -.. code-block:: python - - from rdflib import Graph - from rdflib.namespace import FOAF - - g = Graph() - # get the data - g.parse("http://danbri.livejournal.com/data/foaf") - - # for every foaf:member_name, add foaf:name and remove foaf:member_name - for s, p, o in g.triples((None, FOAF['member_name'], None)): - g.add((s, FOAF['name'], o)) - g.remove((s, FOAF['member_name'], o)) - -.. note:: Since rdflib 5.0.0, using ``foaf:member_name`` is somewhat prevented in RDFlib since FOAF is declared - as a :meth:`~rdflib.namespace.ClosedNamespace` class instance that has a closed set of members and - ``foaf:member_name`` isn't one of them! If LiveJournal had used RDFlib 5.0.0, an error would have been raised for - ``foaf:member_name`` when the triple was created. - - -Creating Containers & Collections ---------------------------------- -There are two convenience classes for RDF Containers & Collections which you can use instead of declaring each -triple of a Containers or a Collections individually: - - * :meth:`~rdflib.container.Container` (also ``Bag``, ``Seq`` & ``Alt``) and - * :meth:`~rdflib.collection.Collection` - -See their documentation for how. diff --git a/docs/intro_to_graphs.md b/docs/intro_to_graphs.md new file mode 100644 index 0000000000..115bb1e654 --- /dev/null +++ b/docs/intro_to_graphs.md @@ -0,0 +1,101 @@ +# Navigating Graphs + +An RDF Graph is a set of RDF triples, and we try to mirror exactly this in RDFLib. The Python [`Graph`][rdflib.graph.Graph] tries to emulate a container type. + +## Graphs as Iterators + +RDFLib graphs override [`__iter__()`][rdflib.graph.Graph.__iter__] in order to support iteration over the contained triples: + +```python +for s, p, o in someGraph: + if not (s, p, o) in someGraph: + raise Exception("Iterator / Container Protocols are Broken!!") +``` + +This loop iterates through all the subjects(s), predicates (p) & objects (o) in `someGraph`. + +## Contains check + +Graphs implement [`__contains__()`][rdflib.graph.Graph.__contains__], so you can check if a triple is in a graph with a `triple in graph` syntax: + +```python +from rdflib import URIRef +from rdflib.namespace import RDF + +bob = URIRef("http://example.org/people/bob") +if (bob, RDF.type, FOAF.Person) in graph: + print("This graph knows that Bob is a person!") +``` + +Note that this triple does not have to be completely bound: + +```python +if (bob, None, None) in graph: + print("This graph contains triples about Bob!") +``` + +## Set Operations on RDFLib Graphs + +Graphs override several pythons operators: [`__iadd__()`][rdflib.graph.Graph.__iadd__], [`__isub__()`][rdflib.graph.Graph.__isub__], etc. This supports addition, subtraction and other set-operations on Graphs: + +| operation | effect | +|-----------|--------| +| `G1 + G2` | return new graph with union (triples on both) | +| `G1 += G2` | in place union / addition | +| `G1 - G2` | return new graph with difference (triples in G1, not in G2) | +| `G1 -= G2` | in place difference / subtraction | +| `G1 & G2` | intersection (triples in both graphs) | +| `G1 ^ G2` | xor (triples in either G1 or G2, but not in both) | + +!!! warning + Set-operations on graphs assume Blank Nodes are shared between graphs. This may or may not be what you want. See [merging](merging.md) for details. + +## Basic Triple Matching + +Instead of iterating through all triples, RDFLib graphs support basic triple pattern matching with a [`triples()`][rdflib.graph.Graph.triples] function. This function is a generator of triples that match a pattern given by arguments, i.e. arguments restrict the triples that are returned. Terms that are `None` are treated as a wildcard. For example: + +```python +g.parse("some_foaf.ttl") +# find all subjects (s) of type (rdf:type) person (foaf:Person) +for s, p, o in g.triples((None, RDF.type, FOAF.Person)): + print(f"{s} is a person") + +# find all subjects of any type +for s, p, o in g.triples((None, RDF.type, None)): + print(f"{s} is a {o}") + +# create a graph +bobgraph = Graph() +# add all triples with subject 'bob' +bobgraph += g.triples((bob, None, None)) +``` + +If you are not interested in whole triples, you can get only the bits you want with the methods [`objects()`][rdflib.graph.Graph.objects], [`subjects()`][rdflib.graph.Graph.subjects], [`predicates()`][rdflib.graph.Graph.predicates], [`predicate_objects()`][rdflib.graph.Graph.predicate_objects], etc. Each take parameters for the components of the triple to constraint: + +```python +for person in g.subjects(RDF.type, FOAF.Person): + print("{} is a person".format(person)) +``` + +Finally, for some properties, only one value per resource makes sense (i.e they are *functional properties*, or have a max-cardinality of 1). The [`value()`][rdflib.graph.Graph.value] method is useful for this, as it returns just a single node, not a generator: + +```python +# get any name of bob +name = g.value(bob, FOAF.name) +# get the one person that knows bob and raise an exception if more are found +person = g.value(predicate=FOAF.knows, object=bob, any=False) +``` + + +## Graph methods for accessing triples + +Here is a list of all convenience methods for querying Graphs: + +* [`triples()`][rdflib.graph.Graph.triples] +* [`value()`][rdflib.graph.Graph.value] +* [`subjects()`][rdflib.graph.Graph.subjects] +* [`objects()`][rdflib.graph.Graph.objects] +* [`predicates()`][rdflib.graph.Graph.predicates] +* [`subject_objects()`][rdflib.graph.Graph.subject_objects] +* [`subject_predicates()`][rdflib.graph.Graph.subject_predicates] +* [`predicate_objects()`][rdflib.graph.Graph.predicate_objects] diff --git a/docs/intro_to_graphs.rst b/docs/intro_to_graphs.rst deleted file mode 100644 index c061a3c7b2..0000000000 --- a/docs/intro_to_graphs.rst +++ /dev/null @@ -1,131 +0,0 @@ -.. _rdflib_graph: Navigating Graphs - -================= -Navigating Graphs -================= - -An RDF Graph is a set of RDF triples, and we try to mirror exactly this in RDFLib. The Python -:meth:`~rdflib.graph.Graph` tries to emulate a container type. - -Graphs as Iterators -------------------- - -RDFLib graphs override :meth:`~rdflib.graph.Graph.__iter__` in order to support iteration over the contained triples: - -.. code-block:: python - - for s, p, o in someGraph: - if not (s, p, o) in someGraph: - raise Exception("Iterator / Container Protocols are Broken!!") - -This loop iterates through all the subjects(s), predicates (p) & objects (o) in ``someGraph``. - -Contains check --------------- - -Graphs implement :meth:`~rdflib.graph.Graph.__contains__`, so you can check if a triple is in a graph with a -``triple in graph`` syntax: - -.. code-block:: python - - from rdflib import URIRef - from rdflib.namespace import RDF - - bob = URIRef("http://example.org/people/bob") - if (bob, RDF.type, FOAF.Person) in graph: - print("This graph knows that Bob is a person!") - -Note that this triple does not have to be completely bound: - -.. code-block:: python - - if (bob, None, None) in graph: - print("This graph contains triples about Bob!") - -.. _graph-setops: - -Set Operations on RDFLib Graphs -------------------------------- - -Graphs override several pythons operators: :meth:`~rdflib.graph.Graph.__iadd__`, :meth:`~rdflib.graph.Graph.__isub__`, -etc. This supports addition, subtraction and other set-operations on Graphs: - -============ ============================================================= -operation effect -============ ============================================================= -``G1 + G2`` return new graph with union (triples on both) -``G1 += G2`` in place union / addition -``G1 - G2`` return new graph with difference (triples in G1, not in G2) -``G1 -= G2`` in place difference / subtraction -``G1 & G2`` intersection (triples in both graphs) -``G1 ^ G2`` xor (triples in either G1 or G2, but not in both) -============ ============================================================= - -.. warning:: Set-operations on graphs assume Blank Nodes are shared between graphs. This may or may not be what you want. See :doc:`merging` for details. - -Basic Triple Matching ---------------------- - -Instead of iterating through all triples, RDFLib graphs support basic triple pattern matching with a -:meth:`~rdflib.graph.Graph.triples` function. This function is a generator of triples that match a pattern given by -arguments, i.e. arguments restrict the triples that are returned. Terms that are :data:`None` are treated as a wildcard. -For example: - -.. code-block:: python - - g.parse("some_foaf.ttl") - # find all subjects (s) of type (rdf:type) person (foaf:Person) - for s, p, o in g.triples((None, RDF.type, FOAF.Person)): - print(f"{s} is a person") - - # find all subjects of any type - for s, p, o in g.triples((None, RDF.type, None)): - print(f"{s} is a {o}") - - # create a graph - bobgraph = Graph() - # add all triples with subject 'bob' - bobgraph += g.triples((bob, None, None)) - -If you are not interested in whole triples, you can get only the bits you want with the methods -:meth:`~rdflib.graph.Graph.objects`, :meth:`~rdflib.graph.Graph.subjects`, :meth:`~rdflib.graph.Graph.predicates`, -:meth:`~rdflib.graph.Graph.predicate_objects`, etc. Each take parameters for the components of the triple to constraint: - -.. code-block:: python - - for person in g.subjects(RDF.type, FOAF.Person): - print("{} is a person".format(person)) - -Finally, for some properties, only one value per resource makes sense (i.e they are *functional properties*, or have a -max-cardinality of 1). The :meth:`~rdflib.graph.Graph.value` method is useful for this, as it returns just a single -node, not a generator: - -.. code-block:: python - - # get any name of bob - name = g.value(bob, FOAF.name) - # get the one person that knows bob and raise an exception if more are found - person = g.value(predicate=FOAF.knows, object=bob, any=False) - - -:class:`~rdflib.graph.Graph` methods for accessing triples ------------------------------------------------------------ - -Here is a list of all convenience methods for querying Graphs: - -.. automethod:: rdflib.graph.Graph.triples - :noindex: -.. automethod:: rdflib.graph.Graph.value - :noindex: -.. automethod:: rdflib.graph.Graph.subjects - :noindex: -.. automethod:: rdflib.graph.Graph.objects - :noindex: -.. automethod:: rdflib.graph.Graph.predicates - :noindex: -.. automethod:: rdflib.graph.Graph.subject_objects - :noindex: -.. automethod:: rdflib.graph.Graph.subject_predicates - :noindex: -.. automethod:: rdflib.graph.Graph.predicate_objects - :noindex: diff --git a/docs/intro_to_parsing.md b/docs/intro_to_parsing.md new file mode 100644 index 0000000000..92b672da7e --- /dev/null +++ b/docs/intro_to_parsing.md @@ -0,0 +1,134 @@ +# Loading and saving RDF + +## Reading RDF files + +RDF data can be represented using various syntaxes (`turtle`, `rdf/xml`, `n3`, `n-triples`, `trix`, `JSON-LD`, etc.). The simplest format is `ntriples`, which is a triple-per-line format. + +Create the file `demo.nt` in the current directory with these two lines in it: + +```turtle + . + "Hello World" . +``` + +On line 1 this file says "drewp is a FOAF Person:. On line 2 it says "drep says "Hello World"". + +RDFLib can guess what format the file is by the file ending (".nt" is commonly used for n-triples) so you can just use [`parse()`][rdflib.graph.Graph.parse] to read in the file. If the file had a non-standard RDF file ending, you could set the keyword-parameter `format` to specify either an Internet Media Type or the format name (a [list of available parsers][rdflib.plugins.parsers] is available). + +In an interactive python interpreter, try this: + +```python +from rdflib import Graph + +g = Graph() +g.parse("demo.nt") + +print(len(g)) +# prints: 2 + +import pprint +for stmt in g: + pprint.pprint(stmt) +# prints: +# (rdflib.term.URIRef('http://example.com/drewp'), +# rdflib.term.URIRef('http://example.com/says'), +# rdflib.term.Literal('Hello World')) +# (rdflib.term.URIRef('http://example.com/drewp'), +# rdflib.term.URIRef('http://www.w3.org/1999/02/22-rdf-syntax-ns#type'), +# rdflib.term.URIRef('http://xmlns.com/foaf/0.1/Person')) +``` + +The final lines show how RDFLib represents the two statements in the file: the statements themselves are just length-3 tuples ("triples") and the subjects, predicates, and objects of the triples are all rdflib types. + +## Reading remote RDF + +Reading graphs from the Internet is easy: + +```python +from rdflib import Graph + +g = Graph() +g.parse("http://www.w3.org/People/Berners-Lee/card") +print(len(g)) +# prints: 86 +``` + +[`parse()`][rdflib.Graph.parse] can process local files, remote data via a URL, as in this example, or RDF data in a string (using the `data` parameter). + +## Saving RDF + +To store a graph in a file, use the [`serialize()`][rdflib.Graph.serialize] function: + +```python +from rdflib import Graph + +g = Graph() +g.parse("http://www.w3.org/People/Berners-Lee/card") +g.serialize(destination="tbl.ttl") +``` + +This parses data from http://www.w3.org/People/Berners-Lee/card and stores it in a file `tbl.ttl` in this directory using the turtle format, which is the default RDF serialization (as of rdflib 6.0.0). + +To read the same data and to save it as an RDF/XML format string in the variable `v`, do this: + +```python +from rdflib import Graph + +g = Graph() +g.parse("http://www.w3.org/People/Berners-Lee/card") +v = g.serialize(format="xml") +``` + +The following table lists the RDF formats you can serialize data to with rdflib, out of the box, and the `format=KEYWORD` keyword used to reference them within `serialize()`: + +| RDF Format | Keyword | Notes | +|------------|---------|-------| +| Turtle | turtle, ttl or turtle2 | turtle2 is just turtle with more spacing & linebreaks | +| RDF/XML | xml or pretty-xml | Was the default format, rdflib < 6.0.0 | +| JSON-LD | json-ld | There are further options for compact syntax and other JSON-LD variants | +| N-Triples | ntriples, nt or nt11 | nt11 is exactly like nt, only utf8 encoded | +| Notation-3 | n3 | N3 is a superset of Turtle that also caters for rules and a few other things | +| Trig | trig | Turtle-like format for RDF triples + context (RDF quads) and thus multiple graphs | +| Trix | trix | RDF/XML-like format for RDF quads | +| N-Quads | nquads | N-Triples-like format for RDF quads | + +## Working with multi-graphs + +To read and query multi-graphs, that is RDF data that is context-aware, you need to use rdflib's [`Dataset`][rdflib.Dataset] class. This an extension to [`Graph`][rdflib.Graph] that know all about quads (triples + graph IDs). + +If you had this multi-graph data file (in the `trig` format, using new-style `PREFIX` statement (not the older `@prefix`): + +```turtle +PREFIX eg: +PREFIX foaf: + +eg:graph-1 { + eg:drewp a foaf:Person . + eg:drewp eg:says "Hello World" . +} + +eg:graph-2 { + eg:nick a foaf:Person . + eg:nick eg:says "Hi World" . +} +``` + +You could parse the file and query it like this: + +```python +from rdflib import Dataset +from rdflib.namespace import RDF + +g = Dataset() +g.parse("demo.trig") + +for s, p, o, g in g.quads((None, RDF.type, None, None)): + print(s, g) +``` + +This will print out: + +``` +http://example.com/person/drewp http://example.com/person/graph-1 +http://example.com/person/nick http://example.com/person/graph-2 +``` diff --git a/docs/intro_to_parsing.rst b/docs/intro_to_parsing.rst deleted file mode 100644 index 8b011c53f4..0000000000 --- a/docs/intro_to_parsing.rst +++ /dev/null @@ -1,158 +0,0 @@ -.. _intro_to_parsing: - -====================== -Loading and saving RDF -====================== - -Reading RDF files ------------------ - -RDF data can be represented using various syntaxes (``turtle``, ``rdf/xml``, ``n3``, ``n-triples``, -``trix``, ``JSON-LD``, etc.). The simplest format is -``ntriples``, which is a triple-per-line format. - -Create the file :file:`demo.nt` in the current directory with these two lines in it: - -.. code-block:: Turtle - - . - "Hello World" . - -On line 1 this file says "drewp is a FOAF Person:. On line 2 it says "drep says "Hello World"". - -RDFLib can guess what format the file is by the file ending (".nt" is commonly used for n-triples) so you can just use -:meth:`~rdflib.graph.Graph.parse` to read in the file. If the file had a non-standard RDF file ending, you could set the -keyword-parameter ``format`` to specify either an Internet Media Type or the format name (a :doc:`list of available -parsers ` is available). - -In an interactive python interpreter, try this: - -.. code-block:: python - - from rdflib import Graph - - g = Graph() - g.parse("demo.nt") - - print(len(g)) - # prints: 2 - - import pprint - for stmt in g: - pprint.pprint(stmt) - # prints: - # (rdflib.term.URIRef('http://example.com/drewp'), - # rdflib.term.URIRef('http://example.com/says'), - # rdflib.term.Literal('Hello World')) - # (rdflib.term.URIRef('http://example.com/drewp'), - # rdflib.term.URIRef('http://www.w3.org/1999/02/22-rdf-syntax-ns#type'), - # rdflib.term.URIRef('http://xmlns.com/foaf/0.1/Person')) - -The final lines show how RDFLib represents the two statements in the -file: the statements themselves are just length-3 tuples ("triples") and the -subjects, predicates, and objects of the triples are all rdflib types. - -Reading remote RDF ------------------- - -Reading graphs from the Internet is easy: - -.. code-block:: python - - from rdflib import Graph - - g = Graph() - g.parse("http://www.w3.org/People/Berners-Lee/card") - print(len(g)) - # prints: 86 - -:func:`rdflib.Graph.parse` can process local files, remote data via a URL, as in this example, or RDF data in a string -(using the ``data`` parameter). - - -Saving RDF ----------- - -To store a graph in a file, use the :func:`rdflib.Graph.serialize` function: - -.. code-block:: python - - from rdflib import Graph - - g = Graph() - g.parse("http://www.w3.org/People/Berners-Lee/card") - g.serialize(destination="tbl.ttl") - -This parses data from http://www.w3.org/People/Berners-Lee/card and stores it in a file ``tbl.ttl`` in this directory -using the turtle format, which is the default RDF serialization (as of rdflib 6.0.0). - -To read the same data and to save it as an RDF/XML format string in the variable ``v``, do this: - -.. code-block:: python - - from rdflib import Graph - - g = Graph() - g.parse("http://www.w3.org/People/Berners-Lee/card") - v = g.serialize(format="xml") - - -The following table lists the RDF formats you can serialize data to with rdflib, out of the box, and the ``format=KEYWORD`` keyword used to reference them within ``serialize()``: - -.. csv-table:: - :header: "RDF Format", "Keyword", "Notes" - - "Turtle", "turtle, ttl or turtle2", "turtle2 is just turtle with more spacing & linebreaks" - "RDF/XML", "xml or pretty-xml", "Was the default format, rdflib < 6.0.0" - "JSON-LD", "json-ld", "There are further options for compact syntax and other JSON-LD variants" - "N-Triples", "ntriples, nt or nt11", "nt11 is exactly like nt, only utf8 encoded" - "Notation-3","n3", "N3 is a superset of Turtle that also caters for rules and a few other things" - - "Trig", "trig", "Turtle-like format for RDF triples + context (RDF quads) and thus multiple graphs" - "Trix", "trix", "RDF/XML-like format for RDF quads" - "N-Quads", "nquads", "N-Triples-like format for RDF quads" - -Working with multi-graphs -------------------------- - -To read and query multi-graphs, that is RDF data that is context-aware, you need to use rdflib's -:class:`rdflib.Dataset` class. This an extension to :class:`rdflib.Graph` that -know all about quads (triples + graph IDs). - -If you had this multi-graph data file (in the ``trig`` format, using new-style ``PREFIX`` statement (not the older -``@prefix``): - -.. code-block:: Turtle - - PREFIX eg: - PREFIX foaf: - - eg:graph-1 { - eg:drewp a foaf:Person . - eg:drewp eg:says "Hello World" . - } - - eg:graph-2 { - eg:nick a foaf:Person . - eg:nick eg:says "Hi World" . - } - -You could parse the file and query it like this: - -.. code-block:: python - - from rdflib import Dataset - from rdflib.namespace import RDF - - g = Dataset() - g.parse("demo.trig") - - for s, p, o, g in g.quads((None, RDF.type, None, None)): - print(s, g) - -This will print out: - -.. code-block:: - - http://example.com/person/drewp http://example.com/person/graph-1 - http://example.com/person/nick http://example.com/person/graph-2 diff --git a/docs/intro_to_sparql.md b/docs/intro_to_sparql.md new file mode 100644 index 0000000000..f4cdf0ea67 --- /dev/null +++ b/docs/intro_to_sparql.md @@ -0,0 +1,159 @@ +# Querying with SPARQL + +## Run a Query + +The RDFLib comes with an implementation of the [SPARQL 1.1 Query](http://www.w3.org/TR/sparql11-query/) and [SPARQL 1.1 Update](http://www.w3.org/TR/sparql11-update/) query languages. + +Queries can be evaluated against a graph with the [`query()`][rdflib.graph.Graph.query] method, and updates with [`update()`][rdflib.graph.Graph.update]. + +The query method returns a [`Result`][rdflib.query.Result] instance. For SELECT queries, iterating over this returns [`ResultRow`][rdflib.query.ResultRow] instances, each containing a set of variable bindings. For `CONSTRUCT`/`DESCRIBE` queries, iterating over the result object gives the triples. For `ASK` queries, iterating will yield the single boolean answer, or evaluating the result object in a boolean-context (i.e. `bool(result)`) + +For example... + +```python +import rdflib +g = rdflib.Graph() +g.parse("http://danbri.org/foaf.rdf#") + +knows_query = """ +SELECT DISTINCT ?aname ?bname +WHERE { + ?a foaf:knows ?b . + ?a foaf:name ?aname . + ?b foaf:name ?bname . +}""" + +qres = g.query(knows_query) +for row in qres: + print(f"{row.aname} knows {row.bname}") +``` + +The results are tuples of values in the same order as your `SELECT` arguments. Alternatively, the values can be accessed by variable name, either as attributes, or as items, e.g. `row.b` and `row["b"]` are equivalent. The above, given the appropriate data, would print something like: + +```text +Timothy Berners-Lee knows Edd Dumbill +Timothy Berners-Lee knows Jennifer Golbeck +Timothy Berners-Lee knows Nicholas Gibbins +... +``` + +As an alternative to using `SPARQL`'s `PREFIX`, namespace bindings can be passed in with the `initNs` kwarg, see [namespaces_and_bindings](namespaces_and_bindings.md). + +Variables can also be pre-bound, using the `initBindings` kwarg which can pass in a `dict` of initial bindings. This is particularly useful for prepared queries, as described below. + +## Update Queries + +Update queries are performed just like reading queries but using the [`update()`][rdflib.graph.Graph.update] method. An example: + +```python +from rdflib import Graph + +# Create a Graph, add in some test data +g = Graph() +g.parse( + data=""" + a . + a . + """, + format="turtle" +) + +# Select all the things (s) that are of type (rdf:type) c: +qres = g.query("""SELECT ?s WHERE { ?s a }""") + +for row in qres: + print(f"{row.s}") +# prints: +# x: +# y: + +# Add in a new triple using SPARQL UPDATE +g.update("""INSERT DATA { a }""") + +# Select all the things (s) that are of type (rdf:type) c: +qres = g.query("""SELECT ?s WHERE { ?s a }""") + +print("After update:") +for row in qres: + print(f"{row.s}") +# prints: +# x: +# y: +# z: + +# Change type of from to +g.update(""" + DELETE { a } + INSERT { a } + WHERE { a } + """) +print("After second update:") +qres = g.query("""SELECT ?s ?o WHERE { ?s a ?o }""") +for row in qres: + print(f"{row.s} a {row.o}") +# prints: +# x: a c: +# z: a c: +# y: a d: +``` + +## Querying a Remote Service + +The `SERVICE` keyword of SPARQL 1.1 can send a query to a remote SPARQL endpoint. + +```python +import rdflib + +g = rdflib.Graph() +qres = g.query( + """ + SELECT ?s + WHERE { + SERVICE { + ?s a ?o . + } + } + LIMIT 3 + """ +) + +for row in qres: + print(row.s) +``` + +This example sends a query to [DBPedia](https://dbpedia.org/)'s SPARQL endpoint service so that it can run the query and then send back the result: + +```text + + + +``` + +## Prepared Queries + +RDFLib lets you *prepare* queries before execution, this saves re-parsing and translating the query into SPARQL Algebra each time. + +The method [`prepareQuery()`][rdflib.plugins.sparql.prepareQuery] takes a query as a string and will return a [`Query`][rdflib.plugins.sparql.sparql.Query] object. This can then be passed to the [`query()`][rdflib.graph.Graph.query] method. + +The `initBindings` kwarg can be used to pass in a `dict` of initial bindings: + +```python +q = prepareQuery( + "SELECT ?s WHERE { ?person foaf:knows ?s .}", + initNs = { "foaf": FOAF } +) + +g = rdflib.Graph() +g.parse("foaf.rdf") + +tim = rdflib.URIRef("http://www.w3.org/People/Berners-Lee/card#i") + +for row in g.query(q, initBindings={'person': tim}): + print(row) +``` + +## Custom Evaluation Functions + +For experts, it is possible to override how bits of SPARQL algebra are evaluated. By using the [setuptools entry-point](http://pythonhosted.org/distribute/setuptools.html#dynamic-discovery-of-services-and-plugins) `rdf.plugins.sparqleval`, or simply adding to an entry to [`CUSTOM_EVALS`][rdflib.plugins.sparql.CUSTOM_EVALS], a custom function can be registered. The function will be called for each algebra component and may raise `NotImplementedError` to indicate that this part should be handled by the default implementation. + +See [`examples/custom_eval.py`][examples.custom_eval] diff --git a/docs/intro_to_sparql.rst b/docs/intro_to_sparql.rst deleted file mode 100644 index f2cbf5a691..0000000000 --- a/docs/intro_to_sparql.rst +++ /dev/null @@ -1,207 +0,0 @@ -.. _intro_to_using_sparql: - -==================== -Querying with SPARQL -==================== - - -Run a Query -^^^^^^^^^^^ - -The RDFLib comes with an implementation of the `SPARQL 1.1 Query -`_ and `SPARQL 1.1 Update -`_ query languages. - -Queries can be evaluated against a graph with the -:meth:`rdflib.graph.Graph.query` method, and updates with -:meth:`rdflib.graph.Graph.update`. - -The query method returns a :class:`rdflib.query.Result` instance. For -SELECT queries, iterating over this returns -:class:`rdflib.query.ResultRow` instances, each containing a set of -variable bindings. For ``CONSTRUCT``/``DESCRIBE`` queries, iterating over the -result object gives the triples. For ``ASK`` queries, iterating will yield -the single boolean answer, or evaluating the result object in a -boolean-context (i.e. ``bool(result)``) - -For example... - -.. code-block:: python - - import rdflib - g = rdflib.Graph() - g.parse("http://danbri.org/foaf.rdf#") - - knows_query = """ - SELECT DISTINCT ?aname ?bname - WHERE { - ?a foaf:knows ?b . - ?a foaf:name ?aname . - ?b foaf:name ?bname . - }""" - - qres = g.query(knows_query) - for row in qres: - print(f"{row.aname} knows {row.bname}") - - - -The results are tuples of values in the same order as your ``SELECT`` -arguments. Alternatively, the values can be accessed by variable -name, either as attributes, or as items, e.g. ``row.b`` and ``row["b"]`` are -equivalent. The above, given the appropriate data, would print something like: - -.. code-block:: text - - Timothy Berners-Lee knows Edd Dumbill - Timothy Berners-Lee knows Jennifer Golbeck - Timothy Berners-Lee knows Nicholas Gibbins - ... - -As an alternative to using ``SPARQL``\s ``PREFIX``, namespace -bindings can be passed in with the ``initNs`` kwarg, see -:doc:`namespaces_and_bindings`. - -Variables can also be pre-bound, using the ``initBindings`` kwarg which can -pass in a ``dict`` of initial bindings. This is particularly -useful for prepared queries, as described below. - -Update Queries -^^^^^^^^^^^^^^ - -Update queries are performed just like reading queries but using the :meth:`rdflib.graph.Graph.update` method. An -example: - -.. code-block:: python - - from rdflib import Graph - - # Create a Graph, add in some test data - g = Graph() - g.parse( - data=""" - a . - a . - """, - format="turtle" - ) - - # Select all the things (s) that are of type (rdf:type) c: - qres = g.query("""SELECT ?s WHERE { ?s a }""") - - for row in qres: - print(f"{row.s}") - # prints: - # x: - # y: - - # Add in a new triple using SPARQL UPDATE - g.update("""INSERT DATA { a }""") - - # Select all the things (s) that are of type (rdf:type) c: - qres = g.query("""SELECT ?s WHERE { ?s a }""") - - print("After update:") - for row in qres: - print(f"{row.s}") - # prints: - # x: - # y: - # z: - - # Change type of from to - g.update(""" - DELETE { a } - INSERT { a } - WHERE { a } - """) - print("After second update:") - qres = g.query("""SELECT ?s ?o WHERE { ?s a ?o }""") - for row in qres: - print(f"{row.s} a {row.o}") - # prints: - # x: a c: - # z: a c: - # y: a d: - - - -Querying a Remote Service -^^^^^^^^^^^^^^^^^^^^^^^^^ - -The ``SERVICE`` keyword of SPARQL 1.1 can send a query to a remote SPARQL endpoint. - -.. code-block:: python - - import rdflib - - g = rdflib.Graph() - qres = g.query( - """ - SELECT ?s - WHERE { - SERVICE { - ?s a ?o . - } - } - LIMIT 3 - """ - ) - - for row in qres: - print(row.s) - - - -This example sends a query to `DBPedia `_'s SPARQL endpoint service so that it can run the query -and then send back the result: - -.. code-block:: text - - - - - -Prepared Queries -^^^^^^^^^^^^^^^^ - -RDFLib lets you *prepare* queries before execution, this saves -re-parsing and translating the query into SPARQL Algebra each time. - -The method :meth:`rdflib.plugins.sparql.prepareQuery` takes a query as -a string and will return a :class:`rdflib.plugins.sparql.sparql.Query` -object. This can then be passed to the -:meth:`rdflib.graph.Graph.query` method. - -The ``initBindings`` kwarg can be used to pass in a ``dict`` of -initial bindings: - -.. code-block:: python - - q = prepareQuery( - "SELECT ?s WHERE { ?person foaf:knows ?s .}", - initNs = { "foaf": FOAF } - ) - - g = rdflib.Graph() - g.parse("foaf.rdf") - - tim = rdflib.URIRef("http://www.w3.org/People/Berners-Lee/card#i") - - for row in g.query(q, initBindings={'person': tim}): - print(row) - - -Custom Evaluation Functions -^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -For experts, it is possible to override how bits of SPARQL algebra are -evaluated. By using the `setuptools entry-point -`_ -``rdf.plugins.sparqleval``, or simply adding to an entry to -:data:`rdflib.plugins.sparql.CUSTOM_EVALS`, a custom function can be -registered. The function will be called for each algebra component and -may raise ``NotImplementedError`` to indicate that this part should be -handled by the default implementation. - -See :file:`examples/custom_eval.py` diff --git a/docs/merging.md b/docs/merging.md new file mode 100644 index 0000000000..25a970bafd --- /dev/null +++ b/docs/merging.md @@ -0,0 +1,39 @@ +# Merging graphs + +Graphs share blank nodes only if they are derived from graphs described by documents or other structures (such as an RDF dataset) that explicitly provide for the sharing of blank nodes between different RDF graphs. Simply downloading a web document does not mean that the blank nodes in a resulting RDF graph are the same as the blank nodes coming from other downloads of the same document or from the same RDF source. + +RDF applications which manipulate concrete syntaxes for RDF which use blank node identifiers should take care to keep track of the identity of the blank nodes they identify. Blank node identifiers often have a local scope, so when RDF from different sources is combined, identifiers may have to be changed in order to avoid accidental conflation of distinct blank nodes. + +For example, two documents may both use the blank node identifier "_:x" to identify a blank node, but unless these documents are in a shared identifier scope or are derived from a common source, the occurrences of "_:x" in one document will identify a different blank node than the one in the graph described by the other document. When graphs are formed by combining RDF from multiple sources, it may be necessary to standardize apart the blank node identifiers by replacing them by others which do not occur in the other document(s). + +_(copied directly from _ + +In RDFLib, blank nodes are given unique IDs when parsing, so graph merging can be done by simply reading several files into the same graph: + +```python +from rdflib import Graph + +graph = Graph() + +graph.parse(input1) +graph.parse(input2) +``` + +`graph` now contains the merged graph of `input1` and `input2`. + +!!! warning "Blank Node Collision" + However, the set-theoretic graph operations in RDFLib are assumed to be performed in sub-graphs of some larger data-base (for instance, in the context of a [`Dataset`][rdflib.graph.Dataset]) and assume shared blank node IDs, and therefore do NOT do _correct_ merging, i.e.: + + ```python + from rdflib import Graph + + g1 = Graph() + g1.parse(input1) + + g2 = Graph() + g2.parse(input2) + + graph = g1 + g2 + ``` + + May cause unwanted collisions of blank-nodes in `graph`. diff --git a/docs/merging.rst b/docs/merging.rst deleted file mode 100644 index 1721d9206c..0000000000 --- a/docs/merging.rst +++ /dev/null @@ -1,44 +0,0 @@ -.. _merging_graphs: - -============== -Merging graphs -============== - - Graphs share blank nodes only if they are derived from graphs described by documents or other structures (such as an RDF dataset) that explicitly provide for the sharing of blank nodes between different RDF graphs. Simply downloading a web document does not mean that the blank nodes in a resulting RDF graph are the same as the blank nodes coming from other downloads of the same document or from the same RDF source. - -RDF applications which manipulate concrete syntaxes for RDF which use blank node identifiers should take care to keep track of the identity of the blank nodes they identify. Blank node identifiers often have a local scope, so when RDF from different sources is combined, identifiers may have to be changed in order to avoid accidental conflation of distinct blank nodes. - -For example, two documents may both use the blank node identifier "_:x" to identify a blank node, but unless these documents are in a shared identifier scope or are derived from a common source, the occurrences of "_:x" in one document will identify a different blank node than the one in the graph described by the other document. When graphs are formed by combining RDF from multiple sources, it may be necessary to standardize apart the blank node identifiers by replacing them by others which do not occur in the other document(s). - -*(copied directly from https://www.w3.org/TR/rdf11-mt/#shared-blank-nodes-unions-and-merges)* - - -In RDFLib, blank nodes are given unique IDs when parsing, so graph merging can be done by simply reading several files into the same graph:: - - from rdflib import Graph - - graph = Graph() - - graph.parse(input1) - graph.parse(input2) - -``graph`` now contains the merged graph of ``input1`` and ``input2``. - - -.. note:: However, the set-theoretic graph operations in RDFLib are assumed to be performed in sub-graphs of some larger data-base (for instance, in the context of a :class:`~rdflib.graph.Dataset`) and assume shared blank node IDs, and therefore do NOT do *correct* merging, i.e.:: - - from rdflib import Graph - - g1 = Graph() - g1.parse(input1) - - g2 = Graph() - g2.parse(input2) - - graph = g1 + g2 - - May cause unwanted collisions of blank-nodes in - ``graph``. - - - diff --git a/docs/namespaces_and_bindings.md b/docs/namespaces_and_bindings.md new file mode 100644 index 0000000000..8efea994b9 --- /dev/null +++ b/docs/namespaces_and_bindings.md @@ -0,0 +1,143 @@ +# Namespaces and Bindings + +RDFLib provides several short-cuts to working with many URIs in the same namespace. + +The [`rdflib.namespace`][rdflib.namespace] module defines the [`Namespace`][rdflib.namespace.Namespace] class which lets you easily create URIs in a namespace: + +```python +from rdflib import Namespace + +EX = Namespace("http://example.org/") +EX.Person # a Python attribute for EX. This example is equivalent to rdflib.term.URIRef("http://example.org/Person") + +# use dict notation for things that are not valid Python identifiers, e.g.: +n['first%20name'] # as rdflib.term.URIRef("http://example.org/first%20name") +``` + +These two styles of namespace creation - object attribute and dict - are equivalent and are made available just to allow for valid RDF namespaces and URIs that are not valid Python identifiers. This isn't just for syntactic things like spaces, as per the example of `first%20name` above, but also for Python reserved words like `class` or `while`, so for the URI `http://example.org/class`, create it with `EX['class']`, not `EX.class`. + +## Common Namespaces + +The `namespace` module defines many common namespaces such as RDF, RDFS, OWL, FOAF, SKOS, PROF, etc. The list of the namespaces provided grows with user contributions to RDFLib. + +These Namespaces, and any others that users define, can also be associated with prefixes using the [`NamespaceManager`][rdflib.namespace.NamespaceManager], e.g. using `foaf` for `http://xmlns.com/foaf/0.1/`. + +Each RDFLib graph has a [`namespace_manager`][rdflib.graph.Graph.namespace_manager] that keeps a list of namespace to prefix mappings. The namespace manager is populated when reading in RDF, and these prefixes are used when serialising RDF, or when parsing SPARQL queries. Prefixes can be bound with the [`bind()`][rdflib.graph.Graph.bind] method: + +```python +from rdflib import Graph, Namespace +from rdflib.namespace import FOAF + +EX = Namespace("http://example.org/") + +g = Graph() +g.bind("foaf", FOAF) # bind an RDFLib-provided namespace to a prefix +g.bind("ex", EX) # bind a user-declared namespace to a prefix +``` + + +The [`bind()`][rdflib.graph.Graph.bind] method is actually supplied by the [`NamespaceManager`][rdflib.namespace.NamespaceManager] class - see next. + +## NamespaceManager + +Each RDFLib graph comes with a [`NamespaceManager`][rdflib.namespace.NamespaceManager] instance in the [`namespace_manager`][rdflib.graph.Graph.namespace_manager] field; you can use the [`bind()`][rdflib.namespace.NamespaceManager.bind] method of this instance to bind a prefix to a namespace URI, as above, however note that the [`NamespaceManager`][rdflib.namespace.NamespaceManager] automatically performs some bindings according to a selected strategy. + +Namespace binding strategies are indicated with the `bind_namespaces` input parameter to [`NamespaceManager`][rdflib.namespace.NamespaceManager] instances and may be set via `Graph` also: + +```python +from rdflib import Graph +from rdflib.namespace import NamespaceManager + +g = Graph(bind_namespaces="rdflib") # bind via Graph + +g2 = Graph() +nm = NamespaceManager(g2, bind_namespaces="rdflib") # bind via NamespaceManager +``` + + +Valid strategies are: + +- core: + - binds several core RDF prefixes only + - owl, rdf, rdfs, xsd, xml from the NAMESPACE_PREFIXES_CORE object + - this is default +- rdflib: + - binds all the namespaces shipped with RDFLib as DefinedNamespace instances + - all the core namespaces and all the following: brick, csvw, dc, dcat + - dcmitype, dcterms, dcam, doap, foaf, geo, odrl, org, prof, prov, qb, sdo + - sh, skos, sosa, ssn, time, vann, void + - see the NAMESPACE_PREFIXES_RDFLIB object in [`rdflib.namespace`][rdflib.namespace] for up-to-date list +- none: + - binds no namespaces to prefixes + - note this is NOT default behaviour +- cc: + - using prefix bindings from prefix.cc which is a online prefixes database + - not implemented yet - this is aspirational + +### Re-binding + +Note that regardless of the strategy employed, prefixes for namespaces can be overwritten with users preferred prefixes, for example: + +```python +from rdflib import Graph +from rdflib.namespace import GEO # imports GeoSPARQL's namespace + +g = Graph(bind_namespaces="rdflib") # binds GeoSPARQL's namespace to prefix 'geo' + +g.bind('geosp', GEO, override=True) +``` + +[`NamespaceManager`][rdflib.namespace.NamespaceManager] also has a method to normalize a given url: + +```python +from rdflib.namespace import NamespaceManager + +nm = NamespaceManager(Graph()) +nm.normalizeUri(t) +``` + +For simple output, or simple serialisation, you often want a nice readable representation of a term. All RDFLib terms have a `.n3()` method, which will return a suitable N3 format and into which you can supply a NamespaceManager instance to provide prefixes, i.e. `.n3(namespace_manager=some_nm)`: + +```python +>>> from rdflib import Graph, URIRef, Literal, BNode +>>> from rdflib.namespace import FOAF, NamespaceManager + +>>> person = URIRef("http://xmlns.com/foaf/0.1/Person") +>>> person.n3() +'' + +>>> g = Graph() +>>> g.bind("foaf", FOAF) + +>>> person.n3(g.namespace_manager) +'foaf:Person' + +>>> l = Literal(2) +>>> l.n3() +'"2"^^' + +>>> l.n3(NamespaceManager(Graph(), bind_namespaces="core")) +'"2"^^xsd:integer' +``` + +The namespace manager also has a useful method `compute_qname`. `g.namespace_manager.compute_qname(x)` (or just `g.compute_qname(x)`) which takes a URI and decomposes it into the parts: + +```python +self.assertEqual(g.compute_qname(URIRef("http://foo/bar#baz")), + ("ns2", URIRef("http://foo/bar#"), "baz")) +``` + +## Namespaces in SPARQL Queries + +The `initNs` argument supplied to [`query()`][rdflib.graph.Graph.query] is a dictionary of namespaces to be expanded in the query string. If you pass no `initNs` argument, the namespaces registered with the graphs namespace_manager are used: + +```python +from rdflib.namespace import FOAF +graph.query('SELECT * WHERE { ?p a foaf:Person }', initNs={'foaf': FOAF}) +``` + +In order to use an empty prefix (e.g. `?a :knows ?b`), use a `PREFIX` directive with no prefix in the SPARQL query to set a default namespace: + +```sparql +PREFIX : +``` diff --git a/docs/namespaces_and_bindings.rst b/docs/namespaces_and_bindings.rst deleted file mode 100644 index ef7458661c..0000000000 --- a/docs/namespaces_and_bindings.rst +++ /dev/null @@ -1,156 +0,0 @@ -.. _namespaces_and_bindings: Namespaces and Bindings - -======================= -Namespaces and Bindings -======================= - -RDFLib provides several short-cuts to working with many URIs in the same namespace. - -The :mod:`rdflib.namespace` defines the :class:`rdflib.namespace.Namespace` class which lets you easily create URIs in a namespace:: - - from rdflib import Namespace - - EX = Namespace("http://example.org/") - EX.Person # a Python attribute for EX. This example is equivalent to rdflib.term.URIRef("http://example.org/Person") - - # use dict notation for things that are not valid Python identifiers, e.g.: - n['first%20name'] # as rdflib.term.URIRef("http://example.org/first%20name") - -These two styles of namespace creation - object attribute and dict - are equivalent and are made available just to allow for valid -RDF namespaces and URIs that are not valid Python identifiers. This isn't just for syntactic things like spaces, as per -the example of ``first%20name`` above, but also for Python reserved words like ``class`` or ``while``, so for the URI -``http://example.org/class``, create it with ``EX['class']``, not ``EX.class``. - -Common Namespaces ------------------ - -The ``namespace`` module defines many common namespaces such as RDF, RDFS, OWL, FOAF, SKOS, PROF, etc. The list of the -namespaces provided grows with user contributions to RDFLib. - -These Namespaces, and any others that users define, can also be associated with prefixes using the :class:`rdflib.namespace.NamespaceManager`, e.g. using ``foaf`` for ``http://xmlns.com/foaf/0.1/``. - -Each RDFLib graph has a :attr:`~rdflib.graph.Graph.namespace_manager` that keeps a list of namespace to prefix mappings. The namespace manager is populated when reading in RDF, and these prefixes are used when serialising RDF, or when parsing SPARQL queries. Prefixes can be bound with the :meth:`rdflib.graph.Graph.bind` method:: - - from rdflib import Graph, Namespace - from rdflib.namespace import FOAF - - EX = Namespace("http://example.org/") - - g = Graph() - g.bind("foaf", FOAF) # bind an RDFLib-provided namespace to a prefix - g.bind("ex", EX) # bind a user-declared namespace to a prefix - - -The :meth:`rdflib.graph.Graph.bind` method is actually supplied by the :class:`rdflib.namespace.NamespaceManager` class - see next. - -NamespaceManager ----------------- - -Each RDFLib graph comes with a :class:`rdflib.namespace.NamespaceManager` instance in the :attr:`~rdflib.graph.Graph.namespace_manager` field; you can use the :meth:`~rdflib.namespace.NamespaceManager.bind` method of this instance to bind a prefix to a namespace URI, -as above, however note that the :class:`~rdflib.namespace.NamespaceManager` automatically performs some bindings according to a selected strategy. - -Namespace binding strategies are indicated with the ``bind_namespaces`` input parameter to :class:`~rdflib.namespace.NamespaceManager` instances -and may be set via ``Graph`` also:: - - from rdflib import Graph - from rdflib.namespace import NamespaceManager - - g = Graph(bind_namespaces="rdflib") # bind via Graph - - g2 = Graph() - nm = NamespaceManager(g2, bind_namespaces="rdflib") # bind via NamespaceManager - - -Valid strategies are: - -* core: - * binds several core RDF prefixes only - * owl, rdf, rdfs, xsd, xml from the NAMESPACE_PREFIXES_CORE object - * this is default -* rdflib: - * binds all the namespaces shipped with RDFLib as DefinedNamespace instances - * all the core namespaces and all the following: brick, csvw, dc, dcat - * dcmitype, dcterms, dcam, doap, foaf, geo, odrl, org, prof, prov, qb, sdo - * sh, skos, sosa, ssn, time, vann, void - * see the NAMESPACE_PREFIXES_RDFLIB object in :class:`rdflib.namespace` for up-to-date list -* none: - * binds no namespaces to prefixes - * note this is NOT default behaviour -* cc: - * using prefix bindings from prefix.cc which is a online prefixes database - * not implemented yet - this is aspirational - -Re-binding -^^^^^^^^^^ - -Note that regardless of the strategy employed, prefixes for namespaces can be overwritten with users preferred prefixes, -for example:: - - from rdflib import Graph - from rdflib.namespace import GEO # imports GeoSPARQL's namespace - - g = Graph(bind_namespaces="rdflib") # binds GeoSPARQL's namespace to prefix 'geo' - - g.bind('geosp', GEO, override=True) - - - -:class:`~rdflib.namespace.NamespaceManager` also has a method to normalize a given url:: - - from rdflib.namespace import NamespaceManager - - nm = NamespaceManager(Graph()) - nm.normalizeUri(t) - - -For simple output, or simple serialisation, you often want a nice -readable representation of a term. All RDFLib terms have a -``.n3()`` method, which will return a suitable N3 format and into which you can supply a NamespaceManager instance -to provide prefixes, i.e. ``.n3(namespace_manager=some_nm)``:: - - >>> from rdflib import Graph, URIRef, Literal, BNode - >>> from rdflib.namespace import FOAF, NamespaceManager - - >>> person = URIRef("http://xmlns.com/foaf/0.1/Person") - >>> person.n3() - '' - - >>> g = Graph() - >>> g.bind("foaf", FOAF) - - >>> person.n3(g.namespace_manager) - 'foaf:Person' - - >>> l = Literal(2) - >>> l.n3() - '"2"^^' - - >>> l.n3(NamespaceManager(Graph(), bind_namespaces="core")) - '"2"^^xsd:integer' - -The namespace manage also has a useful method ``compute_qname`` -``g.namespace_manager.compute_qname(x)`` (or just ``g.compute_qname(x)``) which takes a URI and decomposes it into the parts:: - - self.assertEqual(g.compute_qname(URIRef("http://foo/bar#baz")), - ("ns2", URIRef("http://foo/bar#"), "baz")) - - - -Namespaces in SPARQL Queries ----------------------------- - -The ``initNs`` argument supplied to :meth:`~rdflib.graph.Graph.query` is a dictionary of namespaces to be expanded in the query string. -If you pass no ``initNs`` argument, the namespaces registered with the graphs namespace_manager are used:: - - from rdflib.namespace import FOAF - graph.query('SELECT * WHERE { ?p a foaf:Person }', initNs={'foaf': FOAF}) - - -In order to use an empty prefix (e.g. ``?a :knows ?b``), use a ``PREFIX`` directive with no prefix in the SPARQL query to set a default namespace: - -.. code-block:: sparql - - PREFIX : - - - diff --git a/docs/persistence.md b/docs/persistence.md new file mode 100644 index 0000000000..aa81f66b8a --- /dev/null +++ b/docs/persistence.md @@ -0,0 +1,60 @@ +# Persistence + +RDFLib provides an [`abstracted Store API`][rdflib.store.Store] +for persistence of RDF and Notation 3. The [`Graph`][rdflib.graph.Graph] class works with instances of this API (as the first argument to its constructor) for triple-based management of an RDF store including: garbage collection, transaction management, update, pattern matching, removal, length, and database management ([`Graph.open()`][rdflib.graph.Graph.open] / [`Graph.close()`][rdflib.graph.Graph.close] / [`Graph.destroy()`][rdflib.graph.Graph.destroy]). + +Additional persistence mechanisms can be supported by implementing this API for a different store. + +## Stores currently shipped with core RDFLib + +* [`Memory`][rdflib.plugins.stores.memory.Memory] - not persistent! +* [`BerkeleyDB`][rdflib.plugins.stores.berkeleydb.BerkeleyDB] - on disk persistence via Python's [berkeleydb package](https://pypi.org/project/berkeleydb/) +* [`SPARQLStore`][rdflib.plugins.stores.sparqlstore.SPARQLStore] - a read-only wrapper around a remote SPARQL Query endpoint +* [`SPARQLUpdateStore`][rdflib.plugins.stores.sparqlstore.SPARQLUpdateStore] - a read-write wrapper around a remote SPARQL query/update endpoint pair + +## Usage + +In most cases, passing the name of the store to the Graph constructor is enough: + +```python +from rdflib import Graph + +graph = Graph(store='BerkeleyDB') +``` + +Most stores offering on-disk persistence will need to be opened before reading or writing. When peristing a triplestore, rather than a ConjuntiveGraph quadstore, you need to specify an identifier with which you can open the graph: + +```python +graph = Graph('BerkeleyDB', identifier='mygraph') + +# first time create the store: +graph.open('/home/user/data/myRDFLibStore', create=True) + +# work with the graph: +data = """ +PREFIX : + +:a :b :c . +:d :e :f . +:d :g :h . +""" +graph.parse(data=data, format="ttl") + +# when done! +graph.close() +``` + +When done, [`close()`][rdflib.graph.Graph.close] must be called to free the resources associated with the store. + +## Additional store plugins + +More store implementations are available in RDFLib extension projects: + +* [rdflib-sqlalchemy](https://github.com/RDFLib/rdflib-sqlalchemy) – a store which supports a wide-variety of RDBMS backends +* [rdflib-leveldb](https://github.com/RDFLib/rdflib-leveldb) – a store on top of Google's [LevelDB](https://code.google.com/p/leveldb/) key-value store +* [rdflib-kyotocabinet](https://github.com/RDFLib/rdflib-kyotocabinet) – a store on top of the [Kyoto Cabinet](http://fallabs.com/kyotocabinet/) key-value store + +## Example + +* [`examples.berkeleydb_example`][examples.berkeleydb_example] contains an example for using a BerkeleyDB store. +* [`examples.sparqlstore_example`][examples.sparqlstore_example] contains an example for using a SPARQLStore. diff --git a/docs/persistence.rst b/docs/persistence.rst deleted file mode 100644 index ca7449ed5d..0000000000 --- a/docs/persistence.rst +++ /dev/null @@ -1,81 +0,0 @@ -.. _persistence: Persistence - -=========== -Persistence -=========== - -RDFLib provides an :class:`abstracted Store API ` -for persistence of RDF and Notation 3. The -:class:`~rdflib.graph.Graph` class works with instances of this API -(as the first argument to its constructor) for triple-based management -of an RDF store including: garbage collection, transaction management, -update, pattern matching, removal, length, and database management -(:meth:`~rdflib.graph.Graph.open` / :meth:`~rdflib.graph.Graph.close` -/ :meth:`~rdflib.graph.Graph.destroy`). - -Additional persistence mechanisms can be supported by implementing -this API for a different store. - -Stores currently shipped with core RDFLib -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -* :class:`Memory ` - not persistent! -* :class:`~rdflib.plugins.stores.berkeleydb.BerkeleyDB` - on disk persistence via Python's `berkeleydb package `_ -* :class:`~rdflib.plugins.stores.sparqlstore.SPARQLStore` - a read-only wrapper around a remote SPARQL Query endpoint -* :class:`~rdflib.plugins.stores.sparqlstore.SPARQLUpdateStore` - a read-write wrapper around a remote SPARQL query/update endpoint pair - -Usage -^^^^^ - -In most cases, passing the name of the store to the Graph constructor is enough: - -.. code-block:: python - - from rdflib import Graph - - graph = Graph(store='BerkeleyDB') - - -Most stores offering on-disk persistence will need to be opened before reading or writing. -When peristing a triplestore, rather than a ConjuntiveGraph quadstore, you need to specify -an identifier with which you can open the graph: - -.. code-block:: python - - graph = Graph('BerkeleyDB', identifier='mygraph') - - # first time create the store: - graph.open('/home/user/data/myRDFLibStore', create=True) - - # work with the graph: - data = """ - PREFIX : - - :a :b :c . - :d :e :f . - :d :g :h . - """ - graph.parse(data=data, format="ttl") - - # when done! - graph.close() - - - -When done, :meth:`~rdflib.graph.Graph.close` must be called to free the resources associated with the store. - - -Additional store plugins -^^^^^^^^^^^^^^^^^^^^^^^^ - -More store implementations are available in RDFLib extension projects: - - * `rdflib-sqlalchemy `_ – a store which supports a wide-variety of RDBMS backends, - * `rdflib-leveldb `_ – a store on top of Google's `LevelDB `_ key-value store. - * `rdflib-kyotocabinet `_ – a store on top of the `Kyoto Cabinet `_ key-value store. - -Example -^^^^^^^ - -* :mod:`examples.berkeleydb_example` contains an example for using a BerkeleyDB store. -* :mod:`examples.sparqlstore_example` contains an example for using a SPARQLStore. diff --git a/docs/persisting_n3_terms.md b/docs/persisting_n3_terms.md new file mode 100644 index 0000000000..5cf59dfdbd --- /dev/null +++ b/docs/persisting_n3_terms.md @@ -0,0 +1,89 @@ +# Persisting Notation 3 Terms + +## Using N3 Syntax for Persistence + +Blank Nodes, Literals, URI References, and Variables can be distinguished in persistence by relying on Notation 3 syntax convention. + +All URI References can be expanded and persisted as: + +```turtle +<..URI..> +``` + +All Literals can be expanded and persisted as: + +```turtle +"..value.."@lang or "..value.."^^dtype_uri +``` + +!!! abstract "Language tag" + `@lang` is a language tag and `^^dtype_uri` is the URI of a data type associated with the Literal + +Blank Nodes can be expanded and persisted as: + +```turtle +_:Id +``` + +!!! info "About skolemization" + Where Id is an identifier as determined by skolemization. Skolemization is a syntactic transformation routinely used in automatic inference systems in which existential variables are replaced by 'new' functions - function names not used elsewhere - applied to any enclosing universal variables. In RDF, Skolemization amounts to replacing every blank node in a graph by a 'new' name, i.e. a URI reference which is guaranteed to not occur anywhere else. In effect, it gives 'arbitrary' names to the anonymous entities whose existence was asserted by the use of blank nodes: the arbitrariness of the names ensures that nothing can be inferred that would not follow from the bare assertion of existence represented by the blank node. (Using a literal would not do. Literals are never 'new' in the required sense.) + +Variables can be persisted as they appear in their serialization `(?varName)` - since they only need be unique within their scope (the context of their associated statements) + +These syntactic conventions can facilitate term round-tripping. + +## Variables by Scope + +Would an interface be needed in order to facilitate a quick way to aggregate all the variables in a scope (given by a formula identifier)? An interface such as: + +```python +def variables(formula_identifier) +``` + +## The Need to Skolemize Formula Identifiers + +It would seem reasonable to assume that a formula-aware store would assign Blank Node identifiers as names of formulae that appear in a N3 serialization. So for instance, the following bit of N3: + +``` +{?x a :N3Programmer} => {?x :has :Migrane} +``` + +Could be interpreted as the assertion of the following statement: + +```turtle +_:a log:implies _:b +``` + +However, how are `_:a` and `_:b` distinguished from other Blank Nodes? A formula-aware store would be expected to persist the first set of statements as quoted statements in a formula named `_:a` and the second set as quoted statements in a formula named `_:b`, but it would not be cost-effective for a serializer to have to query the store for all statements in a context named `_:a` in order to determine if `_:a` was associated with a formula (so that it could be serialized properly). + +## Relying on `log:Formula` Membership + +The store could rely on explicit `log:Formula` membership (via `rdf:type` statements) to model the distinction of Blank Nodes associated with formulae. However, would these statements be expected from an N3 parser or known implicitly by the store? i.e., would all such Blank Nodes match the following pattern: + +```turtle +?formula rdf:type log:Formula +``` + +## Relying on an Explicit Interface + +A formula-aware store could also support the persistence of this distinction by implementing a method that returns an iterator over all the formulae in the store: + +```python +def formulae(triple=None) +``` + +This function would return all the Blank Node identifiers assigned to formulae or just those that contain statements matching the given triple pattern and would be the way a serializer determines if a term refers to a formula (in order to properly serializer it). + +How much would such an interface reduce the need to model formulae terms as first class objects (perhaps to be returned by the [`triples()`][rdflib.Graph.triples] function)? Would it be more useful for the [`Graph`][rdflib.Graph] (or the store itself) to return a Context object in place of a formula term (using the formulae interface to make this determination)? + +Conversely, would these interfaces (variables and formulae) be considered optimizations only since you have the distinction by the kinds of terms triples returns (which would be expanded to include variables and formulae)? + +## Persisting Formula Identifiers + +This is the most straight forward way to maintain this distinction - without relying on extra interfaces. Formula identifiers could be persisted distinctly from other terms by using the following notation: + +``` +{_:bnode} or {<.. URI ..>} +``` + +This would facilitate their persistence round-trip - same as the other terms that rely on N3 syntax to distinguish between each other. diff --git a/docs/persisting_n3_terms.rst b/docs/persisting_n3_terms.rst deleted file mode 100644 index 1138b4c3f7..0000000000 --- a/docs/persisting_n3_terms.rst +++ /dev/null @@ -1,93 +0,0 @@ -.. _persisting_n3_terms: - -=========================== -Persisting Notation 3 Terms -=========================== - -Using N3 Syntax for Persistence -------------------------------- -Blank Nodes, Literals, URI References, and Variables can be distinguished in persistence by relying on Notation 3 syntax convention. - -All URI References can be expanded and persisted as: - -.. code-block:: text - - <..URI..> - -All Literals can be expanded and persisted as: - -.. code-block:: text - - "..value.."@lang or "..value.."^^dtype_uri - -.. note:: ``@lang`` is a language tag and ``^^dtype_uri`` is the URI of a data type associated with the Literal - -Blank Nodes can be expanded and persisted as: - -.. code-block:: text - - _:Id - -.. note:: where Id is an identifier as determined by skolemization. Skolemization is a syntactic transformation routinely used in automatic inference systems in which existential variables are replaced by 'new' functions - function names not used elsewhere - applied to any enclosing universal variables. In RDF, Skolemization amounts to replacing every blank node in a graph by a 'new' name, i.e. a URI reference which is guaranteed to not occur anywhere else. In effect, it gives 'arbitrary' names to the anonymous entities whose existence was asserted by the use of blank nodes: the arbitrariness of the names ensures that nothing can be inferred that would not follow from the bare assertion of existence represented by the blank node. (Using a literal would not do. Literals are never 'new' in the required sense.) - -Variables can be persisted as they appear in their serialization ``(?varName)`` - since they only need be unique within their scope (the context of their associated statements) - -These syntactic conventions can facilitate term round-tripping. - -Variables by Scope ------------------- -Would an interface be needed in order to facilitate a quick way to aggregate all the variables in a scope (given by a formula identifier)? An interface such as: - -.. code-block:: python - - def variables(formula_identifier) - -The Need to Skolemize Formula Identifiers ------------------------------------------ -It would seem reasonable to assume that a formula-aware store would assign Blank Node identifiers as names of formulae that appear in a N3 serialization. So for instance, the following bit of N3: - -.. code-block:: text - - {?x a :N3Programmer} => {?x :has :Migrane} - -Could be interpreted as the assertion of the following statement: - -.. code-block:: text - - _:a log:implies _:b - -However, how are ``_:a`` and ``_:b`` distinguished from other Blank Nodes? A formula-aware store would be expected to persist the first set of statements as quoted statements in a formula named ``_:a`` and the second set as quoted statements in a formula named ``_:b``, but it would not be cost-effective for a serializer to have to query the store for all statements in a context named ``_:a`` in order to determine if ``_:a`` was associated with a formula (so that it could be serialized properly). - -Relying on ``log:Formula`` Membership -------------------------------------- - -The store could rely on explicit ``log:Formula`` membership (via ``rdf:type`` statements) to model the distinction of Blank Nodes associated with formulae. However, would these statements be expected from an N3 parser or known implicitly by the store? i.e., would all such Blank Nodes match the following pattern: - -.. code-block:: text - - ?formula rdf:type log:Formula - -Relying on an Explicit Interface --------------------------------- -A formula-aware store could also support the persistence of this distinction by implementing a method that returns an iterator over all the formulae in the store: - -.. code-block:: python - - def formulae(triple=None) - -This function would return all the Blank Node identifiers assigned to formulae or just those that contain statements matching the given triple pattern and would be the way a serializer determines if a term refers to a formula (in order to properly serializer it). - -How much would such an interface reduce the need to model formulae terms as first class objects (perhaps to be returned by the :meth:`~rdflib.Graph.triples` function)? Would it be more useful for the :class:`~rdflib.Graph` (or the store itself) to return a Context object in place of a formula term (using the formulae interface to make this determination)? - -Conversely, would these interfaces (variables and formulae) be considered optimizations only since you have the distinction by the kinds of terms triples returns (which would be expanded to include variables and formulae)? - -Persisting Formula Identifiers ------------------------------- -This is the most straight forward way to maintain this distinction - without relying on extra interfaces. Formula identifiers could be persisted distinctly from other terms by using the following notation: - -.. code-block:: text - - {_:bnode} or {<.. URI ..>} - -This would facilitate their persistence round-trip - same as the other terms that rely on N3 syntax to distinguish between each other. - diff --git a/docs/plugin_parsers.rst b/docs/plugin_parsers.rst deleted file mode 100644 index 56cb5d1eb2..0000000000 --- a/docs/plugin_parsers.rst +++ /dev/null @@ -1,46 +0,0 @@ -.. _plugin_parsers: Plugin parsers - -============== -Plugin parsers -============== - -These serializers are available in default RDFLib, you can use them by -passing the name to graph's :meth:`~rdflib.graph.Graph.parse` method:: - - graph.parse(my_url, format='n3') - -The ``html`` parser will auto-detect RDFa, HTurtle or Microdata. - -It is also possible to pass a mime-type for the ``format`` parameter:: - - graph.parse(my_url, format='application/rdf+xml') - -If you are not sure what format your file will be, you can use :func:`rdflib.util.guess_format` which will guess based on the file extension. - -========= ==================================================================== -Name Class -========= ==================================================================== -json-ld :class:`~rdflib.plugins.parsers.jsonld.JsonLDParser` -hext :class:`~rdflib.plugins.parsers.hext.HextuplesParser` -n3 :class:`~rdflib.plugins.parsers.notation3.N3Parser` -nquads :class:`~rdflib.plugins.parsers.nquads.NQuadsParser` -patch :class:`~rdflib.plugins.parsers.patch.RDFPatchParser` -nt :class:`~rdflib.plugins.parsers.ntriples.NTParser` -trix :class:`~rdflib.plugins.parsers.trix.TriXParser` -turtle :class:`~rdflib.plugins.parsers.notation3.TurtleParser` -xml :class:`~rdflib.plugins.parsers.rdfxml.RDFXMLParser` -========= ==================================================================== - -Multi-graph IDs ---------------- -Note that for correct parsing of multi-graph data, e.g. Trig, HexT, etc., into a ``Dataset``, -as opposed to a context-unaware ``Graph``, you will need to set the ``publicID`` of the ``Dataset`` to the identifier of the ``default_context`` (default graph), for example:: - - d = Dataset() - d.parse( - data=""" ... """, - format="trig", - publicID=d.default_context.identifier - ) - -(from the file tests/test_serializer_hext.py) diff --git a/docs/plugin_query_results.rst b/docs/plugin_query_results.rst deleted file mode 100644 index f44c276877..0000000000 --- a/docs/plugin_query_results.rst +++ /dev/null @@ -1,32 +0,0 @@ -.. _plugin_query_results: Plugin query results - -==================== -Plugin query results -==================== - -Plugins for reading and writing of (SPARQL) :class:`~rdflib.query.Result` - pass ``name`` to either :meth:`~rdflib.query.Result.parse` or :meth:`~rdflib.query.Result.serialize` - - -Parsers -------- - -==== ==================================================================== -Name Class -==== ==================================================================== -csv :class:`~rdflib.plugins.sparql.results.csvresults.CSVResultParser` -json :class:`~rdflib.plugins.sparql.results.jsonresults.JSONResultParser` -tsv :class:`~rdflib.plugins.sparql.results.tsvresults.TSVResultParser` -xml :class:`~rdflib.plugins.sparql.results.xmlresults.XMLResultParser` -==== ==================================================================== - -Serializers ------------ - -==== ======================================================================== -Name Class -==== ======================================================================== -csv :class:`~rdflib.plugins.sparql.results.csvresults.CSVResultSerializer` -json :class:`~rdflib.plugins.sparql.results.jsonresults.JSONResultSerializer` -txt :class:`~rdflib.plugins.sparql.results.txtresults.TXTResultSerializer` -xml :class:`~rdflib.plugins.sparql.results.xmlresults.XMLResultSerializer` -==== ======================================================================== diff --git a/docs/plugin_serializers.rst b/docs/plugin_serializers.rst deleted file mode 100644 index 3721bb9f80..0000000000 --- a/docs/plugin_serializers.rst +++ /dev/null @@ -1,60 +0,0 @@ -.. _plugin_serializers: Plugin serializers - -================== -Plugin serializers -================== - -These serializers are available in default RDFLib, you can use them by -passing the name to a graph's :meth:`~rdflib.graph.Graph.serialize` method:: - - print graph.serialize(format='n3') - -It is also possible to pass a mime-type for the ``format`` parameter:: - - graph.serialize(my_url, format='application/rdf+xml') - -========== =============================================================== -Name Class -========== =============================================================== -json-ld :class:`~rdflib.plugins.serializers.jsonld.JsonLDSerializer` -n3 :class:`~rdflib.plugins.serializers.n3.N3Serializer` -nquads :class:`~rdflib.plugins.serializers.nquads.NQuadsSerializer` -nt :class:`~rdflib.plugins.serializers.nt.NTSerializer` -hext :class:`~rdflib.plugins.serializers.hext.HextuplesSerializer` -patch :class:`~rdflib.plugins.serializers.patch.PatchSerializer` -pretty-xml :class:`~rdflib.plugins.serializers.rdfxml.PrettyXMLSerializer` -trig :class:`~rdflib.plugins.serializers.trig.TrigSerializer` -trix :class:`~rdflib.plugins.serializers.trix.TriXSerializer` -turtle :class:`~rdflib.plugins.serializers.turtle.TurtleSerializer` -longturtle :class:`~rdflib.plugins.serializers.longturtle.LongTurtleSerializer` -xml :class:`~rdflib.plugins.serializers.rdfxml.XMLSerializer` -========== =============================================================== - - -JSON-LD -------- -JSON-LD - 'json-ld' - has been incorporated into RDFLib since v6.0.0. - -RDF Patch ---------- - -The RDF Patch Serializer - 'patch' - uses the RDF Patch format defined at https://afs.github.io/rdf-patch/. It supports serializing context aware stores as either addition or deletion patches; and also supports serializing the difference between two context aware stores as a Patch of additions and deletions. - -HexTuples ---------- -The HexTuples Serializer - 'hext' - uses the HexTuples format defined at https://github.com/ontola/hextuples. - -For serialization of non-context-aware data sources, e.g. a single ``Graph``, the 'graph' field (6th variable in the -Hextuple) will be an empty string. - -For context-aware (multi-graph) serialization, the 'graph' field of the default graph will be an empty string and -the values for other graphs will be Blank Node IDs or IRIs. - -Longturtle ----------- -Longturtle is just the turtle format with newlines preferred over compactness - multiple nodes on the same line -to enhance the format's text file version control (think Git) friendliness - and more modern forms of prefix markers - -PREFIX instead of @prefix - to make it as similar to SPARQL as possible. - -Longturtle is Turtle 1.1 compliant and will work wherever ordinary turtle works, however some very old parsers don't -understand PREFIX, only @prefix... diff --git a/docs/plugin_stores.rst b/docs/plugin_stores.rst index 1a9fc506dd..e69de29bb2 100644 --- a/docs/plugin_stores.rst +++ b/docs/plugin_stores.rst @@ -1,70 +0,0 @@ -.. _plugin_stores: Plugin stores - -============= -Plugin stores -============= - -Built In --------- - -The following Stores are contained within the rdflib core package: - -================= ============================================================ -Name Class -================= ============================================================ -Auditable :class:`~rdflib.plugins.stores.auditable.AuditableStore` -Concurrent :class:`~rdflib.plugins.stores.concurrent.ConcurrentStore` -SimpleMemory :class:`~rdflib.plugins.stores.memory.SimpleMemory` -Memory :class:`~rdflib.plugins.stores.memory.Memory` -SPARQLStore :class:`~rdflib.plugins.stores.sparqlstore.SPARQLStore` -SPARQLUpdateStore :class:`~rdflib.plugins.stores.sparqlstore.SPARQLUpdateStore` -BerkeleyDB :class:`~rdflib.plugins.stores.berkeleydb.BerkeleyDB` -default :class:`~rdflib.plugins.stores.memory.Memory` -================= ============================================================ - -External --------- - -The following Stores are defined externally to rdflib's core package, so look to their documentation elsewhere for -specific details of use. - -================= ==================================================== ============================================================================================= -Name Repository Notes -================= ==================================================== ============================================================================================= -SQLAlchemy ``_ An SQLAlchemy-backed, formula-aware RDFLib Store. Tested dialects are: SQLite, MySQL & PostgreSQL -leveldb ``_ An adaptation of RDFLib BerkeleyDB Store’s key-value approach, using LevelDB as a back-end -Kyoto Cabinet ``_ An adaptation of RDFLib BerkeleyDB Store’s key-value approach, using Kyoto Cabinet as a back-end -HDT ``_ A Store back-end for rdflib to allow for reading and querying `HDT `_ documents -Oxigraph ``_ Works with the `Pyoxigraph `_ Python graph database library -================= ==================================================== ============================================================================================= - -*If you have, or know of a Store implementation and would like it listed here, please submit a Pull Request!* - -Use ---- - -You can use these stores like this: - -.. code-block:: python - - from rdflib import Graph - - # use the default memory Store - graph = Graph() - - # use the BerkeleyDB Store - graph = Graph(store="BerkeleyDB") - - -In some cases, you must explicitly *open* and *close* a store, for example: - -.. code-block:: python - - from rdflib import Graph - - # use the BerkeleyDB Store - graph = Graph(store="BerkeleyDB") - graph.open("/some/folder/location") - # do things ... - graph.close() - diff --git a/docs/plugins.md b/docs/plugins.md new file mode 100644 index 0000000000..a73eb725a8 --- /dev/null +++ b/docs/plugins.md @@ -0,0 +1,187 @@ +# Plugins + +![rdflib plugin "architecture"](_static/plugins-diagram.svg) + +Many parts of RDFLib are extensible with plugins, [see setuptools' 'Creating and discovering plugins'](https://packaging.python.org/guides/creating-and-discovering-plugins/). These pages list the plugins included in RDFLib core. + +* [`Parser Plugins`][rdflib.plugins.parsers] +* [`Serializer Plugins`][rdflib.plugins.serializers] +* [`Store Plugins`][rdflib.plugins.stores] +* [`Query Results Plugins`][rdflib.plugins.sparql.results] + +## Plugin stores + +### Built In + +The following Stores are contained within the rdflib core package: + +| Name | Class | +| --- | --- | +| Auditable | [`AuditableStore`][rdflib.plugins.stores.auditable.AuditableStore] | +| Concurrent | [`ConcurrentStore`][rdflib.plugins.stores.concurrent.ConcurrentStore] | +| SimpleMemory | [`SimpleMemory`][rdflib.plugins.stores.memory.SimpleMemory] | +| Memory | [`Memory`][rdflib.plugins.stores.memory.Memory] | +| SPARQLStore | [`SPARQLStore`][rdflib.plugins.stores.sparqlstore.SPARQLStore] | +| SPARQLUpdateStore | [`SPARQLUpdateStore`][rdflib.plugins.stores.sparqlstore.SPARQLUpdateStore] | +| BerkeleyDB | [`BerkeleyDB`][rdflib.plugins.stores.berkeleydb.BerkeleyDB] | +| default | [`Memory`][rdflib.plugins.stores.memory.Memory] | + +### External + +The following Stores are defined externally to rdflib's core package, so look to their documentation elsewhere for specific details of use. + +| Name | Repository | Notes | +| --- | --- | --- | +| SQLAlchemy | [github.com/RDFLib/rdflib-sqlalchemy](https://github.com/RDFLib/rdflib-sqlalchemy) | An SQLAlchemy-backed, formula-aware RDFLib Store. Tested dialects are: SQLite, MySQL & PostgreSQL | +| leveldb | [github.com/RDFLib/rdflib-leveldb](https://github.com/RDFLib/rdflib-leveldb) | An adaptation of RDFLib BerkeleyDB Store's key-value approach, using LevelDB as a back-end | +| Kyoto Cabinet | [github.com/RDFLib/rdflib-kyotocabinet](https://github.com/RDFLib/rdflib-kyotocabinet) | An adaptation of RDFLib BerkeleyDB Store's key-value approach, using Kyoto Cabinet as a back-end | +| HDT | [github.com/RDFLib/rdflib-hdt](https://github.com/RDFLib/rdflib-hdt) | A Store back-end for rdflib to allow for reading and querying [HDT](https://www.rdfhdt.org/) documents | +| Oxigraph | [github.com/oxigraph/oxrdflib](https://github.com/oxigraph/oxrdflib) | Works with the [Pyoxigraph](https://oxigraph.org/pyoxigraph) Python graph database library | +| pycottas | [github.com/arenas-guerrero-julian/pycottas](https://github.com/arenas-guerrero-julian/pycottas) | A Store backend for querying compressed [COTTAS](https://pycottas.readthedocs.io/#cottas-files) files | + +*If you have, or know of a Store implementation and would like it listed here, please submit a Pull Request!* + +### Use + +You can use these stores like this: + +```python +from rdflib import Graph + +# use the default memory Store +graph = Graph() + +# use the BerkeleyDB Store +graph = Graph(store="BerkeleyDB") +``` + +In some cases, you must explicitly *open* and *close* a store, for example: + +```python +from rdflib import Graph + +# use the BerkeleyDB Store +graph = Graph(store="BerkeleyDB") +graph.open("/some/folder/location") +# do things ... +graph.close() +``` + +## Plugin parsers + +These serializers are available in default RDFLib, you can use them by passing the name to graph's [`parse()`][rdflib.graph.Graph.parse] method: + +```python +graph.parse(my_url, format='n3') +``` + +The `html` parser will auto-detect RDFa, HTurtle or Microdata. + +It is also possible to pass a mime-type for the `format` parameter: + +```python +graph.parse(my_url, format='application/rdf+xml') +``` + +If you are not sure what format your file will be, you can use [`guess_format()`][rdflib.util.guess_format] which will guess based on the file extension. + +| Name | Class | +|---------|---------------------------------------------------------------| +| json-ld | [`JsonLDParser`][rdflib.plugins.parsers.jsonld.JsonLDParser] | +| hext | [`HextuplesParser`][rdflib.plugins.parsers.hext.HextuplesParser] | +| n3 | [`N3Parser`][rdflib.plugins.parsers.notation3.N3Parser] | +| nquads | [`NQuadsParser`][rdflib.plugins.parsers.nquads.NQuadsParser] | +| patch | [`RDFPatchParser`][rdflib.plugins.parsers.patch.RDFPatchParser] | +| nt | [`NTParser`][rdflib.plugins.parsers.ntriples.NTParser] | +| trix | [`TriXParser`][rdflib.plugins.parsers.trix.TriXParser] | +| turtle | [`TurtleParser`][rdflib.plugins.parsers.notation3.TurtleParser] | +| xml | [`RDFXMLParser`][rdflib.plugins.parsers.rdfxml.RDFXMLParser] | + +### Multi-graph IDs + +Note that for correct parsing of multi-graph data, e.g. TriG, HexTuple, etc., into a `Dataset`, as opposed to a context-unaware `Graph`, you will need to set the `publicID` of the `Dataset` to the identifier of the `default_context` (default graph), for example: + +```python +d = Dataset() +d.parse( + data=""" ... """, + format="trig", + publicID=d.default_context.identifier +) +``` + +(from the file tests/test_serializer_hext.py) + +## Plugin serializers + +These serializers are available in default RDFLib, you can use them by +passing the name to a graph's [`serialize()`][rdflib.graph.Graph.serialize] method: + +```python +print graph.serialize(format='n3') +``` + +It is also possible to pass a mime-type for the `format` parameter: + +```python +graph.serialize(my_url, format='application/rdf+xml') +``` + +| Name | Class | +|------|-------| +| json-ld | [`JsonLDSerializer`][rdflib.plugins.serializers.jsonld.JsonLDSerializer] | +| n3 | [`N3Serializer`][rdflib.plugins.serializers.n3.N3Serializer] | +| nquads | [`NQuadsSerializer`][rdflib.plugins.serializers.nquads.NQuadsSerializer] | +| nt | [`NTSerializer`][rdflib.plugins.serializers.nt.NTSerializer] | +| hext | [`HextuplesSerializer`][rdflib.plugins.serializers.hext.HextuplesSerializer] | +| patch | [`PatchSerializer`][rdflib.plugins.serializers.patch.PatchSerializer] | +| pretty-xml | [`PrettyXMLSerializer`][rdflib.plugins.serializers.rdfxml.PrettyXMLSerializer] | +| trig | [`TrigSerializer`][rdflib.plugins.serializers.trig.TrigSerializer] | +| trix | [`TriXSerializer`][rdflib.plugins.serializers.trix.TriXSerializer] | +| turtle | [`TurtleSerializer`][rdflib.plugins.serializers.turtle.TurtleSerializer] | +| longturtle | [`LongTurtleSerializer`][rdflib.plugins.serializers.longturtle.LongTurtleSerializer] | +| xml | [`XMLSerializer`][rdflib.plugins.serializers.rdfxml.XMLSerializer] | + +### JSON-LD + +JSON-LD - 'json-ld' - has been incorporated into RDFLib since v6.0.0. + +### RDF Patch + +The RDF Patch Serializer - 'patch' - uses the RDF Patch format defined at https://afs.github.io/rdf-patch/. It supports serializing context aware stores as either addition or deletion patches; and also supports serializing the difference between two context aware stores as a Patch of additions and deletions. + +### HexTuples + +The HexTuples Serializer - 'hext' - uses the HexTuples format defined at https://github.com/ontola/hextuples. + +For serialization of non-context-aware data sources, e.g. a single `Graph`, the 'graph' field (6th variable in the Hextuple) will be an empty string. + +For context-aware (multi-graph) serialization, the 'graph' field of the default graph will be an empty string and the values for other graphs will be Blank Node IDs or IRIs. + +### Longturtle + +Longturtle is just the turtle format with newlines preferred over compactness - multiple nodes on the same line to enhance the format's text file version control (think Git) friendliness - and more modern forms of prefix markers - PREFIX instead of @prefix - to make it as similar to SPARQL as possible. + +Longturtle is Turtle 1.1 compliant and will work wherever ordinary turtle works, however some very old parsers don't understand PREFIX, only @prefix... + +## Plugin query results + +Plugins for reading and writing of (SPARQL) [`Result`][rdflib.query.Result] - pass `name` to either [`parse()`][rdflib.query.Result.parse] or [`serialize()`][rdflib.query.Result.serialize] + +### Parsers + +| Name | Class | +| ---- | ----- | +| csv | [`CSVResultParser`][rdflib.plugins.sparql.results.csvresults.CSVResultParser] | +| json | [`JSONResultParser`][rdflib.plugins.sparql.results.jsonresults.JSONResultParser] | +| tsv | [`TSVResultParser`][rdflib.plugins.sparql.results.tsvresults.TSVResultParser] | +| xml | [`XMLResultParser`][rdflib.plugins.sparql.results.xmlresults.XMLResultParser] | + +### Serializers + +| Name | Class | +| ---- | ----- | +| csv | [`CSVResultSerializer`][rdflib.plugins.sparql.results.csvresults.CSVResultSerializer] | +| json | [`JSONResultSerializer`][rdflib.plugins.sparql.results.jsonresults.JSONResultSerializer] | +| txt | [`TXTResultSerializer`][rdflib.plugins.sparql.results.txtresults.TXTResultSerializer] | +| xml | [`XMLResultSerializer`][rdflib.plugins.sparql.results.xmlresults.XMLResultSerializer] | diff --git a/docs/plugins.rst b/docs/plugins.rst deleted file mode 100644 index fd3ef5073c..0000000000 --- a/docs/plugins.rst +++ /dev/null @@ -1,21 +0,0 @@ - -Plugins -======= - -.. image:: /_static/plugins-diagram.* - :alt: rdflib plugin "architecture" - :width: 450px - :target: _static/plugins-diagram.svg - - -Many parts of RDFLib are extensible with plugins, `see setuptools' 'Creating and discovering plugins' `_. These pages list the plugins included in RDFLib core. - - - -.. toctree:: - :maxdepth: 1 - - plugin_parsers - plugin_serializers - plugin_stores - plugin_query_results diff --git a/docs/rdf_terms.md b/docs/rdf_terms.md new file mode 100644 index 0000000000..b1d29325b4 --- /dev/null +++ b/docs/rdf_terms.md @@ -0,0 +1,154 @@ +# RDF terms in rdflib + +Terms are the kinds of objects that can appear in a RDFLib's graph's triples. Those that are part of core RDF concepts are: `IRIs`, `Blank Node` and `Literal`, the latter consisting of a literal value and either a [datatype](https://www.w3.org/TR/xmlschema-2/#built-in-datatypes) or an [RFC 3066](https://tools.ietf.org/html/rfc3066) language tag. + +!!! info "Origins" + RDFLib's class for representing IRIs/URIs is called "URIRef" because, at the time it was implemented, that was what the then current RDF specification called URIs/IRIs. We preserve that class name but refer to the RDF object as "IRI". + +## Class hierarchy + +All terms in RDFLib are sub-classes of the [`Identifier`][rdflib.term.Identifier] class. A class diagram of the various terms is: + +![Term Class Hierarchy](_static/term_class_hierarchy.svg) + +Nodes are a subset of the Terms that underlying stores actually persist. + +The set of such Terms depends on whether or not the store is formula-aware. Stores that aren't formula-aware only persist those terms core to the RDF Model but those that are formula-aware also persist the N3 extensions. However, utility terms that only serve the purpose of matching nodes by term-patterns will probably only be terms and not nodes. + +## Python Classes + +The three main RDF objects - *IRI*, *Blank Node* and *Literal* are represented in RDFLib by these three main Python classes: + +### URIRef + +An IRI (Internationalized Resource Identifier) is represented within RDFLib using the [`URIRef class`][rdflib.term.URIRef]. From [the RDF 1.1 specification's IRI section](https://www.w3.org/TR/rdf11-concepts/#section-IRIs): + +```python +>>> from rdflib import URIRef +>>> uri = URIRef() # doctest: +SKIP +Traceback (most recent call last): + File "", line 1, in +TypeError: __new__() missing 1 required positional argument: 'value' +>>> uri = URIRef('') +>>> uri +rdflib.term.URIRef('') +>>> uri = URIRef('http://example.com') +>>> uri +rdflib.term.URIRef('http://example.com') +>>> uri.n3() +'' +``` + +### BNodes + +In RDF, a blank node (also called BNode) is a node in an RDF graph representing a resource for which an IRI or literal is not given. The resource represented by a blank node is also called an anonymous resource. According to the RDF standard, a blank node can only be used as subject or object in a triple, although in some syntaxes like Notation 3 it is acceptable to use a blank node as a predicate. If a blank node has a node ID (not all blank nodes are labelled in all RDF serializations), it is limited in scope to a particular serialization of the RDF graph, i.e. the node p1 in one graph does not represent the same node as a node named p1 in any other graph -- [wikipedia](http://en.wikipedia.org/wiki/Blank_node) + +See the [`BNode`][rdflib.term.BNode] class' documentation. + +```python +>>> from rdflib import BNode +>>> bn = BNode() +>>> bn # doctest: +SKIP +rdflib.term.BNode('AFwALAKU0') +>>> bn.n3() # doctest: +SKIP +'_:AFwALAKU0' +``` + +### Literals + +Literals are attribute values in RDF, for instance, a person's name, the date of birth, height, etc. and are stored using simple data types, e.g. *string*, *double*, *dateTime* etc. This usually looks something like this: + +```python +name = Literal("Nicholas") # the name 'Nicholas', as a string + +age = Literal(39, datatype=XSD.integer) # the number 39, as an integer +``` + +A slightly special case is a *langString* which is a *string* with a language tag, e.g.: + +```python +name = Literal("Nicholas", lang="en") # the name 'Nicholas', as an English string +imie = Literal("Mikołaj", lang="pl") # the Polish version of the name 'Nicholas' +``` + +Special literal types indicated by use of a custom IRI for a literal's `datatype` value, for example the [GeoSPARQL RDF standard](https://opengeospatial.github.io/ogc-geosparql/geosparql11/spec.html#_geometry_serializations) invents a custom datatype, `geoJSONLiteral` to indicate [GeoJSON geometry serlializations](https://opengeospatial.github.io/ogc-geosparql/geosparql11/spec.html#_rdfs_datatype_geogeojsonliteral) like this: + +```python +GEO = Namespace("http://www.opengis.net/ont/geosparql#") + +geojson_geometry = Literal( + '''{"type": "Point", "coordinates": [-83.38,33.95]}''', + datatype=GEO.geoJSONLiteral +``` + +See the [`Literal`][rdflib.term.Literal] class' documentation, followed by notes on Literal from the [RDF 1.1 specification 'Literals' section](https://www.w3.org/TR/rdf11-concepts/#section-Graph-Literal). + +A literal in an RDF graph contains one or two named components. + +All literals have a lexical form being a Unicode string, which SHOULD be in Normal Form C. + +Plain literals have a lexical form and optionally a language tag as defined by [RFC 3066](https://tools.ietf.org/html/rfc3066), normalized to lowercase. An exception will be raised if illegal language-tags are passed to [\_\_new\_\_()][rdflib.term.Literal.__new__]. + +Typed literals have a lexical form and a datatype URI being an RDF URI reference. + +!!! abstract "Language vs. locale" + When using the language tag, care must be taken not to confuse language with locale. The language tag relates only to human language text. Presentational issues should be addressed in end-user applications. + +!!! quote "Case sensitive" + The case normalization of language tags is part of the description of the abstract syntax, and consequently the abstract behaviour of RDF applications. It does not constrain an RDF implementation to actually normalize the case. Crucially, the result of comparing two language tags should not be sensitive to the case of the original input. -- [RDF Concepts and Abstract Syntax](http://www.w3.org/TR/rdf-concepts/#section-Graph-URIref) + +#### Common XSD datatypes + +Most simple literals such as *string* or *integer* have XML Schema (XSD) datatypes defined for them, see the figure below. Additionally, these XSD datatypes are listed in the [XSD Namespace class][rdflib.namespace.XSD] that ships with RDFLib, so many Python code editors will prompt you with autocomplete for them when using it. + +Remember, you don't *have* to use XSD datatypes and can always make up your own, as GeoSPARQL does, as described above. + +![datatype hierarchy](_static/datatype_hierarchy.png) + +#### Python conversions + +RDFLib Literals essentially behave like unicode characters with an XML Schema datatype or language attribute. + +The class provides a mechanism to both convert Python literals (and their built-ins such as time/date/datetime) into equivalent RDF Literals and (conversely) convert Literals to their Python equivalent. This mapping to and from Python literals is done as follows: + +| XML Datatype | Python type | +|--------------|-------------| +| None | None [^1] | +| xsd:time | time [^2] | +| xsd:date | date | +| xsd:dateTime | datetime | +| xsd:string | None | +| xsd:normalizedString | None | +| xsd:token | None | +| xsd:language | None | +| xsd:boolean | boolean | +| xsd:decimal | Decimal | +| xsd:integer | long | +| xsd:nonPositiveInteger | int | +| xsd:long | long | +| xsd:nonNegativeInteger | int | +| xsd:negativeInteger | int | +| xsd:int | long | +| xsd:unsignedLong | long | +| xsd:positiveInteger | int | +| xsd:short | int | +| xsd:unsignedInt | long | +| xsd:byte | int | +| xsd:unsignedShort | int | +| xsd:unsignedByte | int | +| xsd:float | float | +| xsd:double | float | +| xsd:base64Binary | base64 | +| xsd:anyURI | None | +| rdf:XMLLiteral | Document (xml.dom.minidom.Document [^3] | +| rdf:HTML | DocumentFragment (xml.dom.minidom.DocumentFragment) | + +[^1]: plain literals map directly to value space +[^2]: Date, time and datetime literals are mapped to Python instances using the RDFlib xsd_datetime module, that is based on the [isodate](http://pypi.python.org/pypi/isodate/) package). +[^3]: this is a bit dirty - by accident the `html5lib` parser produces `DocumentFragments`, and the xml parser `Documents`, letting us use this to decide what datatype when round-tripping. + +An appropriate data-type and lexical representation can be found using `_castPythonToLiteral`, and the other direction with `_castLexicalToPython`. + +All this happens automatically when creating `Literal` objects by passing Python objects to the constructor, and you never have to do this manually. + +You can add custom data-types with [`bind()`][rdflib.term.bind], see also [`custom_datatype example`][examples.custom_datatype] diff --git a/docs/rdf_terms.rst b/docs/rdf_terms.rst deleted file mode 100644 index f83127da83..0000000000 --- a/docs/rdf_terms.rst +++ /dev/null @@ -1,230 +0,0 @@ -.. _rdf_terms: RDF terms in rdflib - -=================== -RDF terms in rdflib -=================== - -Terms are the kinds of objects that can appear in a RDFLib's graph's triples. -Those that are part of core RDF concepts are: ``IRIs``, ``Blank Node`` -and ``Literal``, the latter consisting of a literal value and either a `datatype `_ -or an :rfc:`3066` language tag. - -.. note:: RDFLib's class for representing IRIs/URIs is called "URIRef" because, at the time it was implemented, that was what the then current RDF specification called URIs/IRIs. We preserve that class name but refer to the RDF object as "IRI". - -Class hierarchy -=============== - -All terms in RDFLib are sub-classes of the :class:`rdflib.term.Identifier` class. A class diagram of the various terms is: - -.. _term_class_hierarchy: -.. figure:: /_static/term_class_hierarchy.svg - :alt: Term Class Hierarchy - - Term Class Hierarchy - - -Nodes are a subset of the Terms that underlying stores actually persist. - -The set of such Terms depends on whether or not the store is formula-aware. -Stores that aren't formula-aware only persist those terms core to the -RDF Model but those that are formula-aware also persist the N3 -extensions. However, utility terms that only serve the purpose of -matching nodes by term-patterns will probably only be terms and not nodes. - -Python Classes -============== - -The three main RDF objects - *IRI*, *Blank Node* and *Literal* are represented in RDFLib by these three main Python classes: - -URIRef ------- - -An IRI (Internationalized Resource Identifier) is represented within RDFLib using the URIRef class. From `the RDF 1.1 specification's IRI section `_: - -Here is the *URIRef* class' auto-built documentation: - -.. autoclass:: rdflib.term.URIRef - :noindex: - -.. code-block:: python - - >>> from rdflib import URIRef - >>> uri = URIRef() # doctest: +SKIP - Traceback (most recent call last): - File "", line 1, in - TypeError: __new__() missing 1 required positional argument: 'value' - >>> uri = URIRef('') - >>> uri - rdflib.term.URIRef('') - >>> uri = URIRef('http://example.com') - >>> uri - rdflib.term.URIRef('http://example.com') - >>> uri.n3() - '' - - -BNodes ------- - -In RDF, a blank node (also called BNode) is a node in an RDF graph representing a resource for which an IRI or literal is not given. The resource represented by a blank node is also called an anonymous resource. According to the RDF standard, a blank node can only be used as subject or object in a triple, although in some syntaxes like Notation 3 it is acceptable to use a blank node as a predicate. If a blank node has a node ID (not all blank nodes are labelled in all RDF serializations), it is limited in scope to a particular serialization of the RDF graph, i.e. the node p1 in one graph does not represent the same node as a node named p1 in any other graph -- `wikipedia`__ - - -.. __: http://en.wikipedia.org/wiki/Blank_node - -Here is the *BNode* class' auto-built documentation: - -.. autoclass:: rdflib.term.BNode - :noindex: - -.. code-block:: python - - >>> from rdflib import BNode - >>> bn = BNode() - >>> bn # doctest: +SKIP - rdflib.term.BNode('AFwALAKU0') - >>> bn.n3() # doctest: +SKIP - '_:AFwALAKU0' - - -.. _rdflibliterals: - -Literals --------- - -Literals are attribute values in RDF, for instance, a person's name, the date of birth, height, etc. -and are stored using simple data types, e.g. *string*, *double*, *dateTime* etc. This usually looks -something like this: - -.. code-block:: python - - name = Literal("Nicholas") # the name 'Nicholas', as a string - - age = Literal(39, datatype=XSD.integer) # the number 39, as an integer - - - -A slightly special case is a *langString* which is a *string* with a language tag, e.g.: - -.. code-block:: python - - name = Literal("Nicholas", lang="en") # the name 'Nicholas', as an English string - imie = Literal("Mikołaj", lang="pl") # the Polish version of the name 'Nicholas' - - -Special literal types indicated by use of a custom IRI for a literal's ``datatype`` value, -for example the `GeoSPARQL RDF standard `_ -invents a custom datatype, ``geoJSONLiteral`` to indicate `GeoJSON geometry serlializations `_ -like this: - -.. code-block:: python - - GEO = Namespace("http://www.opengis.net/ont/geosparql#") - - geojson_geometry = Literal( - '''{"type": "Point", "coordinates": [-83.38,33.95]}''', - datatype=GEO.geoJSONLiteral - - -Here is the ``Literal`` class' auto-built documentation, followed by notes on Literal from the `RDF 1.1 specification 'Literals' section `_. - -.. autoclass:: rdflib.term.Literal - :noindex: - -A literal in an RDF graph contains one or two named components. - -All literals have a lexical form being a Unicode string, which SHOULD be in Normal Form C. - -Plain literals have a lexical form and optionally a language tag as defined by :rfc:`3066`, normalized to lowercase. An exception will be raised if illegal language-tags are passed to :meth:`rdflib.term.Literal.__new__`. - -Typed literals have a lexical form and a datatype URI being an RDF URI reference. - -.. note:: When using the language tag, care must be taken not to confuse language with locale. The language tag relates only to human language text. Presentational issues should be addressed in end-user applications. - -.. note:: The case normalization of language tags is part of the description of the abstract syntax, and consequently the abstract behaviour of RDF applications. It does not constrain an RDF implementation to actually normalize the case. Crucially, the result of comparing two language tags should not be sensitive to the case of the original input. -- `RDF Concepts and Abstract Syntax`__ - - - -.. __: http://www.w3.org/TR/rdf-concepts/#section-Graph-URIref - -Common XSD datatypes -^^^^^^^^^^^^^^^^^^^^ - -Most simple literals such as *string* or *integer* have XML Schema (XSD) datatypes defined for them, see the figure -below. Additionally, these XSD datatypes are listed in the :class:`XSD Namespace class ` that -ships with RDFLib, so many Python code editors will prompt you with autocomplete for them when using it. - -Remember, you don't *have* to use XSD datatypes and can always make up your own, as GeoSPARQL does, as described above. - -.. image:: /_static/datatype_hierarchy.png - :alt: datatype hierarchy - :align: center - :width: 629 - :height: 717 - -Python conversions -^^^^^^^^^^^^^^^^^^ - -RDFLib Literals essentially behave like unicode characters with an XML Schema datatype or language attribute. - -The class provides a mechanism to both convert Python literals (and their built-ins such as time/date/datetime) -into equivalent RDF Literals and (conversely) convert Literals to their Python equivalent. This mapping to and -from Python literals is done as follows: - -====================== =========== -XML Datatype Python type -====================== =========== -None None [#f1]_ -xsd:time time [#f2]_ -xsd:date date -xsd:dateTime datetime -xsd:string None -xsd:normalizedString None -xsd:token None -xsd:language None -xsd:boolean boolean -xsd:decimal Decimal -xsd:integer long -xsd:nonPositiveInteger int -xsd:long long -xsd:nonNegativeInteger int -xsd:negativeInteger int -xsd:int long -xsd:unsignedLong long -xsd:positiveInteger int -xsd:short int -xsd:unsignedInt long -xsd:byte int -xsd:unsignedShort int -xsd:unsignedByte int -xsd:float float -xsd:double float -xsd:base64Binary :mod:`base64` -xsd:anyURI None -rdf:XMLLiteral :class:`xml.dom.minidom.Document` [#f3]_ -rdf:HTML :class:`xml.dom.minidom.DocumentFragment` -====================== =========== - -.. [#f1] plain literals map directly to value space - -.. [#f2] Date, time and datetime literals are mapped to Python - instances using the RDFlib xsd_datetime module, that is based - on the `isodate `_ - package). - -.. [#f3] this is a bit dirty - by accident the ``html5lib`` parser - produces ``DocumentFragments``, and the xml parser ``Documents``, - letting us use this to decide what datatype when round-tripping. - -An appropriate data-type and lexical representation can be found using: - -.. autofunction:: rdflib.term._castPythonToLiteral - -and the other direction with - -.. autofunction:: rdflib.term._castLexicalToPython - -All this happens automatically when creating ``Literal`` objects by passing Python objects to the constructor, -and you never have to do this manually. - -You can add custom data-types with :func:`rdflib.term.bind`, see also :mod:`examples.custom_datatype` - diff --git a/docs/security_considerations.md b/docs/security_considerations.md new file mode 100644 index 0000000000..811432570a --- /dev/null +++ b/docs/security_considerations.md @@ -0,0 +1,78 @@ +# Security Considerations + +RDFLib is designed to access arbitrary network and file resources, in some cases these are directly requested resources, in other cases they are indirectly referenced resources. + +An example of where indirect resources are accessed is JSON-LD processing, where network or file resources referenced by `@context` values will be loaded and processed. + +RDFLib also supports SPARQL, which has federated query capabilities that allow +queries to query arbitrary remote endpoints. + +If you are using RDFLib to process untrusted documents or queries, you should +take measures to restrict file and network access. + +Some measures that can be taken to restrict file and network access are: + +* [Operating System Security Measures](#operating-system-security-measures) +* [Python Runtime Audit Hooks](#python-runtime-audit-hooks) +* [Custom URL Openers](#custom-url-openers) + +Of these, operating system security measures are recommended. The other measures work, but they are not as effective as operating system security measures, and even if they are used, they should be used in conjunction with operating system security measures. + +## Operating System Security Measures + +Most operating systems provide functionality that can be used to restrict network and file access of a process. + +Some examples of these include: + +* [Open Container Initiative (OCI) Containers](https://www.opencontainers.org/) (aka Docker containers). + + Most OCI runtimes provide mechanisms to restrict network and file + access of containers. For example, using Docker, you can limit your + container to only being able to access files explicitly mapped into + the container and only access the network through a firewall. For more + information, refer to the documentation of the tool you use to manage + your OCI containers: + + * [Kubernetes](https://kubernetes.io/docs/home/) + * [Docker](https://docs.docker.com/) + * [Podman](https://podman.io/) + +* [firejail](https://firejail.wordpress.com/) can be used to + sandbox a process on Linux and restrict its network and file access. + +* File and network access restrictions. + + Most operating systems provide a way to restrict operating system users to + only being able to access files and network resources that are explicitly + allowed. Applications that process untrusted input could be run as a user with + these restrictions in place. + +Many other measures are available, however, listing them is outside the scope of this document. + +Of the listed measures, OCI containers are recommended. In most cases, OCI containers are constrained by default and can't access the loopback interface and can only access files that are explicitly mapped into the container. + +## Python Runtime Audit Hooks + +From Python 3.8 onwards, Python provides a mechanism to install runtime audit hooks that can be used to limit access to files and network resources. + +The runtime audit hook system is described in more detail in [PEP 578 – Python Runtime Audit Hooks](https://peps.python.org/pep-0578/). + +Runtime audit hooks can be installed using the [sys.addaudithook](https://docs.python.org/3/library/sys.html#sys.addaudithook) function, and will then get called when audit events occur. The audit events raised by the Python runtime and standard library are described in Python's [audit events table](https://docs.python.org/3/library/audit_events.html). + +RDFLib uses `urllib.request.urlopen` for HTTP, HTTPS and other network access, and this function raises a `urllib.Request` audit event. For file access, RDFLib uses `open`, which raises an `open` audit event. + +Users of RDFLib can install audit hooks that react to these audit events and raise an exception when an attempt is made to access files or network resources that are not explicitly allowed. + +RDFLib's test suite includes tests which verify that audit hooks can block access to network and file resources. + +RDFLib also includes an example that shows how runtime audit hooks can be used to restrict network and file access in [`secure_with_audit`][examples.secure_with_audit]. + +## Custom URL Openers + +RDFLib uses the `urllib.request.urlopen` for HTTP, HTTPS and other network access. This function will use a `urllib.request.OpenerDirector` installed with `urllib.request.install_opener` to open the URLs. + +Users of RDFLib can install a custom URL opener that raises an exception when an attempt is made to access network resources that are not explicitly allowed. + +RDFLib's test suite includes tests which verify that custom URL openers can be used to block access to network resources. + +RDFLib also includes an example that shows how a custom opener can be used to restrict network access in [`secure_with_urlopen`][examples.secure_with_urlopen]. diff --git a/docs/security_considerations.rst b/docs/security_considerations.rst deleted file mode 100644 index 77925a0f55..0000000000 --- a/docs/security_considerations.rst +++ /dev/null @@ -1,114 +0,0 @@ -.. _security_considerations: Security Considerations - -======================= -Security Considerations -======================= - -RDFLib is designed to access arbitrary network and file resources, in some cases -these are directly requested resources, in other cases they are indirectly -referenced resources. - -An example of where indirect resources are accessed is JSON-LD processing, where -network or file resources referenced by ``@context`` values will be loaded and -processed. - -RDFLib also supports SPARQL, which has federated query capabilities that allow -queries to query arbitrary remote endpoints. - -If you are using RDFLib to process untrusted documents or queries, you should -take measures to restrict file and network access. - -Some measures that can be taken to restrict file and network access are: - -* `Operating System Security Measures`_. -* `Python Runtime Audit Hooks`_. -* `Custom URL Openers`_. - -Of these, operating system security measures are recommended. The other -measures work, but they are not as effective as operating system security -measures, and even if they are used, they should be used in conjunction with -operating system security measures. - -Operating System Security Measures -================================== - -Most operating systems provide functionality that can be used to restrict -network and file access of a process. - -Some examples of these include: - -* `Open Container Initiative (OCI) Containers - `_ (aka Docker containers). - - Most OCI runtimes provide mechanisms to restrict network and file - access of containers. For example, using Docker, you can limit your - container to only being able to access files explicitly mapped into - the container and only access the network through a firewall. For more - information, refer to the documentation of the tool you use to manage - your OCI containers: - - * `Kubernetes `_ - * `Docker `_ - * `Podman `_ - -* `firejail `_ can be used to - sandbox a process on Linux and restrict its network and file access. - -* File and network access restrictions. - - Most operating systems provide a way to restrict operating system users to - only being able to access files and network resources that are explicitly - allowed. Applications that process untrusted input could be run as a user with - these restrictions in place. - -Many other measures are available, however, listing them is outside -the scope of this document. - -Of the listed measures, OCI containers are recommended. In most cases, OCI -containers are constrained by default and can't access the loopback interface -and can only access files that are explicitly mapped into the container. - -Python Runtime Audit Hooks -========================== - -From Python 3.8 onwards, Python provides a mechanism to install runtime audit -hooks that can be used to limit access to files and network resources. - -The runtime audit hook system is described in more detail in `PEP 578 – Python -Runtime Audit Hooks `_. - -Runtime audit hooks can be installed using the `sys.addaudithook -`_ function, and -will then get called when audit events occur. The audit events raised by the -Python runtime and standard library are described in Python's `audit events -table `_. - -RDFLib uses `urllib.request.urlopen` for HTTP, HTTPS and other network access, -and this function raises a ``urllib.Request`` audit event. For file access, -RDFLib uses `open`, which raises an ``open`` audit event. - -Users of RDFLib can install audit hooks that react to these audit events and -raise an exception when an attempt is made to access files or network resources -that are not explicitly allowed. - -RDFLib's test suite includes tests which verify that audit hooks can block -access to network and file resources. - -RDFLib also includes an example that shows how runtime audit hooks can be -used to restrict network and file access in :mod:`~examples.secure_with_audit`. - -Custom URL Openers -================== - -RDFLib uses the `urllib.request.urlopen` for HTTP, HTTPS and other network -access. This function will use a `urllib.request.OpenerDirector` installed with -`urllib.request.install_opener` to open the URLs. - -Users of RDFLib can install a custom URL opener that raises an exception when an -attempt is made to access network resources that are not explicitly allowed. - -RDFLib's test suite includes tests which verify that custom URL openers can be -used to block access to network resources. - -RDFLib also includes an example that shows how a custom opener can be used to -restrict network access in :mod:`~examples.secure_with_urlopen`. diff --git a/docs/type_hints.rst b/docs/type_hints.md similarity index 56% rename from docs/type_hints.rst rename to docs/type_hints.md index 31eed6ee79..b859526eb4 100644 --- a/docs/type_hints.rst +++ b/docs/type_hints.md @@ -1,29 +1,22 @@ -.. _type_hints: Type Hints +# Type Hints -========== -Type Hints -========== +This document provides some details about the type hints for RDFLib. More information about type hints can be found [here](https://docs.python.org/3/library/typing.html) -This document provides some details about the type hints for RDFLib. More information about type hints can be found `here `_ +## Rationale for Type Hints -Rationale for Type Hints -======================== - -Type hints are code annotations that describe the types of variables, function parameters and function return value types in a way that can be understood by humans, static type checkers like `mypy `_, code editors like VSCode, documentation generators like Sphinx, and other tools. +Type hints are code annotations that describe the types of variables, function parameters and function return value types in a way that can be understood by humans, static type checkers like [mypy](http://mypy-lang.org/), code editors like VSCode, documentation generators like mkdocstring, and other tools. Static type checkers can use type hints to detect certain classes of errors by inspection. Code editors and IDEs can use type hints to provide better auto-completion and documentation generators can use type hints to generate better documentation. These capabilities make it easier to develop a defect-free RDFLib and they also make it easier for users of RDFLib who can now use static type checkers to detect type errors in code that uses RDFLib. -Gradual Typing Process -====================== +## Gradual Typing Process -Type hints are being added to RDFLib through a process called `gradual typing `_. This process involves adding type hints to some parts of RDFLib while leaving the rest without type hints. Gradual typing is being applied to many, long-lived, Python code bases. +Type hints are being added to RDFLib through a process called [gradual typing](https://en.wikipedia.org/wiki/Gradual_typing). This process involves adding type hints to some parts of RDFLib while leaving the rest without type hints. Gradual typing is being applied to many, long-lived, Python code bases. This process is beneficial in that we can realize some of the benefits of type hints without requiring that the whole codebase have type hints. -Intended Type Hints -=================== +## Intended Type Hints The intent is to have type hints in place for all of RDFLib and to have these type hints be as accurate as possible. @@ -31,33 +24,32 @@ The accuracy of type hints is determined by both the standards that RDFLib aims There may be cases where some functionality of RDFLib may work perfectly well with values of types that are excluded by the type hints, but if these additional types violate the relevant standards we will consider the correct type hints to be those that exclude values of these types. -Public Type Aliases -=================== -In python, type hints are specified in annotations. Type hints are different from type aliases which are normal python variables that are not intended to provide runtime utility and are instead intended for use in static type checking. +## Public Type Aliases -For clarity, the following is an example of a function ``foo`` with type hints: +In python, type hints are specified in annotations. Type hints are different from type aliases which are normal python variables that are not intended to provide runtime utility and are instead intended for use in static type checking. -.. code-block:: python - - def foo(a: int) -> int: - return a + 1 +For clarity, the following is an example of a function `foo` with type hints: -In the function ``foo``, the input variable ``a`` is indicated to be of type ``int`` and the function is indicated to return an ``int``. +```python +def foo(a: int) -> int: + return a + 1 +``` -The following is an example of a type alias ``Bar``: +In the function `foo`, the input variable `a` is indicated to be of type `int` and the function is indicated to return an `int`. -.. code-block:: python +The following is an example of a type alias `Bar`: - from typing import Tuple +```python +from typing import Tuple - Bar = Tuple[int, str] +Bar = Tuple[int, str] +``` -RDFLib will provide public type aliases under the ``rdflib.typing`` package, for example, ``rdflib.typing.Triple``, ``rdflib.typing.Quad``. Type aliases in the rest of RDFLib should be private (i.e. being with an underscore). +RDFLib will provide public type aliases under the `rdflib.typing` package, for example, `rdflib.typing.Triple`, `rdflib.typing.Quad`. Type aliases in the rest of RDFLib should be private (i.e. being with an underscore). -Versioning, Compatibility and Stability -======================================= +## Versioning, Compatibility and Stability -RDFLib attempts to adhere to `semver 2.0 `_ which is concerned with the public API of software. +RDFLib attempts to adhere to [semver 2.0](https://semver.org/spec/v2.0.0.html) which is concerned with the public API of software. Ignoring type hints, the public API of RDFLib exists implicitly as a consequence of the code of RDFLib and the actual behaviour this entails, the relevant standards that RDFLib is trying to implement, and the documentation of RDFLib, with some interplay between all three of these. RDFLib's public API includes public type aliases, as these are normal python variables and not annotations. @@ -70,18 +62,17 @@ Changes to type hints can broadly be classified as follow: **Type Declaration** Adding type hints to existing code that had no explicit type hints, for example, changing - .. code-block:: python - - def foo(val): - return val + 1 - - to +```python +def foo(val): + return val + 1 +``` - .. code-block:: python - - def foo(val: int) -> int: - return val + 1 +to +```python +def foo(val: int) -> int: + return val + 1 +``` **Type Refinement** Refining existing type hints to be narrower, for example, changing a type hint of `typing.Collection` to `typing.Sequence`. @@ -89,33 +80,13 @@ Changes to type hints can broadly be classified as follow: **Type Corrections** Correcting existing type hints which contradict the behaviour of the code or relevant specifications, for example, changing `typing.Sequence` from `typing.Set` -Given semver version components ``MAJOR.MINOR.PATCH``, RDFLib will attempt to constrain type hint changes as follow: - -.. list-table:: - :widths: 1 1 1 1 - :header-rows: 1 - - * - Version Component - - Type Declaration - - Type Refinement - - Type Corrections - - * - MAJOR - - YES - - YES - - YES - - * - MINOR - - YES - - YES - - YES - - * - PATCH - - NO - - NO - - YES - -.. CAUTION:: - A caveat worth nothing here is that code that passed type validation on one version of RDFLib can fail type validation on a later version of RDFLib that only differs in ``PATCH`` version component. This is as a consequence of potential *Type Corrections*. +Given semver version components `MAJOR.MINOR.PATCH`, RDFLib will attempt to constrain type hint changes as follow: +| Version Component | Type Declaration | Type Refinement | Type Corrections | +|------------------|-----------------|----------------|-----------------| +| MAJOR | YES | YES | YES | +| MINOR | YES | YES | YES | +| PATCH | NO | NO | YES | +!!! caution "Type Corrections" + A caveat worth nothing here is that code that passed type validation on one version of RDFLib can fail type validation on a later version of RDFLib that only differs in `PATCH` version component. This is as a consequence of potential *Type Corrections*. diff --git a/docs/upgrade4to5.md b/docs/upgrade4to5.md new file mode 100644 index 0000000000..2d42c85d6a --- /dev/null +++ b/docs/upgrade4to5.md @@ -0,0 +1,203 @@ +# Upgrading from RDFLib version 4.2.2 to 5.0.0 + +RDFLib version 5.0.0 appeared over 3 years after the previous release, 4.2.2 and contains a large number of both enhancements and bug fixes. Fundamentally though, 5.0.0 is compatible with 4.2.2. + +## Major Changes + +### Literal Ordering + +Literal total ordering [PR #793](https://github.com/RDFLib/rdflib/pull/793) is implemented. That means all literals can now be compared to be greater than or less than any other literal. This is required for implementing some specific SPARQL features, but it is counter-intuitive to those who are expecting a TypeError when certain normally-incompatible types are compared. For example, comparing a `Literal(int(1), datatype=xsd:integer)` to `Literal(datetime.date(10,01,2020), datatype=xsd:date)` using a `>` or `<` operator in rdflib 4.2.2 and earlier, would normally throw a TypeError, however in rdflib 5.0.0 this operation now returns a True or False according to the Literal Total Ordering according the rules outlined in [PR #793](https://github.com/RDFLib/rdflib/pull/793) + +### Removed RDF Parsers + +The RDFa and Microdata format RDF parsers were removed from rdflib. There are still other python libraries available to implement these parsers. + +## All Changes + +This list has been assembled from Pull Request and commit information. + +### General Bugs Fixed + +* Pr 451 redux + [PR #978](https://github.com/RDFLib/rdflib/pull/978) +* NTriples fails to parse URIs with only a scheme + [ISSUE #920](https://github.com/RDFLib/rdflib/issues/920) + [PR #974](https://github.com/RDFLib/rdflib/pull/974) +* cannot clone it on windows - Remove colons from test result files. Fix #901. + [ISSUE #901](https://github.com/RDFLib/rdflib/issues/901) + [PR #971](https://github.com/RDFLib/rdflib/pull/971) +* Add requirement for requests to setup.py + [PR #969](https://github.com/RDFLib/rdflib/pull/969) +* fixed URIRef including native unicode characters + [PR #961](https://github.com/RDFLib/rdflib/pull/961) +* DCTERMS.format not working + [ISSUE #932](https://github.com/RDFLib/rdflib/issues/932) +* infixowl.manchesterSyntax do not encode strings + [PR #906](https://github.com/RDFLib/rdflib/pull/906) +* Fix blank node label to not contain '_:' during parsing + [PR #886](https://github.com/RDFLib/rdflib/pull/886) +* rename new SPARQLWrapper to SPARQLConnector + [PR #872](https://github.com/RDFLib/rdflib/pull/872) +* Fix #859. Unquote and Uriquote Literal Datatype. + [PR #860](https://github.com/RDFLib/rdflib/pull/860) +* Parsing nquads + [ISSUE #786](https://github.com/RDFLib/rdflib/issues/786) +* ntriples spec allows for upper-cased lang tag, fixes #782 + [PR #784](https://github.com/RDFLib/rdflib/pull/784) +* Error parsing N-Triple file using RDFlib + [ISSUE #782](https://github.com/RDFLib/rdflib/issues/782) +* Adds escaped single quote to literal parser + [PR #736](https://github.com/RDFLib/rdflib/pull/736) +* N3 parse error on single quote within single quotes + [ISSUE #732](https://github.com/RDFLib/rdflib/issues/732) +* Fixed #725 + [PR #730](https://github.com/RDFLib/rdflib/pull/730) +* test for issue #725: canonicalization collapses BNodes + [PR #726](https://github.com/RDFLib/rdflib/pull/726) +* RGDA1 graph canonicalization sometimes still collapses distinct BNodes + [ISSUE #725](https://github.com/RDFLib/rdflib/issues/725) +* Accept header should use a q parameter + [PR #720](https://github.com/RDFLib/rdflib/pull/720) +* Added test for Issue #682 and fixed. + [PR #718](https://github.com/RDFLib/rdflib/pull/718) +* Incompatibility with Python3: unichr + [ISSUE #687](https://github.com/RDFLib/rdflib/issues/687) +* namespace.py include colon in ALLOWED_NAME_CHARS + [PR #663](https://github.com/RDFLib/rdflib/pull/663) +* namespace.py fix compute_qname missing namespaces + [PR #649](https://github.com/RDFLib/rdflib/pull/649) +* RDFa parsing Error! `__init__()` got an unexpected keyword argument 'encoding' + [ISSUE #639](https://github.com/RDFLib/rdflib/issues/639) +* Bugfix: `term.Literal.__add__` + [PR #451](https://github.com/RDFLib/rdflib/pull/451) +* fixup of #443 + [PR #445](https://github.com/RDFLib/rdflib/pull/445) +* Microdata to rdf second edition bak + [PR #444](https://github.com/RDFLib/rdflib/pull/444) + +### Enhanced Features + +* Register additional serializer plugins for SPARQL mime types. + [PR #987](https://github.com/RDFLib/rdflib/pull/987) +* Pr 388 redux + [PR #979](https://github.com/RDFLib/rdflib/pull/979) +* Allows RDF terms introduced by JSON-LD 1.1 + [PR #970](https://github.com/RDFLib/rdflib/pull/970) +* make SPARQLConnector work with DBpedia + [PR #941](https://github.com/RDFLib/rdflib/pull/941) +* ClosedNamespace returns right exception for way of access + [PR #866](https://github.com/RDFLib/rdflib/pull/866) +* Not adding all namespaces for n3 serializer + [PR #832](https://github.com/RDFLib/rdflib/pull/832) +* Adds basic support of xsd:duration + [PR #808](https://github.com/RDFLib/rdflib/pull/808) +* Add possibility to set authority and basepath to skolemize graph + [PR #807](https://github.com/RDFLib/rdflib/pull/807) +* Change notation3 list realization to non-recursive function. + [PR #805](https://github.com/RDFLib/rdflib/pull/805) +* Suppress warning for not using custom encoding. + [PR #800](https://github.com/RDFLib/rdflib/pull/800) +* Add support to parsing large xml inputs + [ISSUE #749](https://github.com/RDFLib/rdflib/issues/749) + [PR #750](https://github.com/RDFLib/rdflib/pull/750) +* improve hash efficiency by directly using str/unicode hash + [PR #746](https://github.com/RDFLib/rdflib/pull/746) +* Added the csvw prefix to the RDFa initial context. + [PR #594](https://github.com/RDFLib/rdflib/pull/594) +* syncing changes from pyMicrodata + [PR #587](https://github.com/RDFLib/rdflib/pull/587) +* Microdata parser: updated the parser to the latest version of the microdata->rdf note (published in December 2014) + [PR #443](https://github.com/RDFLib/rdflib/pull/443) +* Literal.toPython() support for xsd:hexBinary + [PR #388](https://github.com/RDFLib/rdflib/pull/388) + +### SPARQL Fixes + +* Total order patch patch + [PR #862](https://github.com/RDFLib/rdflib/pull/862) +* use <<= instead of deprecated << + [PR #861](https://github.com/RDFLib/rdflib/pull/861) +* Fix #847 + [PR #856](https://github.com/RDFLib/rdflib/pull/856) +* RDF Literal "1"^^xsd:boolean should _not_ coerce to True + [ISSUE #847](https://github.com/RDFLib/rdflib/issues/847) +* Makes NOW() return an UTC date + [PR #844](https://github.com/RDFLib/rdflib/pull/844) +* NOW() SPARQL should return an xsd:dateTime with a timezone + [ISSUE #843](https://github.com/RDFLib/rdflib/issues/843) +* fix property paths bug: issue #715 + [PR #822](https://github.com/RDFLib/rdflib/pull/822) + [ISSUE #715](https://github.com/RDFLib/rdflib/issues/715) +* MulPath: correct behaviour of n3() + [PR #820](https://github.com/RDFLib/rdflib/pull/820) +* Literal total ordering + [PR #793](https://github.com/RDFLib/rdflib/pull/793) +* Remove SPARQLWrapper dependency + [PR #744](https://github.com/RDFLib/rdflib/pull/744) +* made UNION faster by not preventing duplicates + [PR #741](https://github.com/RDFLib/rdflib/pull/741) +* added a hook to add custom functions to SPARQL + [PR #723](https://github.com/RDFLib/rdflib/pull/723) +* Issue714 + [PR #717](https://github.com/RDFLib/rdflib/pull/717) +* Use <<= instead of deprecated << in SPARQL parser + [PR #417](https://github.com/RDFLib/rdflib/pull/417) +* Custom FILTER function for SPARQL engine + [ISSUE #274](https://github.com/RDFLib/rdflib/issues/274) + +### Code Quality and Cleanups + +* a slightly opinionated autopep8 run + [PR #870](https://github.com/RDFLib/rdflib/pull/870) +* remove rdfa and microdata parsers from core RDFLib + [PR #828](https://github.com/RDFLib/rdflib/pull/828) +* ClosedNamespace KeyError -> AttributeError + [PR #827](https://github.com/RDFLib/rdflib/pull/827) +* typo in rdflib/plugins/sparql/update.py + [ISSUE #760](https://github.com/RDFLib/rdflib/issues/760) +* Fix logging in interactive mode + [PR #731](https://github.com/RDFLib/rdflib/pull/731) +* make namespace module flake8-compliant, change exceptions in that mod… + [PR #711](https://github.com/RDFLib/rdflib/pull/711) +* delete ez_setup.py? + [ISSUE #669](https://github.com/RDFLib/rdflib/issues/669) +* code duplication issue between rdflib and pymicrodata + [ISSUE #582](https://github.com/RDFLib/rdflib/issues/582) +* Transition from 2to3 to use of six.py to be merged in 5.0.0-dev + [PR #519](https://github.com/RDFLib/rdflib/pull/519) +* sparqlstore drop deprecated methods and args + [PR #516](https://github.com/RDFLib/rdflib/pull/516) +* python3 code seems shockingly inefficient + [ISSUE #440](https://github.com/RDFLib/rdflib/issues/440) +* removed md5_term_hash, fixes #240 + [PR #439](https://github.com/RDFLib/rdflib/pull/439) + [ISSUE #240](https://github.com/RDFLib/rdflib/issues/240) + +### Testing + +* 3.7 for travis + [PR #864](https://github.com/RDFLib/rdflib/pull/864) +* Added trig unit tests to highlight some current parsing/serializing issues + [PR #431](https://github.com/RDFLib/rdflib/pull/431) + +### Documentation Fixes + +* Fix a doc string in the query module + [PR #976](https://github.com/RDFLib/rdflib/pull/976) +* setup.py: Make the license field use an SPDX identifier + [PR #789](https://github.com/RDFLib/rdflib/pull/789) +* Update README.md + [PR #764](https://github.com/RDFLib/rdflib/pull/764) +* Update namespaces_and_bindings.rst + [PR #757](https://github.com/RDFLib/rdflib/pull/757) +* DOC: README.md: rdflib-jsonld, https uris + [PR #712](https://github.com/RDFLib/rdflib/pull/712) +* make doctest support py2/py3 + [ISSUE #707](https://github.com/RDFLib/rdflib/issues/707) +* `pip install rdflib` (as per README.md) gets OSError on Mint 18.1 + [ISSUE #704](https://github.com/RDFLib/rdflib/issues/704) + [PR #717](https://github.com/RDFLib/rdflib/pull/717) +* Use <<= instead of deprecated << in SPARQL parser + [PR #417](https://github.com/RDFLib/rdflib/pull/417) +* Custom FILTER function for SPARQL engine + [ISSUE #274](https://github.com/RDFLib/rdflib/issues/274) diff --git a/docs/upgrade4to5.rst b/docs/upgrade4to5.rst deleted file mode 100644 index f6ae19a109..0000000000 --- a/docs/upgrade4to5.rst +++ /dev/null @@ -1,213 +0,0 @@ -.. _upgrade4to5: Upgrading from RDFLib version 4.2.2 to 5.0.0 - -============================================ -Upgrading 4.2.2 to 5.0.0 -============================================ - -RDFLib version 5.0.0 appeared over 3 years after the previous release, 4.2.2 and contains a large number of both enhancements and bug fixes. Fundamentally though, 5.0.0 is compatible with 4.2.2. - - -Major Changes -------------- - -Literal Ordering -^^^^^^^^^^^^^^^^ -Literal total ordering `PR #793 `_ is implemented. That means all literals can now be compared to be greater than or less than any other literal. -This is required for implementing some specific SPARQL features, but it is counter-intuitive to those who are expecting a TypeError when certain normally-incompatible types are compared. -For example, comparing a ``Literal(int(1), datatype=xsd:integer)`` to ``Literal(datetime.date(10,01,2020), datatype=xsd:date)`` using a ``>`` or ``<`` operator in rdflib 4.2.2 and earlier, would normally throw a TypeError, -however in rdflib 5.0.0 this operation now returns a True or False according to the Literal Total Ordering according the rules outlined in `PR #793 `_ - -Removed RDF Parsers -^^^^^^^^^^^^^^^^^^^ -The RDFa and Microdata format RDF parsers were removed from rdflib. There are still other python libraries available to implement these parsers. - -All Changes ------------ - -This list has been assembled from Pull Request and commit information. - -General Bugs Fixed: -^^^^^^^^^^^^^^^^^^^ -* Pr 451 redux - `PR #978 `_ -* NTriples fails to parse URIs with only a scheme - `ISSUE #920 `_ - `PR #974 `_ -* cannot clone it on windows - Remove colons from test result files. Fix #901. - `ISSUE #901 `_ - `PR #971 `_ -* Add requirement for requests to setup.py - `PR #969 `_ -* fixed URIRef including native unicode characters - `PR #961 `_ -* DCTERMS.format not working - `ISSUE #932 `_ -* infixowl.manchesterSyntax do not encode strings - `PR #906 `_ -* Fix blank node label to not contain '_:' during parsing - `PR #886 `_ -* rename new SPARQLWrapper to SPARQLConnector - `PR #872 `_ -* Fix #859. Unquote and Uriquote Literal Datatype. - `PR #860 `_ -* Parsing nquads - `ISSUE #786 `_ -* ntriples spec allows for upper-cased lang tag, fixes #782 - `PR #784 `_ -* Error parsing N-Triple file using RDFlib - `ISSUE #782 `_ -* Adds escaped single quote to literal parser - `PR #736 `_ -* N3 parse error on single quote within single quotes - `ISSUE #732 `_ -* Fixed #725 - `PR #730 `_ -* test for issue #725: canonicalization collapses BNodes - `PR #726 `_ -* RGDA1 graph canonicalization sometimes still collapses distinct BNodes - `ISSUE #725 `_ -* Accept header should use a q parameter - `PR #720 `_ -* Added test for Issue #682 and fixed. - `PR #718 `_ -* Incompatibility with Python3: unichr - `ISSUE #687 `_ -* namespace.py include colon in ALLOWED_NAME_CHARS - `PR #663 `_ -* namespace.py fix compute_qname missing namespaces - `PR #649 `_ -* RDFa parsing Error! ``__init__()`` got an unexpected keyword argument 'encoding' - `ISSUE #639 `_ -* Bugfix: ``term.Literal.__add__`` - `PR #451 `_ -* fixup of #443 - `PR #445 `_ -* Microdata to rdf second edition bak - `PR #444 `_ - -Enhanced Features: -^^^^^^^^^^^^^^^^^^ -* Register additional serializer plugins for SPARQL mime types. - `PR #987 `_ -* Pr 388 redux - `PR #979 `_ -* Allows RDF terms introduced by JSON-LD 1.1 - `PR #970 `_ -* make SPARQLConnector work with DBpedia - `PR #941 `_ -* ClosedNamespace returns right exception for way of access - `PR #866 `_ -* Not adding all namespaces for n3 serializer - `PR #832 `_ -* Adds basic support of xsd:duration - `PR #808 `_ -* Add possibility to set authority and basepath to skolemize graph - `PR #807 `_ -* Change notation3 list realization to non-recursive function. - `PR #805 `_ -* Suppress warning for not using custom encoding. - `PR #800 `_ -* Add support to parsing large xml inputs - `ISSUE #749 `_ - `PR #750 `_ -* improve hash efficiency by directly using str/unicode hash - `PR #746 `_ -* Added the csvw prefix to the RDFa initial context. - `PR #594 `_ -* syncing changes from pyMicrodata - `PR #587 `_ -* Microdata parser: updated the parser to the latest version of the microdata->rdf note (published in December 2014) - `PR #443 `_ -* Literal.toPython() support for xsd:hexBinary - `PR #388 `_ - -SPARQL Fixes: -^^^^^^^^^^^^^ -* Total order patch patch - `PR #862 `_ -* use <<= instead of deprecated << - `PR #861 `_ -* Fix #847 - `PR #856 `_ -* RDF Literal "1"^^xsd:boolean should _not_ coerce to True - `ISSUE #847 `_ -* Makes NOW() return an UTC date - `PR #844 `_ -* NOW() SPARQL should return an xsd:dateTime with a timezone - `ISSUE #843 `_ -* fix property paths bug: issue #715 - `PR #822 `_ - `ISSUE #715 `_ -* MulPath: correct behaviour of n3() - `PR #820 `_ -* Literal total ordering - `PR #793 `_ -* Remove SPARQLWrapper dependency - `PR #744 `_ -* made UNION faster by not preventing duplicates - `PR #741 `_ -* added a hook to add custom functions to SPARQL - `PR #723 `_ -* Issue714 - `PR #717 `_ -* Use <<= instead of deprecated << in SPARQL parser - `PR #417 `_ -* Custom FILTER function for SPARQL engine - `ISSUE #274 `_ - -Code Quality and Cleanups: -^^^^^^^^^^^^^^^^^^^^^^^^^^ -* a slightly opinionated autopep8 run - `PR #870 `_ -* remove rdfa and microdata parsers from core RDFLib - `PR #828 `_ -* ClosedNamespace KeyError -> AttributeError - `PR #827 `_ -* typo in rdflib/plugins/sparql/update.py - `ISSUE #760 `_ -* Fix logging in interactive mode - `PR #731 `_ -* make namespace module flake8-compliant, change exceptions in that mod… - `PR #711 `_ -* delete ez_setup.py? - `ISSUE #669 `_ -* code duplication issue between rdflib and pymicrodata - `ISSUE #582 `_ -* Transition from 2to3 to use of six.py to be merged in 5.0.0-dev - `PR #519 `_ -* sparqlstore drop deprecated methods and args - `PR #516 `_ -* python3 code seems shockingly inefficient - `ISSUE #440 `_ -* removed md5_term_hash, fixes #240 - `PR #439 `_ - `ISSUE #240 `_ - -Testing: -^^^^^^^^ -* 3.7 for travis - `PR #864 `_ -* Added trig unit tests to highlight some current parsing/serializing issues - `PR #431 `_ - -Documentation Fixes: -^^^^^^^^^^^^^^^^^^^^ -* Fix a doc string in the query module - `PR #976 `_ -* setup.py: Make the license field use an SPDX identifier - `PR #789 `_ -* Update README.md - `PR #764 `_ -* Update namespaces_and_bindings.rst - `PR #757 `_ -* DOC: README.md: rdflib-jsonld, https uris - `PR #712 `_ -* make doctest support py2/py3 - `ISSUE #707 `_ -* ``pip install rdflib`` (as per README.md) gets OSError on Mint 18.1 - `ISSUE #704 `_ - `PR #717 `_ -* Use <<= instead of deprecated << in SPARQL parser - `PR #417 `_ -* Custom FILTER function for SPARQL engine - `ISSUE #274 `_ diff --git a/docs/upgrade5to6.md b/docs/upgrade5to6.md new file mode 100644 index 0000000000..8ac59b2a55 --- /dev/null +++ b/docs/upgrade5to6.md @@ -0,0 +1,61 @@ +# Upgrading 5.0.0 to 6.0.0 + +6.0.0 fully adopts Python 3 practices and drops Python 2 support so it is neater, faster and generally more modern than 5.0.0. It also tidies up the [`Graph`][rdflib.graph.Graph] API (removing duplicate functions) so it does include a few breaking changes. Additionally, there is a long list of PRs merged into 6.0.0 adding a number of small fixes and features which are listed below. + +RDFLib version 5.0.0 was released in 2020, 3 years after the previous version (4.2.2) and is fundamentally 5.0.0 compatible with. If you need very long-term backwards-compatibility or Python 2 support, you need 5.0.0. + +## Major Changes + +The most notable changes in RDFLib 6.0.0 are: + +### Python 3.7+ + +* The oldest version of python you can use to run RDFLib is now 3.7. +* This is a big jump from RDFLib 5.0.0 that worked on python 2.7 and 3.5. +* This change is to allow the library maintainers to adopt more modern development tools, newer language features, and avoid the need to support EOL versions of python in he future + +### JSON-LD integration and JSON-LD 1.1 + +* The json-ld serializer/parser plugin was by far the most commonly used RDFLib addon. +* Last year we brought it under the RDFLib org in Github +* Now for 6.0.0 release the JSON-LD serializer and parser are integrated into RDFLib core +* This includes the experimental support for the JSON-LD v1.1 spec +* You no longer need to install the json-ld dependency separately. + +## All Changes + +This list has been assembled from Pull Request and commit information. + +### General Bugs Fixed + +* Pr 451 redux + [PR #978](https://github.com/RDFLib/rdflib/pull/978) + +### Enhanced Features + +* Register additional serializer plugins for SPARQL mime types. + [PR #987](https://github.com/RDFLib/rdflib/pull/987) + +### SPARQL Fixes + +* Total order patch patch + [PR #862](https://github.com/RDFLib/rdflib/pull/862) + +### Code Quality and Cleanups + +* a slightly opinionated autopep8 run + [PR #870](https://github.com/RDFLib/rdflib/pull/870) + +### Testing + +* 3.7 for travis + [PR #864](https://github.com/RDFLib/rdflib/pull/864) + +### Documentation Fixes + +* Fix a doc string in the query module + [PR #976](https://github.com/RDFLib/rdflib/pull/976) + +### Integrate JSON-LD into RDFLib + +[PR #1354](https://github.com/RDFLib/rdflib/pull/1354) diff --git a/docs/upgrade5to6.rst b/docs/upgrade5to6.rst deleted file mode 100644 index 7ffa7e68bc..0000000000 --- a/docs/upgrade5to6.rst +++ /dev/null @@ -1,79 +0,0 @@ -.. _upgrade4to5: Upgrading from RDFLib version 5.0.0 to 6.0.0 - -============================================ -Upgrading 5.0.0 to 6.0.0 -============================================ - -6.0.0 fully adopts Python 3 practices and drops Python 2 support so it is neater, faster and generally more modern than -5.0.0. It also tidies up the ``Graph`` API (removing duplicate functions) so it does include a few breaking changes. -Additionally, there is a long list of PRs merged into 6.0.0 adding a number of small fixes and features which are listed -below. - -RDFLib version 5.0.0 was released in 2020, 3 years after the previous version (4.2.2) and is fundamentally 5.0.0 -compatible with. If you need very long-term backwards-compatibility or Python 2 support, you need 5.0.0. - - -Major Changes -------------- - -The most notable changes in RDFLib 6.0.0 are: - -Python 3.7+ -^^^^^^^^^^^ -* The oldest version of python you can use to run RDFLib is now 3.7. -* This is a big jump from RDFLib 5.0.0 that worked on python 2.7 and 3.5. -* This change is to allow the library maintainers to adopt more modern development tools, - newer language features, and avoid the need to support EOL versions of python in he future - -JSON-LD integration and JSON-LD 1.1 -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -* The json-ld serializer/parser plugin was by far the most commonly used RDFLib addon. -* Last year we brought it under the RDFLib org in Github -* Now for 6.0.0 release the JSON-LD serializer and parser are integrated into RDFLib core -* This includes the experimental support for the JSON-LD v1.1 spec -* You no longer need to install the json-ld dependency separately. - - -All Changes ------------ - -This list has been assembled from Pull Request and commit information. - -General Bugs Fixed: -^^^^^^^^^^^^^^^^^^^ -* Pr 451 redux - `PR #978 `_ - - -Enhanced Features: -^^^^^^^^^^^^^^^^^^ -* Register additional serializer plugins for SPARQL mime types. - `PR #987 `_ - - -SPARQL Fixes: -^^^^^^^^^^^^^ -* Total order patch patch - `PR #862 `_ - - -Code Quality and Cleanups: -^^^^^^^^^^^^^^^^^^^^^^^^^^ -* a slightly opinionated autopep8 run - `PR #870 `_ - - -Testing: -^^^^^^^^ -* 3.7 for travis - `PR #864 `_ - - -Documentation Fixes: -^^^^^^^^^^^^^^^^^^^^ -* Fix a doc string in the query module - `PR #976 `_ - -Integrade JSON-LD into RDFLib: -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -`PR #1354 `_ diff --git a/docs/upgrade6to7.md b/docs/upgrade6to7.md new file mode 100644 index 0000000000..0cba20e225 --- /dev/null +++ b/docs/upgrade6to7.md @@ -0,0 +1,36 @@ +# Upgrading from version 6 to 7 + +## Python version + +RDFLib 7 requires Python 3.8.1 or later. + +## New behaviour for `publicID` in `parse` methods + +Before version 7, the `publicID` argument to the [`parse()`][rdflib.graph.ConjunctiveGraph.parse] and [`parse()`][rdflib.graph.Dataset.parse] methods was used as the name for the default graph, and triples from the default graph in a source were loaded into the graph +named `publicID`. + +In version 7, the `publicID` argument is only used as the base URI for relative URI resolution as defined in [IETF RFC 3986](https://datatracker.ietf.org/doc/html/rfc3986#section-5.1.4). + +To accommodate this change, ensure that use of the `publicID` argument is consistent with the new behaviour. + +If you want to load triples from a format that does not support named graphs into a named graph, use the following code: + +```python +from rdflib import ConjunctiveGraph + +cg = ConjunctiveGraph() +cg.get_context("example:graph_name").parse("http://example.com/source.ttl", format="turtle") +``` + +If you want to move triples from the default graph into a named graph, use the following code: + +```python +from rdflib import ConjunctiveGraph + +cg = ConjunctiveGraph() +cg.parse("http://example.com/source.trig", format="trig") +destination_graph = cg.get_context("example:graph_name") +for triple in cg.default_context.triples((None, None, None)): + destination_graph.add(triple) + cg.default_context.remove(triple) +``` diff --git a/docs/upgrade6to7.rst b/docs/upgrade6to7.rst deleted file mode 100644 index d687634d5f..0000000000 --- a/docs/upgrade6to7.rst +++ /dev/null @@ -1,50 +0,0 @@ -.. _upgrade6to7: Upgrading from RDFLib version 6 to 7 - -============================================ -Upgrading from version 6 to 7 -============================================ - -Python version ----------------------------------------------------- - -RDFLib 7 requires Python 3.8.1 or later. - -New behaviour for ``publicID`` in ``parse`` methods. ----------------------------------------------------- - -Before version 7, the ``publicID`` argument to the -:meth:`rdflib.graph.ConjunctiveGraph.parse` and -:meth:`rdflib.graph.Dataset.parse` methods was used as the name for the default -graph, and triples from the default graph in a source were loaded into the graph -named ``publicID``. - -In version 7, the ``publicID`` argument is only used as the base URI for relative -URI resolution as defined in `IETF RFC 3986 -`_. - -To accommodate this change, ensure that use of the ``publicID`` argument is -consistent with the new behaviour. - -If you want to load triples from a format that does not support named graphs -into a named graph, use the following code: - -.. code-block:: python - - from rdflib import ConjunctiveGraph - - cg = ConjunctiveGraph() - cg.get_context("example:graph_name").parse("http://example.com/source.ttl", format="turtle") - -If you want to move triples from the default graph into a named graph, use the -following code: - -.. code-block:: python - - from rdflib import ConjunctiveGraph - - cg = ConjunctiveGraph() - cg.parse("http://example.com/source.trig", format="trig") - destination_graph = cg.get_context("example:graph_name") - for triple in cg.default_context.triples((None, None, None)): - destination_graph.add(triple) - cg.default_context.remove(triple) diff --git a/docs/utilities.md b/docs/utilities.md new file mode 100644 index 0000000000..46d2813bae --- /dev/null +++ b/docs/utilities.md @@ -0,0 +1,146 @@ +# Utilities & convenience functions + +For RDF programming, RDFLib and Python may not be the fastest tools, but we try hard to make them the easiest and most convenient to use and thus the *fastest* overall! + +This is a collection of hints and pointers for hassle-free RDF coding. + +## Functional properties + +Use [`value()`][rdflib.graph.Graph.value] and [`set()`][rdflib.graph.Graph.set] to work with *functional property* instances, i.e. properties than can only occur once for a resource. + +```python +from rdflib import Graph, URIRef, Literal, BNode +from rdflib.namespace import FOAF, RDF + +g = Graph() +g.bind("foaf", FOAF) + +# Add demo data +bob = URIRef("http://example.org/people/Bob") +g.add((bob, RDF.type, FOAF.Person)) +g.add((bob, FOAF.name, Literal("Bob"))) +g.add((bob, FOAF.age, Literal(38))) + +# To get a single value, use 'value' +print(g.value(bob, FOAF.age)) +# prints: 38 + +# To change a single of value, use 'set' +g.set((bob, FOAF.age, Literal(39))) +print(g.value(bob, FOAF.age)) +# prints: 39 +``` + +## Slicing graphs + +Python allows slicing arrays with a `slice` object, a triple of `start`, `stop` and `step-size`: + +```python +for i in range(20)[2:9:3]: + print(i) +# prints: +# 2, 5, 8 +``` + +RDFLib graphs override `__getitem__` and we pervert the slice triple to be a RDF triple instead. This lets slice syntax be a shortcut for [`triples()`][rdflib.graph.Graph.triples], [`subject_predicates()`][rdflib.graph.Graph.subject_predicates], [`__contains__()`][rdflib.graph.Graph.__contains__], and other Graph query-methods: + +```python +from rdflib import Graph, URIRef, Literal, BNode +from rdflib.namespace import FOAF, RDF + +g = Graph() +g.bind("foaf", FOAF) + +# Add demo data +bob = URIRef("http://example.org/people/Bob") +bill = URIRef("http://example.org/people/Bill") +g.add((bob, RDF.type, FOAF.Person)) +g.add((bob, FOAF.name, Literal("Bob"))) +g.add((bob, FOAF.age, Literal(38))) +g.add((bob, FOAF.knows, bill)) + +print(g[:]) +# same as +print(iter(g)) + +print(g[bob]) +# same as +print(g.predicate_objects(bob)) + +print(g[bob: FOAF.knows]) +# same as +print(g.objects(bob, FOAF.knows)) + +print(g[bob: FOAF.knows: bill]) +# same as +print((bob, FOAF.knows, bill) in g) + +print(g[:FOAF.knows]) +# same as +print(g.subject_objects(FOAF.knows)) +``` + +See [`examples.slice`][examples.slice] for a complete example. + +!!! warning "Slicing Caution" + Slicing is convenient for run-once scripts for playing around + in the Python `REPL`, however since slicing returns + tuples of varying length depending on which parts of the + slice are bound, you should be careful using it in more + complicated programs. If you pass in variables, and they are + `None` or `False`, you may suddenly get a generator of + different length tuples back than you expect. + +## SPARQL Paths + +[SPARQL property paths](http://www.w3.org/TR/sparql11-property-paths/) are possible using overridden operators on URIRefs. See [`examples.foafpaths`][examples.foafpaths] and [`rdflib.paths`][rdflib.paths]. + +## Serializing a single term to N3 + +For simple output, or simple serialisation, you often want a nice +readable representation of a term. All terms (URIRef, Literal etc.) have a +`n3`, method, which will return a suitable N3 format: + +```python +from rdflib import Graph, URIRef, Literal +from rdflib.namespace import FOAF + +# A URIRef +person = URIRef("http://xmlns.com/foaf/0.1/Person") +print(person.n3()) +# prints: + +# Simplifying the output with a namespace prefix: +g = Graph() +g.bind("foaf", FOAF) + +print(person.n3(g.namespace_manager)) +# prints foaf:Person + +# A typed literal +l = Literal(2) +print(l.n3()) +# prints "2"^^ + +# Simplifying the output with a namespace prefix +# XSD is built in, so no need to bind() it! +l.n3(g.namespace_manager) +# prints: "2"^^xsd:integer +``` + +## Parsing data from a string + +You can parse data from a string with the `data` param: + +```python +from rdflib import Graph + +g = Graph().parse(data=" .") +for r in g.triples((None, None, None)): + print(r) +# prints: (rdflib.term.URIRef('a:'), rdflib.term.URIRef('p:'), rdflib.term.URIRef('p:')) +``` + +## Command Line tools + +RDFLib includes a handful of commandline tools, see [`rdflib.tools`][rdflib.tools]. diff --git a/docs/utilities.rst b/docs/utilities.rst deleted file mode 100644 index 381f9070bb..0000000000 --- a/docs/utilities.rst +++ /dev/null @@ -1,166 +0,0 @@ -Utilities & convenience functions -================================= - -For RDF programming, RDFLib and Python may not be the fastest tools, -but we try hard to make them the easiest and most convenient to use and thus the *fastest* overall! - -This is a collection of hints and pointers for hassle-free RDF coding. - -Functional properties ---------------------- - -Use :meth:`~rdflib.graph.Graph.value` and -:meth:`~rdflib.graph.Graph.set` to work with *functional -property* instances, i.e. properties than can only occur once for a resource. - -.. code-block:: python - - from rdflib import Graph, URIRef, Literal, BNode - from rdflib.namespace import FOAF, RDF - - g = Graph() - g.bind("foaf", FOAF) - - # Add demo data - bob = URIRef("http://example.org/people/Bob") - g.add((bob, RDF.type, FOAF.Person)) - g.add((bob, FOAF.name, Literal("Bob"))) - g.add((bob, FOAF.age, Literal(38))) - - # To get a single value, use 'value' - print(g.value(bob, FOAF.age)) - # prints: 38 - - # To change a single of value, use 'set' - g.set((bob, FOAF.age, Literal(39))) - print(g.value(bob, FOAF.age)) - # prints: 39 - - -Slicing graphs --------------- - -Python allows slicing arrays with a ``slice`` object, a triple of -``start``, ``stop`` and ``step-size``: - -.. code-block:: python - - for i in range(20)[2:9:3]: - print(i) - # prints: - # 2, 5, 8 - - -RDFLib graphs override ``__getitem__`` and we pervert the slice triple -to be a RDF triple instead. This lets slice syntax be a shortcut for -:meth:`~rdflib.graph.Graph.triples`, -:meth:`~rdflib.graph.Graph.subject_predicates`, -:meth:`~rdflib.graph.Graph.__contains__`, and other Graph query-methods: - -.. code-block:: python - - from rdflib import Graph, URIRef, Literal, BNode - from rdflib.namespace import FOAF, RDF - - g = Graph() - g.bind("foaf", FOAF) - - # Add demo data - bob = URIRef("http://example.org/people/Bob") - bill = URIRef("http://example.org/people/Bill") - g.add((bob, RDF.type, FOAF.Person)) - g.add((bob, FOAF.name, Literal("Bob"))) - g.add((bob, FOAF.age, Literal(38))) - g.add((bob, FOAF.knows, bill)) - - print(g[:]) - # same as - print(iter(g)) - - print(g[bob]) - # same as - print(g.predicate_objects(bob)) - - print(g[bob: FOAF.knows]) - # same as - print(g.objects(bob, FOAF.knows)) - - print(g[bob: FOAF.knows: bill]) - # same as - print((bob, FOAF.knows, bill) in g) - - print(g[:FOAF.knows]) - # same as - print(g.subject_objects(FOAF.knows)) - - -See :mod:`examples.slice` for a complete example. - -.. note:: Slicing is convenient for run-once scripts for playing around - in the Python ``REPL``, however since slicing returns - tuples of varying length depending on which parts of the - slice are bound, you should be careful using it in more - complicated programs. If you pass in variables, and they are - ``None`` or ``False``, you may suddenly get a generator of - different length tuples back than you expect. - -SPARQL Paths ------------- - -`SPARQL property paths -`_ are possible using -overridden operators on URIRefs. See :mod:`examples.foafpaths` and -:mod:`rdflib.paths`. - -Serializing a single term to N3 -------------------------------- - -For simple output, or simple serialisation, you often want a nice -readable representation of a term. All terms (URIRef, Literal etc.) have a -``n3``, method, which will return a suitable N3 format: - -.. code-block:: python - - from rdflib import Graph, URIRef, Literal - from rdflib.namespace import FOAF - - # A URIRef - person = URIRef("http://xmlns.com/foaf/0.1/Person") - print(person.n3()) - # prints: - - # Simplifying the output with a namespace prefix: - g = Graph() - g.bind("foaf", FOAF) - - print(person.n3(g.namespace_manager)) - # prints foaf:Person - - # A typed literal - l = Literal(2) - print(l.n3()) - # prints "2"^^ - - # Simplifying the output with a namespace prefix - # XSD is built in, so no need to bind() it! - l.n3(g.namespace_manager) - # prints: "2"^^xsd:integer - -Parsing data from a string --------------------------- - -You can parse data from a string with the ``data`` param: - -.. code-block:: python - - from rdflib import Graph - - g = Graph().parse(data=" .") - for r in g.triples((None, None, None)): - print(r) - # prints: (rdflib.term.URIRef('a:'), rdflib.term.URIRef('p:'), rdflib.term.URIRef('p:')) - -Command Line tools ------------------- - -RDFLib includes a handful of commandline tools, see :mod:`rdflib.tools`. diff --git a/examples/__init__.py b/examples/__init__.py index e69de29bb2..02b5360580 100644 --- a/examples/__init__.py +++ b/examples/__init__.py @@ -0,0 +1 @@ +"""These examples all live in `./examples` in the source-distribution of RDFLib.""" diff --git a/examples/conjunctive_graphs.py b/examples/conjunctive_graphs.py index 433a843f49..310ff3c44a 100644 --- a/examples/conjunctive_graphs.py +++ b/examples/conjunctive_graphs.py @@ -1,6 +1,6 @@ """ An RDFLib ConjunctiveGraph is an (unnamed) aggregation of all the Named Graphs -within a Store. The :meth:`~rdflib.graph.ConjunctiveGraph.get_context` +within a Store. The [`ConjunctiveGraph.get_context`][rdflib.graph.ConjunctiveGraph.get_context] method can be used to get a particular named graph for use, such as to add triples to, or the default graph can be used. diff --git a/examples/custom_datatype.py b/examples/custom_datatype.py index 46f2a5f23c..197578b96c 100644 --- a/examples/custom_datatype.py +++ b/examples/custom_datatype.py @@ -4,7 +4,7 @@ Mapping for integers, floats, dateTimes, etc. are already added, but you can also add your own. -This example shows how :meth:`rdflib.term.bind` lets you register new +This example shows how [`bind`][rdflib.term.bind] lets you register new mappings between literal datatypes and Python objects """ diff --git a/examples/custom_eval.py b/examples/custom_eval.py index 32c2686061..fc9649ff02 100644 --- a/examples/custom_eval.py +++ b/examples/custom_eval.py @@ -2,18 +2,20 @@ This example shows how a custom evaluation function can be added to handle certain SPARQL Algebra elements. -A custom function is added that adds ``rdfs:subClassOf`` "inference" when -asking for ``rdf:type`` triples. +A custom function is added that adds `rdfs:subClassOf` "inference" when +asking for `rdf:type` triples. Here the custom eval function is added manually, normally you would use setuptools and entry_points to do it: i.e. in your setup.py:: - entry_points = { - 'rdf.plugins.sparqleval': [ - 'myfunc = mypackage:MyFunction', - ], - } +```python +entry_points = { + 'rdf.plugins.sparqleval': [ + 'myfunc = mypackage:MyFunction', + ], +} +``` """ from pathlib import Path diff --git a/examples/foafpaths.py b/examples/foafpaths.py index db34fb3162..152b4deaa4 100644 --- a/examples/foafpaths.py +++ b/examples/foafpaths.py @@ -5,23 +5,20 @@ We overload some Python operators on URIRefs to allow creating path operators directly in Python. -============ ========================================= -Operator Path -============ ========================================= -``p1 / p2`` Path sequence -``p1 | p2`` Path alternative -``p1 * '*'`` chain of 0 or more p's -``p1 * '+'`` chain of 1 or more p's -``p1 * '?'`` 0 or 1 p -``~p1`` p1 inverted, i.e. (s p1 o) <=> (o ~p1 s) -``-p1`` NOT p1, i.e. any property but p1 -============ ========================================= - - -These can then be used in property position for ``s,p,o`` triple queries +| Operator | Path | +|-------------|----------------------------------------------------| +| `p1 / p2` | Path sequence | +| `p1 | p2` | Path alternative | +| `p1 * '*'` | Chain of 0 or more p's | +| `p1 * '+'` | Chain of 1 or more p's | +| `p1 * '?'` | 0 or 1 p | +| `~p1` | p1 inverted, i.e. `(s p1 o)` ⇔ `(o ~p1 s)` | +| `-p1` | NOT p1, i.e. any property but p1 | + +These can then be used in property position for `s,p,o` triple queries for any graph method. -See the docs for :mod:`rdflib.paths` for the details. +See the docs for [`paths`][rdflib.paths] for the details. This example shows how to get the name of friends (i.e values two steps away x knows y, y name z) with a single query. """ diff --git a/examples/prepared_query.py b/examples/prepared_query.py index 035c6137d0..a297bcbe9e 100644 --- a/examples/prepared_query.py +++ b/examples/prepared_query.py @@ -1,11 +1,11 @@ """ SPARQL Queries be prepared (i.e parsed and translated to SPARQL algebra) -by the :meth:`rdflib.plugins.sparql.prepareQuery` method. +by the [`prepareQuery`][rdflib.plugins.sparql.prepareQuery] method. -``initNs`` can be used instead of PREFIX values. +`initNs` can be used instead of PREFIX values. When executing, variables can be bound with the -``initBindings`` keyword parameter. +`initBindings` keyword parameter. """ from pathlib import Path diff --git a/examples/resource_example.py b/examples/resource_example.py index da93042fa5..ecb7937de4 100644 --- a/examples/resource_example.py +++ b/examples/resource_example.py @@ -1,10 +1,10 @@ """ -RDFLib has a :class:`~rdflib.resource.Resource` class, for a resource-centric API. -The :class:`~rdflib.Graph` class also has a ``resource`` function that can be used +RDFLib has a [`Resource`][rdflib.resource.Resource] class, for a resource-centric API. +The [`Graph`][rdflib.Graph] class also has a `resource` function that can be used to create resources and manipulate them by quickly adding or querying for triples where this resource is the subject. -This example shows g.resource() in action. +This example shows `g.resource()` in action. """ from rdflib import RDF, RDFS, Graph, Literal diff --git a/examples/secure_with_audit.py b/examples/secure_with_audit.py index 2bd4e28fb9..ba75d2b48e 100644 --- a/examples/secure_with_audit.py +++ b/examples/secure_with_audit.py @@ -1,10 +1,9 @@ """ -This example demonstrates how to use `Python audit hooks -`_ to block access +This example demonstrates how to use [Python audit hooks](https://docs.python.org/3/library/sys.html#sys.addaudithook) to block access to files and URLs. -It installs a audit hook with `sys.addaudithook `_ that blocks access to files and -URLs that end with ``blocked.jsonld``. +It installs a audit hook with [sys.addaudithook](https://docs.python.org/3/library/sys.html#sys.addaudithook) that blocks access to files and +URLs that end with `blocked.jsonld`. The code in the example then verifies that the audit hook is blocking access to URLs and files as expected. @@ -23,15 +22,20 @@ def audit_hook(name: str, args: Tuple[Any, ...]) -> None: """ An audit hook that blocks access when an attempt is made to open a - file or URL that ends with ``blocked.jsonld``. + file or URL that ends with `blocked.jsonld`. - Details of the audit events can be seen in the `audit events - table `_. + Details of the audit events can be seen in the + [audit events table](https://docs.python.org/3/library/audit_events.html). - :param name: The name of the audit event. - :param args: The arguments of the audit event. - :return: `None` if the audit hook does not block access. - :raises PermissionError: If the file or URL being accessed ends with ``blocked.jsonld``. + Args: + name: The name of the audit event. + args: The arguments of the audit event. + + Returns: + `None` if the audit hook does not block access. + + Raises: + PermissionError: If the file or URL being accessed ends with `blocked.jsonld`. """ if name == "urllib.Request" and args[0].endswith("blocked.jsonld"): raise PermissionError("Permission denied for URL") diff --git a/examples/secure_with_urlopen.py b/examples/secure_with_urlopen.py index 0055047964..8168e216bc 100644 --- a/examples/secure_with_urlopen.py +++ b/examples/secure_with_urlopen.py @@ -23,9 +23,14 @@ def http_open(self, req: Request) -> http.client.HTTPResponse: """ Block access to URLs that end with "blocked.jsonld". - :param req: The request to open. - :return: The response. - :raises PermissionError: If the URL ends with "blocked.jsonld". + Args: + req: The request to open. + + Returns: + The response. + + Raises: + PermissionError: If the URL ends with "blocked.jsonld". """ if req.get_full_url().endswith("blocked.jsonld"): raise PermissionError("Permission denied for URL") diff --git a/examples/slice.py b/examples/slice.py index 6994613e6f..82474e18bc 100644 --- a/examples/slice.py +++ b/examples/slice.py @@ -3,10 +3,10 @@ This is a short-hand for iterating over triples. -Combined with SPARQL paths (see ``foafpaths.py``) - quite complex queries +Combined with SPARQL paths (see `foafpaths.py`) - quite complex queries can be realised. -See :meth:`rdflib.graph.Graph.__getitem__` for details +See [`Graph.__getitem__`][rdflib.graph.Graph.__getitem__] for details """ from pathlib import Path diff --git a/examples/smushing.py b/examples/smushing.py index 88d68a5204..701993abb7 100644 --- a/examples/smushing.py +++ b/examples/smushing.py @@ -1,22 +1,22 @@ """ A FOAF smushing example. -Filter a graph by normalizing all ``foaf:Persons`` into URIs based on -their ``mbox_sha1sum``. +Filter a graph by normalizing all `foaf:Persons` into URIs based on +their `mbox_sha1sum`. -Suppose I get two `FOAF `_ documents each -talking about the same person (according to ``mbox_sha1sum``) but they -each used a :class:`rdflib.term.BNode` for the subject. For this demo +Suppose I get two [FOAF](http://xmlns.com/foaf/0.1) documents each +talking about the same person (according to `mbox_sha1sum`) but they +each used a [`BNode`][rdflib.term.BNode] for the subject. For this demo I've combined those two documents into one file: This filters a graph by changing every subject with a -``foaf:mbox_sha1sum`` into a new subject whose URI is based on the -``sha1sum``. This new graph might be easier to do some operations on. +`foaf:mbox_sha1sum` into a new subject whose URI is based on the +`sha1sum`. This new graph might be easier to do some operations on. An advantage of this approach over other methods for collapsing BNodes is that I can incrementally process new FOAF documents as they come in without having to access my ever-growing archive. Even if another -``65b983bb397fb71849da910996741752ace8369b`` document comes in next +`65b983bb397fb71849da910996741752ace8369b` document comes in next year, I would still give it the same stable subject URI that merges with my existing data. """ diff --git a/examples/sparql_query_example.py b/examples/sparql_query_example.py index 0e9fc225cf..29fef43c78 100644 --- a/examples/sparql_query_example.py +++ b/examples/sparql_query_example.py @@ -1,14 +1,14 @@ """ -SPARQL Query using :meth:`rdflib.graph.Graph.query` +SPARQL Query using [`Graph.query`][rdflib.graph.Graph.query] -The method returns a :class:`~rdflib.query.Result`, iterating over -this yields :class:`~rdflib.query.ResultRow` objects +The method returns a [`Result`][rdflib.query.Result], iterating over +this yields [`ResultRow`][rdflib.query.ResultRow] objects The variable bindings can be accessed as attributes of the row objects For variable names that are not valid python identifiers, dict access -(i.e. with ``row[var] / __getitem__``) is also possible. +(i.e. with `row[var] / __getitem__`) is also possible. -:attr:`~rdflib.query.Result.vars` contains the variables +[`Result.vars`][rdflib.query.Result.vars] contains the variables """ import logging diff --git a/examples/sparql_update_example.py b/examples/sparql_update_example.py index a99749962b..f5c02b3359 100644 --- a/examples/sparql_update_example.py +++ b/examples/sparql_update_example.py @@ -1,5 +1,5 @@ """ -SPARQL Update statements can be applied with :meth:`rdflib.graph.Graph.update` +SPARQL Update statements can be applied with [`Graph.update`][rdflib.graph.Graph.update] """ from pathlib import Path diff --git a/examples/transitive.py b/examples/transitive.py index 800cbc80c0..9c47089926 100644 --- a/examples/transitive.py +++ b/examples/transitive.py @@ -1,45 +1,45 @@ """ An example illustrating how to use the -:meth:`~rdflib.graph.Graph.transitive_subjects` and -:meth:`~rdflib.graph.Graph.transitive_objects` graph methods +[`Graph.transitive_subjects`][rdflib.graph.Graph.transitive_subjects] and +[`Graph.transitive_objects`][rdflib.graph.Graph.transitive_objects] graph methods -Formal definition -^^^^^^^^^^^^^^^^^^ +## Formal definition -The :meth:`~rdflib.graph.Graph.transitive_objects` method finds all + +The [`Graph.transitive_objects`][rdflib.graph.Graph.transitive_objects] method finds all nodes such that there is a path from subject to one of those nodes using only the predicate property in the triples. The -:meth:`~rdflib.graph.Graph.transitive_subjects` method is similar; it +[`Graph.transitive_subjects`][rdflib.graph.Graph.transitive_subjects] method is similar; it finds all nodes such that there is a path from the node to the object using only the predicate property. -Informal description, with an example -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +## Informal description, with an example -In brief, :meth:`~rdflib.graph.Graph.transitive_objects` walks forward +In brief, [`Graph.transitive_objects`][rdflib.graph.Graph.transitive_objects] walks forward in a graph using a particular property, and -:meth:`~rdflib.graph.Graph.transitive_subjects` walks backward. A good -example uses a property ``ex:parent``, the semantics of which are +[`Graph.transitive_subjects`][rdflib.graph.Graph.transitive_subjects] walks backward. A good +example uses a property `ex:parent`, the semantics of which are biological parentage. The -:meth:`~rdflib.graph.Graph.transitive_objects` method would get all +[`Graph.transitive_objects`][rdflib.graph.Graph.transitive_objects] method would get all the ancestors of a particular person (all nodes such that there is a parent path between the person and the object). The -:meth:`~rdflib.graph.Graph.transitive_subjects` method would get all +[`Graph.transitive_subjects`][rdflib.graph.Graph.transitive_subjects] method would get all the descendants of a particular person (all nodes such that there is a parent path between the node and the person). So, say that your URI is -``ex:person``. +`ex:person`. This example would get all of your (known) ancestors, and then get all the (known) descendants of your maternal grandmother. -.. warning:: The :meth:`~rdflib.graph.Graph.transitive_objects` method has the start node - as the *first* argument, but the :meth:`~rdflib.graph.Graph.transitive_subjects` +!!! warning "Important note on arguments" + + The [`Graph.transitive_objects`][rdflib.graph.Graph.transitive_objects] method has the start node + as the *first* argument, but the [`Graph.transitive_subjects`][rdflib.graph.Graph.transitive_subjects] method has the start node as the *second* argument. -User-defined transitive closures -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +## User-defined transitive closures -The method :meth:`~rdflib.graph.Graph.transitiveClosure` returns +The method [`Graph.transitiveClosure`][rdflib.graph.Graph.transitiveClosure] returns transtive closures of user-defined functions. """ diff --git a/mkdocs.yml b/mkdocs.yml new file mode 100644 index 0000000000..2aa212c2c8 --- /dev/null +++ b/mkdocs.yml @@ -0,0 +1,178 @@ +site_name: RDFLib +site_description: Python library for working with RDF, a simple yet powerful language for representing information. +site_author: RDFLib Team +site_url: https://rdflib.readthedocs.org +repo_name: RDFLib/rdflib +repo_url: https://github.com/RDFLib/rdflib +edit_uri: "edit/main/docs/" +copyright: Copyright © 2002 - 2025, RDFLib Team. + +# poetry run mkdocs serve -a localhost:8000 + +nav: + - Usage: + - Overview: index.md + - Getting started with RDFLib: gettingstarted.md + - Loading and saving RDF: intro_to_parsing.md + - Creating RDF triples: intro_to_creating_rdf.md + - Navigating Graphs: intro_to_graphs.md + - Querying with SPARQL: intro_to_sparql.md + - Utilities functions: utilities.md + + - In depth: + - Plugins: plugins.md + - RDF terms: rdf_terms.md + - Namespaces and Bindings: namespaces_and_bindings.md + - Persistence: persistence.md + - Merging graphs: merging.md + - Security considerations: security_considerations.md + + - Changes: + - Changelog: changelog.md + - Upgrading v6 to 7: upgrade6to7.md + - Upgrading v5 to 6: upgrade5to6.md + - Upgrading v4 to 5: upgrade4to5.md + + - API Reference: + - Examples: apidocs/examples.md + - Graph: apidocs/rdflib.graph.md + - Term: apidocs/rdflib.term.md + - Namespace: apidocs/rdflib.namespace.md + - Tools: apidocs/rdflib.tools.md + - Extras: apidocs/rdflib.extras.md + - Container: apidocs/rdflib.container.md + - Collection: apidocs/rdflib.collection.md + - Paths: apidocs/rdflib.paths.md + - Util: apidocs/rdflib.util.md + - Plugins: + - Parsers: apidocs/rdflib.plugins.parsers.md + - Serializers: apidocs/rdflib.plugins.serializers.md + - Stores: apidocs/rdflib.plugins.stores.md + - SPARQL: apidocs/rdflib.plugins.sparql.md + + - Development: + - Contributing guide: CONTRIBUTING.md + - Developers guide: developers.md + - Documentation guide: docs.md + - Type Hints: type_hints.md + - Persisting Notation 3 Terms: persisting_n3_terms.md + - Code of Conduct: CODE_OF_CONDUCT.md + - Decision Records: decisions.md + + +theme: + name: "material" + favicon: _static/RDFlib.png + logo: _static/RDFlib.png + language: en + # Choose color: https://squidfunk.github.io/mkdocs-material/setup/changing-the-colors/#primary-color + palette: + - media: "(prefers-color-scheme: light)" + primary: indigo + scheme: default + toggle: + icon: material/weather-night + name: Switch to dark mode + - media: "(prefers-color-scheme: dark)" + primary: indigo + scheme: slate + toggle: + icon: material/weather-sunny + name: Switch to light mode + features: + - navigation.indexes + - navigation.sections + - navigation.tabs + - navigation.top + - navigation.tracking + - navigation.footer + - content.code.copy + - content.code.annotate + - content.code.select + - content.tabs.link # Group tabs switch + - content.action.edit + - content.action.view + - search.highlight + - search.share + - search.suggest + - toc.follow + - content.tooltips + # - header.autohide + # - navigation.tabs.sticky + # - navigation.expand + # - navigation.instant + + +plugins: +- search +- autorefs +- include-markdown +- gen-files: + scripts: + - docs/gen_ref_pages.py +- mkdocstrings: + default_handler: python + handlers: + python: + # https://mkdocstrings.github.io/python/reference/api/#mkdocstrings_handlers.python.PythonInputOptions + options: + docstring_style: google + docstring_options: + ignore_init_summary: true + docstring_section_style: list + filters: ["!^_[^_]"] # Exclude names starting with a single underscore + heading_level: 1 + inherited_members: false # Disable inherited members to avoid duplicates + merge_init_into_class: true + parameter_headings: true + separate_signature: true + signature_crossrefs: true + summary: true + show_bases: true + show_root_heading: true + show_root_full_path: false + show_signature_annotations: true + show_source: true + show_symbol_type_heading: true + show_symbol_type_toc: true + show_overloads: false + show_if_no_docstring: true # Showing when no docstring increases build time + +watch: + - rdflib + - docs + + +# Supported admonititions: https://squidfunk.github.io/mkdocs-material/reference/admonitions/#supported-types +markdown_extensions: + - admonition + - pymdownx.highlight: + anchor_linenums: true + - pymdownx.inlinehilite + - pymdownx.snippets + - pymdownx.superfences + - pymdownx.details + - pymdownx.extra + - pymdownx.tabbed: + alternate_style: true + - pymdownx.tasklist: + custom_checkbox: true + - attr_list + - smarty + - abbr + - pymdownx.snippets: + auto_append: + - docs/includes/abbreviations.md + + +# extra_css: +# - _static/custom.css +# extra_javascript: +# - _static/fontawesome.min.js + +extra: + social: + - icon: fontawesome/brands/python + link: https://pypi.org/project/rdflib + - icon: fontawesome/brands/github + link: https://github.com/RDFLib diff --git a/poetry.lock b/poetry.lock index 8622bb56c4..82b2f19f31 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1,29 +1,39 @@ # This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand. [[package]] -name = "alabaster" -version = "0.7.13" -description = "A configurable sidebar-enabled Sphinx theme" +name = "babel" +version = "2.17.0" +description = "Internationalization utilities" optional = false -python-versions = ">=3.6" +python-versions = ">=3.8" files = [ - {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"}, - {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"}, + {file = "babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2"}, + {file = "babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d"}, ] +[package.dependencies] +pytz = {version = ">=2015.7", markers = "python_version < \"3.9\""} + +[package.extras] +dev = ["backports.zoneinfo", "freezegun (>=1.0,<2.0)", "jinja2 (>=3.0)", "pytest (>=6.0)", "pytest-cov", "pytz", "setuptools", "tzdata"] + [[package]] -name = "babel" -version = "2.12.1" -description = "Internationalization utilities" +name = "backrefs" +version = "5.7.post1" +description = "A wrapper around re and regex that adds additional back references." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"}, - {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"}, + {file = "backrefs-5.7.post1-py310-none-any.whl", hash = "sha256:c5e3fd8fd185607a7cb1fefe878cfb09c34c0be3c18328f12c574245f1c0287e"}, + {file = "backrefs-5.7.post1-py311-none-any.whl", hash = "sha256:712ea7e494c5bf3291156e28954dd96d04dc44681d0e5c030adf2623d5606d51"}, + {file = "backrefs-5.7.post1-py312-none-any.whl", hash = "sha256:a6142201c8293e75bce7577ac29e1a9438c12e730d73a59efdd1b75528d1a6c5"}, + {file = "backrefs-5.7.post1-py38-none-any.whl", hash = "sha256:ec61b1ee0a4bfa24267f6b67d0f8c5ffdc8e0d7dc2f18a2685fd1d8d9187054a"}, + {file = "backrefs-5.7.post1-py39-none-any.whl", hash = "sha256:05c04af2bf752bb9a6c9dcebb2aff2fab372d3d9d311f2a138540e307756bd3a"}, + {file = "backrefs-5.7.post1.tar.gz", hash = "sha256:8b0f83b770332ee2f1c8244f4e03c77d127a0fa529328e6a0e77fa25bee99678"}, ] -[package.dependencies] -pytz = {version = ">=2015.7", markers = "python_version < \"3.9\""} +[package.extras] +extras = ["regex"] [[package]] name = "berkeleydb" @@ -81,6 +91,17 @@ d = ["aiohttp (>=3.7.4)", "aiohttp (>=3.7.4,!=3.9.0)"] jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"] uvloop = ["uvloop (>=0.15.2)"] +[[package]] +name = "bracex" +version = "2.6" +description = "Bash style brace expander." +optional = false +python-versions = ">=3.9" +files = [ + {file = "bracex-2.6-py3-none-any.whl", hash = "sha256:0b0049264e7340b3ec782b5cb99beb325f36c3782a32e36e876452fd49a09952"}, + {file = "bracex-2.6.tar.gz", hash = "sha256:98f1347cd77e22ee8d967a30ad4e310b233f7754dbf31ff3fceb76145ba47dc7"}, +] + [[package]] name = "build" version = "1.2.2.post1" @@ -108,108 +129,146 @@ virtualenv = ["virtualenv (>=20.0.35)"] [[package]] name = "certifi" -version = "2023.7.22" +version = "2025.10.5" description = "Python package for providing Mozilla's CA Bundle." optional = false -python-versions = ">=3.6" +python-versions = ">=3.7" files = [ - {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"}, - {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"}, + {file = "certifi-2025.10.5-py3-none-any.whl", hash = "sha256:0f212c2744a9bb6de0c56639a6f68afe01ecd92d91f14ae897c4fe7bbeeef0de"}, + {file = "certifi-2025.10.5.tar.gz", hash = "sha256:47c09d31ccf2acf0be3f701ea53595ee7e0b8fa08801c6624be771df09ae7b43"}, ] [[package]] name = "charset-normalizer" -version = "3.2.0" +version = "3.4.4" description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." optional = false -python-versions = ">=3.7.0" -files = [ - {file = "charset-normalizer-3.2.0.tar.gz", hash = "sha256:3bb3d25a8e6c0aedd251753a79ae98a093c7e7b471faa3aa9a93a81431987ace"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b87549028f680ca955556e3bd57013ab47474c3124dc069faa0b6545b6c9710"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7c70087bfee18a42b4040bb9ec1ca15a08242cf5867c58726530bdf3945672ed"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a103b3a7069b62f5d4890ae1b8f0597618f628b286b03d4bc9195230b154bfa9"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94aea8eff76ee6d1cdacb07dd2123a68283cb5569e0250feab1240058f53b623"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:db901e2ac34c931d73054d9797383d0f8009991e723dab15109740a63e7f902a"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b0dac0ff919ba34d4df1b6131f59ce95b08b9065233446be7e459f95554c0dc8"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:193cbc708ea3aca45e7221ae58f0fd63f933753a9bfb498a3b474878f12caaad"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09393e1b2a9461950b1c9a45d5fd251dc7c6f228acab64da1c9c0165d9c7765c"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:baacc6aee0b2ef6f3d308e197b5d7a81c0e70b06beae1f1fcacffdbd124fe0e3"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:bf420121d4c8dce6b889f0e8e4ec0ca34b7f40186203f06a946fa0276ba54029"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c04a46716adde8d927adb9457bbe39cf473e1e2c2f5d0a16ceb837e5d841ad4f"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:aaf63899c94de41fe3cf934601b0f7ccb6b428c6e4eeb80da72c58eab077b19a"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d62e51710986674142526ab9f78663ca2b0726066ae26b78b22e0f5e571238dd"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-win32.whl", hash = "sha256:04e57ab9fbf9607b77f7d057974694b4f6b142da9ed4a199859d9d4d5c63fe96"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:48021783bdf96e3d6de03a6e39a1171ed5bd7e8bb93fc84cc649d11490f87cea"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4957669ef390f0e6719db3613ab3a7631e68424604a7b448f079bee145da6e09"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46fb8c61d794b78ec7134a715a3e564aafc8f6b5e338417cb19fe9f57a5a9bf2"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f779d3ad205f108d14e99bb3859aa7dd8e9c68874617c72354d7ecaec2a054ac"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f25c229a6ba38a35ae6e25ca1264621cc25d4d38dca2942a7fce0b67a4efe918"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2efb1bd13885392adfda4614c33d3b68dee4921fd0ac1d3988f8cbb7d589e72a"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f30b48dd7fa1474554b0b0f3fdfdd4c13b5c737a3c6284d3cdc424ec0ffff3a"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:246de67b99b6851627d945db38147d1b209a899311b1305dd84916f2b88526c6"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9bd9b3b31adcb054116447ea22caa61a285d92e94d710aa5ec97992ff5eb7cf3"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:8c2f5e83493748286002f9369f3e6607c565a6a90425a3a1fef5ae32a36d749d"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:3170c9399da12c9dc66366e9d14da8bf7147e1e9d9ea566067bbce7bb74bd9c2"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7a4826ad2bd6b07ca615c74ab91f32f6c96d08f6fcc3902ceeedaec8cdc3bcd6"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:3b1613dd5aee995ec6d4c69f00378bbd07614702a315a2cf6c1d21461fe17c23"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9e608aafdb55eb9f255034709e20d5a83b6d60c054df0802fa9c9883d0a937aa"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-win32.whl", hash = "sha256:f2a1d0fd4242bd8643ce6f98927cf9c04540af6efa92323e9d3124f57727bfc1"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:681eb3d7e02e3c3655d1b16059fbfb605ac464c834a0c629048a30fad2b27489"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c57921cda3a80d0f2b8aec7e25c8aa14479ea92b5b51b6876d975d925a2ea346"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41b25eaa7d15909cf3ac4c96088c1f266a9a93ec44f87f1d13d4a0e86c81b982"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f058f6963fd82eb143c692cecdc89e075fa0828db2e5b291070485390b2f1c9c"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a7647ebdfb9682b7bb97e2a5e7cb6ae735b1c25008a70b906aecca294ee96cf4"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eef9df1eefada2c09a5e7a40991b9fc6ac6ef20b1372abd48d2794a316dc0449"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e03b8895a6990c9ab2cdcd0f2fe44088ca1c65ae592b8f795c3294af00a461c3"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ee4006268ed33370957f55bf2e6f4d263eaf4dc3cfc473d1d90baff6ed36ce4a"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c4983bf937209c57240cff65906b18bb35e64ae872da6a0db937d7b4af845dd7"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:3bb7fda7260735efe66d5107fb7e6af6a7c04c7fce9b2514e04b7a74b06bf5dd"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:72814c01533f51d68702802d74f77ea026b5ec52793c791e2da806a3844a46c3"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:70c610f6cbe4b9fce272c407dd9d07e33e6bf7b4aa1b7ffb6f6ded8e634e3592"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-win32.whl", hash = "sha256:a401b4598e5d3f4a9a811f3daf42ee2291790c7f9d74b18d75d6e21dda98a1a1"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:c0b21078a4b56965e2b12f247467b234734491897e99c1d51cee628da9786959"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:95eb302ff792e12aba9a8b8f8474ab229a83c103d74a750ec0bd1c1eea32e669"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a100c6d595a7f316f1b6f01d20815d916e75ff98c27a01ae817439ea7726329"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6339d047dab2780cc6220f46306628e04d9750f02f983ddb37439ca47ced7149"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4b749b9cc6ee664a3300bb3a273c1ca8068c46be705b6c31cf5d276f8628a94"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a38856a971c602f98472050165cea2cdc97709240373041b69030be15047691f"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f87f746ee241d30d6ed93969de31e5ffd09a2961a051e60ae6bddde9ec3583aa"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:89f1b185a01fe560bc8ae5f619e924407efca2191b56ce749ec84982fc59a32a"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e1c8a2f4c69e08e89632defbfabec2feb8a8d99edc9f89ce33c4b9e36ab63037"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2f4ac36d8e2b4cc1aa71df3dd84ff8efbe3bfb97ac41242fbcfc053c67434f46"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a386ebe437176aab38c041de1260cd3ea459c6ce5263594399880bbc398225b2"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:ccd16eb18a849fd8dcb23e23380e2f0a354e8daa0c984b8a732d9cfaba3a776d"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:e6a5bf2cba5ae1bb80b154ed68a3cfa2fa00fde979a7f50d6598d3e17d9ac20c"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:45de3f87179c1823e6d9e32156fb14c1927fcc9aba21433f088fdfb555b77c10"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-win32.whl", hash = "sha256:1000fba1057b92a65daec275aec30586c3de2401ccdcd41f8a5c1e2c87078706"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:8b2c760cfc7042b27ebdb4a43a4453bd829a5742503599144d54a032c5dc7e9e"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:855eafa5d5a2034b4621c74925d89c5efef61418570e5ef9b37717d9c796419c"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:203f0c8871d5a7987be20c72442488a0b8cfd0f43b7973771640fc593f56321f"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e857a2232ba53ae940d3456f7533ce6ca98b81917d47adc3c7fd55dad8fab858"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e86d77b090dbddbe78867a0275cb4df08ea195e660f1f7f13435a4649e954e5"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c4fb39a81950ec280984b3a44f5bd12819953dc5fa3a7e6fa7a80db5ee853952"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2dee8e57f052ef5353cf608e0b4c871aee320dd1b87d351c28764fc0ca55f9f4"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8700f06d0ce6f128de3ccdbc1acaea1ee264d2caa9ca05daaf492fde7c2a7200"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1920d4ff15ce893210c1f0c0e9d19bfbecb7983c76b33f046c13a8ffbd570252"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c1c76a1743432b4b60ab3358c937a3fe1341c828ae6194108a94c69028247f22"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f7560358a6811e52e9c4d142d497f1a6e10103d3a6881f18d04dbce3729c0e2c"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:c8063cf17b19661471ecbdb3df1c84f24ad2e389e326ccaf89e3fb2484d8dd7e"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:cd6dbe0238f7743d0efe563ab46294f54f9bc8f4b9bcf57c3c666cc5bc9d1299"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:1249cbbf3d3b04902ff081ffbb33ce3377fa6e4c7356f759f3cd076cc138d020"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-win32.whl", hash = "sha256:6c409c0deba34f147f77efaa67b8e4bb83d2f11c8806405f76397ae5b8c0d1c9"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:7095f6fbfaa55defb6b733cfeb14efaae7a29f0b59d8cf213be4e7ca0b857b80"}, - {file = "charset_normalizer-3.2.0-py3-none-any.whl", hash = "sha256:8e098148dd37b4ce3baca71fb394c81dc5d9c7728c95df695d2dca218edf40e6"}, +python-versions = ">=3.7" +files = [ + {file = "charset_normalizer-3.4.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e824f1492727fa856dd6eda4f7cee25f8518a12f3c4a56a74e8095695089cf6d"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4bd5d4137d500351a30687c2d3971758aac9a19208fc110ccb9d7188fbe709e8"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:027f6de494925c0ab2a55eab46ae5129951638a49a34d87f4c3eda90f696b4ad"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f820802628d2694cb7e56db99213f930856014862f3fd943d290ea8438d07ca8"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:798d75d81754988d2565bff1b97ba5a44411867c0cf32b77a7e8f8d84796b10d"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d1bb833febdff5c8927f922386db610b49db6e0d4f4ee29601d71e7c2694313"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:9cd98cdc06614a2f768d2b7286d66805f94c48cde050acdbbb7db2600ab3197e"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:077fbb858e903c73f6c9db43374fd213b0b6a778106bc7032446a8e8b5b38b93"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:244bfb999c71b35de57821b8ea746b24e863398194a4014e4c76adc2bbdfeff0"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:64b55f9dce520635f018f907ff1b0df1fdc31f2795a922fb49dd14fbcdf48c84"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:faa3a41b2b66b6e50f84ae4a68c64fcd0c44355741c6374813a800cd6695db9e"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:6515f3182dbe4ea06ced2d9e8666d97b46ef4c75e326b79bb624110f122551db"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cc00f04ed596e9dc0da42ed17ac5e596c6ccba999ba6bd92b0e0aef2f170f2d6"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-win32.whl", hash = "sha256:f34be2938726fc13801220747472850852fe6b1ea75869a048d6f896838c896f"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-win_amd64.whl", hash = "sha256:a61900df84c667873b292c3de315a786dd8dac506704dea57bc957bd31e22c7d"}, + {file = "charset_normalizer-3.4.4-cp310-cp310-win_arm64.whl", hash = "sha256:cead0978fc57397645f12578bfd2d5ea9138ea0fac82b2f63f7f7c6877986a69"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6e1fcf0720908f200cd21aa4e6750a48ff6ce4afe7ff5a79a90d5ed8a08296f8"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5f819d5fe9234f9f82d75bdfa9aef3a3d72c4d24a6e57aeaebba32a704553aa0"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a59cb51917aa591b1c4e6a43c132f0cdc3c76dbad6155df4e28ee626cc77a0a3"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8ef3c867360f88ac904fd3f5e1f902f13307af9052646963ee08ff4f131adafc"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d9e45d7faa48ee908174d8fe84854479ef838fc6a705c9315372eacbc2f02897"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:840c25fb618a231545cbab0564a799f101b63b9901f2569faecd6b222ac72381"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ca5862d5b3928c4940729dacc329aa9102900382fea192fc5e52eb69d6093815"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d9c7f57c3d666a53421049053eaacdd14bbd0a528e2186fcb2e672effd053bb0"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:277e970e750505ed74c832b4bf75dac7476262ee2a013f5574dd49075879e161"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:31fd66405eaf47bb62e8cd575dc621c56c668f27d46a61d975a249930dd5e2a4"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:0d3d8f15c07f86e9ff82319b3d9ef6f4bf907608f53fe9d92b28ea9ae3d1fd89"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9f7fcd74d410a36883701fafa2482a6af2ff5ba96b9a620e9e0721e28ead5569"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ebf3e58c7ec8a8bed6d66a75d7fb37b55e5015b03ceae72a8e7c74495551e224"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-win32.whl", hash = "sha256:eecbc200c7fd5ddb9a7f16c7decb07b566c29fa2161a16cf67b8d068bd21690a"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-win_amd64.whl", hash = "sha256:5ae497466c7901d54b639cf42d5b8c1b6a4fead55215500d2f486d34db48d016"}, + {file = "charset_normalizer-3.4.4-cp311-cp311-win_arm64.whl", hash = "sha256:65e2befcd84bc6f37095f5961e68a6f077bf44946771354a28ad434c2cce0ae1"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0a98e6759f854bd25a58a73fa88833fba3b7c491169f86ce1180c948ab3fd394"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b5b290ccc2a263e8d185130284f8501e3e36c5e02750fc6b6bdeb2e9e96f1e25"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74bb723680f9f7a6234dcf67aea57e708ec1fbdf5699fb91dfd6f511b0a320ef"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f1e34719c6ed0b92f418c7c780480b26b5d9c50349e9a9af7d76bf757530350d"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2437418e20515acec67d86e12bf70056a33abdacb5cb1655042f6538d6b085a8"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:11d694519d7f29d6cd09f6ac70028dba10f92f6cdd059096db198c283794ac86"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ac1c4a689edcc530fc9d9aa11f5774b9e2f33f9a0c6a57864e90908f5208d30a"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:21d142cc6c0ec30d2efee5068ca36c128a30b0f2c53c1c07bd78cb6bc1d3be5f"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:5dbe56a36425d26d6cfb40ce79c314a2e4dd6211d51d6d2191c00bed34f354cc"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5bfbb1b9acf3334612667b61bd3002196fe2a1eb4dd74d247e0f2a4d50ec9bbf"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:d055ec1e26e441f6187acf818b73564e6e6282709e9bcb5b63f5b23068356a15"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:af2d8c67d8e573d6de5bc30cdb27e9b95e49115cd9baad5ddbd1a6207aaa82a9"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:780236ac706e66881f3b7f2f32dfe90507a09e67d1d454c762cf642e6e1586e0"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-win32.whl", hash = "sha256:5833d2c39d8896e4e19b689ffc198f08ea58116bee26dea51e362ecc7cd3ed26"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:a79cfe37875f822425b89a82333404539ae63dbdddf97f84dcbc3d339aae9525"}, + {file = "charset_normalizer-3.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:376bec83a63b8021bb5c8ea75e21c4ccb86e7e45ca4eb81146091b56599b80c3"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e1f185f86a6f3403aa2420e815904c67b2f9ebc443f045edd0de921108345794"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b39f987ae8ccdf0d2642338faf2abb1862340facc796048b604ef14919e55ed"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3162d5d8ce1bb98dd51af660f2121c55d0fa541b46dff7bb9b9f86ea1d87de72"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:81d5eb2a312700f4ecaa977a8235b634ce853200e828fbadf3a9c50bab278328"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5bd2293095d766545ec1a8f612559f6b40abc0eb18bb2f5d1171872d34036ede"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a8a8b89589086a25749f471e6a900d3f662d1d3b6e2e59dcecf787b1cc3a1894"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc7637e2f80d8530ee4a78e878bce464f70087ce73cf7c1caf142416923b98f1"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f8bf04158c6b607d747e93949aa60618b61312fe647a6369f88ce2ff16043490"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:554af85e960429cf30784dd47447d5125aaa3b99a6f0683589dbd27e2f45da44"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:74018750915ee7ad843a774364e13a3db91682f26142baddf775342c3f5b1133"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c0463276121fdee9c49b98908b3a89c39be45d86d1dbaa22957e38f6321d4ce3"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:362d61fd13843997c1c446760ef36f240cf81d3ebf74ac62652aebaf7838561e"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a26f18905b8dd5d685d6d07b0cdf98a79f3c7a918906af7cc143ea2e164c8bc"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-win32.whl", hash = "sha256:9b35f4c90079ff2e2edc5b26c0c77925e5d2d255c42c74fdb70fb49b172726ac"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:b435cba5f4f750aa6c0a0d92c541fb79f69a387c91e61f1795227e4ed9cece14"}, + {file = "charset_normalizer-3.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:542d2cee80be6f80247095cc36c418f7bddd14f4a6de45af91dfad36d817bba2"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c"}, + {file = "charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ce8a0633f41a967713a59c4139d29110c07e826d131a316b50ce11b1d79b4f84"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eaabd426fe94daf8fd157c32e571c85cb12e66692f15516a83a03264b08d06c3"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c4ef880e27901b6cc782f1b95f82da9313c0eb95c3af699103088fa0ac3ce9ac"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2aaba3b0819274cc41757a1da876f810a3e4d7b6eb25699253a4effef9e8e4af"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:778d2e08eda00f4256d7f672ca9fef386071c9202f5e4607920b86d7803387f2"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f155a433c2ec037d4e8df17d18922c3a0d9b3232a396690f17175d2946f0218d"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a8bf8d0f749c5757af2142fe7903a9df1d2e8aa3841559b2bad34b08d0e2bcf3"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:194f08cbb32dc406d6e1aea671a68be0823673db2832b38405deba2fb0d88f63"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:6aee717dcfead04c6eb1ce3bd29ac1e22663cdea57f943c87d1eab9a025438d7"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:cd4b7ca9984e5e7985c12bc60a6f173f3c958eae74f3ef6624bb6b26e2abbae4"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_riscv64.whl", hash = "sha256:b7cf1017d601aa35e6bb650b6ad28652c9cd78ee6caff19f3c28d03e1c80acbf"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:e912091979546adf63357d7e2ccff9b44f026c075aeaf25a52d0e95ad2281074"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:5cb4d72eea50c8868f5288b7f7f33ed276118325c1dfd3957089f6b519e1382a"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-win32.whl", hash = "sha256:837c2ce8c5a65a2035be9b3569c684358dfbf109fd3b6969630a87535495ceaa"}, + {file = "charset_normalizer-3.4.4-cp38-cp38-win_amd64.whl", hash = "sha256:44c2a8734b333e0578090c4cd6b16f275e07aa6614ca8715e6c038e865e70576"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:a9768c477b9d7bd54bc0c86dbaebdec6f03306675526c9927c0e8a04e8f94af9"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1bee1e43c28aa63cb16e5c14e582580546b08e535299b8b6158a7c9c768a1f3d"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fd44c878ea55ba351104cb93cc85e74916eb8fa440ca7903e57575e97394f608"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0f04b14ffe5fdc8c4933862d8306109a2c51e0704acfa35d51598eb45a1e89fc"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:cd09d08005f958f370f539f186d10aec3377d55b9eeb0d796025d4886119d76e"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4fe7859a4e3e8457458e2ff592f15ccb02f3da787fcd31e0183879c3ad4692a1"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fa09f53c465e532f4d3db095e0c55b615f010ad81803d383195b6b5ca6cbf5f3"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7fa17817dc5625de8a027cb8b26d9fefa3ea28c8253929b8d6649e705d2835b6"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:5947809c8a2417be3267efc979c47d76a079758166f7d43ef5ae8e9f92751f88"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:4902828217069c3c5c71094537a8e623f5d097858ac6ca8252f7b4d10b7560f1"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:7c308f7e26e4363d79df40ca5b2be1c6ba9f02bdbccfed5abddb7859a6ce72cf"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:2c9d3c380143a1fedbff95a312aa798578371eb29da42106a29019368a475318"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:cb01158d8b88ee68f15949894ccc6712278243d95f344770fa7593fa2d94410c"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-win32.whl", hash = "sha256:2677acec1a2f8ef614c6888b5b4ae4060cc184174a938ed4e8ef690e15d3e505"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-win_amd64.whl", hash = "sha256:f8e160feb2aed042cd657a72acc0b481212ed28b1b9a95c0cee1621b524e1966"}, + {file = "charset_normalizer-3.4.4-cp39-cp39-win_arm64.whl", hash = "sha256:b5d84d37db046c5ca74ee7bb47dd6cbc13f80665fdde3e8040bdd3fb015ecb50"}, + {file = "charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f"}, + {file = "charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a"}, ] [[package]] name = "click" -version = "8.1.7" +version = "8.1.8" description = "Composable command line interface toolkit" optional = false python-versions = ">=3.7" files = [ - {file = "click-8.1.7-py3-none-any.whl", hash = "sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28"}, - {file = "click-8.1.7.tar.gz", hash = "sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de"}, + {file = "click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2"}, + {file = "click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a"}, ] [package.dependencies] @@ -314,29 +373,52 @@ tomli = {version = "*", optional = true, markers = "python_full_version <= \"3.1 toml = ["tomli"] [[package]] -name = "docutils" -version = "0.20.1" -description = "Docutils -- Python Documentation Utilities" +name = "exceptiongroup" +version = "1.3.0" +description = "Backport of PEP 654 (exception groups)" optional = false python-versions = ">=3.7" files = [ - {file = "docutils-0.20.1-py3-none-any.whl", hash = "sha256:96f387a2c5562db4476f09f13bbab2192e764cac08ebbf3a34a95d9b1e4a59d6"}, - {file = "docutils-0.20.1.tar.gz", hash = "sha256:f08a4e276c3a1583a86dce3e34aba3fe04d02bba2dd51ed16106244e8a923e3b"}, + {file = "exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10"}, + {file = "exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88"}, ] +[package.dependencies] +typing-extensions = {version = ">=4.6.0", markers = "python_version < \"3.13\""} + +[package.extras] +test = ["pytest (>=6)"] + [[package]] -name = "exceptiongroup" -version = "1.1.3" -description = "Backport of PEP 654 (exception groups)" +name = "ghp-import" +version = "2.1.0" +description = "Copy your docs directly to the gh-pages branch." optional = false -python-versions = ">=3.7" +python-versions = "*" files = [ - {file = "exceptiongroup-1.1.3-py3-none-any.whl", hash = "sha256:343280667a4585d195ca1cf9cef84a4e178c4b6cf2274caef9859782b567d5e3"}, - {file = "exceptiongroup-1.1.3.tar.gz", hash = "sha256:097acd85d473d75af5bb98e41b61ff7fe35efe6675e4f9370ec6ec5126d160e9"}, + {file = "ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343"}, + {file = "ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619"}, ] +[package.dependencies] +python-dateutil = ">=2.8.1" + [package.extras] -test = ["pytest (>=6)"] +dev = ["flake8", "markdown", "twine", "wheel"] + +[[package]] +name = "griffe" +version = "1.14.0" +description = "Signatures for entire Python programs. Extract the structure, the frame, the skeleton of your project, to generate API documentation or find breaking changes in your API." +optional = false +python-versions = ">=3.9" +files = [ + {file = "griffe-1.14.0-py3-none-any.whl", hash = "sha256:0e9d52832cccf0f7188cfe585ba962d2674b241c01916d780925df34873bceb0"}, + {file = "griffe-1.14.0.tar.gz", hash = "sha256:9d2a15c1eca966d68e00517de5d69dd1bc5c9f2335ef6c1775362ba5b8651a13"}, +] + +[package.dependencies] +colorama = ">=0.4" [[package]] name = "html5rdf" @@ -351,54 +433,50 @@ files = [ [[package]] name = "idna" -version = "3.4" +version = "3.11" description = "Internationalized Domain Names in Applications (IDNA)" optional = false -python-versions = ">=3.5" +python-versions = ">=3.8" files = [ - {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"}, - {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"}, + {file = "idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea"}, + {file = "idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902"}, ] -[[package]] -name = "imagesize" -version = "1.4.1" -description = "Getting image size from png/jpeg/jpeg2000/gif file" -optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" -files = [ - {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"}, - {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"}, -] +[package.extras] +all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"] [[package]] name = "importlib-metadata" -version = "6.8.0" +version = "8.5.0" description = "Read metadata from Python packages" optional = false python-versions = ">=3.8" files = [ - {file = "importlib_metadata-6.8.0-py3-none-any.whl", hash = "sha256:3ebb78df84a805d7698245025b975d9d67053cd94c79245ba4b3eb694abe68bb"}, - {file = "importlib_metadata-6.8.0.tar.gz", hash = "sha256:dbace7892d8c0c4ac1ad096662232f831d4e64f4c4545bd53016a3e9d4654743"}, + {file = "importlib_metadata-8.5.0-py3-none-any.whl", hash = "sha256:45e54197d28b7a7f1559e60b95e7c567032b602131fbd588f1497f47880aa68b"}, + {file = "importlib_metadata-8.5.0.tar.gz", hash = "sha256:71522656f0abace1d072b9e5481a48f07c138e00f079c38c8f883823f9c26bd7"}, ] [package.dependencies] -zipp = ">=0.5" +zipp = ">=3.20" [package.extras] -docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1)"] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +enabler = ["pytest-enabler (>=2.2)"] perf = ["ipython"] -testing = ["flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf (>=0.9.2)", "pytest-ruff"] +test = ["flufl.flake8", "importlib-resources (>=1.3)", "jaraco.test (>=5.4)", "packaging", "pyfakefs", "pytest (>=6,!=8.1.*)", "pytest-perf (>=0.9.2)"] +type = ["pytest-mypy"] [[package]] name = "iniconfig" -version = "2.0.0" +version = "2.1.0" description = "brain-dead simple config-ini parsing" optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374"}, - {file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"}, + {file = "iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760"}, + {file = "iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7"}, ] [[package]] @@ -414,13 +492,13 @@ files = [ [[package]] name = "jinja2" -version = "3.1.2" +version = "3.1.6" description = "A very fast and expressive template engine." optional = false python-versions = ">=3.7" files = [ - {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"}, - {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"}, + {file = "jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67"}, + {file = "jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d"}, ] [package.dependencies] @@ -431,149 +509,143 @@ i18n = ["Babel (>=2.7)"] [[package]] name = "lxml" -version = "5.3.1" +version = "5.4.0" description = "Powerful and Pythonic XML processing library combining libxml2/libxslt with the ElementTree API." optional = true python-versions = ">=3.6" files = [ - {file = "lxml-5.3.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a4058f16cee694577f7e4dd410263cd0ef75644b43802a689c2b3c2a7e69453b"}, - {file = "lxml-5.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:364de8f57d6eda0c16dcfb999af902da31396949efa0e583e12675d09709881b"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:528f3a0498a8edc69af0559bdcf8a9f5a8bf7c00051a6ef3141fdcf27017bbf5"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:db4743e30d6f5f92b6d2b7c86b3ad250e0bad8dee4b7ad8a0c44bfb276af89a3"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:17b5d7f8acf809465086d498d62a981fa6a56d2718135bb0e4aa48c502055f5c"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:928e75a7200a4c09e6efc7482a1337919cc61fe1ba289f297827a5b76d8969c2"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5a997b784a639e05b9d4053ef3b20c7e447ea80814a762f25b8ed5a89d261eac"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:7b82e67c5feb682dbb559c3e6b78355f234943053af61606af126df2183b9ef9"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_ppc64le.whl", hash = "sha256:f1de541a9893cf8a1b1db9bf0bf670a2decab42e3e82233d36a74eda7822b4c9"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_s390x.whl", hash = "sha256:de1fc314c3ad6bc2f6bd5b5a5b9357b8c6896333d27fdbb7049aea8bd5af2d79"}, - {file = "lxml-5.3.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:7c0536bd9178f754b277a3e53f90f9c9454a3bd108b1531ffff720e082d824f2"}, - {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:68018c4c67d7e89951a91fbd371e2e34cd8cfc71f0bb43b5332db38497025d51"}, - {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:aa826340a609d0c954ba52fd831f0fba2a4165659ab0ee1a15e4aac21f302406"}, - {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:796520afa499732191e39fc95b56a3b07f95256f2d22b1c26e217fb69a9db5b5"}, - {file = "lxml-5.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3effe081b3135237da6e4c4530ff2a868d3f80be0bda027e118a5971285d42d0"}, - {file = "lxml-5.3.1-cp310-cp310-win32.whl", hash = "sha256:a22f66270bd6d0804b02cd49dae2b33d4341015545d17f8426f2c4e22f557a23"}, - {file = "lxml-5.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:0bcfadea3cdc68e678d2b20cb16a16716887dd00a881e16f7d806c2138b8ff0c"}, - {file = "lxml-5.3.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:e220f7b3e8656ab063d2eb0cd536fafef396829cafe04cb314e734f87649058f"}, - {file = "lxml-5.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0f2cfae0688fd01f7056a17367e3b84f37c545fb447d7282cf2c242b16262607"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:67d2f8ad9dcc3a9e826bdc7802ed541a44e124c29b7d95a679eeb58c1c14ade8"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:db0c742aad702fd5d0c6611a73f9602f20aec2007c102630c06d7633d9c8f09a"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:198bb4b4dd888e8390afa4f170d4fa28467a7eaf857f1952589f16cfbb67af27"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d2a3e412ce1849be34b45922bfef03df32d1410a06d1cdeb793a343c2f1fd666"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2b8969dbc8d09d9cd2ae06362c3bad27d03f433252601ef658a49bd9f2b22d79"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:5be8f5e4044146a69c96077c7e08f0709c13a314aa5315981185c1f00235fe65"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_ppc64le.whl", hash = "sha256:133f3493253a00db2c870d3740bc458ebb7d937bd0a6a4f9328373e0db305709"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_s390x.whl", hash = "sha256:52d82b0d436edd6a1d22d94a344b9a58abd6c68c357ed44f22d4ba8179b37629"}, - {file = "lxml-5.3.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:1b6f92e35e2658a5ed51c6634ceb5ddae32053182851d8cad2a5bc102a359b33"}, - {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:203b1d3eaebd34277be06a3eb880050f18a4e4d60861efba4fb946e31071a295"}, - {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:155e1a5693cf4b55af652f5c0f78ef36596c7f680ff3ec6eb4d7d85367259b2c"}, - {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:22ec2b3c191f43ed21f9545e9df94c37c6b49a5af0a874008ddc9132d49a2d9c"}, - {file = "lxml-5.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:7eda194dd46e40ec745bf76795a7cccb02a6a41f445ad49d3cf66518b0bd9cff"}, - {file = "lxml-5.3.1-cp311-cp311-win32.whl", hash = "sha256:fb7c61d4be18e930f75948705e9718618862e6fc2ed0d7159b2262be73f167a2"}, - {file = "lxml-5.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:c809eef167bf4a57af4b03007004896f5c60bd38dc3852fcd97a26eae3d4c9e6"}, - {file = "lxml-5.3.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:e69add9b6b7b08c60d7ff0152c7c9a6c45b4a71a919be5abde6f98f1ea16421c"}, - {file = "lxml-5.3.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:4e52e1b148867b01c05e21837586ee307a01e793b94072d7c7b91d2c2da02ffe"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a4b382e0e636ed54cd278791d93fe2c4f370772743f02bcbe431a160089025c9"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c2e49dc23a10a1296b04ca9db200c44d3eb32c8d8ec532e8c1fd24792276522a"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4399b4226c4785575fb20998dc571bc48125dc92c367ce2602d0d70e0c455eb0"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5412500e0dc5481b1ee9cf6b38bb3b473f6e411eb62b83dc9b62699c3b7b79f7"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c93ed3c998ea8472be98fb55aed65b5198740bfceaec07b2eba551e55b7b9ae"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:63d57fc94eb0bbb4735e45517afc21ef262991d8758a8f2f05dd6e4174944519"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_ppc64le.whl", hash = "sha256:b450d7cabcd49aa7ab46a3c6aa3ac7e1593600a1a0605ba536ec0f1b99a04322"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_s390x.whl", hash = "sha256:4df0ec814b50275ad6a99bc82a38b59f90e10e47714ac9871e1b223895825468"}, - {file = "lxml-5.3.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:d184f85ad2bb1f261eac55cddfcf62a70dee89982c978e92b9a74a1bfef2e367"}, - {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b725e70d15906d24615201e650d5b0388b08a5187a55f119f25874d0103f90dd"}, - {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:a31fa7536ec1fb7155a0cd3a4e3d956c835ad0a43e3610ca32384d01f079ea1c"}, - {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:3c3c8b55c7fc7b7e8877b9366568cc73d68b82da7fe33d8b98527b73857a225f"}, - {file = "lxml-5.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:d61ec60945d694df806a9aec88e8f29a27293c6e424f8ff91c80416e3c617645"}, - {file = "lxml-5.3.1-cp312-cp312-win32.whl", hash = "sha256:f4eac0584cdc3285ef2e74eee1513a6001681fd9753b259e8159421ed28a72e5"}, - {file = "lxml-5.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:29bfc8d3d88e56ea0a27e7c4897b642706840247f59f4377d81be8f32aa0cfbf"}, - {file = "lxml-5.3.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c093c7088b40d8266f57ed71d93112bd64c6724d31f0794c1e52cc4857c28e0e"}, - {file = "lxml-5.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b0884e3f22d87c30694e625b1e62e6f30d39782c806287450d9dc2fdf07692fd"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1637fa31ec682cd5760092adfabe86d9b718a75d43e65e211d5931809bc111e7"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a364e8e944d92dcbf33b6b494d4e0fb3499dcc3bd9485beb701aa4b4201fa414"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:779e851fd0e19795ccc8a9bb4d705d6baa0ef475329fe44a13cf1e962f18ff1e"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c4393600915c308e546dc7003d74371744234e8444a28622d76fe19b98fa59d1"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:673b9d8e780f455091200bba8534d5f4f465944cbdd61f31dc832d70e29064a5"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:2e4a570f6a99e96c457f7bec5ad459c9c420ee80b99eb04cbfcfe3fc18ec6423"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_ppc64le.whl", hash = "sha256:71f31eda4e370f46af42fc9f264fafa1b09f46ba07bdbee98f25689a04b81c20"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_s390x.whl", hash = "sha256:42978a68d3825eaac55399eb37a4d52012a205c0c6262199b8b44fcc6fd686e8"}, - {file = "lxml-5.3.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:8b1942b3e4ed9ed551ed3083a2e6e0772de1e5e3aca872d955e2e86385fb7ff9"}, - {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:85c4f11be9cf08917ac2a5a8b6e1ef63b2f8e3799cec194417e76826e5f1de9c"}, - {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:231cf4d140b22a923b1d0a0a4e0b4f972e5893efcdec188934cc65888fd0227b"}, - {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:5865b270b420eda7b68928d70bb517ccbe045e53b1a428129bb44372bf3d7dd5"}, - {file = "lxml-5.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:dbf7bebc2275016cddf3c997bf8a0f7044160714c64a9b83975670a04e6d2252"}, - {file = "lxml-5.3.1-cp313-cp313-win32.whl", hash = "sha256:d0751528b97d2b19a388b302be2a0ee05817097bab46ff0ed76feeec24951f78"}, - {file = "lxml-5.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:91fb6a43d72b4f8863d21f347a9163eecbf36e76e2f51068d59cd004c506f332"}, - {file = "lxml-5.3.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:016b96c58e9a4528219bb563acf1aaaa8bc5452e7651004894a973f03b84ba81"}, - {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:82a4bb10b0beef1434fb23a09f001ab5ca87895596b4581fd53f1e5145a8934a"}, - {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d68eeef7b4d08a25e51897dac29bcb62aba830e9ac6c4e3297ee7c6a0cf6439"}, - {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:f12582b8d3b4c6be1d298c49cb7ae64a3a73efaf4c2ab4e37db182e3545815ac"}, - {file = "lxml-5.3.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:2df7ed5edeb6bd5590914cd61df76eb6cce9d590ed04ec7c183cf5509f73530d"}, - {file = "lxml-5.3.1-cp36-cp36m-musllinux_1_2_x86_64.whl", hash = "sha256:585c4dc429deebc4307187d2b71ebe914843185ae16a4d582ee030e6cfbb4d8a"}, - {file = "lxml-5.3.1-cp36-cp36m-win32.whl", hash = "sha256:06a20d607a86fccab2fc15a77aa445f2bdef7b49ec0520a842c5c5afd8381576"}, - {file = "lxml-5.3.1-cp36-cp36m-win_amd64.whl", hash = "sha256:057e30d0012439bc54ca427a83d458752ccda725c1c161cc283db07bcad43cf9"}, - {file = "lxml-5.3.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:4867361c049761a56bd21de507cab2c2a608c55102311d142ade7dab67b34f32"}, - {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3dddf0fb832486cc1ea71d189cb92eb887826e8deebe128884e15020bb6e3f61"}, - {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1bcc211542f7af6f2dfb705f5f8b74e865592778e6cafdfd19c792c244ccce19"}, - {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aaca5a812f050ab55426c32177091130b1e49329b3f002a32934cd0245571307"}, - {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:236610b77589faf462337b3305a1be91756c8abc5a45ff7ca8f245a71c5dab70"}, - {file = "lxml-5.3.1-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:aed57b541b589fa05ac248f4cb1c46cbb432ab82cbd467d1c4f6a2bdc18aecf9"}, - {file = "lxml-5.3.1-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:75fa3d6946d317ffc7016a6fcc44f42db6d514b7fdb8b4b28cbe058303cb6e53"}, - {file = "lxml-5.3.1-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:96eef5b9f336f623ffc555ab47a775495e7e8846dde88de5f941e2906453a1ce"}, - {file = "lxml-5.3.1-cp37-cp37m-win32.whl", hash = "sha256:ef45f31aec9be01379fc6c10f1d9c677f032f2bac9383c827d44f620e8a88407"}, - {file = "lxml-5.3.1-cp37-cp37m-win_amd64.whl", hash = "sha256:a0611da6b07dd3720f492db1b463a4d1175b096b49438761cc9f35f0d9eaaef5"}, - {file = "lxml-5.3.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b2aca14c235c7a08558fe0a4786a1a05873a01e86b474dfa8f6df49101853a4e"}, - {file = "lxml-5.3.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae82fce1d964f065c32c9517309f0c7be588772352d2f40b1574a214bd6e6098"}, - {file = "lxml-5.3.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7aae7a3d63b935babfdc6864b31196afd5145878ddd22f5200729006366bc4d5"}, - {file = "lxml-5.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e8e0d177b1fe251c3b1b914ab64135475c5273c8cfd2857964b2e3bb0fe196a7"}, - {file = "lxml-5.3.1-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:6c4dd3bfd0c82400060896717dd261137398edb7e524527438c54a8c34f736bf"}, - {file = "lxml-5.3.1-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:f1208c1c67ec9e151d78aa3435aa9b08a488b53d9cfac9b699f15255a3461ef2"}, - {file = "lxml-5.3.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:c6aacf00d05b38a5069826e50ae72751cb5bc27bdc4d5746203988e429b385bb"}, - {file = "lxml-5.3.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:5881aaa4bf3a2d086c5f20371d3a5856199a0d8ac72dd8d0dbd7a2ecfc26ab73"}, - {file = "lxml-5.3.1-cp38-cp38-win32.whl", hash = "sha256:45fbb70ccbc8683f2fb58bea89498a7274af1d9ec7995e9f4af5604e028233fc"}, - {file = "lxml-5.3.1-cp38-cp38-win_amd64.whl", hash = "sha256:7512b4d0fc5339d5abbb14d1843f70499cab90d0b864f790e73f780f041615d7"}, - {file = "lxml-5.3.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5885bc586f1edb48e5d68e7a4b4757b5feb2a496b64f462b4d65950f5af3364f"}, - {file = "lxml-5.3.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1b92fe86e04f680b848fff594a908edfa72b31bfc3499ef7433790c11d4c8cd8"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a091026c3bf7519ab1e64655a3f52a59ad4a4e019a6f830c24d6430695b1cf6a"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8ffb141361108e864ab5f1813f66e4e1164181227f9b1f105b042729b6c15125"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3715cdf0dd31b836433af9ee9197af10e3df41d273c19bb249230043667a5dfd"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:88b72eb7222d918c967202024812c2bfb4048deeb69ca328363fb8e15254c549"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa59974880ab5ad8ef3afaa26f9bda148c5f39e06b11a8ada4660ecc9fb2feb3"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:3bb8149840daf2c3f97cebf00e4ed4a65a0baff888bf2605a8d0135ff5cf764e"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_ppc64le.whl", hash = "sha256:0d6b2fa86becfa81f0a0271ccb9eb127ad45fb597733a77b92e8a35e53414914"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_s390x.whl", hash = "sha256:136bf638d92848a939fd8f0e06fcf92d9f2e4b57969d94faae27c55f3d85c05b"}, - {file = "lxml-5.3.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:89934f9f791566e54c1d92cdc8f8fd0009447a5ecdb1ec6b810d5f8c4955f6be"}, - {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:a8ade0363f776f87f982572c2860cc43c65ace208db49c76df0a21dde4ddd16e"}, - {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:bfbbab9316330cf81656fed435311386610f78b6c93cc5db4bebbce8dd146675"}, - {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:172d65f7c72a35a6879217bcdb4bb11bc88d55fb4879e7569f55616062d387c2"}, - {file = "lxml-5.3.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:e3c623923967f3e5961d272718655946e5322b8d058e094764180cdee7bab1af"}, - {file = "lxml-5.3.1-cp39-cp39-win32.whl", hash = "sha256:ce0930a963ff593e8bb6fda49a503911accc67dee7e5445eec972668e672a0f0"}, - {file = "lxml-5.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:f7b64fcd670bca8800bc10ced36620c6bbb321e7bc1214b9c0c0df269c1dddc2"}, - {file = "lxml-5.3.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:afa578b6524ff85fb365f454cf61683771d0170470c48ad9d170c48075f86725"}, - {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:67f5e80adf0aafc7b5454f2c1cb0cde920c9b1f2cbd0485f07cc1d0497c35c5d"}, - {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2dd0b80ac2d8f13ffc906123a6f20b459cb50a99222d0da492360512f3e50f84"}, - {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:422c179022ecdedbe58b0e242607198580804253da220e9454ffe848daa1cfd2"}, - {file = "lxml-5.3.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:524ccfded8989a6595dbdda80d779fb977dbc9a7bc458864fc9a0c2fc15dc877"}, - {file = "lxml-5.3.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:48fd46bf7155def2e15287c6f2b133a2f78e2d22cdf55647269977b873c65499"}, - {file = "lxml-5.3.1-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:05123fad495a429f123307ac6d8fd6f977b71e9a0b6d9aeeb8f80c017cb17131"}, - {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a243132767150a44e6a93cd1dde41010036e1cbc63cc3e9fe1712b277d926ce3"}, - {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c92ea6d9dd84a750b2bae72ff5e8cf5fdd13e58dda79c33e057862c29a8d5b50"}, - {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:2f1be45d4c15f237209bbf123a0e05b5d630c8717c42f59f31ea9eae2ad89394"}, - {file = "lxml-5.3.1-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:a83d3adea1e0ee36dac34627f78ddd7f093bb9cfc0a8e97f1572a949b695cb98"}, - {file = "lxml-5.3.1-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:3edbb9c9130bac05d8c3fe150c51c337a471cc7fdb6d2a0a7d3a88e88a829314"}, - {file = "lxml-5.3.1-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2f23cf50eccb3255b6e913188291af0150d89dab44137a69e14e4dcb7be981f1"}, - {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df7e5edac4778127f2bf452e0721a58a1cfa4d1d9eac63bdd650535eb8543615"}, - {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:094b28ed8a8a072b9e9e2113a81fda668d2053f2ca9f2d202c2c8c7c2d6516b1"}, - {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:514fe78fc4b87e7a7601c92492210b20a1b0c6ab20e71e81307d9c2e377c64de"}, - {file = "lxml-5.3.1-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:8fffc08de02071c37865a155e5ea5fce0282e1546fd5bde7f6149fcaa32558ac"}, - {file = "lxml-5.3.1-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:4b0d5cdba1b655d5b18042ac9c9ff50bda33568eb80feaaca4fc237b9c4fbfde"}, - {file = "lxml-5.3.1-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:3031e4c16b59424e8d78522c69b062d301d951dc55ad8685736c3335a97fc270"}, - {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb659702a45136c743bc130760c6f137870d4df3a9e14386478b8a0511abcfca"}, - {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5a11b16a33656ffc43c92a5343a28dc71eefe460bcc2a4923a96f292692709f6"}, - {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c5ae125276f254b01daa73e2c103363d3e99e3e10505686ac7d9d2442dd4627a"}, - {file = "lxml-5.3.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c76722b5ed4a31ba103e0dc77ab869222ec36efe1a614e42e9bcea88a36186fe"}, - {file = "lxml-5.3.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:33e06717c00c788ab4e79bc4726ecc50c54b9bfb55355eae21473c145d83c2d2"}, - {file = "lxml-5.3.1.tar.gz", hash = "sha256:106b7b5d2977b339f1e97efe2778e2ab20e99994cbb0ec5e55771ed0795920c8"}, + {file = "lxml-5.4.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e7bc6df34d42322c5289e37e9971d6ed114e3776b45fa879f734bded9d1fea9c"}, + {file = "lxml-5.4.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6854f8bd8a1536f8a1d9a3655e6354faa6406621cf857dc27b681b69860645c7"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:696ea9e87442467819ac22394ca36cb3d01848dad1be6fac3fb612d3bd5a12cf"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ef80aeac414f33c24b3815ecd560cee272786c3adfa5f31316d8b349bfade28"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b9c2754cef6963f3408ab381ea55f47dabc6f78f4b8ebb0f0b25cf1ac1f7609"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7a62cc23d754bb449d63ff35334acc9f5c02e6dae830d78dab4dd12b78a524f4"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f82125bc7203c5ae8633a7d5d20bcfdff0ba33e436e4ab0abc026a53a8960b7"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:b67319b4aef1a6c56576ff544b67a2a6fbd7eaee485b241cabf53115e8908b8f"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_28_ppc64le.whl", hash = "sha256:a8ef956fce64c8551221f395ba21d0724fed6b9b6242ca4f2f7beb4ce2f41997"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_28_s390x.whl", hash = "sha256:0a01ce7d8479dce84fc03324e3b0c9c90b1ece9a9bb6a1b6c9025e7e4520e78c"}, + {file = "lxml-5.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:91505d3ddebf268bb1588eb0f63821f738d20e1e7f05d3c647a5ca900288760b"}, + {file = "lxml-5.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a3bcdde35d82ff385f4ede021df801b5c4a5bcdfb61ea87caabcebfc4945dc1b"}, + {file = "lxml-5.4.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:aea7c06667b987787c7d1f5e1dfcd70419b711cdb47d6b4bb4ad4b76777a0563"}, + {file = "lxml-5.4.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:a7fb111eef4d05909b82152721a59c1b14d0f365e2be4c742a473c5d7372f4f5"}, + {file = "lxml-5.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:43d549b876ce64aa18b2328faff70f5877f8c6dede415f80a2f799d31644d776"}, + {file = "lxml-5.4.0-cp310-cp310-win32.whl", hash = "sha256:75133890e40d229d6c5837b0312abbe5bac1c342452cf0e12523477cd3aa21e7"}, + {file = "lxml-5.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:de5b4e1088523e2b6f730d0509a9a813355b7f5659d70eb4f319c76beea2e250"}, + {file = "lxml-5.4.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:98a3912194c079ef37e716ed228ae0dcb960992100461b704aea4e93af6b0bb9"}, + {file = "lxml-5.4.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0ea0252b51d296a75f6118ed0d8696888e7403408ad42345d7dfd0d1e93309a7"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b92b69441d1bd39f4940f9eadfa417a25862242ca2c396b406f9272ef09cdcaa"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20e16c08254b9b6466526bc1828d9370ee6c0d60a4b64836bc3ac2917d1e16df"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7605c1c32c3d6e8c990dd28a0970a3cbbf1429d5b92279e37fda05fb0c92190e"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ecf4c4b83f1ab3d5a7ace10bafcb6f11df6156857a3c418244cef41ca9fa3e44"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0cef4feae82709eed352cd7e97ae062ef6ae9c7b5dbe3663f104cd2c0e8d94ba"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:df53330a3bff250f10472ce96a9af28628ff1f4efc51ccba351a8820bca2a8ba"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_28_ppc64le.whl", hash = "sha256:aefe1a7cb852fa61150fcb21a8c8fcea7b58c4cb11fbe59c97a0a4b31cae3c8c"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_28_s390x.whl", hash = "sha256:ef5a7178fcc73b7d8c07229e89f8eb45b2908a9238eb90dcfc46571ccf0383b8"}, + {file = "lxml-5.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:d2ed1b3cb9ff1c10e6e8b00941bb2e5bb568b307bfc6b17dffbbe8be5eecba86"}, + {file = "lxml-5.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:72ac9762a9f8ce74c9eed4a4e74306f2f18613a6b71fa065495a67ac227b3056"}, + {file = "lxml-5.4.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f5cb182f6396706dc6cc1896dd02b1c889d644c081b0cdec38747573db88a7d7"}, + {file = "lxml-5.4.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:3a3178b4873df8ef9457a4875703488eb1622632a9cee6d76464b60e90adbfcd"}, + {file = "lxml-5.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:e094ec83694b59d263802ed03a8384594fcce477ce484b0cbcd0008a211ca751"}, + {file = "lxml-5.4.0-cp311-cp311-win32.whl", hash = "sha256:4329422de653cdb2b72afa39b0aa04252fca9071550044904b2e7036d9d97fe4"}, + {file = "lxml-5.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:fd3be6481ef54b8cfd0e1e953323b7aa9d9789b94842d0e5b142ef4bb7999539"}, + {file = "lxml-5.4.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:b5aff6f3e818e6bdbbb38e5967520f174b18f539c2b9de867b1e7fde6f8d95a4"}, + {file = "lxml-5.4.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:942a5d73f739ad7c452bf739a62a0f83e2578afd6b8e5406308731f4ce78b16d"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:460508a4b07364d6abf53acaa0a90b6d370fafde5693ef37602566613a9b0779"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:529024ab3a505fed78fe3cc5ddc079464e709f6c892733e3f5842007cec8ac6e"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ca56ebc2c474e8f3d5761debfd9283b8b18c76c4fc0967b74aeafba1f5647f9"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a81e1196f0a5b4167a8dafe3a66aa67c4addac1b22dc47947abd5d5c7a3f24b5"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:00b8686694423ddae324cf614e1b9659c2edb754de617703c3d29ff568448df5"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:c5681160758d3f6ac5b4fea370495c48aac0989d6a0f01bb9a72ad8ef5ab75c4"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_28_ppc64le.whl", hash = "sha256:2dc191e60425ad70e75a68c9fd90ab284df64d9cd410ba8d2b641c0c45bc006e"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_28_s390x.whl", hash = "sha256:67f779374c6b9753ae0a0195a892a1c234ce8416e4448fe1e9f34746482070a7"}, + {file = "lxml-5.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:79d5bfa9c1b455336f52343130b2067164040604e41f6dc4d8313867ed540079"}, + {file = "lxml-5.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3d3c30ba1c9b48c68489dc1829a6eede9873f52edca1dda900066542528d6b20"}, + {file = "lxml-5.4.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:1af80c6316ae68aded77e91cd9d80648f7dd40406cef73df841aa3c36f6907c8"}, + {file = "lxml-5.4.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:4d885698f5019abe0de3d352caf9466d5de2baded00a06ef3f1216c1a58ae78f"}, + {file = "lxml-5.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:aea53d51859b6c64e7c51d522c03cc2c48b9b5d6172126854cc7f01aa11f52bc"}, + {file = "lxml-5.4.0-cp312-cp312-win32.whl", hash = "sha256:d90b729fd2732df28130c064aac9bb8aff14ba20baa4aee7bd0795ff1187545f"}, + {file = "lxml-5.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1dc4ca99e89c335a7ed47d38964abcb36c5910790f9bd106f2a8fa2ee0b909d2"}, + {file = "lxml-5.4.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:773e27b62920199c6197130632c18fb7ead3257fce1ffb7d286912e56ddb79e0"}, + {file = "lxml-5.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ce9c671845de9699904b1e9df95acfe8dfc183f2310f163cdaa91a3535af95de"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9454b8d8200ec99a224df8854786262b1bd6461f4280064c807303c642c05e76"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cccd007d5c95279e529c146d095f1d39ac05139de26c098166c4beb9374b0f4d"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0fce1294a0497edb034cb416ad3e77ecc89b313cff7adbee5334e4dc0d11f422"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:24974f774f3a78ac12b95e3a20ef0931795ff04dbb16db81a90c37f589819551"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:497cab4d8254c2a90bf988f162ace2ddbfdd806fce3bda3f581b9d24c852e03c"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:e794f698ae4c5084414efea0f5cc9f4ac562ec02d66e1484ff822ef97c2cadff"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_28_ppc64le.whl", hash = "sha256:2c62891b1ea3094bb12097822b3d44b93fc6c325f2043c4d2736a8ff09e65f60"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_28_s390x.whl", hash = "sha256:142accb3e4d1edae4b392bd165a9abdee8a3c432a2cca193df995bc3886249c8"}, + {file = "lxml-5.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:1a42b3a19346e5601d1b8296ff6ef3d76038058f311902edd574461e9c036982"}, + {file = "lxml-5.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4291d3c409a17febf817259cb37bc62cb7eb398bcc95c1356947e2871911ae61"}, + {file = "lxml-5.4.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4f5322cf38fe0e21c2d73901abf68e6329dc02a4994e483adbcf92b568a09a54"}, + {file = "lxml-5.4.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:0be91891bdb06ebe65122aa6bf3fc94489960cf7e03033c6f83a90863b23c58b"}, + {file = "lxml-5.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:15a665ad90054a3d4f397bc40f73948d48e36e4c09f9bcffc7d90c87410e478a"}, + {file = "lxml-5.4.0-cp313-cp313-win32.whl", hash = "sha256:d5663bc1b471c79f5c833cffbc9b87d7bf13f87e055a5c86c363ccd2348d7e82"}, + {file = "lxml-5.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:bcb7a1096b4b6b24ce1ac24d4942ad98f983cd3810f9711bcd0293f43a9d8b9f"}, + {file = "lxml-5.4.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:7be701c24e7f843e6788353c055d806e8bd8466b52907bafe5d13ec6a6dbaecd"}, + {file = "lxml-5.4.0-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fb54f7c6bafaa808f27166569b1511fc42701a7713858dddc08afdde9746849e"}, + {file = "lxml-5.4.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:97dac543661e84a284502e0cf8a67b5c711b0ad5fb661d1bd505c02f8cf716d7"}, + {file = "lxml-5.4.0-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:c70e93fba207106cb16bf852e421c37bbded92acd5964390aad07cb50d60f5cf"}, + {file = "lxml-5.4.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:9c886b481aefdf818ad44846145f6eaf373a20d200b5ce1a5c8e1bc2d8745410"}, + {file = "lxml-5.4.0-cp36-cp36m-musllinux_1_2_x86_64.whl", hash = "sha256:fa0e294046de09acd6146be0ed6727d1f42ded4ce3ea1e9a19c11b6774eea27c"}, + {file = "lxml-5.4.0-cp36-cp36m-win32.whl", hash = "sha256:61c7bbf432f09ee44b1ccaa24896d21075e533cd01477966a5ff5a71d88b2f56"}, + {file = "lxml-5.4.0-cp36-cp36m-win_amd64.whl", hash = "sha256:7ce1a171ec325192c6a636b64c94418e71a1964f56d002cc28122fceff0b6121"}, + {file = "lxml-5.4.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:795f61bcaf8770e1b37eec24edf9771b307df3af74d1d6f27d812e15a9ff3872"}, + {file = "lxml-5.4.0-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:29f451a4b614a7b5b6c2e043d7b64a15bd8304d7e767055e8ab68387a8cacf4e"}, + {file = "lxml-5.4.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:891f7f991a68d20c75cb13c5c9142b2a3f9eb161f1f12a9489c82172d1f133c0"}, + {file = "lxml-5.4.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4aa412a82e460571fad592d0f93ce9935a20090029ba08eca05c614f99b0cc92"}, + {file = "lxml-5.4.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:ac7ba71f9561cd7d7b55e1ea5511543c0282e2b6450f122672a2694621d63b7e"}, + {file = "lxml-5.4.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:c5d32f5284012deaccd37da1e2cd42f081feaa76981f0eaa474351b68df813c5"}, + {file = "lxml-5.4.0-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:ce31158630a6ac85bddd6b830cffd46085ff90498b397bd0a259f59d27a12188"}, + {file = "lxml-5.4.0-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:31e63621e073e04697c1b2d23fcb89991790eef370ec37ce4d5d469f40924ed6"}, + {file = "lxml-5.4.0-cp37-cp37m-win32.whl", hash = "sha256:be2ba4c3c5b7900246a8f866580700ef0d538f2ca32535e991027bdaba944063"}, + {file = "lxml-5.4.0-cp37-cp37m-win_amd64.whl", hash = "sha256:09846782b1ef650b321484ad429217f5154da4d6e786636c38e434fa32e94e49"}, + {file = "lxml-5.4.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:eaf24066ad0b30917186420d51e2e3edf4b0e2ea68d8cd885b14dc8afdcf6556"}, + {file = "lxml-5.4.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2b31a3a77501d86d8ade128abb01082724c0dfd9524f542f2f07d693c9f1175f"}, + {file = "lxml-5.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e108352e203c7afd0eb91d782582f00a0b16a948d204d4dec8565024fafeea5"}, + {file = "lxml-5.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a11a96c3b3f7551c8a8109aa65e8594e551d5a84c76bf950da33d0fb6dfafab7"}, + {file = "lxml-5.4.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:ca755eebf0d9e62d6cb013f1261e510317a41bf4650f22963474a663fdfe02aa"}, + {file = "lxml-5.4.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:4cd915c0fb1bed47b5e6d6edd424ac25856252f09120e3e8ba5154b6b921860e"}, + {file = "lxml-5.4.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:226046e386556a45ebc787871d6d2467b32c37ce76c2680f5c608e25823ffc84"}, + {file = "lxml-5.4.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:b108134b9667bcd71236c5a02aad5ddd073e372fb5d48ea74853e009fe38acb6"}, + {file = "lxml-5.4.0-cp38-cp38-win32.whl", hash = "sha256:1320091caa89805df7dcb9e908add28166113dcd062590668514dbd510798c88"}, + {file = "lxml-5.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:073eb6dcdf1f587d9b88c8c93528b57eccda40209cf9be549d469b942b41d70b"}, + {file = "lxml-5.4.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:bda3ea44c39eb74e2488297bb39d47186ed01342f0022c8ff407c250ac3f498e"}, + {file = "lxml-5.4.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9ceaf423b50ecfc23ca00b7f50b64baba85fb3fb91c53e2c9d00bc86150c7e40"}, + {file = "lxml-5.4.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:664cdc733bc87449fe781dbb1f309090966c11cc0c0cd7b84af956a02a8a4729"}, + {file = "lxml-5.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:67ed8a40665b84d161bae3181aa2763beea3747f748bca5874b4af4d75998f87"}, + {file = "lxml-5.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9b4a3bd174cc9cdaa1afbc4620c049038b441d6ba07629d89a83b408e54c35cd"}, + {file = "lxml-5.4.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:b0989737a3ba6cf2a16efb857fb0dfa20bc5c542737fddb6d893fde48be45433"}, + {file = "lxml-5.4.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:dc0af80267edc68adf85f2a5d9be1cdf062f973db6790c1d065e45025fa26140"}, + {file = "lxml-5.4.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:639978bccb04c42677db43c79bdaa23785dc7f9b83bfd87570da8207872f1ce5"}, + {file = "lxml-5.4.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a99d86351f9c15e4a901fc56404b485b1462039db59288b203f8c629260a142"}, + {file = "lxml-5.4.0-cp39-cp39-win32.whl", hash = "sha256:3e6d5557989cdc3ebb5302bbdc42b439733a841891762ded9514e74f60319ad6"}, + {file = "lxml-5.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:a8c9b7f16b63e65bbba889acb436a1034a82d34fa09752d754f88d708eca80e1"}, + {file = "lxml-5.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:1b717b00a71b901b4667226bba282dd462c42ccf618ade12f9ba3674e1fabc55"}, + {file = "lxml-5.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27a9ded0f0b52098ff89dd4c418325b987feed2ea5cc86e8860b0f844285d740"}, + {file = "lxml-5.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b7ce10634113651d6f383aa712a194179dcd496bd8c41e191cec2099fa09de5"}, + {file = "lxml-5.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:53370c26500d22b45182f98847243efb518d268374a9570409d2e2276232fd37"}, + {file = "lxml-5.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c6364038c519dffdbe07e3cf42e6a7f8b90c275d4d1617a69bb59734c1a2d571"}, + {file = "lxml-5.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:b12cb6527599808ada9eb2cd6e0e7d3d8f13fe7bbb01c6311255a15ded4c7ab4"}, + {file = "lxml-5.4.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:5f11a1526ebd0dee85e7b1e39e39a0cc0d9d03fb527f56d8457f6df48a10dc0c"}, + {file = "lxml-5.4.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:48b4afaf38bf79109bb060d9016fad014a9a48fb244e11b94f74ae366a64d252"}, + {file = "lxml-5.4.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de6f6bb8a7840c7bf216fb83eec4e2f79f7325eca8858167b68708b929ab2172"}, + {file = "lxml-5.4.0-pp37-pypy37_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:5cca36a194a4eb4e2ed6be36923d3cffd03dcdf477515dea687185506583d4c9"}, + {file = "lxml-5.4.0-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:b7c86884ad23d61b025989d99bfdd92a7351de956e01c61307cb87035960bcb1"}, + {file = "lxml-5.4.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:53d9469ab5460402c19553b56c3648746774ecd0681b1b27ea74d5d8a3ef5590"}, + {file = "lxml-5.4.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:56dbdbab0551532bb26c19c914848d7251d73edb507c3079d6805fa8bba5b706"}, + {file = "lxml-5.4.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:14479c2ad1cb08b62bb941ba8e0e05938524ee3c3114644df905d2331c76cd57"}, + {file = "lxml-5.4.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:32697d2ea994e0db19c1df9e40275ffe84973e4232b5c274f47e7c1ec9763cdd"}, + {file = "lxml-5.4.0-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:24f6df5f24fc3385f622c0c9d63fe34604893bc1a5bdbb2dbf5870f85f9a404a"}, + {file = "lxml-5.4.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:151d6c40bc9db11e960619d2bf2ec5829f0aaffb10b41dcf6ad2ce0f3c0b2325"}, + {file = "lxml-5.4.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:4025bf2884ac4370a3243c5aa8d66d3cb9e15d3ddd0af2d796eccc5f0244390e"}, + {file = "lxml-5.4.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:9459e6892f59ecea2e2584ee1058f5d8f629446eab52ba2305ae13a32a059530"}, + {file = "lxml-5.4.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:47fb24cc0f052f0576ea382872b3fc7e1f7e3028e53299ea751839418ade92a6"}, + {file = "lxml-5.4.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:50441c9de951a153c698b9b99992e806b71c1f36d14b154592580ff4a9d0d877"}, + {file = "lxml-5.4.0-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:ab339536aa798b1e17750733663d272038bf28069761d5be57cb4a9b0137b4f8"}, + {file = "lxml-5.4.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:9776af1aad5a4b4a1317242ee2bea51da54b2a7b7b48674be736d463c999f37d"}, + {file = "lxml-5.4.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:63e7968ff83da2eb6fdda967483a7a023aa497d85ad8f05c3ad9b1f2e8c84987"}, + {file = "lxml-5.4.0.tar.gz", hash = "sha256:d12832e1dbea4be280b22fd0ea7c9b87f0d8fc51ba06e92dc62d52f804f78ebd"}, ] [package.extras] @@ -598,211 +670,351 @@ files = [ test = ["coverage[toml] (>=7.2.5)", "mypy (>=1.2.0)", "pytest (>=7.3.0)", "pytest-mypy-plugins (>=1.10.1)"] [[package]] -name = "markdown-it-py" -version = "3.0.0" -description = "Python port of markdown-it. Markdown parsing, done right!" +name = "markdown" +version = "3.7" +description = "Python implementation of John Gruber's Markdown." optional = false python-versions = ">=3.8" files = [ - {file = "markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb"}, - {file = "markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1"}, + {file = "Markdown-3.7-py3-none-any.whl", hash = "sha256:7eb6df5690b81a1d7942992c97fad2938e956e79df20cbc6186e9c3a77b1c803"}, + {file = "markdown-3.7.tar.gz", hash = "sha256:2ae2471477cfd02dbbf038d5d9bc226d40def84b4fe2986e49b59b6b472bbed2"}, ] [package.dependencies] -mdurl = ">=0.1,<1.0" +importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""} [package.extras] -benchmarking = ["psutil", "pytest", "pytest-benchmark"] -code-style = ["pre-commit (>=3.0,<4.0)"] -compare = ["commonmark (>=0.9,<1.0)", "markdown (>=3.4,<4.0)", "mistletoe (>=1.0,<2.0)", "mistune (>=2.0,<3.0)", "panflute (>=2.3,<3.0)"] -linkify = ["linkify-it-py (>=1,<3)"] -plugins = ["mdit-py-plugins"] -profiling = ["gprof2dot"] -rtd = ["jupyter_sphinx", "mdit-py-plugins", "myst-parser", "pyyaml", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinx_book_theme"] -testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"] +docs = ["mdx-gh-links (>=0.2)", "mkdocs (>=1.5)", "mkdocs-gen-files", "mkdocs-literate-nav", "mkdocs-nature (>=0.6)", "mkdocs-section-index", "mkdocstrings[python]"] +testing = ["coverage", "pyyaml"] [[package]] name = "markupsafe" -version = "2.1.3" +version = "2.1.5" description = "Safely add untrusted strings to HTML/XML markup." optional = false python-versions = ">=3.7" files = [ - {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"}, - {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"}, -] - -[[package]] -name = "mdit-py-plugins" -version = "0.4.0" -description = "Collection of plugins for markdown-it-py" + {file = "MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a17a92de5231666cfbe003f0e4b9b3a7ae3afb1ec2845aadc2bacc93ff85febc"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:72b6be590cc35924b02c78ef34b467da4ba07e4e0f0454a2c5907f473fc50ce5"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e61659ba32cf2cf1481e575d0462554625196a1f2fc06a1c777d3f48e8865d46"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2174c595a0d73a3080ca3257b40096db99799265e1c27cc5a610743acd86d62f"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae2ad8ae6ebee9d2d94b17fb62763125f3f374c25618198f40cbb8b525411900"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:075202fa5b72c86ad32dc7d0b56024ebdbcf2048c0ba09f1cde31bfdd57bcfff"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:598e3276b64aff0e7b3451b72e94fa3c238d452e7ddcd893c3ab324717456bad"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fce659a462a1be54d2ffcacea5e3ba2d74daa74f30f5f143fe0c58636e355fdd"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-win32.whl", hash = "sha256:d9fad5155d72433c921b782e58892377c44bd6252b5af2f67f16b194987338a4"}, + {file = "MarkupSafe-2.1.5-cp310-cp310-win_amd64.whl", hash = "sha256:bf50cd79a75d181c9181df03572cdce0fbb75cc353bc350712073108cba98de5"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:629ddd2ca402ae6dbedfceeba9c46d5f7b2a61d9749597d4307f943ef198fc1f"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5b7b716f97b52c5a14bffdf688f971b2d5ef4029127f1ad7a513973cfd818df2"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ec585f69cec0aa07d945b20805be741395e28ac1627333b1c5b0105962ffced"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b91c037585eba9095565a3556f611e3cbfaa42ca1e865f7b8015fe5c7336d5a5"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7502934a33b54030eaf1194c21c692a534196063db72176b0c4028e140f8f32c"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0e397ac966fdf721b2c528cf028494e86172b4feba51d65f81ffd65c63798f3f"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:c061bb86a71b42465156a3ee7bd58c8c2ceacdbeb95d05a99893e08b8467359a"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3a57fdd7ce31c7ff06cdfbf31dafa96cc533c21e443d57f5b1ecc6cdc668ec7f"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-win32.whl", hash = "sha256:397081c1a0bfb5124355710fe79478cdbeb39626492b15d399526ae53422b906"}, + {file = "MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl", hash = "sha256:2b7c57a4dfc4f16f7142221afe5ba4e093e09e728ca65c51f5620c9aaeb9a617"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:8dec4936e9c3100156f8a2dc89c4b88d5c435175ff03413b443469c7c8c5f4d1"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:3c6b973f22eb18a789b1460b4b91bf04ae3f0c4234a0a6aa6b0a92f6f7b951d4"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac07bad82163452a6884fe8fa0963fb98c2346ba78d779ec06bd7a6262132aee"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f5dfb42c4604dddc8e4305050aa6deb084540643ed5804d7455b5df8fe16f5e5"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ea3d8a3d18833cf4304cd2fc9cbb1efe188ca9b5efef2bdac7adc20594a0e46b"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d050b3361367a06d752db6ead6e7edeb0009be66bc3bae0ee9d97fb326badc2a"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bec0a414d016ac1a18862a519e54b2fd0fc8bbfd6890376898a6c0891dd82e9f"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:58c98fee265677f63a4385256a6d7683ab1832f3ddd1e66fe948d5880c21a169"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-win32.whl", hash = "sha256:8590b4ae07a35970728874632fed7bd57b26b0102df2d2b233b6d9d82f6c62ad"}, + {file = "MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl", hash = "sha256:823b65d8706e32ad2df51ed89496147a42a2a6e01c13cfb6ffb8b1e92bc910bb"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c8b29db45f8fe46ad280a7294f5c3ec36dbac9491f2d1c17345be8e69cc5928f"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ec6a563cff360b50eed26f13adc43e61bc0c04d94b8be985e6fb24b81f6dcfdf"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a549b9c31bec33820e885335b451286e2969a2d9e24879f83fe904a5ce59d70a"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4f11aa001c540f62c6166c7726f71f7573b52c68c31f014c25cc7901deea0b52"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:7b2e5a267c855eea6b4283940daa6e88a285f5f2a67f2220203786dfa59b37e9"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:2d2d793e36e230fd32babe143b04cec8a8b3eb8a3122d2aceb4a371e6b09b8df"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:ce409136744f6521e39fd8e2a24c53fa18ad67aa5bc7c2cf83645cce5b5c4e50"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-win32.whl", hash = "sha256:4096e9de5c6fdf43fb4f04c26fb114f61ef0bf2e5604b6ee3019d51b69e8c371"}, + {file = "MarkupSafe-2.1.5-cp37-cp37m-win_amd64.whl", hash = "sha256:4275d846e41ecefa46e2015117a9f491e57a71ddd59bbead77e904dc02b1bed2"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:656f7526c69fac7f600bd1f400991cc282b417d17539a1b228617081106feb4a"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:97cafb1f3cbcd3fd2b6fbfb99ae11cdb14deea0736fc2b0952ee177f2b813a46"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f3fbcb7ef1f16e48246f704ab79d79da8a46891e2da03f8783a5b6fa41a9532"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa9db3f79de01457b03d4f01b34cf91bc0048eb2c3846ff26f66687c2f6d16ab"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffee1f21e5ef0d712f9033568f8344d5da8cc2869dbd08d87c84656e6a2d2f68"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:5dedb4db619ba5a2787a94d877bc8ffc0566f92a01c0ef214865e54ecc9ee5e0"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:30b600cf0a7ac9234b2638fbc0fb6158ba5bdcdf46aeb631ead21248b9affbc4"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8dd717634f5a044f860435c1d8c16a270ddf0ef8588d4887037c5028b859b0c3"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-win32.whl", hash = "sha256:daa4ee5a243f0f20d528d939d06670a298dd39b1ad5f8a72a4275124a7819eff"}, + {file = "MarkupSafe-2.1.5-cp38-cp38-win_amd64.whl", hash = "sha256:619bc166c4f2de5caa5a633b8b7326fbe98e0ccbfacabd87268a2b15ff73a029"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7a68b554d356a91cce1236aa7682dc01df0edba8d043fd1ce607c49dd3c1edcf"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:db0b55e0f3cc0be60c1f19efdde9a637c32740486004f20d1cff53c3c0ece4d2"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e53af139f8579a6d5f7b76549125f0d94d7e630761a2111bc431fd820e163b8"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17b950fccb810b3293638215058e432159d2b71005c74371d784862b7e4683f3"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c31f53cdae6ecfa91a77820e8b151dba54ab528ba65dfd235c80b086d68a465"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:bff1b4290a66b490a2f4719358c0cdcd9bafb6b8f061e45c7a2460866bf50c2e"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bc1667f8b83f48511b94671e0e441401371dfd0f0a795c7daa4a3cd1dde55bea"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5049256f536511ee3f7e1b3f87d1d1209d327e818e6ae1365e8653d7e3abb6a6"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-win32.whl", hash = "sha256:00e046b6dd71aa03a41079792f8473dc494d564611a8f89bbbd7cb93295ebdcf"}, + {file = "MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl", hash = "sha256:fa173ec60341d6bb97a89f5ea19c85c5643c1e7dedebc22f5181eb73573142c5"}, + {file = "MarkupSafe-2.1.5.tar.gz", hash = "sha256:d283d37a890ba4c1ae73ffadf8046435c76e7bc2247bbb63c00bd1a709c6544b"}, +] + +[[package]] +name = "mergedeep" +version = "1.3.4" +description = "A deep merge function for 🐍." +optional = false +python-versions = ">=3.6" +files = [ + {file = "mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307"}, + {file = "mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8"}, +] + +[[package]] +name = "mkdocs" +version = "1.6.1" +description = "Project documentation with Markdown." optional = false python-versions = ">=3.8" files = [ - {file = "mdit_py_plugins-0.4.0-py3-none-any.whl", hash = "sha256:b51b3bb70691f57f974e257e367107857a93b36f322a9e6d44ca5bf28ec2def9"}, - {file = "mdit_py_plugins-0.4.0.tar.gz", hash = "sha256:d8ab27e9aed6c38aa716819fedfde15ca275715955f8a185a8e1cf90fb1d2c1b"}, + {file = "mkdocs-1.6.1-py3-none-any.whl", hash = "sha256:db91759624d1647f3f34aa0c3f327dd2601beae39a366d6e064c03468d35c20e"}, + {file = "mkdocs-1.6.1.tar.gz", hash = "sha256:7b432f01d928c084353ab39c57282f29f92136665bdd6abf7c1ec8d822ef86f2"}, ] [package.dependencies] -markdown-it-py = ">=1.0.0,<4.0.0" +click = ">=7.0" +colorama = {version = ">=0.4", markers = "platform_system == \"Windows\""} +ghp-import = ">=1.0" +importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""} +jinja2 = ">=2.11.1" +markdown = ">=3.3.6" +markupsafe = ">=2.0.1" +mergedeep = ">=1.3.4" +mkdocs-get-deps = ">=0.2.0" +packaging = ">=20.5" +pathspec = ">=0.11.1" +pyyaml = ">=5.1" +pyyaml-env-tag = ">=0.1" +watchdog = ">=2.0" [package.extras] -code-style = ["pre-commit"] -rtd = ["myst-parser", "sphinx-book-theme"] -testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"] +i18n = ["babel (>=2.9.0)"] +min-versions = ["babel (==2.9.0)", "click (==7.0)", "colorama (==0.4)", "ghp-import (==1.0)", "importlib-metadata (==4.4)", "jinja2 (==2.11.1)", "markdown (==3.3.6)", "markupsafe (==2.0.1)", "mergedeep (==1.3.4)", "mkdocs-get-deps (==0.2.0)", "packaging (==20.5)", "pathspec (==0.11.1)", "pyyaml (==5.1)", "pyyaml-env-tag (==0.1)", "watchdog (==2.0)"] + +[[package]] +name = "mkdocs-autorefs" +version = "1.4.3" +description = "Automatically link across pages in MkDocs." +optional = false +python-versions = ">=3.9" +files = [ + {file = "mkdocs_autorefs-1.4.3-py3-none-any.whl", hash = "sha256:469d85eb3114801d08e9cc55d102b3ba65917a869b893403b8987b601cf55dc9"}, + {file = "mkdocs_autorefs-1.4.3.tar.gz", hash = "sha256:beee715b254455c4aa93b6ef3c67579c399ca092259cc41b7d9342573ff1fc75"}, +] + +[package.dependencies] +Markdown = ">=3.3" +markupsafe = ">=2.0.1" +mkdocs = ">=1.1" [[package]] -name = "mdurl" -version = "0.1.2" -description = "Markdown URL utilities" +name = "mkdocs-gen-files" +version = "0.5.0" +description = "MkDocs plugin to programmatically generate documentation pages during the build" optional = false python-versions = ">=3.7" files = [ - {file = "mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8"}, - {file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"}, + {file = "mkdocs_gen_files-0.5.0-py3-none-any.whl", hash = "sha256:7ac060096f3f40bd19039e7277dd3050be9a453c8ac578645844d4d91d7978ea"}, + {file = "mkdocs_gen_files-0.5.0.tar.gz", hash = "sha256:4c7cf256b5d67062a788f6b1d035e157fc1a9498c2399be9af5257d4ff4d19bc"}, ] +[package.dependencies] +mkdocs = ">=1.0.3" + [[package]] -name = "mypy" -version = "1.11.2" -description = "Optional static typing for Python" +name = "mkdocs-get-deps" +version = "0.2.0" +description = "MkDocs extension that lists all dependencies according to a mkdocs.yml file" optional = false python-versions = ">=3.8" files = [ - {file = "mypy-1.11.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d42a6dd818ffce7be66cce644f1dff482f1d97c53ca70908dff0b9ddc120b77a"}, - {file = "mypy-1.11.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:801780c56d1cdb896eacd5619a83e427ce436d86a3bdf9112527f24a66618fef"}, - {file = "mypy-1.11.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41ea707d036a5307ac674ea172875f40c9d55c5394f888b168033177fce47383"}, - {file = "mypy-1.11.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6e658bd2d20565ea86da7d91331b0eed6d2eee22dc031579e6297f3e12c758c8"}, - {file = "mypy-1.11.2-cp310-cp310-win_amd64.whl", hash = "sha256:478db5f5036817fe45adb7332d927daa62417159d49783041338921dcf646fc7"}, - {file = "mypy-1.11.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:75746e06d5fa1e91bfd5432448d00d34593b52e7e91a187d981d08d1f33d4385"}, - {file = "mypy-1.11.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a976775ab2256aadc6add633d44f100a2517d2388906ec4f13231fafbb0eccca"}, - {file = "mypy-1.11.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cd953f221ac1379050a8a646585a29574488974f79d8082cedef62744f0a0104"}, - {file = "mypy-1.11.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:57555a7715c0a34421013144a33d280e73c08df70f3a18a552938587ce9274f4"}, - {file = "mypy-1.11.2-cp311-cp311-win_amd64.whl", hash = "sha256:36383a4fcbad95f2657642a07ba22ff797de26277158f1cc7bd234821468b1b6"}, - {file = "mypy-1.11.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:e8960dbbbf36906c5c0b7f4fbf2f0c7ffb20f4898e6a879fcf56a41a08b0d318"}, - {file = "mypy-1.11.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:06d26c277962f3fb50e13044674aa10553981ae514288cb7d0a738f495550b36"}, - {file = "mypy-1.11.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6e7184632d89d677973a14d00ae4d03214c8bc301ceefcdaf5c474866814c987"}, - {file = "mypy-1.11.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:3a66169b92452f72117e2da3a576087025449018afc2d8e9bfe5ffab865709ca"}, - {file = "mypy-1.11.2-cp312-cp312-win_amd64.whl", hash = "sha256:969ea3ef09617aff826885a22ece0ddef69d95852cdad2f60c8bb06bf1f71f70"}, - {file = "mypy-1.11.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:37c7fa6121c1cdfcaac97ce3d3b5588e847aa79b580c1e922bb5d5d2902df19b"}, - {file = "mypy-1.11.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4a8a53bc3ffbd161b5b2a4fff2f0f1e23a33b0168f1c0778ec70e1a3d66deb86"}, - {file = "mypy-1.11.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2ff93107f01968ed834f4256bc1fc4475e2fecf6c661260066a985b52741ddce"}, - {file = "mypy-1.11.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:edb91dded4df17eae4537668b23f0ff6baf3707683734b6a818d5b9d0c0c31a1"}, - {file = "mypy-1.11.2-cp38-cp38-win_amd64.whl", hash = "sha256:ee23de8530d99b6db0573c4ef4bd8f39a2a6f9b60655bf7a1357e585a3486f2b"}, - {file = "mypy-1.11.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:801ca29f43d5acce85f8e999b1e431fb479cb02d0e11deb7d2abb56bdaf24fd6"}, - {file = "mypy-1.11.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:af8d155170fcf87a2afb55b35dc1a0ac21df4431e7d96717621962e4b9192e70"}, - {file = "mypy-1.11.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f7821776e5c4286b6a13138cc935e2e9b6fde05e081bdebf5cdb2bb97c9df81d"}, - {file = "mypy-1.11.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:539c570477a96a4e6fb718b8d5c3e0c0eba1f485df13f86d2970c91f0673148d"}, - {file = "mypy-1.11.2-cp39-cp39-win_amd64.whl", hash = "sha256:3f14cd3d386ac4d05c5a39a51b84387403dadbd936e17cb35882134d4f8f0d24"}, - {file = "mypy-1.11.2-py3-none-any.whl", hash = "sha256:b499bc07dbdcd3de92b0a8b29fdf592c111276f6a12fe29c30f6c417dd546d12"}, - {file = "mypy-1.11.2.tar.gz", hash = "sha256:7f9993ad3e0ffdc95c2a14b66dee63729f021968bff8ad911867579c65d13a79"}, + {file = "mkdocs_get_deps-0.2.0-py3-none-any.whl", hash = "sha256:2bf11d0b133e77a0dd036abeeb06dec8775e46efa526dc70667d8863eefc6134"}, + {file = "mkdocs_get_deps-0.2.0.tar.gz", hash = "sha256:162b3d129c7fad9b19abfdcb9c1458a651628e4b1dea628ac68790fb3061c60c"}, ] [package.dependencies] -mypy-extensions = ">=1.0.0" -tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} -typing-extensions = ">=4.6.0" +importlib-metadata = {version = ">=4.3", markers = "python_version < \"3.10\""} +mergedeep = ">=1.3.4" +platformdirs = ">=2.2.0" +pyyaml = ">=5.1" + +[[package]] +name = "mkdocs-include-markdown-plugin" +version = "7.2.0" +description = "Mkdocs Markdown includer plugin." +optional = false +python-versions = ">=3.9" +files = [ + {file = "mkdocs_include_markdown_plugin-7.2.0-py3-none-any.whl", hash = "sha256:d56cdaeb2d113fb66ed0fe4fb7af1da889926b0b9872032be24e19bbb09c9f5b"}, + {file = "mkdocs_include_markdown_plugin-7.2.0.tar.gz", hash = "sha256:4a67a91ade680dc0e15f608e5b6343bec03372ffa112c40a4254c1bfb10f42f3"}, +] + +[package.dependencies] +mkdocs = ">=1.4" +wcmatch = "*" [package.extras] -dmypy = ["psutil (>=4.0)"] -install-types = ["pip"] -mypyc = ["setuptools (>=50)"] -reports = ["lxml"] +cache = ["platformdirs"] [[package]] -name = "mypy-extensions" -version = "1.0.0" -description = "Type system extensions for programs checked with the mypy type checker." +name = "mkdocs-material" +version = "9.6.22" +description = "Documentation that simply works" optional = false -python-versions = ">=3.5" +python-versions = ">=3.8" files = [ - {file = "mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d"}, - {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"}, + {file = "mkdocs_material-9.6.22-py3-none-any.whl", hash = "sha256:14ac5f72d38898b2f98ac75a5531aaca9366eaa427b0f49fc2ecf04d99b7ad84"}, + {file = "mkdocs_material-9.6.22.tar.gz", hash = "sha256:87c158b0642e1ada6da0cbd798a3389b0bc5516b90e5ece4a0fb939f00bacd1c"}, ] +[package.dependencies] +babel = ">=2.10,<3.0" +backrefs = ">=5.7.post1,<6.0" +colorama = ">=0.4,<1.0" +jinja2 = ">=3.1,<4.0" +markdown = ">=3.2,<4.0" +mkdocs = ">=1.6,<2.0" +mkdocs-material-extensions = ">=1.3,<2.0" +paginate = ">=0.5,<1.0" +pygments = ">=2.16,<3.0" +pymdown-extensions = ">=10.2,<11.0" +requests = ">=2.26,<3.0" + +[package.extras] +git = ["mkdocs-git-committers-plugin-2 (>=1.1,<3)", "mkdocs-git-revision-date-localized-plugin (>=1.2.4,<2.0)"] +imaging = ["cairosvg (>=2.6,<3.0)", "pillow (>=10.2,<12.0)"] +recommended = ["mkdocs-minify-plugin (>=0.7,<1.0)", "mkdocs-redirects (>=1.2,<2.0)", "mkdocs-rss-plugin (>=1.6,<2.0)"] + [[package]] -name = "myst-parser" -version = "3.0.1" -description = "An extended [CommonMark](https://spec.commonmark.org/) compliant parser," +name = "mkdocs-material-extensions" +version = "1.3.1" +description = "Extension pack for Python Markdown and MkDocs Material." optional = false python-versions = ">=3.8" files = [ - {file = "myst_parser-3.0.1-py3-none-any.whl", hash = "sha256:6457aaa33a5d474aca678b8ead9b3dc298e89c68e67012e73146ea6fd54babf1"}, - {file = "myst_parser-3.0.1.tar.gz", hash = "sha256:88f0cb406cb363b077d176b51c476f62d60604d68a8dcdf4832e080441301a87"}, + {file = "mkdocs_material_extensions-1.3.1-py3-none-any.whl", hash = "sha256:adff8b62700b25cb77b53358dad940f3ef973dd6db797907c49e3c2ef3ab4e31"}, + {file = "mkdocs_material_extensions-1.3.1.tar.gz", hash = "sha256:10c9511cea88f568257f960358a467d12b970e1f7b2c0e5fb2bb48cab1928443"}, +] + +[[package]] +name = "mkdocstrings" +version = "0.26.2" +description = "Automatic documentation from sources, for MkDocs." +optional = false +python-versions = ">=3.9" +files = [ + {file = "mkdocstrings-0.26.2-py3-none-any.whl", hash = "sha256:1248f3228464f3b8d1a15bd91249ce1701fe3104ac517a5f167a0e01ca850ba5"}, + {file = "mkdocstrings-0.26.2.tar.gz", hash = "sha256:34a8b50f1e6cfd29546c6c09fbe02154adfb0b361bb758834bf56aa284ba876e"}, ] [package.dependencies] -docutils = ">=0.18,<0.22" -jinja2 = "*" -markdown-it-py = ">=3.0,<4.0" -mdit-py-plugins = ">=0.4,<1.0" -pyyaml = "*" -sphinx = ">=6,<8" +click = ">=7.0" +Jinja2 = ">=2.11.1" +Markdown = ">=3.6" +MarkupSafe = ">=1.1" +mkdocs = ">=1.4" +mkdocs-autorefs = ">=1.2" +mkdocstrings-python = {version = ">=0.5.2", optional = true, markers = "extra == \"python\""} +platformdirs = ">=2.2" +pymdown-extensions = ">=6.3" + +[package.extras] +crystal = ["mkdocstrings-crystal (>=0.3.4)"] +python = ["mkdocstrings-python (>=0.5.2)"] +python-legacy = ["mkdocstrings-python-legacy (>=0.2.1)"] + +[[package]] +name = "mkdocstrings-python" +version = "1.13.0" +description = "A Python handler for mkdocstrings." +optional = false +python-versions = ">=3.9" +files = [ + {file = "mkdocstrings_python-1.13.0-py3-none-any.whl", hash = "sha256:b88bbb207bab4086434743849f8e796788b373bd32e7bfefbf8560ac45d88f97"}, + {file = "mkdocstrings_python-1.13.0.tar.gz", hash = "sha256:2dbd5757e8375b9720e81db16f52f1856bf59905428fd7ef88005d1370e2f64c"}, +] + +[package.dependencies] +griffe = ">=0.49" +mkdocs-autorefs = ">=1.2" +mkdocstrings = ">=0.26" + +[[package]] +name = "mypy" +version = "1.14.1" +description = "Optional static typing for Python" +optional = false +python-versions = ">=3.8" +files = [ + {file = "mypy-1.14.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:52686e37cf13d559f668aa398dd7ddf1f92c5d613e4f8cb262be2fb4fedb0fcb"}, + {file = "mypy-1.14.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1fb545ca340537d4b45d3eecdb3def05e913299ca72c290326be19b3804b39c0"}, + {file = "mypy-1.14.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:90716d8b2d1f4cd503309788e51366f07c56635a3309b0f6a32547eaaa36a64d"}, + {file = "mypy-1.14.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2ae753f5c9fef278bcf12e1a564351764f2a6da579d4a81347e1d5a15819997b"}, + {file = "mypy-1.14.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:e0fe0f5feaafcb04505bcf439e991c6d8f1bf8b15f12b05feeed96e9e7bf1427"}, + {file = "mypy-1.14.1-cp310-cp310-win_amd64.whl", hash = "sha256:7d54bd85b925e501c555a3227f3ec0cfc54ee8b6930bd6141ec872d1c572f81f"}, + {file = "mypy-1.14.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:f995e511de847791c3b11ed90084a7a0aafdc074ab88c5a9711622fe4751138c"}, + {file = "mypy-1.14.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d64169ec3b8461311f8ce2fd2eb5d33e2d0f2c7b49116259c51d0d96edee48d1"}, + {file = "mypy-1.14.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ba24549de7b89b6381b91fbc068d798192b1b5201987070319889e93038967a8"}, + {file = "mypy-1.14.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:183cf0a45457d28ff9d758730cd0210419ac27d4d3f285beda038c9083363b1f"}, + {file = "mypy-1.14.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f2a0ecc86378f45347f586e4163d1769dd81c5a223d577fe351f26b179e148b1"}, + {file = "mypy-1.14.1-cp311-cp311-win_amd64.whl", hash = "sha256:ad3301ebebec9e8ee7135d8e3109ca76c23752bac1e717bc84cd3836b4bf3eae"}, + {file = "mypy-1.14.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:30ff5ef8519bbc2e18b3b54521ec319513a26f1bba19a7582e7b1f58a6e69f14"}, + {file = "mypy-1.14.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:cb9f255c18052343c70234907e2e532bc7e55a62565d64536dbc7706a20b78b9"}, + {file = "mypy-1.14.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8b4e3413e0bddea671012b063e27591b953d653209e7a4fa5e48759cda77ca11"}, + {file = "mypy-1.14.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:553c293b1fbdebb6c3c4030589dab9fafb6dfa768995a453d8a5d3b23784af2e"}, + {file = "mypy-1.14.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fad79bfe3b65fe6a1efaed97b445c3d37f7be9fdc348bdb2d7cac75579607c89"}, + {file = "mypy-1.14.1-cp312-cp312-win_amd64.whl", hash = "sha256:8fa2220e54d2946e94ab6dbb3ba0a992795bd68b16dc852db33028df2b00191b"}, + {file = "mypy-1.14.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:92c3ed5afb06c3a8e188cb5da4984cab9ec9a77ba956ee419c68a388b4595255"}, + {file = "mypy-1.14.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:dbec574648b3e25f43d23577309b16534431db4ddc09fda50841f1e34e64ed34"}, + {file = "mypy-1.14.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8c6d94b16d62eb3e947281aa7347d78236688e21081f11de976376cf010eb31a"}, + {file = "mypy-1.14.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d4b19b03fdf54f3c5b2fa474c56b4c13c9dbfb9a2db4370ede7ec11a2c5927d9"}, + {file = "mypy-1.14.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:0c911fde686394753fff899c409fd4e16e9b294c24bfd5e1ea4675deae1ac6fd"}, + {file = "mypy-1.14.1-cp313-cp313-win_amd64.whl", hash = "sha256:8b21525cb51671219f5307be85f7e646a153e5acc656e5cebf64bfa076c50107"}, + {file = "mypy-1.14.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7084fb8f1128c76cd9cf68fe5971b37072598e7c31b2f9f95586b65c741a9d31"}, + {file = "mypy-1.14.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:8f845a00b4f420f693f870eaee5f3e2692fa84cc8514496114649cfa8fd5e2c6"}, + {file = "mypy-1.14.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:44bf464499f0e3a2d14d58b54674dee25c031703b2ffc35064bd0df2e0fac319"}, + {file = "mypy-1.14.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c99f27732c0b7dc847adb21c9d47ce57eb48fa33a17bc6d7d5c5e9f9e7ae5bac"}, + {file = "mypy-1.14.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:bce23c7377b43602baa0bd22ea3265c49b9ff0b76eb315d6c34721af4cdf1d9b"}, + {file = "mypy-1.14.1-cp38-cp38-win_amd64.whl", hash = "sha256:8edc07eeade7ebc771ff9cf6b211b9a7d93687ff892150cb5692e4f4272b0837"}, + {file = "mypy-1.14.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3888a1816d69f7ab92092f785a462944b3ca16d7c470d564165fe703b0970c35"}, + {file = "mypy-1.14.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:46c756a444117c43ee984bd055db99e498bc613a70bbbc120272bd13ca579fbc"}, + {file = "mypy-1.14.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:27fc248022907e72abfd8e22ab1f10e903915ff69961174784a3900a8cba9ad9"}, + {file = "mypy-1.14.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:499d6a72fb7e5de92218db961f1a66d5f11783f9ae549d214617edab5d4dbdbb"}, + {file = "mypy-1.14.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:57961db9795eb566dc1d1b4e9139ebc4c6b0cb6e7254ecde69d1552bf7613f60"}, + {file = "mypy-1.14.1-cp39-cp39-win_amd64.whl", hash = "sha256:07ba89fdcc9451f2ebb02853deb6aaaa3d2239a236669a63ab3801bbf923ef5c"}, + {file = "mypy-1.14.1-py3-none-any.whl", hash = "sha256:b66a60cc4073aeb8ae00057f9c1f64d49e90f918fbcef9a977eb121da8b8f1d1"}, + {file = "mypy-1.14.1.tar.gz", hash = "sha256:7ec88144fe9b510e8475ec2f5f251992690fcf89ccb4500b214b4226abcd32d6"}, +] + +[package.dependencies] +mypy_extensions = ">=1.0.0" +tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} +typing_extensions = ">=4.6.0" [package.extras] -code-style = ["pre-commit (>=3.0,<4.0)"] -linkify = ["linkify-it-py (>=2.0,<3.0)"] -rtd = ["ipython", "sphinx (>=7)", "sphinx-autodoc2 (>=0.5.0,<0.6.0)", "sphinx-book-theme (>=1.1,<2.0)", "sphinx-copybutton", "sphinx-design", "sphinx-pyscript", "sphinx-tippy (>=0.4.3)", "sphinx-togglebutton", "sphinxext-opengraph (>=0.9.0,<0.10.0)", "sphinxext-rediraffe (>=0.2.7,<0.3.0)"] -testing = ["beautifulsoup4", "coverage[toml]", "defusedxml", "pytest (>=8,<9)", "pytest-cov", "pytest-param-files (>=0.6.0,<0.7.0)", "pytest-regressions", "sphinx-pytest"] -testing-docutils = ["pygments", "pytest (>=8,<9)", "pytest-param-files (>=0.6.0,<0.7.0)"] +dmypy = ["psutil (>=4.0)"] +faster-cache = ["orjson"] +install-types = ["pip"] +mypyc = ["setuptools (>=50)"] +reports = ["lxml"] + +[[package]] +name = "mypy-extensions" +version = "1.1.0" +description = "Type system extensions for programs checked with the mypy type checker." +optional = false +python-versions = ">=3.8" +files = [ + {file = "mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505"}, + {file = "mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558"}, +] [[package]] name = "networkx" @@ -912,57 +1124,61 @@ files = [ [[package]] name = "packaging" -version = "23.1" +version = "25.0" description = "Core utilities for Python packages" optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "packaging-23.1-py3-none-any.whl", hash = "sha256:994793af429502c4ea2ebf6bf664629d07c1a9fe974af92966e4b8d2df7edc61"}, - {file = "packaging-23.1.tar.gz", hash = "sha256:a392980d2b6cffa644431898be54b0045151319d1e7ec34f0cfed48767dd334f"}, + {file = "packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484"}, + {file = "packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f"}, ] [[package]] -name = "pathspec" -version = "0.11.2" -description = "Utility library for gitignore style pattern matching of file paths." +name = "paginate" +version = "0.5.7" +description = "Divides large result sets into pages for easier browsing" optional = false -python-versions = ">=3.7" +python-versions = "*" files = [ - {file = "pathspec-0.11.2-py3-none-any.whl", hash = "sha256:1d6ed233af05e679efb96b1851550ea95bbb64b7c490b0f5aa52996c11e92a20"}, - {file = "pathspec-0.11.2.tar.gz", hash = "sha256:e0d8d0ac2f12da61956eb2306b69f9469b42f4deb0f3cb6ed47b9cce9996ced3"}, + {file = "paginate-0.5.7-py2.py3-none-any.whl", hash = "sha256:b885e2af73abcf01d9559fd5216b57ef722f8c42affbb63942377668e35c7591"}, + {file = "paginate-0.5.7.tar.gz", hash = "sha256:22bd083ab41e1a8b4f3690544afb2c60c25e5c9a63a30fa2f483f6c60c8e5945"}, ] +[package.extras] +dev = ["pytest", "tox"] +lint = ["black"] + [[package]] -name = "pbr" -version = "5.11.1" -description = "Python Build Reasonableness" +name = "pathspec" +version = "0.12.1" +description = "Utility library for gitignore style pattern matching of file paths." optional = false -python-versions = ">=2.6" +python-versions = ">=3.8" files = [ - {file = "pbr-5.11.1-py2.py3-none-any.whl", hash = "sha256:567f09558bae2b3ab53cb3c1e2e33e726ff3338e7bae3db5dc954b3a44eef12b"}, - {file = "pbr-5.11.1.tar.gz", hash = "sha256:aefc51675b0b533d56bb5fd1c8c6c0522fe31896679882e1c4c63d5e4a0fccb3"}, + {file = "pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08"}, + {file = "pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712"}, ] [[package]] name = "pip" -version = "24.2" +version = "25.0.1" description = "The PyPA recommended tool for installing Python packages." optional = false python-versions = ">=3.8" files = [ - {file = "pip-24.2-py3-none-any.whl", hash = "sha256:2cd581cf58ab7fcfca4ce8efa6dcacd0de5bf8d0a3eb9ec927e07405f4d9e2a2"}, - {file = "pip-24.2.tar.gz", hash = "sha256:5b5e490b5e9cb275c879595064adce9ebd31b854e3e803740b72f9ccf34a45b8"}, + {file = "pip-25.0.1-py3-none-any.whl", hash = "sha256:c46efd13b6aa8279f33f2864459c8ce587ea6a1a59ee20de055868d8f7688f7f"}, + {file = "pip-25.0.1.tar.gz", hash = "sha256:88f96547ea48b940a3a385494e181e29fb8637898f88d88737c5049780f196ea"}, ] [[package]] name = "pip-tools" -version = "7.4.1" +version = "7.5.1" description = "pip-tools keeps your pinned dependencies fresh." optional = false python-versions = ">=3.8" files = [ - {file = "pip-tools-7.4.1.tar.gz", hash = "sha256:864826f5073864450e24dbeeb85ce3920cdfb09848a3d69ebf537b521f14bcc9"}, - {file = "pip_tools-7.4.1-py3-none-any.whl", hash = "sha256:4c690e5fbae2f21e87843e89c26191f0d9454f362d8acdbd695716493ec8b3a9"}, + {file = "pip_tools-7.5.1-py3-none-any.whl", hash = "sha256:f5ff803823529edc0e6e40c86b1aa7da7266fb1078093c8beea4e5b77877036a"}, + {file = "pip_tools-7.5.1.tar.gz", hash = "sha256:a051a94794ba52df9acad2d7c9b0b09ae001617db458a543f8287fea7b89c2cf"}, ] [package.dependencies] @@ -980,18 +1196,19 @@ testing = ["flit_core (>=2,<4)", "poetry_core (>=1.0.0)", "pytest (>=7.2.0)", "p [[package]] name = "platformdirs" -version = "3.10.0" -description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"." +version = "4.3.6" +description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "platformdirs-3.10.0-py3-none-any.whl", hash = "sha256:d7c24979f292f916dc9cbf8648319032f551ea8c49a4c9bf2fb556a02070ec1d"}, - {file = "platformdirs-3.10.0.tar.gz", hash = "sha256:b45696dab2d7cc691a3226759c0d3b00c47c8b6e293d96f6436f733303f77f6d"}, + {file = "platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb"}, + {file = "platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907"}, ] [package.extras] -docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.1)", "sphinx-autodoc-typehints (>=1.24)"] -test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=7.4)", "pytest-cov (>=4.1)", "pytest-mock (>=3.11.1)"] +docs = ["furo (>=2024.8.6)", "proselint (>=0.14)", "sphinx (>=8.0.2)", "sphinx-autodoc-typehints (>=2.4)"] +test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=8.3.2)", "pytest-cov (>=5)", "pytest-mock (>=3.14)"] +type = ["mypy (>=1.11.2)"] [[package]] name = "pluggy" @@ -1010,17 +1227,35 @@ testing = ["pytest", "pytest-benchmark"] [[package]] name = "pygments" -version = "2.16.1" +version = "2.19.2" description = "Pygments is a syntax highlighting package written in Python." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"}, - {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"}, + {file = "pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b"}, + {file = "pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887"}, ] [package.extras] -plugins = ["importlib-metadata"] +windows-terminal = ["colorama (>=0.4.6)"] + +[[package]] +name = "pymdown-extensions" +version = "10.15" +description = "Extension pack for Python Markdown." +optional = false +python-versions = ">=3.8" +files = [ + {file = "pymdown_extensions-10.15-py3-none-any.whl", hash = "sha256:46e99bb272612b0de3b7e7caf6da8dd5f4ca5212c0b273feb9304e236c484e5f"}, + {file = "pymdown_extensions-10.15.tar.gz", hash = "sha256:0e5994e32155f4b03504f939e501b981d306daf7ec2aa1cd2eb6bd300784f8f7"}, +] + +[package.dependencies] +markdown = ">=3.6" +pyyaml = "*" + +[package.extras] +extra = ["pygments (>=2.19.1)"] [[package]] name = "pyparsing" @@ -1087,91 +1322,141 @@ pytest = ">=4.6" [package.extras] testing = ["fields", "hunter", "process-tests", "pytest-xdist", "virtualenv"] +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +description = "Extensions to the standard Python datetime module" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +files = [ + {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"}, + {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"}, +] + +[package.dependencies] +six = ">=1.5" + [[package]] name = "pytz" -version = "2023.3" +version = "2025.2" description = "World timezone definitions, modern and historical" optional = false python-versions = "*" files = [ - {file = "pytz-2023.3-py2.py3-none-any.whl", hash = "sha256:a151b3abb88eda1d4e34a9814df37de2a80e301e68ba0fd856fb9b46bfbbbffb"}, - {file = "pytz-2023.3.tar.gz", hash = "sha256:1d8ce29db189191fb55338ee6d0387d82ab59f3d00eac103412d64e0ebd0c588"}, + {file = "pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00"}, + {file = "pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3"}, ] [[package]] name = "pyyaml" -version = "6.0.1" +version = "6.0.3" description = "YAML parser and emitter for Python" optional = false +python-versions = ">=3.8" +files = [ + {file = "PyYAML-6.0.3-cp38-cp38-macosx_10_13_x86_64.whl", hash = "sha256:c2514fceb77bc5e7a2f7adfaa1feb2fb311607c9cb518dbc378688ec73d8292f"}, + {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9c57bb8c96f6d1808c030b1687b9b5fb476abaa47f0db9c0101f5e9f394e97f4"}, + {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:efd7b85f94a6f21e4932043973a7ba2613b059c4a000551892ac9f1d11f5baf3"}, + {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:22ba7cfcad58ef3ecddc7ed1db3409af68d023b7f940da23c6c2a1890976eda6"}, + {file = "PyYAML-6.0.3-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:6344df0d5755a2c9a276d4473ae6b90647e216ab4757f8426893b5dd2ac3f369"}, + {file = "PyYAML-6.0.3-cp38-cp38-win32.whl", hash = "sha256:3ff07ec89bae51176c0549bc4c63aa6202991da2d9a6129d7aef7f1407d3f295"}, + {file = "PyYAML-6.0.3-cp38-cp38-win_amd64.whl", hash = "sha256:5cf4e27da7e3fbed4d6c3d8e797387aaad68102272f8f9752883bc32d61cb87b"}, + {file = "pyyaml-6.0.3-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:214ed4befebe12df36bcc8bc2b64b396ca31be9304b8f59e25c11cf94a4c033b"}, + {file = "pyyaml-6.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:02ea2dfa234451bbb8772601d7b8e426c2bfa197136796224e50e35a78777956"}, + {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b30236e45cf30d2b8e7b3e85881719e98507abed1011bf463a8fa23e9c3e98a8"}, + {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:66291b10affd76d76f54fad28e22e51719ef9ba22b29e1d7d03d6777a9174198"}, + {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9c7708761fccb9397fe64bbc0395abcae8c4bf7b0eac081e12b809bf47700d0b"}, + {file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:418cf3f2111bc80e0933b2cd8cd04f286338bb88bdc7bc8e6dd775ebde60b5e0"}, + {file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5e0b74767e5f8c593e8c9b5912019159ed0533c70051e9cce3e8b6aa699fcd69"}, + {file = "pyyaml-6.0.3-cp310-cp310-win32.whl", hash = "sha256:28c8d926f98f432f88adc23edf2e6d4921ac26fb084b028c733d01868d19007e"}, + {file = "pyyaml-6.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:bdb2c67c6c1390b63c6ff89f210c8fd09d9a1217a465701eac7316313c915e4c"}, + {file = "pyyaml-6.0.3-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:44edc647873928551a01e7a563d7452ccdebee747728c1080d881d68af7b997e"}, + {file = "pyyaml-6.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:652cb6edd41e718550aad172851962662ff2681490a8a711af6a4d288dd96824"}, + {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:10892704fc220243f5305762e276552a0395f7beb4dbf9b14ec8fd43b57f126c"}, + {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:850774a7879607d3a6f50d36d04f00ee69e7fc816450e5f7e58d7f17f1ae5c00"}, + {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8bb0864c5a28024fac8a632c443c87c5aa6f215c0b126c449ae1a150412f31d"}, + {file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1d37d57ad971609cf3c53ba6a7e365e40660e3be0e5175fa9f2365a379d6095a"}, + {file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:37503bfbfc9d2c40b344d06b2199cf0e96e97957ab1c1b546fd4f87e53e5d3e4"}, + {file = "pyyaml-6.0.3-cp311-cp311-win32.whl", hash = "sha256:8098f252adfa6c80ab48096053f512f2321f0b998f98150cea9bd23d83e1467b"}, + {file = "pyyaml-6.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:9f3bfb4965eb874431221a3ff3fdcddc7e74e3b07799e0e84ca4a0f867d449bf"}, + {file = "pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196"}, + {file = "pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0"}, + {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28"}, + {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c"}, + {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc"}, + {file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e"}, + {file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea"}, + {file = "pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5"}, + {file = "pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b"}, + {file = "pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd"}, + {file = "pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8"}, + {file = "pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1"}, + {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c"}, + {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5"}, + {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6"}, + {file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6"}, + {file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be"}, + {file = "pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26"}, + {file = "pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c"}, + {file = "pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb"}, + {file = "pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac"}, + {file = "pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310"}, + {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7"}, + {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788"}, + {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5"}, + {file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764"}, + {file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35"}, + {file = "pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac"}, + {file = "pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3"}, + {file = "pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3"}, + {file = "pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba"}, + {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c"}, + {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702"}, + {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c"}, + {file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065"}, + {file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65"}, + {file = "pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9"}, + {file = "pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b"}, + {file = "pyyaml-6.0.3-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:b865addae83924361678b652338317d1bd7e79b1f4596f96b96c77a5a34b34da"}, + {file = "pyyaml-6.0.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c3355370a2c156cffb25e876646f149d5d68f5e0a3ce86a5084dd0b64a994917"}, + {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3c5677e12444c15717b902a5798264fa7909e41153cdf9ef7ad571b704a63dd9"}, + {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5ed875a24292240029e4483f9d4a4b8a1ae08843b9c54f43fcc11e404532a8a5"}, + {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0150219816b6a1fa26fb4699fb7daa9caf09eb1999f3b70fb6e786805e80375a"}, + {file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:fa160448684b4e94d80416c0fa4aac48967a969efe22931448d853ada8baf926"}, + {file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:27c0abcb4a5dac13684a37f76e701e054692a9b2d3064b70f5e4eb54810553d7"}, + {file = "pyyaml-6.0.3-cp39-cp39-win32.whl", hash = "sha256:1ebe39cb5fc479422b83de611d14e2c0d3bb2a18bbcb01f229ab3cfbd8fee7a0"}, + {file = "pyyaml-6.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:2e71d11abed7344e42a8849600193d15b6def118602c4c176f748e4583246007"}, + {file = "pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f"}, +] + +[[package]] +name = "pyyaml-env-tag" +version = "0.1" +description = "A custom YAML tag for referencing environment variables in YAML files. " +optional = false python-versions = ">=3.6" files = [ - {file = "PyYAML-6.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d858aa552c999bc8a8d57426ed01e40bef403cd8ccdd0fc5f6f04a00414cac2a"}, - {file = "PyYAML-6.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fd66fc5d0da6d9815ba2cebeb4205f95818ff4b79c3ebe268e75d961704af52f"}, - {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"}, - {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"}, - {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"}, - {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"}, - {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"}, - {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"}, - {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"}, - {file = "PyYAML-6.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f003ed9ad21d6a4713f0a9b5a7a0a79e08dd0f221aff4525a2be4c346ee60aab"}, - {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"}, - {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"}, - {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"}, - {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"}, - {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"}, - {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, - {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"}, - {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"}, - {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"}, - {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"}, - {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"}, - {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"}, - {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"}, - {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"}, - {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"}, - {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"}, - {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afd7e57eddb1a54f0f1a974bc4391af8bcce0b444685d936840f125cf046d5bd"}, - {file = "PyYAML-6.0.1-cp36-cp36m-win32.whl", hash = "sha256:fca0e3a251908a499833aa292323f32437106001d436eca0e6e7833256674585"}, - {file = "PyYAML-6.0.1-cp36-cp36m-win_amd64.whl", hash = "sha256:f22ac1c3cac4dbc50079e965eba2c1058622631e526bd9afd45fedd49ba781fa"}, - {file = "PyYAML-6.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b1275ad35a5d18c62a7220633c913e1b42d44b46ee12554e5fd39c70a243d6a3"}, - {file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18aeb1bf9a78867dc38b259769503436b7c72f7a1f1f4c93ff9a17de54319b27"}, - {file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:596106435fa6ad000c2991a98fa58eeb8656ef2325d7e158344fb33864ed87e3"}, - {file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:baa90d3f661d43131ca170712d903e6295d1f7a0f595074f151c0aed377c9b9c"}, - {file = "PyYAML-6.0.1-cp37-cp37m-win32.whl", hash = "sha256:9046c58c4395dff28dd494285c82ba00b546adfc7ef001486fbf0324bc174fba"}, - {file = "PyYAML-6.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:4fb147e7a67ef577a588a0e2c17b6db51dda102c71de36f8549b6816a96e1867"}, - {file = "PyYAML-6.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1d4c7e777c441b20e32f52bd377e0c409713e8bb1386e1099c2415f26e479595"}, - {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"}, - {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"}, - {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"}, - {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"}, - {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"}, - {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"}, - {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"}, - {file = "PyYAML-6.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c8098ddcc2a85b61647b2590f825f3db38891662cfc2fc776415143f599bb859"}, - {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"}, - {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"}, - {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"}, - {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"}, - {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"}, - {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"}, - {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, + {file = "pyyaml_env_tag-0.1-py3-none-any.whl", hash = "sha256:af31106dec8a4d68c60207c1886031cbf839b68aa7abccdb19868200532c2069"}, + {file = "pyyaml_env_tag-0.1.tar.gz", hash = "sha256:70092675bda14fdec33b31ba77e7543de9ddc88f2e5b99160396572d11525bdb"}, ] +[package.dependencies] +pyyaml = "*" + [[package]] name = "requests" -version = "2.31.0" +version = "2.32.4" description = "Python HTTP for Humans." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"}, - {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"}, + {file = "requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c"}, + {file = "requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422"}, ] [package.dependencies] certifi = ">=2017.4.17" -charset-normalizer = ">=2,<4" +charset_normalizer = ">=2,<4" idna = ">=2.5,<4" urllib3 = ">=1.21.1,<3" @@ -1181,29 +1466,29 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] [[package]] name = "ruff" -version = "0.9.6" +version = "0.9.10" description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" files = [ - {file = "ruff-0.9.6-py3-none-linux_armv6l.whl", hash = "sha256:2f218f356dd2d995839f1941322ff021c72a492c470f0b26a34f844c29cdf5ba"}, - {file = "ruff-0.9.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b908ff4df65dad7b251c9968a2e4560836d8f5487c2f0cc238321ed951ea0504"}, - {file = "ruff-0.9.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b109c0ad2ececf42e75fa99dc4043ff72a357436bb171900714a9ea581ddef83"}, - {file = "ruff-0.9.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1de4367cca3dac99bcbd15c161404e849bb0bfd543664db39232648dc00112dc"}, - {file = "ruff-0.9.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ac3ee4d7c2c92ddfdaedf0bf31b2b176fa7aa8950efc454628d477394d35638b"}, - {file = "ruff-0.9.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5dc1edd1775270e6aa2386119aea692039781429f0be1e0949ea5884e011aa8e"}, - {file = "ruff-0.9.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:4a091729086dffa4bd070aa5dab7e39cc6b9d62eb2bef8f3d91172d30d599666"}, - {file = "ruff-0.9.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d1bbc6808bf7b15796cef0815e1dfb796fbd383e7dbd4334709642649625e7c5"}, - {file = "ruff-0.9.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:589d1d9f25b5754ff230dce914a174a7c951a85a4e9270613a2b74231fdac2f5"}, - {file = "ruff-0.9.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc61dd5131742e21103fbbdcad683a8813be0e3c204472d520d9a5021ca8b217"}, - {file = "ruff-0.9.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:5e2d9126161d0357e5c8f30b0bd6168d2c3872372f14481136d13de9937f79b6"}, - {file = "ruff-0.9.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:68660eab1a8e65babb5229a1f97b46e3120923757a68b5413d8561f8a85d4897"}, - {file = "ruff-0.9.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c4cae6c4cc7b9b4017c71114115db0445b00a16de3bcde0946273e8392856f08"}, - {file = "ruff-0.9.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:19f505b643228b417c1111a2a536424ddde0db4ef9023b9e04a46ed8a1cb4656"}, - {file = "ruff-0.9.6-py3-none-win32.whl", hash = "sha256:194d8402bceef1b31164909540a597e0d913c0e4952015a5b40e28c146121b5d"}, - {file = "ruff-0.9.6-py3-none-win_amd64.whl", hash = "sha256:03482d5c09d90d4ee3f40d97578423698ad895c87314c4de39ed2af945633caa"}, - {file = "ruff-0.9.6-py3-none-win_arm64.whl", hash = "sha256:0e2bb706a2be7ddfea4a4af918562fdc1bcb16df255e5fa595bbd800ce322a5a"}, - {file = "ruff-0.9.6.tar.gz", hash = "sha256:81761592f72b620ec8fa1068a6fd00e98a5ebee342a3642efd84454f3031dca9"}, + {file = "ruff-0.9.10-py3-none-linux_armv6l.whl", hash = "sha256:eb4d25532cfd9fe461acc83498361ec2e2252795b4f40b17e80692814329e42d"}, + {file = "ruff-0.9.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:188a6638dab1aa9bb6228a7302387b2c9954e455fb25d6b4470cb0641d16759d"}, + {file = "ruff-0.9.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:5284dcac6b9dbc2fcb71fdfc26a217b2ca4ede6ccd57476f52a587451ebe450d"}, + {file = "ruff-0.9.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:47678f39fa2a3da62724851107f438c8229a3470f533894b5568a39b40029c0c"}, + {file = "ruff-0.9.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:99713a6e2766b7a17147b309e8c915b32b07a25c9efd12ada79f217c9c778b3e"}, + {file = "ruff-0.9.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:524ee184d92f7c7304aa568e2db20f50c32d1d0caa235d8ddf10497566ea1a12"}, + {file = "ruff-0.9.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:df92aeac30af821f9acf819fc01b4afc3dfb829d2782884f8739fb52a8119a16"}, + {file = "ruff-0.9.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de42e4edc296f520bb84954eb992a07a0ec5a02fecb834498415908469854a52"}, + {file = "ruff-0.9.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d257f95b65806104b6b1ffca0ea53f4ef98454036df65b1eda3693534813ecd1"}, + {file = "ruff-0.9.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b60dec7201c0b10d6d11be00e8f2dbb6f40ef1828ee75ed739923799513db24c"}, + {file = "ruff-0.9.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d838b60007da7a39c046fcdd317293d10b845001f38bcb55ba766c3875b01e43"}, + {file = "ruff-0.9.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:ccaf903108b899beb8e09a63ffae5869057ab649c1e9231c05ae354ebc62066c"}, + {file = "ruff-0.9.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:f9567d135265d46e59d62dc60c0bfad10e9a6822e231f5b24032dba5a55be6b5"}, + {file = "ruff-0.9.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:5f202f0d93738c28a89f8ed9eaba01b7be339e5d8d642c994347eaa81c6d75b8"}, + {file = "ruff-0.9.10-py3-none-win32.whl", hash = "sha256:bfb834e87c916521ce46b1788fbb8484966e5113c02df216680102e9eb960029"}, + {file = "ruff-0.9.10-py3-none-win_amd64.whl", hash = "sha256:f2160eeef3031bf4b17df74e307d4c5fb689a6f3a26a2de3f7ef4044e3c484f1"}, + {file = "ruff-0.9.10-py3-none-win_arm64.whl", hash = "sha256:5fd804c0327a5e5ea26615550e706942f348b197d5475ff34c19733aee4b2e69"}, + {file = "ruff-0.9.10.tar.gz", hash = "sha256:9bacb735d7bada9cfb0f2c227d3658fc443d90a727b47f206fb33f52f3c0eac7"}, ] [[package]] @@ -1223,223 +1508,166 @@ doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments- test = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "importlib-metadata", "ini2toml[lite] (>=0.14)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "jaraco.test", "mypy (==1.11.*)", "packaging (>=23.2)", "pip (>=19.1)", "pyproject-hooks (!=1.1)", "pytest (>=6,!=8.1.*)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-home (>=0.5)", "pytest-mypy", "pytest-perf", "pytest-ruff (<0.4)", "pytest-ruff (>=0.2.1)", "pytest-ruff (>=0.3.2)", "pytest-subprocess", "pytest-timeout", "pytest-xdist (>=3)", "tomli", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"] [[package]] -name = "snowballstemmer" -version = "2.2.0" -description = "This package provides 29 stemmers for 28 languages generated from Snowball algorithms." +name = "six" +version = "1.17.0" +description = "Python 2 and 3 compatibility utilities" optional = false -python-versions = "*" +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" files = [ - {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"}, - {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"}, + {file = "six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274"}, + {file = "six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"}, ] [[package]] -name = "sphinx" -version = "7.1.2" -description = "Python documentation generator" +name = "tomli" +version = "2.3.0" +description = "A lil' TOML parser" optional = false python-versions = ">=3.8" files = [ - {file = "sphinx-7.1.2-py3-none-any.whl", hash = "sha256:d170a81825b2fcacb6dfd5a0d7f578a053e45d3f2b153fecc948c37344eb4cbe"}, - {file = "sphinx-7.1.2.tar.gz", hash = "sha256:780f4d32f1d7d1126576e0e5ecc19dc32ab76cd24e950228dcf7b1f6d3d9e22f"}, + {file = "tomli-2.3.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:88bd15eb972f3664f5ed4b57c1634a97153b4bac4479dcb6a495f41921eb7f45"}, + {file = "tomli-2.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:883b1c0d6398a6a9d29b508c331fa56adbcdff647f6ace4dfca0f50e90dfd0ba"}, + {file = "tomli-2.3.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d1381caf13ab9f300e30dd8feadb3de072aeb86f1d34a8569453ff32a7dea4bf"}, + {file = "tomli-2.3.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a0e285d2649b78c0d9027570d4da3425bdb49830a6156121360b3f8511ea3441"}, + {file = "tomli-2.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0a154a9ae14bfcf5d8917a59b51ffd5a3ac1fd149b71b47a3a104ca4edcfa845"}, + {file = "tomli-2.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:74bf8464ff93e413514fefd2be591c3b0b23231a77f901db1eb30d6f712fc42c"}, + {file = "tomli-2.3.0-cp311-cp311-win32.whl", hash = "sha256:00b5f5d95bbfc7d12f91ad8c593a1659b6387b43f054104cda404be6bda62456"}, + {file = "tomli-2.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:4dc4ce8483a5d429ab602f111a93a6ab1ed425eae3122032db7e9acf449451be"}, + {file = "tomli-2.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d7d86942e56ded512a594786a5ba0a5e521d02529b3826e7761a05138341a2ac"}, + {file = "tomli-2.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:73ee0b47d4dad1c5e996e3cd33b8a76a50167ae5f96a2607cbe8cc773506ab22"}, + {file = "tomli-2.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:792262b94d5d0a466afb5bc63c7daa9d75520110971ee269152083270998316f"}, + {file = "tomli-2.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4f195fe57ecceac95a66a75ac24d9d5fbc98ef0962e09b2eddec5d39375aae52"}, + {file = "tomli-2.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e31d432427dcbf4d86958c184b9bfd1e96b5b71f8eb17e6d02531f434fd335b8"}, + {file = "tomli-2.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:7b0882799624980785240ab732537fcfc372601015c00f7fc367c55308c186f6"}, + {file = "tomli-2.3.0-cp312-cp312-win32.whl", hash = "sha256:ff72b71b5d10d22ecb084d345fc26f42b5143c5533db5e2eaba7d2d335358876"}, + {file = "tomli-2.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:1cb4ed918939151a03f33d4242ccd0aa5f11b3547d0cf30f7c74a408a5b99878"}, + {file = "tomli-2.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5192f562738228945d7b13d4930baffda67b69425a7f0da96d360b0a3888136b"}, + {file = "tomli-2.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:be71c93a63d738597996be9528f4abe628d1adf5e6eb11607bc8fe1a510b5dae"}, + {file = "tomli-2.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c4665508bcbac83a31ff8ab08f424b665200c0e1e645d2bd9ab3d3e557b6185b"}, + {file = "tomli-2.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4021923f97266babc6ccab9f5068642a0095faa0a51a246a6a02fccbb3514eaf"}, + {file = "tomli-2.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4ea38c40145a357d513bffad0ed869f13c1773716cf71ccaa83b0fa0cc4e42f"}, + {file = "tomli-2.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ad805ea85eda330dbad64c7ea7a4556259665bdf9d2672f5dccc740eb9d3ca05"}, + {file = "tomli-2.3.0-cp313-cp313-win32.whl", hash = "sha256:97d5eec30149fd3294270e889b4234023f2c69747e555a27bd708828353ab606"}, + {file = "tomli-2.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0c95ca56fbe89e065c6ead5b593ee64b84a26fca063b5d71a1122bf26e533999"}, + {file = "tomli-2.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:cebc6fe843e0733ee827a282aca4999b596241195f43b4cc371d64fc6639da9e"}, + {file = "tomli-2.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:4c2ef0244c75aba9355561272009d934953817c49f47d768070c3c94355c2aa3"}, + {file = "tomli-2.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c22a8bf253bacc0cf11f35ad9808b6cb75ada2631c2d97c971122583b129afbc"}, + {file = "tomli-2.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0eea8cc5c5e9f89c9b90c4896a8deefc74f518db5927d0e0e8d4a80953d774d0"}, + {file = "tomli-2.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b74a0e59ec5d15127acdabd75ea17726ac4c5178ae51b85bfe39c4f8a278e879"}, + {file = "tomli-2.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:b5870b50c9db823c595983571d1296a6ff3e1b88f734a4c8f6fc6188397de005"}, + {file = "tomli-2.3.0-cp314-cp314-win32.whl", hash = "sha256:feb0dacc61170ed7ab602d3d972a58f14ee3ee60494292d384649a3dc38ef463"}, + {file = "tomli-2.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:b273fcbd7fc64dc3600c098e39136522650c49bca95df2d11cf3b626422392c8"}, + {file = "tomli-2.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:940d56ee0410fa17ee1f12b817b37a4d4e4dc4d27340863cc67236c74f582e77"}, + {file = "tomli-2.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f85209946d1fe94416debbb88d00eb92ce9cd5266775424ff81bc959e001acaf"}, + {file = "tomli-2.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a56212bdcce682e56b0aaf79e869ba5d15a6163f88d5451cbde388d48b13f530"}, + {file = "tomli-2.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c5f3ffd1e098dfc032d4d3af5c0ac64f6d286d98bc148698356847b80fa4de1b"}, + {file = "tomli-2.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5e01decd096b1530d97d5d85cb4dff4af2d8347bd35686654a004f8dea20fc67"}, + {file = "tomli-2.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:8a35dd0e643bb2610f156cca8db95d213a90015c11fee76c946aa62b7ae7e02f"}, + {file = "tomli-2.3.0-cp314-cp314t-win32.whl", hash = "sha256:a1f7f282fe248311650081faafa5f4732bdbfef5d45fe3f2e702fbc6f2d496e0"}, + {file = "tomli-2.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:70a251f8d4ba2d9ac2542eecf008b3c8a9fc5c3f9f02c56a9d7952612be2fdba"}, + {file = "tomli-2.3.0-py3-none-any.whl", hash = "sha256:e95b1af3c5b07d9e643909b5abbec77cd9f1217e6d0bca72b0234736b9fb1f1b"}, + {file = "tomli-2.3.0.tar.gz", hash = "sha256:64be704a875d2a59753d80ee8a533c3fe183e3f06807ff7dc2232938ccb01549"}, ] -[package.dependencies] -alabaster = ">=0.7,<0.8" -babel = ">=2.9" -colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""} -docutils = ">=0.18.1,<0.21" -imagesize = ">=1.3" -importlib-metadata = {version = ">=4.8", markers = "python_version < \"3.10\""} -Jinja2 = ">=3.0" -packaging = ">=21.0" -Pygments = ">=2.13" -requests = ">=2.25.0" -snowballstemmer = ">=2.0" -sphinxcontrib-applehelp = "*" -sphinxcontrib-devhelp = "*" -sphinxcontrib-htmlhelp = ">=2.0.0" -sphinxcontrib-jsmath = "*" -sphinxcontrib-qthelp = "*" -sphinxcontrib-serializinghtml = ">=1.1.5" - -[package.extras] -docs = ["sphinxcontrib-websupport"] -lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"] -test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"] - [[package]] -name = "sphinx-autodoc-typehints" -version = "2.0.1" -description = "Type hints (PEP 484) support for the Sphinx autodoc extension" +name = "types-setuptools" +version = "71.1.0.20240818" +description = "Typing stubs for setuptools" optional = false python-versions = ">=3.8" files = [ - {file = "sphinx_autodoc_typehints-2.0.1-py3-none-any.whl", hash = "sha256:f73ae89b43a799e587e39266672c1075b2ef783aeb382d3ebed77c38a3fc0149"}, - {file = "sphinx_autodoc_typehints-2.0.1.tar.gz", hash = "sha256:60ed1e3b2c970acc0aa6e877be42d48029a9faec7378a17838716cacd8c10b12"}, + {file = "types-setuptools-71.1.0.20240818.tar.gz", hash = "sha256:f62eaffaa39774462c65fbb49368c4dc1d91a90a28371cb14e1af090ff0e41e3"}, + {file = "types_setuptools-71.1.0.20240818-py3-none-any.whl", hash = "sha256:c4f95302f88369ac0ac46c67ddbfc70c6c4dbbb184d9fed356244217a2934025"}, ] -[package.dependencies] -sphinx = ">=7.1.2" - -[package.extras] -docs = ["furo (>=2024.1.29)"] -numpy = ["nptyping (>=2.5)"] -testing = ["covdefaults (>=2.3)", "coverage (>=7.4.2)", "diff-cover (>=8.0.3)", "pytest (>=8.0.1)", "pytest-cov (>=4.1)", "sphobjinv (>=2.3.1)", "typing-extensions (>=4.9)"] - [[package]] -name = "sphinxcontrib-apidoc" -version = "0.5.0" -description = "A Sphinx extension for running 'sphinx-apidoc' on each build" +name = "typing-extensions" +version = "4.13.2" +description = "Backported and Experimental Type Hints for Python 3.8+" optional = false python-versions = ">=3.8" files = [ - {file = "sphinxcontrib-apidoc-0.5.0.tar.gz", hash = "sha256:65efcd92212a5f823715fb95ee098b458a6bb09a5ee617d9ed3dead97177cd55"}, - {file = "sphinxcontrib_apidoc-0.5.0-py3-none-any.whl", hash = "sha256:c671d644d6dc468be91b813dcddf74d87893bff74fe8f1b8b01b69408f0fb776"}, + {file = "typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c"}, + {file = "typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef"}, ] -[package.dependencies] -pbr = "*" -Sphinx = ">=5.0.0" - [[package]] -name = "sphinxcontrib-applehelp" -version = "1.0.4" -description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books" +name = "urllib3" +version = "2.2.3" +description = "HTTP library with thread-safe connection pooling, file post, and more." optional = false python-versions = ">=3.8" files = [ - {file = "sphinxcontrib-applehelp-1.0.4.tar.gz", hash = "sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e"}, - {file = "sphinxcontrib_applehelp-1.0.4-py3-none-any.whl", hash = "sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] - -[[package]] -name = "sphinxcontrib-devhelp" -version = "1.0.2" -description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document." -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"}, - {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"}, + {file = "urllib3-2.2.3-py3-none-any.whl", hash = "sha256:ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac"}, + {file = "urllib3-2.2.3.tar.gz", hash = "sha256:e7d814a81dad81e6caf2ec9fdedb284ecc9c73076b62654547cc64ccdcae26e9"}, ] [package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] +brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"] +h2 = ["h2 (>=4,<5)"] +socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] +zstd = ["zstandard (>=0.18.0)"] [[package]] -name = "sphinxcontrib-htmlhelp" -version = "2.0.1" -description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files" +name = "watchdog" +version = "4.0.2" +description = "Filesystem events monitoring" optional = false python-versions = ">=3.8" files = [ - {file = "sphinxcontrib-htmlhelp-2.0.1.tar.gz", hash = "sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff"}, - {file = "sphinxcontrib_htmlhelp-2.0.1-py3-none-any.whl", hash = "sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["html5lib", "pytest"] - -[[package]] -name = "sphinxcontrib-jsmath" -version = "1.0.1" -description = "A sphinx extension which renders display math in HTML via JavaScript" -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"}, - {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"}, -] - -[package.extras] -test = ["flake8", "mypy", "pytest"] - -[[package]] -name = "sphinxcontrib-qthelp" -version = "1.0.3" -description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document." -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"}, - {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] - -[[package]] -name = "sphinxcontrib-serializinghtml" -version = "1.1.5" -description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)." -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"}, - {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"}, + {file = "watchdog-4.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ede7f010f2239b97cc79e6cb3c249e72962404ae3865860855d5cbe708b0fd22"}, + {file = "watchdog-4.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a2cffa171445b0efa0726c561eca9a27d00a1f2b83846dbd5a4f639c4f8ca8e1"}, + {file = "watchdog-4.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c50f148b31b03fbadd6d0b5980e38b558046b127dc483e5e4505fcef250f9503"}, + {file = "watchdog-4.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7c7d4bf585ad501c5f6c980e7be9c4f15604c7cc150e942d82083b31a7548930"}, + {file = "watchdog-4.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:914285126ad0b6eb2258bbbcb7b288d9dfd655ae88fa28945be05a7b475a800b"}, + {file = "watchdog-4.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:984306dc4720da5498b16fc037b36ac443816125a3705dfde4fd90652d8028ef"}, + {file = "watchdog-4.0.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:1cdcfd8142f604630deef34722d695fb455d04ab7cfe9963055df1fc69e6727a"}, + {file = "watchdog-4.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d7ab624ff2f663f98cd03c8b7eedc09375a911794dfea6bf2a359fcc266bff29"}, + {file = "watchdog-4.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:132937547a716027bd5714383dfc40dc66c26769f1ce8a72a859d6a48f371f3a"}, + {file = "watchdog-4.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:cd67c7df93eb58f360c43802acc945fa8da70c675b6fa37a241e17ca698ca49b"}, + {file = "watchdog-4.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:bcfd02377be80ef3b6bc4ce481ef3959640458d6feaae0bd43dd90a43da90a7d"}, + {file = "watchdog-4.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:980b71510f59c884d684b3663d46e7a14b457c9611c481e5cef08f4dd022eed7"}, + {file = "watchdog-4.0.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:aa160781cafff2719b663c8a506156e9289d111d80f3387cf3af49cedee1f040"}, + {file = "watchdog-4.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f6ee8dedd255087bc7fe82adf046f0b75479b989185fb0bdf9a98b612170eac7"}, + {file = "watchdog-4.0.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:0b4359067d30d5b864e09c8597b112fe0a0a59321a0f331498b013fb097406b4"}, + {file = "watchdog-4.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:770eef5372f146997638d737c9a3c597a3b41037cfbc5c41538fc27c09c3a3f9"}, + {file = "watchdog-4.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eeea812f38536a0aa859972d50c76e37f4456474b02bd93674d1947cf1e39578"}, + {file = "watchdog-4.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:b2c45f6e1e57ebb4687690c05bc3a2c1fb6ab260550c4290b8abb1335e0fd08b"}, + {file = "watchdog-4.0.2-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:10b6683df70d340ac3279eff0b2766813f00f35a1d37515d2c99959ada8f05fa"}, + {file = "watchdog-4.0.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:f7c739888c20f99824f7aa9d31ac8a97353e22d0c0e54703a547a218f6637eb3"}, + {file = "watchdog-4.0.2-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:c100d09ac72a8a08ddbf0629ddfa0b8ee41740f9051429baa8e31bb903ad7508"}, + {file = "watchdog-4.0.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:f5315a8c8dd6dd9425b974515081fc0aadca1d1d61e078d2246509fd756141ee"}, + {file = "watchdog-4.0.2-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:2d468028a77b42cc685ed694a7a550a8d1771bb05193ba7b24006b8241a571a1"}, + {file = "watchdog-4.0.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:f15edcae3830ff20e55d1f4e743e92970c847bcddc8b7509bcd172aa04de506e"}, + {file = "watchdog-4.0.2-py3-none-manylinux2014_aarch64.whl", hash = "sha256:936acba76d636f70db8f3c66e76aa6cb5136a936fc2a5088b9ce1c7a3508fc83"}, + {file = "watchdog-4.0.2-py3-none-manylinux2014_armv7l.whl", hash = "sha256:e252f8ca942a870f38cf785aef420285431311652d871409a64e2a0a52a2174c"}, + {file = "watchdog-4.0.2-py3-none-manylinux2014_i686.whl", hash = "sha256:0e83619a2d5d436a7e58a1aea957a3c1ccbf9782c43c0b4fed80580e5e4acd1a"}, + {file = "watchdog-4.0.2-py3-none-manylinux2014_ppc64.whl", hash = "sha256:88456d65f207b39f1981bf772e473799fcdc10801062c36fd5ad9f9d1d463a73"}, + {file = "watchdog-4.0.2-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:32be97f3b75693a93c683787a87a0dc8db98bb84701539954eef991fb35f5fbc"}, + {file = "watchdog-4.0.2-py3-none-manylinux2014_s390x.whl", hash = "sha256:c82253cfc9be68e3e49282831afad2c1f6593af80c0daf1287f6a92657986757"}, + {file = "watchdog-4.0.2-py3-none-manylinux2014_x86_64.whl", hash = "sha256:c0b14488bd336c5b1845cee83d3e631a1f8b4e9c5091ec539406e4a324f882d8"}, + {file = "watchdog-4.0.2-py3-none-win32.whl", hash = "sha256:0d8a7e523ef03757a5aa29f591437d64d0d894635f8a50f370fe37f913ce4e19"}, + {file = "watchdog-4.0.2-py3-none-win_amd64.whl", hash = "sha256:c344453ef3bf875a535b0488e3ad28e341adbd5a9ffb0f7d62cefacc8824ef2b"}, + {file = "watchdog-4.0.2-py3-none-win_ia64.whl", hash = "sha256:baececaa8edff42cd16558a639a9b0ddf425f93d892e8392a56bf904f5eff22c"}, + {file = "watchdog-4.0.2.tar.gz", hash = "sha256:b4dfbb6c49221be4535623ea4474a4d6ee0a9cef4a80b20c28db4d858b64e270"}, ] [package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] +watchmedo = ["PyYAML (>=3.10)"] [[package]] -name = "tomli" -version = "2.0.1" -description = "A lil' TOML parser" -optional = false -python-versions = ">=3.7" -files = [ - {file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"}, - {file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"}, -] - -[[package]] -name = "types-setuptools" -version = "71.1.0.20240723" -description = "Typing stubs for setuptools" -optional = false -python-versions = ">=3.8" -files = [ - {file = "types-setuptools-71.1.0.20240723.tar.gz", hash = "sha256:8a9349038c7e22d88e6c5d9c6705b347b22930424114a452c1712899e85131ff"}, - {file = "types_setuptools-71.1.0.20240723-py3-none-any.whl", hash = "sha256:ac9fc263f59d1e02bca49cb7270a12c47ab80b3b911fb4d92f1fecf978bfe88a"}, -] - -[[package]] -name = "typing-extensions" -version = "4.12.2" -description = "Backported and Experimental Type Hints for Python 3.8+" -optional = false -python-versions = ">=3.8" -files = [ - {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"}, - {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"}, -] - -[[package]] -name = "urllib3" -version = "2.0.4" -description = "HTTP library with thread-safe connection pooling, file post, and more." +name = "wcmatch" +version = "10.1" +description = "Wildcard/glob file name matcher." optional = false -python-versions = ">=3.7" +python-versions = ">=3.9" files = [ - {file = "urllib3-2.0.4-py3-none-any.whl", hash = "sha256:de7df1803967d2c2a98e4b11bb7d6bd9210474c46e8a0401514e3a42a75ebde4"}, - {file = "urllib3-2.0.4.tar.gz", hash = "sha256:8d22f86aae8ef5e410d4f539fde9ce6b2113a001bb4d189e0aed70642d602b11"}, + {file = "wcmatch-10.1-py3-none-any.whl", hash = "sha256:5848ace7dbb0476e5e55ab63c6bbd529745089343427caa5537f230cc01beb8a"}, + {file = "wcmatch-10.1.tar.gz", hash = "sha256:f11f94208c8c8484a16f4f48638a85d771d9513f4ab3f37595978801cb9465af"}, ] -[package.extras] -brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"] -secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"] -socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] -zstd = ["zstandard (>=0.18.0)"] +[package.dependencies] +bracex = ">=2.1.1" [[package]] name = "wheel" @@ -1457,18 +1685,22 @@ test = ["pytest (>=6.0.0)", "setuptools (>=65)"] [[package]] name = "zipp" -version = "3.16.2" +version = "3.20.2" description = "Backport of pathlib-compatible object wrapper for zip files" optional = false python-versions = ">=3.8" files = [ - {file = "zipp-3.16.2-py3-none-any.whl", hash = "sha256:679e51dd4403591b2d6838a48de3d283f3d188412a9782faadf845f298736ba0"}, - {file = "zipp-3.16.2.tar.gz", hash = "sha256:ebc15946aa78bd63458992fc81ec3b6f7b1e92d51c35e6de1c3804e73b799147"}, + {file = "zipp-3.20.2-py3-none-any.whl", hash = "sha256:a817ac80d6cf4b23bf7f2828b7cabf326f15a001bea8b1f9b49631780ba28350"}, + {file = "zipp-3.20.2.tar.gz", hash = "sha256:bc9eb26f4506fda01b81bcde0ca78103b6e62f991b381fec825435c836edbc29"}, ] [package.extras] -docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] -testing = ["big-O", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-ignore-flaky", "pytest-mypy (>=0.9.1)", "pytest-ruff"] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1)"] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +enabler = ["pytest-enabler (>=2.2)"] +test = ["big-O", "importlib-resources", "jaraco.functools", "jaraco.itertools", "jaraco.test", "more-itertools", "pytest (>=6,!=8.1.*)", "pytest-ignore-flaky"] +type = ["pytest-mypy"] [extras] berkeleydb = ["berkeleydb"] @@ -1480,4 +1712,4 @@ orjson = ["orjson"] [metadata] lock-version = "2.0" python-versions = ">=3.8.1" -content-hash = "0ec27e1dca1f3b60dce28a2109a95eaada0bce2cbfbaee4d209d434a0d6e4086" +content-hash = "35ce70402138519fd62cf5ec9901125cab2c36fc6965ef13c66020637c8c8bd4" diff --git a/pyproject.toml b/pyproject.toml index ce80422313..474854a9fd 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -15,6 +15,8 @@ classifiers=[ "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", + "Programming Language :: Python :: 3.12", + "Programming Language :: Python :: 3.13", "License :: OSI Approved :: BSD License", "Topic :: Software Development :: Libraries :: Python Modules", "Operating System :: OS Independent", @@ -62,11 +64,14 @@ setuptools = ">=68,<72" wheel = ">=0.42,<0.46" [tool.poetry.group.docs.dependencies] -sphinx = ">=7.1.2,<8" -myst-parser = ">=2,<4" -sphinxcontrib-apidoc = ">=0.3,<0.6" -sphinx-autodoc-typehints = ">=1.25.3,<=2.0.1" -typing-extensions = "^4.5.0" +typing-extensions = "^4.11.0" +mkdocs = ">=1.6.1" +mkdocs-material = ">=9.6.12" +# Set python to 3.11 or greater as that's what's used in the validate workflow to test doc builds. +mkdocstrings = {version = "^0.26.2", extras = ["python"], python = ">=3.11"} +mkdocs-gen-files = "^0.5.0" +# Set python to 3.11 or greater as that's what's used in the validate workflow to test doc builds. +mkdocs-include-markdown-plugin = {version = "^7.2.0", python = ">=3.11"} [tool.poetry.group.lint.dependencies] ruff = ">=0.0.286,<0.10.0" @@ -182,7 +187,7 @@ exclude = ''' | \.venv | \.var | \.github - | _build + | site | htmlcov | benchmarks | test_reports @@ -197,13 +202,13 @@ exclude = ''' [tool.pytest.ini_options] addopts = [ - "--doctest-modules", - "--ignore=admin", - "--ignore=devtools", - "--ignore=rdflib/extras/external_graph_libs.py", - "--ignore-glob=docs/*.py", - "--doctest-glob=docs/*.rst", - "--strict-markers", + "--doctest-modules", + "--ignore=admin", + "--ignore=devtools", + "--ignore=rdflib/extras/external_graph_libs.py", + "--ignore-glob=docs/*.py", + "--ignore-glob=site/*", + "--strict-markers", ] filterwarnings = [ # The below warning is a consequence of how pytest doctest detects mocks and how DefinedNamespace behaves when an undefined attribute is being accessed. @@ -237,7 +242,7 @@ skip = [ '.venv', '.var', '.github', - '_build', + 'site', 'htmlcov', 'benchmarks', 'test_reports', diff --git a/rdflib/__init__.py b/rdflib/__init__.py index 5e37ec92f3..c538272bd0 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -7,40 +7,42 @@ rdflib package. The primary interface `rdflib` exposes to work with RDF is -`rdflib.graph.Graph`. +[`rdflib.graph.Graph`][rdflib.graph.Graph]. A tiny example: - >>> from rdflib import Graph, URIRef, Literal - - >>> g = Graph() - >>> result = g.parse("http://www.w3.org/2000/10/swap/test/meet/blue.rdf") - - >>> print("graph has %s statements." % len(g)) - graph has 4 statements. - >>> - >>> for s, p, o in g: - ... if (s, p, o) not in g: - ... raise Exception("It better be!") - - >>> s = g.serialize(format='nt') - >>> - >>> sorted(g) == [ - ... (URIRef("http://meetings.example.com/cal#m1"), - ... URIRef("http://www.example.org/meeting_organization#homePage"), - ... URIRef("http://meetings.example.com/m1/hp")), - ... (URIRef("http://www.example.org/people#fred"), - ... URIRef("http://www.example.org/meeting_organization#attending"), - ... URIRef("http://meetings.example.com/cal#m1")), - ... (URIRef("http://www.example.org/people#fred"), - ... URIRef("http://www.example.org/personal_details#GivenName"), - ... Literal("Fred")), - ... (URIRef("http://www.example.org/people#fred"), - ... URIRef("http://www.example.org/personal_details#hasEmail"), - ... URIRef("mailto:fred@example.com")) - ... ] - True - +```python +>>> from rdflib import Graph, URIRef, Literal + +>>> g = Graph() +>>> result = g.parse("http://www.w3.org/2000/10/swap/test/meet/blue.rdf") + +>>> print("graph has %s statements." % len(g)) +graph has 4 statements. +>>> +>>> for s, p, o in g: +... if (s, p, o) not in g: +... raise Exception("It better be!") + +>>> s = g.serialize(format='nt') +>>> +>>> sorted(g) == [ +... (URIRef("http://meetings.example.com/cal#m1"), +... URIRef("http://www.example.org/meeting_organization#homePage"), +... URIRef("http://meetings.example.com/m1/hp")), +... (URIRef("http://www.example.org/people#fred"), +... URIRef("http://www.example.org/meeting_organization#attending"), +... URIRef("http://meetings.example.com/cal#m1")), +... (URIRef("http://www.example.org/people#fred"), +... URIRef("http://www.example.org/personal_details#GivenName"), +... Literal("Fred")), +... (URIRef("http://www.example.org/people#fred"), +... URIRef("http://www.example.org/personal_details#hasEmail"), +... URIRef("mailto:fred@example.com")) +... ] +True + +``` """ import logging @@ -133,10 +135,13 @@ For example: +```python >>> from rdflib import Literal,XSD >>> Literal("01", datatype=XSD.int) rdflib.term.Literal("1", datatype=rdflib.term.URIRef("http://www.w3.org/2001/XMLSchema#integer")) +``` + This flag may be changed at any time, but will only affect literals created after that time, previously created literals will remain (un)normalized. @@ -145,14 +150,13 @@ DAWG_LITERAL_COLLATION = False -""" -DAWG_LITERAL_COLLATION determines how literals are ordered or compared +"""DAWG_LITERAL_COLLATION determines how literals are ordered or compared to each other. In SPARQL, applying the >,<,>=,<= operators to literals of incompatible data-types is an error, i.e: -Literal(2)>Literal('cake') is neither true nor false, but an error. +`Literal(2)>Literal('cake')` is neither true nor false, but an error. This is a problem in PY3, where lists of Literals of incompatible types can no longer be sorted. @@ -162,7 +166,7 @@ datatype URI In particular, this determines how the rich comparison operators for -Literal work, eq, __neq__, __lt__, etc. +Literal work, eq, `__neq__`, `__lt__`, etc. """ diff --git a/rdflib/_networking.py b/rdflib/_networking.py index 311096a891..2788fe0aeb 100644 --- a/rdflib/_networking.py +++ b/rdflib/_networking.py @@ -11,23 +11,27 @@ def _make_redirect_request(request: Request, http_error: HTTPError) -> Request: - """ - Create a new request object for a redirected request. - - The logic is based on `urllib.request.HTTPRedirectHandler` from `this commit _`. - - :param request: The original request that resulted in the redirect. - :param http_error: The response to the original request that indicates a - redirect should occur and contains the new location. - :return: A new request object to the location indicated by the response. - :raises HTTPError: the supplied ``http_error`` if the redirect request - cannot be created. - :raises ValueError: If the response code is `None`. - :raises ValueError: If the response does not contain a ``Location`` header - or the ``Location`` header is not a string. - :raises HTTPError: If the scheme of the new location is not ``http``, - ``https``, or ``ftp``. - :raises HTTPError: If there are too many redirects or a redirect loop. + """Create a new request object for a redirected request. + + The logic is based on [HTTPRedirectHandler](https://github.com/python/cpython/blob/b58bc8c2a9a316891a5ea1a0487aebfc86c2793a/Lib/urllib/request.py#L641-L751) from urllib.request. + + Args: + request: The original request that resulted in the redirect. + http_error: The response to the original request that indicates a + redirect should occur and contains the new location. + + Returns: + A new request object to the location indicated by the response. + + Raises: + HTTPError: the supplied `http_error` if the redirect request + cannot be created. + ValueError: If the response code is None. + ValueError: If the response does not contain a `Location` header + or the `Location` header is not a string. + HTTPError: If the scheme of the new location is not `http`, + `https`, or `ftp`. + HTTPError: If there are too many redirects or a redirect loop. """ new_url = http_error.headers.get("Location") if new_url is None: @@ -92,15 +96,17 @@ def _make_redirect_request(request: Request, http_error: HTTPError) -> Request: def _urlopen(request: Request) -> addinfourl: - """ - This is a shim for `urlopen` that handles HTTP redirects with status code + """This is a shim for `urlopen` that handles HTTP redirects with status code 308 (Permanent Redirect). This function should be removed once all supported versions of Python handles the 308 HTTP status code. - :param request: The request to open. - :return: The response to the request. + Args: + request: The request to open. + + Returns: + The response to the request. """ try: return urlopen(request) diff --git a/rdflib/_type_checking.py b/rdflib/_type_checking.py index 1bbeda1349..2ead0eabd7 100644 --- a/rdflib/_type_checking.py +++ b/rdflib/_type_checking.py @@ -3,13 +3,13 @@ as it would otherwise introduce a runtime dependency on `typing_extensions` for older python versions which is not desirable. -This was made mainly to accommodate ``sphinx-autodoc-typehints`` which cannot +This was made mainly to accommodate `sphinx-autodoc-typehints` which cannot recognize type aliases from imported files if the type aliases are defined -inside ``if TYPE_CHECKING:``. So instead of placing the type aliases in normal -modules inside ``TYPE_CHECKING`` guards they are in this file which should only -be imported inside ``TYPE_CHECKING`` guards. +inside `if TYPE_CHECKING:`. So instead of placing the type aliases in normal +modules inside `TYPE_CHECKING` guards they are in this file which should only +be imported inside `TYPE_CHECKING` guards. -.. important:: +!!! info "Internal use only" Things inside this module are not for use outside of RDFLib and this module is not part the the RDFLib public API. """ diff --git a/rdflib/collection.py b/rdflib/collection.py index ed0a48ff93..dd63c4ed21 100644 --- a/rdflib/collection.py +++ b/rdflib/collection.py @@ -12,10 +12,9 @@ class Collection: - """ - See "Emulating container types": - https://docs.python.org/reference/datamodel.html#emulating-container-types + """See "Emulating container types": + ```python >>> from rdflib.term import Literal >>> from rdflib.graph import Graph >>> from pprint import pprint @@ -40,7 +39,6 @@ class Collection: ['"1"^^', '"2"^^', '"3"^^'] - >>> Literal(1) in c True >>> len(c) @@ -50,8 +48,9 @@ class Collection: >>> c.index(Literal(2)) == 1 True - The collection is immutable if ``uri`` is the empty list - (``http://www.w3.org/1999/02/22-rdf-syntax-ns#nil``). + ``` + + The collection is immutable if `uri` is the empty list (`http://www.w3.org/1999/02/22-rdf-syntax-ns#nil`). """ def __init__(self, graph: Graph, uri: Node, seq: List[Node] = []): @@ -62,6 +61,7 @@ def __init__(self, graph: Graph, uri: Node, seq: List[Node] = []): def n3(self) -> str: """ + ```python >>> from rdflib.term import Literal >>> from rdflib.graph import Graph >>> listname = BNode() @@ -83,8 +83,10 @@ def n3(self) -> str: >>> c = Collection(g, listname) >>> print(c.n3()) #doctest: +NORMALIZE_WHITESPACE ( "1"^^ - "2"^^ - "3"^^ ) + "2"^^ + "3"^^ ) + + ``` """ return "( %s )" % (" ".join([i.n3() for i in self])) @@ -147,6 +149,7 @@ def __setitem__(self, key: int, value: Node) -> None: def __delitem__(self, key: int) -> None: """ + ```python >>> from rdflib.namespace import RDF, RDFS >>> from rdflib import Graph >>> from pprint import pformat @@ -187,6 +190,7 @@ def __delitem__(self, key: int) -> None: >>> len(g) 4 + ``` """ self[key] # to raise any potential key exceptions graph = self.graph @@ -223,6 +227,7 @@ def _end(self) -> Node: def append(self, item: Node) -> Collection: """ + ```python >>> from rdflib.term import Literal >>> from rdflib.graph import Graph >>> listname = BNode() @@ -233,8 +238,8 @@ def append(self, item: Node) -> Collection: >>> len([i for i in links if (i, RDF.rest, RDF.nil) in g]) 1 + ``` """ - end = self._end() if end == RDF.nil: raise ValueError("Cannot append to empty list") diff --git a/rdflib/compare.py b/rdflib/compare.py index afc2c40b50..77d99d533b 100644 --- a/rdflib/compare.py +++ b/rdflib/compare.py @@ -7,70 +7,84 @@ Warning: the time to canonicalize bnodes may increase exponentially on degenerate larger graphs. Use with care! -Example of comparing two graphs:: - - >>> g1 = Graph().parse(format='n3', data=''' - ... @prefix : . - ... :rel - ... , - ... [ :label "Same" ], - ... , - ... [ :label "A" ] . - ... ''') - >>> g2 = Graph().parse(format='n3', data=''' - ... @prefix : . - ... :rel - ... , - ... [ :label "Same" ], - ... , - ... [ :label "B" ] . - ... ''') - >>> - >>> iso1 = to_isomorphic(g1) - >>> iso2 = to_isomorphic(g2) - -These are not isomorphic:: - - >>> iso1 == iso2 - False - -Diff the two graphs:: - - >>> in_both, in_first, in_second = graph_diff(iso1, iso2) - -Present in both:: - - >>> def dump_nt_sorted(g): - ... for l in sorted(g.serialize(format='nt').splitlines()): - ... if l: print(l.decode('ascii')) - - >>> dump_nt_sorted(in_both) #doctest: +SKIP - - . - - _:cbcaabaaba17fecbc304a64f8edee4335e . - _:cbcaabaaba17fecbc304a64f8edee4335e - "Same" . - -Only in first:: - - >>> dump_nt_sorted(in_first) #doctest: +SKIP - - . - - _:cb124e4c6da0579f810c0ffe4eff485bd9 . - _:cb124e4c6da0579f810c0ffe4eff485bd9 - "A" . - -Only in second:: - - >>> dump_nt_sorted(in_second) #doctest: +SKIP - - . - - _:cb558f30e21ddfc05ca53108348338ade8 . - _:cb558f30e21ddfc05ca53108348338ade8 - "B" . +Example of comparing two graphs: + +```python +>>> g1 = Graph().parse(format='n3', data=''' +... @prefix : . +... :rel +... , +... [ :label "Same" ], +... , +... [ :label "A" ] . +... ''') +>>> g2 = Graph().parse(format='n3', data=''' +... @prefix : . +... :rel +... , +... [ :label "Same" ], +... , +... [ :label "B" ] . +... ''') +>>> +>>> iso1 = to_isomorphic(g1) +>>> iso2 = to_isomorphic(g2) + +``` + +These are not isomorphic + +```python +>>> iso1 == iso2 +False + +``` + +Diff the two graphs: + +```python +>>> in_both, in_first, in_second = graph_diff(iso1, iso2) + +``` + +Present in both: + +```python +>>> def dump_nt_sorted(g): +... for l in sorted(g.serialize(format='nt').splitlines()): +... if l: print(l.decode('ascii')) +>>> dump_nt_sorted(in_both) #doctest: +SKIP + + . + + _:cbcaabaaba17fecbc304a64f8edee4335e . +_:cbcaabaaba17fecbc304a64f8edee4335e + "Same" . +``` + +Only in first: + +```python +>>> dump_nt_sorted(in_first) #doctest: +SKIP + + . + + _:cb124e4c6da0579f810c0ffe4eff485bd9 . +_:cb124e4c6da0579f810c0ffe4eff485bd9 + "A" . +``` + +Only in second: + +```python +>>> dump_nt_sorted(in_second) #doctest: +SKIP + + . + + _:cb558f30e21ddfc05ca53108348338ade8 . +_:cb558f30e21ddfc05ca53108348338ade8 + "B" . +``` """ from __future__ import annotations @@ -194,7 +208,7 @@ def graph_digest(self, stats=None): def internal_hash(self, stats=None): """ - This is defined instead of __hash__ to avoid a circular recursion + This is defined instead of `__hash__` to avoid a circular recursion scenario with the Memory store for rdflib which requires a hash lookup in order to return a generator of triples. """ @@ -451,6 +465,7 @@ def _traces( experimental = self._experimental_path(coloring_copy) experimental_score = set([c.key() for c in experimental]) if last_coloring: + # type error: Statement is unreachable generator = self._create_generator( # type: ignore[unreachable] [last_coloring, experimental], generator ) @@ -461,13 +476,13 @@ def _traces( best_experimental_score = experimental_score elif best_score > color_score: # type: ignore[unreachable] # prune this branch. - if stats is not None: + if stats is not None and isinstance(stats["prunings"], int): stats["prunings"] += 1 elif experimental_score != best_experimental_score: best.append(refined_coloring) else: # prune this branch. - if stats is not None: + if stats is not None and isinstance(stats["prunings"], int): stats["prunings"] += 1 discrete: List[List[Color]] = [x for x in best if self._discrete(x)] if len(discrete) == 0: @@ -547,8 +562,8 @@ def isomorphic(graph1: Graph, graph2: Graph) -> bool: Uses an algorithm to compute unique hashes which takes bnodes into account. - Examples:: - + Example: + ```python >>> g1 = Graph().parse(format='n3', data=''' ... @prefix : . ... :rel . @@ -563,7 +578,6 @@ def isomorphic(graph1: Graph, graph2: Graph) -> bool: ... ''') >>> isomorphic(g1, g2) True - >>> g3 = Graph().parse(format='n3', data=''' ... @prefix : . ... :rel . @@ -572,6 +586,8 @@ def isomorphic(graph1: Graph, graph2: Graph) -> bool: ... ''') >>> isomorphic(g1, g3) False + + ``` """ gd1 = _TripleCanonicalizer(graph1).to_hash() gd2 = _TripleCanonicalizer(graph2).to_hash() @@ -610,10 +626,10 @@ def similar(g1: Graph, g2: Graph): Checks if the two graphs are "similar", by comparing sorted triples where all bnodes have been replaced by a singular mock bnode (the - ``_MOCK_BNODE``). + `_MOCK_BNODE`). This is a much cheaper, but less reliable, alternative to the comparison - algorithm in ``isomorphic``. + algorithm in `isomorphic`. """ return all(t1 == t2 for (t1, t2) in _squashed_graphs_triples(g1, g2)) diff --git a/rdflib/container.py b/rdflib/container.py index 6ee92848b0..cbfd2cac5b 100644 --- a/rdflib/container.py +++ b/rdflib/container.py @@ -8,50 +8,53 @@ class Container: - """A class for constructing RDF containers, as per https://www.w3.org/TR/rdf11-mt/#rdf-containers - - Basic usage, creating a ``Bag`` and adding to it:: - - >>> from rdflib import Graph, BNode, Literal, Bag - >>> g = Graph() - >>> b = Bag(g, BNode(), [Literal("One"), Literal("Two"), Literal("Three")]) - >>> print(g.serialize(format="turtle")) - @prefix rdf: . - - [] a rdf:Bag ; - rdf:_1 "One" ; - rdf:_2 "Two" ; - rdf:_3 "Three" . - - - - >>> # print out an item using an index reference - >>> print(b[2]) - Two - - >>> # add a new item - >>> b.append(Literal("Hello")) # doctest: +ELLIPSIS - - >>> print(g.serialize(format="turtle")) - @prefix rdf: . - - [] a rdf:Bag ; - rdf:_1 "One" ; - rdf:_2 "Two" ; - rdf:_3 "Three" ; - rdf:_4 "Hello" . - - - + """A class for constructing RDF containers, as per + + Basic usage, creating a `Bag` and adding to it: + + ```python + >>> from rdflib import Graph, BNode, Literal, Bag + >>> g = Graph() + >>> b = Bag(g, BNode(), [Literal("One"), Literal("Two"), Literal("Three")]) + >>> print(g.serialize(format="turtle")) + @prefix rdf: . + + [] a rdf:Bag ; + rdf:_1 "One" ; + rdf:_2 "Two" ; + rdf:_3 "Three" . + + + + >>> # print out an item using an index reference + >>> print(b[2]) + Two + + >>> # add a new item + >>> b.append(Literal("Hello")) # doctest: +ELLIPSIS + + >>> print(g.serialize(format="turtle")) + @prefix rdf: . + + [] a rdf:Bag ; + rdf:_1 "One" ; + rdf:_2 "Two" ; + rdf:_3 "Three" ; + rdf:_4 "Hello" . + + + + ``` """ def __init__(self, graph, uri, seq=[], rtype="Bag"): """Creates a Container - :param graph: a Graph instance - :param uri: URI or Blank Node of the Container - :param seq: the elements of the Container - :param rtype: the type of Container, one of "Bag", "Seq" or "Alt" + Args: + graph: a Graph instance + uri: URI or Blank Node of the Container + seq: the elements of the Container + rtype: the type of Container, one of "Bag", "Seq" or "Alt" """ self.graph = graph diff --git a/rdflib/events.py b/rdflib/events.py index 61f3454b64..b22d7bc3ab 100644 --- a/rdflib/events.py +++ b/rdflib/events.py @@ -6,21 +6,30 @@ Create a dispatcher: - >>> d = Dispatcher() +```python +>>> d = Dispatcher() + +``` Now create a handler for the event and subscribe it to the dispatcher to handle Event events. A handler is a simple function or method that accepts the event as an argument: - >>> def handler1(event): print(repr(event)) - >>> d.subscribe(Event, handler1) # doctest: +ELLIPSIS - +```python +>>> def handler1(event): print(repr(event)) +>>> d.subscribe(Event, handler1) # doctest: +ELLIPSIS + + +``` Now dispatch a new event into the dispatcher, and see handler1 get fired: - >>> d.dispatch(Event(foo='bar', data='yours', used_by='the event handlers')) - +```python +>>> d.dispatch(Event(foo='bar', data='yours', used_by='the event handlers')) + + +``` """ from __future__ import annotations diff --git a/rdflib/extras/describer.py b/rdflib/extras/describer.py index f0df706756..27780baf90 100644 --- a/rdflib/extras/describer.py +++ b/rdflib/extras/describer.py @@ -5,101 +5,104 @@ The `Describer.rel` and `Describer.rev` methods return a context manager which sets the current about to the referenced resource for the context scope (for use with the -``with`` statement). - -Full example in the ``to_rdf`` method below:: - - >>> import datetime - >>> from rdflib.graph import Graph - >>> from rdflib.namespace import Namespace, RDFS, FOAF - >>> - >>> ORG_URI = "http://example.org/" - >>> - >>> CV = Namespace("http://purl.org/captsolo/resume-rdf/0.2/cv#") - >>> - >>> class Person: - ... def __init__(self): - ... self.first_name = "Some" - ... self.last_name = "Body" - ... self.username = "some1" - ... self.presentation = "Just a Python & RDF hacker." - ... self.image = "/images/persons/" + self.username + ".jpg" - ... self.site = "http://example.net/" - ... self.start_date = datetime.date(2009, 9, 4) - ... def get_full_name(self): - ... return " ".join([self.first_name, self.last_name]) - ... def get_absolute_url(self): - ... return "/persons/" + self.username - ... def get_thumbnail_url(self): - ... return self.image.replace('.jpg', '-thumb.jpg') - ... - ... def to_rdf(self): - ... graph = Graph() - ... graph.bind('foaf', FOAF) - ... graph.bind('cv', CV) - ... lang = 'en' - ... d = Describer(graph, base=ORG_URI) - ... d.about(self.get_absolute_url()+'#person') - ... d.rdftype(FOAF.Person) - ... d.value(FOAF.name, self.get_full_name()) - ... d.value(FOAF.givenName, self.first_name) - ... d.value(FOAF.familyName, self.last_name) - ... d.rel(FOAF.homepage, self.site) - ... d.value(RDFS.comment, self.presentation, lang=lang) - ... with d.rel(FOAF.depiction, self.image): - ... d.rdftype(FOAF.Image) - ... d.rel(FOAF.thumbnail, self.get_thumbnail_url()) - ... with d.rev(CV.aboutPerson): - ... d.rdftype(CV.CV) - ... with d.rel(CV.hasWorkHistory): - ... d.value(CV.startDate, self.start_date) - ... d.rel(CV.employedIn, ORG_URI+"#company") - ... return graph - ... - >>> person_graph = Person().to_rdf() - >>> expected = Graph().parse(data=''' - ... - ... - ... Some Body - ... Some - ... Body - ... - ... - ... - ... - ... - ... - ... Just a Python & RDF hacker. - ... - ... - ... - ... - ... - ... - ... - ... - ... 2009-09-04 - ... - ... - ... - ... - ... - ... ''', format="xml") - >>> - >>> from rdflib.compare import isomorphic - >>> isomorphic(person_graph, expected) #doctest: +SKIP - True +`with` statement). + +Full example in the `to_rdf` method below: + +```python +>>> import datetime +>>> from rdflib.graph import Graph +>>> from rdflib.namespace import Namespace, RDFS, FOAF + +>>> ORG_URI = "http://example.org/" + +>>> CV = Namespace("http://purl.org/captsolo/resume-rdf/0.2/cv#") + +>>> class Person: +... def __init__(self): +... self.first_name = "Some" +... self.last_name = "Body" +... self.username = "some1" +... self.presentation = "Just a Python & RDF hacker." +... self.image = "/images/persons/" + self.username + ".jpg" +... self.site = "http://example.net/" +... self.start_date = datetime.date(2009, 9, 4) +... def get_full_name(self): +... return " ".join([self.first_name, self.last_name]) +... def get_absolute_url(self): +... return "/persons/" + self.username +... def get_thumbnail_url(self): +... return self.image.replace('.jpg', '-thumb.jpg') +... +... def to_rdf(self): +... graph = Graph() +... graph.bind('foaf', FOAF) +... graph.bind('cv', CV) +... lang = 'en' +... d = Describer(graph, base=ORG_URI) +... d.about(self.get_absolute_url()+'#person') +... d.rdftype(FOAF.Person) +... d.value(FOAF.name, self.get_full_name()) +... d.value(FOAF.givenName, self.first_name) +... d.value(FOAF.familyName, self.last_name) +... d.rel(FOAF.homepage, self.site) +... d.value(RDFS.comment, self.presentation, lang=lang) +... with d.rel(FOAF.depiction, self.image): +... d.rdftype(FOAF.Image) +... d.rel(FOAF.thumbnail, self.get_thumbnail_url()) +... with d.rev(CV.aboutPerson): +... d.rdftype(CV.CV) +... with d.rel(CV.hasWorkHistory): +... d.value(CV.startDate, self.start_date) +... d.rel(CV.employedIn, ORG_URI+"#company") +... return graph +... +>>> person_graph = Person().to_rdf() +>>> expected = Graph().parse(data=''' +... +... +... Some Body +... Some +... Body +... +... +... +... +... +... +... Just a Python & RDF hacker. +... +... +... +... +... +... +... +... +... 2009-09-04 +... +... +... +... +... +... ''', format="xml") + +>>> from rdflib.compare import isomorphic +>>> isomorphic(person_graph, expected) #doctest: +SKIP +True + +``` """ from contextlib import contextmanager @@ -121,10 +124,10 @@ def __init__(self, graph=None, about=None, base=None): def about(self, subject, **kws): """ Sets the current subject. Will convert the given object into an - ``URIRef`` if it's not an ``Identifier``. - - Usage:: + `URIRef` if it's not an `Identifier`. + Example: + ```python >>> d = Describer() >>> d._current() #doctest: +ELLIPSIS rdflib.term.BNode(...) @@ -132,6 +135,7 @@ def about(self, subject, **kws): >>> d._current() rdflib.term.URIRef('http://example.org/') + ``` """ kws.setdefault("base", self.base) subject = cast_identifier(subject, **kws) @@ -143,10 +147,10 @@ def about(self, subject, **kws): def value(self, p, v, **kws): """ Set a literal value for the given property. Will cast the value to an - ``Literal`` if a plain literal is given. - - Usage:: + `Literal` if a plain literal is given. + Example: + ```python >>> from rdflib import URIRef >>> from rdflib.namespace import RDF, RDFS >>> d = Describer(about="http://example.org/") @@ -154,20 +158,21 @@ def value(self, p, v, **kws): >>> d.graph.value(URIRef('http://example.org/'), RDFS.label) rdflib.term.Literal('Example') + ``` """ v = cast_value(v, **kws) self.graph.add((self._current(), p, v)) def rel(self, p, o=None, **kws): """Set an object for the given property. Will convert the given object - into an ``URIRef`` if it's not an ``Identifier``. If none is given, a - new ``BNode`` is used. + into an `URIRef` if it's not an `Identifier`. If none is given, a + new `BNode` is used. - Returns a context manager for use in a ``with`` block, within which the + Returns a context manager for use in a `with` block, within which the given object is used as current subject. - Usage:: - + Example: + ```python >>> from rdflib import URIRef >>> from rdflib.namespace import RDF, RDFS >>> d = Describer(about="/", base="http://example.org/") @@ -183,6 +188,7 @@ def rel(self, p, o=None, **kws): >>> d.graph.value(URIRef('http://example.org/more'), RDFS.label) rdflib.term.Literal('More') + ``` """ kws.setdefault("base", self.base) @@ -193,12 +199,12 @@ def rel(self, p, o=None, **kws): def rev(self, p, s=None, **kws): """ - Same as ``rel``, but uses current subject as *object* of the relation. + Same as `rel`, but uses current subject as *object* of the relation. The given resource is still used as subject in the returned context manager. - Usage:: - + Example: + ```python >>> from rdflib import URIRef >>> from rdflib.namespace import RDF, RDFS >>> d = Describer(about="http://example.org/") @@ -210,6 +216,7 @@ def rev(self, p, s=None, **kws): >>> d.graph.value(URIRef('http://example.net/'), RDFS.label) rdflib.term.Literal('Net') + ``` """ kws.setdefault("base", self.base) p = cast_identifier(p) @@ -218,11 +225,10 @@ def rev(self, p, s=None, **kws): return self._subject_stack(s) def rdftype(self, t): - """ - Shorthand for setting rdf:type of the current subject. - - Usage:: + """Shorthand for setting rdf:type of the current subject. + Example: + ```python >>> from rdflib import URIRef >>> from rdflib.namespace import RDF, RDFS >>> d = Describer(about="http://example.org/") @@ -231,6 +237,7 @@ def rdftype(self, t): ... RDF.type, RDFS.Resource) in d.graph True + ``` """ self.graph.add((self._current(), RDF.type, t)) diff --git a/rdflib/extras/external_graph_libs.py b/rdflib/extras/external_graph_libs.py index 42469778e9..cd7acbb0b8 100644 --- a/rdflib/extras/external_graph_libs.py +++ b/rdflib/extras/external_graph_libs.py @@ -1,12 +1,13 @@ """Convert (to and) from rdflib graphs to other well known graph libraries. Currently the following libraries are supported: + - networkx: MultiDiGraph, DiGraph, Graph - graph_tool: Graph Doctests in this file are all skipped, as we can't run them conditionally if networkx or graph_tool are available and they would err otherwise. -see ../../test/test_extras_external_graph_libs.py for conditional tests +see `../../test/test_extras_external_graph_libs.py` for conditional tests """ from __future__ import annotations @@ -37,16 +38,16 @@ def _rdflib_to_networkx_graph( Modifies nxgraph in-place! - Arguments: + Args: graph: an rdflib.Graph. nxgraph: a networkx.Graph/DiGraph/MultiDigraph. calc_weights: If True adds a 'weight' attribute to each edge according to the count of s,p,o triples between s and o, which is meaningful for Graph/DiGraph. edge_attrs: Callable to construct edge data from s, p, o. - 'triples' attribute is handled specially to be merged. - 'weight' should not be generated if calc_weights==True. - (see invokers below!) + 'triples' attribute is handled specially to be merged. + 'weight' should not be generated if calc_weights==True. + (see invokers below!) transform_s: Callable to transform node generated from s. transform_o: Callable to transform node generated from o. """ @@ -81,44 +82,46 @@ def rdflib_to_networkx_multidigraph( The subjects and objects are the later nodes of the MultiDiGraph. The predicates are used as edge keys (to identify multi-edges). - :Parameters: - - - graph: a rdflib.Graph. - - edge_attrs: Callable to construct later edge_attributes. It receives + Args: + graph: a rdflib.Graph. + edge_attrs: Callable to construct later edge_attributes. It receives 3 variables (s, p, o) and should construct a dictionary that is passed to networkx's add_edge(s, o, \*\*attrs) function. By default this will include setting the MultiDiGraph key=p here. If you don't want to be able to re-identify the edge later on, you - can set this to ``lambda s, p, o: {}``. In this case MultiDiGraph's + can set this to `lambda s, p, o: {}`. In this case MultiDiGraph's default (increasing ints) will be used. Returns: networkx.MultiDiGraph - >>> from rdflib import Graph, URIRef, Literal - >>> g = Graph() - >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') - >>> p, q = URIRef('p'), URIRef('q') - >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] - >>> for t in edges: - ... g.add(t) - ... - >>> mdg = rdflib_to_networkx_multidigraph(g) - >>> len(mdg.edges()) - 4 - >>> mdg.has_edge(a, b) - True - >>> mdg.has_edge(a, b, key=p) - True - >>> mdg.has_edge(a, b, key=q) - True - - >>> mdg = rdflib_to_networkx_multidigraph(g, edge_attrs=lambda s,p,o: {}) - >>> mdg.has_edge(a, b, key=0) - True - >>> mdg.has_edge(a, b, key=1) - True + Example: + ```python + >>> from rdflib import Graph, URIRef, Literal + >>> g = Graph() + >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') + >>> p, q = URIRef('p'), URIRef('q') + >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] + >>> for t in edges: + ... g.add(t) + ... + >>> mdg = rdflib_to_networkx_multidigraph(g) + >>> len(mdg.edges()) + 4 + >>> mdg.has_edge(a, b) + True + >>> mdg.has_edge(a, b, key=p) + True + >>> mdg.has_edge(a, b, key=q) + True + + >>> mdg = rdflib_to_networkx_multidigraph(g, edge_attrs=lambda s,p,o: {}) + >>> mdg.has_edge(a, b, key=0) + True + >>> mdg.has_edge(a, b, key=1) + True + ``` """ import networkx as nx @@ -140,11 +143,10 @@ def rdflib_to_networkx_digraph( all triples between s and o. Also by default calculates the edge weight as the length of triples. - :Parameters: - - - ``graph``: a rdflib.Graph. - - ``calc_weights``: If true calculate multi-graph edge-count as edge 'weight' - - ``edge_attrs``: Callable to construct later edge_attributes. It receives + Args: + graph: a rdflib.Graph. + calc_weights: If true calculate multi-graph edge-count as edge 'weight' + edge_attrs: Callable to construct later edge_attributes. It receives 3 variables (s, p, o) and should construct a dictionary that is passed to networkx's add_edge(s, o, \*\*attrs) function. @@ -152,36 +154,38 @@ def rdflib_to_networkx_digraph( which is treated specially by us to be merged. Other attributes of multi-edges will only contain the attributes of the first edge. If you don't want the 'triples' attribute for tracking, set this to - ``lambda s, p, o: {}``. + `lambda s, p, o: {}`. Returns: networkx.DiGraph - >>> from rdflib import Graph, URIRef, Literal - >>> g = Graph() - >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') - >>> p, q = URIRef('p'), URIRef('q') - >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] - >>> for t in edges: - ... g.add(t) - ... - >>> dg = rdflib_to_networkx_digraph(g) - >>> dg[a][b]['weight'] - 2 - >>> sorted(dg[a][b]['triples']) == [(a, p, b), (a, q, b)] - True - >>> len(dg.edges()) - 3 - >>> dg.size() - 3 - >>> dg.size(weight='weight') - 4.0 - - >>> dg = rdflib_to_networkx_graph(g, False, edge_attrs=lambda s,p,o:{}) - >>> 'weight' in dg[a][b] - False - >>> 'triples' in dg[a][b] - False - + Example: + ```python + >>> from rdflib import Graph, URIRef, Literal + >>> g = Graph() + >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') + >>> p, q = URIRef('p'), URIRef('q') + >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] + >>> for t in edges: + ... g.add(t) + ... + >>> dg = rdflib_to_networkx_digraph(g) + >>> dg[a][b]['weight'] + 2 + >>> sorted(dg[a][b]['triples']) == [(a, p, b), (a, q, b)] + True + >>> len(dg.edges()) + 3 + >>> dg.size() + 3 + >>> dg.size(weight='weight') + 4.0 + + >>> dg = rdflib_to_networkx_graph(g, False, edge_attrs=lambda s,p,o:{}) + >>> 'weight' in dg[a][b] + False + >>> 'triples' in dg[a][b] + False + ``` """ import networkx as nx @@ -198,53 +202,54 @@ def rdflib_to_networkx_graph( ): r"""Converts the given graph into a networkx.Graph. - As an rdflib.Graph() can contain multiple directed edges between nodes, by - default adds the a 'triples' attribute to the single DiGraph edge with a - list of triples between s and o in graph. - Also by default calculates the edge weight as the len(triples). - - :Parameters: + As an [`rdflib.Graph()`][rdflib.Graph] can contain multiple directed edges between nodes, by + default adds the a 'triples' attribute to the single DiGraph edge with a list of triples between s and o in graph. + Also by default calculates the edge weight as the `len(triples)`. - - graph: a rdflib.Graph. - - calc_weights: If true calculate multi-graph edge-count as edge 'weight' - - edge_attrs: Callable to construct later edge_attributes. It receives - 3 variables (s, p, o) and should construct a dictionary that is - passed to networkx's add_edge(s, o, \*\*attrs) function. + Args: + graph: a rdflib.Graph. + calc_weights: If true calculate multi-graph edge-count as edge 'weight' + edge_attrs: Callable to construct later edge_attributes. It receives + 3 variables (s, p, o) and should construct a dictionary that is + passed to networkx's add_edge(s, o, \*\*attrs) function. - By default this will include setting the 'triples' attribute here, - which is treated specially by us to be merged. Other attributes of - multi-edges will only contain the attributes of the first edge. - If you don't want the 'triples' attribute for tracking, set this to - ``lambda s, p, o: {}``. + By default this will include setting the 'triples' attribute here, + which is treated specially by us to be merged. Other attributes of + multi-edges will only contain the attributes of the first edge. + If you don't want the 'triples' attribute for tracking, set this to + `lambda s, p, o: {}`. Returns: networkx.Graph - >>> from rdflib import Graph, URIRef, Literal - >>> g = Graph() - >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') - >>> p, q = URIRef('p'), URIRef('q') - >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] - >>> for t in edges: - ... g.add(t) - ... - >>> ug = rdflib_to_networkx_graph(g) - >>> ug[a][b]['weight'] - 3 - >>> sorted(ug[a][b]['triples']) == [(a, p, b), (a, q, b), (b, p, a)] - True - >>> len(ug.edges()) - 2 - >>> ug.size() - 2 - >>> ug.size(weight='weight') - 4.0 - - >>> ug = rdflib_to_networkx_graph(g, False, edge_attrs=lambda s,p,o:{}) - >>> 'weight' in ug[a][b] - False - >>> 'triples' in ug[a][b] - False + Example: + ```python + >>> from rdflib import Graph, URIRef, Literal + >>> g = Graph() + >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') + >>> p, q = URIRef('p'), URIRef('q') + >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] + >>> for t in edges: + ... g.add(t) + ... + >>> ug = rdflib_to_networkx_graph(g) + >>> ug[a][b]['weight'] + 3 + >>> sorted(ug[a][b]['triples']) == [(a, p, b), (a, q, b), (b, p, a)] + True + >>> len(ug.edges()) + 2 + >>> ug.size() + 2 + >>> ug.size(weight='weight') + 4.0 + + >>> ug = rdflib_to_networkx_graph(g, False, edge_attrs=lambda s,p,o:{}) + >>> 'weight' in ug[a][b] + False + >>> 'triples' in ug[a][b] + False + ``` """ import networkx as nx @@ -266,56 +271,58 @@ def rdflib_to_graphtool( The subjects and objects are the later vertices of the Graph. The predicates become edges. - :Parameters: - - graph: a rdflib.Graph. - - v_prop_names: a list of names for the vertex properties. The default is set - to ['term'] (see transform_s, transform_o below). - - e_prop_names: a list of names for the edge properties. - - transform_s: callable with s, p, o input. Should return a dictionary - containing a value for each name in v_prop_names. By default is set - to {'term': s} which in combination with v_prop_names = ['term'] - adds s as 'term' property to the generated vertex for s. - - transform_p: similar to transform_s, but wrt. e_prop_names. By default - returns {'term': p} which adds p as a property to the generated - edge between the vertex for s and the vertex for o. - - transform_o: similar to transform_s. + Args: + graph: a rdflib.Graph. + v_prop_names: a list of names for the vertex properties. The default is set + to ['term'] (see transform_s, transform_o below). + e_prop_names: a list of names for the edge properties. + transform_s: callable with s, p, o input. Should return a dictionary + containing a value for each name in v_prop_names. By default is set + to {'term': s} which in combination with v_prop_names = ['term'] + adds s as 'term' property to the generated vertex for s. + transform_p: similar to transform_s, but wrt. e_prop_names. By default + returns {'term': p} which adds p as a property to the generated + edge between the vertex for s and the vertex for o. + transform_o: similar to transform_s. Returns: graph_tool.Graph() - >>> from rdflib import Graph, URIRef, Literal - >>> g = Graph() - >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') - >>> p, q = URIRef('p'), URIRef('q') - >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] - >>> for t in edges: - ... g.add(t) - ... - >>> mdg = rdflib_to_graphtool(g) - >>> len(list(mdg.edges())) - 4 - >>> from graph_tool import util as gt_util - >>> vpterm = mdg.vertex_properties['term'] - >>> va = gt_util.find_vertex(mdg, vpterm, a)[0] - >>> vb = gt_util.find_vertex(mdg, vpterm, b)[0] - >>> vl = gt_util.find_vertex(mdg, vpterm, l)[0] - >>> (va, vb) in [(e.source(), e.target()) for e in list(mdg.edges())] - True - >>> epterm = mdg.edge_properties['term'] - >>> len(list(gt_util.find_edge(mdg, epterm, p))) == 3 - True - >>> len(list(gt_util.find_edge(mdg, epterm, q))) == 1 - True - - >>> mdg = rdflib_to_graphtool( - ... g, - ... e_prop_names=[str('name')], - ... transform_p=lambda s, p, o: {str('name'): unicode(p)}) - >>> epterm = mdg.edge_properties['name'] - >>> len(list(gt_util.find_edge(mdg, epterm, unicode(p)))) == 3 - True - >>> len(list(gt_util.find_edge(mdg, epterm, unicode(q)))) == 1 - True - + Example: + ```python + >>> from rdflib import Graph, URIRef, Literal + >>> g = Graph() + >>> a, b, l = URIRef('a'), URIRef('b'), Literal('l') + >>> p, q = URIRef('p'), URIRef('q') + >>> edges = [(a, p, b), (a, q, b), (b, p, a), (b, p, l)] + >>> for t in edges: + ... g.add(t) + ... + >>> mdg = rdflib_to_graphtool(g) + >>> len(list(mdg.edges())) + 4 + >>> from graph_tool import util as gt_util + >>> vpterm = mdg.vertex_properties['term'] + >>> va = gt_util.find_vertex(mdg, vpterm, a)[0] + >>> vb = gt_util.find_vertex(mdg, vpterm, b)[0] + >>> vl = gt_util.find_vertex(mdg, vpterm, l)[0] + >>> (va, vb) in [(e.source(), e.target()) for e in list(mdg.edges())] + True + >>> epterm = mdg.edge_properties['term'] + >>> len(list(gt_util.find_edge(mdg, epterm, p))) == 3 + True + >>> len(list(gt_util.find_edge(mdg, epterm, q))) == 1 + True + + >>> mdg = rdflib_to_graphtool( + ... g, + ... e_prop_names=[str('name')], + ... transform_p=lambda s, p, o: {str('name'): unicode(p)}) + >>> epterm = mdg.edge_properties['name'] + >>> len(list(gt_util.find_edge(mdg, epterm, unicode(p)))) == 3 + True + >>> len(list(gt_util.find_edge(mdg, epterm, unicode(q)))) == 1 + True + ``` """ # pytype error: Can't find module 'graph_tool'. import graph_tool as gt # pytype: disable=import-error diff --git a/rdflib/extras/infixowl.py b/rdflib/extras/infixowl.py index b80fb0c16e..bb60de3834 100644 --- a/rdflib/extras/infixowl.py +++ b/rdflib/extras/infixowl.py @@ -1,58 +1,75 @@ """RDFLib Python binding for OWL Abstract Syntax -OWL Constructor DL Syntax Manchester OWL Syntax Example -==================================================================================== -intersectionOf C ∩ D C AND D Human AND Male -unionOf C ∪ D C OR D Man OR Woman -complementOf ¬ C NOT C NOT Male -oneOf {a} ∪ {b}... {a b ...} {England Italy Spain} -someValuesFrom ∃ R C R SOME C hasColleague SOME Professor -allValuesFrom ∀ R C R ONLY C hasColleague ONLY Professor -minCardinality ≥ N R R MIN 3 hasColleague MIN 3 -maxCardinality ≤ N R R MAX 3 hasColleague MAX 3 -cardinality = N R R EXACTLY 3 hasColleague EXACTLY 3 -hasValue ∃ R {a} R VALUE a hasColleague VALUE Matthew - -see: http://www.w3.org/TR/owl-semantics/syntax.html - http://owl-workshop.man.ac.uk/acceptedLong/submission_9.pdf +| OWL Constructor | DL Syntax | Manchester OWL Syntax | Example | +|------------------|---------------|------------------------|----------------------------------| +| `intersectionOf` | C ∩ D | C AND D | Human AND Male | +| `unionOf` | C ∪ D | C OR D | Man OR Woman | +| `complementOf` | ¬C | NOT C | NOT Male | +| `oneOf` | {a} ∪ {b}... | {a b ...} | {England Italy Spain} | +| `someValuesFrom` | ∃ R C | R SOME C | hasColleague SOME Professor | +| `allValuesFrom` | ∀ R C | R ONLY C | hasColleague ONLY Professor | +| `minCardinality` | ≥ N R | R MIN 3 | hasColleague MIN 3 | +| `maxCardinality` | ≤ N R | R MAX 3 | hasColleague MAX 3 | +| `cardinality` | = N R | R EXACTLY 3 | hasColleague EXACTLY 3 | +| `hasValue` | ∃ R.{a} | R VALUE a | hasColleague VALUE Matthew | + +See: +- http://www.w3.org/TR/owl-semantics/syntax.html +- http://owl-workshop.man.ac.uk/acceptedLong/submission_9.pdf 3.2.3 Axioms for complete classes without using owl:equivalentClass Named class description of type 2 (with owl:oneOf) or type 4-6 (with owl:intersectionOf, owl:unionOf or owl:complementOf -Uses Manchester Syntax for __repr__ +Uses Manchester Syntax for `__repr__` +```python >>> exNs = Namespace("http://example.com/") >>> g = Graph() >>> g.bind("ex", exNs, override=False) +``` + Now we have an empty graph, we can construct OWL classes in it using the Python classes defined in this module +```python >>> a = Class(exNs.Opera, graph=g) +``` + Now we can assert rdfs:subClassOf and owl:equivalentClass relationships (in the underlying graph) with other classes using the 'subClassOf' and 'equivalentClass' descriptors which can be set to a list of objects for the corresponding predicates. +```python >>> a.subClassOf = [exNs.MusicalWork] +``` + We can then access the rdfs:subClassOf relationships +```python >>> print(list(a.subClassOf)) [Class: ex:MusicalWork ] +``` + This can also be used against already populated graphs: +```python >>> owlGraph = Graph().parse(str(OWL)) >>> list(Class(OWL.Class, graph=owlGraph).subClassOf) [Class: rdfs:Class ] +``` + Operators are also available. For instance we can add ex:Opera to the extension of the ex:CreativeWork class via the '+=' operator +```python >>> a Class: ex:Opera SubClassOf: ex:MusicalWork >>> b = Class(exNs.CreativeWork, graph=g) @@ -60,29 +77,41 @@ >>> print(sorted(a.subClassOf, key=lambda c:c.identifier)) [Class: ex:CreativeWork , Class: ex:MusicalWork ] +``` + And we can then remove it from the extension as well +```python >>> b -= a >>> a Class: ex:Opera SubClassOf: ex:MusicalWork +``` + Boolean class constructions can also be created with Python operators. For example, The | operator can be used to construct a class consisting of a owl:unionOf the operands: +```python >>> c = a | b | Class(exNs.Work, graph=g) >>> c ( ex:Opera OR ex:CreativeWork OR ex:Work ) +``` + Boolean class expressions can also be operated as lists (using python list operators) +```python >>> del c[c.index(Class(exNs.Work, graph=g))] >>> c ( ex:Opera OR ex:CreativeWork ) +``` + The '&' operator can be used to construct class intersection: +```python >>> woman = Class(exNs.Female, graph=g) & Class(exNs.Human, graph=g) >>> woman.identifier = exNs.Woman >>> woman @@ -90,27 +119,34 @@ >>> len(woman) 2 +``` + Enumerated classes can also be manipulated +```python >>> contList = [Class(exNs.Africa, graph=g), Class(exNs.NorthAmerica, graph=g)] >>> EnumeratedClass(members=contList, graph=g) { ex:Africa ex:NorthAmerica } +``` + owl:Restrictions can also be instantiated: +```python >>> Restriction(exNs.hasParent, graph=g, allValuesFrom=exNs.Human) ( ex:hasParent ONLY ex:Human ) -Restrictions can also be created using Manchester OWL syntax in 'colloquial' -Python +``` + +Restrictions can also be created using Manchester OWL syntax in 'colloquial' Python + +```python >>> exNs.hasParent @ some @ Class(exNs.Physician, graph=g) ( ex:hasParent SOME ex:Physician ) - >>> Property(exNs.hasParent, graph=g) @ max @ Literal(1) ( ex:hasParent MAX 1 ) - >>> print(g.serialize(format='pretty-xml')) # doctest: +SKIP - +``` """ from __future__ import annotations @@ -134,7 +170,6 @@ Python has the wonderful "in" operator and it would be nice to have additional infix operator like this. This recipe shows how (almost) arbitrary infix operators can be defined. - """ __all__ = [ @@ -368,7 +403,6 @@ def _remover(inst): class Individual: """ A typed individual, the base class of the InfixOWL classes. - """ factoryGraph = Graph() # noqa: N815 @@ -422,6 +456,7 @@ def replace(self, other): causing all triples that refer to it to be changed and then delete the individual. + ```python >>> g = Graph() >>> b = Individual(OWL.Restriction, g) >>> b.type = RDFS.Resource @@ -430,6 +465,8 @@ def replace(self, other): >>> del b.type >>> len(list(b.type)) 0 + + ``` """ for s, p, _o in self.graph.triples((None, None, self.identifier)): self.graph.add((s, p, classOrIdentifier(other))) @@ -452,6 +489,7 @@ def _set_type(self, kind: Union[Individual, Identifier, Iterable[_ObjectType]]): @TermDeletionHelper(RDF.type) def _delete_type(self): """ + ```python >>> g = Graph() >>> b = Individual(OWL.Restriction, g) >>> b.type = RDFS.Resource @@ -460,6 +498,8 @@ def _delete_type(self): >>> del b.type >>> len(list(b.type)) 0 + + ``` """ pass # pragma: no cover @@ -522,15 +562,13 @@ def _delete_sameAs(self): # noqa: N802 class AnnotatableTerms(Individual): - """ - Terms in an OWL ontology with rdfs:label and rdfs:comment - + """Terms in an OWL ontology with rdfs:label and rdfs:comment - ## Interface with ATTEMPTO (http://attempto.ifi.uzh.ch/site) + Interface with ATTEMPTO (http://attempto.ifi.uzh.ch/site) - ### Verbalisation of OWL entity IRIS + ## Verbalisation of OWL entity IRIS - #### How are OWL entity IRIs verbalized? + ### How are OWL entity IRIs verbalized? The OWL verbalizer maps OWL entity IRIs to ACE content words such that @@ -565,34 +603,33 @@ class AnnotatableTerms(Individual): It is possible to specify the mapping of IRIs to surface forms using the following annotation properties: - .. code-block:: none - - http://attempto.ifi.uzh.ch/ace_lexicon#PN_sg - http://attempto.ifi.uzh.ch/ace_lexicon#CN_sg - http://attempto.ifi.uzh.ch/ace_lexicon#CN_pl - http://attempto.ifi.uzh.ch/ace_lexicon#TV_sg - http://attempto.ifi.uzh.ch/ace_lexicon#TV_pl - http://attempto.ifi.uzh.ch/ace_lexicon#TV_vbg + ``` + http://attempto.ifi.uzh.ch/ace_lexicon#PN_sg + http://attempto.ifi.uzh.ch/ace_lexicon#CN_sg + http://attempto.ifi.uzh.ch/ace_lexicon#CN_pl + http://attempto.ifi.uzh.ch/ace_lexicon#TV_sg + http://attempto.ifi.uzh.ch/ace_lexicon#TV_pl + http://attempto.ifi.uzh.ch/ace_lexicon#TV_vbg + ``` For example, the following axioms state that if the IRI "#man" is used as a plural common noun, then the wordform men must be used by the verbalizer. If, however, it is used as a singular transitive verb, then mans must be used. - .. code-block:: none - - - - #man - men - - - - - #man - mans - - + ```xml + + + #man + men + + + + + #man + mans + + ``` """ def __init__( @@ -702,6 +739,7 @@ def _set_label(self, label): @TermDeletionHelper(RDFS.label) def _delete_label(self): """ + ```python >>> g = Graph() >>> b = Individual(OWL.Restriction,g) >>> b.label = Literal('boo') @@ -710,6 +748,8 @@ def _delete_label(self): >>> del b.label >>> len(list(b.label)) 0 + + ``` """ pass # pragma: no cover @@ -869,6 +909,7 @@ def DeepClassClear(class_to_prune): # noqa: N802 Recursively clear the given class, continuing where any related class is an anonymous class + ```python >>> EX = Namespace("http://example.com/") >>> g = Graph() >>> g.bind("ex", EX, override=False) @@ -905,6 +946,8 @@ def DeepClassClear(class_to_prune): # noqa: N802 >>> otherClass.delete() >>> list(g.triples((otherClass.identifier, None, None))) [] + + ``` """ def deepClearIfBNode(_class): # noqa: N802 @@ -933,8 +976,8 @@ def deepClearIfBNode(_class): # noqa: N802 class MalformedClass(ValueError): # noqa: N818 """ - .. deprecated:: TODO-NEXT-VERSION - This class will be removed in version ``7.0.0``. + !!! warning "Deprecated" + This class will be removed in version `7.0.0`. """ pass @@ -983,19 +1026,20 @@ def CastClass(c, graph=None): # noqa: N802 class Class(AnnotatableTerms): - """ - 'General form' for classes: + """'General form' for classes: The Manchester Syntax (supported in Protege) is used as the basis for the form of this class See: http://owl-workshop.man.ac.uk/acceptedLong/submission_9.pdf: + ``` [Annotation] ‘Class:’ classID {Annotation ( (‘SubClassOf:’ ClassExpression) | (‘EquivalentTo’ ClassExpression) | (’DisjointWith’ ClassExpression)) } + ``` Appropriate excerpts from OWL Reference: @@ -1009,7 +1053,6 @@ class Class(AnnotatableTerms): "..An owl:complementOf property links a class to precisely one class description." - """ def _serialize(self, graph): @@ -1152,6 +1195,7 @@ def __and__(self, other): Chaining 3 intersections + ```python >>> exNs = Namespace("http://example.com/") >>> g = Graph() >>> g.bind("ex", exNs, override=False) @@ -1165,6 +1209,8 @@ def __and__(self, other): True >>> isinstance(youngWoman.identifier, BNode) True + + ``` """ return BooleanClass( operator=OWL.intersectionOf, members=[self, other], graph=self.graph @@ -1261,6 +1307,7 @@ def _get_parents(self): computed attributes that returns a generator over taxonomic 'parents' by disjunction, conjunction, and subsumption + ```python >>> from rdflib.util import first >>> exNs = Namespace('http://example.com/') >>> g = Graph() @@ -1281,6 +1328,7 @@ def _get_parents(self): >>> list(father.parents) [Class: ex:Parent , Class: ex:Male ] + ``` """ for parent in itertools.chain(self.subClassOf, self.equivalentClass): yield parent @@ -1494,14 +1542,14 @@ def __iadd__(self, other): class EnumeratedClass(OWLRDFListProxy, Class): - """ - Class for owl:oneOf forms: + """Class for owl:oneOf forms: OWL Abstract Syntax is used axiom ::= 'EnumeratedClass(' classID ['Deprecated'] { annotation } { individualID } ')' + ```python >>> exNs = Namespace("http://example.com/") >>> g = Graph() >>> g.bind("ex", exNs, override=False) @@ -1525,6 +1573,8 @@ class EnumeratedClass(OWLRDFListProxy, Class): owl:oneOf ( ex:chime ex:uche ex:ejike ) . + + ``` """ _operator = OWL.oneOf @@ -1564,6 +1614,7 @@ def serialize(self, graph): class BooleanClassExtentHelper: """ + ```python >>> testGraph = Graph() >>> Individual.factoryGraph = testGraph >>> EX = Namespace("http://example.com/") @@ -1579,6 +1630,8 @@ class BooleanClassExtentHelper: >>> for c in BooleanClass.getUnions(): ... print(c) #doctest: +SKIP ( ex:Fire OR ex:Water ) + + ``` """ def __init__(self, operator): @@ -1605,7 +1658,6 @@ class BooleanClass(OWLRDFListProxy, Class): See: http://www.w3.org/TR/owl-ref/#Boolean owl:complementOf is an attribute of Class, however - """ @BooleanClassExtentHelper(OWL.intersectionOf) @@ -1672,6 +1724,7 @@ def changeOperator(self, newOperator): # noqa: N802, N803 Converts a unionOf / intersectionOf class expression into one that instead uses the given operator + ```python >>> testGraph = Graph() >>> Individual.factoryGraph = testGraph >>> EX = Namespace("http://example.com/") @@ -1690,6 +1743,7 @@ def changeOperator(self, newOperator): # noqa: N802, N803 ... print(e) # doctest: +SKIP The new operator is already being used! + ``` """ assert newOperator != self._operator, "The new operator is already being used!" self.graph.remove((self.identifier, self._operator, self._rdfList.uri)) @@ -1720,20 +1774,20 @@ def AllDifferent(members): # noqa: N802 TODO: implement this function DisjointClasses(' description description { description } ')' - """ pass # pragma: no cover class Restriction(Class): """ + ``` restriction ::= 'restriction(' datavaluedPropertyID dataRestrictionComponent { dataRestrictionComponent } ')' | 'restriction(' individualvaluedPropertyID individualRestrictionComponent { individualRestrictionComponent } ')' - + ``` """ restrictionKinds = [ # noqa: N815 @@ -1810,6 +1864,7 @@ def __init__( def serialize(self, graph): """ + ```python >>> g1 = Graph() >>> g2 = Graph() >>> EX = Namespace("http://example.com/") @@ -1829,6 +1884,8 @@ def serialize(self, graph): ... ) #doctest: +NORMALIZE_WHITESPACE +SKIP [rdflib.term.URIRef( 'http://www.w3.org/2002/07/owl#DatatypeProperty')] + + ``` """ Property(self.onProperty, graph=self.graph, baseType=None).serialize(graph) for s, p, o in self.graph.triples((self.identifier, None, None)): @@ -2061,6 +2118,7 @@ def __repr__(self): class Property(AnnotatableTerms): """ + ``` axiom ::= 'DatatypeProperty(' datavaluedPropertyID ['Deprecated'] { annotation } { 'super(' datavaluedPropertyID ')'} ['Functional'] @@ -2073,25 +2131,21 @@ class Property(AnnotatableTerms): 'Functional' 'InverseFunctional' | 'Transitive' ] { 'domain(' description ')' } { 'range(' description ')' } ') - + ``` """ def setupVerbAnnotations(self, verb_annotations): # noqa: N802 - """ - - OWL properties map to ACE transitive verbs (TV) + """OWL properties map to ACE transitive verbs (TV) There are 6 morphological categories that determine the surface form of an IRI: - singular form of a transitive verb (e.g. mans) - plural form of a transitive verb (e.g. man) - past participle form a transitive verb (e.g. manned) - - http://attempto.ifi.uzh.ch/ace_lexicon#TV_sg - http://attempto.ifi.uzh.ch/ace_lexicon#TV_pl - http://attempto.ifi.uzh.ch/ace_lexicon#TV_vbg - + - singular form of a transitive verb (e.g. mans) + - plural form of a transitive verb (e.g. man) + - past participle form a transitive verb (e.g. manned) + - http://attempto.ifi.uzh.ch/ace_lexicon#TV_sg + - http://attempto.ifi.uzh.ch/ace_lexicon#TV_pl + - http://attempto.ifi.uzh.ch/ace_lexicon#TV_vbg """ if isinstance(verb_annotations, tuple): diff --git a/rdflib/extras/shacl.py b/rdflib/extras/shacl.py index 1a5094ce32..925d6fee41 100644 --- a/rdflib/extras/shacl.py +++ b/rdflib/extras/shacl.py @@ -36,12 +36,15 @@ def parse_shacl_path( ) -> Union[URIRef, Path]: """ Parse a valid SHACL path (e.g. the object of a triple with predicate sh:path) - from a :class:`~rdflib.graph.Graph` as a :class:`~rdflib.term.URIRef` if the path - is simply a predicate or a :class:`~rdflib.paths.Path` otherwise. + from a [`Graph`][rdflib.graph.Graph] as a [`URIRef`][rdflib.term.URIRef] if the path + is simply a predicate or a [`Path`][rdflib.paths.Path] otherwise. - :param shapes_graph: A :class:`~rdflib.graph.Graph` containing the path to be parsed - :param path_identifier: A :class:`~rdflib.term.Node` of the path - :return: A :class:`~rdflib.term.URIRef` or a :class:`~rdflib.paths.Path` + Args: + shapes_graph: A [`Graph`][rdflib.graph.Graph] containing the path to be parsed + path_identifier: A [`Node`][rdflib.term.Node] of the path + + Returns: + A [`URIRef`][rdflib.term.URIRef] or a [`Path`][rdflib.paths.Path] """ path: Optional[Union[URIRef, Path]] = None @@ -112,11 +115,14 @@ def _build_path_component( Helper method that implements the recursive component of SHACL path triple construction. - :param graph: A :class:`~rdflib.graph.Graph` into which to insert triples - :param graph_component: A :class:`~rdflib.term.URIRef` or - :class:`~rdflib.paths.Path` that is part of a path expression - :return: The :class:`~rdflib.term.IdentifiedNode of the resource in the - graph that corresponds to the provided path_component + Args: + graph: A [`Graph`][rdflib.graph.Graph] into which to insert triples + graph_component: A [`URIRef`][rdflib.term.URIRef] or + [`Path`][rdflib.paths.Path] that is part of a path expression + + Returns: + The [`IdentifiedNode`][rdflib.term.IdentifiedNode] of the resource in the + graph that corresponds to the provided path_component """ # Literals or other types are not allowed if not isinstance(path_component, (URIRef, Path)): @@ -181,24 +187,27 @@ def build_shacl_path( path: URIRef | Path, target_graph: Graph | None = None ) -> tuple[IdentifiedNode, Graph | None]: """ - Build the SHACL Path triples for a path given by a :class:`~rdflib.term.URIRef` for - simple paths or a :class:`~rdflib.paths.Path` for complex paths. + Build the SHACL Path triples for a path given by a [`URIRef`][rdflib.term.URIRef] for + simple paths or a [`Path`][rdflib.paths.Path] for complex paths. - Returns an :class:`~rdflib.term.IdentifiedNode` for the path (which should be - the object of a triple with predicate sh:path) and the graph into which any + Returns an [`IdentifiedNode`][rdflib.term.IdentifiedNode] for the path (which should be + the object of a triple with predicate `sh:path`) and the graph into which any new triples were added. - :param path: A :class:`~rdflib.term.URIRef` or a :class:`~rdflib.paths.Path` - :param target_graph: Optionally, a :class:`~rdflib.graph.Graph` into which to put - constructed triples. If not provided, a new graph will be created - :return: A (path_identifier, graph) tuple where: - - path_identifier: If path is a :class:`~rdflib.term.URIRef`, this is simply - the provided path. If path is a :class:`~rdflib.paths.Path`, this is - the :class:`~rdflib.term.BNode` corresponding to the root of the SHACL - path expression added to the graph. - - graph: None if path is a :class:`~rdflib.term.URIRef` (as no new triples - are constructed). If path is a :class:`~rdflib.paths.Path`, this is either the - target_graph provided or a new graph into which the path triples were added. + Args: + path: A [`URIRef`][rdflib.term.URIRef] or a [`Path`][rdflib.paths.Path] + target_graph: Optionally, a [`Graph`][rdflib.graph.Graph] into which to put + constructed triples. If not provided, a new graph will be created + + Returns: + A (path_identifier, graph) tuple where: + - path_identifier: If path is a [`URIRef`][rdflib.term.URIRef], this is simply + the provided path. If path is a [`Path`][rdflib.paths.Path], this is + the [`BNode`][rdflib.term.BNode] corresponding to the root of the SHACL + path expression added to the graph. + - graph: None if path is a [`URIRef`][rdflib.term.URIRef] (as no new triples + are constructed). If path is a [`Path`][rdflib.paths.Path], this is either the + target_graph provided or a new graph into which the path triples were added. """ # If a path is a URI, that's the whole path. No graph needs to be constructed. if isinstance(path, URIRef): diff --git a/rdflib/graph.py b/rdflib/graph.py index bef31e2b6e..20d6e88400 100644 --- a/rdflib/graph.py +++ b/rdflib/graph.py @@ -1,40 +1,36 @@ -"""\ - +""" RDFLib defines the following kinds of Graphs: -* :class:`~rdflib.graph.Graph` -* :class:`~rdflib.graph.QuotedGraph` -* :class:`~rdflib.graph.ConjunctiveGraph` -* :class:`~rdflib.graph.Dataset` +* [`Graph`][rdflib.graph.Graph] +* [`QuotedGraph`][rdflib.graph.QuotedGraph] +* [`ConjunctiveGraph`][rdflib.graph.ConjunctiveGraph] +* [`Dataset`][rdflib.graph.Dataset] -Graph ------ +## Graph -An RDF graph is a set of RDF triples. Graphs support the python ``in`` +An RDF graph is a set of RDF triples. Graphs support the python `in` operator, as well as iteration and some operations like union, difference and intersection. -see :class:`~rdflib.graph.Graph` +See [`Graph`][rdflib.graph.Graph] -Conjunctive Graph ------------------ +## Conjunctive Graph -.. warning:: - ConjunctiveGraph is deprecated, use :class:`~rdflib.graph.Dataset` instead. +!!! warning "Deprecation notice" + `ConjunctiveGraph` is deprecated, use [`Dataset`][rdflib.graph.Dataset] instead. A Conjunctive Graph is the most relevant collection of graphs that are -considered to be the boundary for closed world assumptions. This +considered to be the boundary for closed world assumptions. This boundary is equivalent to that of the store instance (which is itself uniquely identified and distinct from other instances of -:class:`~rdflib.store.Store` that signify other Conjunctive Graphs). It is +[`Store`][rdflib.store.Store] that signify other Conjunctive Graphs). It is equivalent to all the named graphs within it and associated with a -``_default_`` graph which is automatically assigned a -:class:`~rdflib.term.BNode` for an identifier - if one isn't given. +`_default_` graph which is automatically assigned a +[`BNode`][rdflib.term.BNode] for an identifier - if one isn't given. -see :class:`~rdflib.graph.ConjunctiveGraph` +See [`ConjunctiveGraph`][rdflib.graph.ConjunctiveGraph] -Quoted graph ------------- +## Quoted graph The notion of an RDF graph [14] is extended to include the concept of a formula node. A formula node may occur wherever any other kind of @@ -48,10 +44,9 @@ This is intended to map the idea of "{ N3-expression }" that is used by N3 into an RDF graph upon which RDF semantics is defined. -see :class:`~rdflib.graph.QuotedGraph` +See [`QuotedGraph`][rdflib.graph.QuotedGraph] -Dataset -------- +## Dataset The RDF 1.1 Dataset, a small extension to the Conjunctive Graph. The primary term is "graphs in the datasets" and not "contexts with quads" @@ -62,73 +57,84 @@ at creation time). This implementation includes a convenience method to directly add a single quad to a dataset graph. -see :class:`~rdflib.graph.Dataset` +See [`Dataset`][rdflib.graph.Dataset] -Working with graphs -=================== +## Working with graphs Instantiating Graphs with default store (Memory) and default identifier (a BNode): - >>> g = Graph() - >>> g.store.__class__ - - >>> g.identifier.__class__ - +```python +>>> g = Graph() +>>> g.store.__class__ + +>>> g.identifier.__class__ + + +``` Instantiating Graphs with a Memory store and an identifier - : - >>> g = Graph('Memory', URIRef("https://rdflib.github.io")) - >>> g.identifier - rdflib.term.URIRef('https://rdflib.github.io') - >>> str(g) # doctest: +NORMALIZE_WHITESPACE - " a rdfg:Graph;rdflib:storage - [a rdflib:Store;rdfs:label 'Memory']." +```python +>>> g = Graph('Memory', URIRef("https://rdflib.github.io")) +>>> g.identifier +rdflib.term.URIRef('https://rdflib.github.io') +>>> str(g) # doctest: +NORMALIZE_WHITESPACE +" a rdfg:Graph;rdflib:storage + [a rdflib:Store;rdfs:label 'Memory']." + +``` Creating a ConjunctiveGraph - The top level container for all named Graphs in a "database": - >>> g = ConjunctiveGraph() - >>> str(g.default_context) - "[a rdfg:Graph;rdflib:storage [a rdflib:Store;rdfs:label 'Memory']]." +```python +>>> g = ConjunctiveGraph() +>>> str(g.default_context) +"[a rdfg:Graph;rdflib:storage [a rdflib:Store;rdfs:label 'Memory']]." + +``` Adding / removing reified triples to Graph and iterating over it directly or via triple pattern: - >>> g = Graph() - >>> statementId = BNode() - >>> print(len(g)) - 0 - >>> g.add((statementId, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS - )> - >>> g.add((statementId, RDF.subject, - ... URIRef("https://rdflib.github.io/store/ConjunctiveGraph"))) # doctest: +ELLIPSIS - )> - >>> g.add((statementId, RDF.predicate, namespace.RDFS.label)) # doctest: +ELLIPSIS - )> - >>> g.add((statementId, RDF.object, Literal("Conjunctive Graph"))) # doctest: +ELLIPSIS - )> - >>> print(len(g)) - 4 - >>> for s, p, o in g: - ... print(type(s)) - ... - - - - - - >>> for s, p, o in g.triples((None, RDF.object, None)): - ... print(o) - ... - Conjunctive Graph - >>> g.remove((statementId, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS - )> - >>> print(len(g)) - 3 +```python +>>> g = Graph() +>>> statementId = BNode() +>>> print(len(g)) +0 +>>> g.add((statementId, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS +)> +>>> g.add((statementId, RDF.subject, +... URIRef("https://rdflib.github.io/store/ConjunctiveGraph"))) # doctest: +ELLIPSIS +)> +>>> g.add((statementId, RDF.predicate, namespace.RDFS.label)) # doctest: +ELLIPSIS +)> +>>> g.add((statementId, RDF.object, Literal("Conjunctive Graph"))) # doctest: +ELLIPSIS +)> +>>> print(len(g)) +4 +>>> for s, p, o in g: +... print(type(s)) +... + + + + + +>>> for s, p, o in g.triples((None, RDF.object, None)): +... print(o) +... +Conjunctive Graph +>>> g.remove((statementId, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS +)> +>>> print(len(g)) +3 + +``` -``None`` terms in calls to :meth:`~rdflib.graph.Graph.triples` can be +`None` terms in calls to [`triples()`][rdflib.graph.Graph.triples] can be thought of as "open variables". Graph support set-theoretic operators, you can add/subtract graphs, as @@ -138,113 +144,126 @@ Note that BNode IDs are kept when doing set-theoretic operations, this may or may not be what you want. Two named graphs within the same application probably want share BNode IDs, two graphs with data from -different sources probably not. If your BNode IDs are all generated +different sources probably not. If your BNode IDs are all generated by RDFLib they are UUIDs and unique. - >>> g1 = Graph() - >>> g2 = Graph() - >>> u = URIRef("http://example.com/foo") - >>> g1.add([u, namespace.RDFS.label, Literal("foo")]) # doctest: +ELLIPSIS - )> - >>> g1.add([u, namespace.RDFS.label, Literal("bar")]) # doctest: +ELLIPSIS - )> - >>> g2.add([u, namespace.RDFS.label, Literal("foo")]) # doctest: +ELLIPSIS - )> - >>> g2.add([u, namespace.RDFS.label, Literal("bing")]) # doctest: +ELLIPSIS - )> - >>> len(g1 + g2) # adds bing as label - 3 - >>> len(g1 - g2) # removes foo - 1 - >>> len(g1 * g2) # only foo - 1 - >>> g1 += g2 # now g1 contains everything - +```python +>>> g1 = Graph() +>>> g2 = Graph() +>>> u = URIRef("http://example.com/foo") +>>> g1.add([u, namespace.RDFS.label, Literal("foo")]) # doctest: +ELLIPSIS +)> +>>> g1.add([u, namespace.RDFS.label, Literal("bar")]) # doctest: +ELLIPSIS +)> +>>> g2.add([u, namespace.RDFS.label, Literal("foo")]) # doctest: +ELLIPSIS +)> +>>> g2.add([u, namespace.RDFS.label, Literal("bing")]) # doctest: +ELLIPSIS +)> +>>> len(g1 + g2) # adds bing as label +3 +>>> len(g1 - g2) # removes foo +1 +>>> len(g1 * g2) # only foo +1 +>>> g1 += g2 # now g1 contains everything + +``` Graph Aggregation - ConjunctiveGraphs and ReadOnlyGraphAggregate within the same store: - >>> store = plugin.get("Memory", Store)() - >>> g1 = Graph(store) - >>> g2 = Graph(store) - >>> g3 = Graph(store) - >>> stmt1 = BNode() - >>> stmt2 = BNode() - >>> stmt3 = BNode() - >>> g1.add((stmt1, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS - )> - >>> g1.add((stmt1, RDF.subject, - ... URIRef('https://rdflib.github.io/store/ConjunctiveGraph'))) # doctest: +ELLIPSIS - )> - >>> g1.add((stmt1, RDF.predicate, namespace.RDFS.label)) # doctest: +ELLIPSIS - )> - >>> g1.add((stmt1, RDF.object, Literal('Conjunctive Graph'))) # doctest: +ELLIPSIS - )> - >>> g2.add((stmt2, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS - )> - >>> g2.add((stmt2, RDF.subject, - ... URIRef('https://rdflib.github.io/store/ConjunctiveGraph'))) # doctest: +ELLIPSIS - )> - >>> g2.add((stmt2, RDF.predicate, RDF.type)) # doctest: +ELLIPSIS - )> - >>> g2.add((stmt2, RDF.object, namespace.RDFS.Class)) # doctest: +ELLIPSIS - )> - >>> g3.add((stmt3, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS - )> - >>> g3.add((stmt3, RDF.subject, - ... URIRef('https://rdflib.github.io/store/ConjunctiveGraph'))) # doctest: +ELLIPSIS - )> - >>> g3.add((stmt3, RDF.predicate, namespace.RDFS.comment)) # doctest: +ELLIPSIS - )> - >>> g3.add((stmt3, RDF.object, Literal( - ... 'The top-level aggregate graph - The sum ' + - ... 'of all named graphs within a Store'))) # doctest: +ELLIPSIS - )> - >>> len(list(ConjunctiveGraph(store).subjects(RDF.type, RDF.Statement))) - 3 - >>> len(list(ReadOnlyGraphAggregate([g1,g2]).subjects( - ... RDF.type, RDF.Statement))) - 2 - -ConjunctiveGraphs have a :meth:`~rdflib.graph.ConjunctiveGraph.quads` method +```python +>>> store = plugin.get("Memory", Store)() +>>> g1 = Graph(store) +>>> g2 = Graph(store) +>>> g3 = Graph(store) +>>> stmt1 = BNode() +>>> stmt2 = BNode() +>>> stmt3 = BNode() +>>> g1.add((stmt1, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS +)> +>>> g1.add((stmt1, RDF.subject, +... URIRef('https://rdflib.github.io/store/ConjunctiveGraph'))) # doctest: +ELLIPSIS +)> +>>> g1.add((stmt1, RDF.predicate, namespace.RDFS.label)) # doctest: +ELLIPSIS +)> +>>> g1.add((stmt1, RDF.object, Literal('Conjunctive Graph'))) # doctest: +ELLIPSIS +)> +>>> g2.add((stmt2, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS +)> +>>> g2.add((stmt2, RDF.subject, +... URIRef('https://rdflib.github.io/store/ConjunctiveGraph'))) # doctest: +ELLIPSIS +)> +>>> g2.add((stmt2, RDF.predicate, RDF.type)) # doctest: +ELLIPSIS +)> +>>> g2.add((stmt2, RDF.object, namespace.RDFS.Class)) # doctest: +ELLIPSIS +)> +>>> g3.add((stmt3, RDF.type, RDF.Statement)) # doctest: +ELLIPSIS +)> +>>> g3.add((stmt3, RDF.subject, +... URIRef('https://rdflib.github.io/store/ConjunctiveGraph'))) # doctest: +ELLIPSIS +)> +>>> g3.add((stmt3, RDF.predicate, namespace.RDFS.comment)) # doctest: +ELLIPSIS +)> +>>> g3.add((stmt3, RDF.object, Literal( +... 'The top-level aggregate graph - The sum ' + +... 'of all named graphs within a Store'))) # doctest: +ELLIPSIS +)> +>>> len(list(ConjunctiveGraph(store).subjects(RDF.type, RDF.Statement))) +3 +>>> len(list(ReadOnlyGraphAggregate([g1,g2]).subjects( +... RDF.type, RDF.Statement))) +2 + +``` + +ConjunctiveGraphs have a [`quads()`][rdflib.graph.ConjunctiveGraph.quads] method which returns quads instead of triples, where the fourth item is the Graph (or subclass thereof) instance in which the triple was asserted: - >>> uniqueGraphNames = set( - ... [graph.identifier for s, p, o, graph in ConjunctiveGraph(store - ... ).quads((None, RDF.predicate, None))]) - >>> len(uniqueGraphNames) - 3 - >>> unionGraph = ReadOnlyGraphAggregate([g1, g2]) - >>> uniqueGraphNames = set( - ... [graph.identifier for s, p, o, graph in unionGraph.quads( - ... (None, RDF.predicate, None))]) - >>> len(uniqueGraphNames) - 2 - -Parsing N3 from a string - - >>> g2 = Graph() - >>> src = ''' - ... @prefix rdf: . - ... @prefix rdfs: . - ... [ a rdf:Statement ; - ... rdf:subject ; - ... rdf:predicate rdfs:label; - ... rdf:object "Conjunctive Graph" ] . - ... ''' - >>> g2 = g2.parse(data=src, format="n3") - >>> print(len(g2)) - 4 +```python +>>> uniqueGraphNames = set( +... [graph.identifier for s, p, o, graph in ConjunctiveGraph(store +... ).quads((None, RDF.predicate, None))]) +>>> len(uniqueGraphNames) +3 +>>> unionGraph = ReadOnlyGraphAggregate([g1, g2]) +>>> uniqueGraphNames = set( +... [graph.identifier for s, p, o, graph in unionGraph.quads( +... (None, RDF.predicate, None))]) +>>> len(uniqueGraphNames) +2 + +``` + +Parsing N3 from a string: + +```python +>>> g2 = Graph() +>>> src = ''' +... @prefix rdf: . +... @prefix rdfs: . +... [ a rdf:Statement ; +... rdf:subject ; +... rdf:predicate rdfs:label; +... rdf:object "Conjunctive Graph" ] . +... ''' +>>> g2 = g2.parse(data=src, format="n3") +>>> print(len(g2)) +4 + +``` Using Namespace class: - >>> RDFLib = Namespace("https://rdflib.github.io/") - >>> RDFLib.ConjunctiveGraph - rdflib.term.URIRef('https://rdflib.github.io/ConjunctiveGraph') - >>> RDFLib["Graph"] - rdflib.term.URIRef('https://rdflib.github.io/Graph') +```python +>>> RDFLib = Namespace("https://rdflib.github.io/") +>>> RDFLib.ConjunctiveGraph +rdflib.term.URIRef('https://rdflib.github.io/ConjunctiveGraph') +>>> RDFLib["Graph"] +rdflib.term.URIRef('https://rdflib.github.io/Graph') +``` """ from __future__ import annotations @@ -415,67 +434,72 @@ _TCArgT = TypeVar("_TCArgT") +# Graph is a node because technically a formula-aware graph +# take a Graph as subject or object, but we usually use QuotedGraph for that. class Graph(Node): """An RDF Graph: a Python object containing nodes and relations between them as RDF 'triples'. This is the central RDFLib object class and Graph objects are almost always present - it all uses of RDFLib. + in all uses of RDFLib. - The basic use is to create a Graph and iterate through or query its content, e.g.: + Example: + The basic use is to create a Graph and iterate through or query its content: - >>> from rdflib import Graph, URIRef - >>> g = Graph() - - >>> g.add(( - ... URIRef("http://example.com/s1"), # subject - ... URIRef("http://example.com/p1"), # predicate - ... URIRef("http://example.com/o1"), # object - ... )) # doctest: +ELLIPSIS - )> + ```python + >>> from rdflib import Graph, URIRef + >>> g = Graph() + >>> g.add(( + ... URIRef("http://example.com/s1"), # subject + ... URIRef("http://example.com/p1"), # predicate + ... URIRef("http://example.com/o1"), # object + ... )) # doctest: +ELLIPSIS + )> - >>> g.add(( - ... URIRef("http://example.com/s2"), # subject - ... URIRef("http://example.com/p2"), # predicate - ... URIRef("http://example.com/o2"), # object - ... )) # doctest: +ELLIPSIS - )> + >>> g.add(( + ... URIRef("http://example.com/s2"), # subject + ... URIRef("http://example.com/p2"), # predicate + ... URIRef("http://example.com/o2"), # object + ... )) # doctest: +ELLIPSIS + )> - >>> for triple in sorted(g): # simple looping - ... print(triple) - (rdflib.term.URIRef('http://example.com/s1'), rdflib.term.URIRef('http://example.com/p1'), rdflib.term.URIRef('http://example.com/o1')) - (rdflib.term.URIRef('http://example.com/s2'), rdflib.term.URIRef('http://example.com/p2'), rdflib.term.URIRef('http://example.com/o2')) - - >>> # get the object of the triple with subject s1 and predicate p1 - >>> o = g.value( - ... subject=URIRef("http://example.com/s1"), - ... predicate=URIRef("http://example.com/p1") - ... ) - - - The constructor accepts one argument, the "store" that will be used to store the - graph data with the default being the `Memory ` - (in memory) Store. Other Stores that persist content to disk using various file - databases or Stores that use remote servers (SPARQL systems) are supported. See - the :doc:`rdflib.plugins.stores` package for Stores currently shipped with RDFLib. - Other Stores not shipped with RDFLib can be added, such as - `HDT `_. - - Stores can be context-aware or unaware. Unaware stores take up - (some) less space but cannot support features that require - context, such as true merging/demerging of sub-graphs and - provenance. - - Even if used with a context-aware store, Graph will only expose the quads which - belong to the default graph. To access the rest of the data the - `Dataset` class can be used instead. - - The Graph constructor can take an identifier which identifies the Graph - by name. If none is given, the graph is assigned a BNode for its - identifier. - - For more on Named Graphs, see the RDFLib `Dataset` class and the TriG Specification, - https://www.w3.org/TR/trig/. + >>> for triple in sorted(g): # simple looping + ... print(triple) + (rdflib.term.URIRef('http://example.com/s1'), rdflib.term.URIRef('http://example.com/p1'), rdflib.term.URIRef('http://example.com/o1')) + (rdflib.term.URIRef('http://example.com/s2'), rdflib.term.URIRef('http://example.com/p2'), rdflib.term.URIRef('http://example.com/o2')) + + >>> # get the object of the triple with subject s1 and predicate p1 + >>> o = g.value( + ... subject=URIRef("http://example.com/s1"), + ... predicate=URIRef("http://example.com/p1") + ... ) + + ``` + + !!! info "Graph stores" + The constructor accepts one argument, the "store" that will be used to store the + graph data with the default being the [`Memory`][rdflib.plugins.stores.memory.Memory] + (in memory) Store. Other Stores that persist content to disk using various file + databases or Stores that use remote servers (SPARQL systems) are supported. See + the `rdflib.plugins.stores` package for Stores currently shipped with RDFLib. + Other Stores not shipped with RDFLib can be added, such as + [HDT](https://github.com/rdflib/rdflib-hdt/). + + Stores can be context-aware or unaware. Unaware stores take up + (some) less space but cannot support features that require + context, such as true merging/demerging of sub-graphs and + provenance. + + Even if used with a context-aware store, Graph will only expose the quads which + belong to the default graph. To access the rest of the data the + `Dataset` class can be used instead. + + The Graph constructor can take an identifier which identifies the Graph + by name. If none is given, the graph is assigned a BNode for its + identifier. + + For more on Named Graphs, see the RDFLib `Dataset` class and the TriG Specification, + . """ context_aware: bool @@ -547,7 +571,7 @@ def toPython(self: _GraphT) -> _GraphT: # noqa: N802 return self def destroy(self: _GraphT, configuration: str) -> _GraphT: - """Destroy the store identified by ``configuration`` if supported""" + """Destroy the store identified by `configuration` if supported""" self.__store.destroy(configuration) return self @@ -581,7 +605,14 @@ def close(self, commit_pending_transaction: bool = False) -> None: return self.__store.close(commit_pending_transaction=commit_pending_transaction) def add(self: _GraphT, triple: _TripleType) -> _GraphT: - """Add a triple with self as context""" + """Add a triple with self as context. + + Args: + triple: The triple to add to the graph. + + Returns: + The graph instance. + """ s, p, o = triple assert isinstance(s, Node), "Subject %s must be an rdflib term" % (s,) assert isinstance(p, Node), "Predicate %s must be an rdflib term" % (p,) @@ -632,10 +663,17 @@ def triples( self, triple: _TripleSelectorType, ) -> Generator[_TripleOrTriplePathType, None, None]: - """Generator over the triple store + """Generator over the triple store. - Returns triples that match the given triple pattern. If triple pattern + Returns triples that match the given triple pattern. If the triple pattern does not provide a context, all contexts will be searched. + + Args: + triple: A triple pattern where each component can be a specific value or None + as a wildcard. The predicate can also be a path expression. + + Yields: + Triples matching the given pattern. """ s, p, o = triple if isinstance(p, Path): @@ -652,6 +690,7 @@ def __getitem__(self, item): A generator over matches is returned, the returned tuples include only the parts not given + ```python >>> import rdflib >>> g = rdflib.Graph() >>> g.add((rdflib.URIRef("urn:bob"), namespace.RDFS.label, rdflib.Literal("Bob"))) # doctest: +ELLIPSIS @@ -666,25 +705,17 @@ def __getitem__(self, item): >>> list(g[::rdflib.Literal("Bob")]) # all triples with bob as object [(rdflib.term.URIRef('urn:bob'), rdflib.term.URIRef('http://www.w3.org/2000/01/rdf-schema#label'))] + ``` + Combined with SPARQL paths, more complex queries can be written concisely: - Name of all Bobs friends: - - g[bob : FOAF.knows/FOAF.name ] - - Some label for Bob: - - g[bob : DC.title|FOAF.name|RDFS.label] - - All friends and friends of friends of Bob - - g[bob : FOAF.knows * "+"] - - etc. - - .. versionadded:: 4.0 + - Name of all Bobs friends: `g[bob : FOAF.knows/FOAF.name ]` + - Some label for Bob: `g[bob : DC.title|FOAF.name|RDFS.label]` + - All friends and friends of friends of Bob: `g[bob : FOAF.knows * "+"]` + - etc. + !!! example "New in version 4.0" """ if isinstance(item, slice): @@ -692,20 +723,30 @@ def __getitem__(self, item): if s is None and p is None and o is None: return self.triples((s, p, o)) elif s is None and p is None: - return self.subject_predicates(o) + # type error: Argument 1 to "subject_predicates" of "Graph" has incompatible type "Union[int, Any]"; expected "Optional[Node]" + return self.subject_predicates(o) # type: ignore[arg-type] elif s is None and o is None: - return self.subject_objects(p) + # type error: Argument 1 to "subject_objects" of "Graph" has incompatible type "Union[int, Any]"; expected "Union[Path, Node, None]" + return self.subject_objects(p) # type: ignore[arg-type] elif p is None and o is None: - return self.predicate_objects(s) + # type error: Argument 1 to "predicate_objects" of "Graph" has incompatible type "Union[int, Any]"; expected "Optional[Node]" + return self.predicate_objects(s) # type: ignore[arg-type] elif s is None: - return self.subjects(p, o) + # type error: Argument 1 to "subjects" of "Graph" has incompatible type "Union[int, Any]"; expected "Union[Path, Node, None]" + # Argument 2 to "subjects" of "Graph" has incompatible type "Union[int, Any]"; expected "Union[Node, List[Node], None]" + return self.subjects(p, o) # type: ignore[arg-type] elif p is None: - return self.predicates(s, o) + # type error: Argument 1 to "predicates" of "Graph" has incompatible type "Union[int, Any]"; expected "Optional[Node]" + # Argument 2 to "predicates" of "Graph" has incompatible type "Union[int, Any]"; expected "Optional[Node]" + return self.predicates(s, o) # type: ignore[arg-type] elif o is None: - return self.objects(s, p) + # type error: Argument 1 to "objects" of "Graph" has incompatible type "Union[int, Any]"; expected "Union[Node, List[Node], None]" + # Argument 2 to "objects" of "Graph" has incompatible type "Union[int, Any]"; expected "Union[Path, Node, None]" + return self.objects(s, p) # type: ignore[arg-type] else: + # type error: Unsupported operand types for in ("Tuple[Union[int, Any], Union[int, Any], Union[int, Any]]" and "Graph") # all given - return (s, p, o) in self + return (s, p, o) in self # type: ignore[operator] elif isinstance(item, (Path, Node)): # type error: Argument 1 to "predicate_objects" of "Graph" has incompatible type "Union[Path, Node]"; expected "Optional[Node]" @@ -717,20 +758,34 @@ def __getitem__(self, item): ) def __len__(self) -> int: - """Returns the number of triples in the graph + """Returns the number of triples in the graph. If context is specified then the number of triples in the context is returned instead. + + Returns: + The number of triples in the graph. """ # type error: Unexpected keyword argument "context" for "__len__" of "Store" return self.__store.__len__(context=self) # type: ignore[call-arg] def __iter__(self) -> Generator[_TripleType, None, None]: - """Iterates over all triples in the store""" + """Iterates over all triples in the store. + + Returns: + A generator yielding all triples in the store. + """ return self.triples((None, None, None)) def __contains__(self, triple: _TripleSelectorType) -> bool: - """Support for 'triple in graph' syntax""" + """Support for 'triple in graph' syntax. + + Args: + triple: The triple pattern to check for. + + Returns: + True if the triple pattern exists in the graph, False otherwise. + """ for triple in self.triples(triple): return True return False @@ -858,7 +913,13 @@ def subjects( unique: bool = False, ) -> Generator[_SubjectType, None, None]: """A generator of (optionally unique) subjects with the given - predicate and object(s)""" + predicate and object(s) + + Args: + predicate: A specific predicate to match or None to match any predicate. + object: A specific object or list of objects to match or None to match any object. + unique: If True, only yield unique subjects. + """ # if the object is a list of Nodes, yield results from subject() call for each if isinstance(object, list): for obj in object: @@ -887,8 +948,16 @@ def predicates( object: Optional[_ObjectType] = None, unique: bool = False, ) -> Generator[_PredicateType, None, None]: - """A generator of (optionally unique) predicates with the given - subject and object""" + """Generate predicates with the given subject and object. + + Args: + subject: A specific subject to match or None to match any subject. + object: A specific object to match or None to match any object. + unique: If True, only yield unique predicates. + + Yields: + Predicates matching the given subject and object. + """ if not unique: for s, p, o in self.triples((subject, None, object)): yield p @@ -912,7 +981,16 @@ def objects( unique: bool = False, ) -> Generator[_ObjectType, None, None]: """A generator of (optionally unique) objects with the given - subject(s) and predicate""" + subject(s) and predicate + + Args: + subject: A specific subject or a list of subjects to match or None to match any subject. + predicate: A specific predicate to match or None to match any predicate. + unique: If True, only yield unique objects. + + Yields: + Objects matching the given subject and predicate. + """ if isinstance(subject, list): for subj in subject: for o in self.objects(subj, predicate, unique): @@ -1068,12 +1146,12 @@ def value( It is one of those situations that occur a lot, hence this 'macro' like utility - Parameters: - - - subject, predicate, object: exactly one must be None - - default: value to be returned if no values found - - any: if True, return any value in the case there is more than one, - else, raise UniquenessError + Args: + subject: Subject of the triple pattern, exactly one of subject, predicate, object must be None + predicate: Predicate of the triple pattern, exactly one of subject, predicate, object must be None + object: Object of the triple pattern, exactly one of subject, predicate, object must be None + default: Value to be returned if no values found + any: If True, return any value in the case there is more than one, else, raise UniquenessError """ retval = default @@ -1120,7 +1198,8 @@ def value( def items(self, list: Node) -> Generator[Node, None, None]: """Generator over all items in the resource specified by list - list is an RDF collection. + Args: + list: An RDF collection. """ chain = set([list]) while list: @@ -1139,51 +1218,49 @@ def transitiveClosure( # noqa: N802 arg: _TCArgT, seen: Optional[Dict[_TCArgT, int]] = None, ): - """ - Generates transitive closure of a user-defined - function against the graph - - >>> from rdflib.collection import Collection - >>> g = Graph() - >>> a = BNode("foo") - >>> b = BNode("bar") - >>> c = BNode("baz") - >>> g.add((a,RDF.first,RDF.type)) # doctest: +ELLIPSIS - )> - >>> g.add((a,RDF.rest,b)) # doctest: +ELLIPSIS - )> - >>> g.add((b,RDF.first,namespace.RDFS.label)) # doctest: +ELLIPSIS - )> - >>> g.add((b,RDF.rest,c)) # doctest: +ELLIPSIS - )> - >>> g.add((c,RDF.first,namespace.RDFS.comment)) # doctest: +ELLIPSIS - )> - >>> g.add((c,RDF.rest,RDF.nil)) # doctest: +ELLIPSIS - )> - >>> def topList(node,g): - ... for s in g.subjects(RDF.rest, node): - ... yield s - >>> def reverseList(node,g): - ... for f in g.objects(node, RDF.first): - ... print(f) - ... for s in g.subjects(RDF.rest, node): - ... yield s - - >>> [rt for rt in g.transitiveClosure( - ... topList,RDF.nil)] # doctest: +NORMALIZE_WHITESPACE - [rdflib.term.BNode('baz'), - rdflib.term.BNode('bar'), - rdflib.term.BNode('foo')] - - >>> [rt for rt in g.transitiveClosure( - ... reverseList,RDF.nil)] # doctest: +NORMALIZE_WHITESPACE - http://www.w3.org/2000/01/rdf-schema#comment - http://www.w3.org/2000/01/rdf-schema#label - http://www.w3.org/1999/02/22-rdf-syntax-ns#type - [rdflib.term.BNode('baz'), - rdflib.term.BNode('bar'), - rdflib.term.BNode('foo')] - + """Generates transitive closure of a user-defined function against the graph + + ```python + from rdflib.collection import Collection + g = Graph() + a = BNode("foo") + b = BNode("bar") + c = BNode("baz") + g.add((a,RDF.first,RDF.type)) + g.add((a,RDF.rest,b)) + g.add((b,RDF.first,namespace.RDFS.label)) + g.add((b,RDF.rest,c)) + g.add((c,RDF.first,namespace.RDFS.comment)) + g.add((c,RDF.rest,RDF.nil)) + def topList(node,g): + for s in g.subjects(RDF.rest, node): + yield s + def reverseList(node,g): + for f in g.objects(node, RDF.first): + print(f) + for s in g.subjects(RDF.rest, node): + yield s + + [rt for rt in g.transitiveClosure( + topList,RDF.nil)] + # [rdflib.term.BNode('baz'), + # rdflib.term.BNode('bar'), + # rdflib.term.BNode('foo')] + + [rt for rt in g.transitiveClosure( + reverseList,RDF.nil)] + # http://www.w3.org/2000/01/rdf-schema#comment + # http://www.w3.org/2000/01/rdf-schema#label + # http://www.w3.org/1999/02/22-rdf-syntax-ns#type + # [rdflib.term.BNode('baz'), + # rdflib.term.BNode('bar'), + # rdflib.term.BNode('foo')] + ``` + + Args: + func: A function that generates a sequence of nodes + arg: The starting node + seen: A dict of visited nodes """ if seen is None: seen = {} @@ -1204,7 +1281,12 @@ def transitive_objects( """Transitively generate objects for the ``predicate`` relationship Generated objects belong to the depth first transitive closure of the - ``predicate`` relationship starting at ``subject``. + `predicate` relationship starting at `subject`. + + Args: + subject: The subject to start the transitive closure from + predicate: The predicate to follow + remember: A dict of visited nodes """ if remember is None: remember = {} @@ -1225,7 +1307,12 @@ def transitive_subjects( """Transitively generate subjects for the ``predicate`` relationship Generated subjects belong to the depth first transitive closure of the - ``predicate`` relationship starting at ``object``. + `predicate` relationship starting at `object`. + + Args: + predicate: The predicate to follow + object: The object to start the transitive closure from + remember: A dict of visited nodes """ if remember is None: remember = {} @@ -1257,8 +1344,16 @@ def bind( if replace, replace any existing prefix with the new namespace - for example: graph.bind("foaf", "http://xmlns.com/foaf/0.1/") + Args: + prefix: The prefix to bind + namespace: The namespace to bind the prefix to + override: If True, override any existing prefix binding + replace: If True, replace any existing namespace binding + Example: + ```python + graph.bind("foaf", "http://xmlns.com/foaf/0.1/") + ``` """ # TODO FIXME: This method's behaviour should be simplified and made # more robust. If the method cannot do what it is asked it should raise @@ -1272,7 +1367,11 @@ def bind( ) def namespaces(self) -> Generator[Tuple[str, URIRef], None, None]: - """Generator over all the prefix, namespace tuples""" + """Generator over all the prefix, namespace tuples + + Returns: + Generator yielding prefix, namespace tuples + """ for prefix, namespace in self.namespace_manager.namespaces(): # noqa: F402 yield prefix, namespace @@ -1344,36 +1443,26 @@ def serialize( encoding: Optional[str] = None, **args: Any, ) -> Union[bytes, str, _GraphT]: - """ - Serialize the graph. - - :param destination: - The destination to serialize the graph to. This can be a path as a - :class:`str` or :class:`~pathlib.PurePath` object, or it can be a - :class:`~typing.IO` ``[bytes]`` like object. If this parameter is not - supplied the serialized graph will be returned. - :param format: - The format that the output should be written in. This value - references a :class:`~rdflib.serializer.Serializer` plugin. Format - support can be extended with plugins, but ``"xml"``, ``"n3"``, - ``"turtle"``, ``"nt"``, ``"pretty-xml"``, ``"trix"``, ``"trig"``, - ``"nquads"``, ``"json-ld"`` and ``"hext"`` are built in. Defaults to - ``"turtle"``. - :param base: - The base IRI for formats that support it. For the turtle format this - will be used as the ``@base`` directive. - :param encoding: Encoding of output. - :param args: - Additional arguments to pass to the - :class:`~rdflib.serializer.Serializer` that will be used. - :return: The serialized graph if ``destination`` is `None`. The - serialized graph is returned as `str` if no encoding is specified, - and as `bytes` if an encoding is specified. - :rtype: :class:`bytes` if ``destination`` is `None` and ``encoding`` is not `None`. - :rtype: :class:`str` if ``destination`` is `None` and ``encoding`` is `None`. - :return: ``self`` (i.e. the :class:`~rdflib.graph.Graph` instance) if - ``destination`` is not `None`. - :rtype: :class:`~rdflib.graph.Graph` if ``destination`` is not `None`. + """Serialize the graph. + + Args: + destination: The destination to serialize the graph to. This can be a path as a + string or pathlib.PurePath object, or it can be an IO[bytes] like object. + If this parameter is not supplied the serialized graph will be returned. + format: The format that the output should be written in. This value + references a Serializer plugin. Format support can be extended with plugins, + but "xml", "n3", "turtle", "nt", "pretty-xml", "trix", "trig", "nquads", + "json-ld" and "hext" are built in. Defaults to "turtle". + base: The base IRI for formats that support it. For the turtle format this + will be used as the @base directive. + encoding: Encoding of output. + args: Additional arguments to pass to the Serializer that will be used. + + Returns: + The serialized graph if `destination` is None. The serialized graph is returned + as str if no encoding is specified, and as bytes if an encoding is specified. + + self (i.e. the Graph instance) if `destination` is not None. """ # if base is not given as attribute use the base set for the graph @@ -1435,87 +1524,89 @@ def parse( data: Optional[Union[str, bytes]] = None, **args: Any, ) -> Graph: - """ - Parse an RDF source adding the resulting triples to the Graph. + """Parse an RDF source adding the resulting triples to the Graph. The source is specified using one of source, location, file or data. - .. caution:: - - This method can access directly or indirectly requested network or - file resources, for example, when parsing JSON-LD documents with - ``@context`` directives that point to a network location. - - When processing untrusted or potentially malicious documents, - measures should be taken to restrict network and file access. - - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. - - :param source: An `xml.sax.xmlreader.InputSource`, file-like object, - `pathlib.Path` like object, or string. In the case of a string the string - is the location of the source. - :param location: A string indicating the relative or absolute URL of the - source. `Graph`'s absolutize method is used if a relative location - is specified. - :param file: A file-like object. - :param data: A string containing the data to be parsed. - :param format: Used if format can not be determined from source, e.g. - file extension or Media Type. Defaults to text/turtle. Format - support can be extended with plugins, but "xml", "n3" (use for - turtle), "nt" & "trix" are built in. - :param publicID: the logical URI to use as the document base. If None - specified the document location is used (at least in the case where - there is a document location). This is used as the base URI when - resolving relative URIs in the source document, as defined in `IETF - RFC 3986 - `_, - given the source document does not define a base URI. - :return: ``self``, i.e. the :class:`~rdflib.graph.Graph` instance. - - Examples: - - >>> my_data = ''' - ... - ... - ... Example - ... This is really just an example. - ... - ... - ... ''' - >>> import os, tempfile - >>> fd, file_name = tempfile.mkstemp() - >>> f = os.fdopen(fd, "w") - >>> dummy = f.write(my_data) # Returns num bytes written - >>> f.close() - - >>> g = Graph() - >>> result = g.parse(data=my_data, format="application/rdf+xml") - >>> len(g) - 2 - - >>> g = Graph() - >>> result = g.parse(location=file_name, format="application/rdf+xml") - >>> len(g) - 2 - - >>> g = Graph() - >>> with open(file_name, "r") as f: - ... result = g.parse(f, format="application/rdf+xml") - >>> len(g) - 2 - - >>> os.remove(file_name) - - >>> # default turtle parsing - >>> result = g.parse(data=" .") - >>> len(g) - 3 - + Args: + source: An `xml.sax.xmlreader.InputSource`, file-like object, + `pathlib.Path` like object, or string. In the case of a string the string + is the location of the source. + publicID: The logical URI to use as the document base. If None + specified the document location is used (at least in the case where + there is a document location). This is used as the base URI when + resolving relative URIs in the source document, as defined in `IETF + RFC 3986 `_, + given the source document does not define a base URI. + format: Used if format can not be determined from source, e.g. + file extension or Media Type. Defaults to text/turtle. Format + support can be extended with plugins, but "xml", "n3" (use for + turtle), "nt" & "trix" are built in. + location: A string indicating the relative or absolute URL of the + source. `Graph`'s absolutize method is used if a relative location + is specified. + file: A file-like object. + data: A string containing the data to be parsed. + args: Additional arguments to pass to the parser. + + Returns: + self, i.e. the Graph instance. + + Example: + ```python + >>> my_data = ''' + ... + ... + ... Example + ... This is really just an example. + ... + ... + ... ''' + >>> import os, tempfile + >>> fd, file_name = tempfile.mkstemp() + >>> f = os.fdopen(fd, "w") + >>> dummy = f.write(my_data) # Returns num bytes written + >>> f.close() + + >>> g = Graph() + >>> result = g.parse(data=my_data, format="application/rdf+xml") + >>> len(g) + 2 + + >>> g = Graph() + >>> result = g.parse(location=file_name, format="application/rdf+xml") + >>> len(g) + 2 + + >>> g = Graph() + >>> with open(file_name, "r") as f: + ... result = g.parse(f, format="application/rdf+xml") + >>> len(g) + 2 + + >>> os.remove(file_name) + + >>> # default turtle parsing + >>> result = g.parse(data=" .") + >>> len(g) + 3 + + ``` + + !!! warning "Caution" + This method can access directly or indirectly requested network or + file resources, for example, when parsing JSON-LD documents with + `@context` directives that point to a network location. + + When processing untrusted or potentially malicious documents, + measures should be taken to restrict network and file access. + + For information on available security measures, see the RDFLib + [Security Considerations](../security_considerations.md) + documentation. """ source = create_input_source( @@ -1581,28 +1672,31 @@ def query( """ Query this graph. - A type of 'prepared queries' can be realised by providing initial - variable bindings with initBindings - - Initial namespaces are used to resolve prefixes used in the query, if - none are given, the namespaces from the graph's namespace manager are - used. - - .. caution:: - - This method can access indirectly requested network endpoints, for - example, query processing will attempt to access network endpoints - specified in ``SERVICE`` directives. - - When processing untrusted or potentially malicious queries, measures - should be taken to restrict network and file access. - - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. - - :returntype: :class:`~rdflib.query.Result` - + Args: + query_object: The query string or object to execute. + processor: The query processor to use. Default is "sparql". + result: The result format to use. Default is "sparql". + initNs: Initial namespaces to use for resolving prefixes in the query. + If none are given, the namespaces from the graph's namespace manager are used. + initBindings: Initial variable bindings to use. A type of 'prepared queries' + can be realized by providing these bindings. + use_store_provided: Whether to use the store's query method if available. + kwargs: Additional arguments to pass to the query processor. + + Returns: + A [`rdflib.query.Result`][`rdflib.query.Result`] instance. + + !!! warning "Caution" + This method can access indirectly requested network endpoints, for + example, query processing will attempt to access network endpoints + specified in `SERVICE` directives. + + When processing untrusted or potentially malicious queries, measures + should be taken to restrict network and file access. + + For information on available security measures, see the RDFLib + [Security Considerations](../security_considerations.md) + documentation. """ initBindings = initBindings or {} # noqa: N806 @@ -1643,21 +1737,27 @@ def update( use_store_provided: bool = True, **kwargs: Any, ) -> None: - """ - Update this graph with the given update query. - - .. caution:: - - This method can access indirectly requested network endpoints, for - example, query processing will attempt to access network endpoints - specified in ``SERVICE`` directives. - - When processing untrusted or potentially malicious queries, measures - should be taken to restrict network and file access. - - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. + """Update this graph with the given update query. + + Args: + update_object: The update query string or object to execute. + processor: The update processor to use. Default is "sparql". + initNs: Initial namespaces to use for resolving prefixes in the query. + If none are given, the namespaces from the graph's namespace manager are used. + initBindings: Initial variable bindings to use. + use_store_provided: Whether to use the store's update method if available. + kwargs: Additional arguments to pass to the update processor. + + !!! warning "Caution" + This method can access indirectly requested network endpoints, for + example, query processing will attempt to access network endpoints + specified in `SERVICE` directives. + + When processing untrusted or potentially malicious queries, measures + should be taken to restrict network and file access. + + For information on available security measures, see the RDFLib + Security Considerations documentation. """ initBindings = initBindings or {} # noqa: N806 initNs = initNs or dict(self.namespaces()) # noqa: N806 @@ -1700,11 +1800,20 @@ def __reduce__(self) -> Tuple[Type[Graph], Tuple[Store, _ContextIdentifierType]] ) def isomorphic(self, other: Graph) -> bool: - """ - does a very basic check if these graphs are the same + """Check if this graph is isomorphic to another graph. + + Performs a basic check if these graphs are the same. If no BNodes are involved, this is accurate. - See rdflib.compare for a correct implementation of isomorphism checks + Args: + other: The graph to compare with. + + Returns: + True if the graphs are isomorphic, False otherwise. + + Note: + This is only an approximation. See rdflib.compare for a correct + implementation of isomorphism checks. """ # TODO: this is only an approximation. if len(self) != len(other): @@ -1721,14 +1830,18 @@ def isomorphic(self, other: Graph) -> bool: return True def connected(self) -> bool: - """Check if the Graph is connected + """Check if the Graph is connected. The Graph is considered undirectional. - Performs a search on the Graph, starting from a random node. Then - iteratively goes depth-first through the triplets where the node is - subject and object. Return True if all nodes have been visited and - False if it cannot continue and there are still unvisited nodes left. + Returns: + True if all nodes have been visited and there are no unvisited nodes left, + False otherwise. + + Note: + Performs a search on the Graph, starting from a random node. Then + iteratively goes depth-first through the triplets where the node is + subject and object. """ all_nodes = list(self.all_nodes()) discovered = [] @@ -1763,14 +1876,16 @@ def all_nodes(self) -> Set[Node]: return res def collection(self, identifier: _SubjectType) -> Collection: - """Create a new ``Collection`` instance. + """Create a new `Collection` instance. - Parameters: + Args: + identifier: A URIRef or BNode instance. - - ``identifier``: a URIRef or BNode instance. - - Example:: + Returns: + A new Collection instance. + Example: + ```python >>> graph = Graph() >>> uri = URIRef("http://example.org/resource") >>> collection = graph.collection(uri) @@ -1778,6 +1893,8 @@ def collection(self, identifier: _SubjectType) -> Collection: >>> assert collection.uri is uri >>> assert collection.graph is graph >>> collection += [ Literal(1), Literal(2) ] + + ``` """ return Collection(self, identifier) @@ -1785,12 +1902,14 @@ def collection(self, identifier: _SubjectType) -> Collection: def resource(self, identifier: Union[Node, str]) -> Resource: """Create a new ``Resource`` instance. - Parameters: - - - ``identifier``: a URIRef or BNode instance. + Args: + identifier: A URIRef or BNode instance. - Example:: + Returns: + A new Resource instance. + Example: + ```python >>> graph = Graph() >>> uri = URIRef("http://example.org/resource") >>> resource = graph.resource(uri) @@ -1798,6 +1917,7 @@ def resource(self, identifier: Union[Node, str]) -> Resource: >>> assert resource.identifier is uri >>> assert resource.graph is graph + ``` """ if not isinstance(identifier, Node): identifier = URIRef(identifier) @@ -1892,34 +2012,39 @@ def do_de_skolemize2(t: _TripleType) -> _TripleType: def cbd( self, resource: _SubjectType, *, target_graph: Optional[Graph] = None ) -> Graph: - """Retrieves the Concise Bounded Description of a Resource from a Graph + """Retrieves the Concise Bounded Description of a Resource from a Graph. - Concise Bounded Description (CBD) is defined in [1] as: + Args: + resource: A URIRef object, the Resource to query for. + target_graph: Optionally, a graph to add the CBD to; otherwise, + a new graph is created for the CBD. - Given a particular node (the starting node) in a particular RDF graph (the source graph), a subgraph of that - particular graph, taken to comprise a concise bounded description of the resource denoted by the starting node, - can be identified as follows: + Returns: + A Graph, subgraph of self if no graph was provided otherwise the provided graph. - 1. Include in the subgraph all statements in the source graph where the subject of the statement is the - starting node; + Note: + Concise Bounded Description (CBD) is defined as: - 2. Recursively, for all statements identified in the subgraph thus far having a blank node object, include - in the subgraph all statements in the source graph where the subject of the statement is the blank node - in question and which are not already included in the subgraph. + Given a particular node (the starting node) in a particular RDF graph (the source graph), + a subgraph of that particular graph, taken to comprise a concise bounded description of + the resource denoted by the starting node, can be identified as follows: - 3. Recursively, for all statements included in the subgraph thus far, for all reifications of each statement - in the source graph, include the concise bounded description beginning from the rdf:Statement node of - each reification. + 1. Include in the subgraph all statements in the source graph where the subject of the + statement is the starting node; - This results in a subgraph where the object nodes are either URI references, literals, or blank nodes not - serving as the subject of any statement in the graph. + 2. Recursively, for all statements identified in the subgraph thus far having a blank + node object, include in the subgraph all statements in the source graph where the + subject of the statement is the blank node in question and which are not already + included in the subgraph. - [1] https://www.w3.org/Submission/CBD/ + 3. Recursively, for all statements included in the subgraph thus far, for all + reifications of each statement in the source graph, include the concise bounded + description beginning from the rdf:Statement node of each reification. - :param resource: a URIRef object, of the Resource for queried for - :param target_graph: Optionally, a graph to add the CBD to; otherwise, a new graph is created for the CBD - :return: a Graph, subgraph of self if no graph was provided otherwise the provided graph + This results in a subgraph where the object nodes are either URI references, literals, + or blank nodes not serving as the subject of any statement in the graph. + See: """ if target_graph is None: subgraph = Graph() @@ -1956,11 +2081,11 @@ class ConjunctiveGraph(Graph): """A ConjunctiveGraph is an (unnamed) aggregation of all the named graphs in a store. - .. warning:: - ConjunctiveGraph is deprecated, use :class:`~rdflib.graph.Dataset` instead. + !!! warning "Deprecation notice" + ConjunctiveGraph is deprecated, use [`rdflib.graph.Dataset`][rdflib.graph.Dataset] instead. - It has a ``default`` graph, whose name is associated with the - graph throughout its life. :meth:`__init__` can take an identifier + It has a `default` graph, whose name is associated with the + graph throughout its life. Constructor can take an identifier to use as the name of this default graph or it will assign a BNode. @@ -2084,8 +2209,7 @@ def add( self: _ConjunctiveGraphT, triple_or_quad: _TripleOrOptionalQuadType, ) -> _ConjunctiveGraphT: - """ - Add a triple or quad to the store. + """Add a triple or quad to the store. if a triple is given it is added to the default context """ @@ -2138,13 +2262,10 @@ def addN( # noqa: N802 # type error: Argument 1 of "remove" is incompatible with supertype "Graph"; supertype defines the argument type as "Tuple[Optional[Node], Optional[Node], Optional[Node]]" def remove(self: _ConjunctiveGraphT, triple_or_quad: _TripleOrOptionalQuadType) -> _ConjunctiveGraphT: # type: ignore[override] - """ - Removes a triple or quads + """Removes a triple or quads if a triple is given it is removed from all contexts - a quad is removed from the given context only - """ s, p, o, c = self._spoc(triple_or_quad) @@ -2177,8 +2298,7 @@ def triples( triple_or_quad: _TripleOrQuadSelectorType, context: Optional[_ContextType] = None, ) -> Generator[_TripleOrTriplePathType, None, None]: - """ - Iterate over all the triples in the entire conjunctive graph + """Iterate over all the triples in the entire conjunctive graph For legacy reasons, this can take the context to query either as a fourth element of the quad, or as the explicit context @@ -2298,41 +2418,46 @@ def parse( data: Optional[Union[str, bytes]] = None, **args: Any, ) -> Graph: - """ - Parse source adding the resulting triples to its own context (sub graph + """Parse source adding the resulting triples to its own context (sub graph of this graph). - See :meth:`rdflib.graph.Graph.parse` for documentation on arguments. - - If the source is in a format that does not support named graphs its triples - will be added to the default graph - (i.e. :attr:`ConjunctiveGraph.default_context`). - - :Returns: - - The graph into which the source was parsed. In the case of n3 it returns - the root context. - - .. caution:: - - This method can access directly or indirectly requested network or - file resources, for example, when parsing JSON-LD documents with - ``@context`` directives that point to a network location. - - When processing untrusted or potentially malicious documents, - measures should be taken to restrict network and file access. - - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. - - *Changed in 7.0*: The ``publicID`` argument is no longer used as the - identifier (i.e. name) of the default graph as was the case before - version 7.0. In the case of sources that do not support named graphs, - the ``publicID`` parameter will also not be used as the name for the - graph that the data is loaded into, and instead the triples from sources - that do not support named graphs will be loaded into the default graph - (i.e. :attr:`ConjunctiveGraph.default_context`). + See [`rdflib.graph.Graph.parse`][rdflib.graph.Graph.parse] for documentation on arguments. + + Args: + source: The source to parse + publicID: The public ID of the source + format: The format of the source + location: The location of the source + file: The file object to parse + data: The data to parse + **args: Additional arguments + + Returns: + The graph into which the source was parsed. In the case of n3 it returns + the root context. + + Note: + If the source is in a format that does not support named graphs its triples + will be added to the default graph (i.e. ConjunctiveGraph.default_context). + + !!! warning "Caution" + This method can access directly or indirectly requested network or + file resources, for example, when parsing JSON-LD documents with + `@context` directives that point to a network location. + + When processing untrusted or potentially malicious documents, + measures should be taken to restrict network and file access. + + For information on available security measures, see the RDFLib + Security Considerations documentation. + + !!! example "Changed in 7.0" + The `publicID` argument is no longer used as the identifier (i.e. name) + of the default graph as was the case before version 7.0. In the case of + sources that do not support named graphs, the `publicID` parameter will + also not be used as the name for the graph that the data is loaded into, + and instead the triples from sources that do not support named graphs will + be loaded into the default graph (i.e. ConjunctiveGraph.default_context). """ source = create_input_source( @@ -2369,23 +2494,20 @@ class Dataset(ConjunctiveGraph): RDFLib Graph identified by IRI - within it and allows whole-of-dataset or single Graph use. - RDFLib's Dataset class is based on the `RDF 1.2. 'Dataset' definition - `_: - - .. + RDFLib's Dataset class is based on the [RDF 1.2. 'Dataset' definition](https://www.w3.org/TR/rdf12-datasets/): - An RDF dataset is a collection of RDF graphs, and comprises: + An RDF dataset is a collection of RDF graphs, and comprises: - - Exactly one default graph, being an RDF graph. The default graph does not - have a name and MAY be empty. - - Zero or more named graphs. Each named graph is a pair consisting of an IRI or - a blank node (the graph name), and an RDF graph. Graph names are unique - within an RDF dataset. + - Exactly one default graph, being an RDF graph. The default graph does not + have a name and MAY be empty. + - Zero or more named graphs. Each named graph is a pair consisting of an IRI or + a blank node (the graph name), and an RDF graph. Graph names are unique + within an RDF dataset. Accordingly, a Dataset allows for `Graph` objects to be added to it with - :class:`rdflib.term.URIRef` or :class:`rdflib.term.BNode` identifiers and always - creats a default graph with the :class:`rdflib.term.URIRef` identifier - :code:`urn:x-rdflib:default`. + [`URIRef`][rdflib.term.URIRef] or [`BNode`][rdflib.term.BNode] identifiers and always + creats a default graph with the [`URIRef`][rdflib.term.URIRef] identifier + `urn:x-rdflib:default`. Dataset extends Graph's Subject, Predicate, Object (s, p, o) 'triple' structure to include a graph identifier - archaically called Context - producing @@ -2394,12 +2516,14 @@ class Dataset(ConjunctiveGraph): Triples, or quads, can be added to a Dataset. Triples, or quads with the graph identifer :code:`urn:x-rdflib:default` go into the default graph. - .. note:: Dataset builds on the `ConjunctiveGraph` class but that class's direct + !!! warning "Deprecation notice" + Dataset builds on the `ConjunctiveGraph` class but that class's direct use is now deprecated (since RDFLib 7.x) and it should not be used. `ConjunctiveGraph` will be removed from future RDFLib versions. - Examples of usage and see also the examples/datast.py file: + Examples of usage and see also the `examples/datast.py` file: + ```python >>> # Create a new Dataset >>> ds = Dataset() >>> # simple triples goes to default graph @@ -2409,12 +2533,12 @@ class Dataset(ConjunctiveGraph): ... Literal("foo") ... )) # doctest: +ELLIPSIS )> - >>> + >>> # Create a graph in the dataset, if the graph name has already been >>> # used, the corresponding graph will be returned >>> # (ie, the Dataset keeps track of the constituent graphs) >>> g = ds.graph(URIRef("http://www.example.com/gr")) - >>> + >>> # add triples to the new graph as usual >>> g.add(( ... URIRef("http://example.org/x"), @@ -2430,7 +2554,7 @@ class Dataset(ConjunctiveGraph): ... g ... )) # doctest: +ELLIPSIS )> - >>> + >>> # querying triples return them all regardless of the graph >>> for t in ds.triples((None,None,None)): # doctest: +SKIP ... print(t) # doctest: +NORMALIZE_WHITESPACE @@ -2443,7 +2567,7 @@ class Dataset(ConjunctiveGraph): (rdflib.term.URIRef("http://example.org/x"), rdflib.term.URIRef("http://example.org/y"), rdflib.term.Literal("bar")) - >>> + >>> # querying quads() return quads; the fourth argument can be unrestricted >>> # (None) or restricted to a graph >>> for q in ds.quads((None, None, None, None)): # doctest: +SKIP @@ -2460,7 +2584,7 @@ class Dataset(ConjunctiveGraph): rdflib.term.URIRef("http://example.org/z"), rdflib.term.Literal("foo-bar"), rdflib.term.URIRef("http://www.example.com/gr")) - >>> + >>> # unrestricted looping is equivalent to iterating over the entire Dataset >>> for q in ds: # doctest: +SKIP ... print(q) # doctest: +NORMALIZE_WHITESPACE @@ -2476,7 +2600,7 @@ class Dataset(ConjunctiveGraph): rdflib.term.URIRef("http://example.org/z"), rdflib.term.Literal("foo-bar"), rdflib.term.URIRef("http://www.example.com/gr")) - >>> + >>> # resticting iteration to a graph: >>> for q in ds.quads((None, None, None, g)): # doctest: +SKIP ... print(q) # doctest: +NORMALIZE_WHITESPACE @@ -2491,7 +2615,7 @@ class Dataset(ConjunctiveGraph): >>> # Note that in the call above - >>> # ds.quads((None,None,None,"http://www.example.com/gr")) >>> # would have been accepted, too - >>> + >>> # graph names in the dataset can be queried: >>> for c in ds.graphs(): # doctest: +SKIP ... print(c.identifier) # doctest: @@ -2511,10 +2635,11 @@ class Dataset(ConjunctiveGraph): ... print(c) # doctest: +NORMALIZE_WHITESPACE DEFAULT http://www.example.com/gr - >>> + >>> # a graph can also be removed from a dataset via ds.remove_graph(g) + ``` - ... versionadded:: 4.0 + !!! example "New in version 4.0" """ def __init__( @@ -2526,7 +2651,7 @@ def __init__( super(Dataset, self).__init__(store=store, identifier=None) if not self.store.graph_aware: - raise Exception("DataSet must be backed by a graph-aware store!") + raise Exception("Dataset must be backed by a graph-aware store!") self._default_context = Graph( store=self.store, identifier=DATASET_DEFAULT_GRAPH_ID, @@ -2630,37 +2755,47 @@ def parse( data: Optional[Union[str, bytes]] = None, **args: Any, ) -> Dataset: - """ - Parse an RDF source adding the resulting triples to the Graph. + """Parse an RDF source adding the resulting triples to the Graph. - See :meth:`rdflib.graph.Graph.parse` for documentation on arguments. + Args: + source: The source to parse. See rdflib.graph.Graph.parse for details. + publicID: The public ID of the source. + format: The format of the source. + location: The location of the source. + file: The file object to parse. + data: The data to parse. + **args: Additional arguments. - The source is specified using one of source, location, file or data. + Returns: + The graph that the source was parsed into. + + Note: + The source is specified using one of source, location, file or data. If the source is in a format that does not support named graphs its triples will be added to the default graph (i.e. :attr:`.Dataset.default_graph`). - - .. caution:: - - This method can access directly or indirectly requested network or - file resources, for example, when parsing JSON-LD documents with - ``@context`` directives that point to a network location. - - When processing untrusted or potentially malicious documents, - measures should be taken to restrict network and file access. - - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. - - *Changed in 7.0*: The ``publicID`` argument is no longer used as the - identifier (i.e. name) of the default graph as was the case before - version 7.0. In the case of sources that do not support named graphs, - the ``publicID`` parameter will also not be used as the name for the - graph that the data is loaded into, and instead the triples from sources - that do not support named graphs will be loaded into the default graph - (i.e. :attr:`.Dataset.default_graph`). + If the source is in a format that does not support named graphs its triples + will be added to the default graph (i.e. Dataset.default_graph). + + !!! warning "Caution" + This method can access directly or indirectly requested network or + file resources, for example, when parsing JSON-LD documents with + `@context` directives that point to a network location. + + When processing untrusted or potentially malicious documents, + measures should be taken to restrict network and file access. + + For information on available security measures, see the RDFLib + Security Considerations documentation. + + !!! example "Changed in 7.0" + The `publicID` argument is no longer used as the identifier (i.e. name) + of the default graph as was the case before version 7.0. In the case of + sources that do not support named graphs, the `publicID` parameter will + also not be used as the name for the graph that the data is loaded into, + and instead the triples from sources that do not support named graphs will + be loaded into the default graph (i.e. Dataset.default_graph). """ ConjunctiveGraph.parse( @@ -2867,20 +3002,15 @@ class Seq: returned corresponding to the Seq content. It is based on the natural ordering of the predicate names _1, _2, _3, etc, which is the 'implementation' of a sequence in RDF terms. - """ - - def __init__(self, graph: Graph, subject: _SubjectType): - """Parameters: - - graph: - the graph containing the Seq - - - subject: - the subject of a Seq. Note that the init does not + Args: + graph: the graph containing the Seq + subject:the subject of a Seq. Note that the init does not check whether this is a Seq, this is done in whoever creates this instance! - """ + """ + def __init__(self, graph: Graph, subject: _SubjectType): self._list: List[Tuple[int, _ObjectType]] _list = self._list = list() LI_INDEX = URIRef(str(RDF) + "_") # noqa: N806 @@ -3156,22 +3286,20 @@ def _assertnode(*terms: Any) -> bool: class BatchAddGraph: - """ - Wrapper around graph that turns batches of calls to Graph's add + """Wrapper around graph that turns batches of calls to Graph's add (and optionally, addN) into calls to batched calls to addN`. - :Parameters: - - - graph: The graph to wrap - - batch_size: The maximum number of triples to buffer before passing to - Graph's addN - - batch_addn: If True, then even calls to `addN` will be batched according to - batch_size - - graph: The wrapped graph - count: The number of triples buffered since initialization or the last call to reset - batch: The current buffer of triples - + Args: + graph: The graph to wrap + batch_size: The maximum number of triples to buffer before passing to + Graph's addN + batch_addn: If True, then even calls to `addN` will be batched according to + batch_size + + Attributes: + graph: The wrapped graph + count: The number of triples buffered since initialization or the last call to reset + batch: The current buffer of triples """ def __init__(self, graph: Graph, batch_size: int = 1000, batch_addn: bool = False): @@ -3198,10 +3326,13 @@ def add( _QuadType, ], ) -> BatchAddGraph: - """ - Add a triple to the buffer + """Add a triple to the buffer. + + Args: + triple_or_quad: The triple or quad to add - :param triple: The triple to add + Returns: + The BatchAddGraph instance """ if len(self.batch) >= self.__batch_size: self.graph.addN(self.batch) diff --git a/rdflib/namespace/_GEO.py b/rdflib/namespace/_GEO.py index d7168d64c0..2542c1e4f0 100644 --- a/rdflib/namespace/_GEO.py +++ b/rdflib/namespace/_GEO.py @@ -9,20 +9,20 @@ class GEO(DefinedNamespace): Generated from: http://schemas.opengis.net/geosparql/1.0/geosparql_vocab_all.rdf Date: 2021-12-27 17:38:15.101187 - .. code-block:: Turtle - - dc:creator "Open Geospatial Consortium"^^xsd:string - dc:date "2012-04-30"^^xsd:date - dc:source - "OGC GeoSPARQL – A Geographic Query Language for RDF Data OGC 11-052r5"^^xsd:string - rdfs:seeAlso - - - owl:imports dc: - - - - owl:versionInfo "OGC GeoSPARQL 1.0"^^xsd:string + ```turtle + dc:creator "Open Geospatial Consortium"^^xsd:string + dc:date "2012-04-30"^^xsd:date + dc:source + "OGC GeoSPARQL – A Geographic Query Language for RDF Data OGC 11-052r5"^^xsd:string + rdfs:seeAlso + + + owl:imports dc: + + + + owl:versionInfo "OGC GeoSPARQL 1.0"^^xsd:string + ``` """ # http://www.w3.org/2000/01/rdf-schema#Datatype diff --git a/rdflib/namespace/__init__.py b/rdflib/namespace/__init__.py index cd2946ad55..e3d3e88cdc 100644 --- a/rdflib/namespace/__init__.py +++ b/rdflib/namespace/__init__.py @@ -1,37 +1,34 @@ """ -=================== -Namespace Utilities -=================== +# Namespace Utilities RDFLib provides mechanisms for managing Namespaces. -In particular, there is a :class:`~rdflib.namespace.Namespace` class +In particular, there is a [`Namespace`][rdflib.namespace.Namespace] class that takes as its argument the base URI of the namespace. -.. code-block:: pycon +```python +>>> from rdflib.namespace import Namespace +>>> RDFS = Namespace("http://www.w3.org/1999/02/22-rdf-syntax-ns#") - >>> from rdflib.namespace import Namespace - >>> RDFS = Namespace("http://www.w3.org/1999/02/22-rdf-syntax-ns#") +``` Fully qualified URIs in the namespace can be constructed either by attribute or by dictionary access on Namespace instances: -.. code-block:: pycon - - >>> RDFS.seeAlso - rdflib.term.URIRef('http://www.w3.org/1999/02/22-rdf-syntax-ns#seeAlso') - >>> RDFS['seeAlso'] - rdflib.term.URIRef('http://www.w3.org/1999/02/22-rdf-syntax-ns#seeAlso') +```python +>>> RDFS.seeAlso +rdflib.term.URIRef('http://www.w3.org/1999/02/22-rdf-syntax-ns#seeAlso') +>>> RDFS['seeAlso'] +rdflib.term.URIRef('http://www.w3.org/1999/02/22-rdf-syntax-ns#seeAlso') +``` -Automatic handling of unknown predicates ------------------------------------------ +## Automatic handling of unknown predicates As a programming convenience, a namespace binding is automatically -created when :class:`rdflib.term.URIRef` predicates are added to the graph. +created when [`URIRef`][rdflib.term.URIRef] predicates are added to the graph. -Importable namespaces ------------------------ +## Importable namespaces The following namespaces are available by directly importing from rdflib: @@ -63,11 +60,12 @@ * WGS * XSD -.. code-block:: pycon +```python +>>> from rdflib.namespace import RDFS +>>> RDFS.seeAlso +rdflib.term.URIRef('http://www.w3.org/2000/01/rdf-schema#seeAlso') - >>> from rdflib.namespace import RDFS - >>> RDFS.seeAlso - rdflib.term.URIRef('http://www.w3.org/2000/01/rdf-schema#seeAlso') +``` """ from __future__ import annotations @@ -144,9 +142,9 @@ def get_annotations(thing: Any) -> dict: class Namespace(str): - """ - Utility class for quickly generating URIRefs with a common prefix + """Utility class for quickly generating URIRefs with a common prefix. + ```python >>> from rdflib.namespace import Namespace >>> n = Namespace("http://example.org/") >>> n.Person # as attribute @@ -158,6 +156,8 @@ class Namespace(str): >>> n2 = Namespace("http://example2.org/") >>> n.Person in n2 False + + ``` """ def __new__(cls, value: Union[str, bytes]) -> Namespace: @@ -191,6 +191,7 @@ def __repr__(self) -> str: def __contains__(self, ref: str) -> bool: # type: ignore[override] """Allows to check if a URI is within (starts with) this Namespace. + ```python >>> from rdflib import URIRef >>> namespace = Namespace('http://example.org/') >>> uri = URIRef('http://example.org/foo') @@ -202,20 +203,24 @@ def __contains__(self, ref: str) -> bool: # type: ignore[override] >>> obj = URIRef('http://not.example.org/bar') >>> obj in namespace False + + ``` """ return ref.startswith(self) # test namespace membership with "ref in ns" syntax class URIPattern(str): - """ - Utility class for creating URIs according to some pattern + """Utility class for creating URIs according to some pattern. + This supports either new style formatting with .format - or old-style with % operator + or old-style with % operator. + ```python >>> u=URIPattern("http://example.org/%s/%d/resource") >>> u%('books', 12345) rdflib.term.URIRef('http://example.org/books/12345/resource') + ``` """ def __new__(cls, value: Union[str, bytes]) -> URIPattern: @@ -351,9 +356,9 @@ def as_jsonld_context(self, pfx: str) -> dict: # noqa: N804 class DefinedNamespace(metaclass=DefinedNamespaceMeta): - """ - A Namespace with an enumerated list of members. - Warnings are emitted if unknown members are referenced if _warn is True + """A Namespace with an enumerated list of members. + + Warnings are emitted if unknown members are referenced if _warn is True. """ __slots__: Tuple[str, ...] = tuple() @@ -445,30 +450,29 @@ class NamespaceManager: * using prefix bindings from prefix.cc which is a online prefixes database * not implemented yet - this is aspirational - .. attention:: + !!! warning "Breaking changes" - The namespaces bound for specific values of ``bind_namespaces`` + The namespaces bound for specific values of `bind_namespaces` constitute part of RDFLib's public interface, so changes to them should only be additive within the same minor version. Removing values, or removing namespaces that are bound by default, constitutes a breaking change. - See the - Sample usage - - .. code-block:: pycon - - >>> import rdflib - >>> from rdflib import Graph - >>> from rdflib.namespace import Namespace, NamespaceManager - >>> EX = Namespace('http://example.com/') - >>> namespace_manager = NamespaceManager(Graph()) - >>> namespace_manager.bind('ex', EX, override=False) - >>> g = Graph() - >>> g.namespace_manager = namespace_manager - >>> all_ns = [n for n in g.namespace_manager.namespaces()] - >>> assert ('ex', rdflib.term.URIRef('http://example.com/')) in all_ns - >>> + See the sample usage + + ```python + >>> import rdflib + >>> from rdflib import Graph + >>> from rdflib.namespace import Namespace, NamespaceManager + >>> EX = Namespace('http://example.com/') + >>> namespace_manager = NamespaceManager(Graph()) + >>> namespace_manager.bind('ex', EX, override=False) + >>> g = Graph() + >>> g.namespace_manager = namespace_manager + >>> all_ns = [n for n in g.namespace_manager.namespaces()] + >>> assert ('ex', rdflib.term.URIRef('http://example.com/')) in all_ns + + ``` """ def __init__(self, graph: Graph, bind_namespaces: _NamespaceSetString = "rdflib"): @@ -540,24 +544,28 @@ def curie(self, uri: str, generate: bool = True) -> str: Result is guaranteed to contain a colon separating the prefix from the name, even if the prefix is an empty string. - .. warning:: - - When ``generate`` is `True` (which is the default) and there is no + !!! warning "Side-effect" + When `generate` is `True` (which is the default) and there is no matching namespace for the URI in the namespace manager then a new - namespace will be added with prefix ``ns{index}``. + namespace will be added with prefix `ns{index}`. - Thus, when ``generate`` is `True`, this function is not a pure + Thus, when `generate` is `True`, this function is not a pure function because of this side-effect. This default behaviour is chosen so that this function operates similarly to `NamespaceManager.qname`. - :param uri: URI to generate CURIE for. - :param generate: Whether to add a prefix for the namespace if one doesn't - already exist. Default: `True`. - :return: CURIE for the URI. - :raises KeyError: If generate is `False` and the namespace doesn't already have - a prefix. + Args: + uri: URI to generate CURIE for. + generate: Whether to add a prefix for the namespace if one doesn't + already exist. Default: `True`. + + Returns: + CURIE for the URI + + Raises: + KeyError: If generate is `False` and the namespace doesn't already have + a prefix. """ prefix, namespace, name = self.compute_qname(uri, generate=generate) return ":".join((prefix, name)) @@ -756,7 +764,6 @@ def bind( bound to another prefix. If replace, replace any existing prefix with the new namespace - """ namespace = URIRef(str(namespace)) diff --git a/rdflib/parser.py b/rdflib/parser.py index 1c652ca21c..79bcc6b964 100644 --- a/rdflib/parser.py +++ b/rdflib/parser.py @@ -1,5 +1,4 @@ -""" -Parser plugin interface. +"""Parser plugin interface. This module defines the parser plugin interface and contains other related parser support code. @@ -7,7 +6,6 @@ The module is mainly useful for those wanting to write a parser that can plugin to rdflib. If you are wanting to invoke a parser you likely want to do so through the Graph class parse method. - """ from __future__ import annotations @@ -132,8 +130,7 @@ def _init(self): name = "string" elif isinstance(self.wrapped, TextIOWrapper): inner = self.wrapped.buffer - # type error: TextIOWrapper.buffer cannot be a BytesIOWrapper - if isinstance(inner, BytesIOWrapper): # type: ignore[unreachable] + if isinstance(inner, BytesIOWrapper): raise Exception( "BytesIOWrapper cannot be wrapped in TextIOWrapper, " "then wrapped in another BytesIOWrapper" diff --git a/rdflib/paths.py b/rdflib/paths.py index 3692bad45b..d8f6d12521 100644 --- a/rdflib/paths.py +++ b/rdflib/paths.py @@ -1,50 +1,23 @@ r""" - This module implements the SPARQL 1.1 Property path operators, as defined in: - -http://www.w3.org/TR/sparql11-query/#propertypaths +[http://www.w3.org/TR/sparql11-query/#propertypaths](http://www.w3.org/TR/sparql11-query/#propertypaths) In SPARQL the syntax is as follows: -+--------------------+-------------------------------------------------+ -|Syntax | Matches | -+====================+=================================================+ -|iri | An IRI. A path of length one. | -+--------------------+-------------------------------------------------+ -|^elt | Inverse path (object to subject). | -+--------------------+-------------------------------------------------+ -|elt1 / elt2 | A sequence path of elt1 followed by elt2. | -+--------------------+-------------------------------------------------+ -|elt1 | elt2 | A alternative path of elt1 or elt2 | -| | (all possibilities are tried). | -+--------------------+-------------------------------------------------+ -|elt* | A path that connects the subject and object | -| | of the path by zero or more matches of elt. | -+--------------------+-------------------------------------------------+ -|elt+ | A path that connects the subject and object | -| | of the path by one or more matches of elt. | -+--------------------+-------------------------------------------------+ -|elt? | A path that connects the subject and object | -| | of the path by zero or one matches of elt. | -+--------------------+-------------------------------------------------+ -|!iri or | Negated property set. An IRI which is not one of| -|!(iri\ :sub:`1`\ \| | iri\ :sub:`1`...iri\ :sub:`n`. | -|... \|iri\ :sub:`n`)| !iri is short for !(iri). | -+--------------------+-------------------------------------------------+ -|!^iri or | Negated property set where the excluded matches | -|!(^iri\ :sub:`1`\ \|| are based on reversed path. That is, not one of | -|...\|^iri\ :sub:`n`)| iri\ :sub:`1`...iri\ :sub:`n` as reverse paths. | -| | !^iri is short for !(^iri). | -+--------------------+-------------------------------------------------+ -|!(iri\ :sub:`1`\ \| | A combination of forward and reverse | -|...\|iri\ :sub:`j`\ | properties in a negated property set. | -|\|^iri\ :sub:`j+1`\ | | -|\|... \|^iri\ | | -|:sub:`n`)| | | -+--------------------+-------------------------------------------------+ -|(elt) | A group path elt, brackets control precedence. | -+--------------------+-------------------------------------------------+ +| Syntax | Matches | +|---------------------|-------------------------------------------------------------------------| +| `iri` | An IRI. A path of length one. | +| `^elt` | Inverse path (object to subject). | +| `elt1 / elt2` | A sequence path of `elt1` followed by `elt2`. | +| `elt1 \| elt2` | An alternative path of `elt1` or `elt2` (all possibilities are tried). | +| `elt*` | A path that connects subject and object by zero or more matches of `elt`.| +| `elt+` | A path that connects subject and object by one or more matches of `elt`.| +| `elt?` | A path that connects subject and object by zero or one matches of `elt`.| +| `!iri` or
`!(iri1 \| ... \| irin)` | Negated property set. An IRI not among `iri1` to `irin`.
`!iri` is short for `!(iri)`. | +| `!^iri` or
`!(^iri1 \| ... \| ^irin)` | Negated reverse property set. Excludes `^iri1` to `^irin` as reverse paths.
`!^iri` is short for `!(^iri)`. | +| `!(iri1 \| ... \| irij \| ^irij+1 \| ... \| ^irin)` | A combination of forward and reverse properties in a negated property set. | +| `(elt)` | A grouped path `elt`, where parentheses control precedence. | This module is used internally by the SPARQL engine, but the property paths can also be used to query RDFLib Graphs directly. @@ -52,6 +25,7 @@ Where possible the SPARQL syntax is mapped to Python operators, and property path objects can be constructed from existing URIRefs. +```python >>> from rdflib import Graph, Namespace >>> from rdflib.namespace import FOAF @@ -64,16 +38,22 @@ >>> FOAF.name|FOAF.givenName Path(http://xmlns.com/foaf/0.1/name | http://xmlns.com/foaf/0.1/givenName) +``` + Modifiers (?, \*, +) are done using \* (the multiplication operator) and the strings '\*', '?', '+', also defined as constants in this file. +```python >>> FOAF.knows*OneOrMore Path(http://xmlns.com/foaf/0.1/knows+) +``` + The path objects can also be used with the normal graph methods. First some example data: +```python >>> g=Graph() >>> g=g.parse(data=''' @@ -90,19 +70,28 @@ >>> e = Namespace('ex:') +``` + Graph contains: +```python >>> (e.a, e.p1/e.p2, e.e) in g True +``` + Graph generator functions, triples, subjects, objects, etc. : +```python >>> list(g.objects(e.c, (e.p3*OneOrMore)/e.p2)) # doctest: +NORMALIZE_WHITESPACE [rdflib.term.URIRef('ex:j'), rdflib.term.URIRef('ex:g'), rdflib.term.URIRef('ex:f')] +``` + A more complete set of tests: +```python >>> list(eval_path(g, (None, e.p1/e.p2, None)))==[(e.a, e.e)] True >>> list(eval_path(g, (e.a, e.p1|e.p2, None)))==[(e.a,e.c), (e.a,e.f)] @@ -168,8 +157,11 @@ >>> list(eval_path(g, (e.c, (e.p2|e.p3)*ZeroOrMore, e.j))) [(rdflib.term.URIRef('ex:c'), rdflib.term.URIRef('ex:j'))] +``` + No vars specified: +```python >>> sorted(list(eval_path(g, (None, e.p3*OneOrMore, None)))) #doctest: +NORMALIZE_WHITESPACE [(rdflib.term.URIRef('ex:c'), rdflib.term.URIRef('ex:a')), (rdflib.term.URIRef('ex:c'), rdflib.term.URIRef('ex:g')), @@ -178,6 +170,7 @@ (rdflib.term.URIRef('ex:g'), rdflib.term.URIRef('ex:h')), (rdflib.term.URIRef('ex:h'), rdflib.term.URIRef('ex:a'))] +``` """ from __future__ import annotations @@ -223,6 +216,8 @@ def _n3( @total_ordering class Path(ABC): + """Base class for all property paths.""" + __or__: Callable[[Path, Union[URIRef, Path]], AlternativePath] __invert__: Callable[[Path], InvPath] __neg__: Callable[[Path], NegatedPath] diff --git a/rdflib/plugin.py b/rdflib/plugin.py index 556b788040..d0ad7247d7 100644 --- a/rdflib/plugin.py +++ b/rdflib/plugin.py @@ -1,28 +1,26 @@ -""" -Plugin support for rdf. +"""Plugin support for rdf. There are a number of plugin points for rdf: parser, serializer, store, query processor, and query result. Plugins can be registered either through setuptools entry_points or by calling rdf.plugin.register directly. -If you have a package that uses a setuptools based setup.py you can add the -following to your setup:: - - entry_points = { - 'rdf.plugins.parser': [ - 'nt = rdf.plugins.parsers.ntriples:NTParser', - ], - 'rdf.plugins.serializer': [ - 'nt = rdf.plugins.serializers.NTSerializer:NTSerializer', - ], - } - -See the `setuptools dynamic discovery of services and plugins`__ for more -information. - -.. __: http://peak.telecommunity.com/DevCenter/setuptools#dynamic-discovery-of-services-and-plugins - +If you have a package that uses a setuptools based `setup.py` you can add the +following to your setup: + +```python +entry_points = { + 'rdf.plugins.parser': [ + 'nt = rdf.plugins.parsers.ntriples:NTParser', + ], + 'rdf.plugins.serializer': [ + 'nt = rdf.plugins.serializers.NTSerializer:NTSerializer', + ], + } +``` + +See the [setuptools dynamic discovery of services and plugins](http://peak.telecommunity.com/DevCenter/setuptools#dynamic-discovery-of-services-and-plugins) +for moreinformation. """ from __future__ import annotations diff --git a/rdflib/plugins/parsers/jsonld.py b/rdflib/plugins/parsers/jsonld.py index e103e7033a..fe9503b546 100644 --- a/rdflib/plugins/parsers/jsonld.py +++ b/rdflib/plugins/parsers/jsonld.py @@ -1,10 +1,8 @@ """ -This parser will interpret a JSON-LD document as an RDF Graph. See: - - http://json-ld.org/ - -Example usage:: +This parser will interpret a JSON-LD document as an RDF Graph. See http://json-ld.org/ +Example: + ```python >>> from rdflib import Graph, URIRef, Literal >>> test_json = ''' ... { @@ -26,6 +24,7 @@ ... Literal("Someone's Homepage", lang='en'))] True + ``` """ # From: https://github.com/RDFLib/rdflib-jsonld/blob/feature/json-ld-1.1/rdflib_jsonld/parser.py @@ -104,31 +103,23 @@ def parse( """Parse JSON-LD from a source document. The source document can be JSON or HTML with embedded JSON script - elements (type attribute = "application/ld+json"). To process as HTML - ``source.content_type`` must be set to "text/html" or - "application/xhtml+xml". - - :param source: InputSource with JSON-formatted data (JSON or HTML) - - :param sink: Graph to receive the parsed triples - - :param version: parse as JSON-LD version, defaults to 1.1 - - :param encoding: character encoding of the JSON (should be "utf-8" - or "utf-16"), defaults to "utf-8" - - :param base: JSON-LD `Base IRI `_, defaults to None - - :param context: JSON-LD `Context `_, defaults to None - - :param generalized_rdf: parse as `Generalized RDF `_, defaults to False - - :param extract_all_scripts: if source is an HTML document then extract - all script elements, defaults to False (extract only the first - script element). This is ignored if ``source.system_id`` contains - a fragment identifier, in which case only the script element with - matching id attribute is extracted. - + elements (type attribute = `application/ld+json`). To process as HTML + `source.content_type` must be set to "text/html" or + `application/xhtml+xml. + + Args: + source: InputSource with JSON-formatted data (JSON or HTML) + sink: Graph to receive the parsed triples + version: parse as JSON-LD version, defaults to 1.1 + skolemize: whether to skolemize blank nodes, defaults to False + encoding: character encoding of the JSON (should be "utf-8" + base: JSON-LD [Base IRI](https://www.w3.org/TR/json-ld/#base-iri), defaults to None + context: JSON-LD [Context](https://www.w3.org/TR/json-ld/#the-context), defaults to None + generalized_rdf: parse as [Generalized RDF](https://www.w3.org/TR/json-ld/#relationship-to-rdf), defaults to False + extract_all_scripts: if source is an HTML document then extract + script element). This is ignored if `source.system_id` contains + a fragment identifier, in which case only the script element with + matching id attribute is extracted. """ if encoding not in ("utf-8", "utf-16"): warnings.warn( diff --git a/rdflib/plugins/parsers/notation3.py b/rdflib/plugins/parsers/notation3.py index e9c2d0f27f..ae184e8557 100755 --- a/rdflib/plugins/parsers/notation3.py +++ b/rdflib/plugins/parsers/notation3.py @@ -95,18 +95,18 @@ def splitFragP(uriref: str, punc: int = 0) -> Tuple[str, str]: - """split a URI reference before the fragment + """Split a URI reference before the fragment - Punctuation is kept. - - e.g. + Punctuation is kept. e.g. + ```python >>> splitFragP("abc#def") ('abc', '#def') >>> splitFragP("abcdef") ('abcdef', '') + ``` """ i = uriref.rfind("#") @@ -124,15 +124,19 @@ def join(here: str, there: str) -> str: (non-ascii characters are supported/doctested; haven't checked the details of the IRI spec though) - ``here`` is assumed to be absolute. - ``there`` is URI reference. + `here` is assumed to be absolute. + `there` is URI reference. + ```python >>> join('http://example/x/y/z', '../abc') 'http://example/x/abc' + ``` + Raise ValueError if there uses relative path syntax but here has no hierarchical path. + ```python >>> join('mid:foo@example', '../foo') # doctest: +NORMALIZE_WHITESPACE Traceback (most recent call last): raise ValueError(here) @@ -145,13 +149,18 @@ def join(here: str, there: str) -> str: >>> join('mid:foo@example', '#foo') 'mid:foo@example#foo' + ``` + We grok IRIs + ```python >>> len('Andr\\xe9') 5 >>> join('http://example.org/', '#Andr\\xe9') 'http://example.org/#Andr\\xe9' + + ``` """ # assert(here.find("#") < 0), \ @@ -224,7 +233,6 @@ def base() -> str: this yield the URI of the file. If we had a reliable way of getting a computer name, we should put it in the hostname just to prevent ambiguity - """ # return "file://" + hostname + os.getcwd() + "/" return "file://" + _fixslash(os.getcwd()) + "/" @@ -546,7 +554,7 @@ def tok(self, tok: str, argstr: str, i: int, colon: bool = False) -> int: we must not be at end of file. if colon, then keyword followed by colon is ok - (@prefix: is ok, rdf:type shortcut a must be followed by ws) + (`@prefix:` is ok, rdf:type shortcut a must be followed by ws) """ assert tok[0] not in _notNameChars # not for punctuation @@ -1977,9 +1985,11 @@ def hexify(ustr: str) -> bytes: """Use URL encoding to return an ASCII string corresponding to the given UTF8 string + ```python >>> hexify("http://example/a b") b'http://example/a%20b' + ``` """ # s1=ustr.encode('utf-8') s = "" @@ -1993,8 +2003,7 @@ def hexify(ustr: str) -> bytes: class TurtleParser(Parser): - """ - An RDFLib parser for Turtle + """An RDFLib parser for Turtle See http://www.w3.org/TR/turtle/ """ @@ -2029,11 +2038,9 @@ def parse( class N3Parser(TurtleParser): - """ - An RDFLib parser for Notation3 + """An RDFLib parser for Notation3 See http://www.w3.org/DesignIssues/Notation3.html - """ def __init__(self): diff --git a/rdflib/plugins/parsers/nquads.py b/rdflib/plugins/parsers/nquads.py index cadcfa28a3..75dfd366c0 100644 --- a/rdflib/plugins/parsers/nquads.py +++ b/rdflib/plugins/parsers/nquads.py @@ -3,6 +3,7 @@ graphs that can be used and queried. The store that backs the graph *must* be able to handle contexts. +```python >>> from rdflib import ConjunctiveGraph, URIRef, Namespace >>> g = ConjunctiveGraph() >>> data = open("test/data/nquads.rdflib/example.nquads", "rb") @@ -21,6 +22,8 @@ >>> s = URIRef("http://bibliographica.org/entity/E10009") >>> FOAF = Namespace("http://xmlns.com/foaf/0.1/") >>> assert(g.value(s, FOAF.name).eq("Arco Publications")) + +``` """ from __future__ import annotations @@ -52,16 +55,22 @@ def parse( # type: ignore[override] skolemize: bool = False, **kwargs: Any, ): - """ - Parse inputsource as an N-Quads file. - - :type inputsource: `rdflib.parser.InputSource` - :param inputsource: the source of N-Quads-formatted data - :type sink: `rdflib.graph.Graph` - :param sink: where to send parsed triples - :type bnode_context: `dict`, optional - :param bnode_context: a dict mapping blank node identifiers to `~rdflib.term.BNode` instances. - See `.W3CNTriplesParser.parse` + """Parse inputsource as an N-Quads file. + + Args: + inputsource: The source of N-Quads-formatted data. + sink: The graph where parsed quads will be stored. + bnode_context: Optional dictionary mapping blank node identifiers to + [`BNode`][rdflib.term.BNode] instances. + See `.W3CNTriplesParser.parse` for more details. + skolemize: Whether to skolemize blank nodes. + + Returns: + The Dataset containing the parsed quads. + + Raises: + AssertionError: If the sink store is not context-aware. + ParseError: If the input is not a file-like object or contains invalid lines. """ assert ( sink.store.context_aware diff --git a/rdflib/plugins/parsers/ntriples.py b/rdflib/plugins/parsers/ntriples.py index 933e99f3fb..b330256492 100644 --- a/rdflib/plugins/parsers/ntriples.py +++ b/rdflib/plugins/parsers/ntriples.py @@ -1,4 +1,4 @@ -"""\ +""" N-Triples Parser License: GPL 2, W3C, BSD, or MIT Author: Sean B. Palmer, inamidst.com @@ -126,14 +126,17 @@ def uriquote(uri: str) -> str: class W3CNTriplesParser: """An N-Triples Parser. + This is a legacy-style Triples parser for NTriples provided by W3C - Usage:: - p = W3CNTriplesParser(sink=MySink()) - sink = p.parse(f) # file; use parsestring for a string + Example: + ```python + p = W3CNTriplesParser(sink=MySink()) + sink = p.parse(f) # file; use parsestring for a string + ``` To define a context in which blank node identifiers refer to the same blank node - across instances of NTriplesParser, pass the same dict as ``bnode_context`` to each + across instances of NTriplesParser, pass the same dict as `bnode_context` to each instance. By default, a new blank node context is created for each instance of `W3CNTriplesParser`. """ @@ -168,16 +171,18 @@ def parse( bnode_context: Optional[_BNodeContextType] = None, skolemize: bool = False, ) -> Union[DummySink, NTGraphSink]: - """ - Parse f as an N-Triples file. - - :type f: :term:`file object` - :param f: the N-Triples source - :type bnode_context: `dict`, optional - :param bnode_context: a dict mapping blank node identifiers (e.g., ``a`` in ``_:a``) - to `~rdflib.term.BNode` instances. An empty dict can be - passed in to define a distinct context for a given call to - `parse`. + """Parse f as an N-Triples file. + + Args: + f: The N-Triples source + bnode_context: A dict mapping blank node identifiers (e.g., `a` in `_:a`) + to [`BNode`][rdflib.term.BNode] instances. An empty dict can be + passed in to define a distinct context for a given call to + `parse`. + skolemize: Whether to skolemize blank nodes + + Returns: + The sink containing the parsed triples """ if not hasattr(f, "read"): @@ -352,22 +357,21 @@ def triple(self, s: _SubjectType, p: _PredicateType, o: _ObjectType) -> None: class NTParser(Parser): - """parser for the ntriples format, often stored with the .nt extension + """Parser for the N-Triples format, often stored with the .nt extension. - See http://www.w3.org/TR/rdf-testcases/#ntriples""" + See http://www.w3.org/TR/rdf-testcases/#ntriples + """ __slots__ = () @classmethod def parse(cls, source: InputSource, sink: Graph, **kwargs: Any) -> None: - """ - Parse the NT format + """Parse the NT format. - :type source: `rdflib.parser.InputSource` - :param source: the source of NT-formatted data - :type sink: `rdflib.graph.Graph` - :param sink: where to send parsed triples - :param kwargs: Additional arguments to pass to `.W3CNTriplesParser.parse` + Args: + source: The source of NT-formatted data + sink: Where to send parsed triples + **kwargs: Additional arguments to pass to `W3CNTriplesParser.parse` """ f: Union[TextIO, IO[bytes], codecs.StreamReader] f = source.getCharacterStream() diff --git a/rdflib/plugins/parsers/patch.py b/rdflib/plugins/parsers/patch.py index 5e8f12d1ff..4489c733d1 100644 --- a/rdflib/plugins/parsers/patch.py +++ b/rdflib/plugins/parsers/patch.py @@ -22,8 +22,7 @@ class Operation(Enum): - """ - Enum of RDF Patch operations. + """Enum of RDF Patch operations. Operations: - `AddTripleOrQuad` (A): Adds a triple or quad. @@ -55,16 +54,13 @@ def parse( # type: ignore[override] skolemize: bool = False, **kwargs: Any, ) -> Dataset: - """ - Parse inputsource as an RDF Patch file. - - :type inputsource: `rdflib.parser.InputSource` - :param inputsource: the source of RDF Patch formatted data - :type sink: `rdflib.graph.Dataset` - :param sink: where to send parsed data - :type bnode_context: `dict`, optional - :param bnode_context: a dict mapping blank node identifiers to `~rdflib.term.BNode` instances. - See `.W3CNTriplesParser.parse` + """Parse inputsource as an RDF Patch file. + + Args: + inputsource: the source of RDF Patch formatted data + sink: where to send parsed data + bnode_context: a dict mapping blank node identifiers to [`BNode`][rdflib.term.BNode] + instances. See `.W3CNTriplesParser.parse` """ assert sink.store.context_aware, ( "RDFPatchParser must be given" " a context aware store." diff --git a/rdflib/plugins/parsers/rdfxml.py b/rdflib/plugins/parsers/rdfxml.py index 54fc69567b..9ec39d04e3 100644 --- a/rdflib/plugins/parsers/rdfxml.py +++ b/rdflib/plugins/parsers/rdfxml.py @@ -638,6 +638,8 @@ def create_parser(target: InputSource, store: Graph) -> xmlreader.XMLReader: class RDFXMLParser(Parser): + """An RDF/XML parser.""" + def __init__(self): pass diff --git a/rdflib/plugins/serializers/jsonld.py b/rdflib/plugins/serializers/jsonld.py index 0afe8305a8..14001d8a85 100644 --- a/rdflib/plugins/serializers/jsonld.py +++ b/rdflib/plugins/serializers/jsonld.py @@ -1,10 +1,8 @@ """ -This serialiser will output an RDF Graph as a JSON-LD formatted document. See: - - http://json-ld.org/ - -Example usage:: +This serialiser will output an RDF Graph as a JSON-LD formatted document. See http://json-ld.org/ +Example: + ```python >>> from rdflib import Graph >>> testrdf = ''' ... @prefix dc: . @@ -27,6 +25,7 @@ } ] + ``` """ # From: https://github.com/RDFLib/rdflib-jsonld/blob/feature/json-ld-1.1/rdflib_jsonld/serializer.py @@ -56,6 +55,8 @@ class JsonLDSerializer(Serializer): + """JSON-LD RDF graph serializer.""" + def __init__(self, store: Graph): super(JsonLDSerializer, self).__init__(store) diff --git a/rdflib/plugins/serializers/longturtle.py b/rdflib/plugins/serializers/longturtle.py index 1cb2fa7368..3800c40dd7 100644 --- a/rdflib/plugins/serializers/longturtle.py +++ b/rdflib/plugins/serializers/longturtle.py @@ -1,6 +1,6 @@ """ LongTurtle RDF graph serializer for RDFLib. -See for syntax specification. +See http://www.w3.org/TeamSubmission/turtle/ for syntax specification. This variant, longturtle as opposed to just turtle, makes some small format changes to turtle - the original turtle serializer. It: @@ -41,7 +41,7 @@ class LongTurtleSerializer(RecursiveSerializer): """LongTurtle, a Turtle serialization format. - When the optional parameter ``canon`` is set to :py:obj:`True`, the graph is canonicalized + When the optional parameter `canon` is set to `True`, the graph is canonicalized before serialization. This normalizes blank node identifiers and allows for deterministic serialization of the graph. Useful when consistent outputs are required. """ diff --git a/rdflib/plugins/serializers/n3.py b/rdflib/plugins/serializers/n3.py index d8036bba06..627bbe19ca 100644 --- a/rdflib/plugins/serializers/n3.py +++ b/rdflib/plugins/serializers/n3.py @@ -12,6 +12,8 @@ class N3Serializer(TurtleSerializer): + """Notation 3 (N3) RDF graph serializer.""" + short_name = "n3" def __init__(self, store: Graph, parent=None): diff --git a/rdflib/plugins/serializers/nquads.py b/rdflib/plugins/serializers/nquads.py index d88454d1ed..c7c68f66ea 100644 --- a/rdflib/plugins/serializers/nquads.py +++ b/rdflib/plugins/serializers/nquads.py @@ -12,6 +12,8 @@ class NQuadsSerializer(Serializer): + """NQuads RDF graph serializer.""" + def __init__(self, store: Graph): if not store.context_aware: raise Exception( diff --git a/rdflib/plugins/serializers/nt.py b/rdflib/plugins/serializers/nt.py index 1b0343b5ac..57fdb0c606 100644 --- a/rdflib/plugins/serializers/nt.py +++ b/rdflib/plugins/serializers/nt.py @@ -21,9 +21,7 @@ class NTSerializer(Serializer): - """ - Serializes RDF graphs to NTriples format. - """ + """Serializes RDF graphs to NTriples format.""" def __init__(self, store: Graph): Serializer.__init__(self, store) @@ -48,8 +46,7 @@ def serialize( class NT11Serializer(NTSerializer): - """ - Serializes RDF graphs to RDF 1.1 NTriples format. + """Serializes RDF graphs to RDF 1.1 NTriples format. Exactly like nt - only utf8 encoded. """ @@ -70,9 +67,7 @@ def _nt_row(triple: _TripleType) -> str: def _quoteLiteral(l_: Literal) -> str: # noqa: N802 - """ - a simpler version of term.Literal.n3() - """ + """A simpler version of term.Literal.n3()""" encoded = _quote_encode(l_) diff --git a/rdflib/plugins/serializers/patch.py b/rdflib/plugins/serializers/patch.py index 1bc5ff41f7..c69f9c1b26 100644 --- a/rdflib/plugins/serializers/patch.py +++ b/rdflib/plugins/serializers/patch.py @@ -34,13 +34,16 @@ def serialize( encoding: Optional[str] = None, **kwargs: Any, ) -> None: - """ - Serialize the store to the given stream. - :param stream: The stream to serialize to. - :param base: The base URI to use for the serialization. - :param encoding: The encoding to use for the serialization. - :param kwargs: Additional keyword arguments. + """Serialize the store to the given stream. + + Args: + stream: The stream to serialize to. + base: The base URI to use for the serialization. + encoding: The encoding to use for the serialization. + kwargs: Additional keyword arguments. + Supported keyword arguments: + - operation: The operation to perform. Either 'add' or 'remove'. - target: The target Dataset to compare against. NB: Only one of 'operation' or 'target' should be provided. diff --git a/rdflib/plugins/serializers/rdfxml.py b/rdflib/plugins/serializers/rdfxml.py index 8ae7d78cbe..baeafe8acc 100644 --- a/rdflib/plugins/serializers/rdfxml.py +++ b/rdflib/plugins/serializers/rdfxml.py @@ -19,6 +19,8 @@ class XMLSerializer(Serializer): + """RDF/XML RDF graph serializer.""" + def __init__(self, store: Graph): super(XMLSerializer, self).__init__(store) @@ -167,6 +169,8 @@ def fix(val: str) -> str: class PrettyXMLSerializer(Serializer): + """Pretty RDF/XML RDF graph serializer.""" + def __init__(self, store: Graph, max_depth=3): super(PrettyXMLSerializer, self).__init__(store) self.forceRDFAbout: Set[URIRef] = set() diff --git a/rdflib/plugins/serializers/trig.py b/rdflib/plugins/serializers/trig.py index 95b5e42c03..25f8d2a128 100644 --- a/rdflib/plugins/serializers/trig.py +++ b/rdflib/plugins/serializers/trig.py @@ -18,6 +18,8 @@ class TrigSerializer(TurtleSerializer): + """TriG RDF graph serializer.""" + short_name = "trig" indentString = 4 * " " diff --git a/rdflib/plugins/serializers/trix.py b/rdflib/plugins/serializers/trix.py index 95730e8fbb..430b7e97f3 100644 --- a/rdflib/plugins/serializers/trix.py +++ b/rdflib/plugins/serializers/trix.py @@ -16,6 +16,8 @@ class TriXSerializer(Serializer): + """TriX RDF graph serializer.""" + def __init__(self, store: Graph): super(TriXSerializer, self).__init__(store) if not store.context_aware: diff --git a/rdflib/plugins/serializers/turtle.py b/rdflib/plugins/serializers/turtle.py index a6d2b631ab..9a77debb9b 100644 --- a/rdflib/plugins/serializers/turtle.py +++ b/rdflib/plugins/serializers/turtle.py @@ -36,6 +36,8 @@ class RecursiveSerializer(Serializer): + """Base class for recursive serializers.""" + topClasses = [RDFS.Class] predicateOrder = [RDF.type, RDFS.label] maxDepth = 10 @@ -76,7 +78,8 @@ def orderSubjects(self) -> List[_SubjectType]: for classURI in self.topClasses: members = list(self.store.subjects(RDF.type, classURI)) - members.sort() + # type error: All overload variants of "sort" of "list" require at least one argument + members.sort() # type: ignore[call-overload] subjects.extend(members) for member in members: @@ -143,7 +146,8 @@ def sortProperties( Sort the lists of values. Return a sorted list of properties.""" # Sort object lists for prop, objects in properties.items(): - objects.sort() + # type error: All overload variants of "sort" of "list" require at least one argument + objects.sort() # type: ignore[call-overload] # Make sorted list of properties propList: List[_PredicateType] = [] @@ -153,7 +157,8 @@ def sortProperties( propList.append(prop) seen[prop] = True props = list(properties.keys()) - props.sort() + # type error: All overload variants of "sort" of "list" require at least one argument + props.sort() # type: ignore[call-overload] for prop in props: if prop not in seen: propList.append(prop) @@ -195,6 +200,8 @@ def relativize(self, uri: _StrT) -> Union[_StrT, URIRef]: class TurtleSerializer(RecursiveSerializer): + """Turtle RDF graph serializer.""" + short_name = "turtle" indentString = " " diff --git a/rdflib/plugins/serializers/xmlwriter.py b/rdflib/plugins/serializers/xmlwriter.py index 8c00521ad5..78cf0193e5 100644 --- a/rdflib/plugins/serializers/xmlwriter.py +++ b/rdflib/plugins/serializers/xmlwriter.py @@ -16,6 +16,8 @@ class XMLWriter: + """A simple XML writer that writes to a stream.""" + def __init__( self, stream: IO[bytes], diff --git a/rdflib/plugins/shared/jsonld/context.py b/rdflib/plugins/shared/jsonld/context.py index e6b668878f..86af71ade3 100644 --- a/rdflib/plugins/shared/jsonld/context.py +++ b/rdflib/plugins/shared/jsonld/context.py @@ -1,8 +1,5 @@ """ -Implementation of the JSON-LD Context structure. See: - - http://json-ld.org/ - +Implementation of the JSON-LD Context structure. See: http://json-ld.org/ """ # https://github.com/RDFLib/rdflib-jsonld/blob/feature/json-ld-1.1/rdflib_jsonld/context.py @@ -657,7 +654,8 @@ def to_dict(self) -> Dict[str, Any]: Returns a dictionary representation of the context that can be serialized to JSON. - :return: a dictionary representation of the context. + Returns: + a dictionary representation of the context. """ r = {v: k for (k, v) in self._prefixes.items()} r.update({term.name: self._term_dict(term) for term in self._lookup.values()}) diff --git a/rdflib/plugins/shared/jsonld/util.py b/rdflib/plugins/shared/jsonld/util.py index 097a90b70d..47583e9457 100644 --- a/rdflib/plugins/shared/jsonld/util.py +++ b/rdflib/plugins/shared/jsonld/util.py @@ -50,15 +50,15 @@ def source_to_json( """Extract JSON from a source document. The source document can be JSON or HTML with embedded JSON script elements (type attribute = "application/ld+json"). - To process as HTML ``source.content_type`` must be set to "text/html" or "application/xhtml+xml". + To process as HTML `source.content_type` must be set to "text/html" or "application/xhtml+xml". - :param source: the input source document (JSON or HTML) + Args: + source: the input source document (JSON or HTML) + fragment_id: if source is an HTML document then extract only the script element with matching id attribute, defaults to None + extract_all_scripts: if source is an HTML document then extract all script elements (unless fragment_id is provided), defaults to False (extract only the first script element) - :param fragment_id: if source is an HTML document then extract only the script element with matching id attribute, defaults to None - - :param extract_all_scripts: if source is an HTML document then extract all script elements (unless fragment_id is provided), defaults to False (extract only the first script element) - - :return: Tuple with the extracted JSON document and value of the HTML base element + Returns: + Tuple with the extracted JSON document and value of the HTML base element """ if isinstance(source, PythonInputSource): @@ -208,6 +208,7 @@ def split_iri(iri: str) -> Tuple[str, Optional[str]]: def norm_url(base: str, url: str) -> str: """ + ```python >>> norm_url('http://example.org/', '/one') 'http://example.org/one' >>> norm_url('http://example.org/', '/one#') @@ -220,6 +221,8 @@ def norm_url(base: str, url: str) -> str: 'http://example.net/one' >>> norm_url('http://example.org/', 'http://example.org//one') 'http://example.org//one' + + ``` """ if "://" in url: return url @@ -253,7 +256,7 @@ def norm_url(base: str, url: str) -> str: # type error: Missing return statement def context_from_urlinputsource(source: URLInputSource) -> Optional[str]: # type: ignore[return] """ - Please note that JSON-LD documents served with the application/ld+json media type + Please note that JSON-LD documents served with the `application/ld+json` media type MUST have all context information, including references to external contexts, within the body of the document. Contexts linked via a http://www.w3.org/ns/json-ld#context HTTP Link Header MUST be diff --git a/rdflib/plugins/sparql/__init__.py b/rdflib/plugins/sparql/__init__.py index 0ab7f80bf7..c59dd13f4a 100644 --- a/rdflib/plugins/sparql/__init__.py +++ b/rdflib/plugins/sparql/__init__.py @@ -1,7 +1,6 @@ -""" -SPARQL implementation for RDFLib +"""SPARQL implementation for RDFLib -.. versionadded:: 4.0 +!!! example "New in version 4.0" """ from importlib.metadata import entry_points diff --git a/rdflib/plugins/sparql/algebra.py b/rdflib/plugins/sparql/algebra.py index 5cb22d2650..568f9dff1a 100644 --- a/rdflib/plugins/sparql/algebra.py +++ b/rdflib/plugins/sparql/algebra.py @@ -2,7 +2,6 @@ Converting the 'parse-tree' output of pyparsing to a SPARQL Algebra expression http://www.w3.org/TR/sparql11-query/#sparqlQuery - """ from __future__ import annotations @@ -277,9 +276,7 @@ def _c(n): def collectAndRemoveFilters(parts: List[CompValue]) -> Optional[Expr]: - """ - - FILTER expressions apply to the whole group graph pattern in which + """FILTER expressions apply to the whole group graph pattern in which they appear. http://www.w3.org/TR/sparql11-query/#sparqlCollectFilters @@ -405,8 +402,7 @@ def _traverse( visitPre: Callable[[Any], Any] = lambda n: None, visitPost: Callable[[Any], Any] = lambda n: None, ): - """ - Traverse a parse-tree, visit each node + """Traverse a parse-tree, visit each node if visit functions return a value, replace current node """ @@ -433,7 +429,7 @@ def _traverse( return e -def _traverseAgg(e, visitor: Callable[[Any, Any], Any] = lambda n, v: None): +def _traverseAgg(e: Any, visitor: Callable[[Any, Any], Any] = lambda n, v: None): """ Traverse a parse-tree, visit each node @@ -628,7 +624,6 @@ def translateValues( def translate(q: CompValue) -> Tuple[Optional[CompValue], List[Variable]]: """ http://www.w3.org/TR/sparql11-query/#convertSolMod - """ _traverse(q, _simplifyFilters) @@ -768,7 +763,7 @@ def translate(q: CompValue) -> Tuple[Optional[CompValue], List[Variable]]: def _find_first_child_projections(M: CompValue) -> Iterable[CompValue]: """ Recursively find the first child instance of a Projection operation in each of - the branches of the query execution plan/tree. + the branches of the query execution plan/tree. """ for child_op in M.values(): @@ -955,17 +950,16 @@ class ExpressionNotCoveredException(Exception): # noqa: N818 class _AlgebraTranslator: - """ - Translator of a Query's algebra to its equivalent SPARQL (string). + """Translator of a Query's algebra to its equivalent SPARQL (string). Coded as a class to support storage of state during the translation process, without use of a file. Anticipated Usage: - .. code-block:: python - - translated_query = _AlgebraTranslator(query).translateAlgebra() + ```python + translated_query = _AlgebraTranslator(query).translateAlgebra() + ``` An external convenience function which wraps the above call, `translateAlgebra`, is supplied, so this class does not need to be @@ -1023,12 +1017,7 @@ def convert_node_arg( ) def sparql_query_text(self, node): - """ - https://www.w3.org/TR/sparql11-query/#sparqlSyntax - - :param node: - :return: - """ + """""" if isinstance(node, CompValue): # 18.2 Query Forms @@ -1655,9 +1644,12 @@ def translateAlgebra(query_algebra: Query) -> str: """ Translates a SPARQL 1.1 algebra tree into the corresponding query string. - :param query_algebra: An algebra returned by `translateQuery`. - :return: The query form generated from the SPARQL 1.1 algebra tree for - SELECT queries. + Args: + query_algebra: An algebra returned by `translateQuery`. + + Returns: + The query form generated from the SPARQL 1.1 algebra tree for + SELECT queries. """ query_from_algebra = _AlgebraTranslator( query_algebra=query_algebra diff --git a/rdflib/plugins/sparql/evaluate.py b/rdflib/plugins/sparql/evaluate.py index 363918179a..1bdfb33518 100644 --- a/rdflib/plugins/sparql/evaluate.py +++ b/rdflib/plugins/sparql/evaluate.py @@ -6,12 +6,11 @@ evalPart is called on each level and will delegate to the right method -A rdflib.plugins.sparql.sparql.QueryContext is passed along, keeping +A `rdflib.plugins.sparql.sparql.QueryContext` is passed along, keeping information needed for evaluation A list of dicts (solution mappings) is returned, apart from GroupBy which may also return a dict of list of dicts - """ from __future__ import annotations @@ -657,19 +656,19 @@ def evalQuery( initBindings: Optional[Mapping[str, Identifier]] = None, base: Optional[str] = None, ) -> Mapping[Any, Any]: - """ + """Evaluate a SPARQL query against a graph. - .. caution:: + !!! warning "Caution" This method can access indirectly requested network endpoints, for example, query processing will attempt to access network endpoints - specified in ``SERVICE`` directives. + specified in `SERVICE` directives. When processing untrusted or potentially malicious queries, measures should be taken to restrict network and file access. For information on available security measures, see the RDFLib - :doc:`Security Considerations ` + [Security Considerations](../security_considerations.md) documentation. """ main = query.algebra diff --git a/rdflib/plugins/sparql/operators.py b/rdflib/plugins/sparql/operators.py index e4d19f6646..fffd383eba 100644 --- a/rdflib/plugins/sparql/operators.py +++ b/rdflib/plugins/sparql/operators.py @@ -3,7 +3,6 @@ They get bound as instances-methods to the CompValue objects from parserutils using setEvalFn - """ from __future__ import annotations @@ -481,8 +480,11 @@ def Builtin_TIMEZONE(e: Expr, ctx) -> Literal: """ http://www.w3.org/TR/sparql11-query/#func-timezone - :returns: the timezone part of arg as an xsd:dayTimeDuration. - :raises: an error if there is no timezone. + Returns: + The timezone part of arg as an xsd:dayTimeDuration. + + Raises: + An error if there is no timezone. """ dt = datetime(e.arg) if not dt.tzinfo: @@ -538,8 +540,7 @@ def Builtin_UCASE(e: Expr, ctx) -> Literal: def Builtin_LANG(e: Expr, ctx) -> Literal: - """ - http://www.w3.org/TR/sparql11-query/#func-lang + """http://www.w3.org/TR/sparql11-query/#func-lang Returns the language tag of ltrl, if it has one. It returns "" if ltrl has no language tag. Note that the RDF data model does not include literals @@ -598,8 +599,7 @@ def Builtin_EXISTS(e: Expr, ctx: FrozenBindings) -> Literal: def register_custom_function( uri: URIRef, func: _CustomFunction, override: bool = False, raw: bool = False ) -> None: - """ - Register a custom SPARQL function. + """Register a custom SPARQL function. By default, the function will be passed the RDF terms in the argument list. If raw is True, the function will be passed an Expression and a Context. @@ -1081,7 +1081,6 @@ def numeric(expr: Literal) -> Any: def dateTimeObjects(expr: Literal) -> Any: """ return a dataTime/date/time/duration/dayTimeDuration/yearMonthDuration python objects from a literal - """ return expr.toPython() @@ -1096,7 +1095,6 @@ def isCompatibleDateTimeDatatype( # type: ignore[return] """ Returns a boolean indicating if first object is compatible with operation(+/-) over second object. - """ if dt1 == XSD.date: if dt2 == XSD.yearMonthDuration: @@ -1132,7 +1130,6 @@ def calculateDuration( ) -> Literal: """ returns the duration Literal between two datetime - """ date1 = obj1 date2 = obj2 @@ -1180,8 +1177,7 @@ def EBV(rt: Union[Identifier, SPARQLError, Expr]) -> Union[bool, NoReturn]: ... def EBV(rt: Union[Identifier, SPARQLError, Expr]) -> bool: - """ - Effective Boolean Value (EBV) + """Effective Boolean Value (EBV) * If the argument is a typed literal with a datatype of xsd:boolean, the EBV is the value of that argument. @@ -1192,7 +1188,6 @@ def EBV(rt: Union[Identifier, SPARQLError, Expr]) -> bool: derived from a numeric type, the EBV is false if the operand value is NaN or is numerically equal to zero; otherwise the EBV is true. * All other arguments, including unbound arguments, produce a type error. - """ if isinstance(rt, Literal): @@ -1226,28 +1221,27 @@ def EBV(rt: Union[Identifier, SPARQLError, Expr]) -> bool: def _lang_range_check(range: Literal, lang: Literal) -> bool: """ Implementation of the extended filtering algorithm, as defined in point - 3.3.2, of U{RFC 4647}, on + 3.3.2, of [RFC 4647](http://www.rfc-editor.org/rfc/rfc4647.txt), on matching language ranges and language tags. - Needed to handle the C{rdf:PlainLiteral} datatype. - @param range: language range - @param lang: language tag - @rtype: boolean + Needed to handle the `rdf:PlainLiteral` datatype. - @author: U{Ivan Herman} + Args: + range: language range + lang: language tag - Taken from `RDFClosure/RestrictedDatatype.py`__ - - .. __:http://dev.w3.org/2004/PythonLib-IH/RDFClosure/RestrictedDatatype.py + Author: [Ivan Herman](http://www.w3.org/People/Ivan/) + Taken from [`RDFClosure/RestrictedDatatype.py`](http://dev.w3.org/2004/PythonLib-IH/RDFClosure/RestrictedDatatype.py) """ def _match(r: str, l_: str) -> bool: """ Matching of a range and language item: either range is a wildcard or the two are equal - @param r: language range item - @param l_: language tag item - @rtype: boolean + + Args: + r: language range item + l_: language tag item """ return r == "*" or r == l_ diff --git a/rdflib/plugins/sparql/parserutils.py b/rdflib/plugins/sparql/parserutils.py index 7b85eb6590..43af286cae 100644 --- a/rdflib/plugins/sparql/parserutils.py +++ b/rdflib/plugins/sparql/parserutils.py @@ -21,8 +21,6 @@ Comp lets you set an evalFn that is bound to the eval method of the resulting CompValue - - """ from __future__ import annotations @@ -60,8 +58,7 @@ def value( variables: bool = False, errors: bool = False, ) -> Any: - """ - utility function for evaluating something... + """Utility function for evaluating something... Variables will be looked up in the context Normally, non-bound vars is an error, @@ -69,7 +66,6 @@ def value( Normally, an error raises the error, set errors=True to return error - """ if isinstance(val, Expr): @@ -154,7 +150,6 @@ class CompValue(OrderedDict): The result of parsing a Comp Any included Params are available as Dict keys or as attributes - """ def __init__(self, name: str, **values): diff --git a/rdflib/plugins/sparql/processor.py b/rdflib/plugins/sparql/processor.py index de97d80bd3..976598ca54 100644 --- a/rdflib/plugins/sparql/processor.py +++ b/rdflib/plugins/sparql/processor.py @@ -2,7 +2,6 @@ Code for tying SPARQL Engine into RDFLib These should be automatically registered with RDFLib - """ from __future__ import annotations @@ -86,18 +85,18 @@ def update( initNs: Optional[Mapping[str, Any]] = None, ) -> None: """ - .. caution:: + !!! warning "Caution" - This method can access indirectly requested network endpoints, for - example, query processing will attempt to access network endpoints - specified in ``SERVICE`` directives. + This method can access indirectly requested network endpoints, for + example, query processing will attempt to access network endpoints + specified in `SERVICE` directives. - When processing untrusted or potentially malicious queries, measures - should be taken to restrict network and file access. + When processing untrusted or potentially malicious queries, measures + should be taken to restrict network and file access. - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. + For information on available security measures, see the RDFLib + [Security Considerations](../security_considerations.md) + documentation. """ if isinstance(strOrQuery, str): @@ -127,18 +126,18 @@ def query( # type: ignore[override] namespaces. The given base is used to resolve relative URIs in the query and will be overridden by any BASE given in the query. - .. caution:: + !!! warning "Caution" - This method can access indirectly requested network endpoints, for - example, query processing will attempt to access network endpoints - specified in ``SERVICE`` directives. + This method can access indirectly requested network endpoints, for + example, query processing will attempt to access network endpoints + specified in `SERVICE` directives. - When processing untrusted or potentially malicious queries, measures - should be taken to restrict network and file access. + When processing untrusted or potentially malicious queries, measures + should be taken to restrict network and file access. - For information on available security measures, see the RDFLib - :doc:`Security Considerations ` - documentation. + For information on available security measures, see the RDFLib + [Security Considerations](../security_considerations.md) + documentation. """ if isinstance(strOrQuery, str): diff --git a/rdflib/plugins/sparql/results/csvresults.py b/rdflib/plugins/sparql/results/csvresults.py index 32b3e42121..c75da0232c 100644 --- a/rdflib/plugins/sparql/results/csvresults.py +++ b/rdflib/plugins/sparql/results/csvresults.py @@ -1,10 +1,8 @@ """ - This module implements a parser and serializer for the CSV SPARQL result formats http://www.w3.org/TR/sparql11-results-csv-tsv/ - """ from __future__ import annotations @@ -20,6 +18,8 @@ class CSVResultParser(ResultParser): + """Parses SPARQL CSV results into a Result object.""" + def __init__(self): self.delim = "," @@ -62,6 +62,8 @@ def convertTerm(self, t: str) -> Optional[Union[BNode, URIRef, Literal]]: class CSVResultSerializer(ResultSerializer): + """Serializes SPARQL results into CSV format.""" + def __init__(self, result: SPARQLResult): ResultSerializer.__init__(self, result) diff --git a/rdflib/plugins/sparql/results/jsonresults.py b/rdflib/plugins/sparql/results/jsonresults.py index cfc2dc1e10..20f08405b1 100644 --- a/rdflib/plugins/sparql/results/jsonresults.py +++ b/rdflib/plugins/sparql/results/jsonresults.py @@ -6,7 +6,6 @@ http://projects.bigasterisk.com/sparqlhttp/ Authors: Drew Perttula, Gunnar Aastrand Grimnes - """ from __future__ import annotations @@ -27,6 +26,8 @@ class JSONResultParser(ResultParser): + """Parses SPARQL JSON results into a Result object.""" + # type error: Signature of "parse" incompatible with supertype "ResultParser" def parse(self, source: IO, content_type: Optional[str] = None) -> Result: # type: ignore[override] inp = source.read() @@ -43,6 +44,8 @@ def parse(self, source: IO, content_type: Optional[str] = None) -> Result: # ty class JSONResultSerializer(ResultSerializer): + """Serializes SPARQL results to JSON format.""" + def __init__(self, result: Result): ResultSerializer.__init__(self, result) @@ -125,8 +128,11 @@ def parseJsonTerm(d: Dict[str, str]) -> Identifier: """rdflib object (Literal, URIRef, BNode) for the given json-format dict. input is like: - { 'type': 'uri', 'value': 'http://famegame.com/2006/01/username' } - { 'type': 'literal', 'value': 'drewp' } + + ```json + { 'type': 'uri', 'value': 'http://famegame.com/2006/01/username' } + { 'type': 'literal', 'value': 'drewp' } + ``` """ t = d["type"] diff --git a/rdflib/plugins/sparql/results/tsvresults.py b/rdflib/plugins/sparql/results/tsvresults.py index 54b516d0dd..14741718a5 100644 --- a/rdflib/plugins/sparql/results/tsvresults.py +++ b/rdflib/plugins/sparql/results/tsvresults.py @@ -64,6 +64,8 @@ class TSVResultParser(ResultParser): + """Parses SPARQL TSV results into a Result object.""" + # type error: Signature of "parse" incompatible with supertype "ResultParser" [override] def parse(self, source: IO, content_type: typing.Optional[str] = None) -> Result: # type: ignore[override] if isinstance(source.read(0), bytes): diff --git a/rdflib/plugins/sparql/results/xmlresults.py b/rdflib/plugins/sparql/results/xmlresults.py index 3cc6b2c38d..b02c171a05 100644 --- a/rdflib/plugins/sparql/results/xmlresults.py +++ b/rdflib/plugins/sparql/results/xmlresults.py @@ -48,6 +48,8 @@ class XMLResultParser(ResultParser): + """A Parser for SPARQL results in XML.""" + # TODO FIXME: content_type should be a keyword only arg. def parse(self, source: IO, content_type: Optional[str] = None) -> Result: # type: ignore[override] return XMLResult(source) @@ -153,6 +155,8 @@ def parseTerm(element: xml_etree.Element) -> Union[URIRef, Literal, BNode]: class XMLResultSerializer(ResultSerializer): + """Serializes SPARQL results into XML format.""" + def __init__(self, result: Result): ResultSerializer.__init__(self, result) diff --git a/rdflib/plugins/sparql/sparql.py b/rdflib/plugins/sparql/sparql.py index 8249a0ee81..518d8fcd87 100644 --- a/rdflib/plugins/sparql/sparql.py +++ b/rdflib/plugins/sparql/sparql.py @@ -337,15 +337,16 @@ def load( """ Load data from the source into the query context's. - :param source: The source to load from. - :param default: If `True`, triples from the source will be added - to the default graph, otherwise it will be loaded into a - graph with ``source`` URI as its name. - :param into: The name of the graph to load the data into. If - `None`, the source URI will be used as as the name of the - graph. - :param kwargs: Keyword arguments to pass to - :meth:`rdflib.graph.Graph.parse`. + Args: + source: The source to load from. + default: If `True`, triples from the source will be added + to the default graph, otherwise it will be loaded into a + graph with `source` URI as its name. + into: The name of the graph to load the data into. If + `None`, the source URI will be used as as the name of the + graph. + **kwargs: Keyword arguments to pass to + [`parse`][rdflib.graph.Graph.parse]. """ def _load(graph, source): diff --git a/rdflib/plugins/sparql/update.py b/rdflib/plugins/sparql/update.py index c9d36564c0..5ec1e283e2 100644 --- a/rdflib/plugins/sparql/update.py +++ b/rdflib/plugins/sparql/update.py @@ -1,7 +1,5 @@ """ - Code for carrying out Update Operations - """ from __future__ import annotations @@ -288,9 +286,7 @@ def evalUpdate( update: Update, initBindings: Optional[Mapping[str, Identifier]] = None, ) -> None: - """ - - http://www.w3.org/TR/sparql11-update/#updateLanguage + """http://www.w3.org/TR/sparql11-update/#updateLanguage 'A request is a sequence of operations [...] Implementations MUST ensure that operations of a single request are executed in a @@ -305,17 +301,17 @@ def evalUpdate( This will return None on success and raise Exceptions on error - .. caution:: + !!! warning "Security Considerations" This method can access indirectly requested network endpoints, for example, query processing will attempt to access network endpoints - specified in ``SERVICE`` directives. + specified in `SERVICE` directives. When processing untrusted or potentially malicious queries, measures should be taken to restrict network and file access. For information on available security measures, see the RDFLib - :doc:`Security Considerations ` + [Security Considerations](../security_considerations.md) documentation. """ diff --git a/rdflib/plugins/stores/auditable.py b/rdflib/plugins/stores/auditable.py index ca2c7d79b0..05ca2f6489 100644 --- a/rdflib/plugins/stores/auditable.py +++ b/rdflib/plugins/stores/auditable.py @@ -44,6 +44,8 @@ class AuditableStore(Store): + """A store that logs destructive operations (add/remove) in reverse order.""" + def __init__(self, store: Store): self.store = store self.context_aware = store.context_aware diff --git a/rdflib/plugins/stores/berkeleydb.py b/rdflib/plugins/stores/berkeleydb.py index 11195432f3..693dfd220a 100644 --- a/rdflib/plugins/stores/berkeleydb.py +++ b/rdflib/plugins/stores/berkeleydb.py @@ -71,25 +71,23 @@ def bb(u: str) -> bytes: class BerkeleyDB(Store): - """\ - A store that allows for on-disk persistent using BerkeleyDB, a fast - key/value DB. + """A store that allows for on-disk persistent using BerkeleyDB, a fast key/value DB. This store implementation used to be known, previous to rdflib 6.0.0 as 'Sleepycat' due to that being the then name of the Python wrapper for BerkeleyDB. This store allows for quads as well as triples. See examples of use - in both the `examples.berkeleydb_example` and ``test/test_store/test_store_berkeleydb.py`` + in both the `examples.berkeleydb_example` and `test/test_store/test_store_berkeleydb.py` files. **NOTE on installation**: To use this store, you must have BerkeleyDB installed on your system - separately to Python (``brew install berkeley-db`` on a Mac) and also have - the BerkeleyDB Python wrapper installed (``pip install berkeleydb``). + separately to Python (`brew install berkeley-db` on a Mac) and also have + the BerkeleyDB Python wrapper installed (`pip install berkeleydb`). You may need to install BerkeleyDB Python wrapper like this: - ``YES_I_HAVE_THE_RIGHT_TO_USE_THIS_BERKELEY_DB_VERSION=1 pip install berkeleydb`` + `YES_I_HAVE_THE_RIGHT_TO_USE_THIS_BERKELEY_DB_VERSION=1 pip install berkeleydb` """ context_aware = True diff --git a/rdflib/plugins/stores/concurrent.py b/rdflib/plugins/stores/concurrent.py index 2d050954b3..4203dd1ac7 100644 --- a/rdflib/plugins/stores/concurrent.py +++ b/rdflib/plugins/stores/concurrent.py @@ -21,6 +21,8 @@ def __next__(self): class ConcurrentStore: + """A store that allows concurrent reads and writes.""" + def __init__(self, store): self.store = store diff --git a/rdflib/plugins/stores/memory.py b/rdflib/plugins/stores/memory.py index 7dc7c25ac8..bd73e0d10e 100644 --- a/rdflib/plugins/stores/memory.py +++ b/rdflib/plugins/stores/memory.py @@ -40,12 +40,11 @@ class SimpleMemory(Store): - """\ - A fast naive in memory implementation of a triple store. + """A fast naive in memory implementation of a triple store. This triple store uses nested dictionaries to store triples. Each - triple is stored in two such indices as follows spo[s][p][o] = 1 and - pos[p][o][s] = 1. + triple is stored in two such indices as follows `spo[s][p][o]` = 1 and + `pos[p][o][s]` = 1. Authors: Michel Pelletier, Daniel Krech, Stefan Niederhauser """ @@ -82,9 +81,7 @@ def add( context: _ContextType, quoted: bool = False, ) -> None: - """\ - Add a triple to the store of triples. - """ + """Add a triple to the store of triples.""" # add dictionary entries for spo[s][p][p] = 1 and pos[p][o][s] # = 1, creating the nested dictionaries where they do not yet # exits. @@ -270,8 +267,7 @@ def update( class Memory(Store): - """\ - An in memory implementation of a triple store. + """An in memory implementation of a triple store. Same as SimpleMemory above, but is Context-aware, Graph-aware, and Formula-aware Authors: Ashley Sommer @@ -320,9 +316,7 @@ def add( context: _ContextType, quoted: bool = False, ) -> None: - """\ - Add a triple to the store of triples. - """ + """Add a triple to the store of triples.""" # add dictionary entries for spo[s][p][p] = 1 and pos[p][o][s] # = 1, creating the nested dictionaries where they do not yet # exits. diff --git a/rdflib/plugins/stores/sparqlstore.py b/rdflib/plugins/stores/sparqlstore.py index 2d3e259fe5..4d72f2c8d2 100644 --- a/rdflib/plugins/stores/sparqlstore.py +++ b/rdflib/plugins/stores/sparqlstore.py @@ -1,7 +1,6 @@ """ This is an RDFLib store around Ivan Herman et al.'s SPARQL service wrapper. This was first done in layer-cake, and then ported to RDFLib - """ from __future__ import annotations @@ -79,20 +78,23 @@ class SPARQLStore(SPARQLConnector, Store): motivated by the SPARQL 1.1. Fuseki/TDB has a flag for specifying that the default graph - is the union of all graphs (``tdb:unionDefaultGraph`` in the Fuseki config). + is the union of all graphs (`tdb:unionDefaultGraph` in the Fuseki config). + + !!! warning "Blank nodes - .. warning:: By default the SPARQL Store does not support blank-nodes! + By default the SPARQL Store does not support blank-nodes! - As blank-nodes act as variables in SPARQL queries, - there is no way to query for a particular blank node without - using non-standard SPARQL extensions. + As blank-nodes act as variables in SPARQL queries, + there is no way to query for a particular blank node without + using non-standard SPARQL extensions. - See http://www.w3.org/TR/sparql11-query/#BGPsparqlBNodes + See http://www.w3.org/TR/sparql11-query/#BGPsparqlBNodes - You can make use of such extensions through the ``node_to_sparql`` + You can make use of such extensions through the `node_to_sparql` argument. For example if you want to transform BNode('0001') into "", you can use a function like this: + ```python >>> def my_bnode_ext(node): ... if isinstance(node, BNode): ... return '' % node @@ -100,10 +102,12 @@ class SPARQLStore(SPARQLConnector, Store): >>> store = SPARQLStore('http://dbpedia.org/sparql', ... node_to_sparql=my_bnode_ext) + ``` + You can request a particular result serialization with the - ``returnFormat`` parameter. This is a string that must have a - matching plugin registered. Built in is support for ``xml``, - ``json``, ``csv``, ``tsv`` and ``application/rdf+xml``. + `returnFormat` parameter. This is a string that must have a + matching plugin registered. Built in is support for `xml`, + `json`, `csv`, `tsv` and `application/rdf+xml`. The underlying SPARQLConnector uses the urllib library. Any extra kwargs passed to the SPARQLStore connector are passed to @@ -112,10 +116,12 @@ class SPARQLStore(SPARQLConnector, Store): Form example: + ```python >>> store = SPARQLStore('...my endpoint ...', auth=('user','pass')) - will use HTTP basic auth. + ``` + will use HTTP basic auth. """ formula_aware = False @@ -269,22 +275,22 @@ def triples( # type: ignore[override] * OFFSET: an integer to enable paging of results * ORDERBY: an instance of Variable('s'), Variable('o') or Variable('p') or, by default, the first 'None' from the given triple - .. warning:: + !!! warning "Limit and offset - Using LIMIT or OFFSET automatically include ORDERBY otherwise this is because the results are retrieved in a not deterministic way (depends on the walking path on the graph) - Using OFFSET without defining LIMIT will discard the first OFFSET - 1 results - .. code-block:: python - - a_graph.LIMIT = limit - a_graph.OFFSET = offset - triple_generator = a_graph.triples(mytriple): - # do something - # Removes LIMIT and OFFSET if not required for the next triple() calls - del a_graph.LIMIT - del a_graph.OFFSET + ```python + a_graph.LIMIT = limit + a_graph.OFFSET = offset + triple_generator = a_graph.triples(mytriple): + # do something + # Removes LIMIT and OFFSET if not required for the next triple() calls + del a_graph.LIMIT + del a_graph.OFFSET + ``` """ s, p, o = spo @@ -413,8 +419,8 @@ def contexts( # type: ignore[override] self, triple: Optional[_TripleType] = None ) -> Generator[_ContextIdentifierType, None, None]: """ - Iterates over results to "SELECT ?NAME { GRAPH ?NAME { ?s ?p ?o } }" - or "SELECT ?NAME { GRAPH ?NAME {} }" if triple is `None`. + Iterates over results to `SELECT ?NAME { GRAPH ?NAME { ?s ?p ?o } }` + or `SELECT ?NAME { GRAPH ?NAME {} }` if triple is `None`. Returns instances of this store with the SPARQL wrapper object updated via addNamedGraph(?NAME). @@ -546,8 +552,7 @@ class SPARQLUpdateStore(SPARQLStore): For Graph objects, everything works as expected. - See the :class:`SPARQLStore` base class for more information. - + See the [`SPARQLStore`][rdflib.plugins.stores.sparqlstore.SPARQLStore] base class for more information. """ where_pattern = re.compile(r"""(?PWHERE\s*\{)""", re.IGNORECASE) @@ -618,13 +623,12 @@ def __init__( **kwds, ): """ - :param autocommit if set, the store will commit after every - writing operations. If False, we only make queries on the - server once commit is called. - - :param dirty_reads if set, we do not commit before reading. So you - cannot read what you wrote before manually calling commit. - + Args: + autocommit: if set, the store will commit after every + writing operations. If False, we only make queries on the + server once commit is called. + dirty_reads if set, we do not commit before reading. So you + cannot read what you wrote before manually calling commit. """ SPARQLStore.__init__( @@ -671,12 +675,12 @@ def __len__(self, *args: Any, **kwargs: Any) -> int: def open( self, configuration: Union[str, Tuple[str, str]], create: bool = False ) -> None: - """ - sets the endpoint URLs for this SPARQLStore + """Sets the endpoint URLs for this `SPARQLStore` - :param configuration: either a tuple of (query_endpoint, update_endpoint), - or a string with the endpoint which is configured as query and update endpoint - :param create: if True an exception is thrown. + Args: + configuration: either a tuple of (query_endpoint, update_endpoint), + or a string with the endpoint which is configured as query and update endpoint + create: if True an exception is thrown. """ if create: @@ -697,7 +701,7 @@ def _transaction(self) -> List[str]: # Transactional interfaces def commit(self) -> None: - """add(), addN(), and remove() are transactional to reduce overhead of many small edits. + """`add()`, `addN()`, and `remove()` are transactional to reduce overhead of many small edits. Read and update() calls will automatically commit any outstanding edits. This should behave as expected most of the time, except that alternating writes and reads can degenerate to the original call-per-triple situation that originally existed. @@ -810,9 +814,8 @@ def update( # type: ignore[override] queryGraph: Optional[str] = None, # noqa: N803 DEBUG: bool = False, # noqa: N803 ): - """ - Perform a SPARQL Update Query against the endpoint, - INSERT, LOAD, DELETE etc. + """Perform a SPARQL Update Query against the endpoint, INSERT, LOAD, DELETE etc. + Setting initNs adds PREFIX declarations to the beginning of the update. Setting initBindings adds inline VALUEs to the beginning of every WHERE clause. By the SPARQL grammar, all @@ -822,25 +825,24 @@ def update( # type: ignore[override] substring 'WHERE {' which does not denote a WHERE clause, e.g. if it is part of a literal. - .. admonition:: Context-aware query rewriting + !!! info "Context-aware query rewriting" - **When:** If context-awareness is enabled and the graph is not the default graph of the store. - - **Why:** To ensure consistency with the :class:`~rdflib.plugins.stores.memory.Memory` store. - The graph must accept "local" SPARQL requests (requests with no GRAPH keyword) - as if it was the default graph. + - **Why:** To ensure consistency with the [`Memory`][rdflib.plugins.stores.memory.Memory] store. + The graph must accept "local" SPARQL requests (requests with no GRAPH keyword) + as if it was the default graph. - **What is done:** These "local" queries are rewritten by this store. - The content of each block of a SPARQL Update operation is wrapped in a GRAPH block - except if the block is empty. - This basically causes INSERT, INSERT DATA, DELETE, DELETE DATA and WHERE to operate - only on the context. - - **Example:** ``"INSERT DATA { }"`` is converted into - ``"INSERT DATA { GRAPH { } }"``. + The content of each block of a SPARQL Update operation is wrapped in a GRAPH block + except if the block is empty. + This basically causes INSERT, INSERT DATA, DELETE, DELETE DATA and WHERE to operate + only on the context. + - **Example:** `"INSERT DATA { }"` is converted into + `"INSERT DATA { GRAPH { } }"`. - **Warning:** Queries are presumed to be "local" but this assumption is **not checked**. - For instance, if the query already contains GRAPH blocks, the latter will be wrapped in new GRAPH blocks. + For instance, if the query already contains GRAPH blocks, the latter will be wrapped in new GRAPH blocks. - **Warning:** A simplified grammar is used that should tolerate - extensions of the SPARQL grammar. Still, the process may fail in - uncommon situations and produce invalid output. - + extensions of the SPARQL grammar. Still, the process may fail in + uncommon situations and produce invalid output. """ if not self.update_endpoint: raise Exception("Update endpoint is not set!") @@ -874,12 +876,11 @@ def update( # type: ignore[override] self.commit() def _insert_named_graph(self, query: str, query_graph: str) -> str: - """ - Inserts GRAPH {} into blocks of SPARQL Update operations + """Inserts GRAPH {} into blocks of SPARQL Update operations - For instance, "INSERT DATA { }" + For instance, `INSERT DATA { }` is converted into - "INSERT DATA { GRAPH { } }" + `INSERT DATA { GRAPH { } }` """ if isinstance(query_graph, Node): query_graph = self.node_to_sparql(query_graph) diff --git a/rdflib/query.py b/rdflib/query.py index b3a0c43cea..33f1cfcbc0 100644 --- a/rdflib/query.py +++ b/rdflib/query.py @@ -49,7 +49,6 @@ class Processor: This module is useful for those wanting to write a query processor that can plugin to rdf. If you are wanting to execute a query you likely want to do so through the Graph class query method. - """ def __init__(self, graph: Graph): @@ -67,16 +66,14 @@ def query( # type: ignore[empty-body] class UpdateProcessor: - """ - Update plugin interface. + """Update plugin interface. This module is useful for those wanting to write an update processor that can plugin to rdflib. If you are wanting to execute an update statement you likely want to do so through the Graph class update method. - .. versionadded:: 4.0 - + !!! example "New in version 4.0" """ def __init__(self, graph: Graph): @@ -96,12 +93,7 @@ class ResultException(Exception): # noqa: N818 class EncodeOnlyUnicode: - """ - This is a crappy work-around for - http://bugs.python.org/issue11649 - - - """ + """This is a crappy work-around for http://bugs.python.org/issue11649""" def __init__(self, stream: BinaryIO): self.__stream = stream @@ -117,10 +109,9 @@ def __getattr__(self, name: str) -> Any: class ResultRow(Tuple[rdflib.term.Identifier, ...]): - """ - a single result row - allows accessing bindings as attributes or with [] + """A single result row allows accessing bindings as attributes or with [] + ```python >>> from rdflib import URIRef, Variable >>> rr=ResultRow({ Variable('a'): URIRef('urn:cake') }, [Variable('a')]) @@ -148,8 +139,9 @@ class ResultRow(Tuple[rdflib.term.Identifier, ...]): >>> rr[Variable('a')] rdflib.term.URIRef('urn:cake') - .. versionadded:: 4.0 + ``` + !!! example "New in version 4.0" """ labels: Mapping[str, int] @@ -215,8 +207,7 @@ class Result: If the type is "CONSTRUCT" or "DESCRIBE" iterating will yield the triples. - len(result) also works. - + `len(result)` also works. """ def __init__(self, type_: str): @@ -226,6 +217,7 @@ def __init__(self, type_: str): self.type = type_ #: variables contained in the result. self.vars: Optional[List[Variable]] = None + """a list of variables contained in the result""" self._bindings: MutableSequence[Mapping[Variable, Identifier]] = None # type: ignore[assignment] self._genbindings: Optional[Iterator[Mapping[Variable, Identifier]]] = None self.askAnswer: Optional[bool] = None @@ -264,6 +256,7 @@ def parse( content_type: Optional[str] = None, **kwargs: Any, ) -> Result: + """Parse a query result from a source.""" from rdflib import plugin if format: @@ -290,18 +283,20 @@ def serialize( """ Serialize the query result. - The :code:`format` argument determines the Serializer class to use. + The `format` argument determines the Serializer class to use. + + - csv: [`CSVResultSerializer`][rdflib.plugins.sparql.results.csvresults.CSVResultSerializer] + - json: [`JSONResultSerializer`][rdflib.plugins.sparql.results.jsonresults.JSONResultSerializer] + - txt: [`TXTResultSerializer`][rdflib.plugins.sparql.results.txtresults.TXTResultSerializer] + - xml: [`XMLResultSerializer`][rdflib.plugins.sparql.results.xmlresults.XMLResultSerializer] - - csv: :class:`~rdflib.plugins.sparql.results.csvresults.CSVResultSerializer` - - json: :class:`~rdflib.plugins.sparql.results.jsonresults.JSONResultSerializer` - - txt: :class:`~rdflib.plugins.sparql.results.txtresults.TXTResultSerializer` - - xml: :class:`~rdflib.plugins.sparql.results.xmlresults.XMLResultSerializer` + Args: + destination: Path of file output or BufferedIOBase object to write the output to. + encoding: Encoding of output. + format: One of ['csv', 'json', 'txt', xml'] - :param destination: Path of file output or BufferedIOBase object to write the output to. - :param encoding: Encoding of output. - :param format: One of ['csv', 'json', 'txt', xml'] - :param args: - :return: bytes + Returns: + bytes """ if self.type in ("CONSTRUCT", "DESCRIBE"): # type error: Item "None" of "Optional[Graph]" has no attribute "serialize" diff --git a/rdflib/resource.py b/rdflib/resource.py index 48c4710f6a..b69af6015f 100644 --- a/rdflib/resource.py +++ b/rdflib/resource.py @@ -1,8 +1,8 @@ """ -The :class:`~rdflib.resource.Resource` class wraps a -:class:`~rdflib.graph.Graph` -and a resource reference (i.e. a :class:`rdflib.term.URIRef` or -:class:`rdflib.term.BNode`) to support a resource-oriented way of +The [`Resource`][rdflib.resource.Resource] class wraps a +[`Graph`][rdflib.graph.Graph] +and a resource reference (i.e. a [`URIRef`][rdflib.term.URIRef] or +[`BNode`][rdflib.term.BNode]) to support a resource-oriented way of working with a graph. It contains methods directly corresponding to those methods of the Graph @@ -12,278 +12,342 @@ oriented" style, as compared to the triple orientation of the Graph API. Resulting generators are also wrapped so that any resource reference values -(:class:`rdflib.term.URIRef` and :class:`rdflib.term.BNode`) are in turn +([`URIRef`][rdflib.term.URIRef] and [`BNode`][rdflib.term.BNode]) are in turn wrapped as Resources. (Note that this behaviour differs from the corresponding -methods in :class:`~rdflib.graph.Graph`, where no such conversion takes place.) +methods in [`Graph`][rdflib.graph.Graph], where no such conversion takes place.) -Basic Usage Scenario --------------------- +## Basic Usage Scenario -Start by importing things we need and define some namespaces:: +Start by importing things we need and define some namespaces: - >>> from rdflib import * - >>> FOAF = Namespace("http://xmlns.com/foaf/0.1/") - >>> CV = Namespace("http://purl.org/captsolo/resume-rdf/0.2/cv#") +```python +>>> from rdflib import * +>>> FOAF = Namespace("http://xmlns.com/foaf/0.1/") +>>> CV = Namespace("http://purl.org/captsolo/resume-rdf/0.2/cv#") -Load some RDF data:: +``` - >>> graph = Graph().parse(format='n3', data=''' - ... @prefix rdfs: . - ... @prefix xsd: . - ... @prefix foaf: . - ... @prefix cv: . - ... - ... @base . - ... - ... a foaf:Person; - ... rdfs:comment "Just a Python & RDF hacker."@en; - ... foaf:depiction ; - ... foaf:homepage ; - ... foaf:name "Some Body" . - ... - ... a foaf:Image; - ... rdfs:label "some 1"@en; - ... rdfs:comment "Just an image"@en; - ... foaf:thumbnail . - ... - ... a foaf:Image . - ... - ... [] a cv:CV; - ... cv:aboutPerson ; - ... cv:hasWorkHistory [ cv:employedIn ; - ... cv:startDate "2009-09-04"^^xsd:date ] . - ... ''') +Load some RDF data: -Create a Resource:: +```python +>>> graph = Graph().parse(format='n3', data=''' +... @prefix rdfs: . +... @prefix xsd: . +... @prefix foaf: . +... @prefix cv: . +... +... @base . +... +... a foaf:Person; +... rdfs:comment "Just a Python & RDF hacker."@en; +... foaf:depiction ; +... foaf:homepage ; +... foaf:name "Some Body" . +... +... a foaf:Image; +... rdfs:label "some 1"@en; +... rdfs:comment "Just an image"@en; +... foaf:thumbnail . +... +... a foaf:Image . +... +... [] a cv:CV; +... cv:aboutPerson ; +... cv:hasWorkHistory [ cv:employedIn ; +... cv:startDate "2009-09-04"^^xsd:date ] . +... ''') - >>> person = Resource( - ... graph, URIRef("http://example.org/person/some1#self")) +``` -Retrieve some basic facts:: +Create a Resource: - >>> person.identifier - rdflib.term.URIRef('http://example.org/person/some1#self') +```python +>>> person = Resource( +... graph, URIRef("http://example.org/person/some1#self")) - >>> person.value(FOAF.name) - rdflib.term.Literal('Some Body') +``` - >>> person.value(RDFS.comment) - rdflib.term.Literal('Just a Python & RDF hacker.', lang='en') +Retrieve some basic facts: -Resources can be sliced (like graphs, but the subject is fixed):: +```python +>>> person.identifier +rdflib.term.URIRef('http://example.org/person/some1#self') - >>> for name in person[FOAF.name]: - ... print(name) - Some Body - >>> person[FOAF.name : Literal("Some Body")] - True +>>> person.value(FOAF.name) +rdflib.term.Literal('Some Body') -Resources as unicode are represented by their identifiers as unicode:: +>>> person.value(RDFS.comment) +rdflib.term.Literal('Just a Python & RDF hacker.', lang='en') - >>> %(unicode)s(person) #doctest: +SKIP - 'Resource(http://example.org/person/some1#self' +``` + +Resources can be sliced (like graphs, but the subject is fixed): + +```python +>>> for name in person[FOAF.name]: +... print(name) +Some Body +>>> person[FOAF.name : Literal("Some Body")] +True + +``` + +Resources as unicode are represented by their identifiers as unicode: + +```python +>>> %(unicode)s(person) #doctest: +SKIP +'Resource(http://example.org/person/some1#self' + +``` Resource references are also Resources, so you can easily get e.g. a qname -for the type of a resource, like:: - - >>> person.value(RDF.type).qname() - 'foaf:Person' - -Or for the predicates of a resource:: - - >>> sorted( - ... p.qname() for p in person.predicates() - ... ) #doctest: +NORMALIZE_WHITESPACE +SKIP - ['foaf:depiction', 'foaf:homepage', - 'foaf:name', 'rdf:type', 'rdfs:comment'] - -Follow relations and get more data from their Resources as well:: - - >>> for pic in person.objects(FOAF.depiction): - ... print(pic.identifier) - ... print(pic.value(RDF.type).qname()) - ... print(pic.value(FOAF.thumbnail).identifier) - http://example.org/images/person/some1.jpg - foaf:Image - http://example.org/images/person/some1-thumb.jpg - - >>> for cv in person.subjects(CV.aboutPerson): - ... work = list(cv.objects(CV.hasWorkHistory))[0] - ... print(work.value(CV.employedIn).identifier) - ... print(work.value(CV.startDate)) - http://example.org/#company - 2009-09-04 - -It's just as easy to work with the predicates of a resource:: - - >>> for s, p in person.subject_predicates(): - ... print(s.value(RDF.type).qname()) - ... print(p.qname()) - ... for s, o in p.subject_objects(): - ... print(s.value(RDF.type).qname()) - ... print(o.value(RDF.type).qname()) - cv:CV - cv:aboutPerson - cv:CV - foaf:Person - -This is useful for e.g. inspection:: - - >>> thumb_ref = URIRef("http://example.org/images/person/some1-thumb.jpg") - >>> thumb = Resource(graph, thumb_ref) - >>> for p, o in thumb.predicate_objects(): - ... print(p.qname()) - ... print(o.qname()) - rdf:type - foaf:Image - - -Schema Example --------------- - -With this artificial schema data:: - - >>> graph = Graph().parse(format='n3', data=''' - ... @prefix rdf: . - ... @prefix rdfs: . - ... @prefix owl: . - ... @prefix v: . - ... - ... v:Artifact a owl:Class . - ... - ... v:Document a owl:Class; - ... rdfs:subClassOf v:Artifact . - ... - ... v:Paper a owl:Class; - ... rdfs:subClassOf v:Document . - ... - ... v:Choice owl:oneOf (v:One v:Other) . - ... - ... v:Stuff a rdf:Seq; rdf:_1 v:One; rdf:_2 v:Other . - ... - ... ''') - -From this class:: - - >>> artifact = Resource(graph, URIRef("http://example.org/def/v#Artifact")) - -we can get at subclasses:: - - >>> subclasses = list(artifact.transitive_subjects(RDFS.subClassOf)) - >>> [c.qname() for c in subclasses] - ['v:Artifact', 'v:Document', 'v:Paper'] - -and superclasses from the last subclass:: - - >>> [c.qname() for c in subclasses[-1].transitive_objects(RDFS.subClassOf)] - ['v:Paper', 'v:Document', 'v:Artifact'] - -Get items from the Choice:: - - >>> choice = Resource(graph, URIRef("http://example.org/def/v#Choice")) - >>> [it.qname() for it in choice.value(OWL.oneOf).items()] - ['v:One', 'v:Other'] +for the type of a resource, like: + +```python +>>> person.value(RDF.type).qname() +'foaf:Person' + +``` + +Or for the predicates of a resource: + +```python +>>> sorted( +... p.qname() for p in person.predicates() +... ) #doctest: +NORMALIZE_WHITESPACE +SKIP +['foaf:depiction', 'foaf:homepage', + 'foaf:name', 'rdf:type', 'rdfs:comment'] + +``` + +Follow relations and get more data from their Resources as well: + +```python +>>> for pic in person.objects(FOAF.depiction): +... print(pic.identifier) +... print(pic.value(RDF.type).qname()) +... print(pic.value(FOAF.thumbnail).identifier) +http://example.org/images/person/some1.jpg +foaf:Image +http://example.org/images/person/some1-thumb.jpg + +``` + +```python +>>> for cv in person.subjects(CV.aboutPerson): +... work = list(cv.objects(CV.hasWorkHistory))[0] +... print(work.value(CV.employedIn).identifier) +... print(work.value(CV.startDate)) +http://example.org/#company +2009-09-04 + +``` + +It's just as easy to work with the predicates of a resource: + +```python +>>> for s, p in person.subject_predicates(): +... print(s.value(RDF.type).qname()) +... print(p.qname()) +... for s, o in p.subject_objects(): +... print(s.value(RDF.type).qname()) +... print(o.value(RDF.type).qname()) +cv:CV +cv:aboutPerson +cv:CV +foaf:Person + +``` + +This is useful for e.g. inspection: + +```python +>>> thumb_ref = URIRef("http://example.org/images/person/some1-thumb.jpg") +>>> thumb = Resource(graph, thumb_ref) +>>> for p, o in thumb.predicate_objects(): +... print(p.qname()) +... print(o.qname()) +rdf:type +foaf:Image + +``` + +## Schema Example + +With this artificial schema data: + +```python +>>> graph = Graph().parse(format='n3', data=''' +... @prefix rdf: . +... @prefix rdfs: . +... @prefix owl: . +... @prefix v: . +... +... v:Artifact a owl:Class . +... +... v:Document a owl:Class; +... rdfs:subClassOf v:Artifact . +... +... v:Paper a owl:Class; +... rdfs:subClassOf v:Document . +... +... v:Choice owl:oneOf (v:One v:Other) . +... +... v:Stuff a rdf:Seq; rdf:_1 v:One; rdf:_2 v:Other . +... +... ''') + +``` + +From this class: + +```python +>>> artifact = Resource(graph, URIRef("http://example.org/def/v#Artifact")) + +``` + +we can get at subclasses: + +```python +>>> subclasses = list(artifact.transitive_subjects(RDFS.subClassOf)) +>>> [c.qname() for c in subclasses] +['v:Artifact', 'v:Document', 'v:Paper'] + +``` + +and superclasses from the last subclass: + +```python +>>> [c.qname() for c in subclasses[-1].transitive_objects(RDFS.subClassOf)] +['v:Paper', 'v:Document', 'v:Artifact'] + +``` + +Get items from the Choice: + +```python +>>> choice = Resource(graph, URIRef("http://example.org/def/v#Choice")) +>>> [it.qname() for it in choice.value(OWL.oneOf).items()] +['v:One', 'v:Other'] + +``` On add, other resources are auto-unboxed: - >>> paper = Resource(graph, URIRef("http://example.org/def/v#Paper")) - >>> paper.add(RDFS.subClassOf, artifact) - >>> artifact in paper.objects(RDFS.subClassOf) # checks Resource instance - True - >>> (paper._identifier, RDFS.subClassOf, artifact._identifier) in graph - True +```python +>>> paper = Resource(graph, URIRef("http://example.org/def/v#Paper")) +>>> paper.add(RDFS.subClassOf, artifact) +>>> artifact in paper.objects(RDFS.subClassOf) # checks Resource instance +True +>>> (paper._identifier, RDFS.subClassOf, artifact._identifier) in graph +True + +``` + +## Technical Details -Technical Details ------------------ +Comparison is based on graph and identifier: -Comparison is based on graph and identifier:: +```python +>>> g1 = Graph() +>>> t1 = Resource(g1, URIRef("http://example.org/thing")) +>>> t2 = Resource(g1, URIRef("http://example.org/thing")) +>>> t3 = Resource(g1, URIRef("http://example.org/other")) +>>> t4 = Resource(Graph(), URIRef("http://example.org/other")) - >>> g1 = Graph() - >>> t1 = Resource(g1, URIRef("http://example.org/thing")) - >>> t2 = Resource(g1, URIRef("http://example.org/thing")) - >>> t3 = Resource(g1, URIRef("http://example.org/other")) - >>> t4 = Resource(Graph(), URIRef("http://example.org/other")) +>>> t1 is t2 +False - >>> t1 is t2 - False +>>> t1 == t2 +True +>>> t1 != t2 +False - >>> t1 == t2 - True - >>> t1 != t2 - False +>>> t1 == t3 +False +>>> t1 != t3 +True - >>> t1 == t3 - False - >>> t1 != t3 - True +>>> t3 != t4 +True - >>> t3 != t4 - True +>>> t3 < t1 and t1 > t3 +True +>>> t1 >= t1 and t1 >= t3 +True +>>> t1 <= t1 and t3 <= t1 +True - >>> t3 < t1 and t1 > t3 - True - >>> t1 >= t1 and t1 >= t3 - True - >>> t1 <= t1 and t3 <= t1 - True +>>> t1 < t1 or t1 < t3 or t3 > t1 or t3 > t3 +False - >>> t1 < t1 or t1 < t3 or t3 > t1 or t3 > t3 - False +``` -Hash is computed from graph and identifier:: +Hash is computed from graph and identifier: - >>> g1 = Graph() - >>> t1 = Resource(g1, URIRef("http://example.org/thing")) +```python +>>> g1 = Graph() +>>> t1 = Resource(g1, URIRef("http://example.org/thing")) - >>> hash(t1) == hash(Resource(g1, URIRef("http://example.org/thing"))) - True +>>> hash(t1) == hash(Resource(g1, URIRef("http://example.org/thing"))) +True - >>> hash(t1) == hash(Resource(Graph(), t1.identifier)) - False - >>> hash(t1) == hash(Resource(Graph(), URIRef("http://example.org/thing"))) - False +>>> hash(t1) == hash(Resource(Graph(), t1.identifier)) +False +>>> hash(t1) == hash(Resource(Graph(), URIRef("http://example.org/thing"))) +False + +``` The Resource class is suitable as a base class for mapper toolkits. For example, consider this utility for accessing RDF properties via qname-like -attributes:: - - >>> class Item(Resource): - ... - ... def __getattr__(self, p): - ... return list(self.objects(self._to_ref(*p.split('_', 1)))) - ... - ... def _to_ref(self, pfx, name): - ... return URIRef(self._graph.store.namespace(pfx) + name) - -It works as follows:: - - >>> graph = Graph().parse(format='n3', data=''' - ... @prefix rdfs: . - ... @prefix foaf: . - ... - ... @base . - ... - ... foaf:name "Some Body"; - ... foaf:depiction . - ... rdfs:comment "Just an image"@en . - ... ''') - - >>> person = Item(graph, URIRef("http://example.org/person/some1#self")) - - >>> print(person.foaf_name[0]) - Some Body +attributes: + +```python +>>> class Item(Resource): +... +... def __getattr__(self, p): +... return list(self.objects(self._to_ref(*p.split('_', 1)))) +... +... def _to_ref(self, pfx, name): +... return URIRef(self._graph.store.namespace(pfx) + name) + +``` + +It works as follows: + +```python +>>> graph = Graph().parse(format='n3', data=''' +... @prefix rdfs: . +... @prefix foaf: . +... +... @base . +... +... foaf:name "Some Body"; +... foaf:depiction . +... rdfs:comment "Just an image"@en . +... ''') + +>>> person = Item(graph, URIRef("http://example.org/person/some1#self")) + +>>> print(person.foaf_name[0]) +Some Body + +``` The mechanism for wrapping references as resources cooperates with subclasses. -Therefore, accessing referenced resources automatically creates new ``Item`` -objects:: +Therefore, accessing referenced resources automatically creates new `Item` +objects: - >>> isinstance(person.foaf_depiction[0], Item) - True +```python +>>> isinstance(person.foaf_depiction[0], Item) +True - >>> print(person.foaf_depiction[0].rdfs_comment[0]) - Just an image +>>> print(person.foaf_depiction[0].rdfs_comment[0]) +Just an image +``` """ from rdflib.namespace import RDF @@ -294,6 +358,8 @@ class Resource: + """A Resource is a wrapper for a graph and a resource identifier.""" + def __init__(self, graph, subject): self._graph = graph self._identifier = subject diff --git a/rdflib/serializer.py b/rdflib/serializer.py index 6f1230d590..f7f4e250bd 100644 --- a/rdflib/serializer.py +++ b/rdflib/serializer.py @@ -1,13 +1,11 @@ -""" -Serializer plugin interface. +"""Serializer plugin interface. This module is useful for those wanting to write a serializer that can plugin to rdflib. If you are wanting to invoke a serializer you likely want to do so through the Graph class serialize method. TODO: info for how to write a serializer that can plugin to rdflib. -See also rdflib.plugin - +See also [`rdflib.plugin`][rdflib.plugin] """ from __future__ import annotations diff --git a/rdflib/store.py b/rdflib/store.py index 86dabf1854..397577e5c9 100644 --- a/rdflib/store.py +++ b/rdflib/store.py @@ -1,12 +1,6 @@ -""" -============ -rdflib.store -============ - -Types of store --------------- +"""## Types of store -``Context-aware``: An RDF store capable of storing statements within contexts +`Context-aware`: An RDF store capable of storing statements within contexts is considered context-aware. Essentially, such a store is able to partition the RDF model it represents into individual, named, and addressable sub-graphs. @@ -14,15 +8,13 @@ Relevant Notation3 reference regarding formulae, quoted statements, and such: http://www.w3.org/DesignIssues/Notation3.html -``Formula-aware``: An RDF store capable of distinguishing between statements +`Formula-aware`: An RDF store capable of distinguishing between statements that are asserted and statements that are quoted is considered formula-aware. -``Transaction-capable``: capable of providing transactional integrity to the +`Transaction-capable`: capable of providing transactional integrity to the RDF operations performed on it. -``Graph-aware``: capable of keeping track of empty graphs. - ------- +`Graph-aware`: capable of keeping track of empty graphs. """ from __future__ import annotations @@ -79,34 +71,30 @@ class StoreCreatedEvent(Event): - """ - This event is fired when the Store is created, it has the following - attribute: - - - ``configuration``: string used to create the store + """This event is fired when the Store is created. + Attributes: + configuration: String used to create the store """ class TripleAddedEvent(Event): - """ - This event is fired when a triple is added, it has the following - attributes: + """This event is fired when a triple is added. - - the ``triple`` added to the graph - - the ``context`` of the triple, if any - - the ``graph`` to which the triple was added + Attributes: + triple: The triple added to the graph. + context: The context of the triple, if any. + graph: The graph to which the triple was added. """ class TripleRemovedEvent(Event): - """ - This event is fired when a triple is removed, it has the following - attributes: + """This event is fired when a triple is removed. - - the ``triple`` removed from the graph - - the ``context`` of the triple, if any - - the ``graph`` from which the triple was removed + Attributes: + triple: The triple removed from the graph. + context: The context of the triple, if any. + graph: The graph from which the triple was removed. """ @@ -174,10 +162,12 @@ def __init__( configuration: Optional[str] = None, identifier: Optional[Identifier] = None, ): - """ - identifier: URIRef of the Store. Defaults to CWD - configuration: string containing information open can use to - connect to datastore. + """Initialize the Store. + + Args: + identifier: URIRef of the Store. Defaults to CWD + configuration: String containing information open can use to + connect to datastore. """ self.__node_pickler: Optional[NodePickler] = None self.dispatcher = Dispatcher() @@ -208,33 +198,38 @@ def create(self, configuration: str) -> None: def open( self, configuration: Union[str, tuple[str, str]], create: bool = False ) -> Optional[int]: - """Opens the store specified by the configuration string or tuple. If - create is True a store will be created if it does not already - exist. If create is False and a store does not already exist - an exception is raised. An exception is also raised if a store - exists, but there is insufficient permissions to open the - store. This should return one of: - VALID_STORE, CORRUPTED_STORE, or NO_STORE + """Opens the store specified by the configuration string. + + Args: + configuration: Store configuration string + create: If True, a store will be created if it doesn't exist. + If False and the store doesn't exist, an exception is raised. + + Returns: + One of: VALID_STORE, CORRUPTED_STORE, or NO_STORE + + Raises: + Exception: If there are insufficient permissions to open the store. """ return UNKNOWN def close(self, commit_pending_transaction: bool = False) -> None: - """ - This closes the database connection. The commit_pending_transaction - parameter specifies whether to commit all pending transactions before - closing (if the store is transactional). + """Closes the database connection. + + Args: + commit_pending_transaction: Whether to commit all pending + transactions before closing (if the store is transactional). """ def destroy(self, configuration: str) -> None: - """ - This destroys the instance of the store identified by the - configuration string. + """Destroys the instance of the store. + + Args: + configuration: The configuration string identifying the store instance. """ def gc(self) -> None: - """ - Allows the store to perform any needed garbage collection - """ + """Allows the store to perform any needed garbage collection.""" pass # RDF APIs @@ -244,22 +239,32 @@ def add( context: _ContextType, quoted: bool = False, ) -> None: - """ - Adds the given statement to a specific context or to the model. The - quoted argument is interpreted by formula-aware stores to indicate - this statement is quoted/hypothetical It should be an error to not - specify a context and have the quoted argument be True. It should also - be an error for the quoted argument to be True when the store is not - formula-aware. + """Adds the given statement to a specific context or to the model. + + Args: + triple: The triple to add + context: The context to add the triple to + quoted: If True, indicates this statement is quoted/hypothetical + (for formula-aware stores) + + Note: + It should be an error to not specify a context and have the quoted + argument be True. It should also be an error for the quoted argument + to be True when the store is not formula-aware. """ self.dispatcher.dispatch(TripleAddedEvent(triple=triple, context=context)) def addN(self, quads: Iterable[_QuadType]) -> None: # noqa: N802 - """ - Adds each item in the list of statements to a specific context. The - quoted argument is interpreted by formula-aware stores to indicate this - statement is quoted/hypothetical. Note that the default implementation - is a redirect to add + """Adds each item in the list of statements to a specific context. + + The quoted argument is interpreted by formula-aware stores to indicate this + statement is quoted/hypothetical. + + Note: + The default implementation is a redirect to add. + + Args: + quads: An iterable of quads to add """ for s, p, o, c in quads: assert c is not None, "Context associated with %s %s %s is None!" % ( @@ -349,9 +354,10 @@ def triples( # type: ignore[return] for example, REGEXTerm, URIRef, Literal, BNode, Variable, Graph, QuotedGraph, Date? DateRange? - :param context: A conjunctive query can be indicated by either - providing a value of None, or a specific context can be - queries by passing a Graph instance (if store is context aware). + Args: + context: A conjunctive query can be indicated by either + providing a value of None, or a specific context can be + queries by passing a Graph instance (if store is context aware). """ subject, predicate, object = triple_pattern @@ -365,7 +371,8 @@ def __len__(self, context: Optional[_ContextType] = None) -> int: # type: ignor otherwise it should return the number of statements in the formula or context given. - :param context: a graph instance to query or None + Args: + context: a graph instance to query or None """ # type error: Missing return statement @@ -390,17 +397,15 @@ def query( queryGraph: str, # noqa: N803 **kwargs: Any, ) -> Result: - """ - If stores provide their own SPARQL implementation, override this. + """If stores provide their own SPARQL implementation, override this. - queryGraph is None, a URIRef or '__UNION__' + queryGraph is None, a URIRef or `__UNION__` If None the graph is specified in the query-string/object If URIRef it specifies the graph to query, - If '__UNION__' the union of all named graphs should be queried + If `__UNION__` the union of all named graphs should be queried (This is used by ConjunctiveGraphs Values other than None obviously only makes sense for context-aware stores.) - """ raise NotImplementedError @@ -413,18 +418,15 @@ def update( queryGraph: str, # noqa: N803 **kwargs: Any, ) -> None: - """ - If stores provide their own (SPARQL) Update implementation, - override this. + """If stores provide their own (SPARQL) Update implementation, override this. - queryGraph is None, a URIRef or '__UNION__' + queryGraph is None, a URIRef or `__UNION__` If None the graph is specified in the query-string/object If URIRef it specifies the graph to query, - If '__UNION__' the union of all named graphs should be queried + If `__UNION__` the union of all named graphs should be queried (This is used by ConjunctiveGraphs Values other than None obviously only makes sense for context-aware stores.) - """ raise NotImplementedError @@ -432,8 +434,13 @@ def update( # Optional Namespace methods def bind(self, prefix: str, namespace: URIRef, override: bool = True) -> None: - """ - :param override: rebind, even if the given namespace is already bound to another prefix. + """Bind a namespace to a prefix. + + Args: + prefix: The prefix to bind the namespace to. + namespace: The URIRef of the namespace to bind. + override: If True, rebind even if the given namespace is already bound + to another prefix """ def prefix(self, namespace: URIRef) -> Optional[str]: @@ -461,18 +468,19 @@ def rollback(self) -> None: # Optional graph methods def add_graph(self, graph: Graph) -> None: - """ - Add a graph to the store, no effect if the graph already + """Add a graph to the store, no effect if the graph already exists. - :param graph: a Graph instance + + Args: + graph: a Graph instance """ raise Exception("Graph method called on non-graph_aware store") def remove_graph(self, graph: Graph) -> None: - """ - Remove a graph from the store, this should also remove all + """Remove a graph from the store, this should also remove all triples in the graph - :param graphid: a Graph instance + Args: + graphid: a Graph instance """ raise Exception("Graph method called on non-graph_aware store") diff --git a/rdflib/term.py b/rdflib/term.py index 254bc9d62c..d9f0646175 100644 --- a/rdflib/term.py +++ b/rdflib/term.py @@ -3,14 +3,14 @@ objects that can appear in a quoted/asserted triple. This includes those that are core to RDF: -* :class:`Blank Nodes ` -* :class:`URI References ` -* :class:`Literals ` (which consist of a literal value,datatype and language tag) +* [Blank Nodes][rdflib.term.BNode] - Blank Nodes +* [URI References][rdflib.term.URIRef] - URI References +* [Literals][rdflib.term.Literal] - Literals (which consist of a literal value, datatype and language tag) Those that extend the RDF model into N3: -* :class:`Formulae ` -* :class:`Universal Quantifications (Variables) ` +* [`QuotedGraph`][rdflib.graph.QuotedGraph] - Formulae +* [`Variable`][rdflib.term.Variable] - Universal Quantifications (Variables) And those that are primarily for matching against 'Nodes' in the underlying Graph: @@ -18,7 +18,6 @@ * REGEX Expressions * Date Ranges * Numerical Ranges - """ from __future__ import annotations @@ -135,9 +134,7 @@ def _is_valid_unicode(value: Union[str, bytes]) -> bool: class Node(abc.ABC): - """ - A Node in the Graph. - """ + """A Node in the Graph.""" __slots__ = () @@ -146,10 +143,8 @@ def n3(self, namespace_manager: Optional[NamespaceManager] = None) -> str: ... class Identifier(Node, str): # allow Identifiers to be Nodes in the Graph - """ - See http://www.w3.org/2002/07/rdf-identifer-terminology/ - regarding choice of terminology. - """ + """See http://www.w3.org/2002/07/rdf-identifer-terminology/ + regarding choice of terminology.""" __slots__ = () @@ -170,9 +165,9 @@ def __ne__(self, other: Any) -> bool: return not self.__eq__(other) def __eq__(self, other: Any) -> bool: - """ - Equality for Nodes. + """Equality for Nodes. + ```python >>> BNode("foo")==None False >>> BNode("foo")==URIRef("foo") @@ -187,6 +182,8 @@ def __eq__(self, other: Any) -> bool: True >>> Variable('a')!=Variable('a') False + + ``` """ if type(self) is type(other): @@ -195,15 +192,13 @@ def __eq__(self, other: Any) -> bool: return False def __gt__(self, other: Any) -> bool: - """ - This implements ordering for Nodes, + """This implements ordering for Nodes. This tries to implement this: http://www.w3.org/TR/sparql11-query/#modOrderBy Variables are not included in the SPARQL list, but they are greater than BNodes and smaller than everything else - """ if other is None: return True # everything bigger than None @@ -269,10 +264,10 @@ def toPython(self) -> str: # noqa: N802 class URIRef(IdentifiedNode): - """ - RDF 1.1's IRI Section https://www.w3.org/TR/rdf11-concepts/#section-IRIs + """[RDF 1.1's IRI Section](https://www.w3.org/TR/rdf11-concepts/#section-IRIs) - .. note:: Documentation on RDF outside of RDFLib uses the term IRI or URI whereas this class is called URIRef. This is because it was made when the first version of the RDF specification was current, and it used the term *URIRef*, see `RDF 1.0 URIRef `_ + !!! info "Terminology" + Documentation on RDF outside of RDFLib uses the term IRI or URI whereas this class is called URIRef. This is because it was made when the first version of the RDF specification was current, and it used the term *URIRef*, see [RDF 1.0 URIRef](http://www.w3.org/TR/rdf-concepts/#section-Graph-URIref) An IRI (Internationalized Resource Identifier) within an RDF graph is a Unicode string that conforms to the syntax defined in RFC 3987. @@ -310,13 +305,12 @@ def __new__(cls, value: str, base: Optional[str] = None): return rt def n3(self, namespace_manager: Optional[NamespaceManager] = None) -> str: - """ - This will do a limited check for valid URIs, + """This will do a limited check for valid URIs, essentially just making sure that the string includes no illegal - characters (``<, >, ", {, }, |, \\, `, ^``) + characters (`<, >, ", {, }, |, \\, `, ^`) - :param namespace_manager: if not None, will be used to make up - a prefixed name + Args: + namespace_manager: if not None, will be used to make up a prefixed name """ if not _is_valid_uri(self): @@ -338,13 +332,15 @@ def defrag(self) -> URIRef: @property def fragment(self) -> str: - """ - Return the URL Fragment + """Return the URL Fragment + ```python >>> URIRef("http://example.com/some/path/#some-fragment").fragment 'some-fragment' >>> URIRef("http://example.com/some/path/").fragment '' + + ``` """ return urlparse(self).fragment @@ -374,7 +370,7 @@ def de_skolemize(self) -> BNode: This function accepts only rdflib type skolemization, to provide a round-tripping within the system. - .. versionadded:: 4.0 + Added in version 4.0 """ if isinstance(self, RDFLibGenid): parsed_uri = urlparse(f"{self}") @@ -451,7 +447,7 @@ class BNode(IdentifiedNode): --- - RDFLib's ``BNode`` class makes unique IDs for all the Blank Nodes in a Graph but you + RDFLib's `BNode` class makes unique IDs for all the Blank Nodes in a Graph but you should *never* expect, or reply on, BNodes' IDs to match across graphs, or even for multiple copies of the same graph, if they are regenerated from some non-RDFLib source, such as loading from RDF data. @@ -516,7 +512,7 @@ def skolemize( """Create a URIRef "skolem" representation of the BNode, in accordance with http://www.w3.org/TR/rdf11-concepts/#section-skolemization - .. versionadded:: 4.0 + Added in version 4.0 """ if authority is None: authority = _SKOLEM_DEFAULT_AUTHORITY @@ -537,41 +533,52 @@ class Literal(Identifier): * a lexical form, being a Unicode string, which SHOULD be in Normal Form C * a datatype IRI, being an IRI identifying a datatype that determines how the lexical form maps to a literal value, and - * if and only if the datatype IRI is ``http://www.w3.org/1999/02/22-rdf-syntax-ns#langString``, a non-empty language tag. The language tag MUST be well-formed according to section 2.2.9 of `Tags for identifying languages `_. + * if and only if the datatype IRI is `http://www.w3.org/1999/02/22-rdf-syntax-ns#langString`, a non-empty language tag. The language tag MUST be well-formed according to section 2.2.9 of `Tags for identifying languages `_. A literal is a language-tagged string if the third element is present. Lexical representations of language tags MAY be converted to lower case. The value space of language tags is always in lower case. --- For valid XSD datatypes, the lexical form is optionally normalized - at construction time. Default behaviour is set by rdflib.NORMALIZE_LITERALS - and can be overridden by the normalize parameter to __new__ + at construction time. Default behaviour is set by `rdflib.NORMALIZE_LITERALS` + and can be overridden by the normalize parameter to `__new__` Equality and hashing of Literals are done based on the lexical form, i.e.: + ```python >>> from rdflib.namespace import XSD - >>> Literal('01') != Literal('1') # clear - strings differ True + ``` + but with data-type they get normalized: + ```python >>> Literal('01', datatype=XSD.integer) != Literal('1', datatype=XSD.integer) False + ``` + unless disabled: + ```python >>> Literal('01', datatype=XSD.integer, normalize=False) != Literal('1', datatype=XSD.integer) True + ``` Value based comparison is possible: + ```python >>> Literal('01', datatype=XSD.integer).eq(Literal('1', datatype=XSD.float)) True + ``` + The eq method also provides limited support for basic python types: + ```python >>> Literal(1).eq(1) # fine - int compatible with xsd:integer True >>> Literal('a').eq('b') # fine - str compatible with plain-lit @@ -581,6 +588,8 @@ class Literal(Identifier): >>> Literal('a').eq(1) # not fine, int incompatible with plain-lit NotImplemented + ``` + Greater-than/less-than ordering comparisons are also done in value space, when compatible datatypes are used. Incompatible datatypes are ordered by DT, or by lang-tag. For other nodes the ordering @@ -589,6 +598,7 @@ class Literal(Identifier): Any comparison with non-rdflib Node are "NotImplemented" In PY3 this is an error. + ```python >>> from rdflib import Literal, XSD >>> lit2006 = Literal('2006-01-01',datatype=XSD.date) >>> lit2006.toPython() @@ -610,12 +620,15 @@ class Literal(Identifier): >>> Literal(1) > URIRef('foo') # by node-type True + ``` + The > < operators will eat this NotImplemented and throw a TypeError (py3k): + ```python >>> Literal(1).__gt__(2.0) NotImplemented - + ``` """ _value: Any @@ -632,6 +645,7 @@ def __new__( datatype: Optional[str] = None, normalize: Optional[bool] = None, ): + """Create a new Literal instance.""" if lang == "": lang = None # no empty lang-tags in RDF @@ -716,14 +730,21 @@ def normalize(self) -> Literal: """ Returns a new literal with a normalised lexical representation of this literal + + ```python >>> from rdflib import XSD >>> Literal("01", datatype=XSD.integer, normalize=False).normalize() rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + ``` + Illegal lexical forms for the datatype given are simply passed on + + ```python >>> Literal("a", datatype=XSD.integer, normalize=False) rdflib.term.Literal('a', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + ``` """ if self.value is not None: @@ -774,6 +795,7 @@ def __setstate__(self, arg: Tuple[Any, Dict[str, Any]]) -> None: def __add__(self, val: Any) -> Literal: """ + ```python >>> from rdflib.namespace import XSD >>> Literal(1) + 1 rdflib.term.Literal('2', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) @@ -790,6 +812,8 @@ def __add__(self, val: Any) -> Literal: >>> b = Literal('P122DT15H58M', datatype=XSD.duration) >>> (a + b) rdflib.term.Literal('2006-11-01T12:50:00', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#dateTime')) + + ``` """ # if no val is supplied, return this Literal @@ -877,36 +901,45 @@ def __add__(self, val: Any) -> Literal: return Literal(s, self.language, datatype=new_datatype) def __sub__(self, val: Any) -> Literal: - """ - >>> from rdflib.namespace import XSD - >>> Literal(2) - 1 - rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - >>> Literal(1.1) - 1.0 - rdflib.term.Literal('0.10000000000000009', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#double')) - >>> Literal(1.1) - 1 - rdflib.term.Literal('0.1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#decimal')) - >>> Literal(1.1, datatype=XSD.float) - Literal(1.0, datatype=XSD.float) - rdflib.term.Literal('0.10000000000000009', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#float')) - >>> Literal("1.1") - 1.0 # doctest: +IGNORE_EXCEPTION_DETAIL - Traceback (most recent call last): - ... - TypeError: Not a number; rdflib.term.Literal('1.1') - >>> Literal(1.1, datatype=XSD.integer) - Literal(1.0, datatype=XSD.integer) - rdflib.term.Literal('0.10000000000000009', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + """Implements subtraction between Literals or between a Literal and a Python object. - # Handling dateTime/date/time based operations in Literals - >>> a = Literal('2006-01-01T20:50:00', datatype=XSD.dateTime) - >>> b = Literal('2006-02-01T20:50:00', datatype=XSD.dateTime) - >>> (b - a) - rdflib.term.Literal('P31D', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#duration')) - >>> from rdflib.namespace import XSD - >>> a = Literal('2006-07-01T20:52:00', datatype=XSD.dateTime) - >>> b = Literal('2006-11-01T12:50:00', datatype=XSD.dateTime) - >>> (a - b) - rdflib.term.Literal('-P122DT15H58M', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#duration')) - >>> (b - a) - rdflib.term.Literal('P122DT15H58M', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#duration')) + Example: + ```python + from rdflib.namespace import XSD + + # Basic numeric subtraction + Literal(2) - 1 + # rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + Literal(1.1) - 1.0 + # rdflib.term.Literal('0.10000000000000009', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#double')) + + Literal(1.1) - 1 + # rdflib.term.Literal('0.1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#decimal')) + + Literal(1.1, datatype=XSD.float) - Literal(1.0, datatype=XSD.float) + # rdflib.term.Literal('0.10000000000000009', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#float')) + + # This will raise a TypeError + Literal("1.1") - 1.0 + # TypeError: Not a number; rdflib.term.Literal('1.1') + + Literal(1.1, datatype=XSD.integer) - Literal(1.0, datatype=XSD.integer) + # rdflib.term.Literal('0.10000000000000009', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # Handling dateTime/date/time based operations in Literals + a = Literal('2006-01-01T20:50:00', datatype=XSD.dateTime) + b = Literal('2006-02-01T20:50:00', datatype=XSD.dateTime) + (b - a) + # rdflib.term.Literal('P31D', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#duration')) + + a = Literal('2006-07-01T20:52:00', datatype=XSD.dateTime) + b = Literal('2006-11-01T12:50:00', datatype=XSD.dateTime) + (a - b) + # rdflib.term.Literal('-P122DT15H58M', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#duration')) + (b - a) + # rdflib.term.Literal('P122DT15H58M', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#duration')) + ``` """ # if no val is supplied, return this Literal if val is None: @@ -981,29 +1014,36 @@ def __sub__(self, val: Any) -> Literal: ) def __bool__(self) -> bool: - """ - Is the Literal "True" - This is used for if statements, bool(literal), etc. + """Determines the truth value of the Literal. + + Used for if statements, bool(literal), etc. """ if self.value is not None: return bool(self.value) return len(self) != 0 def __neg__(self) -> Literal: - """ - >>> (- Literal(1)) - rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - >>> (- Literal(10.5)) - rdflib.term.Literal('-10.5', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#double')) - >>> from rdflib.namespace import XSD - >>> (- Literal("1", datatype=XSD.integer)) - rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - - >>> (- Literal("1")) - Traceback (most recent call last): - File "", line 1, in - TypeError: Not a number; rdflib.term.Literal('1') - >>> + """Implements unary negation for Literals with numeric values. + + Example: + ```python + # Negating an integer Literal + -Literal(1) + # rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # Negating a float Literal + -Literal(10.5) + # rdflib.term.Literal('-10.5', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#double')) + + # Using a string with a datatype + from rdflib.namespace import XSD + -Literal("1", datatype=XSD.integer) + # rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # This will raise a TypeError + -Literal("1") + # TypeError: Not a number; rdflib.term.Literal('1') + ``` """ if isinstance(self.value, (int, long_type, float)): @@ -1012,19 +1052,27 @@ def __neg__(self) -> Literal: raise TypeError(f"Not a number; {self!r}") def __pos__(self) -> Literal: - """ - >>> (+ Literal(1)) - rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - >>> (+ Literal(-1)) - rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - >>> from rdflib.namespace import XSD - >>> (+ Literal("-1", datatype=XSD.integer)) - rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - - >>> (+ Literal("1")) - Traceback (most recent call last): - File "", line 1, in - TypeError: Not a number; rdflib.term.Literal('1') + """Implements unary plus operation for Literals with numeric values. + + Example: + ```python + # Applying unary plus to an integer Literal + +Literal(1) + # rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # Applying unary plus to a negative integer Literal + +Literal(-1) + # rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # Using a string with a datatype + from rdflib.namespace import XSD + +Literal("-1", datatype=XSD.integer) + # rdflib.term.Literal('-1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # This will raise a TypeError + +Literal("1") + # TypeError: Not a number; rdflib.term.Literal('1') + ``` """ if isinstance(self.value, (int, long_type, float)): return Literal(self.value.__pos__()) @@ -1032,18 +1080,23 @@ def __pos__(self) -> Literal: raise TypeError(f"Not a number; {self!r}") def __abs__(self) -> Literal: - """ - >>> abs(Literal(-1)) - rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - - >>> from rdflib.namespace import XSD - >>> abs( Literal("-1", datatype=XSD.integer)) - rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - - >>> abs(Literal("1")) - Traceback (most recent call last): - File "", line 1, in - TypeError: Not a number; rdflib.term.Literal('1') + """Implements absolute value operation for Literals with numeric values. + + Example: + ```python + # Absolute value of a negative integer Literal + abs(Literal(-1)) + # rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # Using a string with a datatype + from rdflib.namespace import XSD + abs(Literal("-1", datatype=XSD.integer)) + # rdflib.term.Literal('1', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # This will raise a TypeError + abs(Literal("1")) + # TypeError: Not a number; rdflib.term.Literal('1') + ``` """ if isinstance(self.value, (int, long_type, float)): return Literal(self.value.__abs__()) @@ -1051,20 +1104,23 @@ def __abs__(self) -> Literal: raise TypeError(f"Not a number; {self!r}") def __invert__(self) -> Literal: - """ - >>> ~(Literal(-1)) - rdflib.term.Literal('0', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - - >>> from rdflib.namespace import XSD - >>> ~( Literal("-1", datatype=XSD.integer)) - rdflib.term.Literal('0', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) - - Not working: - - >>> ~(Literal("1")) - Traceback (most recent call last): - File "", line 1, in - TypeError: Not a number; rdflib.term.Literal('1') + """Implements bitwise NOT operation for Literals with numeric values. + + Example: + ```python + # Bitwise NOT of a negative integer Literal + ~(Literal(-1)) + # rdflib.term.Literal('0', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # Using a string with a datatype + from rdflib.namespace import XSD + ~(Literal("-1", datatype=XSD.integer)) + # rdflib.term.Literal('0', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#integer')) + + # This will raise a TypeError + ~(Literal("1")) + # TypeError: Not a number; rdflib.term.Literal('1') + ``` """ if isinstance(self.value, (int, long_type, float)): # type error: Unsupported operand type for ~ ("float") @@ -1073,40 +1129,52 @@ def __invert__(self) -> Literal: raise TypeError(f"Not a number; {self!r}") def __gt__(self, other: Any) -> bool: - """ + """Implements the greater-than comparison for Literals. - This implements ordering for Literals, - the other comparison methods delegate here + This is the base method for ordering comparisons - other comparison methods delegate here. - This tries to implement this: - http://www.w3.org/TR/sparql11-query/#modOrderBy + Implements the ordering rules described in http://www.w3.org/TR/sparql11-query/#modOrderBy - In short, Literals with compatible data-types are ordered in value - space, i.e. - >>> from rdflib import XSD + In summary: + 1. Literals with compatible data-types are ordered in value space + 2. Incompatible datatypes are ordered by their datatype URIs + 3. Literals with language tags are ordered by their language tags + 4. Plain literals come before xsd:string literals + 5. In the node order: None < BNode < URIRef < Literal - >>> Literal(1) > Literal(2) # int/int - False - >>> Literal(2.0) > Literal(1) # double/int - True - >>> from decimal import Decimal - >>> Literal(Decimal("3.3")) > Literal(2.0) # decimal/double - True - >>> Literal(Decimal("3.3")) < Literal(4.0) # decimal/double - True - >>> Literal('b') > Literal('a') # plain lit/plain lit - True - >>> Literal('b') > Literal('a', datatype=XSD.string) # plain lit/xsd:str - True + Example: + ```python + from rdflib import XSD + from decimal import Decimal - Incompatible datatype mismatches ordered by DT + # Comparing numeric literals in value space + Literal(1) > Literal(2) # int/int + # False - >>> Literal(1) > Literal("2") # int>string - False + Literal(2.0) > Literal(1) # double/int + # True - Langtagged literals by lang tag - >>> Literal("a", lang="en") > Literal("a", lang="fr") - False + Literal(Decimal("3.3")) > Literal(2.0) # decimal/double + # True + + Literal(Decimal("3.3")) < Literal(4.0) # decimal/double + # True + + # Comparing string literals + Literal('b') > Literal('a') # plain lit/plain lit + # True + + Literal('b') > Literal('a', datatype=XSD.string) # plain lit/xsd:str + # True + + # Incompatible datatypes ordered by DT + Literal(1) > Literal("2") # int>string + # False + + # Langtagged literals ordered by lang tag + Literal("a", lang="en") > Literal("a", lang="fr") + # False + ``` """ if other is None: return True # Everything is greater than None @@ -1186,11 +1254,14 @@ def __lt__(self, other: Any) -> bool: return NotImplemented def __le__(self, other: Any) -> bool: - """ - >>> from rdflib.namespace import XSD - >>> Literal('2007-01-01T10:00:00', datatype=XSD.dateTime - ... ) <= Literal('2007-01-01T10:00:00', datatype=XSD.dateTime) - True + """Less than or equal operator for Literals. + + Example: + ```python + from rdflib.namespace import XSD + Literal('2007-01-01T10:00:00', datatype=XSD.dateTime) <= Literal('2007-01-01T10:00:00', datatype=XSD.dateTime) + # True + ``` """ r = self.__lt__(other) if r: @@ -1210,10 +1281,7 @@ def __ge__(self, other: Any) -> bool: return NotImplemented def _comparable_to(self, other: Any) -> bool: - """ - Helper method to decide which things are meaningful to - rich-compare with this literal - """ + """Helper method to decide which things are meaningful to rich-compare with this literal.""" if isinstance(other, Literal): if self.datatype is not None and other.datatype is not None: # two datatyped literals @@ -1243,33 +1311,34 @@ def _comparable_to(self, other: Any) -> bool: # Subclass: def __hash__(self) -> int # NOTE for type ignore: This can possibly be fixed by changing how __hash__ is implemented in Identifier def __hash__(self) -> int: # type: ignore[override] - """ - >>> from rdflib.namespace import XSD - >>> a = {Literal('1', datatype=XSD.integer):'one'} - >>> Literal('1', datatype=XSD.double) in a - False - - - "Called for the key object for dictionary operations, - and by the built-in function hash(). Should return - a 32-bit integer usable as a hash value for - dictionary operations. The only required property - is that objects which compare equal have the same - hash value; it is advised to somehow mix together - (e.g., using exclusive or) the hash values for the - components of the object that also play a part in - comparison of objects." -- 3.4.1 Basic customization (Python) - - "Two literals are equal if and only if all of the following hold: - * The strings of the two lexical forms compare equal, character by - character. - * Either both or neither have language tags. - * The language tags, if any, compare equal. - * Either both or neither have datatype URIs. - * The two datatype URIs, if any, compare equal, character by - character." - -- 6.5.1 Literal Equality (RDF: Concepts and Abstract Syntax) - + """Hash function for Literals to enable their use as dictionary keys. + + Example: + ```python + from rdflib.namespace import XSD + a = {Literal('1', datatype=XSD.integer):'one'} + Literal('1', datatype=XSD.double) in a + # False + ``` + + Notes: + "Called for the key object for dictionary operations, + and by the built-in function hash(). Should return + a 32-bit integer usable as a hash value for + dictionary operations. The only required property + is that objects which compare equal have the same + hash value; it is advised to somehow mix together + (e.g., using exclusive or) the hash values for the + components of the object that also play a part in + comparison of objects." -- 3.4.1 Basic customization (Python) + + "Two literals are equal if and only if all of the following hold: + * The strings of the two lexical forms compare equal, character by character. + * Either both or neither have language tags. + * The language tags, if any, compare equal. + * Either both or neither have datatype URIs. + * The two datatype URIs, if any, compare equal, character by character." + -- 6.5.1 Literal Equality (RDF: Concepts and Abstract Syntax) """ # don't use super()... for efficiency reasons, see Identifier.__hash__ res = str.__hash__(self) @@ -1281,40 +1350,47 @@ def __hash__(self) -> int: # type: ignore[override] return res def __eq__(self, other: Any) -> bool: - """ - Literals are only equal to other literals. - - "Two literals are equal if and only if all of the following hold: - * The strings of the two lexical forms compare equal, character by character. - * Either both or neither have language tags. - * The language tags, if any, compare equal. - * Either both or neither have datatype URIs. - * The two datatype URIs, if any, compare equal, character by character." - -- 6.5.1 Literal Equality (RDF: Concepts and Abstract Syntax) + """Equality operator for Literals. - >>> Literal("1", datatype=URIRef("foo")) == Literal("1", datatype=URIRef("foo")) - True - >>> Literal("1", datatype=URIRef("foo")) == Literal("1", datatype=URIRef("foo2")) - False - - >>> Literal("1", datatype=URIRef("foo")) == Literal("2", datatype=URIRef("foo")) - False - >>> Literal("1", datatype=URIRef("foo")) == "asdf" - False - >>> from rdflib import XSD - >>> Literal('2007-01-01', datatype=XSD.date) == Literal('2007-01-01', datatype=XSD.date) - True - >>> Literal('2007-01-01', datatype=XSD.date) == date(2007, 1, 1) - False - >>> Literal("one", lang="en") == Literal("one", lang="en") - True - >>> Literal("hast", lang='en') == Literal("hast", lang='de') - False - >>> Literal("1", datatype=XSD.integer) == Literal(1) - True - >>> Literal("1", datatype=XSD.integer) == Literal("01", datatype=XSD.integer) - True + Literals are only equal to other literals. + Notes: + "Two literals are equal if and only if all of the following hold: + * The strings of the two lexical forms compare equal, character by character. + * Either both or neither have language tags. + * The language tags, if any, compare equal. + * Either both or neither have datatype URIs. + * The two datatype URIs, if any, compare equal, character by character." + -- 6.5.1 Literal Equality (RDF: Concepts and Abstract Syntax) + + Example: + ```python + Literal("1", datatype=URIRef("foo")) == Literal("1", datatype=URIRef("foo")) + # True + Literal("1", datatype=URIRef("foo")) == Literal("1", datatype=URIRef("foo2")) + # False + + Literal("1", datatype=URIRef("foo")) == Literal("2", datatype=URIRef("foo")) + # False + Literal("1", datatype=URIRef("foo")) == "asdf" + # False + + from rdflib import XSD + Literal('2007-01-01', datatype=XSD.date) == Literal('2007-01-01', datatype=XSD.date) + # True + Literal('2007-01-01', datatype=XSD.date) == date(2007, 1, 1) + # False + + Literal("one", lang="en") == Literal("one", lang="en") + # True + Literal("hast", lang='en') == Literal("hast", lang='de') + # False + + Literal("1", datatype=XSD.integer) == Literal(1) + # True + Literal("1", datatype=XSD.integer) == Literal("01", datatype=XSD.integer) + # True + ``` """ if self is other: return True @@ -1332,26 +1408,20 @@ def __eq__(self, other: Any) -> bool: return False def eq(self, other: Any) -> bool: - """ - Compare the value of this literal with something else - - Either, with the value of another literal - comparisons are then done in literal "value space", - and according to the rules of XSD subtype-substitution/type-promotion - - OR, with a python object: + """Compare the value of this literal with something else. - basestring objects can be compared with plain-literals, - or those with datatype xsd:string + This comparison can be done in two ways: - bool objects with xsd:boolean + 1. With the value of another literal - comparisons are then done in literal "value space" + according to the rules of XSD subtype-substitution/type-promotion - a int, long or float with numeric xsd types - - isodate date,time,datetime objects with xsd:date,xsd:time or xsd:datetime - - Any other operations returns NotImplemented + 2. With a Python object: + * string objects can be compared with plain-literals or those with datatype xsd:string + * bool objects with xsd:boolean + * int, long or float with numeric xsd types + * date, time, datetime objects with xsd:date, xsd:time, xsd:datetime + Any other operations returns NotImplemented. """ if isinstance(other, Literal): # Fast path for comparing numeric literals @@ -1459,57 +1529,71 @@ def neq(self, other: Any) -> bool: return not self.eq(other) def n3(self, namespace_manager: Optional[NamespaceManager] = None) -> str: - r''' - Returns a representation in the N3 format. + r'''Returns a representation in the N3 format. - Examples:: + ```python + >>> Literal("foo").n3() + '"foo"' - >>> Literal("foo").n3() - '"foo"' + ``` - Strings with newlines or triple-quotes:: + Strings with newlines or triple-quotes: - >>> Literal("foo\nbar").n3() - '"""foo\nbar"""' + ```python + >>> Literal("foo\nbar").n3() + '"""foo\nbar"""' + >>> Literal("''\'").n3() + '"\'\'\'"' + >>> Literal('"""').n3() + '"\\"\\"\\""' - >>> Literal("''\'").n3() - '"\'\'\'"' + ``` - >>> Literal('"""').n3() - '"\\"\\"\\""' + Language: - Language:: + ```python + >>> Literal("hello", lang="en").n3() + '"hello"@en' - >>> Literal("hello", lang="en").n3() - '"hello"@en' + ``` - Datatypes:: + Datatypes: - >>> Literal(1).n3() - '"1"^^' + ```python + >>> Literal(1).n3() + '"1"^^' + >>> Literal(1.0).n3() + '"1.0"^^' + >>> Literal(True).n3() + '"true"^^' - >>> Literal(1.0).n3() - '"1.0"^^' + ``` - >>> Literal(True).n3() - '"true"^^' + Datatype and language isn't allowed (datatype takes precedence): - Datatype and language isn't allowed (datatype takes precedence):: + ```python + >>> Literal(1, lang="en").n3() + '"1"^^' - >>> Literal(1, lang="en").n3() - '"1"^^' + ``` - Custom datatype:: + Custom datatype: - >>> footype = URIRef("http://example.org/ns#foo") - >>> Literal("1", datatype=footype).n3() - '"1"^^' + ```python + >>> footype = URIRef("http://example.org/ns#foo") + >>> Literal("1", datatype=footype).n3() + '"1"^^' + + ``` Passing a namespace-manager will use it to abbreviate datatype URIs: - >>> from rdflib import Graph - >>> Literal(1).n3(Graph().namespace_manager) - '"1"^^xsd:integer' + ```python + >>> from rdflib import Graph + >>> Literal(1).n3(Graph().namespace_manager) + '"1"^^xsd:integer' + + ``` ''' if namespace_manager: return self._literal_n3(qname_callback=namespace_manager.normalizeUri) @@ -1521,8 +1605,16 @@ def _literal_n3( use_plain: bool = False, qname_callback: Optional[Callable[[URIRef], Optional[str]]] = None, ) -> str: - """ - Using plain literal (shorthand) output:: + """Internal method for N3 serialization with more options. + + Args: + use_plain: Whether to use plain literal (shorthand) output + qname_callback: Function to convert URIs to prefixed names + + Example: + Using plain literal (shorthand) output: + + ```python >>> from rdflib.namespace import XSD >>> Literal(1)._literal_n3(use_plain=True) @@ -1537,8 +1629,7 @@ def _literal_n3( >>> Literal(1.0, datatype=XSD.float)._literal_n3(use_plain=True) '"1.0"^^' - >>> Literal("foo", datatype=XSD.string)._literal_n3( - ... use_plain=True) + >>> Literal("foo", datatype=XSD.string)._literal_n3(use_plain=True) '"foo"^^' >>> Literal(True)._literal_n3(use_plain=True) @@ -1550,20 +1641,26 @@ def _literal_n3( >>> Literal(1.91)._literal_n3(use_plain=True) '1.91e+00' + ``` + Only limited precision available for floats: + + ```python >>> Literal(0.123456789)._literal_n3(use_plain=True) '1.234568e-01' - >>> Literal('0.123456789', - ... datatype=XSD.decimal)._literal_n3(use_plain=True) + >>> Literal('0.123456789', datatype=XSD.decimal)._literal_n3(use_plain=True) '0.123456789' - Using callback for datatype QNames:: + ``` + + Using callback for datatype QNames: - >>> Literal(1)._literal_n3( - ... qname_callback=lambda uri: "xsd:integer") + ```python + >>> Literal(1)._literal_n3(qname_callback=lambda uri: "xsd:integer") '"1"^^xsd:integer' + ``` """ if use_plain and self.datatype in _PLAIN_LITERAL_TYPES: if self.value is not None: @@ -1685,12 +1782,16 @@ def _parseXML(xmlstring: str) -> xml.dom.minidom.Document: # noqa: N802 def _parse_html(lexical_form: str) -> xml.dom.minidom.DocumentFragment: """ Parse the lexical form of an HTML literal into a document fragment - using the ``dom`` from html5rdf tree builder. + using the `dom` from html5rdf tree builder. - :param lexical_form: The lexical form of the HTML literal. - :return: A document fragment representing the HTML literal. - :raises: `html5rdf.html5parser.ParseError` if the lexical form is - not valid HTML. + Args: + lexical_form: The lexical form of the HTML literal. + + Returns: + A document fragment representing the HTML literal. + + Raises: + html5rdf.html5parser.ParseError: If the lexical form is not valid HTML. """ parser = html5rdf.HTMLParser( tree=html5rdf.treebuilders.getTreeBuilder("dom"), strict=True @@ -1709,8 +1810,11 @@ def _write_html(value: xml.dom.minidom.DocumentFragment) -> bytes: Serialize a document fragment representing an HTML literal into its lexical form. - :param value: A document fragment representing an HTML literal. - :return: The lexical form of the HTML literal. + Args: + value: A document fragment representing an HTML literal. + + Returns: + The lexical form of the HTML literal. """ result = html5rdf.serialize(value, tree="dom") return result @@ -2123,9 +2227,7 @@ def _castPythonToLiteral( # noqa: N802 def _reset_bindings() -> None: - """ - Reset lexical<->value space binding for `Literal` - """ + """Reset lexical<->value space binding for `Literal`.""" _toPythonMapping.clear() _toPythonMapping.update(XSDToPython) @@ -2139,9 +2241,10 @@ def _reset_bindings() -> None: def _castLexicalToPython( # noqa: N802 lexical: Union[str, bytes], datatype: Optional[URIRef] ) -> Any: - """ - Map a lexical form to the value-space for the given datatype - :returns: a python object for the value or ``None`` + """Map a lexical form to the value-space for the given datatype. + + Returns: + A python object for the value or `None` """ try: conv_func = _toPythonMapping[datatype] @@ -2179,9 +2282,7 @@ def _castLexicalToPython( # noqa: N802 def _normalise_XSD_STRING(lexical_or_value: _AnyT) -> _AnyT: # noqa: N802 - """ - Replaces \t, \n, \r (#x9 (tab), #xA (linefeed), and #xD (carriage return)) with space without any whitespace collapsing - """ + """Replaces \\t, \\n, \\r (#x9 (tab), #xA (linefeed), and #xD (carriage return)) with space without any whitespace collapsing.""" if isinstance(lexical_or_value, str): # type error: Incompatible return value type (got "str", expected "_AnyT") [return-value] # NOTE for type ignore: this is an issue with mypy: https://github.com/python/mypy/issues/10003 @@ -2208,16 +2309,15 @@ def bind( """ register a new datatype<->pythontype binding - :param constructor: an optional function for converting lexical forms - into a Python instances, if not given the pythontype - is used directly - - :param lexicalizer: an optional function for converting python objects to - lexical form, if not given object.__str__ is used - - :param datatype_specific: makes the lexicalizer function be accessible - from the pair (pythontype, datatype) if set to True - or from the pythontype otherwise. False by default + Args: + constructor: An optional function for converting lexical forms + into a Python instances, if not given the pythontype + is used directly + lexicalizer: An optional function for converting python objects to + lexical form, if not given object.__str__ is used + datatype_specific: Makes the lexicalizer function be accessible + from the pair (pythontype, datatype) if set to True + or from the pythontype otherwise. False by default """ if datatype_specific and datatype is None: raise Exception("No datatype given for a datatype-specific binding") diff --git a/rdflib/tools/chunk_serializer.py b/rdflib/tools/chunk_serializer.py index e5a6155b1c..e7881497d7 100644 --- a/rdflib/tools/chunk_serializer.py +++ b/rdflib/tools/chunk_serializer.py @@ -31,34 +31,21 @@ def serialize_in_chunks( output_dir: Optional[Path] = None, write_prefixes: bool = False, ) -> None: - """ - Serializes a given Graph into a series of n-triples with a given length. - - :param g: - The graph to serialize. - - :param max_file_size_kb: - Maximum size per NT file in kB (1,000 bytes) - Equivalent to ~6,000 triples, depending on Literal sizes. - - :param max_triples: - Maximum size per NT file in triples - Equivalent to lines in file. - - If both this parameter and max_file_size_kb are set, max_file_size_kb will be used. - - :param file_name_stem: - Prefix of each file name. - e.g. "chunk" = chunk_000001.nt, chunk_000002.nt... - - :param output_dir: - The directory you want the files to be written to. - - :param write_prefixes: - The first file created is a Turtle file containing original graph prefixes. - - - See ``../test/test_tools/test_chunk_serializer.py`` for examples of this in use. + """Serializes a given Graph into a series of n-triples with a given length. + + Args: + g: The graph to serialize. + max_file_size_kb: Maximum size per NT file in kB (1,000 bytes) + Equivalent to ~6,000 triples, depending on Literal sizes. + max_triples: Maximum size per NT file in triples + Equivalent to lines in file. + If both this parameter and max_file_size_kb are set, max_file_size_kb will be used. + file_name_stem: Prefix of each file name. + e.g. "chunk" = chunk_000001.nt, chunk_000002.nt... + output_dir: The directory you want the files to be written to. + write_prefixes: The first file created is a Turtle file containing original graph prefixes. + + See `../test/test_tools/test_chunk_serializer.py` for examples of this in use. """ if output_dir is None: diff --git a/rdflib/tools/csv2rdf.py b/rdflib/tools/csv2rdf.py index d518e809f6..2cf4f74059 100644 --- a/rdflib/tools/csv2rdf.py +++ b/rdflib/tools/csv2rdf.py @@ -3,8 +3,7 @@ See also https://github.com/RDFLib/pyTARQL in the RDFlib family of tools -try: ``csv2rdf --help`` - +try: `csv2rdf --help` """ from __future__ import annotations diff --git a/rdflib/tools/rdf2dot.py b/rdflib/tools/rdf2dot.py index 5f78f4076e..3373497811 100644 --- a/rdflib/tools/rdf2dot.py +++ b/rdflib/tools/rdf2dot.py @@ -3,10 +3,9 @@ You can draw the graph of an RDF file directly: -.. code-block: bash - - rdf2dot my_rdf_file.rdf | dot -Tpng | display - +```bash +rdf2dot my_rdf_file.rdf | dot -Tpng | display +``` """ from __future__ import annotations diff --git a/rdflib/tools/rdfpipe.py b/rdflib/tools/rdfpipe.py index 118cd8b989..58a0c4c90e 100644 --- a/rdflib/tools/rdfpipe.py +++ b/rdflib/tools/rdfpipe.py @@ -63,6 +63,7 @@ def parse_and_serialize( def _format_and_kws(fmt): """ + ```python >>> _format_and_kws("fmt") ('fmt', {}) >>> _format_and_kws("fmt:+a") @@ -75,6 +76,8 @@ def _format_and_kws(fmt): ('fmt', {'c': 'd'}) >>> _format_and_kws("fmt:a=b:c") ('fmt', {'a': 'b:c'}) + + ``` """ fmt, kws = fmt, {} if fmt and ":" in fmt: diff --git a/rdflib/tools/rdfs2dot.py b/rdflib/tools/rdfs2dot.py index 8368c9319f..92acf7793d 100644 --- a/rdflib/tools/rdfs2dot.py +++ b/rdflib/tools/rdfs2dot.py @@ -4,9 +4,9 @@ You can draw the graph of an RDFS file directly: -.. code-block: bash - - rdf2dot my_rdfs_file.rdf | dot -Tpng | display +```bash +rdf2dot my_rdfs_file.rdf | dot -Tpng | display +``` """ from __future__ import annotations diff --git a/rdflib/util.py b/rdflib/util.py index 96260fc208..39459d8e92 100644 --- a/rdflib/util.py +++ b/rdflib/util.py @@ -158,15 +158,15 @@ def to_term( ) -> Optional[rdflib.term.Identifier]: """ Creates and returns an Identifier of type corresponding - to the pattern of the given positional argument string ``s``: + to the pattern of the given positional argument string `s`: - '' returns the ``default`` keyword argument value or ``None`` + '' returns the `default` keyword argument value or `None` - '' returns ``URIRef(s)`` (i.e. without angle brackets) + '' returns `URIRef(s)` (i.e. without angle brackets) - '"s"' returns ``Literal(s)`` (i.e. without doublequotes) + '"s"' returns `Literal(s)` (i.e. without doublequotes) - '_s' returns ``BNode(s)`` (i.e. without leading underscore) + '_s' returns `BNode(s)` (i.e. without leading underscore) """ if not s: @@ -188,33 +188,34 @@ def from_n3( backend: Optional[str] = None, nsm: Optional[rdflib.namespace.NamespaceManager] = None, ) -> Optional[Union[rdflib.term.Node, str]]: - r''' - Creates the Identifier corresponding to the given n3 string. - - >>> from rdflib.term import URIRef, Literal - >>> from rdflib.namespace import NamespaceManager - >>> from_n3('') == URIRef('http://ex.com/foo') - True - >>> from_n3('"foo"@de') == Literal('foo', lang='de') - True - >>> from_n3('"""multi\nline\nstring"""@en') == Literal( - ... 'multi\nline\nstring', lang='en') - True - >>> from_n3('42') == Literal(42) - True - >>> from_n3(Literal(42).n3()) == Literal(42) - True - >>> from_n3('"42"^^xsd:integer') == Literal(42) - True - >>> from rdflib import RDFS - >>> from_n3('rdfs:label') == RDFS['label'] - True - >>> nsm = NamespaceManager(rdflib.graph.Graph()) - >>> nsm.bind('dbpedia', 'http://dbpedia.org/resource/') - >>> berlin = URIRef('http://dbpedia.org/resource/Berlin') - >>> from_n3('dbpedia:Berlin', nsm=nsm) == berlin - True - + r'''Creates the Identifier corresponding to the given n3 string. + + ```python + >>> from rdflib.term import URIRef, Literal + >>> from rdflib.namespace import NamespaceManager + >>> from_n3('') == URIRef('http://ex.com/foo') + True + >>> from_n3('"foo"@de') == Literal('foo', lang='de') + True + >>> from_n3('"""multi\nline\nstring"""@en') == Literal( + ... 'multi\nline\nstring', lang='en') + True + >>> from_n3('42') == Literal(42) + True + >>> from_n3(Literal(42).n3()) == Literal(42) + True + >>> from_n3('"42"^^xsd:integer') == Literal(42) + True + >>> from rdflib import RDFS + >>> from_n3('rdfs:label') == RDFS['label'] + True + >>> nsm = NamespaceManager(rdflib.graph.Graph()) + >>> nsm.bind('dbpedia', 'http://dbpedia.org/resource/') + >>> berlin = URIRef('http://dbpedia.org/resource/Berlin') + >>> from_n3('dbpedia:Berlin', nsm=nsm) == berlin + True + + ``` ''' if not s: return default @@ -296,6 +297,7 @@ def from_n3( def date_time(t=None, local_time_zone=False): """http://www.w3.org/TR/NOTE-datetime ex: 1997-07-16T19:20:30Z + ```python >>> date_time(1126482850) '2005-09-11T23:54:10Z' @@ -308,6 +310,8 @@ def date_time(t=None, local_time_zone=False): >>> date_time(0) '1970-01-01T00:00:00Z' + + ``` """ if t is None: t = time() @@ -331,6 +335,7 @@ def date_time(t=None, local_time_zone=False): def parse_date_time(val: str) -> int: """always returns seconds in UTC + ```python # tests are written like this to make any errors easier to understand >>> parse_date_time('2005-09-11T23:54:10Z') - 1126482850.0 0.0 @@ -345,6 +350,8 @@ def parse_date_time(val: str) -> int: 0.0 >>> parse_date_time("2005-09-05T10:42:00") - 1125916920.0 0.0 + + ``` """ if "T" not in val: @@ -375,8 +382,10 @@ def parse_date_time(val: str) -> int: def guess_format(fpath: str, fmap: Optional[Dict[str, str]] = None) -> Optional[str]: """ Guess RDF serialization based on file suffix. Uses - ``SUFFIX_FORMAT_MAP`` unless ``fmap`` is provided. Examples: + `SUFFIX_FORMAT_MAP` unless `fmap` is provided. + Example: + ```python >>> guess_format('path/to/file.rdf') 'xml' >>> guess_format('path/to/file.owl') @@ -392,15 +401,20 @@ def guess_format(fpath: str, fmap: Optional[Dict[str, str]] = None) -> Optional[ >>> guess_format('path/to/file.xhtml', {'xhtml': 'grddl'}) 'grddl' - This also works with just the suffixes, with or without leading dot, and - regardless of letter case:: + ``` + + This also works with just the suffixes, with or without leading dot, and + regardless of letter case: + ```python >>> guess_format('.rdf') 'xml' >>> guess_format('rdf') 'xml' >>> guess_format('RDF') 'xml' + + ``` """ fmap = fmap or SUFFIX_FORMAT_MAP return fmap.get(_get_ext(fpath)) or fmap.get(fpath.lower()) @@ -409,8 +423,10 @@ def guess_format(fpath: str, fmap: Optional[Dict[str, str]] = None) -> Optional[ def _get_ext(fpath: str, lower: bool = True) -> str: """ Gets the file extension from a file(path); stripped of leading '.' and in - lower case. Examples: + lower case. + Example: + ```python >>> _get_ext("path/to/file.txt") 'txt' >>> _get_ext("OTHER.PDF") @@ -419,6 +435,8 @@ def _get_ext(fpath: str, lower: bool = True) -> str: '' >>> _get_ext(".rdf") 'rdf' + + ``` """ ext = splitext(fpath)[-1] if ext == "" and fpath.startswith("."): @@ -435,15 +453,13 @@ def find_roots( prop: rdflib.term.URIRef, roots: Optional[Set[rdflib.term.Node]] = None, ) -> Set[rdflib.term.Node]: - """ - Find the roots in some sort of transitive hierarchy. + """Find the roots in some sort of transitive hierarchy. find_roots(graph, rdflib.RDFS.subClassOf) will return a set of all roots of the sub-class hierarchy Assumes triple of the form (child, prop, parent), i.e. the direction of - RDFS.subClassOf or SKOS.broader - + `RDFS.subClassOf` or `SKOS.broader` """ non_roots: Set[rdflib.term.Node] = set() @@ -473,16 +489,19 @@ def get_tree( i.e. - get_tree(graph, - rdflib.URIRef("http://xmlns.com/foaf/0.1/Person"), - rdflib.RDFS.subClassOf) + ```python + get_tree( + graph, + rdflib.URIRef("http://xmlns.com/foaf/0.1/Person"), + rdflib.RDFS.subClassOf, + ) + ``` will return the structure for the subClassTree below person. dir='down' assumes triple of the form (child, prop, parent), i.e. the direction of RDFS.subClassOf or SKOS.broader Any other dir traverses in the other direction - """ if done is None: @@ -522,19 +541,21 @@ def _coalesce( ) -> Optional[_AnyT]: """ This is a null coalescing function, it will return the first non-`None` - argument passed to it, otherwise it will return ``default`` which is `None` + argument passed to it, otherwise it will return `default` which is `None` by default. - For more info regarding the rationale of this function see deferred `PEP 505 - `_. + For more info regarding the rationale of this function see deferred + [PEP 505](https://peps.python.org/pep-0505/). - :param args: Values to consider as candidates to return, the first arg that - is not `None` will be returned. If no argument is passed this function - will return None. - :param default: The default value to return if none of the args are not - `None`. - :return: The first ``args`` that is not `None`, otherwise the value of - ``default`` if there are no ``args`` or if all ``args`` are `None`. + Args: + *args: Values to consider as candidates to return, the first arg that + is not `None` will be returned. If no argument is passed this function + will return None. + default: The default value to return if none of the args are not `None`. + + Returns: + The first `args` that is not `None`, otherwise the value of + `default` if there are no `args` or if all `args` are `None`. """ for arg in args: if arg is not None: @@ -544,13 +565,13 @@ def _coalesce( _RFC3986_SUBDELIMS = "!$&'()*+,;=" """ -``sub-delims`` production from `RFC 3986, section 2.2 -`_. +`sub-delims` production from +[RFC 3986, section 2.2](https://www.rfc-editor.org/rfc/rfc3986.html#section-2.2). """ _RFC3986_PCHAR_NU = "%" + _RFC3986_SUBDELIMS + ":@" """ -The non-unreserved characters in the ``pchar`` production from RFC 3986. +The non-unreserved characters in the `pchar` production from RFC 3986. """ _QUERY_SAFE_CHARS = _RFC3986_PCHAR_NU + "/?" @@ -558,10 +579,10 @@ def _coalesce( The non-unreserved characters that are safe to use in in the query and fragment components. -.. code-block:: - - pchar = unreserved / pct-encoded / sub-delims / ":" / "@" query - = *( pchar / "/" / "?" ) fragment = *( pchar / "/" / "?" ) +``` +pchar = unreserved / pct-encoded / sub-delims / ":" / "@" query += *( pchar / "/" / "?" ) fragment = *( pchar / "/" / "?" ) +``` """ _USERNAME_SAFE_CHARS = _RFC3986_SUBDELIMS + "%" @@ -569,9 +590,9 @@ def _coalesce( The non-unreserved characters that are safe to use in the username and password components. -.. code-block:: - - userinfo = *( unreserved / pct-encoded / sub-delims / ":" ) +``` +userinfo = *( unreserved / pct-encoded / sub-delims / ":" ) +``` ":" is excluded as this is only used for the username and password components, and they are treated separately. @@ -581,7 +602,6 @@ def _coalesce( """ The non-unreserved characters that are safe to use in the path component. - This is based on various path-related productions from RFC 3986. """ @@ -590,10 +610,13 @@ def _iri2uri(iri: str) -> str: """ Prior art: - * `iri_to_uri from Werkzeug `_ + - [iri_to_uri from Werkzeug](https://github.com/pallets/werkzeug/blob/92c6380248c7272ee668e1f8bbd80447027ccce2/src/werkzeug/urls.py#L926-L931) + ```python >>> _iri2uri("https://dbpedia.org/resource/Almería") 'https://dbpedia.org/resource/Almer%C3%ADa' + + ``` """ # https://datatracker.ietf.org/doc/html/rfc3986 # https://datatracker.ietf.org/doc/html/rfc3305 diff --git a/rdflib/void.py b/rdflib/void.py index fac16d0102..97b705d8a1 100644 --- a/rdflib/void.py +++ b/rdflib/void.py @@ -14,8 +14,7 @@ def generateVoID( # noqa: N802 res: Optional[Graph] = None, distinctForPartitions: bool = True, # noqa: N803 ): - """ - Returns a new graph with a VoID description of the passed dataset + """Returns a new graph with a VoID description of the passed dataset For more info on Vocabulary of Interlinked Datasets (VoID), see: http://vocab.deri.ie/void @@ -30,7 +29,6 @@ def generateVoID( # noqa: N802 the distinctForPartitions parameter controls whether distinctSubjects/objects are tracked for each class/propertyPartition this requires more memory again - """ typeMap: Dict[_SubjectType, Set[_SubjectType]] = ( # noqa: N806 diff --git a/rdflib/xsd_datetime.py b/rdflib/xsd_datetime.py index e05dd3c137..e7abfd1d0f 100644 --- a/rdflib/xsd_datetime.py +++ b/rdflib/xsd_datetime.py @@ -1,5 +1,5 @@ """ -Large parts of this module are taken from the ``isodate`` package. +Large parts of this module are taken from the `isodate` package. https://pypi.org/project/isodate/ Modifications are made to isodate features to allow compatibility with XSD dates and durations that are not necessarily valid ISO8601 strings. @@ -53,10 +53,7 @@ def fquotmod( val: Decimal, low: Union[Decimal, int], high: Union[Decimal, int] ) -> Tuple[int, Decimal]: - """ - A divmod function with boundaries. - - """ + """A divmod function with boundaries.""" # assumes that all the maths is done with Decimals. # divmod for Decimal uses truncate instead of floor as builtin # divmod, so we have to do it manually here. @@ -87,8 +84,7 @@ def max_days_in_month(year: int, month: int) -> int: class Duration: - """ - A class which represents a duration. + """A class which represents a duration. The difference to datetime.timedelta is, that this class handles also differences given in years and months. @@ -186,8 +182,7 @@ def __hash__(self): return hash((self.tdelta, self.months, self.years)) def __neg__(self): - """ - A simple unary minus. + """A simple unary minus. Returns a new Duration instance with all it's negated. """ @@ -344,8 +339,7 @@ def __ne__(self, other): return True def totimedelta(self, start=None, end=None): - """ - Convert this duration into a timedelta object. + """Convert this duration into a timedelta object. This method requires a start datetime or end datetime, but raises an exception if both are given. @@ -376,19 +370,19 @@ def totimedelta(self, start=None, end=None): def parse_xsd_duration( dur_string: str, as_timedelta_if_possible: bool = True ) -> Union[Duration, timedelta]: - """ - Parses an ISO 8601 durations into datetime.timedelta or Duration objects. + """Parses an ISO 8601 durations into datetime.timedelta or Duration objects. If the ISO date string does not contain years or months, a timedelta instance is returned, else a Duration instance is returned. The following duration formats are supported: - -``PnnW`` duration in weeks - -``PnnYnnMnnDTnnHnnMnnS`` complete duration specification - -``PYYYYMMDDThhmmss`` basic alternative complete date format - -``PYYYY-MM-DDThh:mm:ss`` extended alternative complete date format - -``PYYYYDDDThhmmss`` basic alternative ordinal date format - -``PYYYY-DDDThh:mm:ss`` extended alternative ordinal date format + + -`PnnW` duration in weeks + -`PnnYnnMnnDTnnHnnMnnS` complete duration specification + -`PYYYYMMDDThhmmss` basic alternative complete date format + -`PYYYY-MM-DDThh:mm:ss` extended alternative complete date format + -`PYYYYDDDThhmmss` basic alternative ordinal date format + -`PYYYY-DDDThh:mm:ss` extended alternative ordinal date format The '-' is optional. diff --git a/run_tests.py b/run_tests.py index c3ef4acd4b..41a9bb5bee 100755 --- a/run_tests.py +++ b/run_tests.py @@ -1,27 +1,28 @@ """ -Testing with pytest -================= +# Testing with pytest This test runner uses pytest for test discovery and running. It uses the argument spec of pytest, but with some options pre-set. To begin with, make sure you have pytest installed, e.g.: - $ pip install pytest +```bash +poetry add pytest +``` To run the tests, use: - $ ./run_tests.py +```bash +./run_tests.py +``` -For more details check . +For more details check https://rdflib.readthedocs.io/en/stable/developers.html. -Coverage -======== +## Coverage -If ``pytest-cov`` is placed in $PYTHONPATH, it can be used to create coverage +If `pytest-cov` is placed in $PYTHONPATH, it can be used to create coverage information if the "--cov" option is supplied. -See for details. - +See https://github.com/pytest-dev/pytest-cov for details. """ import json diff --git a/test/test_graph/test_graph.py b/test/test_graph/test_graph.py index 0e8227042c..9e8b880183 100644 --- a/test/test_graph/test_graph.py +++ b/test/test_graph/test_graph.py @@ -22,7 +22,7 @@ def test_property_store() -> None: """ - The ``store`` property works correctly. + The `store` property works correctly. """ graph = Graph() assert isinstance(graph.store, Store) @@ -38,7 +38,7 @@ def test_property_identifier_default() -> None: def test_property_identifier() -> None: """ - The ``identifier`` property works correctly. + The `identifier` property works correctly. """ id = URIRef("example:a") graph = Graph(identifier=id) @@ -47,7 +47,7 @@ def test_property_identifier() -> None: def test_property_namespace_manager() -> None: """ - The ``namespace_manager`` property works correctly. + The `namespace_manager` property works correctly. """ graph = Graph() # check repeats as property is a signleton diff --git a/test/test_graph/test_graph_store.py b/test/test_graph/test_graph_store.py index 9a33977027..07d042fd3f 100644 --- a/test/test_graph/test_graph_store.py +++ b/test/test_graph/test_graph_store.py @@ -235,7 +235,7 @@ def test_query_query_graph( query_graph: Union[str, Callable[[Graph], str]], ) -> None: """ - The `Graph.query` method passes the correct ``queryGraph`` argument + The `Graph.query` method passes the correct `queryGraph` argument to stores that have implemented a `Store.query` method. """ @@ -293,7 +293,7 @@ def test_update_query_graph( query_graph: Union[str, Callable[[Graph], str]], ) -> None: """ - The `Graph.update` method passes the correct ``queryGraph`` argument + The `Graph.update` method passes the correct `queryGraph` argument to stores that have implemented a `Store.update` method. """ diff --git a/test/test_graph/test_namespace_rebinding.py b/test/test_graph/test_namespace_rebinding.py index 253fbedc22..9aaf072511 100644 --- a/test/test_graph/test_namespace_rebinding.py +++ b/test/test_graph/test_namespace_rebinding.py @@ -238,12 +238,10 @@ def test_parse_rebinds_prefix(): def test_automatic_handling_of_unknown_predicates(): # AUTOMATIC HANDLING OF UNKNOWN PREDICATES - """ - Automatic handling of unknown predicates - ----------------------------------------- + """Automatic handling of unknown predicates As a programming convenience, a namespace binding is automatically - created when :class:`rdflib.term.URIRef` predicates are added to the graph. + created when [`URIRef`][rdflib.term.URIRef] predicates are added to the graph. """ g = Graph(bind_namespaces="none") diff --git a/test/test_misc/test_bnode_ncname.py b/test/test_misc/test_bnode_ncname.py index cc6f3cf7cc..e2cb95a0d0 100644 --- a/test/test_misc/test_bnode_ncname.py +++ b/test/test_misc/test_bnode_ncname.py @@ -14,7 +14,7 @@ def is_ncname(value): From the `W3C RDF Syntax doc `_ - "The value is a function of the value of the ``identifier`` accessor. + "The value is a function of the value of the `identifier` accessor. The string value begins with "_:" and the entire value MUST match the `N-Triples nodeID `_ production". diff --git a/test/test_misc/test_input_source.py b/test/test_misc/test_input_source.py index ce01cdaf9e..200fdc5b37 100644 --- a/test/test_misc/test_input_source.py +++ b/test/test_misc/test_input_source.py @@ -70,7 +70,7 @@ def test_too_many_arguments(): class SourceParam(enum.Enum): """ - Indicates what kind of paramter should be passed as ``source`` to create_input_source(). + Indicates what kind of paramter should be passed as `source` to create_input_source(). """ BINARY_IO = enum.auto() @@ -84,11 +84,13 @@ class SourceParam(enum.Enum): @contextmanager def from_path(self, path: Path) -> Generator[SourceParamType, None, None]: """ - Yields a value of the type indicated by the enum value which provides the data from the file at ``path``. + Yields a value of the type indicated by the enum value which provides the data from the file at `path`. + Args: + path: Path to the file to read. - :param path: Path to the file to read. - :return: A context manager which yields a value of the type indicated by the enum value. + Returns: + A context manager which yields a value of the type indicated by the enum value. """ if self is SourceParam.BINARY_IO: yield path.open("rb") @@ -110,7 +112,7 @@ def from_path(self, path: Path) -> Generator[SourceParamType, None, None]: class LocationParam(enum.Enum): """ - Indicates what kind of paramter should be passed as ``location`` to create_input_source(). + Indicates what kind of paramter should be passed as `location` to create_input_source(). """ FILE_URI = enum.auto() @@ -121,10 +123,13 @@ def from_path( self, path: Optional[Path], url: Optional[str] ) -> Generator[str, None, None]: """ - Yields a value of the type indicated by the enum value which provides the data from the file at ``path``. + Yields a value of the type indicated by the enum value which provides the data from the file at `path`. - :param path: Path to the file to read. - :return: A context manager which yields a value of the type indicated by the enum value. + Args: + path: Path to the file to read. + + Returns: + A context manager which yields a value of the type indicated by the enum value. """ if self is LocationParam.FILE_URI: assert path is not None @@ -138,7 +143,7 @@ def from_path( class FileParam(enum.Enum): """ - Indicates what kind of paramter should be passed as ``file`` to create_input_source(). + Indicates what kind of paramter should be passed as `file` to create_input_source(). """ BINARY_IO = enum.auto() @@ -147,10 +152,13 @@ class FileParam(enum.Enum): @contextmanager def from_path(self, path: Path) -> Generator[Union[BinaryIO, TextIO], None, None]: """ - Yields a value of the type indicated by the enum value which provides the data from the file at ``path``. + Yields a value of the type indicated by the enum value which provides the data from the file at `path`. + + Args: + path: Path to the file to read. - :param path: Path to the file to read. - :return: A context manager which yields a value of the type indicated by the enum value. + Returns: + A context manager which yields a value of the type indicated by the enum value. """ if self is FileParam.BINARY_IO: yield path.open("rb") @@ -162,7 +170,7 @@ def from_path(self, path: Path) -> Generator[Union[BinaryIO, TextIO], None, None class DataParam(enum.Enum): """ - Indicates what kind of paramter should be passed as ``data`` to create_input_source(). + Indicates what kind of paramter should be passed as `data` to create_input_source(). """ STRING = enum.auto() @@ -172,10 +180,13 @@ class DataParam(enum.Enum): @contextmanager def from_path(self, path: Path) -> Generator[Union[bytes, str, dict], None, None]: """ - Yields a value of the type indicated by the enum value which provides the data from the file at ``path``. + Yields a value of the type indicated by the enum value which provides the data from the file at `path`. - :param path: Path to the file to read. - :return: A context manager which yields a value of the type indicated by the enum value. + Args: + path: Path to the file to read. + + Returns: + A context manager which yields a value of the type indicated by the enum value. """ if self is DataParam.STRING: yield path.read_text(encoding="utf-8") @@ -271,9 +282,10 @@ class InputSourceChecker: """ Checker for input source objects. - :param type: Expected type of input source. - :param stream_check: What kind of stream check to perform. - :param encoding: Expected encoding of input source. If ``None``, then the encoding is not checked. If it has a value (i.e. an instance of :class:`Holder`), then the encoding is expected to match ``encoding.value``. + Args: + type: Expected type of input source. + stream_check: What kind of stream check to perform. + encoding: Expected encoding of input source. If `None`, then the encoding is not checked. If it has a value (i.e. an instance of `Holder`), then the encoding is expected to match `encoding.value`. """ type: Type[InputSource] @@ -290,7 +302,7 @@ def check( input_source: InputSource, ) -> None: """ - Check that ``input_source`` matches expectations. + Check that `input_source` matches expectations. """ logging.debug( "input_source = %s / %s, self.type = %s", @@ -345,8 +357,11 @@ def type_from_param( """ Return the type of input source that should be created for the given parameter. - :param param: The parameter that will be passed to :func:`create_input_source`. - :return: Type of input source that should be created for the given parameter. + Args: + param: The parameter that will be passed to `create_input_source`. + + Returns: + Type of input source that should be created for the given parameter. """ if param in ( SourceParam.PATH, @@ -380,14 +395,14 @@ def type_from_param( Union[ExceptionChecker, InputSourceChecker], ] """ -Type alias for the tuple representation of :class:`CreateInputSourceTestParams`. +Type alias for the tuple representation of `CreateInputSourceTestParams`. """ @dataclass class CreateInputSourceTestParams: """ - Parameters for :func:`create_input_source`. + Parameters for `create_input_source`. """ input_path: Path @@ -550,7 +565,7 @@ def make_params( SourceParam.BINARY_IO, FileParam.BINARY_IO, ): - # This should maybe be ``None`` instead of ``Holder(None)``, but as + # This should maybe be `None` instead of `Holder(None)`, but as # there is no ecoding supplied it is probably safe to assert that no # encoding is associated with it. expected_encoding = Holder(None) @@ -580,10 +595,11 @@ def test_create_input_source( A given set of parameters results in an input source matching specified invariants. - :param test_params: The parameters to use for the test. This specifies what - parameters should be passed to func:`create_input_source` and what - invariants the resulting input source should match. - :param http_file_server: The HTTP file server to use for the test. + Args: + test_params: The parameters to use for the test. This specifies what + parameters should be passed to `create_input_source` and what + invariants the resulting input source should match. + http_file_server: The HTTP file server to use for the test. """ logging.debug("test_params = %s", test_params) input_path = test_params.input_path diff --git a/test/test_namespace/test_namespacemanager.py b/test/test_namespace/test_namespacemanager.py index 4ca182ae28..1596176590 100644 --- a/test/test_namespace/test_namespacemanager.py +++ b/test/test_namespace/test_namespacemanager.py @@ -370,16 +370,16 @@ def test_compute_qname( store_prefixes: Optional[Mapping[str, Namespace]], expected_result: OutcomePrimitive[Tuple[str, URIRef, str]], ) -> None: - """ - :param uri: argument to compute_qname() - :param generate: argument to compute_qname() - :param bind_namespaces: argument to Graph() - - :param manager_prefixes: additional namespaces to bind on NamespaceManager. - :param graph_prefixes: additional namespaces to bind on Graph. - :param store_prefixes: additional namespaces to bind on Store. - - :param expected_result: Expected result tuple or exception. + """Test the compute_qname method of NamespaceManager. + + Args: + uri: argument to compute_qname() + generate: argument to compute_qname() + bind_namespaces: argument to Graph() + manager_prefixes: additional namespaces to bind on NamespaceManager. + graph_prefixes: additional namespaces to bind on Graph. + store_prefixes: additional namespaces to bind on Store. + expected_result: Expected result tuple or exception. """ graph = Graph(bind_namespaces=bind_namespaces) if graph_prefixes is not None: @@ -548,10 +548,10 @@ def test_generate_curie( expected_result: OutcomePrimitive[str], ) -> None: """ - .. note:: + !!! warning "Side effects" - This is using the function scoped nsm fixture because curie has side - effects and will modify the namespace manager. + This test uses a function-scoped fixture because the curie() method + has side effects that modify the namespace manager state. """ nsm = test_nsm_function checker = OutcomeChecker[str].from_primitive(expected_result) diff --git a/test/test_serializers/test_prettyxml.py b/test/test_serializers/test_prettyxml.py index aac19af50e..f5ba46abbe 100644 --- a/test/test_serializers/test_prettyxml.py +++ b/test/test_serializers/test_prettyxml.py @@ -19,7 +19,7 @@ def test_serialize_and_reparse(self): _assert_equal_graphs(self.source_graph, reparsed_graph) def test_multiple(self): - """Repeats ``test_serialize`` ``self.repeats`` times, to reduce sucess based on in-memory ordering.""" + """Repeats `test_serialize` `self.repeats` times, to reduce sucess based on in-memory ordering.""" for i in range(self.repeats): self.test_serialize_and_reparse() @@ -40,7 +40,7 @@ def _assert_equal_graphs(g1, g2): def _mangled_copy(g): - "Makes a copy of the graph, replacing all bnodes with the bnode ``_blank``." + "Makes a copy of the graph, replacing all bnodes with the bnode `_blank`." gcopy = Dataset() def isbnode(v): diff --git a/test/test_serializers/test_serializer_xml.py b/test/test_serializers/test_serializer_xml.py index 535b24c853..d546ebbbf1 100644 --- a/test/test_serializers/test_serializer_xml.py +++ b/test/test_serializers/test_serializer_xml.py @@ -19,7 +19,7 @@ def test_serialize_and_reparse(self): _assert_equal_graphs(self.source_graph, reparsed_graph) def test_multiple(self): - """Repeats ``test_serialize`` ``self.repeats`` times, to reduce sucess based on in-memory ordering.""" + """Repeats `test_serialize` `self.repeats` times, to reduce sucess based on in-memory ordering.""" for i in range(self.repeats): self.test_serialize_and_reparse() @@ -40,7 +40,7 @@ def _assert_equal_graphs(g1, g2): def _mangled_copy(g): - """Makes a copy of the graph, replacing all bnodes with the bnode ``_blank``.""" + """Makes a copy of the graph, replacing all bnodes with the bnode `_blank`.""" gcopy = Dataset() def isbnode(v): diff --git a/test/test_sparql/test_result.py b/test/test_sparql/test_result.py index 9f7defc0cd..80bcc69137 100644 --- a/test/test_sparql/test_result.py +++ b/test/test_sparql/test_result.py @@ -356,7 +356,7 @@ def test_serialize_to_strdest( name_prefix: str, ) -> None: """ - Various ways of specifying the destination argument of ``Result.serialize`` + Various ways of specifying the destination argument of `Result.serialize` as a string works correctly. """ format_info = ResultFormat.JSON.info diff --git a/test/test_sparql/test_sparql.py b/test/test_sparql/test_sparql.py index 74d7d9b72a..540874cc5a 100644 --- a/test/test_sparql/test_sparql.py +++ b/test/test_sparql/test_sparql.py @@ -281,7 +281,7 @@ def test_txtresult(): def test_property_bindings(rdfs_graph: Graph) -> None: """ - The ``bindings`` property of a `rdflib.query.Result` result works as expected. + The `bindings` property of a `rdflib.query.Result` result works as expected. """ result = rdfs_graph.query( """ @@ -416,7 +416,7 @@ def test_custom_eval_exception( result_consumer: Callable[[Result], None], exception_type: Type[Exception] ) -> None: """ - Exception raised from a ``CUSTOM_EVALS`` function during the execution of a + Exception raised from a `CUSTOM_EVALS` function during the execution of a query propagates to the caller. """ custom_function_uri = EGDC["function"] diff --git a/test/test_sparql/test_update.py b/test/test_sparql/test_update.py index 4b16aa7de2..ce3b43f66f 100644 --- a/test/test_sparql/test_update.py +++ b/test/test_sparql/test_update.py @@ -28,7 +28,7 @@ def test_load_into_default( graph_factory: Callable[[], Graph], source: GraphSource ) -> None: """ - Evaluation of ``LOAD `` into default graph works correctly. + Evaluation of `LOAD ` into default graph works correctly. """ expected_graph = graph_factory() @@ -70,7 +70,7 @@ def test_load_into_named( graph_factory: Callable[[], ConjunctiveGraph], source: GraphSource ) -> None: """ - Evaluation of ``LOAD INTO GRAPH `` works correctly. + Evaluation of `LOAD INTO GRAPH ` works correctly. """ expected_graph = graph_factory() diff --git a/test/test_store/test_store.py b/test/test_store/test_store.py index 60c013b1b1..671067c05f 100644 --- a/test/test_store/test_store.py +++ b/test/test_store/test_store.py @@ -22,7 +22,7 @@ def test_namespaces_via_manager() -> None: def test_propery_node_pickler() -> None: """ - The ``node_pickler`` property of a `rdflib.store.Store` works correctly. + The `node_pickler` property of a `rdflib.store.Store` works correctly. """ store = Store() assert isinstance(store.node_pickler, NodePickler) diff --git a/test/test_store/test_store_sparqlstore.py b/test/test_store/test_store_sparqlstore.py index 325f3b651f..1a26c1d289 100644 --- a/test/test_store/test_store_sparqlstore.py +++ b/test/test_store/test_store_sparqlstore.py @@ -19,12 +19,11 @@ class TestSPARQLStoreGraph: - """ - Tests for ``rdflib.Graph(store="SPARQLStore")``. + """SPARQLStore Graph Tests - .. note:: - This is a pytest based test class to be used for new tests instead of - the older `unittest.TestCase` based classes. + !!! info "New Test Framework" + This is a pytest based test class that replaces the older + `unittest.TestCase` based classes for testing SPARQLStore functionality. """ @pytest.mark.parametrize( diff --git a/test/test_turtle_quoting.py b/test/test_turtle_quoting.py index 7cdd63a24b..650670acc0 100644 --- a/test/test_turtle_quoting.py +++ b/test/test_turtle_quoting.py @@ -73,7 +73,7 @@ def add_pair(escape: str, unescaped: str) -> None: def ntriples_unquote_validate(input: str) -> str: """ - This function wraps `ntriples.unquote` in a way that ensures that `ntriples.validate` is always ``True`` when it runs. + This function wraps `ntriples.unquote` in a way that ensures that `ntriples.validate` is always `True` when it runs. """ old_validate = ntriples.validate try: @@ -85,7 +85,7 @@ def ntriples_unquote_validate(input: str) -> str: def ntriples_unquote(input: str) -> str: """ - This function wraps `ntriples.unquote` in a way that ensures that `ntriples.validate` is always ``False`` when it runs. + This function wraps `ntriples.unquote` in a way that ensures that `ntriples.validate` is always `False` when it runs. """ old_validate = ntriples.validate try: diff --git a/test/utils/__init__.py b/test/utils/__init__.py index cdcedda9c7..84b6beab39 100644 --- a/test/utils/__init__.py +++ b/test/utils/__init__.py @@ -2,7 +2,7 @@ This module contains test utilities. The tests for test utilities should be placed inside `test.utils.test` -(``test/utils/tests/``). +(`test/utils/tests/`). """ from __future__ import annotations @@ -493,9 +493,11 @@ def idfns(*idfns: Callable[[Any], Optional[str]]) -> Callable[[Any], Optional[st Returns an ID function which will try each of the provided ID functions in order. - :param idfns: The ID functions to try. - :return: An ID function which will try each of the provided ID - functions. + Args: + idfns: The ID functions to try. + + Returns: + An ID function which will try each of the provided ID functions. """ def _idfns(value: Any) -> Optional[str]: diff --git a/test/utils/graph.py b/test/utils/graph.py index 4ea1cfbd44..d3715014ca 100644 --- a/test/utils/graph.py +++ b/test/utils/graph.py @@ -85,12 +85,14 @@ def load( @classmethod def idfn(cls, val: Any) -> Optional[str]: - """ - ID function for GraphSource objects. + """ID function for GraphSource objects. + + Args: + val: The value to try to generate and identifier for. - :param val: The value to try to generate and identifier for. - :return: A string identifying the given value if the value is a - `GraphSource`, otherwise return `None`. + Returns: + A string identifying the given value if the value is a + `GraphSource`, otherwise return `None`. """ if isinstance(val, cls): try: diff --git a/test/utils/httpfileserver.py b/test/utils/httpfileserver.py index 7dee493ab7..774b042258 100644 --- a/test/utils/httpfileserver.py +++ b/test/utils/httpfileserver.py @@ -71,12 +71,13 @@ class HTTPFileInfo: """ Information about a file served by the HTTPFileServerRequestHandler. - :param request_url: The URL that should be requested to get the file. - :param effective_url: The URL that the file will be served from after - redirects. - :param redirects: A sequence of redirects that will be given to the client - if it uses the ``request_url``. This sequence will terminate in the - ``effective_url``. + Args: + request_url: The URL that should be requested to get the file. + effective_url: The URL that the file will be served from after + redirects. + redirects: A sequence of redirects that will be given to the client + if it uses the `request_url`. This sequence will terminate in the + `effective_url`. """ # request_url: str diff --git a/test/utils/iri.py b/test/utils/iri.py index 4259e8762a..1502c0a365 100644 --- a/test/utils/iri.py +++ b/test/utils/iri.py @@ -29,12 +29,14 @@ def file_uri_to_path( """ This function returns a pathlib.PurePath object for the supplied file URI. - :param str file_uri: The file URI ... - :param class path_class: The type of path in the file_uri. By default it uses - the system specific path pathlib.PurePath, to force a specific type of path - pass pathlib.PureWindowsPath or pathlib.PurePosixPath - :returns: the pathlib.PurePath object - :rtype: pathlib.PurePath + Args: + file_uri: The file URI ... + path_class: The type of path in the file_uri. By default it uses + the system specific path pathlib.PurePath, to force a specific type of path + pass pathlib.PureWindowsPath or pathlib.PurePosixPath + + Returns: + The pathlib.PurePath object """ is_windows_path = isinstance(path_class(), PureWindowsPath) file_uri_parsed = urlparse(file_uri) diff --git a/test/utils/outcome.py b/test/utils/outcome.py index 82a96138cb..7f1210ecec 100644 --- a/test/utils/outcome.py +++ b/test/utils/outcome.py @@ -52,11 +52,11 @@ def check(self, actual: AnyT) -> None: This should run inside the checker's context. - :param outcome: The actual outcome of the test. - :raises AssertionError: If the outcome does not match the - expectation. - :raises RuntimeError: If this method is called when no outcome - is expected. + Raises: + AssertionError: If the outcome does not match the + expectation. + RuntimeError: If this method is called when no outcome + is expected. """ ... @@ -68,11 +68,13 @@ def context(self) -> Generator[Optional[ExceptionInfo[Exception]], None, None]: This is necessary for checking exception outcomes. - :return: A context manager that yields the exception info for - any exceptions that were raised in this context. - :raises AssertionError: If the test does not raise an exception - when one is expected, or if the exception does not match the - expectation. + Returns: + A context manager that yields the exception info for + any exceptions that were raised in this context. + Raises: + AssertionError: If the test does not raise an exception + when one is expected, or if the exception does not match the + expectation. """ ... @@ -156,7 +158,8 @@ class ValueChecker(NoExceptionChecker[AnyT]): """ Validates that the outcome is a specific value. - :param value: The expected value. + Args: + value: The expected value. """ expected: AnyT @@ -170,8 +173,9 @@ class CallableChecker(NoExceptionChecker[AnyT]): """ Validates the outcome with a callable. - :param callable: The callable that will be called with the outcome - to validate it. + Args: + callable: The callable that will be called with the outcome + to validate it. """ callable: Callable[[AnyT], None] @@ -185,11 +189,12 @@ class ExceptionChecker(OutcomeChecker[AnyT]): """ Validates that the outcome is a specific exception. - :param type: The expected exception type. - :param match: A regular expression or string that the exception - message must match. - :param attributes: A dictionary of attributes that the exception - must have and their expected values. + Args: + type: The expected exception type. + match: A regular expression or string that the exception + message must match. + attributes: A dictionary of attributes that the exception + must have and their expected values. """ type: Type[Exception] diff --git a/test/utils/test/__init__.py b/test/utils/test/__init__.py index 4034ca863e..a6bcecc33d 100644 --- a/test/utils/test/__init__.py +++ b/test/utils/test/__init__.py @@ -1,3 +1,3 @@ """ -This module contains tests for test utility modules inside `test.utils` (i.e. ``test/utils/``). +This module contains tests for test utility modules inside `test.utils` (i.e. `test/utils/`). """ diff --git a/test/utils/test/test_outcome.py b/test/utils/test/test_outcome.py index 8299bcaca3..f8819e829f 100644 --- a/test/utils/test/test_outcome.py +++ b/test/utils/test/test_outcome.py @@ -61,7 +61,7 @@ def test_checker( ) -> None: """ Given the action, the checker raises the expected exception, or does - not raise anything if ``expected_exception`` is None. + not raise anything if `expected_exception` is None. """ with ExitStack() as xstack: if expected_exception is not None: diff --git a/tox.ini b/tox.ini index 9ec80d516d..c96bb2a6d7 100644 --- a/tox.ini +++ b/tox.ini @@ -7,6 +7,7 @@ envlist = toxworkdir={env:TOX_WORK_DIR:{tox_root}/.tox} [testenv] +deps = setuptools>=68,<72 passenv = DBUS_SESSION_BUS_ADDRESS # This is needed for keyring acccess on Linux. allowlist_externals = poetry @@ -29,7 +30,7 @@ commands = {env:TOX_EXTRA_COMMAND:} {env:TOX_MYPY_COMMAND:poetry run python -m mypy --show-error-context --show-error-codes --junit-xml=test_reports/{env:TOX_JUNIT_XML_PREFIX:}mypy-junit.xml} {posargs:poetry run {env:TOX_TEST_HARNESS:} pytest -ra --tb=native {env:TOX_PYTEST_ARGS:--junit-xml=test_reports/{env:TOX_JUNIT_XML_PREFIX:}pytest-junit.xml --cov --cov-report=} {env:TOX_PYTEST_EXTRA_ARGS:}} - docs: poetry run sphinx-build -T -W -b html -d {envdir}/doctree docs docs/_build/html + docs: poetry run mkdocs build [testenv:covreport] skip_install = true @@ -62,13 +63,14 @@ commands_pre = poetry install --only=main --only=docs --extras=html poetry env info commands = - poetry run sphinx-build -T -W -b html -d {envdir}/doctree docs docs/_build/html + poetry run mkdocs build [testenv:py38-extensive-min] base = void deps = pytest==7.* pytest-cov==4.* + setuptools>=68,<72 setenv = BERKELEYDB_DIR = /usr COVERAGE_FILE = {env:COVERAGE_FILE:{toxinidir}/.coverage.{envname}} From 8685a85503c1385e5d34cc6f37b1d3700fb260cf Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 30 Oct 2025 12:47:51 +1000 Subject: [PATCH 51/60] fix: set changed size when iterating the store's graphs (#3281) * fix: set changed size when iterating the store's graphs * fix: materialise the iterator with list() so it doesn't change * chore: revert calling to store directly --------- Co-authored-by: Nicholas Car --- rdflib/plugins/stores/memory.py | 2 +- test/test_store/test_store_memorystore.py | 17 +++++++++++++++++ 2 files changed, 18 insertions(+), 1 deletion(-) diff --git a/rdflib/plugins/stores/memory.py b/rdflib/plugins/stores/memory.py index bd73e0d10e..a474c5caf2 100644 --- a/rdflib/plugins/stores/memory.py +++ b/rdflib/plugins/stores/memory.py @@ -556,7 +556,7 @@ def contexts( self, triple: Optional[_TripleType] = None ) -> Generator[_ContextType, None, None]: if triple is None or triple == (None, None, None): - return (context for context in self.__all_contexts) + return (context for context in list(self.__all_contexts)) subj, pred, obj = triple try: diff --git a/test/test_store/test_store_memorystore.py b/test/test_store/test_store_memorystore.py index 905dc58b6d..df51e3bf70 100644 --- a/test/test_store/test_store_memorystore.py +++ b/test/test_store/test_store_memorystore.py @@ -29,3 +29,20 @@ def test_memory_store(get_graph): g.remove(triple1) assert len(g) == 1 assert len(g.serialize()) > 0 + + +def test_sparql_bindings_creating_new_graph(): + # Test for https://github.com/RDFLib/rdflib/issues/3102 + dataset = rdflib.Dataset() + # Create a graph + dataset.graph(identifier="urn:example:graph") + sparql_query = """ + SELECT * + WHERE { + GRAPH ?g_1 { } + GRAPH ?g_2 { } + }""" + + # Ensure it doesn't raise RuntimeError: Set changed size during iteration + results = dataset.query(sparql_query) + assert len(results) == 1 From 6a04c63a6d38c1fca1ef3a0f4b7464219650b12c Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 30 Oct 2025 16:22:18 +1000 Subject: [PATCH 52/60] chore(v7): upgrade ci and dev/test deps (#3288) * build: add pyproject classifier for python 3.14 * test: add test matrix for python 3.12, 3.13 and 3.14 * test: update matrix * test: update tox.ini with new python versions * style: align black formatting with python version 3.8 * test: mark sparql10 test as xfail as it's failing in python 3.14 * test: mark sparql10 test as xfail as it's failing in python 3.14 * chore: update ruff, mypy, and black to the latest supported versions * test: xfail test marked only when greater than or equal to python 3.14 * style: formatting * test: mark more tests as xfail --- .github/workflows/validate.yaml | 2 +- .pre-commit-config.yaml | 4 +- poetry.lock | 87 ++++++++++++----------- pyproject.toml | 11 +-- test/test_graph/test_graph_http.py | 2 +- test/test_store/test_store_sparqlstore.py | 4 +- test/test_w3c_spec/test_sparql10_w3c.py | 5 ++ test/test_w3c_spec/test_sparql11_w3c.py | 41 +++++++++++ test/utils/httpfileserver.py | 2 +- tox.ini | 10 +-- 10 files changed, 108 insertions(+), 60 deletions(-) diff --git a/.github/workflows/validate.yaml b/.github/workflows/validate.yaml index 178a33d12a..838a26ef3d 100644 --- a/.github/workflows/validate.yaml +++ b/.github/workflows/validate.yaml @@ -25,7 +25,7 @@ jobs: strategy: fail-fast: false matrix: - python-version: ["3.8", "3.9", "3.10", "3.11"] + python-version: ["3.8", "3.9", "3.10", "3.11", "3.12", "3.13", "3.14"] os: [ubuntu-latest, macos-latest, windows-latest] # This is used for injecting additional tests for a specific python # version and OS. diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 098305df07..d4cbfc2ce2 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -8,13 +8,13 @@ ci: repos: - repo: https://github.com/astral-sh/ruff-pre-commit # WARNING: Ruff version should be the same as in `pyproject.toml` - rev: v0.11.0 + rev: v0.14.2 hooks: - id: ruff args: ["--fix"] - repo: https://github.com/psf/black-pre-commit-mirror # WARNING: Black version should be the same as in `pyproject.toml` - rev: "24.4.2" + rev: "24.8.0" hooks: - id: black pass_filenames: false diff --git a/poetry.lock b/poetry.lock index 82b2f19f31..e0c5133d94 100644 --- a/poetry.lock +++ b/poetry.lock @@ -47,33 +47,33 @@ files = [ [[package]] name = "black" -version = "24.4.2" +version = "24.8.0" description = "The uncompromising code formatter." optional = false python-versions = ">=3.8" files = [ - {file = "black-24.4.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:dd1b5a14e417189db4c7b64a6540f31730713d173f0b63e55fabd52d61d8fdce"}, - {file = "black-24.4.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8e537d281831ad0e71007dcdcbe50a71470b978c453fa41ce77186bbe0ed6021"}, - {file = "black-24.4.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eaea3008c281f1038edb473c1aa8ed8143a5535ff18f978a318f10302b254063"}, - {file = "black-24.4.2-cp310-cp310-win_amd64.whl", hash = "sha256:7768a0dbf16a39aa5e9a3ded568bb545c8c2727396d063bbaf847df05b08cd96"}, - {file = "black-24.4.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:257d724c2c9b1660f353b36c802ccece186a30accc7742c176d29c146df6e474"}, - {file = "black-24.4.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:bdde6f877a18f24844e381d45e9947a49e97933573ac9d4345399be37621e26c"}, - {file = "black-24.4.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e151054aa00bad1f4e1f04919542885f89f5f7d086b8a59e5000e6c616896ffb"}, - {file = "black-24.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:7e122b1c4fb252fd85df3ca93578732b4749d9be076593076ef4d07a0233c3e1"}, - {file = "black-24.4.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:accf49e151c8ed2c0cdc528691838afd217c50412534e876a19270fea1e28e2d"}, - {file = "black-24.4.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:88c57dc656038f1ab9f92b3eb5335ee9b021412feaa46330d5eba4e51fe49b04"}, - {file = "black-24.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:be8bef99eb46d5021bf053114442914baeb3649a89dc5f3a555c88737e5e98fc"}, - {file = "black-24.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:415e686e87dbbe6f4cd5ef0fbf764af7b89f9057b97c908742b6008cc554b9c0"}, - {file = "black-24.4.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:bf10f7310db693bb62692609b397e8d67257c55f949abde4c67f9cc574492cc7"}, - {file = "black-24.4.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:98e123f1d5cfd42f886624d84464f7756f60ff6eab89ae845210631714f6db94"}, - {file = "black-24.4.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:48a85f2cb5e6799a9ef05347b476cce6c182d6c71ee36925a6c194d074336ef8"}, - {file = "black-24.4.2-cp38-cp38-win_amd64.whl", hash = "sha256:b1530ae42e9d6d5b670a34db49a94115a64596bc77710b1d05e9801e62ca0a7c"}, - {file = "black-24.4.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:37aae07b029fa0174d39daf02748b379399b909652a806e5708199bd93899da1"}, - {file = "black-24.4.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:da33a1a5e49c4122ccdfd56cd021ff1ebc4a1ec4e2d01594fef9b6f267a9e741"}, - {file = "black-24.4.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ef703f83fc32e131e9bcc0a5094cfe85599e7109f896fe8bc96cc402f3eb4b6e"}, - {file = "black-24.4.2-cp39-cp39-win_amd64.whl", hash = "sha256:b9176b9832e84308818a99a561e90aa479e73c523b3f77afd07913380ae2eab7"}, - {file = "black-24.4.2-py3-none-any.whl", hash = "sha256:d36ed1124bb81b32f8614555b34cc4259c3fbc7eec17870e8ff8ded335b58d8c"}, - {file = "black-24.4.2.tar.gz", hash = "sha256:c872b53057f000085da66a19c55d68f6f8ddcac2642392ad3a355878406fbd4d"}, + {file = "black-24.8.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:09cdeb74d494ec023ded657f7092ba518e8cf78fa8386155e4a03fdcc44679e6"}, + {file = "black-24.8.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:81c6742da39f33b08e791da38410f32e27d632260e599df7245cccee2064afeb"}, + {file = "black-24.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:707a1ca89221bc8a1a64fb5e15ef39cd755633daa672a9db7498d1c19de66a42"}, + {file = "black-24.8.0-cp310-cp310-win_amd64.whl", hash = "sha256:d6417535d99c37cee4091a2f24eb2b6d5ec42b144d50f1f2e436d9fe1916fe1a"}, + {file = "black-24.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:fb6e2c0b86bbd43dee042e48059c9ad7830abd5c94b0bc518c0eeec57c3eddc1"}, + {file = "black-24.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:837fd281f1908d0076844bc2b801ad2d369c78c45cf800cad7b61686051041af"}, + {file = "black-24.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:62e8730977f0b77998029da7971fa896ceefa2c4c4933fcd593fa599ecbf97a4"}, + {file = "black-24.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:72901b4913cbac8972ad911dc4098d5753704d1f3c56e44ae8dce99eecb0e3af"}, + {file = "black-24.8.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:7c046c1d1eeb7aea9335da62472481d3bbf3fd986e093cffd35f4385c94ae368"}, + {file = "black-24.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:649f6d84ccbae73ab767e206772cc2d7a393a001070a4c814a546afd0d423aed"}, + {file = "black-24.8.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2b59b250fdba5f9a9cd9d0ece6e6d993d91ce877d121d161e4698af3eb9c1018"}, + {file = "black-24.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:6e55d30d44bed36593c3163b9bc63bf58b3b30e4611e4d88a0c3c239930ed5b2"}, + {file = "black-24.8.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:505289f17ceda596658ae81b61ebbe2d9b25aa78067035184ed0a9d855d18afd"}, + {file = "black-24.8.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b19c9ad992c7883ad84c9b22aaa73562a16b819c1d8db7a1a1a49fb7ec13c7d2"}, + {file = "black-24.8.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1f13f7f386f86f8121d76599114bb8c17b69d962137fc70efe56137727c7047e"}, + {file = "black-24.8.0-cp38-cp38-win_amd64.whl", hash = "sha256:f490dbd59680d809ca31efdae20e634f3fae27fba3ce0ba3208333b713bc3920"}, + {file = "black-24.8.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eab4dd44ce80dea27dc69db40dab62d4ca96112f87996bca68cd75639aeb2e4c"}, + {file = "black-24.8.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3c4285573d4897a7610054af5a890bde7c65cb466040c5f0c8b732812d7f0e5e"}, + {file = "black-24.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e84e33b37be070ba135176c123ae52a51f82306def9f7d063ee302ecab2cf47"}, + {file = "black-24.8.0-cp39-cp39-win_amd64.whl", hash = "sha256:73bbf84ed136e45d451a260c6b73ed674652f90a2b3211d6a35e78054563a9bb"}, + {file = "black-24.8.0-py3-none-any.whl", hash = "sha256:972085c618ee94f402da1af548a4f218c754ea7e5dc70acb168bfaca4c2542ed"}, + {file = "black-24.8.0.tar.gz", hash = "sha256:2500945420b6784c38b9ee885af039f5e7471ef284ab03fa35ecdde4688cd83f"}, ] [package.dependencies] @@ -1466,29 +1466,30 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] [[package]] name = "ruff" -version = "0.9.10" +version = "0.14.2" description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" files = [ - {file = "ruff-0.9.10-py3-none-linux_armv6l.whl", hash = "sha256:eb4d25532cfd9fe461acc83498361ec2e2252795b4f40b17e80692814329e42d"}, - {file = "ruff-0.9.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:188a6638dab1aa9bb6228a7302387b2c9954e455fb25d6b4470cb0641d16759d"}, - {file = "ruff-0.9.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:5284dcac6b9dbc2fcb71fdfc26a217b2ca4ede6ccd57476f52a587451ebe450d"}, - {file = "ruff-0.9.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:47678f39fa2a3da62724851107f438c8229a3470f533894b5568a39b40029c0c"}, - {file = "ruff-0.9.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:99713a6e2766b7a17147b309e8c915b32b07a25c9efd12ada79f217c9c778b3e"}, - {file = "ruff-0.9.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:524ee184d92f7c7304aa568e2db20f50c32d1d0caa235d8ddf10497566ea1a12"}, - {file = "ruff-0.9.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:df92aeac30af821f9acf819fc01b4afc3dfb829d2782884f8739fb52a8119a16"}, - {file = "ruff-0.9.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de42e4edc296f520bb84954eb992a07a0ec5a02fecb834498415908469854a52"}, - {file = "ruff-0.9.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d257f95b65806104b6b1ffca0ea53f4ef98454036df65b1eda3693534813ecd1"}, - {file = "ruff-0.9.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b60dec7201c0b10d6d11be00e8f2dbb6f40ef1828ee75ed739923799513db24c"}, - {file = "ruff-0.9.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d838b60007da7a39c046fcdd317293d10b845001f38bcb55ba766c3875b01e43"}, - {file = "ruff-0.9.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:ccaf903108b899beb8e09a63ffae5869057ab649c1e9231c05ae354ebc62066c"}, - {file = "ruff-0.9.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:f9567d135265d46e59d62dc60c0bfad10e9a6822e231f5b24032dba5a55be6b5"}, - {file = "ruff-0.9.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:5f202f0d93738c28a89f8ed9eaba01b7be339e5d8d642c994347eaa81c6d75b8"}, - {file = "ruff-0.9.10-py3-none-win32.whl", hash = "sha256:bfb834e87c916521ce46b1788fbb8484966e5113c02df216680102e9eb960029"}, - {file = "ruff-0.9.10-py3-none-win_amd64.whl", hash = "sha256:f2160eeef3031bf4b17df74e307d4c5fb689a6f3a26a2de3f7ef4044e3c484f1"}, - {file = "ruff-0.9.10-py3-none-win_arm64.whl", hash = "sha256:5fd804c0327a5e5ea26615550e706942f348b197d5475ff34c19733aee4b2e69"}, - {file = "ruff-0.9.10.tar.gz", hash = "sha256:9bacb735d7bada9cfb0f2c227d3658fc443d90a727b47f206fb33f52f3c0eac7"}, + {file = "ruff-0.14.2-py3-none-linux_armv6l.whl", hash = "sha256:7cbe4e593505bdec5884c2d0a4d791a90301bc23e49a6b1eb642dd85ef9c64f1"}, + {file = "ruff-0.14.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:8d54b561729cee92f8d89c316ad7a3f9705533f5903b042399b6ae0ddfc62e11"}, + {file = "ruff-0.14.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:5c8753dfa44ebb2cde10ce5b4d2ef55a41fb9d9b16732a2c5df64620dbda44a3"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d0bbeffb8d9f4fccf7b5198d566d0bad99a9cb622f1fc3467af96cb8773c9e3"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7047f0c5a713a401e43a88d36843d9c83a19c584e63d664474675620aaa634a8"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3bf8d2f9aa1602599217d82e8e0af7fd33e5878c4d98f37906b7c93f46f9a839"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:1c505b389e19c57a317cf4b42db824e2fca96ffb3d86766c1c9f8b96d32048a7"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a307fc45ebd887b3f26b36d9326bb70bf69b01561950cdcc6c0bdf7bb8e0f7cc"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:61ae91a32c853172f832c2f40bd05fd69f491db7289fb85a9b941ebdd549781a"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1967e40286f63ee23c615e8e7e98098dedc7301568bd88991f6e544d8ae096"}, + {file = "ruff-0.14.2-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:2877f02119cdebf52a632d743a2e302dea422bfae152ebe2f193d3285a3a65df"}, + {file = "ruff-0.14.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e681c5bc777de5af898decdcb6ba3321d0d466f4cb43c3e7cc2c3b4e7b843a05"}, + {file = "ruff-0.14.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:e21be42d72e224736f0c992cdb9959a2fa53c7e943b97ef5d081e13170e3ffc5"}, + {file = "ruff-0.14.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:b8264016f6f209fac16262882dbebf3f8be1629777cf0f37e7aff071b3e9b92e"}, + {file = "ruff-0.14.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:5ca36b4cb4db3067a3b24444463ceea5565ea78b95fe9a07ca7cb7fd16948770"}, + {file = "ruff-0.14.2-py3-none-win32.whl", hash = "sha256:41775927d287685e08f48d8eb3f765625ab0b7042cc9377e20e64f4eb0056ee9"}, + {file = "ruff-0.14.2-py3-none-win_amd64.whl", hash = "sha256:0df3424aa5c3c08b34ed8ce099df1021e3adaca6e90229273496b839e5a7e1af"}, + {file = "ruff-0.14.2-py3-none-win_arm64.whl", hash = "sha256:ea9d635e83ba21569fbacda7e78afbfeb94911c9434aff06192d9bc23fd5495a"}, + {file = "ruff-0.14.2.tar.gz", hash = "sha256:98da787668f239313d9c902ca7c523fe11b8ec3f39345553a51b25abc4629c96"}, ] [[package]] @@ -1712,4 +1713,4 @@ orjson = ["orjson"] [metadata] lock-version = "2.0" python-versions = ">=3.8.1" -content-hash = "35ce70402138519fd62cf5ec9901125cab2c36fc6965ef13c66020637c8c8bd4" +content-hash = "5113fb643b174ebedd081933dc427c4cad3fe80266c3d81c6be6291ce2ea2620" diff --git a/pyproject.toml b/pyproject.toml index 474854a9fd..5e3691fc28 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -17,6 +17,7 @@ classifiers=[ "Programming Language :: Python :: 3.11", "Programming Language :: Python :: 3.12", "Programming Language :: Python :: 3.13", + "Programming Language :: Python :: 3.14", "License :: OSI Approved :: BSD License", "Topic :: Software Development :: Libraries :: Python Modules", "Operating System :: OS Independent", @@ -50,8 +51,8 @@ lxml = {version = ">=4.3,<6.0", optional = true} orjson = {version = ">=3.9.14,<4", optional = true} [tool.poetry.group.dev.dependencies] -black = "24.4.2" -mypy = "^1.1.0" +black = "24.8.0" +mypy = "1.14.1" lxml-stubs = ">=0.4,<0.6" pip-tools = "^7.4.1" @@ -74,7 +75,7 @@ mkdocs-gen-files = "^0.5.0" mkdocs-include-markdown-plugin = {version = "^7.2.0", python = ">=3.11"} [tool.poetry.group.lint.dependencies] -ruff = ">=0.0.286,<0.10.0" +ruff = "0.14.2" [tool.poetry.extras] berkeleydb = ["berkeleydb"] @@ -173,7 +174,7 @@ ignore = [ [tool.black] line-length = 88 target-version = ['py38'] -required-version = "24.4.2" +required-version = "24.8.0" include = '\.pyi?$' exclude = ''' ( @@ -228,7 +229,7 @@ log_cli_date_format = "%Y-%m-%dT%H:%M:%S" [tool.isort] profile = "black" -py_version = 37 +py_version = 38 line_length = 88 src_paths= ["rdflib", "test", "devtools", "examples"] supported_extensions = ["pyw", "pyi", "py"] diff --git a/test/test_graph/test_graph_http.py b/test/test_graph/test_graph_http.py index 11eebe38b1..d4bb7a61d3 100644 --- a/test/test_graph/test_graph_http.py +++ b/test/test_graph/test_graph_http.py @@ -66,7 +66,7 @@ class ContentNegotiationHandler(BaseHTTPRequestHandler): - def do_GET(self): # noqa: N802 + def do_GET(self): self.send_response(200, "OK") # fun fun fun parsing accept header. diff --git a/test/test_store/test_store_sparqlstore.py b/test/test_store/test_store_sparqlstore.py index 1a26c1d289..51bee1ec14 100644 --- a/test/test_store/test_store_sparqlstore.py +++ b/test/test_store/test_store_sparqlstore.py @@ -440,7 +440,7 @@ def test_query(self): class SPARQL11ProtocolStoreMock(BaseHTTPRequestHandler): - def do_POST(self): # noqa: N802 + def do_POST(self): """ If the body should be analysed as well, just use: ``` @@ -477,7 +477,7 @@ def do_POST(self): # noqa: N802 self.end_headers() return - def do_GET(self): # noqa: N802 + def do_GET(self): # Process an HTTP GET request and return a response with an HTTP 200 status. self.send_response(200, "OK") self.end_headers() diff --git a/test/test_w3c_spec/test_sparql10_w3c.py b/test/test_w3c_spec/test_sparql10_w3c.py index 3f33ca0060..d11ff8924c 100644 --- a/test/test_w3c_spec/test_sparql10_w3c.py +++ b/test/test_w3c_spec/test_sparql10_w3c.py @@ -2,6 +2,7 @@ Runs the SPARQL 1.0 test suite from. """ +import sys from contextlib import ExitStack from typing import Generator @@ -100,6 +101,10 @@ f"{REMOTE_BASE_IRI}syntax-sparql4/syn-bad-37.rq": pytest.mark.xfail( reason="Accepts invalid query." ), + f"{REMOTE_BASE_IRI}solution-seq/manifest#slice-3": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), } diff --git a/test/test_w3c_spec/test_sparql11_w3c.py b/test/test_w3c_spec/test_sparql11_w3c.py index f68227470d..41fef9d08d 100644 --- a/test/test_w3c_spec/test_sparql11_w3c.py +++ b/test/test_w3c_spec/test_sparql11_w3c.py @@ -2,6 +2,7 @@ Runs the SPARQL 1.1 test suite from. """ +import sys from contextlib import ExitStack from typing import Generator @@ -238,6 +239,46 @@ f"{REMOTE_BASE_IRI}syntax-update-1/manifest#test_54": pytest.mark.xfail( reason="Parses sucessfully instead of failing." ), + f"{REMOTE_BASE_IRI}csv-tsv-res/manifest#csv01": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}csv-tsv-res/manifest#tsv01": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}csv-tsv-res/manifest#csv02": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}csv-tsv-res/manifest#tsv02": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}csv-tsv-res/manifest#csv03": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}csv-tsv-res/manifest#tsv03": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}functions/manifest#plus-1": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}functions/manifest#plus-2": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}json-res/manifest#jsonres01": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), + f"{REMOTE_BASE_IRI}json-res/manifest#jsonres02": pytest.mark.xfail( + condition=sys.version_info >= (3, 14), + reason="Python 3.14 raises a TypeError when evaluating NotImplemented as a boolean value.", + ), } diff --git a/test/utils/httpfileserver.py b/test/utils/httpfileserver.py index 774b042258..5449e5e595 100644 --- a/test/utils/httpfileserver.py +++ b/test/utils/httpfileserver.py @@ -185,7 +185,7 @@ def make_handler(self) -> Type[BaseHTTPRequestHandler]: class Handler(BaseHTTPRequestHandler): server: HTTPFileServer - def do_GET(self) -> None: # noqa: N802 + def do_GET(self) -> None: parsed_path = urlparse(self.path) path_query = parse_qs(parsed_path.query) body = None diff --git a/tox.ini b/tox.ini index c96bb2a6d7..87f3224530 100644 --- a/tox.ini +++ b/tox.ini @@ -3,7 +3,7 @@ [tox] minversion = 4.0.0 envlist = - lint,py3{8,9,10,11},covreport,docs,precommit + lint,py3{8,9,10,11,12,13,14},covreport,docs,precommit toxworkdir={env:TOX_WORK_DIR:{tox_root}/.tox} [testenv] @@ -19,9 +19,9 @@ setenv = extensive: POETRY_ARGS_extensive = --extras=berkeleydb --extras=networkx --extras=html --extras=orjson lxml: POETRY_ARGS_lxml = --extras=lxml commands_pre = - py3{8,9,10,11}: python -c 'import os; print("\n".join(f"{key}={value}" for key, value in os.environ.items()))' - py3{8,9,10,11}: poetry check --lock - py3{8,9,10,11}: poetry install --no-root --only=main --only=dev --only=lint --only=tests {env:POETRY_ARGS_docs:} {env:POETRY_ARGS_extensive:} {env:POETRY_ARGS_lxml:} {env:POETRY_ARGS:} --sync + py3{8,9,10,11,12,13,14}: python -c 'import os; print("\n".join(f"{key}={value}" for key, value in os.environ.items()))' + py3{8,9,10,11,12,13,14}: poetry check --lock + py3{8,9,10,11,12,13,14}: poetry install --no-root --only=main --only=dev --only=lint --only=tests {env:POETRY_ARGS_docs:} {env:POETRY_ARGS_extensive:} {env:POETRY_ARGS_lxml:} {env:POETRY_ARGS:} --sync commands = min: python -c 'import sys; print("min qualifier not supported on this environment"); sys.exit(1);' poetry config --list @@ -35,7 +35,7 @@ commands = [testenv:covreport] skip_install = true parallel_show_output = true -depends = py3{8,9,10,11}{-extensive,}{-docs,} +depends = py3{8,9,10,11,12,13,14}{-extensive,}{-docs,} setenv = COVERAGE_FILE= commands_pre = From c2c94e50d696d4d890bb5fbd40feb3678ed331d5 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 30 Oct 2025 20:38:09 +1000 Subject: [PATCH 53/60] chore: prep 7.4.0 release (#3289) * chore: remove remnant rst files * chore: prep 7.4.0 release --- CHANGELOG.md | 45 +++++++++++++++++++++++++-------------- CITATION.cff | 4 ++-- README.md | 1 + admin/README.md | 2 +- docs/apidocs/examples.rst | 0 docs/conf.py | 0 docs/developers.md | 14 ++++++------ docs/developers.rst | 0 docs/plugin_stores.rst | 0 pyproject.toml | 2 +- rdflib/__init__.py | 2 +- 11 files changed, 42 insertions(+), 28 deletions(-) delete mode 100644 docs/apidocs/examples.rst delete mode 100644 docs/conf.py delete mode 100644 docs/developers.rst delete mode 100644 docs/plugin_stores.rst diff --git a/CHANGELOG.md b/CHANGELOG.md index f6addadc9e..77ca9acdad 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,16 @@ +## 2025-10-30 RELEASE 7.4.0 + +This release addresses a couple of bugs and improves the testing matrix by adding python 3.12, 3.13 and 3.14 to the test matrix. + +This is also the first RDFLib release to use MkDocs for documentation. + +Pull requests merged: + +- chore(v7): upgrade ci and dev/test deps by @edmondchuc in [#3288](https://github.com/RDFLib/rdflib/pull/3288) +- feat: v7 mkdocs by @edmondchuc in [#3287](https://github.com/RDFLib/rdflib/pull/3287) +- fix: set changed size when iterating the store's graphs by @edmondchuc in [#3281](https://github.com/RDFLib/rdflib/pull/3281) +- lazy fix of issue pytest failure on python 3.13.8 - Removing of xfail raise restriction by @WhiteGobo in [#3275](https://github.com/RDFLib/rdflib/pull/3275) + ## 2025-10-24 RELEASE 7.3.0 This release delivers several important fixes and enhancements to RDFLib’s Dataset implementation, resolving long-standing issues and improving consistency across serialization and SPARQL operations. It also introduces new deprecation notices for certain Dataset methods and attributes, which will be removed in the next major release. In addition, this version includes a range of improvements to SPARQL result parsing, typing, and literal handling. @@ -33,22 +46,22 @@ This release delivers several important fixes and enhancements to RDFLib’s Dat Pull requests merged: -- feat: add Dataset `__iadd__` support by @edmondchuc in [#3268](https://api.github.com/repos/RDFLib/rdflib/pulls/3268) -- fix: RecursiveSerializer- outputs undeclared prefix for predicates that contains the base as a substring by @edmondchuc in [#3267](https://api.github.com/repos/RDFLib/rdflib/pulls/3267) -- fix: allow static type checkers to infer term's `__new__` type by @edmondchuc in [#3266](https://api.github.com/repos/RDFLib/rdflib/pulls/3266) -- fix: SPARQL Update inserts into the default graph by @edmondchuc in [#3265](https://api.github.com/repos/RDFLib/rdflib/pulls/3265) -- chore: add deprecation notice to Dataset methods and attributes by @edmondchuc in [#3264](https://api.github.com/repos/RDFLib/rdflib/pulls/3264) -- fix: Dataset.parse now returns Self by @edmondchuc in [#3263](https://api.github.com/repos/RDFLib/rdflib/pulls/3263) -- fix: dataset nquads serialization including RDFLib internal default graph identifier by @edmondchuc in [#3262](https://api.github.com/repos/RDFLib/rdflib/pulls/3262) -- patch for reevaluation in sparql modify between update loops. with test by @WhiteGobo in [#3261](https://api.github.com/repos/RDFLib/rdflib/pulls/3261) -- feat: change dataset's default serialize format to trig by @edmondchuc in [#3260](https://api.github.com/repos/RDFLib/rdflib/pulls/3260) -- feat: allow adding graphs backed by different stores to the same dataset by @edmondchuc in [#3259](https://api.github.com/repos/RDFLib/rdflib/pulls/3259) -- fix(v7): remove Literal.toPython date conversion for gYear/gYearMonth (#3115) by @edmondchuc in [#3258](https://api.github.com/repos/RDFLib/rdflib/pulls/3258) -- sparqls optionals clause can now bind variables. with test. issue 2957 by @WhiteGobo in [#3247](https://api.github.com/repos/RDFLib/rdflib/pulls/3247) -- fix: skip prefix generation for predicates corresponding to base namespace by @edmondchuc in [#3244](https://api.github.com/repos/RDFLib/rdflib/pulls/3244) -- Run the example queries agains the local fuseki by @white-gecko in [#3240](https://api.github.com/repos/RDFLib/rdflib/pulls/3240) -- Adjust the type hint for Graph open to reflect a SPARQLUpdateStore configuration by @white-gecko in [#3239](https://api.github.com/repos/RDFLib/rdflib/pulls/3239) -- SPARQL result parsing by @white-gecko in [#2796](https://api.github.com/repos/RDFLib/rdflib/pulls/2796) +- feat: add Dataset `__iadd__` support by @edmondchuc in [#3268](https://github.com/RDFLib/rdflib/pull/3268) +- fix: RecursiveSerializer- outputs undeclared prefix for predicates that contains the base as a substring by @edmondchuc in [#3267](https://github.com/RDFLib/rdflib/pull/3267) +- fix: allow static type checkers to infer term's `__new__` type by @edmondchuc in [#3266](https://github.com/RDFLib/rdflib/pull/3266) +- fix: SPARQL Update inserts into the default graph by @edmondchuc in [#3265](https://github.com/RDFLib/rdflib/pull/3265) +- chore: add deprecation notice to Dataset methods and attributes by @edmondchuc in [#3264](https://github.com/RDFLib/rdflib/pull/3264) +- fix: Dataset.parse now returns Self by @edmondchuc in [#3263](https://github.com/RDFLib/rdflib/pull/3263) +- fix: dataset nquads serialization including RDFLib internal default graph identifier by @edmondchuc in [#3262](https://github.com/RDFLib/rdflib/pull/3262) +- patch for reevaluation in sparql modify between update loops. with test by @WhiteGobo in [#3261](https://github.com/RDFLib/rdflib/pull/3261) +- feat: change dataset's default serialize format to trig by @edmondchuc in [#3260](https://github.com/RDFLib/rdflib/pull/3260) +- feat: allow adding graphs backed by different stores to the same dataset by @edmondchuc in [#3259](https://github.com/RDFLib/rdflib/pull/3259) +- fix(v7): remove Literal.toPython date conversion for gYear/gYearMonth (#3115) by @edmondchuc in [#3258](https://github.com/RDFLib/rdflib/pull/3258) +- sparqls optionals clause can now bind variables. with test. issue 2957 by @WhiteGobo in [#3247](https://github.com/RDFLib/rdflib/pull/3247) +- fix: skip prefix generation for predicates corresponding to base namespace by @edmondchuc in [#3244](https://github.com/RDFLib/rdflib/pull/3244) +- Run the example queries agains the local fuseki by @white-gecko in [#3240](https://github.com/RDFLib/rdflib/pull/3240) +- Adjust the type hint for Graph open to reflect a SPARQLUpdateStore configuration by @white-gecko in [#3239](https://github.com/RDFLib/rdflib/pull/3239) +- SPARQL result parsing by @white-gecko in [#2796](https://github.com/RDFLib/rdflib/pull/2796) ## 2025-09-19 RELEASE 7.2.1 diff --git a/CITATION.cff b/CITATION.cff index c111335f86..7cf2c00b05 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -69,7 +69,7 @@ authors: - family-names: "Stuart" given-names: "Veyndan" title: "RDFLib" -version: 7.3.0 -date-released: 2025-10-24 +version: 7.4.0 +date-released: 2025-10-30 url: "https://github.com/RDFLib/rdflib" doi: 10.5281/zenodo.6845245 diff --git a/README.md b/README.md index 3cfb4e632f..4d63ccec4f 100644 --- a/README.md +++ b/README.md @@ -45,6 +45,7 @@ Help with maintenance of all of the RDFLib family of packages is always welcome ## Versions & Releases * `main` branch in this repository is the current unstable release - version 8 alpha +* `7.4.0` a few small fixes, add test matrix for active python versions, and move v7 documentation to MkDocs * `7.3.0` many fixes and usability improvements, particularly for the Dataset class. See changelog for details * `7.2.1` tiny clean up release, relaxes Python version requirement * `7.2.0` general fixes and usability improvements, see changelog for details diff --git a/admin/README.md b/admin/README.md index c511b9bcec..a4d12b0d4c 100644 --- a/admin/README.md +++ b/admin/README.md @@ -9,5 +9,5 @@ To make a release of RDFLib, see the [Developer's Guide](https://rdflib.readthed An alternative to the `get_merged_prs.py` script is to use the GitHub CL to get the list of PRs and pipe it into the `pr_markdown.py` script. The following command retrieves the list of PRs merged since the last release (`2025-09-19`) from a particular branch (`7.x`). ```bash -gh api '/search/issues?q=repo:rdflib/rdflib+is:pr+is:merged+base:7.x+merged:>2025-09-19&per_page=100' | jq '{total_count, incomplete_results, items: [.items[] | {number, title, pull_request_merged_at: .pull_request.merged_at, pull_request_url: .pull_request.url, username: .user.login}]}' | poetry run python admin/pr_markdown.py +gh api '/search/issues?q=repo:rdflib/rdflib+is:pr+is:merged+base:7.x+merged:>2025-10-24&per_page=100' | jq '{total_count, incomplete_results, items: [.items[] | {number, title, pull_request_merged_at: .pull_request.merged_at, pull_request_url: .pull_request.html_url, username: .user.login}]}' | poetry run python admin/pr_markdown.py ``` diff --git a/docs/apidocs/examples.rst b/docs/apidocs/examples.rst deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/docs/conf.py b/docs/conf.py deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/docs/developers.md b/docs/developers.md index 4687bbe038..3f00b62026 100644 --- a/docs/developers.md +++ b/docs/developers.md @@ -327,13 +327,13 @@ RDFLib 5.0.0 maintained compatibility with Python versions 2.7, 3.4, 3.5, 3.6, 3 Create a release-preparation pull request with the following changes: -* Updated version and date in `CITATION.cff`. -* Updated copyright year in the `LICENSE` file. -* Updated copyright year in the `docs/conf.py` file. -* Updated main branch version and current version in the `README.md` file. -* Updated version in the `pyproject.toml` file. -* Updated `__date__` in the `rdflib/__init__.py` file. -* Accurate `CHANGELOG.md` entry for the release. +* Updated version and date in [`CITATION.cff`](../CITATION.cff). +* Updated copyright year in the [`LICENSE`](../LICENSE) file. +* Updated copyright year in the [`mkdocs.yml`](../mkdocs.yml) file. +* Updated main branch version and current version in the [`README.md`](../README.md) file. +* Updated version in the [`pyproject.toml`](../pyproject.toml) file. +* Updated `__date__` in the [`rdflib/__init__.py`](../rdflib/__init__.py) file. +* Updated [`CHANGELOG.md`](../CHANGELOG.md) entry for the release with admin tools as described in [`admin/README.md`](../admin/README.md). Once the PR is merged, switch to the main branch, build the release and upload it to PyPI: diff --git a/docs/developers.rst b/docs/developers.rst deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/docs/plugin_stores.rst b/docs/plugin_stores.rst deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/pyproject.toml b/pyproject.toml index 5e3691fc28..3b9b8635d5 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.4.0-a0" +version = "7.4.0" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] diff --git a/rdflib/__init__.py b/rdflib/__init__.py index c538272bd0..16fbb9bcd6 100644 --- a/rdflib/__init__.py +++ b/rdflib/__init__.py @@ -54,7 +54,7 @@ __docformat__ = "restructuredtext en" __version__: str = _DISTRIBUTION_METADATA["Version"] -__date__ = "2025-10-24" +__date__ = "2025-10-30" __all__ = [ "URIRef", From 3c9d9da3a3ea1bd783f551b5b7e4ae1113e660db Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Fri, 31 Oct 2025 00:22:21 +1000 Subject: [PATCH 54/60] chore: post steps for 7.4.0 release (#3291) --- docker/latest/requirements.in | 2 +- docker/latest/requirements.txt | 2 +- pyproject.toml | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/docker/latest/requirements.in b/docker/latest/requirements.in index 4bbaa611e4..10ff547f3e 100644 --- a/docker/latest/requirements.in +++ b/docker/latest/requirements.in @@ -1,4 +1,4 @@ # This file is used for building a docker image of the latest rdflib release. It # will be updated by dependabot when new releases are made. -rdflib==7.3.0 +rdflib==7.4.0 html5rdf==1.2.1 diff --git a/docker/latest/requirements.txt b/docker/latest/requirements.txt index d490b232ad..2268779dcd 100644 --- a/docker/latest/requirements.txt +++ b/docker/latest/requirements.txt @@ -8,5 +8,5 @@ html5rdf==1.2.1 # via -r docker/latest/requirements.in pyparsing==3.0.9 # via rdflib -rdflib==7.3.0 +rdflib==7.4.0 # via -r docker/latest/requirements.in diff --git a/pyproject.toml b/pyproject.toml index 3b9b8635d5..9ccea14b32 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "rdflib" -version = "7.4.0" +version = "7.5.0-a0" description = """RDFLib is a Python library for working with RDF, \ a simple yet powerful language for representing information.""" authors = ["Daniel 'eikeon' Krech "] From 92ad5094f0864e52a4eeb38668cb32d101f35884 Mon Sep 17 00:00:00 2001 From: Edmond Chuc <37032744+edmondchuc@users.noreply.github.com> Date: Thu, 6 Nov 2025 22:40:51 +1000 Subject: [PATCH 55/60] test: fix dbpedia webtests (#3304) --- test/test_sparql/test_service.py | 114 +++++++++++++++++-------------- 1 file changed, 64 insertions(+), 50 deletions(-) diff --git a/test/test_sparql/test_service.py b/test/test_sparql/test_service.py index 83ac8388fd..55be56f7ac 100644 --- a/test/test_sparql/test_service.py +++ b/test/test_sparql/test_service.py @@ -18,17 +18,18 @@ @pytest.mark.webtest def test_service(): g = Graph() - q = """select ?sameAs ?dbpComment - where - { service - { select ?dbpHypernym ?dbpComment - where - { - - ?sameAs ; - ?dbpComment . - - } } } limit 2""" + q = """ + SELECT ?p ?o + WHERE { + SERVICE { + SELECT ?p ?o + WHERE { + ?p ?o + } + } + } + LIMIT 2 + """ try: results = helper.query_with_retry(g, q) except (RemoteDisconnected, IncompleteRead): @@ -44,19 +45,23 @@ def test_service(): @pytest.mark.webtest def test_service_with_bind(): g = Graph() - q = """select ?sameAs ?dbpComment ?subject - where - { bind ( as ?subject) - service - { select ?sameAs ?dbpComment ?subject - where - { - - ?sameAs ; - ?dbpComment ; - ?subject . - - } } } limit 2""" + q = """ + PREFIX dbp: + SELECT ?sameAs ?dbpComment ?subject + WHERE { + BIND( AS ?subject) + SERVICE { + SELECT ?sameAs ?dbpComment ?subject + WHERE { + + ?sameAs ; + dbp:caption ?dbpComment ; + ?subject . + } + } + } + LIMIT 2 + """ try: results = helper.query_with_retry(g, q) except (RemoteDisconnected, IncompleteRead): @@ -80,27 +85,30 @@ def test_service_with_bound_solutions(): """ ) q = """ - SELECT ?sameAs ?dbpComment ?subject WHERE { - [] - ?sameAs ; - ?subject . - - SERVICE { - SELECT ?sameAs ?dbpComment ?subject WHERE { - + PREFIX dbp: + SELECT ?sameAs ?dbpComment ?subject + WHERE { + [] ?sameAs ; - ?dbpComment ; ?subject . + + SERVICE { + SELECT ?sameAs ?dbpComment ?subject + WHERE { + + ?sameAs ; + dbp:caption ?dbpComment ; + ?subject . + } + } } - } - } - LIMIT 2 + LIMIT 2 """ try: results = helper.query_with_retry(g, q) except (RemoteDisconnected, IncompleteRead): pytest.skip("this test uses dbpedia which is down sometimes") - assert len(results) == 2 + assert len(results) == 1 for r in results: assert len(r) == 3 @@ -109,19 +117,25 @@ def test_service_with_bound_solutions(): @pytest.mark.webtest def test_service_with_values(): g = Graph() - q = """select ?sameAs ?dbpComment ?subject - where - { values (?sameAs ?subject) {( ) ( )} - service - { select ?sameAs ?dbpComment ?subject - where - { - - ?sameAs ; - ?dbpComment ; - ?subject . - - } } } limit 2""" + q = """ + PREFIX dbp: + SELECT ?sameAs ?dbpComment ?subject + WHERE { + VALUES (?sameAs ?subject) { + ( ) ( ) + } + SERVICE { + SELECT ?sameAs ?dbpComment ?subject + WHERE { + + ?sameAs ; + dbp:caption ?dbpComment ; + ?subject . + } + } + } + LIMIT 2 + """ try: results = helper.query_with_retry(g, q) except (RemoteDisconnected, IncompleteRead): From 02862f17a5cda8f49a6de393f516b5ce91fbd158 Mon Sep 17 00:00:00 2001 From: Edmond Chuc Date: Wed, 12 Nov 2025 14:43:47 +1000 Subject: [PATCH 56/60] fix: These markdown docs were added back in when moving to MkDocs in v7.4.0. This removes them again and also removes them from the mkdocs.yml config. --- docs/persisting_n3_terms.md | 89 ----------------------------------- docs/type_hints.md | 92 ------------------------------------- mkdocs.yml | 2 - 3 files changed, 183 deletions(-) delete mode 100644 docs/persisting_n3_terms.md delete mode 100644 docs/type_hints.md diff --git a/docs/persisting_n3_terms.md b/docs/persisting_n3_terms.md deleted file mode 100644 index 5cf59dfdbd..0000000000 --- a/docs/persisting_n3_terms.md +++ /dev/null @@ -1,89 +0,0 @@ -# Persisting Notation 3 Terms - -## Using N3 Syntax for Persistence - -Blank Nodes, Literals, URI References, and Variables can be distinguished in persistence by relying on Notation 3 syntax convention. - -All URI References can be expanded and persisted as: - -```turtle -<..URI..> -``` - -All Literals can be expanded and persisted as: - -```turtle -"..value.."@lang or "..value.."^^dtype_uri -``` - -!!! abstract "Language tag" - `@lang` is a language tag and `^^dtype_uri` is the URI of a data type associated with the Literal - -Blank Nodes can be expanded and persisted as: - -```turtle -_:Id -``` - -!!! info "About skolemization" - Where Id is an identifier as determined by skolemization. Skolemization is a syntactic transformation routinely used in automatic inference systems in which existential variables are replaced by 'new' functions - function names not used elsewhere - applied to any enclosing universal variables. In RDF, Skolemization amounts to replacing every blank node in a graph by a 'new' name, i.e. a URI reference which is guaranteed to not occur anywhere else. In effect, it gives 'arbitrary' names to the anonymous entities whose existence was asserted by the use of blank nodes: the arbitrariness of the names ensures that nothing can be inferred that would not follow from the bare assertion of existence represented by the blank node. (Using a literal would not do. Literals are never 'new' in the required sense.) - -Variables can be persisted as they appear in their serialization `(?varName)` - since they only need be unique within their scope (the context of their associated statements) - -These syntactic conventions can facilitate term round-tripping. - -## Variables by Scope - -Would an interface be needed in order to facilitate a quick way to aggregate all the variables in a scope (given by a formula identifier)? An interface such as: - -```python -def variables(formula_identifier) -``` - -## The Need to Skolemize Formula Identifiers - -It would seem reasonable to assume that a formula-aware store would assign Blank Node identifiers as names of formulae that appear in a N3 serialization. So for instance, the following bit of N3: - -``` -{?x a :N3Programmer} => {?x :has :Migrane} -``` - -Could be interpreted as the assertion of the following statement: - -```turtle -_:a log:implies _:b -``` - -However, how are `_:a` and `_:b` distinguished from other Blank Nodes? A formula-aware store would be expected to persist the first set of statements as quoted statements in a formula named `_:a` and the second set as quoted statements in a formula named `_:b`, but it would not be cost-effective for a serializer to have to query the store for all statements in a context named `_:a` in order to determine if `_:a` was associated with a formula (so that it could be serialized properly). - -## Relying on `log:Formula` Membership - -The store could rely on explicit `log:Formula` membership (via `rdf:type` statements) to model the distinction of Blank Nodes associated with formulae. However, would these statements be expected from an N3 parser or known implicitly by the store? i.e., would all such Blank Nodes match the following pattern: - -```turtle -?formula rdf:type log:Formula -``` - -## Relying on an Explicit Interface - -A formula-aware store could also support the persistence of this distinction by implementing a method that returns an iterator over all the formulae in the store: - -```python -def formulae(triple=None) -``` - -This function would return all the Blank Node identifiers assigned to formulae or just those that contain statements matching the given triple pattern and would be the way a serializer determines if a term refers to a formula (in order to properly serializer it). - -How much would such an interface reduce the need to model formulae terms as first class objects (perhaps to be returned by the [`triples()`][rdflib.Graph.triples] function)? Would it be more useful for the [`Graph`][rdflib.Graph] (or the store itself) to return a Context object in place of a formula term (using the formulae interface to make this determination)? - -Conversely, would these interfaces (variables and formulae) be considered optimizations only since you have the distinction by the kinds of terms triples returns (which would be expanded to include variables and formulae)? - -## Persisting Formula Identifiers - -This is the most straight forward way to maintain this distinction - without relying on extra interfaces. Formula identifiers could be persisted distinctly from other terms by using the following notation: - -``` -{_:bnode} or {<.. URI ..>} -``` - -This would facilitate their persistence round-trip - same as the other terms that rely on N3 syntax to distinguish between each other. diff --git a/docs/type_hints.md b/docs/type_hints.md deleted file mode 100644 index b859526eb4..0000000000 --- a/docs/type_hints.md +++ /dev/null @@ -1,92 +0,0 @@ -# Type Hints - -This document provides some details about the type hints for RDFLib. More information about type hints can be found [here](https://docs.python.org/3/library/typing.html) - -## Rationale for Type Hints - -Type hints are code annotations that describe the types of variables, function parameters and function return value types in a way that can be understood by humans, static type checkers like [mypy](http://mypy-lang.org/), code editors like VSCode, documentation generators like mkdocstring, and other tools. - -Static type checkers can use type hints to detect certain classes of errors by inspection. Code editors and IDEs can use type hints to provide better auto-completion and documentation generators can use type hints to generate better documentation. - -These capabilities make it easier to develop a defect-free RDFLib and they also make it easier for users of RDFLib who can now use static type checkers to detect type errors in code that uses RDFLib. - -## Gradual Typing Process - -Type hints are being added to RDFLib through a process called [gradual typing](https://en.wikipedia.org/wiki/Gradual_typing). This process involves adding type hints to some parts of RDFLib while leaving the rest without type hints. Gradual typing is being applied to many, long-lived, Python code bases. - -This process is beneficial in that we can realize some of the benefits of type hints without requiring that the whole codebase have type hints. - -## Intended Type Hints - -The intent is to have type hints in place for all of RDFLib and to have these type hints be as accurate as possible. - -The accuracy of type hints is determined by both the standards that RDFLib aims to conform to, like RDF 1.1, and the deliberate choices that are made when implementing RDFLib. For example, given that the RDF 1.1 specification stipulates that the subject of an RDF triple cannot be a literal, all functions that accept an *RDF term* to be used as the subject of a triple should have type hints which excludes values that are literals. - -There may be cases where some functionality of RDFLib may work perfectly well with values of types that are excluded by the type hints, but if these additional types violate the relevant standards we will consider the correct type hints to be those that exclude values of these types. - -## Public Type Aliases - -In python, type hints are specified in annotations. Type hints are different from type aliases which are normal python variables that are not intended to provide runtime utility and are instead intended for use in static type checking. - -For clarity, the following is an example of a function `foo` with type hints: - -```python -def foo(a: int) -> int: - return a + 1 -``` - -In the function `foo`, the input variable `a` is indicated to be of type `int` and the function is indicated to return an `int`. - -The following is an example of a type alias `Bar`: - -```python -from typing import Tuple - -Bar = Tuple[int, str] -``` - -RDFLib will provide public type aliases under the `rdflib.typing` package, for example, `rdflib.typing.Triple`, `rdflib.typing.Quad`. Type aliases in the rest of RDFLib should be private (i.e. being with an underscore). - -## Versioning, Compatibility and Stability - -RDFLib attempts to adhere to [semver 2.0](https://semver.org/spec/v2.0.0.html) which is concerned with the public API of software. - -Ignoring type hints, the public API of RDFLib exists implicitly as a consequence of the code of RDFLib and the actual behaviour this entails, the relevant standards that RDFLib is trying to implement, and the documentation of RDFLib, with some interplay between all three of these. RDFLib's public API includes public type aliases, as these are normal python variables and not annotations. - -Type hints attempt to formally document RDFLib's implicitly-defined public API in a machine-readable fashion as accurately and correctly as possible within the framework outline earlier in this document. - -Type hints do not affect the runtime API or behaviour of RDFLib. In this way then, they are somewhat outside of the scope of semver, however, they still have an impact on the users of RDFLib, even if this impact is not at runtime, but during development. This necessitates some clarity as to what users of RDFLib should expect regarding type hints in RDFLib releases. - -Changes to type hints can broadly be classified as follow: - -**Type Declaration** - Adding type hints to existing code that had no explicit type hints, for example, changing - -```python -def foo(val): - return val + 1 -``` - -to - -```python -def foo(val: int) -> int: - return val + 1 -``` - -**Type Refinement** - Refining existing type hints to be narrower, for example, changing a type hint of `typing.Collection` to `typing.Sequence`. - -**Type Corrections** - Correcting existing type hints which contradict the behaviour of the code or relevant specifications, for example, changing `typing.Sequence` from `typing.Set` - -Given semver version components `MAJOR.MINOR.PATCH`, RDFLib will attempt to constrain type hint changes as follow: - -| Version Component | Type Declaration | Type Refinement | Type Corrections | -|------------------|-----------------|----------------|-----------------| -| MAJOR | YES | YES | YES | -| MINOR | YES | YES | YES | -| PATCH | NO | NO | YES | - -!!! caution "Type Corrections" - A caveat worth nothing here is that code that passed type validation on one version of RDFLib can fail type validation on a later version of RDFLib that only differs in `PATCH` version component. This is as a consequence of potential *Type Corrections*. diff --git a/mkdocs.yml b/mkdocs.yml index ec321cfedd..8cd9ff6312 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -56,8 +56,6 @@ nav: - Contributing guide: CONTRIBUTING.md - Developers guide: developers.md - Documentation guide: docs.md - - Type Hints: type_hints.md - - Persisting Notation 3 Terms: persisting_n3_terms.md - Code of Conduct: CODE_OF_CONDUCT.md - Decision Records: decisions.md From e9898a5793c970a14672e85d6a93903d24c9a159 Mon Sep 17 00:00:00 2001 From: Edmond Chuc Date: Wed, 12 Nov 2025 14:52:17 +1000 Subject: [PATCH 57/60] build: fix poetry.lock having two conflicting click entries --- poetry.lock | 16 ---------------- 1 file changed, 16 deletions(-) diff --git a/poetry.lock b/poetry.lock index 775588af75..c19e18c612 100644 --- a/poetry.lock +++ b/poetry.lock @@ -279,22 +279,6 @@ files = [ [package.dependencies] colorama = {version = "*", markers = "platform_system == \"Windows\""} -[[package]] -name = "click" -version = "8.3.0" -description = "Composable command line interface toolkit" -optional = false -python-versions = ">=3.10" -groups = ["dev", "docs"] -markers = "python_version >= \"3.11\"" -files = [ - {file = "click-8.3.0-py3-none-any.whl", hash = "sha256:9b9f285302c6e3064f4330c05f05b81945b2a39544279343e6e7c5f27a9baddc"}, - {file = "click-8.3.0.tar.gz", hash = "sha256:e7b8232224eba16f4ebe410c25ced9f7875cb5f3263ffc93cc3e8da705e229c4"}, -] - -[package.dependencies] -colorama = {version = "*", markers = "platform_system == \"Windows\""} - [[package]] name = "colorama" version = "0.4.6" From 8d7868006037d82f4f67bb0b6461c12c64dfb2ee Mon Sep 17 00:00:00 2001 From: Edmond Chuc Date: Wed, 12 Nov 2025 14:56:13 +1000 Subject: [PATCH 58/60] build: revert back to main's poetry.lock file as the newly generated one still has duplicate package entries for different versions (click, griffe, etc) --- poetry.lock | 1091 +++++++++++++++++++-------------------------------- 1 file changed, 393 insertions(+), 698 deletions(-) diff --git a/poetry.lock b/poetry.lock index c19e18c612..0b6b5649c1 100644 --- a/poetry.lock +++ b/poetry.lock @@ -17,19 +17,18 @@ dev = ["backports.zoneinfo ; python_version < \"3.9\"", "freezegun (>=1.0,<2.0)" [[package]] name = "backrefs" -version = "6.0.1" +version = "5.8" description = "A wrapper around re and regex that adds additional back references." optional = false python-versions = ">=3.9" groups = ["docs"] files = [ - {file = "backrefs-6.0.1-py310-none-any.whl", hash = "sha256:78a69e21b71d739b625b52b5adbf7eb1716fb4cf0a39833826f59546f321cb99"}, - {file = "backrefs-6.0.1-py311-none-any.whl", hash = "sha256:6ba76d616ccb02479a3a098ad1f46d92225f280d7bdce7583bc62897f32d946c"}, - {file = "backrefs-6.0.1-py312-none-any.whl", hash = "sha256:2f440f79f5ef5b9083fd366a09a976690044eca0ea0e59ac0508c3630e0ebc7c"}, - {file = "backrefs-6.0.1-py313-none-any.whl", hash = "sha256:62ea7e9b286808576f35b2d28a0daa09b85ae2fc71b82a951d35729b0138e66b"}, - {file = "backrefs-6.0.1-py314-none-any.whl", hash = "sha256:3ba0d943178d24a3721c5d915734767fa93f3bde1d317c4ef9e0f33b21b9c302"}, - {file = "backrefs-6.0.1-py39-none-any.whl", hash = "sha256:b1a61b29c35cc72cfb54886164b626fbe64cab74e9d8dcac125155bd3acdb023"}, - {file = "backrefs-6.0.1.tar.gz", hash = "sha256:54f8453c9ae38417a83c06d23745c634138c8da622d87a12cb3eef9ba66dd466"}, + {file = "backrefs-5.8-py310-none-any.whl", hash = "sha256:c67f6638a34a5b8730812f5101376f9d41dc38c43f1fdc35cb54700f6ed4465d"}, + {file = "backrefs-5.8-py311-none-any.whl", hash = "sha256:2e1c15e4af0e12e45c8701bd5da0902d326b2e200cafcd25e49d9f06d44bb61b"}, + {file = "backrefs-5.8-py312-none-any.whl", hash = "sha256:bbef7169a33811080d67cdf1538c8289f76f0942ff971222a16034da88a73486"}, + {file = "backrefs-5.8-py313-none-any.whl", hash = "sha256:e3a63b073867dbefd0536425f43db618578528e3896fb77be7141328642a1585"}, + {file = "backrefs-5.8-py39-none-any.whl", hash = "sha256:a66851e4533fb5b371aa0628e1fee1af05135616b86140c9d787a2ffdf4b8fdc"}, + {file = "backrefs-5.8.tar.gz", hash = "sha256:2cab642a205ce966af3dd4b38ee36009b31fa9502a35fd61d59ccc116e40a6bd"}, ] [package.extras] @@ -37,14 +36,14 @@ extras = ["regex"] [[package]] name = "berkeleydb" -version = "18.1.15" +version = "18.1.14" description = "Python bindings for Oracle Berkeley DB" optional = true python-versions = "*" groups = ["main"] markers = "extra == \"berkeleydb\"" files = [ - {file = "berkeleydb-18.1.15.tar.gz", hash = "sha256:7afa53143d754c6bb2c85656c1325ebae518adcfcd1b59e13cc2abb88ddf758e"}, + {file = "berkeleydb-18.1.14.tar.gz", hash = "sha256:8c260282f57ebd5b9c3ce53da0eb75be5957addb303e3190935b716448f32f7d"}, ] [[package]] @@ -96,26 +95,26 @@ uvloop = ["uvloop (>=0.15.2)"] [[package]] name = "bracex" -version = "2.6" +version = "2.5.post1" description = "Bash style brace expander." optional = false -python-versions = ">=3.9" +python-versions = ">=3.8" groups = ["docs"] files = [ - {file = "bracex-2.6-py3-none-any.whl", hash = "sha256:0b0049264e7340b3ec782b5cb99beb325f36c3782a32e36e876452fd49a09952"}, - {file = "bracex-2.6.tar.gz", hash = "sha256:98f1347cd77e22ee8d967a30ad4e310b233f7754dbf31ff3fceb76145ba47dc7"}, + {file = "bracex-2.5.post1-py3-none-any.whl", hash = "sha256:13e5732fec27828d6af308628285ad358047cec36801598368cb28bc631dbaf6"}, + {file = "bracex-2.5.post1.tar.gz", hash = "sha256:12c50952415bfa773d2d9ccb8e79651b8cdb1f31a42f6091b804f6ba2b4a66b6"}, ] [[package]] name = "build" -version = "1.3.0" +version = "1.2.2.post1" description = "A simple, correct Python build frontend" optional = false -python-versions = ">=3.9" +python-versions = ">=3.8" groups = ["dev"] files = [ - {file = "build-1.3.0-py3-none-any.whl", hash = "sha256:7145f0b5061ba90a1500d60bd1b13ca0a8a4cebdd0cc16ed8adf1c0e739f43b4"}, - {file = "build-1.3.0.tar.gz", hash = "sha256:698edd0ea270bde950f53aed21f3a0135672206f3911e0176261a31e0e07b397"}, + {file = "build-1.2.2.post1-py3-none-any.whl", hash = "sha256:1d61c0887fa860c01971625baae8bdd338e517b836a2f70dd1f7aa3a6b2fc5b5"}, + {file = "build-1.2.2.post1.tar.gz", hash = "sha256:b36993e92ca9375a219c99e606a122ff365a760a2d4bba0caa09bd5278b608b7"}, ] [package.dependencies] @@ -126,142 +125,124 @@ pyproject_hooks = "*" tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} [package.extras] +docs = ["furo (>=2023.08.17)", "sphinx (>=7.0,<8.0)", "sphinx-argparse-cli (>=1.5)", "sphinx-autodoc-typehints (>=1.10)", "sphinx-issues (>=3.0.0)"] +test = ["build[uv,virtualenv]", "filelock (>=3)", "pytest (>=6.2.4)", "pytest-cov (>=2.12)", "pytest-mock (>=2)", "pytest-rerunfailures (>=9.1)", "pytest-xdist (>=1.34)", "setuptools (>=42.0.0) ; python_version < \"3.10\"", "setuptools (>=56.0.0) ; python_version == \"3.10\"", "setuptools (>=56.0.0) ; python_version == \"3.11\"", "setuptools (>=67.8.0) ; python_version >= \"3.12\"", "wheel (>=0.36.0)"] +typing = ["build[uv]", "importlib-metadata (>=5.1)", "mypy (>=1.9.0,<1.10.0)", "tomli", "typing-extensions (>=3.7.4.3)"] uv = ["uv (>=0.1.18)"] -virtualenv = ["virtualenv (>=20.11) ; python_version < \"3.10\"", "virtualenv (>=20.17) ; python_version >= \"3.10\" and python_version < \"3.14\"", "virtualenv (>=20.31) ; python_version >= \"3.14\""] +virtualenv = ["virtualenv (>=20.0.35)"] [[package]] name = "certifi" -version = "2025.11.12" +version = "2025.4.26" description = "Python package for providing Mozilla's CA Bundle." optional = false -python-versions = ">=3.7" +python-versions = ">=3.6" groups = ["docs"] files = [ - {file = "certifi-2025.11.12-py3-none-any.whl", hash = "sha256:97de8790030bbd5c2d96b7ec782fc2f7820ef8dba6db909ccf95449f2d062d4b"}, - {file = "certifi-2025.11.12.tar.gz", hash = "sha256:d8ab5478f2ecd78af242878415affce761ca6bc54a22a27e026d7c25357c3316"}, + {file = "certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3"}, + {file = "certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6"}, ] [[package]] name = "charset-normalizer" -version = "3.4.4" +version = "3.4.2" description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." optional = false python-versions = ">=3.7" groups = ["docs"] files = [ - {file = "charset_normalizer-3.4.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e824f1492727fa856dd6eda4f7cee25f8518a12f3c4a56a74e8095695089cf6d"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4bd5d4137d500351a30687c2d3971758aac9a19208fc110ccb9d7188fbe709e8"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:027f6de494925c0ab2a55eab46ae5129951638a49a34d87f4c3eda90f696b4ad"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f820802628d2694cb7e56db99213f930856014862f3fd943d290ea8438d07ca8"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:798d75d81754988d2565bff1b97ba5a44411867c0cf32b77a7e8f8d84796b10d"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d1bb833febdff5c8927f922386db610b49db6e0d4f4ee29601d71e7c2694313"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:9cd98cdc06614a2f768d2b7286d66805f94c48cde050acdbbb7db2600ab3197e"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:077fbb858e903c73f6c9db43374fd213b0b6a778106bc7032446a8e8b5b38b93"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:244bfb999c71b35de57821b8ea746b24e863398194a4014e4c76adc2bbdfeff0"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:64b55f9dce520635f018f907ff1b0df1fdc31f2795a922fb49dd14fbcdf48c84"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:faa3a41b2b66b6e50f84ae4a68c64fcd0c44355741c6374813a800cd6695db9e"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:6515f3182dbe4ea06ced2d9e8666d97b46ef4c75e326b79bb624110f122551db"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cc00f04ed596e9dc0da42ed17ac5e596c6ccba999ba6bd92b0e0aef2f170f2d6"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-win32.whl", hash = "sha256:f34be2938726fc13801220747472850852fe6b1ea75869a048d6f896838c896f"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-win_amd64.whl", hash = "sha256:a61900df84c667873b292c3de315a786dd8dac506704dea57bc957bd31e22c7d"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-win_arm64.whl", hash = "sha256:cead0978fc57397645f12578bfd2d5ea9138ea0fac82b2f63f7f7c6877986a69"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6e1fcf0720908f200cd21aa4e6750a48ff6ce4afe7ff5a79a90d5ed8a08296f8"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5f819d5fe9234f9f82d75bdfa9aef3a3d72c4d24a6e57aeaebba32a704553aa0"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a59cb51917aa591b1c4e6a43c132f0cdc3c76dbad6155df4e28ee626cc77a0a3"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8ef3c867360f88ac904fd3f5e1f902f13307af9052646963ee08ff4f131adafc"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d9e45d7faa48ee908174d8fe84854479ef838fc6a705c9315372eacbc2f02897"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:840c25fb618a231545cbab0564a799f101b63b9901f2569faecd6b222ac72381"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ca5862d5b3928c4940729dacc329aa9102900382fea192fc5e52eb69d6093815"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d9c7f57c3d666a53421049053eaacdd14bbd0a528e2186fcb2e672effd053bb0"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:277e970e750505ed74c832b4bf75dac7476262ee2a013f5574dd49075879e161"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:31fd66405eaf47bb62e8cd575dc621c56c668f27d46a61d975a249930dd5e2a4"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:0d3d8f15c07f86e9ff82319b3d9ef6f4bf907608f53fe9d92b28ea9ae3d1fd89"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9f7fcd74d410a36883701fafa2482a6af2ff5ba96b9a620e9e0721e28ead5569"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ebf3e58c7ec8a8bed6d66a75d7fb37b55e5015b03ceae72a8e7c74495551e224"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-win32.whl", hash = "sha256:eecbc200c7fd5ddb9a7f16c7decb07b566c29fa2161a16cf67b8d068bd21690a"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-win_amd64.whl", hash = "sha256:5ae497466c7901d54b639cf42d5b8c1b6a4fead55215500d2f486d34db48d016"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-win_arm64.whl", hash = "sha256:65e2befcd84bc6f37095f5961e68a6f077bf44946771354a28ad434c2cce0ae1"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0a98e6759f854bd25a58a73fa88833fba3b7c491169f86ce1180c948ab3fd394"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b5b290ccc2a263e8d185130284f8501e3e36c5e02750fc6b6bdeb2e9e96f1e25"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74bb723680f9f7a6234dcf67aea57e708ec1fbdf5699fb91dfd6f511b0a320ef"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f1e34719c6ed0b92f418c7c780480b26b5d9c50349e9a9af7d76bf757530350d"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2437418e20515acec67d86e12bf70056a33abdacb5cb1655042f6538d6b085a8"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:11d694519d7f29d6cd09f6ac70028dba10f92f6cdd059096db198c283794ac86"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ac1c4a689edcc530fc9d9aa11f5774b9e2f33f9a0c6a57864e90908f5208d30a"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:21d142cc6c0ec30d2efee5068ca36c128a30b0f2c53c1c07bd78cb6bc1d3be5f"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:5dbe56a36425d26d6cfb40ce79c314a2e4dd6211d51d6d2191c00bed34f354cc"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5bfbb1b9acf3334612667b61bd3002196fe2a1eb4dd74d247e0f2a4d50ec9bbf"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:d055ec1e26e441f6187acf818b73564e6e6282709e9bcb5b63f5b23068356a15"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:af2d8c67d8e573d6de5bc30cdb27e9b95e49115cd9baad5ddbd1a6207aaa82a9"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:780236ac706e66881f3b7f2f32dfe90507a09e67d1d454c762cf642e6e1586e0"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-win32.whl", hash = "sha256:5833d2c39d8896e4e19b689ffc198f08ea58116bee26dea51e362ecc7cd3ed26"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:a79cfe37875f822425b89a82333404539ae63dbdddf97f84dcbc3d339aae9525"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:376bec83a63b8021bb5c8ea75e21c4ccb86e7e45ca4eb81146091b56599b80c3"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e1f185f86a6f3403aa2420e815904c67b2f9ebc443f045edd0de921108345794"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b39f987ae8ccdf0d2642338faf2abb1862340facc796048b604ef14919e55ed"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3162d5d8ce1bb98dd51af660f2121c55d0fa541b46dff7bb9b9f86ea1d87de72"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:81d5eb2a312700f4ecaa977a8235b634ce853200e828fbadf3a9c50bab278328"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5bd2293095d766545ec1a8f612559f6b40abc0eb18bb2f5d1171872d34036ede"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a8a8b89589086a25749f471e6a900d3f662d1d3b6e2e59dcecf787b1cc3a1894"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc7637e2f80d8530ee4a78e878bce464f70087ce73cf7c1caf142416923b98f1"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f8bf04158c6b607d747e93949aa60618b61312fe647a6369f88ce2ff16043490"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:554af85e960429cf30784dd47447d5125aaa3b99a6f0683589dbd27e2f45da44"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:74018750915ee7ad843a774364e13a3db91682f26142baddf775342c3f5b1133"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c0463276121fdee9c49b98908b3a89c39be45d86d1dbaa22957e38f6321d4ce3"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:362d61fd13843997c1c446760ef36f240cf81d3ebf74ac62652aebaf7838561e"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a26f18905b8dd5d685d6d07b0cdf98a79f3c7a918906af7cc143ea2e164c8bc"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-win32.whl", hash = "sha256:9b35f4c90079ff2e2edc5b26c0c77925e5d2d255c42c74fdb70fb49b172726ac"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:b435cba5f4f750aa6c0a0d92c541fb79f69a387c91e61f1795227e4ed9cece14"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:542d2cee80be6f80247095cc36c418f7bddd14f4a6de45af91dfad36d817bba2"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ce8a0633f41a967713a59c4139d29110c07e826d131a316b50ce11b1d79b4f84"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eaabd426fe94daf8fd157c32e571c85cb12e66692f15516a83a03264b08d06c3"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c4ef880e27901b6cc782f1b95f82da9313c0eb95c3af699103088fa0ac3ce9ac"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2aaba3b0819274cc41757a1da876f810a3e4d7b6eb25699253a4effef9e8e4af"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:778d2e08eda00f4256d7f672ca9fef386071c9202f5e4607920b86d7803387f2"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f155a433c2ec037d4e8df17d18922c3a0d9b3232a396690f17175d2946f0218d"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a8bf8d0f749c5757af2142fe7903a9df1d2e8aa3841559b2bad34b08d0e2bcf3"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:194f08cbb32dc406d6e1aea671a68be0823673db2832b38405deba2fb0d88f63"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:6aee717dcfead04c6eb1ce3bd29ac1e22663cdea57f943c87d1eab9a025438d7"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:cd4b7ca9984e5e7985c12bc60a6f173f3c958eae74f3ef6624bb6b26e2abbae4"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_riscv64.whl", hash = "sha256:b7cf1017d601aa35e6bb650b6ad28652c9cd78ee6caff19f3c28d03e1c80acbf"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:e912091979546adf63357d7e2ccff9b44f026c075aeaf25a52d0e95ad2281074"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:5cb4d72eea50c8868f5288b7f7f33ed276118325c1dfd3957089f6b519e1382a"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-win32.whl", hash = "sha256:837c2ce8c5a65a2035be9b3569c684358dfbf109fd3b6969630a87535495ceaa"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-win_amd64.whl", hash = "sha256:44c2a8734b333e0578090c4cd6b16f275e07aa6614ca8715e6c038e865e70576"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:a9768c477b9d7bd54bc0c86dbaebdec6f03306675526c9927c0e8a04e8f94af9"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1bee1e43c28aa63cb16e5c14e582580546b08e535299b8b6158a7c9c768a1f3d"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fd44c878ea55ba351104cb93cc85e74916eb8fa440ca7903e57575e97394f608"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0f04b14ffe5fdc8c4933862d8306109a2c51e0704acfa35d51598eb45a1e89fc"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:cd09d08005f958f370f539f186d10aec3377d55b9eeb0d796025d4886119d76e"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4fe7859a4e3e8457458e2ff592f15ccb02f3da787fcd31e0183879c3ad4692a1"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fa09f53c465e532f4d3db095e0c55b615f010ad81803d383195b6b5ca6cbf5f3"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7fa17817dc5625de8a027cb8b26d9fefa3ea28c8253929b8d6649e705d2835b6"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:5947809c8a2417be3267efc979c47d76a079758166f7d43ef5ae8e9f92751f88"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:4902828217069c3c5c71094537a8e623f5d097858ac6ca8252f7b4d10b7560f1"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:7c308f7e26e4363d79df40ca5b2be1c6ba9f02bdbccfed5abddb7859a6ce72cf"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:2c9d3c380143a1fedbff95a312aa798578371eb29da42106a29019368a475318"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:cb01158d8b88ee68f15949894ccc6712278243d95f344770fa7593fa2d94410c"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-win32.whl", hash = "sha256:2677acec1a2f8ef614c6888b5b4ae4060cc184174a938ed4e8ef690e15d3e505"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-win_amd64.whl", hash = "sha256:f8e160feb2aed042cd657a72acc0b481212ed28b1b9a95c0cee1621b524e1966"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-win_arm64.whl", hash = "sha256:b5d84d37db046c5ca74ee7bb47dd6cbc13f80665fdde3e8040bdd3fb015ecb50"}, - {file = "charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f"}, - {file = "charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7c48ed483eb946e6c04ccbe02c6b4d1d48e51944b6db70f697e089c193404941"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2d318c11350e10662026ad0eb71bb51c7812fc8590825304ae0bdd4ac283acd"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9cbfacf36cb0ec2897ce0ebc5d08ca44213af24265bd56eca54bee7923c48fd6"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:18dd2e350387c87dabe711b86f83c9c78af772c748904d372ade190b5c7c9d4d"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8075c35cd58273fee266c58c0c9b670947c19df5fb98e7b66710e04ad4e9ff86"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5bf4545e3b962767e5c06fe1738f951f77d27967cb2caa64c28be7c4563e162c"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:7a6ab32f7210554a96cd9e33abe3ddd86732beeafc7a28e9955cdf22ffadbab0"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:b33de11b92e9f75a2b545d6e9b6f37e398d86c3e9e9653c4864eb7e89c5773ef"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:8755483f3c00d6c9a77f490c17e6ab0c8729e39e6390328e42521ef175380ae6"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:68a328e5f55ec37c57f19ebb1fdc56a248db2e3e9ad769919a58672958e8f366"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:21b2899062867b0e1fde9b724f8aecb1af14f2778d69aacd1a5a1853a597a5db"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-win32.whl", hash = "sha256:e8082b26888e2f8b36a042a58307d5b917ef2b1cacab921ad3323ef91901c71a"}, + {file = "charset_normalizer-3.4.2-cp310-cp310-win_amd64.whl", hash = "sha256:f69a27e45c43520f5487f27627059b64aaf160415589230992cec34c5e18a509"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:be1e352acbe3c78727a16a455126d9ff83ea2dfdcbc83148d2982305a04714c2"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa88ca0b1932e93f2d961bf3addbb2db902198dca337d88c89e1559e066e7645"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d524ba3f1581b35c03cb42beebab4a13e6cdad7b36246bd22541fa585a56cccd"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28a1005facc94196e1fb3e82a3d442a9d9110b8434fc1ded7a24a2983c9888d8"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fdb20a30fe1175ecabed17cbf7812f7b804b8a315a25f24678bcdf120a90077f"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0f5d9ed7f254402c9e7d35d2f5972c9bbea9040e99cd2861bd77dc68263277c7"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:efd387a49825780ff861998cd959767800d54f8308936b21025326de4b5a42b9"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:f0aa37f3c979cf2546b73e8222bbfa3dc07a641585340179d768068e3455e544"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e70e990b2137b29dc5564715de1e12701815dacc1d056308e2b17e9095372a82"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:0c8c57f84ccfc871a48a47321cfa49ae1df56cd1d965a09abe84066f6853b9c0"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:6b66f92b17849b85cad91259efc341dce9c1af48e2173bf38a85c6329f1033e5"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-win32.whl", hash = "sha256:daac4765328a919a805fa5e2720f3e94767abd632ae410a9062dff5412bae65a"}, + {file = "charset_normalizer-3.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:e53efc7c7cee4c1e70661e2e112ca46a575f90ed9ae3fef200f2a25e954f4b28"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0c29de6a1a95f24b9a1aa7aefd27d2487263f00dfd55a77719b530788f75cff7"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cddf7bd982eaa998934a91f69d182aec997c6c468898efe6679af88283b498d3"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fcbe676a55d7445b22c10967bceaaf0ee69407fbe0ece4d032b6eb8d4565982a"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d41c4d287cfc69060fa91cae9683eacffad989f1a10811995fa309df656ec214"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e594135de17ab3866138f496755f302b72157d115086d100c3f19370839dd3a"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf713fe9a71ef6fd5adf7a79670135081cd4431c2943864757f0fa3a65b1fafd"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a370b3e078e418187da8c3674eddb9d983ec09445c99a3a263c2011993522981"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a955b438e62efdf7e0b7b52a64dc5c3396e2634baa62471768a64bc2adb73d5c"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7222ffd5e4de8e57e03ce2cef95a4c43c98fcb72ad86909abdfc2c17d227fc1b"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bee093bf902e1d8fc0ac143c88902c3dfc8941f7ea1d6a8dd2bcb786d33db03d"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dedb8adb91d11846ee08bec4c8236c8549ac721c245678282dcb06b221aab59f"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-win32.whl", hash = "sha256:db4c7bf0e07fc3b7d89ac2a5880a6a8062056801b83ff56d8464b70f65482b6c"}, + {file = "charset_normalizer-3.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:5a9979887252a82fefd3d3ed2a8e3b937a7a809f65dcb1e068b090e165bbe99e"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:926ca93accd5d36ccdabd803392ddc3e03e6d4cd1cf17deff3b989ab8e9dbcf0"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba9904b0f38a143592d9fc0e19e2df0fa2e41c3c3745554761c5f6447eedabf"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fddb7e2c84ac87ac3a947cb4e66d143ca5863ef48e4a5ecb83bd48619e4634e"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98f862da73774290f251b9df8d11161b6cf25b599a66baf087c1ffe340e9bfd1"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c9379d65defcab82d07b2a9dfbfc2e95bc8fe0ebb1b176a3190230a3ef0e07c"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e635b87f01ebc977342e2697d05b56632f5f879a4f15955dfe8cef2448b51691"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1c95a1e2902a8b722868587c0e1184ad5c55631de5afc0eb96bc4b0d738092c0"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ef8de666d6179b009dce7bcb2ad4c4a779f113f12caf8dc77f0162c29d20490b"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:32fc0341d72e0f73f80acb0a2c94216bd704f4f0bce10aedea38f30502b271ff"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:289200a18fa698949d2b39c671c2cc7a24d44096784e76614899a7ccf2574b7b"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a476b06fbcf359ad25d34a057b7219281286ae2477cc5ff5e3f70a246971148"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-win32.whl", hash = "sha256:aaeeb6a479c7667fbe1099af9617c83aaca22182d6cf8c53966491a0f1b7ffb7"}, + {file = "charset_normalizer-3.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:aa6af9e7d59f9c12b33ae4e9450619cf2488e2bbe9b44030905877f0b2324980"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1cad5f45b3146325bb38d6855642f6fd609c3f7cad4dbaf75549bf3b904d3184"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b2680962a4848b3c4f155dc2ee64505a9c57186d0d56b43123b17ca3de18f0fa"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:36b31da18b8890a76ec181c3cf44326bf2c48e36d393ca1b72b3f484113ea344"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f4074c5a429281bf056ddd4c5d3b740ebca4d43ffffe2ef4bf4d2d05114299da"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c9e36a97bee9b86ef9a1cf7bb96747eb7a15c2f22bdb5b516434b00f2a599f02"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:1b1bde144d98e446b056ef98e59c256e9294f6b74d7af6846bf5ffdafd687a7d"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:915f3849a011c1f593ab99092f3cecfcb4d65d8feb4a64cf1bf2d22074dc0ec4"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-musllinux_1_2_ppc64le.whl", hash = "sha256:fb707f3e15060adf5b7ada797624a6c6e0138e2a26baa089df64c68ee98e040f"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-musllinux_1_2_s390x.whl", hash = "sha256:25a23ea5c7edc53e0f29bae2c44fcb5a1aa10591aae107f2a2b2583a9c5cbc64"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:770cab594ecf99ae64c236bc9ee3439c3f46be49796e265ce0cc8bc17b10294f"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-win32.whl", hash = "sha256:6a0289e4589e8bdfef02a80478f1dfcb14f0ab696b5a00e1f4b8a14a307a3c58"}, + {file = "charset_normalizer-3.4.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6fc1f5b51fa4cecaa18f2bd7a003f3dd039dd615cd69a2afd6d3b19aed6775f2"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:76af085e67e56c8816c3ccf256ebd136def2ed9654525348cfa744b6802b69eb"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e45ba65510e2647721e35323d6ef54c7974959f6081b58d4ef5d87c60c84919a"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:046595208aae0120559a67693ecc65dd75d46f7bf687f159127046628178dc45"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75d10d37a47afee94919c4fab4c22b9bc2a8bf7d4f46f87363bcf0573f3ff4f5"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6333b3aa5a12c26b2a4d4e7335a28f1475e0e5e17d69d55141ee3cab736f66d1"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e8323a9b031aa0393768b87f04b4164a40037fb2a3c11ac06a03ffecd3618027"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:24498ba8ed6c2e0b56d4acbf83f2d989720a93b41d712ebd4f4979660db4417b"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:844da2b5728b5ce0e32d863af26f32b5ce61bc4273a9c720a9f3aa9df73b1455"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:65c981bdbd3f57670af8b59777cbfae75364b483fa8a9f420f08094531d54a01"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:3c21d4fca343c805a52c0c78edc01e3477f6dd1ad7c47653241cf2a206d4fc58"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:dc7039885fa1baf9be153a0626e337aa7ec8bf96b0128605fb0d77788ddc1681"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-win32.whl", hash = "sha256:8272b73e1c5603666618805fe821edba66892e2870058c94c53147602eab29c7"}, + {file = "charset_normalizer-3.4.2-cp38-cp38-win_amd64.whl", hash = "sha256:70f7172939fdf8790425ba31915bfbe8335030f05b9913d7ae00a87d4395620a"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:005fa3432484527f9732ebd315da8da8001593e2cf46a3d817669f062c3d9ed4"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e92fca20c46e9f5e1bb485887d074918b13543b1c2a1185e69bb8d17ab6236a7"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:50bf98d5e563b83cc29471fa114366e6806bc06bc7a25fd59641e41445327836"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:721c76e84fe669be19c5791da68232ca2e05ba5185575086e384352e2c309597"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:82d8fd25b7f4675d0c47cf95b594d4e7b158aca33b76aa63d07186e13c0e0ab7"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b3daeac64d5b371dea99714f08ffc2c208522ec6b06fbc7866a450dd446f5c0f"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:dccab8d5fa1ef9bfba0590ecf4d46df048d18ffe3eec01eeb73a42e0d9e7a8ba"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:aaf27faa992bfee0264dc1f03f4c75e9fcdda66a519db6b957a3f826e285cf12"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:eb30abc20df9ab0814b5a2524f23d75dcf83cde762c161917a2b4b7b55b1e518"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:c72fbbe68c6f32f251bdc08b8611c7b3060612236e960ef848e0a517ddbe76c5"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:982bb1e8b4ffda883b3d0a521e23abcd6fd17418f6d2c4118d257a10199c0ce3"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-win32.whl", hash = "sha256:43e0933a0eff183ee85833f341ec567c0980dae57c464d8a508e1b2ceb336471"}, + {file = "charset_normalizer-3.4.2-cp39-cp39-win_amd64.whl", hash = "sha256:d11b54acf878eef558599658b0ffca78138c8c3655cf4f3a4a673c437e67732e"}, + {file = "charset_normalizer-3.4.2-py3-none-any.whl", hash = "sha256:7f56930ab0abd1c45cd15be65cc741c28b1c9a34876ce8c17a2fa107810c0af0"}, + {file = "charset_normalizer-3.4.2.tar.gz", hash = "sha256:5baececa9ecba31eff645232d59845c07aa030f0c81ee70184a90d35099a0e63"}, ] [[package]] @@ -412,112 +393,6 @@ tomli = {version = "*", optional = true, markers = "python_full_version <= \"3.1 [package.extras] toml = ["tomli ; python_full_version <= \"3.11.0a6\""] -[[package]] -name = "coverage" -version = "7.11.3" -description = "Code coverage measurement for Python" -optional = false -python-versions = ">=3.10" -groups = ["tests"] -markers = "python_version >= \"3.11\"" -files = [ - {file = "coverage-7.11.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0c986537abca9b064510f3fd104ba33e98d3036608c7f2f5537f869bc10e1ee5"}, - {file = "coverage-7.11.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:28c5251b3ab1d23e66f1130ca0c419747edfbcb4690de19467cd616861507af7"}, - {file = "coverage-7.11.3-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:4f2bb4ee8dd40f9b2a80bb4adb2aecece9480ba1fa60d9382e8c8e0bd558e2eb"}, - {file = "coverage-7.11.3-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:e5f4bfac975a2138215a38bda599ef00162e4143541cf7dd186da10a7f8e69f1"}, - {file = "coverage-7.11.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8f4cbfff5cf01fa07464439a8510affc9df281535f41a1f5312fbd2b59b4ab5c"}, - {file = "coverage-7.11.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:31663572f20bf3406d7ac00d6981c7bbbcec302539d26b5ac596ca499664de31"}, - {file = "coverage-7.11.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:9799bd6a910961cb666196b8583ed0ee125fa225c6fdee2cbf00232b861f29d2"}, - {file = "coverage-7.11.3-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:097acc18bedf2c6e3144eaf09b5f6034926c3c9bb9e10574ffd0942717232507"}, - {file = "coverage-7.11.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:6f033dec603eea88204589175782290a038b436105a8f3637a81c4359df27832"}, - {file = "coverage-7.11.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:dd9ca2d44ed8018c90efb72f237a2a140325a4c3339971364d758e78b175f58e"}, - {file = "coverage-7.11.3-cp310-cp310-win32.whl", hash = "sha256:900580bc99c145e2561ea91a2d207e639171870d8a18756eb57db944a017d4bb"}, - {file = "coverage-7.11.3-cp310-cp310-win_amd64.whl", hash = "sha256:c8be5bfcdc7832011b2652db29ed7672ce9d353dd19bce5272ca33dbcf60aaa8"}, - {file = "coverage-7.11.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:200bb89fd2a8a07780eafcdff6463104dec459f3c838d980455cfa84f5e5e6e1"}, - {file = "coverage-7.11.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8d264402fc179776d43e557e1ca4a7d953020d3ee95f7ec19cc2c9d769277f06"}, - {file = "coverage-7.11.3-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:385977d94fc155f8731c895accdfcc3dd0d9dd9ef90d102969df95d3c637ab80"}, - {file = "coverage-7.11.3-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:0542ddf6107adbd2592f29da9f59f5d9cff7947b5bb4f734805085c327dcffaa"}, - {file = "coverage-7.11.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d60bf4d7f886989ddf80e121a7f4d140d9eac91f1d2385ce8eb6bda93d563297"}, - {file = "coverage-7.11.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c0a3b6e32457535df0d41d2d895da46434706dd85dbaf53fbc0d3bd7d914b362"}, - {file = "coverage-7.11.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:876a3ee7fd2613eb79602e4cdb39deb6b28c186e76124c3f29e580099ec21a87"}, - {file = "coverage-7.11.3-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:a730cd0824e8083989f304e97b3f884189efb48e2151e07f57e9e138ab104200"}, - {file = "coverage-7.11.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:b5cd111d3ab7390be0c07ad839235d5ad54d2ca497b5f5db86896098a77180a4"}, - {file = "coverage-7.11.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:074e6a5cd38e06671580b4d872c1a67955d4e69639e4b04e87fc03b494c1f060"}, - {file = "coverage-7.11.3-cp311-cp311-win32.whl", hash = "sha256:86d27d2dd7c7c5a44710565933c7dc9cd70e65ef97142e260d16d555667deef7"}, - {file = "coverage-7.11.3-cp311-cp311-win_amd64.whl", hash = "sha256:ca90ef33a152205fb6f2f0c1f3e55c50df4ef049bb0940ebba666edd4cdebc55"}, - {file = "coverage-7.11.3-cp311-cp311-win_arm64.whl", hash = "sha256:56f909a40d68947ef726ce6a34eb38f0ed241ffbe55c5007c64e616663bcbafc"}, - {file = "coverage-7.11.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:5b771b59ac0dfb7f139f70c85b42717ef400a6790abb6475ebac1ecee8de782f"}, - {file = "coverage-7.11.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:603c4414125fc9ae9000f17912dcfd3d3eb677d4e360b85206539240c96ea76e"}, - {file = "coverage-7.11.3-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:77ffb3b7704eb7b9b3298a01fe4509cef70117a52d50bcba29cffc5f53dd326a"}, - {file = "coverage-7.11.3-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4d4ca49f5ba432b0755ebb0fc3a56be944a19a16bb33802264bbc7311622c0d1"}, - {file = "coverage-7.11.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:05fd3fb6edff0c98874d752013588836f458261e5eba587afe4c547bba544afd"}, - {file = "coverage-7.11.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0e920567f8c3a3ce68ae5a42cf7c2dc4bb6cc389f18bff2235dd8c03fa405de5"}, - {file = "coverage-7.11.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:4bec8c7160688bd5a34e65c82984b25409563134d63285d8943d0599efbc448e"}, - {file = "coverage-7.11.3-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:adb9b7b42c802bd8cb3927de8c1c26368ce50c8fdaa83a9d8551384d77537044"}, - {file = "coverage-7.11.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:c8f563b245b4ddb591e99f28e3cd140b85f114b38b7f95b2e42542f0603eb7d7"}, - {file = "coverage-7.11.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e2a96fdc7643c9517a317553aca13b5cae9bad9a5f32f4654ce247ae4d321405"}, - {file = "coverage-7.11.3-cp312-cp312-win32.whl", hash = "sha256:e8feeb5e8705835f0622af0fe7ff8d5cb388948454647086494d6c41ec142c2e"}, - {file = "coverage-7.11.3-cp312-cp312-win_amd64.whl", hash = "sha256:abb903ffe46bd319d99979cdba350ae7016759bb69f47882242f7b93f3356055"}, - {file = "coverage-7.11.3-cp312-cp312-win_arm64.whl", hash = "sha256:1451464fd855d9bd000c19b71bb7dafea9ab815741fb0bd9e813d9b671462d6f"}, - {file = "coverage-7.11.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:84b892e968164b7a0498ddc5746cdf4e985700b902128421bb5cec1080a6ee36"}, - {file = "coverage-7.11.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f761dbcf45e9416ec4698e1a7649248005f0064ce3523a47402d1bff4af2779e"}, - {file = "coverage-7.11.3-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:1410bac9e98afd9623f53876fae7d8a5db9f5a0ac1c9e7c5188463cb4b3212e2"}, - {file = "coverage-7.11.3-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:004cdcea3457c0ea3233622cd3464c1e32ebba9b41578421097402bee6461b63"}, - {file = "coverage-7.11.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8f067ada2c333609b52835ca4d4868645d3b63ac04fb2b9a658c55bba7f667d3"}, - {file = "coverage-7.11.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:07bc7745c945a6d95676953e86ba7cebb9f11de7773951c387f4c07dc76d03f5"}, - {file = "coverage-7.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:8bba7e4743e37484ae17d5c3b8eb1ce78b564cb91b7ace2e2182b25f0f764cb5"}, - {file = "coverage-7.11.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:fbffc22d80d86fbe456af9abb17f7a7766e7b2101f7edaacc3535501691563f7"}, - {file = "coverage-7.11.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:0dba4da36730e384669e05b765a2c49f39514dd3012fcc0398dd66fba8d746d5"}, - {file = "coverage-7.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ae12fe90b00b71a71b69f513773310782ce01d5f58d2ceb2b7c595ab9d222094"}, - {file = "coverage-7.11.3-cp313-cp313-win32.whl", hash = "sha256:12d821de7408292530b0d241468b698bce18dd12ecaf45316149f53877885f8c"}, - {file = "coverage-7.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:6bb599052a974bb6cedfa114f9778fedfad66854107cf81397ec87cb9b8fbcf2"}, - {file = "coverage-7.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:bb9d7efdb063903b3fdf77caec7b77c3066885068bdc0d44bc1b0c171033f944"}, - {file = "coverage-7.11.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:fb58da65e3339b3dbe266b607bb936efb983d86b00b03eb04c4ad5b442c58428"}, - {file = "coverage-7.11.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:8d16bbe566e16a71d123cd66382c1315fcd520c7573652a8074a8fe281b38c6a"}, - {file = "coverage-7.11.3-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:a8258f10059b5ac837232c589a350a2df4a96406d6d5f2a09ec587cbdd539655"}, - {file = "coverage-7.11.3-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4c5627429f7fbff4f4131cfdd6abd530734ef7761116811a707b88b7e205afd7"}, - {file = "coverage-7.11.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:465695268414e149bab754c54b0c45c8ceda73dd4a5c3ba255500da13984b16d"}, - {file = "coverage-7.11.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:4ebcddfcdfb4c614233cff6e9a3967a09484114a8b2e4f2c7a62dc83676ba13f"}, - {file = "coverage-7.11.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:13b2066303a1c1833c654d2af0455bb009b6e1727b3883c9964bc5c2f643c1d0"}, - {file = "coverage-7.11.3-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:d8750dd20362a1b80e3cf84f58013d4672f89663aee457ea59336df50fab6739"}, - {file = "coverage-7.11.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:ab6212e62ea0e1006531a2234e209607f360d98d18d532c2fa8e403c1afbdd71"}, - {file = "coverage-7.11.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a6b17c2b5e0b9bb7702449200f93e2d04cb04b1414c41424c08aa1e5d352da76"}, - {file = "coverage-7.11.3-cp313-cp313t-win32.whl", hash = "sha256:426559f105f644b69290ea414e154a0d320c3ad8a2bb75e62884731f69cf8e2c"}, - {file = "coverage-7.11.3-cp313-cp313t-win_amd64.whl", hash = "sha256:90a96fcd824564eae6137ec2563bd061d49a32944858d4bdbae5c00fb10e76ac"}, - {file = "coverage-7.11.3-cp313-cp313t-win_arm64.whl", hash = "sha256:1e33d0bebf895c7a0905fcfaff2b07ab900885fc78bba2a12291a2cfbab014cc"}, - {file = "coverage-7.11.3-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:fdc5255eb4815babcdf236fa1a806ccb546724c8a9b129fd1ea4a5448a0bf07c"}, - {file = "coverage-7.11.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:fe3425dc6021f906c6325d3c415e048e7cdb955505a94f1eb774dafc779ba203"}, - {file = "coverage-7.11.3-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:4ca5f876bf41b24378ee67c41d688155f0e54cdc720de8ef9ad6544005899240"}, - {file = "coverage-7.11.3-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:9061a3e3c92b27fd8036dafa26f25d95695b6aa2e4514ab16a254f297e664f83"}, - {file = "coverage-7.11.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:abcea3b5f0dc44e1d01c27090bc32ce6ffb7aa665f884f1890710454113ea902"}, - {file = "coverage-7.11.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:68c4eb92997dbaaf839ea13527be463178ac0ddd37a7ac636b8bc11a51af2428"}, - {file = "coverage-7.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:149eccc85d48c8f06547534068c41d69a1a35322deaa4d69ba1561e2e9127e75"}, - {file = "coverage-7.11.3-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:08c0bcf932e47795c49f0406054824b9d45671362dfc4269e0bc6e4bff010704"}, - {file = "coverage-7.11.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:39764c6167c82d68a2d8c97c33dba45ec0ad9172570860e12191416f4f8e6e1b"}, - {file = "coverage-7.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:3224c7baf34e923ffc78cb45e793925539d640d42c96646db62dbd61bbcfa131"}, - {file = "coverage-7.11.3-cp314-cp314-win32.whl", hash = "sha256:c713c1c528284d636cd37723b0b4c35c11190da6f932794e145fc40f8210a14a"}, - {file = "coverage-7.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:c381a252317f63ca0179d2c7918e83b99a4ff3101e1b24849b999a00f9cd4f86"}, - {file = "coverage-7.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:3e33a968672be1394eded257ec10d4acbb9af2ae263ba05a99ff901bb863557e"}, - {file = "coverage-7.11.3-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:f9c96a29c6d65bd36a91f5634fef800212dff69dacdb44345c4c9783943ab0df"}, - {file = "coverage-7.11.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2ec27a7a991d229213c8070d31e3ecf44d005d96a9edc30c78eaeafaa421c001"}, - {file = "coverage-7.11.3-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:72c8b494bd20ae1c58528b97c4a67d5cfeafcb3845c73542875ecd43924296de"}, - {file = "coverage-7.11.3-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:60ca149a446da255d56c2a7a813b51a80d9497a62250532598d249b3cdb1a926"}, - {file = "coverage-7.11.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eb5069074db19a534de3859c43eec78e962d6d119f637c41c8e028c5ab3f59dd"}, - {file = "coverage-7.11.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ac5d5329c9c942bbe6295f4251b135d860ed9f86acd912d418dce186de7c19ac"}, - {file = "coverage-7.11.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e22539b676fafba17f0a90ac725f029a309eb6e483f364c86dcadee060429d46"}, - {file = "coverage-7.11.3-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:2376e8a9c889016f25472c452389e98bc6e54a19570b107e27cde9d47f387b64"}, - {file = "coverage-7.11.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:4234914b8c67238a3c4af2bba648dc716aa029ca44d01f3d51536d44ac16854f"}, - {file = "coverage-7.11.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f0b4101e2b3c6c352ff1f70b3a6fcc7c17c1ab1a91ccb7a33013cb0782af9820"}, - {file = "coverage-7.11.3-cp314-cp314t-win32.whl", hash = "sha256:305716afb19133762e8cf62745c46c4853ad6f9eeba54a593e373289e24ea237"}, - {file = "coverage-7.11.3-cp314-cp314t-win_amd64.whl", hash = "sha256:9245bd392572b9f799261c4c9e7216bafc9405537d0f4ce3ad93afe081a12dc9"}, - {file = "coverage-7.11.3-cp314-cp314t-win_arm64.whl", hash = "sha256:9a1d577c20b4334e5e814c3d5fe07fa4a8c3ae42a601945e8d7940bab811d0bd"}, - {file = "coverage-7.11.3-py3-none-any.whl", hash = "sha256:351511ae28e2509c8d8cae5311577ea7dd511ab8e746ffc8814a0896c3d33fbe"}, - {file = "coverage-7.11.3.tar.gz", hash = "sha256:0f59387f5e6edbbffec2281affb71cdc85e0776c1745150a3ab9b6c1d016106b"}, -] - -[package.extras] -toml = ["tomli ; python_full_version <= \"3.11.0a6\""] - [[package]] name = "exceptiongroup" version = "1.3.0" @@ -557,38 +432,19 @@ dev = ["flake8", "markdown", "twine", "wheel"] [[package]] name = "griffe" -version = "1.14.0" +version = "1.7.3" description = "Signatures for entire Python programs. Extract the structure, the frame, the skeleton of your project, to generate API documentation or find breaking changes in your API." optional = false python-versions = ">=3.9" groups = ["docs"] files = [ - {file = "griffe-1.14.0-py3-none-any.whl", hash = "sha256:0e9d52832cccf0f7188cfe585ba962d2674b241c01916d780925df34873bceb0"}, - {file = "griffe-1.14.0.tar.gz", hash = "sha256:9d2a15c1eca966d68e00517de5d69dd1bc5c9f2335ef6c1775362ba5b8651a13"}, + {file = "griffe-1.7.3-py3-none-any.whl", hash = "sha256:c6b3ee30c2f0f17f30bcdef5068d6ab7a2a4f1b8bf1a3e74b56fffd21e1c5f75"}, + {file = "griffe-1.7.3.tar.gz", hash = "sha256:52ee893c6a3a968b639ace8015bec9d36594961e156e23315c8e8e51401fa50b"}, ] [package.dependencies] colorama = ">=0.4" -[[package]] -name = "griffe" -version = "1.15.0" -description = "Signatures for entire Python programs. Extract the structure, the frame, the skeleton of your project, to generate API documentation or find breaking changes in your API." -optional = false -python-versions = ">=3.10" -groups = ["docs"] -markers = "python_version >= \"3.11\"" -files = [ - {file = "griffe-1.15.0-py3-none-any.whl", hash = "sha256:6f6762661949411031f5fcda9593f586e6ce8340f0ba88921a0f2ef7a81eb9a3"}, - {file = "griffe-1.15.0.tar.gz", hash = "sha256:7726e3afd6f298fbc3696e67958803e7ac843c1cfe59734b6251a40cdbfb5eea"}, -] - -[package.dependencies] -colorama = ">=0.4" - -[package.extras] -pypi = ["pip (>=24.0)", "platformdirs (>=4.2)", "wheel (>=0.42)"] - [[package]] name = "html5rdf" version = "1.2.1" @@ -604,14 +460,14 @@ files = [ [[package]] name = "idna" -version = "3.11" +version = "3.10" description = "Internationalized Domain Names in Applications (IDNA)" optional = false -python-versions = ">=3.8" +python-versions = ">=3.6" groups = ["docs"] files = [ - {file = "idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea"}, - {file = "idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902"}, + {file = "idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"}, + {file = "idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9"}, ] [package.extras] @@ -654,19 +510,6 @@ files = [ {file = "iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7"}, ] -[[package]] -name = "iniconfig" -version = "2.3.0" -description = "brain-dead simple config-ini parsing" -optional = false -python-versions = ">=3.10" -groups = ["tests"] -markers = "python_version >= \"3.11\"" -files = [ - {file = "iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12"}, - {file = "iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730"}, -] - [[package]] name = "isodate" version = "0.7.2" @@ -872,14 +715,14 @@ test = ["coverage[toml] (>=7.2.5)", "mypy (>=1.2.0)", "pytest (>=7.3.0)", "pytes [[package]] name = "markdown" -version = "3.9" +version = "3.8" description = "Python implementation of John Gruber's Markdown." optional = false python-versions = ">=3.9" groups = ["docs"] files = [ - {file = "markdown-3.9-py3-none-any.whl", hash = "sha256:9f4d91ed810864ea88a6f32c07ba8bee1346c0cc1f6b1f9f6c822f2a9667d280"}, - {file = "markdown-3.9.tar.gz", hash = "sha256:d2900fe1782bd33bdbbd56859defef70c2e78fc46668f8eb9df3128138f2cb6a"}, + {file = "markdown-3.8-py3-none-any.whl", hash = "sha256:794a929b79c5af141ef5ab0f2f642d0f7b1872981250230e72682346f7cc90dc"}, + {file = "markdown-3.8.tar.gz", hash = "sha256:7df81e63f0df5c4b24b7d156eb81e4690595239b7d70937d0409f1b0de319c6f"}, ] [package.dependencies] @@ -889,120 +732,75 @@ importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""} docs = ["mdx_gh_links (>=0.2)", "mkdocs (>=1.6)", "mkdocs-gen-files", "mkdocs-literate-nav", "mkdocs-nature (>=0.6)", "mkdocs-section-index", "mkdocstrings[python]"] testing = ["coverage", "pyyaml"] -[[package]] -name = "markdown" -version = "3.10" -description = "Python implementation of John Gruber's Markdown." -optional = false -python-versions = ">=3.10" -groups = ["docs"] -markers = "python_version >= \"3.11\"" -files = [ - {file = "markdown-3.10-py3-none-any.whl", hash = "sha256:b5b99d6951e2e4948d939255596523444c0e677c669700b1d17aa4a8a464cb7c"}, - {file = "markdown-3.10.tar.gz", hash = "sha256:37062d4f2aa4b2b6b32aefb80faa300f82cc790cb949a35b8caede34f2b68c0e"}, -] - -[package.extras] -docs = ["mdx_gh_links (>=0.2)", "mkdocs (>=1.6)", "mkdocs-gen-files", "mkdocs-literate-nav", "mkdocs-nature (>=0.6)", "mkdocs-section-index", "mkdocstrings[python]"] -testing = ["coverage", "pyyaml"] - [[package]] name = "markupsafe" -version = "3.0.3" +version = "3.0.2" description = "Safely add untrusted strings to HTML/XML markup." optional = false python-versions = ">=3.9" groups = ["docs"] files = [ - {file = "markupsafe-3.0.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2f981d352f04553a7171b8e44369f2af4055f888dfb147d55e42d29e29e74559"}, - {file = "markupsafe-3.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e1c1493fb6e50ab01d20a22826e57520f1284df32f2d8601fdd90b6304601419"}, - {file = "markupsafe-3.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1ba88449deb3de88bd40044603fafffb7bc2b055d626a330323a9ed736661695"}, - {file = "markupsafe-3.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f42d0984e947b8adf7dd6dde396e720934d12c506ce84eea8476409563607591"}, - {file = "markupsafe-3.0.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c0c0b3ade1c0b13b936d7970b1d37a57acde9199dc2aecc4c336773e1d86049c"}, - {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0303439a41979d9e74d18ff5e2dd8c43ed6c6001fd40e5bf2e43f7bd9bbc523f"}, - {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:d2ee202e79d8ed691ceebae8e0486bd9a2cd4794cec4824e1c99b6f5009502f6"}, - {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:177b5253b2834fe3678cb4a5f0059808258584c559193998be2601324fdeafb1"}, - {file = "markupsafe-3.0.3-cp310-cp310-win32.whl", hash = "sha256:2a15a08b17dd94c53a1da0438822d70ebcd13f8c3a95abe3a9ef9f11a94830aa"}, - {file = "markupsafe-3.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:c4ffb7ebf07cfe8931028e3e4c85f0357459a3f9f9490886198848f4fa002ec8"}, - {file = "markupsafe-3.0.3-cp310-cp310-win_arm64.whl", hash = "sha256:e2103a929dfa2fcaf9bb4e7c091983a49c9ac3b19c9061b6d5427dd7d14d81a1"}, - {file = "markupsafe-3.0.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cc7ea17a6824959616c525620e387f6dd30fec8cb44f649e31712db02123dad"}, - {file = "markupsafe-3.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4bd4cd07944443f5a265608cc6aab442e4f74dff8088b0dfc8238647b8f6ae9a"}, - {file = "markupsafe-3.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b5420a1d9450023228968e7e6a9ce57f65d148ab56d2313fcd589eee96a7a50"}, - {file = "markupsafe-3.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0bf2a864d67e76e5c9a34dc26ec616a66b9888e25e7b9460e1c76d3293bd9dbf"}, - {file = "markupsafe-3.0.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc51efed119bc9cfdf792cdeaa4d67e8f6fcccab66ed4bfdd6bde3e59bfcbb2f"}, - {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a"}, - {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7be7b61bb172e1ed687f1754f8e7484f1c8019780f6f6b0786e76bb01c2ae115"}, - {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f9e130248f4462aaa8e2552d547f36ddadbeaa573879158d721bbd33dfe4743a"}, - {file = "markupsafe-3.0.3-cp311-cp311-win32.whl", hash = "sha256:0db14f5dafddbb6d9208827849fad01f1a2609380add406671a26386cdf15a19"}, - {file = "markupsafe-3.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:de8a88e63464af587c950061a5e6a67d3632e36df62b986892331d4620a35c01"}, - {file = "markupsafe-3.0.3-cp311-cp311-win_arm64.whl", hash = "sha256:3b562dd9e9ea93f13d53989d23a7e775fdfd1066c33494ff43f5418bc8c58a5c"}, - {file = "markupsafe-3.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e"}, - {file = "markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce"}, - {file = "markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d"}, - {file = "markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d"}, - {file = "markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a"}, - {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b"}, - {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f"}, - {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b"}, - {file = "markupsafe-3.0.3-cp312-cp312-win32.whl", hash = "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d"}, - {file = "markupsafe-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c"}, - {file = "markupsafe-3.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f"}, - {file = "markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795"}, - {file = "markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219"}, - {file = "markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6"}, - {file = "markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676"}, - {file = "markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9"}, - {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1"}, - {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc"}, - {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12"}, - {file = "markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed"}, - {file = "markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5"}, - {file = "markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485"}, - {file = "markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73"}, - {file = "markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37"}, - {file = "markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19"}, - {file = "markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025"}, - {file = "markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6"}, - {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f"}, - {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb"}, - {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009"}, - {file = "markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354"}, - {file = "markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218"}, - {file = "markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287"}, - {file = "markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe"}, - {file = "markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026"}, - {file = "markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737"}, - {file = "markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97"}, - {file = "markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d"}, - {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda"}, - {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf"}, - {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe"}, - {file = "markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9"}, - {file = "markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581"}, - {file = "markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4"}, - {file = "markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab"}, - {file = "markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175"}, - {file = "markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634"}, - {file = "markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50"}, - {file = "markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e"}, - {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5"}, - {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523"}, - {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc"}, - {file = "markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d"}, - {file = "markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9"}, - {file = "markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa"}, - {file = "markupsafe-3.0.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:15d939a21d546304880945ca1ecb8a039db6b4dc49b2c5a400387cdae6a62e26"}, - {file = "markupsafe-3.0.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f71a396b3bf33ecaa1626c255855702aca4d3d9fea5e051b41ac59a9c1c41edc"}, - {file = "markupsafe-3.0.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0f4b68347f8c5eab4a13419215bdfd7f8c9b19f2b25520968adfad23eb0ce60c"}, - {file = "markupsafe-3.0.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e8fc20152abba6b83724d7ff268c249fa196d8259ff481f3b1476383f8f24e42"}, - {file = "markupsafe-3.0.3-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:949b8d66bc381ee8b007cd945914c721d9aba8e27f71959d750a46f7c282b20b"}, - {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:3537e01efc9d4dccdf77221fb1cb3b8e1a38d5428920e0657ce299b20324d758"}, - {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:591ae9f2a647529ca990bc681daebdd52c8791ff06c2bfa05b65163e28102ef2"}, - {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:a320721ab5a1aba0a233739394eb907f8c8da5c98c9181d1161e77a0c8e36f2d"}, - {file = "markupsafe-3.0.3-cp39-cp39-win32.whl", hash = "sha256:df2449253ef108a379b8b5d6b43f4b1a8e81a061d6537becd5582fba5f9196d7"}, - {file = "markupsafe-3.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:7c3fb7d25180895632e5d3148dbdc29ea38ccb7fd210aa27acbd1201a1902c6e"}, - {file = "markupsafe-3.0.3-cp39-cp39-win_arm64.whl", hash = "sha256:38664109c14ffc9e7437e86b4dceb442b0096dfe3541d7864d9cbe1da4cf36c8"}, - {file = "markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:eaa0a10b7f72326f1372a713e73c3f739b524b3af41feb43e4921cb529f5929a"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:48032821bbdf20f5799ff537c7ac3d1fba0ba032cfc06194faffa8cda8b560ff"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a9d3f5f0901fdec14d8d2f66ef7d035f2157240a433441719ac9a3fba440b13"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88b49a3b9ff31e19998750c38e030fc7bb937398b1f78cfa599aaef92d693144"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cfad01eed2c2e0c01fd0ecd2ef42c492f7f93902e39a42fc9ee1692961443a29"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:1225beacc926f536dc82e45f8a4d68502949dc67eea90eab715dea3a21c1b5f0"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:3169b1eefae027567d1ce6ee7cae382c57fe26e82775f460f0b2778beaad66c0"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:eb7972a85c54febfb25b5c4b4f3af4dcc731994c7da0d8a0b4a6eb0640e1d178"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-win32.whl", hash = "sha256:8c4e8c3ce11e1f92f6536ff07154f9d49677ebaaafc32db9db4620bc11ed480f"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:6e296a513ca3d94054c2c881cc913116e90fd030ad1c656b3869762b754f5f8a"}, + {file = "markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0"}, ] [[package]] @@ -1051,14 +849,14 @@ min-versions = ["babel (==2.9.0)", "click (==7.0)", "colorama (==0.4) ; platform [[package]] name = "mkdocs-autorefs" -version = "1.4.3" +version = "1.4.2" description = "Automatically link across pages in MkDocs." optional = false python-versions = ">=3.9" groups = ["docs"] files = [ - {file = "mkdocs_autorefs-1.4.3-py3-none-any.whl", hash = "sha256:469d85eb3114801d08e9cc55d102b3ba65917a869b893403b8987b601cf55dc9"}, - {file = "mkdocs_autorefs-1.4.3.tar.gz", hash = "sha256:beee715b254455c4aa93b6ef3c67579c399ca092259cc41b7d9342573ff1fc75"}, + {file = "mkdocs_autorefs-1.4.2-py3-none-any.whl", hash = "sha256:83d6d777b66ec3c372a1aad4ae0cf77c243ba5bcda5bf0c6b8a2c5e7a3d89f13"}, + {file = "mkdocs_autorefs-1.4.2.tar.gz", hash = "sha256:e2ebe1abd2b67d597ed19378c0fff84d73d1dbce411fce7a7cc6f161888b6749"}, ] [package.dependencies] @@ -1120,28 +918,28 @@ cache = ["platformdirs"] [[package]] name = "mkdocs-material" -version = "9.7.0" +version = "9.6.23" description = "Documentation that simply works" optional = false python-versions = ">=3.8" groups = ["docs"] files = [ - {file = "mkdocs_material-9.7.0-py3-none-any.whl", hash = "sha256:da2866ea53601125ff5baa8aa06404c6e07af3c5ce3d5de95e3b52b80b442887"}, - {file = "mkdocs_material-9.7.0.tar.gz", hash = "sha256:602b359844e906ee402b7ed9640340cf8a474420d02d8891451733b6b02314ec"}, + {file = "mkdocs_material-9.6.23-py3-none-any.whl", hash = "sha256:3bf3f1d82d269f3a14ed6897bfc3a844cc05e1dc38045386691b91d7e6945332"}, + {file = "mkdocs_material-9.6.23.tar.gz", hash = "sha256:62ebc9cdbe90e1ae4f4e9b16a6aa5c69b93474c7b9e79ebc0b11b87f9f055e00"}, ] [package.dependencies] -babel = ">=2.10" -backrefs = ">=5.7.post1" -colorama = ">=0.4" -jinja2 = ">=3.1" -markdown = ">=3.2" -mkdocs = ">=1.6" -mkdocs-material-extensions = ">=1.3" -paginate = ">=0.5" -pygments = ">=2.16" -pymdown-extensions = ">=10.2" -requests = ">=2.26" +babel = ">=2.10,<3.0" +backrefs = ">=5.7.post1,<6.0" +colorama = ">=0.4,<1.0" +jinja2 = ">=3.1,<4.0" +markdown = ">=3.2,<4.0" +mkdocs = ">=1.6,<2.0" +mkdocs-material-extensions = ">=1.3,<2.0" +paginate = ">=0.5,<1.0" +pygments = ">=2.16,<3.0" +pymdown-extensions = ">=10.2,<11.0" +requests = ">=2.26,<3.0" [package.extras] git = ["mkdocs-git-committers-plugin-2 (>=1.1,<3)", "mkdocs-git-revision-date-localized-plugin (>=1.2.4,<2.0)"] @@ -1189,40 +987,22 @@ python-legacy = ["mkdocstrings-python-legacy (>=0.2.1)"] [[package]] name = "mkdocstrings-python" -version = "1.18.2" +version = "1.16.11" description = "A Python handler for mkdocstrings." optional = false python-versions = ">=3.9" groups = ["docs"] files = [ - {file = "mkdocstrings_python-1.18.2-py3-none-any.whl", hash = "sha256:944fe6deb8f08f33fa936d538233c4036e9f53e840994f6146e8e94eb71b600d"}, - {file = "mkdocstrings_python-1.18.2.tar.gz", hash = "sha256:4ad536920a07b6336f50d4c6d5603316fafb1172c5c882370cbbc954770ad323"}, + {file = "mkdocstrings_python-1.16.11-py3-none-any.whl", hash = "sha256:25d96cc9c1f9c272ea1bd8222c900b5f852bf46c984003e9c7c56eaa4696190f"}, + {file = "mkdocstrings_python-1.16.11.tar.gz", hash = "sha256:935f95efa887f99178e4a7becaaa1286fb35adafffd669b04fd611d97c00e5ce"}, ] [package.dependencies] -griffe = ">=1.13" +griffe = ">=1.6.2" mkdocs-autorefs = ">=1.4" -mkdocstrings = ">=0.30" +mkdocstrings = ">=0.28.3" typing-extensions = {version = ">=4.0", markers = "python_version < \"3.11\""} -[[package]] -name = "mkdocstrings-python" -version = "1.19.0" -description = "A Python handler for mkdocstrings." -optional = false -python-versions = ">=3.10" -groups = ["docs"] -markers = "python_version >= \"3.11\"" -files = [ - {file = "mkdocstrings_python-1.19.0-py3-none-any.whl", hash = "sha256:395c1032af8f005234170575cc0c5d4d20980846623b623b35594281be4a3059"}, - {file = "mkdocstrings_python-1.19.0.tar.gz", hash = "sha256:917aac66cf121243c11db5b89f66b0ded6c53ec0de5318ff5e22424eb2f2e57c"}, -] - -[package.dependencies] -griffe = ">=1.13" -mkdocs-autorefs = ">=1.4" -mkdocstrings = ">=0.30" - [[package]] name = "mypy" version = "1.18.2" @@ -1316,124 +1096,87 @@ doc = ["nb2plots (>=0.7)", "nbconvert (<7.9)", "numpydoc (>=1.6)", "pillow (>=9. extra = ["lxml (>=4.6)", "pydot (>=1.4.2)", "pygraphviz (>=1.11)", "sympy (>=1.10)"] test = ["pytest (>=7.2)", "pytest-cov (>=4.0)"] -[[package]] -name = "networkx" -version = "3.5" -description = "Python package for creating and manipulating graphs and networks" -optional = true -python-versions = ">=3.11" -groups = ["main"] -markers = "python_version >= \"3.11\" and extra == \"networkx\"" -files = [ - {file = "networkx-3.5-py3-none-any.whl", hash = "sha256:0030d386a9a06dee3565298b4a734b68589749a544acbb6c412dc9e2489ec6ec"}, - {file = "networkx-3.5.tar.gz", hash = "sha256:d4c6f9cf81f52d69230866796b82afbccdec3db7ae4fbd1b65ea750feed50037"}, -] - -[package.extras] -default = ["matplotlib (>=3.8)", "numpy (>=1.25)", "pandas (>=2.0)", "scipy (>=1.11.2)"] -developer = ["mypy (>=1.15)", "pre-commit (>=4.1)"] -doc = ["intersphinx-registry", "myst-nb (>=1.1)", "numpydoc (>=1.8.0)", "pillow (>=10)", "pydata-sphinx-theme (>=0.16)", "sphinx (>=8.0)", "sphinx-gallery (>=0.18)", "texext (>=0.6.7)"] -example = ["cairocffi (>=1.7)", "contextily (>=1.6)", "igraph (>=0.11)", "momepy (>=0.7.2)", "osmnx (>=2.0.0)", "scikit-learn (>=1.5)", "seaborn (>=0.13)"] -extra = ["lxml (>=4.6)", "pydot (>=3.0.1)", "pygraphviz (>=1.14)", "sympy (>=1.10)"] -test = ["pytest (>=7.2)", "pytest-cov (>=4.0)", "pytest-xdist (>=3.0)"] -test-extras = ["pytest-mpl", "pytest-randomly"] - [[package]] name = "orjson" -version = "3.11.4" +version = "3.10.18" description = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy" optional = true python-versions = ">=3.9" groups = ["main"] markers = "extra == \"orjson\"" files = [ - {file = "orjson-3.11.4-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:e3aa2118a3ece0d25489cbe48498de8a5d580e42e8d9979f65bf47900a15aba1"}, - {file = "orjson-3.11.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a69ab657a4e6733133a3dca82768f2f8b884043714e8d2b9ba9f52b6efef5c44"}, - {file = "orjson-3.11.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3740bffd9816fc0326ddc406098a3a8f387e42223f5f455f2a02a9f834ead80c"}, - {file = "orjson-3.11.4-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:65fd2f5730b1bf7f350c6dc896173d3460d235c4be007af73986d7cd9a2acd23"}, - {file = "orjson-3.11.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9fdc3ae730541086158d549c97852e2eea6820665d4faf0f41bf99df41bc11ea"}, - {file = "orjson-3.11.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e10b4d65901da88845516ce9f7f9736f9638d19a1d483b3883dc0182e6e5edba"}, - {file = "orjson-3.11.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb6a03a678085f64b97f9d4a9ae69376ce91a3a9e9b56a82b1580d8e1d501aff"}, - {file = "orjson-3.11.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2c82e4f0b1c712477317434761fbc28b044c838b6b1240d895607441412371ac"}, - {file = "orjson-3.11.4-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:d58c166a18f44cc9e2bad03a327dc2d1a3d2e85b847133cfbafd6bfc6719bd79"}, - {file = "orjson-3.11.4-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:94f206766bf1ea30e1382e4890f763bd1eefddc580e08fec1ccdc20ddd95c827"}, - {file = "orjson-3.11.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:41bf25fb39a34cf8edb4398818523277ee7096689db352036a9e8437f2f3ee6b"}, - {file = "orjson-3.11.4-cp310-cp310-win32.whl", hash = "sha256:fa9627eba4e82f99ca6d29bc967f09aba446ee2b5a1ea728949ede73d313f5d3"}, - {file = "orjson-3.11.4-cp310-cp310-win_amd64.whl", hash = "sha256:23ef7abc7fca96632d8174ac115e668c1e931b8fe4dde586e92a500bf1914dcc"}, - {file = "orjson-3.11.4-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:5e59d23cd93ada23ec59a96f215139753fbfe3a4d989549bcb390f8c00370b39"}, - {file = "orjson-3.11.4-cp311-cp311-macosx_15_0_arm64.whl", hash = "sha256:5c3aedecfc1beb988c27c79d52ebefab93b6c3921dbec361167e6559aba2d36d"}, - {file = "orjson-3.11.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da9e5301f1c2caa2a9a4a303480d79c9ad73560b2e7761de742ab39fe59d9175"}, - {file = "orjson-3.11.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8873812c164a90a79f65368f8f96817e59e35d0cc02786a5356f0e2abed78040"}, - {file = "orjson-3.11.4-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5d7feb0741ebb15204e748f26c9638e6665a5fa93c37a2c73d64f1669b0ddc63"}, - {file = "orjson-3.11.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:01ee5487fefee21e6910da4c2ee9eef005bee568a0879834df86f888d2ffbdd9"}, - {file = "orjson-3.11.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3d40d46f348c0321df01507f92b95a377240c4ec31985225a6668f10e2676f9a"}, - {file = "orjson-3.11.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95713e5fc8af84d8edc75b785d2386f653b63d62b16d681687746734b4dfc0be"}, - {file = "orjson-3.11.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ad73ede24f9083614d6c4ca9a85fe70e33be7bf047ec586ee2363bc7418fe4d7"}, - {file = "orjson-3.11.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:842289889de515421f3f224ef9c1f1efb199a32d76d8d2ca2706fa8afe749549"}, - {file = "orjson-3.11.4-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:3b2427ed5791619851c52a1261b45c233930977e7de8cf36de05636c708fa905"}, - {file = "orjson-3.11.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3c36e524af1d29982e9b190573677ea02781456b2e537d5840e4538a5ec41907"}, - {file = "orjson-3.11.4-cp311-cp311-win32.whl", hash = "sha256:87255b88756eab4a68ec61837ca754e5d10fa8bc47dc57f75cedfeaec358d54c"}, - {file = "orjson-3.11.4-cp311-cp311-win_amd64.whl", hash = "sha256:e2d5d5d798aba9a0e1fede8d853fa899ce2cb930ec0857365f700dffc2c7af6a"}, - {file = "orjson-3.11.4-cp311-cp311-win_arm64.whl", hash = "sha256:6bb6bb41b14c95d4f2702bce9975fda4516f1db48e500102fc4d8119032ff045"}, - {file = "orjson-3.11.4-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:d4371de39319d05d3f482f372720b841c841b52f5385bd99c61ed69d55d9ab50"}, - {file = "orjson-3.11.4-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:e41fd3b3cac850eaae78232f37325ed7d7436e11c471246b87b2cd294ec94853"}, - {file = "orjson-3.11.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:600e0e9ca042878c7fdf189cf1b028fe2c1418cc9195f6cb9824eb6ed99cb938"}, - {file = "orjson-3.11.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7bbf9b333f1568ef5da42bc96e18bf30fd7f8d54e9ae066d711056add508e415"}, - {file = "orjson-3.11.4-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4806363144bb6e7297b8e95870e78d30a649fdc4e23fc84daa80c8ebd366ce44"}, - {file = "orjson-3.11.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad355e8308493f527d41154e9053b86a5be892b3b359a5c6d5d95cda23601cb2"}, - {file = "orjson-3.11.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c8a7517482667fb9f0ff1b2f16fe5829296ed7a655d04d68cd9711a4d8a4e708"}, - {file = "orjson-3.11.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:97eb5942c7395a171cbfecc4ef6701fc3c403e762194683772df4c54cfbb2210"}, - {file = "orjson-3.11.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:149d95d5e018bdd822e3f38c103b1a7c91f88d38a88aada5c4e9b3a73a244241"}, - {file = "orjson-3.11.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:624f3951181eb46fc47dea3d221554e98784c823e7069edb5dbd0dc826ac909b"}, - {file = "orjson-3.11.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:03bfa548cf35e3f8b3a96c4e8e41f753c686ff3d8e182ce275b1751deddab58c"}, - {file = "orjson-3.11.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:525021896afef44a68148f6ed8a8bf8375553d6066c7f48537657f64823565b9"}, - {file = "orjson-3.11.4-cp312-cp312-win32.whl", hash = "sha256:b58430396687ce0f7d9eeb3dd47761ca7d8fda8e9eb92b3077a7a353a75efefa"}, - {file = "orjson-3.11.4-cp312-cp312-win_amd64.whl", hash = "sha256:c6dbf422894e1e3c80a177133c0dda260f81428f9de16d61041949f6a2e5c140"}, - {file = "orjson-3.11.4-cp312-cp312-win_arm64.whl", hash = "sha256:d38d2bc06d6415852224fcc9c0bfa834c25431e466dc319f0edd56cca81aa96e"}, - {file = "orjson-3.11.4-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:2d6737d0e616a6e053c8b4acc9eccea6b6cce078533666f32d140e4f85002534"}, - {file = "orjson-3.11.4-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:afb14052690aa328cc118a8e09f07c651d301a72e44920b887c519b313d892ff"}, - {file = "orjson-3.11.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38aa9e65c591febb1b0aed8da4d469eba239d434c218562df179885c94e1a3ad"}, - {file = "orjson-3.11.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f2cf4dfaf9163b0728d061bebc1e08631875c51cd30bf47cb9e3293bfbd7dcd5"}, - {file = "orjson-3.11.4-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:89216ff3dfdde0e4070932e126320a1752c9d9a758d6a32ec54b3b9334991a6a"}, - {file = "orjson-3.11.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9daa26ca8e97fae0ce8aa5d80606ef8f7914e9b129b6b5df9104266f764ce436"}, - {file = "orjson-3.11.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5c8b2769dc31883c44a9cd126560327767f848eb95f99c36c9932f51090bfce9"}, - {file = "orjson-3.11.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1469d254b9884f984026bd9b0fa5bbab477a4bfe558bba6848086f6d43eb5e73"}, - {file = "orjson-3.11.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:68e44722541983614e37117209a194e8c3ad07838ccb3127d96863c95ec7f1e0"}, - {file = "orjson-3.11.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:8e7805fda9672c12be2f22ae124dcd7b03928d6c197544fe12174b86553f3196"}, - {file = "orjson-3.11.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:04b69c14615fb4434ab867bf6f38b2d649f6f300af30a6705397e895f7aec67a"}, - {file = "orjson-3.11.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:639c3735b8ae7f970066930e58cf0ed39a852d417c24acd4a25fc0b3da3c39a6"}, - {file = "orjson-3.11.4-cp313-cp313-win32.whl", hash = "sha256:6c13879c0d2964335491463302a6ca5ad98105fc5db3565499dcb80b1b4bd839"}, - {file = "orjson-3.11.4-cp313-cp313-win_amd64.whl", hash = "sha256:09bf242a4af98732db9f9a1ec57ca2604848e16f132e3f72edfd3c5c96de009a"}, - {file = "orjson-3.11.4-cp313-cp313-win_arm64.whl", hash = "sha256:a85f0adf63319d6c1ba06fb0dbf997fced64a01179cf17939a6caca662bf92de"}, - {file = "orjson-3.11.4-cp314-cp314-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:42d43a1f552be1a112af0b21c10a5f553983c2a0938d2bbb8ecd8bc9fb572803"}, - {file = "orjson-3.11.4-cp314-cp314-macosx_15_0_arm64.whl", hash = "sha256:26a20f3fbc6c7ff2cb8e89c4c5897762c9d88cf37330c6a117312365d6781d54"}, - {file = "orjson-3.11.4-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6e3f20be9048941c7ffa8fc523ccbd17f82e24df1549d1d1fe9317712d19938e"}, - {file = "orjson-3.11.4-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aac364c758dc87a52e68e349924d7e4ded348dedff553889e4d9f22f74785316"}, - {file = "orjson-3.11.4-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d5c54a6d76e3d741dcc3f2707f8eeb9ba2a791d3adbf18f900219b62942803b1"}, - {file = "orjson-3.11.4-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f28485bdca8617b79d44627f5fb04336897041dfd9fa66d383a49d09d86798bc"}, - {file = "orjson-3.11.4-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bfc2a484cad3585e4ba61985a6062a4c2ed5c7925db6d39f1fa267c9d166487f"}, - {file = "orjson-3.11.4-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e34dbd508cb91c54f9c9788923daca129fe5b55c5b4eebe713bf5ed3791280cf"}, - {file = "orjson-3.11.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b13c478fa413d4b4ee606ec8e11c3b2e52683a640b006bb586b3041c2ca5f606"}, - {file = "orjson-3.11.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:724ca721ecc8a831b319dcd72cfa370cc380db0bf94537f08f7edd0a7d4e1780"}, - {file = "orjson-3.11.4-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:977c393f2e44845ce1b540e19a786e9643221b3323dae190668a98672d43fb23"}, - {file = "orjson-3.11.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1e539e382cf46edec157ad66b0b0872a90d829a6b71f17cb633d6c160a223155"}, - {file = "orjson-3.11.4-cp314-cp314-win32.whl", hash = "sha256:d63076d625babab9db5e7836118bdfa086e60f37d8a174194ae720161eb12394"}, - {file = "orjson-3.11.4-cp314-cp314-win_amd64.whl", hash = "sha256:0a54d6635fa3aaa438ae32e8570b9f0de36f3f6562c308d2a2a452e8b0592db1"}, - {file = "orjson-3.11.4-cp314-cp314-win_arm64.whl", hash = "sha256:78b999999039db3cf58f6d230f524f04f75f129ba3d1ca2ed121f8657e575d3d"}, - {file = "orjson-3.11.4-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:405261b0a8c62bcbd8e2931c26fdc08714faf7025f45531541e2b29e544b545b"}, - {file = "orjson-3.11.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:af02ff34059ee9199a3546f123a6ab4c86caf1708c79042caf0820dc290a6d4f"}, - {file = "orjson-3.11.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0b2eba969ea4203c177c7b38b36c69519e6067ee68c34dc37081fac74c796e10"}, - {file = "orjson-3.11.4-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0baa0ea43cfa5b008a28d3c07705cf3ada40e5d347f0f44994a64b1b7b4b5350"}, - {file = "orjson-3.11.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:80fd082f5dcc0e94657c144f1b2a3a6479c44ad50be216cf0c244e567f5eae19"}, - {file = "orjson-3.11.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1e3704d35e47d5bee811fb1cbd8599f0b4009b14d451c4c57be5a7e25eb89a13"}, - {file = "orjson-3.11.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:caa447f2b5356779d914658519c874cf3b7629e99e63391ed519c28c8aea4919"}, - {file = "orjson-3.11.4-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:bba5118143373a86f91dadb8df41d9457498226698ebdf8e11cbb54d5b0e802d"}, - {file = "orjson-3.11.4-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:622463ab81d19ef3e06868b576551587de8e4d518892d1afab71e0fbc1f9cffc"}, - {file = "orjson-3.11.4-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:3e0a700c4b82144b72946b6629968df9762552ee1344bfdb767fecdd634fbd5a"}, - {file = "orjson-3.11.4-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:6e18a5c15e764e5f3fc569b47872450b4bcea24f2a6354c0a0e95ad21045d5a9"}, - {file = "orjson-3.11.4-cp39-cp39-win32.whl", hash = "sha256:fb1c37c71cad991ef4d89c7a634b5ffb4447dbd7ae3ae13e8f5ee7f1775e7ab1"}, - {file = "orjson-3.11.4-cp39-cp39-win_amd64.whl", hash = "sha256:e2985ce8b8c42d00492d0ed79f2bd2b6460d00f2fa671dfde4bf2e02f49bf5c6"}, - {file = "orjson-3.11.4.tar.gz", hash = "sha256:39485f4ab4c9b30a3943cfe99e1a213c4776fb69e8abd68f66b83d5a0b0fdc6d"}, + {file = "orjson-3.10.18-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:a45e5d68066b408e4bc383b6e4ef05e717c65219a9e1390abc6155a520cac402"}, + {file = "orjson-3.10.18-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:be3b9b143e8b9db05368b13b04c84d37544ec85bb97237b3a923f076265ec89c"}, + {file = "orjson-3.10.18-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9b0aa09745e2c9b3bf779b096fa71d1cc2d801a604ef6dd79c8b1bfef52b2f92"}, + {file = "orjson-3.10.18-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:53a245c104d2792e65c8d225158f2b8262749ffe64bc7755b00024757d957a13"}, + {file = "orjson-3.10.18-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f9495ab2611b7f8a0a8a505bcb0f0cbdb5469caafe17b0e404c3c746f9900469"}, + {file = "orjson-3.10.18-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:73be1cbcebadeabdbc468f82b087df435843c809cd079a565fb16f0f3b23238f"}, + {file = "orjson-3.10.18-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fe8936ee2679e38903df158037a2f1c108129dee218975122e37847fb1d4ac68"}, + {file = "orjson-3.10.18-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:7115fcbc8525c74e4c2b608129bef740198e9a120ae46184dac7683191042056"}, + {file = "orjson-3.10.18-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:771474ad34c66bc4d1c01f645f150048030694ea5b2709b87d3bda273ffe505d"}, + {file = "orjson-3.10.18-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:7c14047dbbea52886dd87169f21939af5d55143dad22d10db6a7514f058156a8"}, + {file = "orjson-3.10.18-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:641481b73baec8db14fdf58f8967e52dc8bda1f2aba3aa5f5c1b07ed6df50b7f"}, + {file = "orjson-3.10.18-cp310-cp310-win32.whl", hash = "sha256:607eb3ae0909d47280c1fc657c4284c34b785bae371d007595633f4b1a2bbe06"}, + {file = "orjson-3.10.18-cp310-cp310-win_amd64.whl", hash = "sha256:8770432524ce0eca50b7efc2a9a5f486ee0113a5fbb4231526d414e6254eba92"}, + {file = "orjson-3.10.18-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:e0a183ac3b8e40471e8d843105da6fbe7c070faab023be3b08188ee3f85719b8"}, + {file = "orjson-3.10.18-cp311-cp311-macosx_15_0_arm64.whl", hash = "sha256:5ef7c164d9174362f85238d0cd4afdeeb89d9e523e4651add6a5d458d6f7d42d"}, + {file = "orjson-3.10.18-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:afd14c5d99cdc7bf93f22b12ec3b294931518aa019e2a147e8aa2f31fd3240f7"}, + {file = "orjson-3.10.18-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7b672502323b6cd133c4af6b79e3bea36bad2d16bca6c1f645903fce83909a7a"}, + {file = "orjson-3.10.18-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:51f8c63be6e070ec894c629186b1c0fe798662b8687f3d9fdfa5e401c6bd7679"}, + {file = "orjson-3.10.18-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3f9478ade5313d724e0495d167083c6f3be0dd2f1c9c8a38db9a9e912cdaf947"}, + {file = "orjson-3.10.18-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:187aefa562300a9d382b4b4eb9694806e5848b0cedf52037bb5c228c61bb66d4"}, + {file = "orjson-3.10.18-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9da552683bc9da222379c7a01779bddd0ad39dd699dd6300abaf43eadee38334"}, + {file = "orjson-3.10.18-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e450885f7b47a0231979d9c49b567ed1c4e9f69240804621be87c40bc9d3cf17"}, + {file = "orjson-3.10.18-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:5e3c9cc2ba324187cd06287ca24f65528f16dfc80add48dc99fa6c836bb3137e"}, + {file = "orjson-3.10.18-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:50ce016233ac4bfd843ac5471e232b865271d7d9d44cf9d33773bcd883ce442b"}, + {file = "orjson-3.10.18-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:b3ceff74a8f7ffde0b2785ca749fc4e80e4315c0fd887561144059fb1c138aa7"}, + {file = "orjson-3.10.18-cp311-cp311-win32.whl", hash = "sha256:fdba703c722bd868c04702cac4cb8c6b8ff137af2623bc0ddb3b3e6a2c8996c1"}, + {file = "orjson-3.10.18-cp311-cp311-win_amd64.whl", hash = "sha256:c28082933c71ff4bc6ccc82a454a2bffcef6e1d7379756ca567c772e4fb3278a"}, + {file = "orjson-3.10.18-cp311-cp311-win_arm64.whl", hash = "sha256:a6c7c391beaedd3fa63206e5c2b7b554196f14debf1ec9deb54b5d279b1b46f5"}, + {file = "orjson-3.10.18-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:50c15557afb7f6d63bc6d6348e0337a880a04eaa9cd7c9d569bcb4e760a24753"}, + {file = "orjson-3.10.18-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:356b076f1662c9813d5fa56db7d63ccceef4c271b1fb3dd522aca291375fcf17"}, + {file = "orjson-3.10.18-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:559eb40a70a7494cd5beab2d73657262a74a2c59aff2068fdba8f0424ec5b39d"}, + {file = "orjson-3.10.18-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f3c29eb9a81e2fbc6fd7ddcfba3e101ba92eaff455b8d602bf7511088bbc0eae"}, + {file = "orjson-3.10.18-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6612787e5b0756a171c7d81ba245ef63a3533a637c335aa7fcb8e665f4a0966f"}, + {file = "orjson-3.10.18-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ac6bd7be0dcab5b702c9d43d25e70eb456dfd2e119d512447468f6405b4a69c"}, + {file = "orjson-3.10.18-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9f72f100cee8dde70100406d5c1abba515a7df926d4ed81e20a9730c062fe9ad"}, + {file = "orjson-3.10.18-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9dca85398d6d093dd41dc0983cbf54ab8e6afd1c547b6b8a311643917fbf4e0c"}, + {file = "orjson-3.10.18-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:22748de2a07fcc8781a70edb887abf801bb6142e6236123ff93d12d92db3d406"}, + {file = "orjson-3.10.18-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:3a83c9954a4107b9acd10291b7f12a6b29e35e8d43a414799906ea10e75438e6"}, + {file = "orjson-3.10.18-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:303565c67a6c7b1f194c94632a4a39918e067bd6176a48bec697393865ce4f06"}, + {file = "orjson-3.10.18-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:86314fdb5053a2f5a5d881f03fca0219bfdf832912aa88d18676a5175c6916b5"}, + {file = "orjson-3.10.18-cp312-cp312-win32.whl", hash = "sha256:187ec33bbec58c76dbd4066340067d9ece6e10067bb0cc074a21ae3300caa84e"}, + {file = "orjson-3.10.18-cp312-cp312-win_amd64.whl", hash = "sha256:f9f94cf6d3f9cd720d641f8399e390e7411487e493962213390d1ae45c7814fc"}, + {file = "orjson-3.10.18-cp312-cp312-win_arm64.whl", hash = "sha256:3d600be83fe4514944500fa8c2a0a77099025ec6482e8087d7659e891f23058a"}, + {file = "orjson-3.10.18-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:69c34b9441b863175cc6a01f2935de994025e773f814412030f269da4f7be147"}, + {file = "orjson-3.10.18-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:1ebeda919725f9dbdb269f59bc94f861afbe2a27dce5608cdba2d92772364d1c"}, + {file = "orjson-3.10.18-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5adf5f4eed520a4959d29ea80192fa626ab9a20b2ea13f8f6dc58644f6927103"}, + {file = "orjson-3.10.18-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7592bb48a214e18cd670974f289520f12b7aed1fa0b2e2616b8ed9e069e08595"}, + {file = "orjson-3.10.18-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f872bef9f042734110642b7a11937440797ace8c87527de25e0c53558b579ccc"}, + {file = "orjson-3.10.18-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0315317601149c244cb3ecef246ef5861a64824ccbcb8018d32c66a60a84ffbc"}, + {file = "orjson-3.10.18-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e0da26957e77e9e55a6c2ce2e7182a36a6f6b180ab7189315cb0995ec362e049"}, + {file = "orjson-3.10.18-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bb70d489bc79b7519e5803e2cc4c72343c9dc1154258adf2f8925d0b60da7c58"}, + {file = "orjson-3.10.18-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e9e86a6af31b92299b00736c89caf63816f70a4001e750bda179e15564d7a034"}, + {file = "orjson-3.10.18-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:c382a5c0b5931a5fc5405053d36c1ce3fd561694738626c77ae0b1dfc0242ca1"}, + {file = "orjson-3.10.18-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:8e4b2ae732431127171b875cb2668f883e1234711d3c147ffd69fe5be51a8012"}, + {file = "orjson-3.10.18-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2d808e34ddb24fc29a4d4041dcfafbae13e129c93509b847b14432717d94b44f"}, + {file = "orjson-3.10.18-cp313-cp313-win32.whl", hash = "sha256:ad8eacbb5d904d5591f27dee4031e2c1db43d559edb8f91778efd642d70e6bea"}, + {file = "orjson-3.10.18-cp313-cp313-win_amd64.whl", hash = "sha256:aed411bcb68bf62e85588f2a7e03a6082cc42e5a2796e06e72a962d7c6310b52"}, + {file = "orjson-3.10.18-cp313-cp313-win_arm64.whl", hash = "sha256:f54c1385a0e6aba2f15a40d703b858bedad36ded0491e55d35d905b2c34a4cc3"}, + {file = "orjson-3.10.18-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:c95fae14225edfd699454e84f61c3dd938df6629a00c6ce15e704f57b58433bb"}, + {file = "orjson-3.10.18-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5232d85f177f98e0cefabb48b5e7f60cff6f3f0365f9c60631fecd73849b2a82"}, + {file = "orjson-3.10.18-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2783e121cafedf0d85c148c248a20470018b4ffd34494a68e125e7d5857655d1"}, + {file = "orjson-3.10.18-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e54ee3722caf3db09c91f442441e78f916046aa58d16b93af8a91500b7bbf273"}, + {file = "orjson-3.10.18-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2daf7e5379b61380808c24f6fc182b7719301739e4271c3ec88f2984a2d61f89"}, + {file = "orjson-3.10.18-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7f39b371af3add20b25338f4b29a8d6e79a8c7ed0e9dd49e008228a065d07781"}, + {file = "orjson-3.10.18-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2b819ed34c01d88c6bec290e6842966f8e9ff84b7694632e88341363440d4cc0"}, + {file = "orjson-3.10.18-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2f6c57debaef0b1aa13092822cbd3698a1fb0209a9ea013a969f4efa36bdea57"}, + {file = "orjson-3.10.18-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:755b6d61ffdb1ffa1e768330190132e21343757c9aa2308c67257cc81a1a6f5a"}, + {file = "orjson-3.10.18-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:ce8d0a875a85b4c8579eab5ac535fb4b2a50937267482be402627ca7e7570ee3"}, + {file = "orjson-3.10.18-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:57b5d0673cbd26781bebc2bf86f99dd19bd5a9cb55f71cc4f66419f6b50f3d77"}, + {file = "orjson-3.10.18-cp39-cp39-win32.whl", hash = "sha256:951775d8b49d1d16ca8818b1f20c4965cae9157e7b562a2ae34d3967b8f21c8e"}, + {file = "orjson-3.10.18-cp39-cp39-win_amd64.whl", hash = "sha256:fdd9d68f83f0bc4406610b1ac68bdcded8c5ee58605cc69e643a06f4d075f429"}, + {file = "orjson-3.10.18.tar.gz", hash = "sha256:e8da3947d92123eda795b68228cafe2724815621fe35e8e320a9e9593a4bcd53"}, ] [[package]] @@ -1478,14 +1221,14 @@ files = [ [[package]] name = "pip" -version = "25.3" +version = "25.1.1" description = "The PyPA recommended tool for installing Python packages." optional = false python-versions = ">=3.9" groups = ["dev"] files = [ - {file = "pip-25.3-py3-none-any.whl", hash = "sha256:9655943313a94722b7774661c21049070f6bbb0a1516bf02f7c8d5d9201514cd"}, - {file = "pip-25.3.tar.gz", hash = "sha256:8d0538dbbd7babbd207f261ed969c65de439f6bc9e5dbd3b3b9a77f25d95f343"}, + {file = "pip-25.1.1-py3-none-any.whl", hash = "sha256:2913a38a2abf4ea6b64ab507bd9e967f3b53dc1ede74b01b0931e1ce548751af"}, + {file = "pip-25.1.1.tar.gz", hash = "sha256:3de45d411d308d5054c2168185d8da7f9a2cd753dbac8acbfa88a8909ecd9077"}, ] [[package]] @@ -1515,14 +1258,14 @@ testing = ["flit_core (>=2,<4)", "poetry_core (>=1.0.0)", "pytest (>=7.2.0)", "p [[package]] name = "platformdirs" -version = "4.4.0" +version = "4.3.8" description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`." optional = false python-versions = ">=3.9" groups = ["dev", "docs"] files = [ - {file = "platformdirs-4.4.0-py3-none-any.whl", hash = "sha256:abd01743f24e5287cd7a5db3752faf1a2d65353f38ec26d98e25a6db65958c85"}, - {file = "platformdirs-4.4.0.tar.gz", hash = "sha256:ca753cf4d81dc309bc67b0ea38fd15dc97bc30ce419a7f58d13eb3bf14c4febf"}, + {file = "platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4"}, + {file = "platformdirs-4.3.8.tar.gz", hash = "sha256:3d512d96e16bcb959a814c9f348431070822a6496326a4be0911c40b5a74c2bc"}, ] [package.extras] @@ -1530,24 +1273,6 @@ docs = ["furo (>=2024.8.6)", "proselint (>=0.14)", "sphinx (>=8.1.3)", "sphinx-a test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=8.3.4)", "pytest-cov (>=6)", "pytest-mock (>=3.14)"] type = ["mypy (>=1.14.1)"] -[[package]] -name = "platformdirs" -version = "4.5.0" -description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`." -optional = false -python-versions = ">=3.10" -groups = ["dev", "docs"] -markers = "python_version >= \"3.11\"" -files = [ - {file = "platformdirs-4.5.0-py3-none-any.whl", hash = "sha256:e578a81bb873cbb89a41fcc904c7ef523cc18284b7e3b3ccf06aca1403b7ebd3"}, - {file = "platformdirs-4.5.0.tar.gz", hash = "sha256:70ddccdd7c99fc5942e9fc25636a8b34d04c24b335100223152c2803e4063312"}, -] - -[package.extras] -docs = ["furo (>=2025.9.25)", "proselint (>=0.14)", "sphinx (>=8.2.3)", "sphinx-autodoc-typehints (>=3.2)"] -test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=8.4.2)", "pytest-cov (>=7)", "pytest-mock (>=3.15.1)"] -type = ["mypy (>=1.18.2)"] - [[package]] name = "pluggy" version = "1.6.0" @@ -1566,14 +1291,14 @@ testing = ["coverage", "pytest", "pytest-benchmark"] [[package]] name = "pygments" -version = "2.19.2" +version = "2.19.1" description = "Pygments is a syntax highlighting package written in Python." optional = false python-versions = ">=3.8" groups = ["docs", "tests"] files = [ - {file = "pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b"}, - {file = "pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887"}, + {file = "pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c"}, + {file = "pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f"}, ] [package.extras] @@ -1581,14 +1306,14 @@ windows-terminal = ["colorama (>=0.4.6)"] [[package]] name = "pymdown-extensions" -version = "10.17.1" +version = "10.15" description = "Extension pack for Python Markdown." optional = false -python-versions = ">=3.9" +python-versions = ">=3.8" groups = ["docs"] files = [ - {file = "pymdown_extensions-10.17.1-py3-none-any.whl", hash = "sha256:1f160209c82eecbb5d8a0d8f89a4d9bd6bdcbde9a8537761844cfc57ad5cd8a6"}, - {file = "pymdown_extensions-10.17.1.tar.gz", hash = "sha256:60d05fe55e7fb5a1e4740fc575facad20dc6ee3a748e8d3d36ba44142e75ce03"}, + {file = "pymdown_extensions-10.15-py3-none-any.whl", hash = "sha256:46e99bb272612b0de3b7e7caf6da8dd5f4ca5212c0b273feb9304e236c484e5f"}, + {file = "pymdown_extensions-10.15.tar.gz", hash = "sha256:0e5994e32155f4b03504f939e501b981d306daf7ec2aa1cd2eb6bd300784f8f7"}, ] [package.dependencies] @@ -1686,85 +1411,65 @@ six = ">=1.5" [[package]] name = "pyyaml" -version = "6.0.3" +version = "6.0.2" description = "YAML parser and emitter for Python" optional = false python-versions = ">=3.8" groups = ["docs"] files = [ - {file = "PyYAML-6.0.3-cp38-cp38-macosx_10_13_x86_64.whl", hash = "sha256:c2514fceb77bc5e7a2f7adfaa1feb2fb311607c9cb518dbc378688ec73d8292f"}, - {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9c57bb8c96f6d1808c030b1687b9b5fb476abaa47f0db9c0101f5e9f394e97f4"}, - {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:efd7b85f94a6f21e4932043973a7ba2613b059c4a000551892ac9f1d11f5baf3"}, - {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:22ba7cfcad58ef3ecddc7ed1db3409af68d023b7f940da23c6c2a1890976eda6"}, - {file = "PyYAML-6.0.3-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:6344df0d5755a2c9a276d4473ae6b90647e216ab4757f8426893b5dd2ac3f369"}, - {file = "PyYAML-6.0.3-cp38-cp38-win32.whl", hash = "sha256:3ff07ec89bae51176c0549bc4c63aa6202991da2d9a6129d7aef7f1407d3f295"}, - {file = "PyYAML-6.0.3-cp38-cp38-win_amd64.whl", hash = "sha256:5cf4e27da7e3fbed4d6c3d8e797387aaad68102272f8f9752883bc32d61cb87b"}, - {file = "pyyaml-6.0.3-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:214ed4befebe12df36bcc8bc2b64b396ca31be9304b8f59e25c11cf94a4c033b"}, - {file = "pyyaml-6.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:02ea2dfa234451bbb8772601d7b8e426c2bfa197136796224e50e35a78777956"}, - {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b30236e45cf30d2b8e7b3e85881719e98507abed1011bf463a8fa23e9c3e98a8"}, - {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:66291b10affd76d76f54fad28e22e51719ef9ba22b29e1d7d03d6777a9174198"}, - {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9c7708761fccb9397fe64bbc0395abcae8c4bf7b0eac081e12b809bf47700d0b"}, - {file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:418cf3f2111bc80e0933b2cd8cd04f286338bb88bdc7bc8e6dd775ebde60b5e0"}, - {file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5e0b74767e5f8c593e8c9b5912019159ed0533c70051e9cce3e8b6aa699fcd69"}, - {file = "pyyaml-6.0.3-cp310-cp310-win32.whl", hash = "sha256:28c8d926f98f432f88adc23edf2e6d4921ac26fb084b028c733d01868d19007e"}, - {file = "pyyaml-6.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:bdb2c67c6c1390b63c6ff89f210c8fd09d9a1217a465701eac7316313c915e4c"}, - {file = "pyyaml-6.0.3-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:44edc647873928551a01e7a563d7452ccdebee747728c1080d881d68af7b997e"}, - {file = "pyyaml-6.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:652cb6edd41e718550aad172851962662ff2681490a8a711af6a4d288dd96824"}, - {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:10892704fc220243f5305762e276552a0395f7beb4dbf9b14ec8fd43b57f126c"}, - {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:850774a7879607d3a6f50d36d04f00ee69e7fc816450e5f7e58d7f17f1ae5c00"}, - {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8bb0864c5a28024fac8a632c443c87c5aa6f215c0b126c449ae1a150412f31d"}, - {file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1d37d57ad971609cf3c53ba6a7e365e40660e3be0e5175fa9f2365a379d6095a"}, - {file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:37503bfbfc9d2c40b344d06b2199cf0e96e97957ab1c1b546fd4f87e53e5d3e4"}, - {file = "pyyaml-6.0.3-cp311-cp311-win32.whl", hash = "sha256:8098f252adfa6c80ab48096053f512f2321f0b998f98150cea9bd23d83e1467b"}, - {file = "pyyaml-6.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:9f3bfb4965eb874431221a3ff3fdcddc7e74e3b07799e0e84ca4a0f867d449bf"}, - {file = "pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196"}, - {file = "pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0"}, - {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28"}, - {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c"}, - {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc"}, - {file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e"}, - {file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea"}, - {file = "pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5"}, - {file = "pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b"}, - {file = "pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd"}, - {file = "pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8"}, - {file = "pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1"}, - {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c"}, - {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5"}, - {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6"}, - {file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6"}, - {file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be"}, - {file = "pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26"}, - {file = "pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c"}, - {file = "pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb"}, - {file = "pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac"}, - {file = "pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310"}, - {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7"}, - {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788"}, - {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5"}, - {file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764"}, - {file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35"}, - {file = "pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac"}, - {file = "pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3"}, - {file = "pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3"}, - {file = "pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba"}, - {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c"}, - {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702"}, - {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c"}, - {file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065"}, - {file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65"}, - {file = "pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9"}, - {file = "pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b"}, - {file = "pyyaml-6.0.3-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:b865addae83924361678b652338317d1bd7e79b1f4596f96b96c77a5a34b34da"}, - {file = "pyyaml-6.0.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c3355370a2c156cffb25e876646f149d5d68f5e0a3ce86a5084dd0b64a994917"}, - {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3c5677e12444c15717b902a5798264fa7909e41153cdf9ef7ad571b704a63dd9"}, - {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5ed875a24292240029e4483f9d4a4b8a1ae08843b9c54f43fcc11e404532a8a5"}, - {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0150219816b6a1fa26fb4699fb7daa9caf09eb1999f3b70fb6e786805e80375a"}, - {file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:fa160448684b4e94d80416c0fa4aac48967a969efe22931448d853ada8baf926"}, - {file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:27c0abcb4a5dac13684a37f76e701e054692a9b2d3064b70f5e4eb54810553d7"}, - {file = "pyyaml-6.0.3-cp39-cp39-win32.whl", hash = "sha256:1ebe39cb5fc479422b83de611d14e2c0d3bb2a18bbcb01f229ab3cfbd8fee7a0"}, - {file = "pyyaml-6.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:2e71d11abed7344e42a8849600193d15b6def118602c4c176f748e4583246007"}, - {file = "pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f"}, + {file = "PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086"}, + {file = "PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed"}, + {file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180"}, + {file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68"}, + {file = "PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99"}, + {file = "PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e"}, + {file = "PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774"}, + {file = "PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85"}, + {file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4"}, + {file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e"}, + {file = "PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5"}, + {file = "PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44"}, + {file = "PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab"}, + {file = "PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476"}, + {file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48"}, + {file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b"}, + {file = "PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4"}, + {file = "PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8"}, + {file = "PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba"}, + {file = "PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5"}, + {file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc"}, + {file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652"}, + {file = "PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183"}, + {file = "PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563"}, + {file = "PyYAML-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083"}, + {file = "PyYAML-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706"}, + {file = "PyYAML-6.0.2-cp38-cp38-win32.whl", hash = "sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a"}, + {file = "PyYAML-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff"}, + {file = "PyYAML-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d"}, + {file = "PyYAML-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19"}, + {file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e"}, + {file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725"}, + {file = "PyYAML-6.0.2-cp39-cp39-win32.whl", hash = "sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631"}, + {file = "PyYAML-6.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8"}, + {file = "pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e"}, ] [[package]] @@ -1864,54 +1569,44 @@ files = [ [[package]] name = "tomli" -version = "2.3.0" +version = "2.2.1" description = "A lil' TOML parser" optional = false python-versions = ">=3.8" groups = ["dev", "tests"] files = [ - {file = "tomli-2.3.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:88bd15eb972f3664f5ed4b57c1634a97153b4bac4479dcb6a495f41921eb7f45"}, - {file = "tomli-2.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:883b1c0d6398a6a9d29b508c331fa56adbcdff647f6ace4dfca0f50e90dfd0ba"}, - {file = "tomli-2.3.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d1381caf13ab9f300e30dd8feadb3de072aeb86f1d34a8569453ff32a7dea4bf"}, - {file = "tomli-2.3.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a0e285d2649b78c0d9027570d4da3425bdb49830a6156121360b3f8511ea3441"}, - {file = "tomli-2.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0a154a9ae14bfcf5d8917a59b51ffd5a3ac1fd149b71b47a3a104ca4edcfa845"}, - {file = "tomli-2.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:74bf8464ff93e413514fefd2be591c3b0b23231a77f901db1eb30d6f712fc42c"}, - {file = "tomli-2.3.0-cp311-cp311-win32.whl", hash = "sha256:00b5f5d95bbfc7d12f91ad8c593a1659b6387b43f054104cda404be6bda62456"}, - {file = "tomli-2.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:4dc4ce8483a5d429ab602f111a93a6ab1ed425eae3122032db7e9acf449451be"}, - {file = "tomli-2.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d7d86942e56ded512a594786a5ba0a5e521d02529b3826e7761a05138341a2ac"}, - {file = "tomli-2.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:73ee0b47d4dad1c5e996e3cd33b8a76a50167ae5f96a2607cbe8cc773506ab22"}, - {file = "tomli-2.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:792262b94d5d0a466afb5bc63c7daa9d75520110971ee269152083270998316f"}, - {file = "tomli-2.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4f195fe57ecceac95a66a75ac24d9d5fbc98ef0962e09b2eddec5d39375aae52"}, - {file = "tomli-2.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e31d432427dcbf4d86958c184b9bfd1e96b5b71f8eb17e6d02531f434fd335b8"}, - {file = "tomli-2.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:7b0882799624980785240ab732537fcfc372601015c00f7fc367c55308c186f6"}, - {file = "tomli-2.3.0-cp312-cp312-win32.whl", hash = "sha256:ff72b71b5d10d22ecb084d345fc26f42b5143c5533db5e2eaba7d2d335358876"}, - {file = "tomli-2.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:1cb4ed918939151a03f33d4242ccd0aa5f11b3547d0cf30f7c74a408a5b99878"}, - {file = "tomli-2.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5192f562738228945d7b13d4930baffda67b69425a7f0da96d360b0a3888136b"}, - {file = "tomli-2.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:be71c93a63d738597996be9528f4abe628d1adf5e6eb11607bc8fe1a510b5dae"}, - {file = "tomli-2.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c4665508bcbac83a31ff8ab08f424b665200c0e1e645d2bd9ab3d3e557b6185b"}, - {file = "tomli-2.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4021923f97266babc6ccab9f5068642a0095faa0a51a246a6a02fccbb3514eaf"}, - {file = "tomli-2.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4ea38c40145a357d513bffad0ed869f13c1773716cf71ccaa83b0fa0cc4e42f"}, - {file = "tomli-2.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ad805ea85eda330dbad64c7ea7a4556259665bdf9d2672f5dccc740eb9d3ca05"}, - {file = "tomli-2.3.0-cp313-cp313-win32.whl", hash = "sha256:97d5eec30149fd3294270e889b4234023f2c69747e555a27bd708828353ab606"}, - {file = "tomli-2.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0c95ca56fbe89e065c6ead5b593ee64b84a26fca063b5d71a1122bf26e533999"}, - {file = "tomli-2.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:cebc6fe843e0733ee827a282aca4999b596241195f43b4cc371d64fc6639da9e"}, - {file = "tomli-2.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:4c2ef0244c75aba9355561272009d934953817c49f47d768070c3c94355c2aa3"}, - {file = "tomli-2.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c22a8bf253bacc0cf11f35ad9808b6cb75ada2631c2d97c971122583b129afbc"}, - {file = "tomli-2.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0eea8cc5c5e9f89c9b90c4896a8deefc74f518db5927d0e0e8d4a80953d774d0"}, - {file = "tomli-2.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b74a0e59ec5d15127acdabd75ea17726ac4c5178ae51b85bfe39c4f8a278e879"}, - {file = "tomli-2.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:b5870b50c9db823c595983571d1296a6ff3e1b88f734a4c8f6fc6188397de005"}, - {file = "tomli-2.3.0-cp314-cp314-win32.whl", hash = "sha256:feb0dacc61170ed7ab602d3d972a58f14ee3ee60494292d384649a3dc38ef463"}, - {file = "tomli-2.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:b273fcbd7fc64dc3600c098e39136522650c49bca95df2d11cf3b626422392c8"}, - {file = "tomli-2.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:940d56ee0410fa17ee1f12b817b37a4d4e4dc4d27340863cc67236c74f582e77"}, - {file = "tomli-2.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f85209946d1fe94416debbb88d00eb92ce9cd5266775424ff81bc959e001acaf"}, - {file = "tomli-2.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a56212bdcce682e56b0aaf79e869ba5d15a6163f88d5451cbde388d48b13f530"}, - {file = "tomli-2.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c5f3ffd1e098dfc032d4d3af5c0ac64f6d286d98bc148698356847b80fa4de1b"}, - {file = "tomli-2.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5e01decd096b1530d97d5d85cb4dff4af2d8347bd35686654a004f8dea20fc67"}, - {file = "tomli-2.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:8a35dd0e643bb2610f156cca8db95d213a90015c11fee76c946aa62b7ae7e02f"}, - {file = "tomli-2.3.0-cp314-cp314t-win32.whl", hash = "sha256:a1f7f282fe248311650081faafa5f4732bdbfef5d45fe3f2e702fbc6f2d496e0"}, - {file = "tomli-2.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:70a251f8d4ba2d9ac2542eecf008b3c8a9fc5c3f9f02c56a9d7952612be2fdba"}, - {file = "tomli-2.3.0-py3-none-any.whl", hash = "sha256:e95b1af3c5b07d9e643909b5abbec77cd9f1217e6d0bca72b0234736b9fb1f1b"}, - {file = "tomli-2.3.0.tar.gz", hash = "sha256:64be704a875d2a59753d80ee8a533c3fe183e3f06807ff7dc2232938ccb01549"}, + {file = "tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249"}, + {file = "tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6"}, + {file = "tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a"}, + {file = "tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee"}, + {file = "tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e"}, + {file = "tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4"}, + {file = "tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106"}, + {file = "tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8"}, + {file = "tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff"}, + {file = "tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b"}, + {file = "tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea"}, + {file = "tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8"}, + {file = "tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192"}, + {file = "tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222"}, + {file = "tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77"}, + {file = "tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6"}, + {file = "tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd"}, + {file = "tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e"}, + {file = "tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98"}, + {file = "tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4"}, + {file = "tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7"}, + {file = "tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c"}, + {file = "tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13"}, + {file = "tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281"}, + {file = "tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272"}, + {file = "tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140"}, + {file = "tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2"}, + {file = "tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744"}, + {file = "tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec"}, + {file = "tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69"}, + {file = "tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc"}, + {file = "tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff"}, ] markers = {dev = "python_version < \"3.11\"", tests = "python_full_version <= \"3.11.0a6\""} @@ -2002,14 +1697,14 @@ watchmedo = ["PyYAML (>=3.10)"] [[package]] name = "wcmatch" -version = "10.1" +version = "10.0" description = "Wildcard/glob file name matcher." optional = false -python-versions = ">=3.9" +python-versions = ">=3.8" groups = ["docs"] files = [ - {file = "wcmatch-10.1-py3-none-any.whl", hash = "sha256:5848ace7dbb0476e5e55ab63c6bbd529745089343427caa5537f230cc01beb8a"}, - {file = "wcmatch-10.1.tar.gz", hash = "sha256:f11f94208c8c8484a16f4f48638a85d771d9513f4ab3f37595978801cb9465af"}, + {file = "wcmatch-10.0-py3-none-any.whl", hash = "sha256:0dd927072d03c0a6527a20d2e6ad5ba8d0380e60870c383bc533b71744df7b7a"}, + {file = "wcmatch-10.0.tar.gz", hash = "sha256:e72f0de09bba6a04e0de70937b0cf06e55f36f37b3deb422dfaf854b867b840a"}, ] [package.dependencies] @@ -2032,14 +1727,14 @@ test = ["pytest (>=6.0.0)", "setuptools (>=65)"] [[package]] name = "zipp" -version = "3.23.0" +version = "3.22.0" description = "Backport of pathlib-compatible object wrapper for zip files" optional = false python-versions = ">=3.9" groups = ["dev", "docs"] files = [ - {file = "zipp-3.23.0-py3-none-any.whl", hash = "sha256:071652d6115ed432f5ce1d34c336c0adfd6a884660d1e9712a256d3d3bd4b14e"}, - {file = "zipp-3.23.0.tar.gz", hash = "sha256:a07157588a12518c9d4034df3fbbee09c814741a33ff63c05fa29d26a2404166"}, + {file = "zipp-3.22.0-py3-none-any.whl", hash = "sha256:fe208f65f2aca48b81f9e6fd8cf7b8b32c26375266b009b413d45306b6148343"}, + {file = "zipp-3.22.0.tar.gz", hash = "sha256:dd2f28c3ce4bc67507bfd3781d21b7bb2be31103b51a4553ad7d90b84e57ace5"}, ] markers = {dev = "python_full_version < \"3.10.2\"", docs = "python_version == \"3.9\""} @@ -2048,7 +1743,7 @@ check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1) ; sys_platform != \" cover = ["pytest-cov"] doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] enabler = ["pytest-enabler (>=2.2)"] -test = ["big-O", "jaraco.functools", "jaraco.itertools", "jaraco.test", "more_itertools", "pytest (>=6,!=8.1.*)", "pytest-ignore-flaky"] +test = ["big-O", "importlib_resources ; python_version < \"3.9\"", "jaraco.functools", "jaraco.itertools", "jaraco.test", "more_itertools", "pytest (>=6,!=8.1.*)", "pytest-ignore-flaky"] type = ["pytest-mypy"] [extras] From 310d1a92018f29892185d760f5939fa63becddf2 Mon Sep 17 00:00:00 2001 From: Edmond Chuc Date: Wed, 12 Nov 2025 15:11:29 +1000 Subject: [PATCH 59/60] chore: remove unecessary type checking imports --- rdflib/extras/shacl.py | 6 ------ 1 file changed, 6 deletions(-) diff --git a/rdflib/extras/shacl.py b/rdflib/extras/shacl.py index 3683a40920..1330a16ac4 100644 --- a/rdflib/extras/shacl.py +++ b/rdflib/extras/shacl.py @@ -15,12 +15,6 @@ from rdflib.graph import _ObjectType from rdflib.term import IdentifiedNode -if TYPE_CHECKING: - from rdflib.term import IdentifiedNode - -if TYPE_CHECKING: - from rdflib.term import IdentifiedNode - class SHACLPathError(Exception): pass From 5a946fba5ac97f36ccd017b58a4b211fbf94a3c0 Mon Sep 17 00:00:00 2001 From: Edmond Chuc Date: Wed, 12 Nov 2025 15:16:45 +1000 Subject: [PATCH 60/60] chore: delete merged_prs.json --- merged_prs.json | 3939 ----------------------------------------------- 1 file changed, 3939 deletions(-) delete mode 100644 merged_prs.json diff --git a/merged_prs.json b/merged_prs.json deleted file mode 100644 index 3a59d972d4..0000000000 --- a/merged_prs.json +++ /dev/null @@ -1,3939 +0,0 @@ -[ - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3223", - "id": 3428695947, - "node_id": "PR_kwDOADL-3s6pNAnb", - "number": 3223, - "title": "[7.x] Fix incorrect deskolemization of literals", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-09-18T05:18:09Z", - "updated_at": "2025-09-18T05:30:22Z", - "closed_at": "2025-09-18T05:30:20Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3223", - "html_url": "https://github.com/RDFLib/rdflib/pull/3223", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3223.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3223.patch", - "merged_at": "2025-09-18T05:30:20Z" - }, - "body": "This is the v7 version of PR https://github.com/RDFLib/rdflib/pull/3127.\r\n\r\n* Fix issue 3126\r\n\r\n* [pre-commit.ci] auto fixes from pre-commit.com hooks\r\n\r\nfor more information, see https://pre-commit.ci\r\n\r\n---------\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3223/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3222", - "id": 3428459459, - "node_id": "PR_kwDOADL-3s6pMNyn", - "number": 3222, - "title": "Merge 7-maintenance branch into 7.x", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-09-18T03:00:50Z", - "updated_at": "2025-09-18T03:08:39Z", - "closed_at": "2025-09-18T03:08:37Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3222", - "html_url": "https://github.com/RDFLib/rdflib/pull/3222", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3222.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3222.patch", - "merged_at": "2025-09-18T03:08:37Z" - }, - "body": "# Summary of changes\r\n\r\nI am merging in the `7-maintenance` branch after reviewing the current set of v7 branches. The `7.x` branch has branch protection rules enabled and is intended to be long-lived for all future v7 related features and fixes.\r\n\r\n`7-maintenance` branch will be closed after this merge. All future v7 PRs should target `7.x` instead.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3222/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3221", - "id": 3428294912, - "node_id": "PR_kwDOADL-3s6pLqU1", - "number": 3221, - "title": "[7.x] notation3.py: don't normalize float representation", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 9291256166, - "node_id": "LA_kwDOADL-3s8AAAACKc1RZg", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.x", - "name": "7.x", - "color": "95113B", - "default": false, - "description": "" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-09-18T01:31:07Z", - "updated_at": "2025-09-18T04:18:53Z", - "closed_at": "2025-09-18T04:18:51Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3221", - "html_url": "https://github.com/RDFLib/rdflib/pull/3221", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3221.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3221.patch", - "merged_at": "2025-09-18T04:18:51Z" - }, - "body": "# Summary of changes\r\n\r\nCode from PR https://github.com/RDFLib/rdflib/pull/3020 into v7.x.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3221/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3219", - "id": 3423885690, - "node_id": "PR_kwDOADL-3s6o8vSy", - "number": 3219, - "title": "Allow lxml 6", - "user": { - "login": "jhgit", - "id": 772518, - "node_id": "MDQ6VXNlcjc3MjUxOA==", - "avatar_url": "https://avatars.githubusercontent.com/u/772518?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/jhgit", - "html_url": "https://github.com/jhgit", - "followers_url": "https://api.github.com/users/jhgit/followers", - "following_url": "https://api.github.com/users/jhgit/following{/other_user}", - "gists_url": "https://api.github.com/users/jhgit/gists{/gist_id}", - "starred_url": "https://api.github.com/users/jhgit/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/jhgit/subscriptions", - "organizations_url": "https://api.github.com/users/jhgit/orgs", - "repos_url": "https://api.github.com/users/jhgit/repos", - "events_url": "https://api.github.com/users/jhgit/events{/privacy}", - "received_events_url": "https://api.github.com/users/jhgit/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-09-16T22:06:35Z", - "updated_at": "2025-09-17T01:52:22Z", - "closed_at": "2025-09-17T01:52:22Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3219", - "html_url": "https://github.com/RDFLib/rdflib/pull/3219", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3219.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3219.patch", - "merged_at": "2025-09-17T01:52:22Z" - }, - "body": "rdflib builds with lxml 6.0.1 - the current latest release.\r\n\r\nFixes #3220\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\nlxml is at 6.0.1. rdflib 7.1.4 builds with that version of lxml. Update pypproject.toml accordingly. Tested locally with python 3.9 and 3.11.\r\n\r\n# Checklist\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes (same pytest tests pass or fail with lxml5 as lxml6).\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [x] Considered adding additional documentation. (didn't see any documentation that needed updating)\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3219/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3218", - "id": 3420409710, - "node_id": "PR_kwDOADL-3s6oxCzU", - "number": 3218, - "title": "ci: fix firejail command for poetry 2.1.0", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-09-16T05:06:40Z", - "updated_at": "2025-09-16T05:44:53Z", - "closed_at": "2025-09-16T05:44:52Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3218", - "html_url": "https://github.com/RDFLib/rdflib/pull/3218", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3218.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3218.patch", - "merged_at": "2025-09-16T05:44:52Z" - }, - "body": "# Summary of changes\r\n\r\nIncrementally bumping poetry from v2.0.0 to see which patch/minor version breaks the CI.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3218/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3217", - "id": 3416326911, - "node_id": "PR_kwDOADL-3s6ojScX", - "number": 3217, - "title": "build(deps): bump library/python from `8220cce` to `58c30f5` in /docker/latest", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4545133062, - "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", - "name": "docker", - "color": "21ceff", - "default": false, - "description": "Pull requests that update Docker code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 3, - "created_at": "2025-09-15T05:14:09Z", - "updated_at": "2025-09-16T03:00:23Z", - "closed_at": "2025-09-16T02:59:34Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3217", - "html_url": "https://github.com/RDFLib/rdflib/pull/3217", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3217.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3217.patch", - "merged_at": "2025-09-16T02:59:34Z" - }, - "body": "Bumps library/python from `8220cce` to `58c30f5`.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.7-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once it's up-to-date and CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3217/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3216", - "id": 3416322658, - "node_id": "PR_kwDOADL-3s6ojRf8", - "number": 3216, - "title": "build(deps): bump library/python from `8220cce` to `58c30f5` in /docker/unstable", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4545133062, - "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", - "name": "docker", - "color": "21ceff", - "default": false, - "description": "Pull requests that update Docker code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 3, - "created_at": "2025-09-15T05:12:27Z", - "updated_at": "2025-09-16T02:47:32Z", - "closed_at": "2025-09-16T02:46:16Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3216", - "html_url": "https://github.com/RDFLib/rdflib/pull/3216", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3216.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3216.patch", - "merged_at": "2025-09-16T02:46:15Z" - }, - "body": "Bumps library/python from `8220cce` to `58c30f5`.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.7-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once it's up-to-date and CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3216/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3214", - "id": 3416315331, - "node_id": "PR_kwDOADL-3s6ojP3g", - "number": 3214, - "title": "build(deps-dev): bump mkdocstrings from 0.29.1 to 0.30.0", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-09-15T05:09:22Z", - "updated_at": "2025-09-16T01:23:50Z", - "closed_at": "2025-09-16T01:23:49Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3214", - "html_url": "https://github.com/RDFLib/rdflib/pull/3214", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3214.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3214.patch", - "merged_at": "2025-09-16T01:23:49Z" - }, - "body": "Bumps [mkdocstrings](https://github.com/mkdocstrings/mkdocstrings) from 0.29.1 to 0.30.0.\n
\n
\nChangelog\n

Sourced from mkdocstrings's changelog.

\n
\n

0.30.0 - 2025-07-23

\n

Compare with 0.29.1

\n

Features

\n
    \n
  • Add data-skip-inventory boolean attribute for elements to skip registration in local inventory (f856160 by Bartosz S\u0142awecki). Issue-671, PR-774
  • \n
  • Add I18N support (translations) (2b4ed54 by Nyuan Zhang). PR-645, Co-authored-by: Timoth\u00e9e Mazzucotelli dev@pawamoy.fr
  • \n
\n
\n
\n
\nCommits\n
    \n
  • 2be445f chore: Prepare release 0.30.0
  • \n
  • f856160 feat: Add data-skip-inventory boolean attribute for elements to skip regist...
  • \n
  • 2b4ed54 feat: Add I18N support (translations)
  • \n
  • 51f217f chore: Template upgrade
  • \n
  • b1da3d0 ci: Ignore Ruff warnings
  • \n
  • d5bf4e1 docs: Update link to YAML idiosyncrasies
  • \n
  • See full diff in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mkdocstrings&package-manager=pip&previous-version=0.29.1&new-version=0.30.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3214/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3213", - "id": 3416314480, - "node_id": "PR_kwDOADL-3s6ojPr0", - "number": 3213, - "title": "build(deps-dev): bump ruff from 0.8.6 to 0.13.0", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-09-15T05:09:01Z", - "updated_at": "2025-09-16T02:11:02Z", - "closed_at": "2025-09-16T02:10:33Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3213", - "html_url": "https://github.com/RDFLib/rdflib/pull/3213", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3213.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3213.patch", - "merged_at": "2025-09-16T02:10:32Z" - }, - "body": "Bumps [ruff](https://github.com/astral-sh/ruff) from 0.8.6 to 0.13.0.\n
\nRelease notes\n

Sourced from ruff's releases.

\n
\n

0.13.0

\n

Release Notes

\n

Check out the blog post for a migration guide and overview of the changes!

\n

Breaking changes

\n
    \n
  • \n

    Several rules can now add from __future__ import annotations automatically

    \n

    TC001, TC002, TC003, RUF013, and UP037 now add from __future__ import annotations as part of their fixes when the lint.future-annotations setting is enabled. This allows the rules to move more imports into TYPE_CHECKING blocks (TC001, TC002, and TC003), use PEP 604 union syntax on Python versions before 3.10 (RUF013), and unquote more annotations (UP037).

    \n
  • \n
  • \n

    Full module paths are now used to verify first-party modules

    \n

    Ruff now checks that the full path to a module exists on disk before categorizing it as a first-party import. This change makes first-party import detection more accurate, helping to avoid false positives on local directories with the same name as a third-party dependency, for example. See the FAQ section on import categorization for more details.

    \n
  • \n
  • \n

    Deprecated rules must now be selected by exact rule code

    \n

    Ruff will no longer activate deprecated rules selected by their group name or prefix. As noted below, the two remaining deprecated rules were also removed in this release, so this won't affect any current rules, but it will still affect any deprecations in the future.

    \n
  • \n
  • \n

    The deprecated macOS configuration directory fallback has been removed

    \n

    Ruff will no longer look for a user-level configuration file at ~/Library/Application Support/ruff/ruff.toml on macOS. This feature was deprecated in v0.5 in favor of using the XDG specification (usually resolving to ~/.config/ruff/ruff.toml), like on Linux. The fallback and accompanying deprecation warning have now been removed.

    \n
  • \n
\n

Removed Rules

\n

The following rules have been removed:

\n\n

Stabilization

\n

The following rules have been stabilized and are no longer in preview:

\n\n

The following behaviors have been stabilized:

\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from ruff's changelog.

\n
\n

0.13.0

\n

Check out the blog post for a migration\nguide and overview of the changes!

\n

Breaking changes

\n
    \n
  • \n

    Several rules can now add from __future__ import annotations automatically

    \n

    TC001, TC002, TC003, RUF013, and UP037 now add from __future__ import annotations as part of their fixes when the\nlint.future-annotations setting is enabled. This allows the rules to move\nmore imports into TYPE_CHECKING blocks (TC001, TC002, and TC003),\nuse PEP 604 union syntax on Python versions before 3.10 (RUF013), and\nunquote more annotations (UP037).

    \n
  • \n
  • \n

    Full module paths are now used to verify first-party modules

    \n

    Ruff now checks that the full path to a module exists on disk before\ncategorizing it as a first-party import. This change makes first-party\nimport detection more accurate, helping to avoid false positives on local\ndirectories with the same name as a third-party dependency, for example. See\nthe FAQ\nsection on import categorization for more details.

    \n
  • \n
  • \n

    Deprecated rules must now be selected by exact rule code

    \n

    Ruff will no longer activate deprecated rules selected by their group name\nor prefix. As noted below, the two remaining deprecated rules were also\nremoved in this release, so this won't affect any current rules, but it will\nstill affect any deprecations in the future.

    \n
  • \n
  • \n

    The deprecated macOS configuration directory fallback has been removed

    \n

    Ruff will no longer look for a user-level configuration file at\n~/Library/Application Support/ruff/ruff.toml on macOS. This feature was\ndeprecated in v0.5 in favor of using the XDG\nspecification\n(usually resolving to ~/.config/ruff/ruff.toml), like on Linux. The\nfallback and accompanying deprecation warning have now been removed.

    \n
  • \n
\n

Removed Rules

\n

The following rules have been removed:

\n\n

Stabilization

\n

The following rules have been stabilized and are no longer in preview:

\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • a1fdd66 Bump 0.13.0 (#20336)
  • \n
  • 8770b95 [ty] introduce DivergentType (#20312)
  • \n
  • 65982a1 [ty] Use 'unknown' specialization for upper bound on Self (#20325)
  • \n
  • 57d1f71 [ty] Simplify unions of enum literals and subtypes thereof (#20324)
  • \n
  • 7a75702 Ignore deprecated rules unless selected by exact code (#20167)
  • \n
  • 9ca632c Stabilize adding future import via config option (#20277)
  • \n
  • 64fe7d3 [flake8-errmsg] Stabilize extending raw-string-in-exception (EM101) to ...
  • \n
  • beeeb8d Stabilize the remaining Airflow rules (#20250)
  • \n
  • b6fca52 [flake8-bugbear] Stabilize support for non-context-manager calls in `assert...
  • \n
  • ac7f882 [flake8-commas] Stabilize support for trailing comma checks in type paramet...
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.8.6&new-version=0.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3213/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3211", - "id": 3416312512, - "node_id": "PR_kwDOADL-3s6ojPQZ", - "number": 3211, - "title": "build(deps-dev): bump pip-tools from 7.4.1 to 7.5.0", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 3, - "created_at": "2025-09-15T05:08:08Z", - "updated_at": "2025-09-16T01:36:07Z", - "closed_at": "2025-09-16T01:35:09Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3211", - "html_url": "https://github.com/RDFLib/rdflib/pull/3211", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3211.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3211.patch", - "merged_at": "2025-09-16T01:35:09Z" - }, - "body": "Bumps [pip-tools](https://github.com/jazzband/pip-tools) from 7.4.1 to 7.5.0.\n
\nRelease notes\n

Sourced from pip-tools's releases.

\n
\n

v7.5.0

\n

2025-07-30

\n

Bug fixes

\n
    \n
  • \n

    Fixed the ordering of format controls to preserve underlying pip behavior -- by @\u200bsethmlarson.

    \n

    PRs and issues: #2082

    \n
  • \n
  • \n

    Fixed NoCandidateFound exception to be compatible with pip >= 24.1 -- by @\u200bchrysle.

    \n

    PRs and issues: #2083

    \n
  • \n
  • \n

    pip-compile now produces relative paths for editable dependencies -- by @\u200bmacro1.

    \n

    PRs and issues: #2087

    \n
  • \n
  • \n

    Fixed crash failures due to incompatibility with pip >= 25.1 -- by @\u200bgkreitz and @\u200bsirosen.

    \n

    PRs and issues: #2176, #2178

    \n
  • \n
\n

Features

\n
    \n
  • \n

    pip-compile now treats package versions requested on the command line as constraints for the underlying pip usage.\nThis applies to build deps in addition to normal package requirements.

    \n

    -- by @\u200bchrysle

    \n

    PRs and issues: #2106

    \n
  • \n
  • \n

    pip-tools now tests on and officially supports Python 3.12 -- by @\u200bsirosen.

    \n

    PRs and issues: #2188

    \n
  • \n
  • \n

    Requirements file paths in pip-compile output are now normalized to POSIX-style, even when pip-compile is run on Windows.\nThis provides more consistent output across various platforms.

    \n

    -- by @\u200bsirosen

    \n

    PRs and issues: #2195

    \n
  • \n
  • \n

    pip-tools now tests against and supports pip up to version 25.1 -- by @\u200bsirosen.

    \n

    PRs and issues: #2195

    \n
  • \n
\n

Removals and backward incompatible breaking changes

\n
    \n
  • pip-compile will now relativize the requirements paths which are recorded in its output.\nPaths are made relative to the working directory.\nThis provides more consistent results across pip versions.
  • \n
\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from pip-tools's changelog.

\n
\n

v7.5.0

\n

2025-07-30

\n

Bug fixes

\n
    \n
  • \n

    Fixed the ordering of format controls to preserve underlying pip behavior\n-- by {user}sethmlarson.

    \n

    PRs and issues: {issue}2082

    \n
  • \n
  • \n

    Fixed NoCandidateFound exception to be compatible with pip >= 24.1\n-- by {user}chrysle.

    \n

    PRs and issues: {issue}2083

    \n
  • \n
  • \n

    pip-compile now produces relative paths for editable dependencies\n-- by {user}macro1.

    \n

    PRs and issues: {issue}2087

    \n
  • \n
  • \n

    Fixed crash failures due to incompatibility with pip >= 25.1\n-- by {user}gkreitz and {user}sirosen.

    \n

    PRs and issues: {issue}2176, {issue}2178

    \n
  • \n
\n

Features

\n
    \n
  • \n

    pip-compile now treats package versions requested on the command line as\nconstraints for the underlying pip usage.\nThis applies to build deps in addition to normal package requirements.

    \n

    -- by {user}chrysle

    \n

    PRs and issues: {issue}2106

    \n
  • \n
  • \n

    pip-tools now tests on and officially supports Python 3.12\n-- by {user}sirosen.

    \n

    PRs and issues: {issue}2188

    \n
  • \n
  • \n

    Requirements file paths in pip-compile output are now normalized to\nPOSIX-style, even when pip-compile is run on Windows.\nThis provides more consistent output across various platforms.

    \n

    -- by {user}sirosen

    \n

    PRs and issues: {issue}2195

    \n
  • \n
  • \n

    pip-tools now tests against and supports pip up to version 25.1

    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • debe5a4 Update changelog for version 7.5.0
  • \n
  • 1c7d9fb Merge pull request #2210 from webknjaz/bugfixes/release-env-context-access
  • \n
  • 96ed4d2 Merge pull request #2209 from webknjaz/maintenance/release-attestations-cleanup
  • \n
  • a180dd9 \ud83d\udcdd Link the PR #2209 change note to PR #2149
  • \n
  • 7f9512a \ud83d\udcdd Link the PR #2210 change note to PR #2149
  • \n
  • 396da33 Run the dist build job in PRs
  • \n
  • 7b1c22c Fix accessing repo id in the release workflow
  • \n
  • 05daad6 Drop release attestations for Jazzband upload
  • \n
  • b4ddd75 Merge pull request #2203 from sirosen/use-towncrier
  • \n
  • a136172 Add a run of 'changelog-draft' to QA CI jobs
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pip-tools&package-manager=pip&previous-version=7.4.1&new-version=7.5.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3211/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3210", - "id": 3415928342, - "node_id": "PR_kwDOADL-3s6oh7Eb", - "number": 3210, - "title": "chore: address dependabot security vulnerabilities", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-09-15T01:24:49Z", - "updated_at": "2025-09-16T03:18:21Z", - "closed_at": "2025-09-16T03:18:19Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3210", - "html_url": "https://github.com/RDFLib/rdflib/pull/3210", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3210.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3210.patch", - "merged_at": "2025-09-16T03:18:19Z" - }, - "body": "# Summary of changes\r\n\r\n- Upgrade [urllib3](https://pypi.org/project/urllib3/) to `2.5.0`\r\n - https://github.com/RDFLib/rdflib/security/dependabot/25\r\n - https://github.com/RDFLib/rdflib/security/dependabot/27\r\n- Upgrade [requests](https://pypi.org/project/requests/) to `2.32.5`\r\n - https://github.com/RDFLib/rdflib/security/dependabot/24\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3210/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3209", - "id": 3392612128, - "node_id": "PR_kwDOADL-3s6nTQkG", - "number": 3209, - "title": "build(deps-dev): bump coverage from 7.8.2 to 7.10.6", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-09-08T05:03:55Z", - "updated_at": "2025-09-12T03:48:06Z", - "closed_at": "2025-09-12T03:48:04Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3209", - "html_url": "https://github.com/RDFLib/rdflib/pull/3209", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3209.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3209.patch", - "merged_at": "2025-09-12T03:48:04Z" - }, - "body": "Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.8.2 to 7.10.6.\n
\nChangelog\n

Sourced from coverage's changelog.

\n
\n

Version 7.10.6 \u2014 2025-08-29

\n
    \n
  • \n

    Fix: source directories were not properly communicated to subprocesses\nthat ran in different directories, as reported in issue 1499_. This is now\nfixed.

    \n
  • \n
  • \n

    Performance: Alex Gaynor continues fine-tuning <pull 2038_>_ the speed of\ncombination, especially with many contexts.

    \n
  • \n
\n

.. _issue 1499: nedbat/coveragepy#1499\n.. _pull 2038: nedbat/coveragepy#2038

\n

.. _changes_7-10-5:

\n

Version 7.10.5 \u2014 2025-08-23

\n
    \n
  • Big speed improvements for coverage combine: it's now about twice as\nfast! Huge thanks to Alex Gaynor for pull requests 2032 <pull 2032_>,\n2033 <pull 2033_>, and 2034 <pull 2034_>_.
  • \n
\n

.. _pull 2032: nedbat/coveragepy#2032\n.. _pull 2033: nedbat/coveragepy#2033\n.. _pull 2034: nedbat/coveragepy#2034

\n

.. _changes_7-10-4:

\n

Version 7.10.4 \u2014 2025-08-16

\n
    \n
  • \n

    Added patch = fork for times when the built-in forking support is\ninsufficient.

    \n
  • \n
  • \n

    Fix: patch = execv also inherits the entire coverage configuration now.

    \n
  • \n
\n

.. _changes_7-10-3:

\n

Version 7.10.3 \u2014 2025-08-10

\n
    \n
  • \n

    Fixes for patch = subprocess:

    \n
      \n
    • \n

      If subprocesses spawned yet more subprocesses simultaneously, some coverage\ncould be missed. This is now fixed, closing issue 2024_.

      \n
    • \n
    • \n

      If subprocesses were created in other directories, their data files were

      \n
    • \n
    \n
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • 88c55ff docs: sample HTML for 7.10.6
  • \n
  • 01d8995 docs: prep for 7.10.6
  • \n
  • 9b0c24f docs: thanks Alex #2038
  • \n
  • 66d6910 fix: make source paths absolute where they exist. #1499
  • \n
  • bb3382f build: no need for the combine/html times now
  • \n
  • 9ea349a lab: warn_executed.py
  • \n
  • 808c9b4 build: changing metacov.ini should trigger metacov
  • \n
  • 384f5f2 build: oops, some 'if's are really line pragmas
  • \n
  • a7224af perf: pre-compute the mapping between other_db.context and main.context (#2038)
  • \n
  • 5c00c5b chore: bump the action-dependencies group with 3 updates (#2039)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=coverage&package-manager=pip&previous-version=7.8.2&new-version=7.10.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3209/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3208", - "id": 3392611610, - "node_id": "PR_kwDOADL-3s6nTQc1", - "number": 3208, - "title": "build(deps-dev): bump mkdocs-material from 9.6.14 to 9.6.19", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-09-08T05:03:41Z", - "updated_at": "2025-09-12T04:00:23Z", - "closed_at": "2025-09-12T03:59:59Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3208", - "html_url": "https://github.com/RDFLib/rdflib/pull/3208", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3208.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3208.patch", - "merged_at": "2025-09-12T03:59:59Z" - }, - "body": "Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.6.14 to 9.6.19.\n
\nRelease notes\n

Sourced from mkdocs-material's releases.

\n
\n

mkdocs-material-9.6.19

\n
    \n
  • Added support for Python 3.14
  • \n
  • Updated Bahasa Malaysia translations
  • \n
\n

mkdocs-material-9.6.18

\n
    \n
  • Updated Azerbaijani translations
  • \n
  • Fixed last compat issues with minijinja, now 100% compatible
  • \n
\n

mkdocs-material-9.6.17

\n
    \n
  • Fixed #8396: Videos do not autoplay when inside a content tab
  • \n
  • Fixed #8394: Stroke width not effective in Mermaid.js diagrams
  • \n
  • Fixed disappearing version selector when hiding page title
  • \n
\n

mkdocs-material-9.6.16

\n
    \n
  • Fixed #8349: Info plugin doesn't correctly detect virtualenv in some cases
  • \n
  • Fixed #8334: Find-in-page detects matches in hidden search result list
  • \n
\n

mkdocs-material-9.6.15

\n
    \n
  • Updated Mongolian translations
  • \n
  • Improved semantic markup of "edit this page" button
  • \n
  • Improved info plugin virtual environment resolution
  • \n
  • Fixed #8291: Large font size setting throws of breakpoints in JavaScript
  • \n
\n
\n
\n
\nChangelog\n

Sourced from mkdocs-material's changelog.

\n
\n

mkdocs-material-9.6.19 (2025-09-07)

\n
    \n
  • Added support for Python 3.14
  • \n
  • Updated Bahasa Malaysia translations
  • \n
\n

mkdocs-material-9.6.18 (2025-08-22)

\n
    \n
  • Updated Azerbaijani translations
  • \n
  • Fixed last compat issues with [minijinja], now 100% compatible
  • \n
\n

mkdocs-material-9.6.17 (2025-08-15)

\n
    \n
  • Fixed #8396: Videos do not autoplay when inside a content tab
  • \n
  • Fixed #8394: Stroke width not effective in Mermaid.js diagrams
  • \n
  • Fixed disappearing version selector when hiding page title
  • \n
\n

mkdocs-material-9.6.16 (2025-07-26)

\n
    \n
  • Fixed #8349: Info plugin doesn't correctly detect virtualenv in some cases
  • \n
  • Fixed #8334: Find-in-page detects matches in hidden search result list
  • \n
\n

mkdocs-material-9.6.15 (2025-07-01)

\n
    \n
  • Updated Mongolian translations
  • \n
  • Improved semantic markup of "edit this page" button
  • \n
  • Improved info plugin virtual environment resolution
  • \n
  • Fixed #8291: Large font size setting throws of breakpoints in JavaScript
  • \n
\n

mkdocs-material-9.6.14 (2025-05-13)

\n
    \n
  • Fixed #8215: Social plugin crashes when CairoSVG is updated to 2.8
  • \n
\n

mkdocs-material-9.6.13 (2025-05-10)

\n
    \n
  • Fixed #8204: Annotations showing list markers in print view
  • \n
  • Fixed #8153: Improve style of cardinality symbols in Mermaid.js ER diagrams
  • \n
\n

mkdocs-material-9.6.12 (2025-04-17)

\n
    \n
  • Fixed #8158: Flip footnote back reference icon for right-to-left languages
  • \n
\n

mkdocs-material-9.6.11 (2025-04-01)

\n
    \n
  • Updated Docker image to latest Alpine Linux
  • \n
  • Bump required Jinja version to 3.1
  • \n
  • Fixed #8133: Jinja filter items not available (9.6.10 regression)
  • \n
  • Fixed #8128: Search plugin not entirely disabled via enabled setting
  • \n
\n

mkdocs-material-9.6.10 (2025-03-30)

\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • 2fe55ee Prepare 9.6.19 release
  • \n
  • c9d5303 Documentation
  • \n
  • 3a0cea1 Bump actions/upload-pages-artifact from 3 to 4
  • \n
  • 3026a57 Bump actions/checkout from 4 to 5
  • \n
  • cb1fc6f Updated dependencies
  • \n
  • 1f3c48e Fixed pillow version range
  • \n
  • 13c9c77 Added pillow 11 to supported version range
  • \n
  • 0d262ec Documentation
  • \n
  • 97ae22f Updated Premium sponsors
  • \n
  • ee6484e Updated Premium sponsors
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mkdocs-material&package-manager=pip&previous-version=9.6.14&new-version=9.6.19)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3208/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3207", - "id": 3392609945, - "node_id": "PR_kwDOADL-3s6nTQFT", - "number": 3207, - "title": "build(deps-dev): bump mkdocs-include-markdown-plugin from 7.1.5 to 7.1.7", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 3, - "created_at": "2025-09-08T05:02:56Z", - "updated_at": "2025-09-12T04:23:57Z", - "closed_at": "2025-09-12T04:23:40Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3207", - "html_url": "https://github.com/RDFLib/rdflib/pull/3207", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3207.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3207.patch", - "merged_at": "2025-09-12T04:23:40Z" - }, - "body": "Bumps [mkdocs-include-markdown-plugin](https://github.com/mondeja/mkdocs-include-markdown-plugin) from 7.1.5 to 7.1.7.\n
\nRelease notes\n

Sourced from mkdocs-include-markdown-plugin's releases.

\n
\n

v7.1.7

\n

Bug fixes

\n
    \n
  • Fix passing negative values to heading-offset argument of include-markdown directive.
  • \n
\n

v7.1.6

\n

Bug fixes

\n
    \n
  • Fix internal anchor in included file incorrectly rewritten.
  • \n
\n
\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mkdocs-include-markdown-plugin&package-manager=pip&previous-version=7.1.5&new-version=7.1.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3207/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3206", - "id": 3392609909, - "node_id": "PR_kwDOADL-3s6nTQEx", - "number": 3206, - "title": "build(deps): bump actions/setup-python from 5 to 6", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4132956439, - "node_id": "LA_kwDOADL-3s72V-kX", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/github_actions", - "name": "github_actions", - "color": "000000", - "default": false, - "description": "Pull requests that update GitHub Actions code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 4, - "created_at": "2025-09-08T05:02:55Z", - "updated_at": "2025-09-12T04:12:29Z", - "closed_at": "2025-09-12T04:11:27Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3206", - "html_url": "https://github.com/RDFLib/rdflib/pull/3206", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3206.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3206.patch", - "merged_at": "2025-09-12T04:11:26Z" - }, - "body": "Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.\n
\nRelease notes\n

Sourced from actions/setup-python's releases.

\n
\n

v6.0.0

\n

What's Changed

\n

Breaking Changes

\n\n

Make sure your runner is on version v2.327.1 or later to ensure compatibility with this release. See Release Notes

\n

Enhancements:

\n\n

Bug fixes:

\n\n

Dependency updates:

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/setup-python/compare/v5...v6.0.0

\n

v5.6.0

\n

What's Changed

\n\n

Full Changelog: https://github.com/actions/setup-python/compare/v5...v5.6.0

\n

v5.5.0

\n

What's Changed

\n

Enhancements:

\n\n

Bug fixes:

\n\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • e797f83 Upgrade to node 24 (#1164)
  • \n
  • 3d1e2d2 Revert "Enhance cache-dependency-path handling to support files outside the w...
  • \n
  • 65b0712 Clarify pythonLocation behavior for PyPy and GraalPy in environment variables...
  • \n
  • 5b668cf Bump actions/checkout from 4 to 5 (#1181)
  • \n
  • f62a0e2 Change missing cache directory error to warning (#1182)
  • \n
  • 9322b3c Upgrade setuptools to 78.1.1 to fix path traversal vulnerability in PackageIn...
  • \n
  • fbeb884 Bump form-data to fix critical vulnerabilities #182 & #183 (#1163)
  • \n
  • 03bb615 Bump idna from 2.9 to 3.7 in /tests/data (#843)
  • \n
  • 36da51d Add version parsing from Pipfile (#1067)
  • \n
  • 3c6f142 update documentation (#1156)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/setup-python&package-manager=github_actions&previous-version=5&new-version=6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3206/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3205", - "id": 3392609468, - "node_id": "PR_kwDOADL-3s6nTP-k", - "number": 3205, - "title": "build(deps-dev): bump pytest from 8.3.5 to 8.4.2", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-09-08T05:02:44Z", - "updated_at": "2025-09-12T04:35:58Z", - "closed_at": "2025-09-12T04:34:37Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3205", - "html_url": "https://github.com/RDFLib/rdflib/pull/3205", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3205.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3205.patch", - "merged_at": "2025-09-12T04:34:37Z" - }, - "body": "Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.5 to 8.4.2.\n
\nRelease notes\n

Sourced from pytest's releases.

\n
\n

8.4.2

\n

pytest 8.4.2 (2025-09-03)

\n

Bug fixes

\n
    \n
  • \n

    #13478: Fixed a crash when using console_output_style{.interpreted-text role="confval"} with times and a module is skipped.

    \n
  • \n
  • \n

    #13530: Fixed a crash when using pytest.approx{.interpreted-text role="func"} and decimal.Decimal{.interpreted-text role="class"} instances with the decimal.FloatOperation{.interpreted-text role="class"} trap set.

    \n
  • \n
  • \n

    #13549: No longer evaluate type annotations in Python 3.14 when inspecting function signatures.

    \n

    This prevents crashes during module collection when modules do not explicitly use from __future__ import annotations and import types for annotations within a if TYPE_CHECKING: block.

    \n
  • \n
  • \n

    #13559: Added missing [int]{.title-ref} and [float]{.title-ref} variants to the [Literal]{.title-ref} type annotation of the [type]{.title-ref} parameter in pytest.Parser.addini{.interpreted-text role="meth"}.

    \n
  • \n
  • \n

    #13563: pytest.approx{.interpreted-text role="func"} now only imports numpy if NumPy is already in sys.modules. This fixes unconditional import behavior introduced in [8.4.0]{.title-ref}.

    \n
  • \n
\n

Improved documentation

\n
    \n
  • #13577: Clarify that pytest_generate_tests is discovered in test modules/classes; other hooks must be in conftest.py or plugins.
  • \n
\n

Contributor-facing changes

\n
    \n
  • #13480: Self-testing: fixed a few test failures when run with -Wdefault or a similar override.
  • \n
  • #13547: Self-testing: corrected expected message for test_doctest_unexpected_exception in Python 3.14.
  • \n
  • #13684: Make pytest's own testsuite insensitive to the presence of the CI environment variable -- by ogrisel{.interpreted-text role="user"}.
  • \n
\n

8.4.1

\n

pytest 8.4.1 (2025-06-17)

\n

Bug fixes

\n
    \n
  • \n

    #13461: Corrected _pytest.terminal.TerminalReporter.isatty to support\nbeing called as a method. Before it was just a boolean which could\nbreak correct code when using -o log_cli=true).

    \n
  • \n
  • \n

    #13477: Reintroduced pytest.PytestReturnNotNoneWarning{.interpreted-text role="class"} which was removed by accident in pytest [8.4]{.title-ref}.

    \n

    This warning is raised when a test functions returns a value other than None, which is often a mistake made by beginners.

    \n

    See return-not-none{.interpreted-text role="ref"} for more information.

    \n
  • \n
  • \n

    #13497: Fixed compatibility with Twisted 25+.

    \n
  • \n
\n

Improved documentation

\n
    \n
  • #13492: Fixed outdated warning about faulthandler not working on Windows.
  • \n
\n

8.4.0

\n

pytest 8.4.0 (2025-06-02)

\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • bfae422 Prepare release version 8.4.2
  • \n
  • 8990538 Fix passenv CI in tox ini and make tests insensitive to the presence of the C...
  • \n
  • ca676bf Merge pull request #13687 from pytest-dev/patchback/backports/8.4.x/e63f6e51c...
  • \n
  • 975a60a Merge pull request #13686 from pytest-dev/patchback/backports/8.4.x/12bde8af6...
  • \n
  • 7723ce8 Merge pull request #13683 from even-even/fix_Exeption_to_Exception_in_errorMe...
  • \n
  • b7f0568 Merge pull request #13685 from CoretexShadow/fix/docs-pytest-generate-tests
  • \n
  • 2c94c4a add missing colon (#13640) (#13641)
  • \n
  • c3d7684 Merge pull request #13606 from pytest-dev/patchback/backports/8.4.x/5f9938563...
  • \n
  • dc6e3be Merge pull request #13605 from The-Compiler/training-update-2025-07
  • \n
  • f87289c Fix crash with times output style and skipped module (#13573) (#13579)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pytest&package-manager=pip&previous-version=8.3.5&new-version=8.4.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3205/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3204", - "id": 3392609060, - "node_id": "PR_kwDOADL-3s6nTP41", - "number": 3204, - "title": "build(deps-dev): bump typing-extensions from 4.13.2 to 4.15.0", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 3, - "created_at": "2025-09-08T05:02:32Z", - "updated_at": "2025-09-12T05:00:39Z", - "closed_at": "2025-09-12T04:59:56Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3204", - "html_url": "https://github.com/RDFLib/rdflib/pull/3204", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3204.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3204.patch", - "merged_at": "2025-09-12T04:59:56Z" - }, - "body": "Bumps [typing-extensions](https://github.com/python/typing_extensions) from 4.13.2 to 4.15.0.\n
\nRelease notes\n

Sourced from typing-extensions's releases.

\n
\n

4.15.0

\n

No user-facing changes since 4.15.0rc1.

\n

New features since 4.14.1:

\n
    \n
  • Add the @typing_extensions.disjoint_base decorator, as specified\nin PEP 800. Patch by Jelle Zijlstra.
  • \n
  • Add typing_extensions.type_repr, a backport of\nannotationlib.type_repr,\nintroduced in Python 3.14 (CPython PR #124551,\noriginally by Jelle Zijlstra). Patch by Semyon Moroz.
  • \n
  • Fix behavior of type params in typing_extensions.evaluate_forward_ref. Backport of\nCPython PR #137227 by Jelle Zijlstra.
  • \n
\n

4.15.0rc1

\n
    \n
  • Add the @typing_extensions.disjoint_base decorator, as specified\nin PEP 800. Patch by Jelle Zijlstra.
  • \n
  • Add typing_extensions.type_repr, a backport of\nannotationlib.type_repr,\nintroduced in Python 3.14 (CPython PR #124551,\noriginally by Jelle Zijlstra). Patch by Semyon Moroz.
  • \n
  • Fix behavior of type params in typing_extensions.evaluate_forward_ref. Backport of\nCPython PR #137227 by Jelle Zijlstra.
  • \n
\n

4.14.1

\n

Release 4.14.1 (July 4, 2025)

\n
    \n
  • Fix usage of typing_extensions.TypedDict nested inside other types\n(e.g., typing.Type[typing_extensions.TypedDict]). This is not allowed by the\ntype system but worked on older versions, so we maintain support.
  • \n
\n

4.14.0

\n

This release adds several new features, including experimental support for inline typed dictionaries (PEP 764) and sentinels (PEP 661), and support for changes in Python 3.14. In addition, Python 3.8 is no longer supported.

\n

Changes since 4.14.0rc1:

\n
    \n
  • Remove __or__ and __ror__ methods from typing_extensions.Sentinel\non Python versions <3.10. PEP 604 was introduced in Python 3.10, and\ntyping_extensions does not generally attempt to backport PEP-604 methods\nto prior versions.
  • \n
  • Further update typing_extensions.evaluate_forward_ref with changes in Python 3.14.
  • \n
\n

Changes included in 4.14.0rc1:

\n
    \n
  • Drop support for Python 3.8 (including PyPy-3.8). Patch by Victorien Plot.
  • \n
  • Do not attempt to re-export names that have been removed from typing,\nanticipating the removal of typing.no_type_check_decorator in Python 3.15.\nPatch by Jelle Zijlstra.
  • \n
  • Update typing_extensions.Format, typing_extensions.evaluate_forward_ref, and\ntyping_extensions.TypedDict to align
  • \n
\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from typing-extensions's changelog.

\n
\n

Release 4.15.0 (August 25, 2025)

\n

No user-facing changes since 4.15.0rc1.

\n

Release 4.15.0rc1 (August 18, 2025)

\n
    \n
  • Add the @typing_extensions.disjoint_base decorator, as specified\nin PEP 800. Patch by Jelle Zijlstra.
  • \n
  • Add typing_extensions.type_repr, a backport of\nannotationlib.type_repr,\nintroduced in Python 3.14 (CPython PR #124551,\noriginally by Jelle Zijlstra). Patch by Semyon Moroz.
  • \n
  • Fix behavior of type params in typing_extensions.evaluate_forward_ref. Backport of\nCPython PR #137227 by Jelle Zijlstra.
  • \n
\n

Release 4.14.1 (July 4, 2025)

\n
    \n
  • Fix usage of typing_extensions.TypedDict nested inside other types\n(e.g., typing.Type[typing_extensions.TypedDict]). This is not allowed by the\ntype system but worked on older versions, so we maintain support.
  • \n
\n

Release 4.14.0 (June 2, 2025)

\n

Changes since 4.14.0rc1:

\n
    \n
  • Remove __or__ and __ror__ methods from typing_extensions.Sentinel\non Python versions <3.10. PEP 604 was introduced in Python 3.10, and\ntyping_extensions does not generally attempt to backport PEP-604 methods\nto prior versions.
  • \n
  • Further update typing_extensions.evaluate_forward_ref with changes in Python 3.14.
  • \n
\n

Release 4.14.0rc1 (May 24, 2025)

\n
    \n
  • Drop support for Python 3.8 (including PyPy-3.8). Patch by Victorien Plot.
  • \n
  • Do not attempt to re-export names that have been removed from typing,\nanticipating the removal of typing.no_type_check_decorator in Python 3.15.\nPatch by Jelle Zijlstra.
  • \n
  • Update typing_extensions.Format, typing_extensions.evaluate_forward_ref, and\ntyping_extensions.TypedDict to align\nwith changes in Python 3.14. Patches by Jelle Zijlstra.
  • \n
  • Fix tests for Python 3.14 and 3.15. Patches by Jelle Zijlstra.
  • \n
\n

New features:

\n
    \n
  • Add support for inline typed dictionaries (PEP 764).\nPatch by Victorien Plot.
  • \n
  • Add typing_extensions.Reader and typing_extensions.Writer. Patch by\nSebastian Rittau.
  • \n
  • Add support for sentinels (PEP 661). Patch by\nVictorien Plot.
  • \n
\n
\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=typing-extensions&package-manager=pip&previous-version=4.13.2&new-version=4.15.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3204/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3203", - "id": 3392607684, - "node_id": "PR_kwDOADL-3s6nTPlp", - "number": 3203, - "title": "build(deps-dev): bump pytest-cov from 6.1.1 to 6.3.0", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 3, - "created_at": "2025-09-08T05:01:54Z", - "updated_at": "2025-09-12T05:12:59Z", - "closed_at": "2025-09-12T05:12:15Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3203", - "html_url": "https://github.com/RDFLib/rdflib/pull/3203", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3203.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3203.patch", - "merged_at": "2025-09-12T05:12:15Z" - }, - "body": "Bumps [pytest-cov](https://github.com/pytest-dev/pytest-cov) from 6.1.1 to 6.3.0.\n
\nChangelog\n

Sourced from pytest-cov's changelog.

\n
\n

6.3.0 (2025-09-06)

\n
    \n
  • Added support for markdown reports.\nContributed by Marcos Boger in [#712](https://github.com/pytest-dev/pytest-cov/issues/712) <https://github.com/pytest-dev/pytest-cov/pull/712>_\nand [#714](https://github.com/pytest-dev/pytest-cov/issues/714) <https://github.com/pytest-dev/pytest-cov/pull/714>_.
  • \n
  • Fixed some formatting issues in docs.\nAnonymous contribution in [#706](https://github.com/pytest-dev/pytest-cov/issues/706) <https://github.com/pytest-dev/pytest-cov/pull/706>_.
  • \n
\n

6.2.1 (2025-06-12)

\n
    \n
  • \n

    Added a version requirement for pytest's pluggy dependency (1.2.0, released 2023-06-21) that has the required new-style hookwrapper API.

    \n
  • \n
  • \n

    Removed deprecated license classifier (packaging).

    \n
  • \n
  • \n

    Disabled coverage warnings in two more situations where they have no value:

    \n
      \n
    • "module-not-measured" in workers
    • \n
    • "already-imported" in subprocesses
    • \n
    \n
  • \n
\n

6.2.0 (2025-06-11)

\n
    \n
  • \n

    The plugin now adds 3 rules in the filter warnings configuration to prevent common coverage warnings being raised as obscure errors::

    \n

    default:unclosed database in <sqlite3.Connection object at:ResourceWarning\nonce::PytestCovWarning\nonce::CoverageWarning

    \n

    This fixes most of the bad interactions that are occurring on pytest 8.4 with filterwarnings=error.

    \n

    The plugin will check if there already matching rules for the 3 categories\n(ResourceWarning, PytestCovWarning, CoverageWarning) and message (unclosed database in <sqlite3.Connection object at) before adding the filters.

    \n

    This means you can have this in your pytest configuration for complete oblivion (not recommended, if that is not clear)::

    \n

    filterwarnings = [\n"error",\n"ignore:unclosed database in <sqlite3.Connection object at:ResourceWarning",\n"ignore::PytestCovWarning",\n"ignore::CoverageWarning",\n]

    \n
  • \n
\n
\n
\n
\nCommits\n
    \n
  • a69d1ab Bump version: 6.2.1 \u2192 6.3.0
  • \n
  • 475bf32 Update changelog.
  • \n
  • 3834009 Add GitHub Actions example and fix example to not break with default markdown...
  • \n
  • 0824728 Small phrasing adustments in Markdown docs
  • \n
  • 474c1f4 Move markdown dest files check to StoreReport for earlier error and parser.er...
  • \n
  • 7b21833 Default markdown-append to coverage.md and raise warning if both markdown opt...
  • \n
  • 3a15312 Fix usage of Path.open() to write/append to files
  • \n
  • 4b79449 Change output file cov-append.md in md-append example
  • \n
  • 40e9e8e Add docs and update AUTHORS.rst
  • \n
  • f5ca33a Add tests for markdown and markdown-append
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pytest-cov&package-manager=pip&previous-version=6.1.1&new-version=6.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3203/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3202", - "id": 3382370163, - "node_id": "PR_kwDOADL-3s6myRLC", - "number": 3202, - "title": "Merge 7-maintenance changes into main", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-09-04T06:37:03Z", - "updated_at": "2025-09-09T03:33:26Z", - "closed_at": "2025-09-09T03:33:25Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3202", - "html_url": "https://github.com/RDFLib/rdflib/pull/3202", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3202.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3202.patch", - "merged_at": "2025-09-09T03:33:25Z" - }, - "body": "# Summary of changes\r\n\r\nThis PR integrates the recent features and bug fixes from `7-maintenance` branch into `main`. This will be merged when the 7.2.0 version is released.\r\n\r\nMany merge conflicts were resolved and all tests and checks are passing.\r\n\r\nThis PR supersedes https://github.com/RDFLib/rdflib/pull/3199.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3202/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3197", - "id": 3378517619, - "node_id": "PR_kwDOADL-3s6mlSqI", - "number": 3197, - "title": "feat: canonicalization with longturtle serializer now optional", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-09-03T06:41:22Z", - "updated_at": "2025-09-08T01:31:53Z", - "closed_at": "2025-09-08T01:31:52Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3197", - "html_url": "https://github.com/RDFLib/rdflib/pull/3197", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3197.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3197.patch", - "merged_at": "2025-09-08T01:31:52Z" - }, - "body": "Fixes https://github.com/RDFLib/rdflib/issues/3196\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3197/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3195", - "id": 3365279463, - "node_id": "PR_kwDOADL-3s6l6NCn", - "number": 3195, - "title": "Revert \"remove old hacks against 2to3 (#3076)\"", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 7242799529, - "node_id": "LA_kwDOADL-3s8AAAABr7RZqQ", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.1", - "name": "7.1", - "color": "FC7848", - "default": false, - "description": "Issues planned to fix in v7.1" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-08-29T03:03:18Z", - "updated_at": "2025-08-29T03:50:11Z", - "closed_at": "2025-08-29T03:50:10Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3195", - "html_url": "https://github.com/RDFLib/rdflib/pull/3195", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3195.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3195.patch", - "merged_at": "2025-08-29T03:50:10Z" - }, - "body": "This reverts commit b74c6574fd982b410aed1aa43853eed37504bf15.\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\nFixes https://github.com/RDFLib/rdflib/issues/3193\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3195/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3194", - "id": 3365240233, - "node_id": "PR_kwDOADL-3s6l6E02", - "number": 3194, - "title": "Fix failing webtest", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 7242799529, - "node_id": "LA_kwDOADL-3s8AAAABr7RZqQ", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.1", - "name": "7.1", - "color": "FC7848", - "default": false, - "description": "Issues planned to fix in v7.1" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-08-29T02:35:40Z", - "updated_at": "2025-08-29T05:33:39Z", - "closed_at": "2025-08-29T05:33:36Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3194", - "html_url": "https://github.com/RDFLib/rdflib/pull/3194", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3194.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3194.patch", - "merged_at": "2025-08-29T05:33:36Z" - }, - "body": "\r\n\r\n# Summary of changes\r\n\r\nFixes https://github.com/RDFLib/rdflib/issues/3192\r\n\r\nNote: this is a cascading PR and includes https://github.com/RDFLib/rdflib/pull/3195 to ensure all fixes to tests are applied before merging into `7-maintenance` branch.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3194/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3189", - "id": 3351175415, - "node_id": "PR_kwDOADL-3s6lLaSS", - "number": 3189, - "title": "build(deps): bump actions/setup-java from 4 to 5", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4132956439, - "node_id": "LA_kwDOADL-3s72V-kX", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/github_actions", - "name": "github_actions", - "color": "000000", - "default": false, - "description": "Pull requests that update GitHub Actions code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-08-25T09:48:26Z", - "updated_at": "2025-09-12T05:24:57Z", - "closed_at": "2025-09-12T05:24:16Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3189", - "html_url": "https://github.com/RDFLib/rdflib/pull/3189", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3189.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3189.patch", - "merged_at": "2025-09-12T05:24:16Z" - }, - "body": "Bumps [actions/setup-java](https://github.com/actions/setup-java) from 4 to 5.\n
\nRelease notes\n

Sourced from actions/setup-java's releases.

\n
\n

v5.0.0

\n

What's Changed

\n

Breaking Changes

\n\n

Make sure your runner is updated to this version or newer to use this release. v2.327.1 Release Notes

\n

Dependency Upgrades

\n\n

Bug Fixes

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/setup-java/compare/v4...v5.0.0

\n

v4.7.1

\n

What's Changed

\n

Documentation changes

\n\n

Dependency updates:

\n\n

Full Changelog: https://github.com/actions/setup-java/compare/v4...v4.7.1

\n

v4.7.0

\n

What's Changed

\n\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • dded088 Bump actions/checkout from 4 to 5 (#896)
  • \n
  • 0913e9a Upgrade to node 24 (#888)
  • \n
  • e9343db Bumps form-data (#887)
  • \n
  • ae2b61d Bump undici from 5.28.5 to 5.29.0 (#833)
  • \n
  • c190c18 Bump eslint-plugin-jest from 27.9.0 to 29.0.1 (#730)
  • \n
  • 67aec00 Fix: prevent default installation of JetBrains pre-releases (#859)
  • \n
  • ebb356c Improve Error Handling for Setup-Java Action to Help Debug Intermittent Failu...
  • \n
  • f4f1212 Update publish-immutable-actions.yml (#798)
  • \n
  • See full diff in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/setup-java&package-manager=github_actions&previous-version=4&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3189/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3185", - "id": 3329618398, - "node_id": "PR_kwDOADL-3s6kDWYr", - "number": 3185, - "title": "build(deps): bump library/python from 3.13.3-slim to 3.13.7-slim in /docker/latest", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4545133062, - "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", - "name": "docker", - "color": "21ceff", - "default": false, - "description": "Pull requests that update Docker code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-08-18T07:56:51Z", - "updated_at": "2025-09-12T05:36:21Z", - "closed_at": "2025-09-12T05:35:57Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3185", - "html_url": "https://github.com/RDFLib/rdflib/pull/3185", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3185.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3185.patch", - "merged_at": "2025-09-12T05:35:57Z" - }, - "body": "Bumps library/python from 3.13.3-slim to 3.13.7-slim.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.3-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3185/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3184", - "id": 3329609992, - "node_id": "PR_kwDOADL-3s6kDUnS", - "number": 3184, - "title": "build(deps): bump actions/checkout from 4 to 5", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4132956439, - "node_id": "LA_kwDOADL-3s72V-kX", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/github_actions", - "name": "github_actions", - "color": "000000", - "default": false, - "description": "Pull requests that update GitHub Actions code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-08-18T07:54:08Z", - "updated_at": "2025-09-12T05:46:47Z", - "closed_at": "2025-09-12T05:46:06Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3184", - "html_url": "https://github.com/RDFLib/rdflib/pull/3184", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3184.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3184.patch", - "merged_at": "2025-09-12T05:46:06Z" - }, - "body": "Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to 5.\n
\nRelease notes\n

Sourced from actions/checkout's releases.

\n
\n

v5.0.0

\n

What's Changed

\n\n

\u26a0\ufe0f Minimum Compatible Runner Version

\n

v2.327.1
\nRelease Notes

\n

Make sure your runner is updated to this version or newer to use this release.

\n

Full Changelog: https://github.com/actions/checkout/compare/v4...v5.0.0

\n

v4.3.0

\n

What's Changed

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/checkout/compare/v4...v4.3.0

\n

v4.2.2

\n

What's Changed

\n\n

Full Changelog: https://github.com/actions/checkout/compare/v4.2.1...v4.2.2

\n

v4.2.1

\n

What's Changed

\n\n

New Contributors

\n\n

Full Changelog: https://github.com/actions/checkout/compare/v4.2.0...v4.2.1

\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from actions/checkout's changelog.

\n
\n

Changelog

\n

V5.0.0

\n\n

V4.3.0

\n\n

v4.2.2

\n\n

v4.2.1

\n\n

v4.2.0

\n\n

v4.1.7

\n\n

v4.1.6

\n\n

v4.1.5

\n\n

v4.1.4

\n\n

v4.1.3

\n\n
\n

... (truncated)

\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/checkout&package-manager=github_actions&previous-version=4&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3184/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3183", - "id": 3329595517, - "node_id": "PR_kwDOADL-3s6kDRhT", - "number": 3183, - "title": "build(deps): bump library/python from 3.13.3-slim to 3.13.7-slim in /docker/unstable", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4545133062, - "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", - "name": "docker", - "color": "21ceff", - "default": false, - "description": "Pull requests that update Docker code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-08-18T07:49:17Z", - "updated_at": "2025-09-12T05:58:56Z", - "closed_at": "2025-09-12T05:57:58Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3183", - "html_url": "https://github.com/RDFLib/rdflib/pull/3183", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3183.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3183.patch", - "merged_at": "2025-09-12T05:57:58Z" - }, - "body": "Bumps library/python from 3.13.3-slim to 3.13.7-slim.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.3-slim&new-version=3.13.7-slim)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3183/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3182", - "id": 3328829189, - "node_id": "PR_kwDOADL-3s6kAyCB", - "number": 3182, - "title": "Fix #3181", - "user": { - "login": "nicholascar", - "id": 7321872, - "node_id": "MDQ6VXNlcjczMjE4NzI=", - "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/nicholascar", - "html_url": "https://github.com/nicholascar", - "followers_url": "https://api.github.com/users/nicholascar/followers", - "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", - "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", - "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", - "organizations_url": "https://api.github.com/users/nicholascar/orgs", - "repos_url": "https://api.github.com/users/nicholascar/repos", - "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", - "received_events_url": "https://api.github.com/users/nicholascar/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-08-18T00:52:09Z", - "updated_at": "2025-08-18T00:52:18Z", - "closed_at": "2025-08-18T00:52:17Z", - "author_association": "MEMBER", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3182", - "html_url": "https://github.com/RDFLib/rdflib/pull/3182", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3182.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3182.patch", - "merged_at": "2025-08-18T00:52:17Z" - }, - "body": "README link fix", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3182/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3178", - "id": 3311960710, - "node_id": "PR_kwDOADL-3s6jJe79", - "number": 3178, - "title": "Creation of an RDFLib Charter", - "user": { - "login": "nicholascar", - "id": 7321872, - "node_id": "MDQ6VXNlcjczMjE4NzI=", - "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/nicholascar", - "html_url": "https://github.com/nicholascar", - "followers_url": "https://api.github.com/users/nicholascar/followers", - "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", - "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", - "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", - "organizations_url": "https://api.github.com/users/nicholascar/orgs", - "repos_url": "https://api.github.com/users/nicholascar/repos", - "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", - "received_events_url": "https://api.github.com/users/nicholascar/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-08-11T23:35:05Z", - "updated_at": "2025-08-18T00:43:02Z", - "closed_at": "2025-08-18T00:43:00Z", - "author_association": "MEMBER", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3178", - "html_url": "https://github.com/RDFLib/rdflib/pull/3178", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3178.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3178.patch", - "merged_at": "2025-08-18T00:43:00Z" - }, - "body": "Edits of the Contributing guidelines to streamline their advice and to add a Charter that states RDFLib's community's principles.", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3178/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3177", - "id": 3309552816, - "node_id": "PR_kwDOADL-3s6jBhWX", - "number": 3177, - "title": "Feature: Add Tentris Plugin to docs", - "user": { - "login": "bigerl", - "id": 933146, - "node_id": "MDQ6VXNlcjkzMzE0Ng==", - "avatar_url": "https://avatars.githubusercontent.com/u/933146?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/bigerl", - "html_url": "https://github.com/bigerl", - "followers_url": "https://api.github.com/users/bigerl/followers", - "following_url": "https://api.github.com/users/bigerl/following{/other_user}", - "gists_url": "https://api.github.com/users/bigerl/gists{/gist_id}", - "starred_url": "https://api.github.com/users/bigerl/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/bigerl/subscriptions", - "organizations_url": "https://api.github.com/users/bigerl/orgs", - "repos_url": "https://api.github.com/users/bigerl/repos", - "events_url": "https://api.github.com/users/bigerl/events{/privacy}", - "received_events_url": "https://api.github.com/users/bigerl/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-08-11T11:09:53Z", - "updated_at": "2025-08-11T23:43:05Z", - "closed_at": "2025-08-11T23:43:05Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3177", - "html_url": "https://github.com/RDFLib/rdflib/pull/3177", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3177.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3177.patch", - "merged_at": "2025-08-11T23:43:05Z" - }, - "body": "# Summary of changes\r\n\r\nAt Tentris, we developed a plugin that allows users to run their `rdflib.Graph` (1) with a native in-memory Tentris instance and (2) connect it to an Tentris SPARQL HTTP endpoint. \r\n\r\nI have added it to the list of Plugins. \r\n\r\n# Checklist\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes.\r\n- If the change has a potential impact on users of this project:\r\n - [x] Considered adding additional documentation.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n \r\n **Note: Some organization policy seems to prevent that. If anybody is aware how I can adjust that I am happy to change it.**\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3177/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3176", - "id": 3309030353, - "node_id": "PR_kwDOADL-3s6i_zhJ", - "number": 3176, - "title": "build(deps): bump poetry from 2.0.0 to 2.1.4 in /devtools", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 4, - "created_at": "2025-08-11T08:40:57Z", - "updated_at": "2025-09-16T06:01:05Z", - "closed_at": "2025-09-16T06:00:35Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3176", - "html_url": "https://github.com/RDFLib/rdflib/pull/3176", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3176.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3176.patch", - "merged_at": "2025-09-16T06:00:35Z" - }, - "body": "Bumps [poetry](https://github.com/python-poetry/poetry) from 2.0.0 to 2.1.4.\n
\nRelease notes\n

Sourced from poetry's releases.

\n
\n

2.1.4

\n

Changed

\n
    \n
  • Require virtualenv<20.33 to work around an issue where Poetry uses the wrong Python version (#10491).
  • \n
  • Improve the error messages for the validation of the pyproject.toml file (#10471).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where project plugins were installed even though poetry install was called with --no-plugins (#10405).
  • \n
  • Fix an issue where dependency resolution failed for self-referential extras with duplicate dependencies (#10488).
  • \n
\n

Docs

\n
    \n
  • Clarify how to include files that were automatically excluded via VCS ignore settings (#10442).
  • \n
  • Clarify the behavior of poetry add if no version constraint is explicitly specified (#10445).
  • \n
\n

2.1.3

\n

Changed

\n
    \n
  • Require importlib-metadata<8.7 for Python 3.9 because of a breaking change in importlib-metadata 8.7 (#10374).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where re-locking failed for incomplete multiple-constraints dependencies with explicit sources (#10324).
  • \n
  • Fix an issue where the --directory option did not work if a plugin, which accesses the poetry instance during its activation, was installed (#10352).
  • \n
  • Fix an issue where poetry env activate -v printed additional information to stdout instead of stderr so that the output could not be used as designed (#10353).
  • \n
  • Fix an issue where the original error was not printed if building a git dependency failed (#10366).
  • \n
  • Fix an issue where wheels for the wrong platform were installed in rare cases. (#10361).
  • \n
\n

poetry-core (2.1.3)

\n
    \n
  • Fix an issue where the union of specific inverse or partially inverse markers was not simplified (#858).
  • \n
  • Fix an issue where optional dependencies defined in the project section were treated as non-optional when a source was defined for them in the tool.poetry section (#857).
  • \n
  • Fix an issue where markers with === were not parsed correctly (#860).
  • \n
  • Fix an issue where local versions with upper case letters caused an error (#859).
  • \n
  • Fix an issue where extra markers with a value starting with "in" were not validated correctly (#862).
  • \n
\n

2.1.2

\n

Changed

\n
    \n
  • Improve performance of locking dependencies (#10275).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where markers were not locked correctly (#10240).
  • \n
  • Fix an issue where the result of poetry lock was not deterministic (#10276).
  • \n
  • Fix an issue where poetry env activate returned the wrong command for tcsh (#10243).
  • \n
  • Fix an issue where poetry env activate returned the wrong command for pwsh on Linux (#10256).
  • \n
\n

Docs

\n\n
\n

... (truncated)

\n
\n
\nChangelog\n

Sourced from poetry's changelog.

\n
\n

[2.1.4] - 2025-08-05

\n

Changed

\n
    \n
  • Require virtualenv<20.33 to work around an issue where Poetry uses the wrong Python version (#10491).
  • \n
  • Improve the error messages for the validation of the pyproject.toml file (#10471).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where project plugins were installed even though poetry install was called with --no-plugins (#10405).
  • \n
  • Fix an issue where dependency resolution failed for self-referential extras with duplicate dependencies (#10488).
  • \n
\n

Docs

\n
    \n
  • Clarify how to include files that were automatically excluded via VCS ignore settings (#10442).
  • \n
  • Clarify the behavior of poetry add if no version constraint is explicitly specified (#10445).
  • \n
\n

[2.1.3] - 2025-05-04

\n

Changed

\n
    \n
  • Require importlib-metadata<8.7 for Python 3.9 because of a breaking change in importlib-metadata 8.7 (#10374).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where re-locking failed for incomplete multiple-constraints dependencies with explicit sources (#10324).
  • \n
  • Fix an issue where the --directory option did not work if a plugin, which accesses the poetry instance during its activation, was installed (#10352).
  • \n
  • Fix an issue where poetry env activate -v printed additional information to stdout instead of stderr so that the output could not be used as designed (#10353).
  • \n
  • Fix an issue where the original error was not printed if building a git dependency failed (#10366).
  • \n
  • Fix an issue where wheels for the wrong platform were installed in rare cases. (#10361).
  • \n
\n

poetry-core (2.1.3)

\n
    \n
  • Fix an issue where the union of specific inverse or partially inverse markers was not simplified (#858).
  • \n
  • Fix an issue where optional dependencies defined in the project section were treated as non-optional when a source was defined for them in the tool.poetry section (#857).
  • \n
  • Fix an issue where markers with === were not parsed correctly (#860).
  • \n
  • Fix an issue where local versions with upper case letters caused an error (#859).
  • \n
  • Fix an issue where extra markers with a value starting with "in" were not validated correctly (#862).
  • \n
\n

[2.1.2] - 2025-03-29

\n

Changed

\n
    \n
  • Improve performance of locking dependencies (#10275).
  • \n
\n

Fixed

\n
    \n
  • Fix an issue where markers were not locked correctly (#10240).
  • \n
\n\n
\n

... (truncated)

\n
\n
\nCommits\n
    \n
  • a8f0889 release: bump version to 2.1.4
  • \n
  • 683fd83 fix: adjust virtualenv constraint in pyproject.toml to < 20.33.0 (#10491)
  • \n
  • 501346e solver: fix dependency resolution for self-referential extras with duplicate ...
  • \n
  • c9e8a4c fix deprecated parts in pyproject example in README (#10479)
  • \n
  • 2855b2e Fix test_python_get_preferred_default for rc Python releases (#10478)
  • \n
  • 9ee000a improve pyproject.toml validation error messages by replacing data with `to...
  • \n
  • 6d6c2f1 docs: update unspecified version docs for add (#10445)
  • \n
  • 5e58233 Documentation: Clarified negating VCS excluded files (#10442)
  • \n
  • ac51717 fix: typo in dependency-specification.md (#10427)
  • \n
  • c1220a7 Add missing tmp_venv mock to test_no_additional_output_in_verbose_mode (#10397)
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=poetry&package-manager=pip&previous-version=2.0.0&new-version=2.1.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nYou can trigger a rebase of this PR by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\nDependabot will merge this PR once it's up-to-date and CI passes on it, as requested by @edmondchuc.\n\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
\n\n> **Note**\n> Automatic rebases have been disabled on this pull request as it has been open for over 30 days.", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3176/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3158", - "id": 3166176394, - "node_id": "PR_kwDOADL-3s6bj-5z", - "number": 3158, - "title": "Fix contributing guide link in README.md", - "user": { - "login": "rodrigosetti", - "id": 99732, - "node_id": "MDQ6VXNlcjk5NzMy", - "avatar_url": "https://avatars.githubusercontent.com/u/99732?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/rodrigosetti", - "html_url": "https://github.com/rodrigosetti", - "followers_url": "https://api.github.com/users/rodrigosetti/followers", - "following_url": "https://api.github.com/users/rodrigosetti/following{/other_user}", - "gists_url": "https://api.github.com/users/rodrigosetti/gists{/gist_id}", - "starred_url": "https://api.github.com/users/rodrigosetti/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/rodrigosetti/subscriptions", - "organizations_url": "https://api.github.com/users/rodrigosetti/orgs", - "repos_url": "https://api.github.com/users/rodrigosetti/repos", - "events_url": "https://api.github.com/users/rodrigosetti/events{/privacy}", - "received_events_url": "https://api.github.com/users/rodrigosetti/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-06-22T18:47:04Z", - "updated_at": "2025-08-18T03:28:21Z", - "closed_at": "2025-08-18T00:45:52Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3158", - "html_url": "https://github.com/RDFLib/rdflib/pull/3158", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3158.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3158.patch", - "merged_at": "2025-08-18T00:45:51Z" - }, - "body": "Previous link was broken", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3158/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3146", - "id": 3108065541, - "node_id": "PR_kwDOADL-3s6YhDye", - "number": 3146, - "title": "Replacement for #3125", - "user": { - "login": "nicholascar", - "id": 7321872, - "node_id": "MDQ6VXNlcjczMjE4NzI=", - "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/nicholascar", - "html_url": "https://github.com/nicholascar", - "followers_url": "https://api.github.com/users/nicholascar/followers", - "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", - "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", - "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", - "organizations_url": "https://api.github.com/users/nicholascar/orgs", - "repos_url": "https://api.github.com/users/nicholascar/repos", - "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", - "received_events_url": "https://api.github.com/users/nicholascar/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-06-01T23:59:34Z", - "updated_at": "2025-06-02T00:48:58Z", - "closed_at": "2025-06-02T00:48:56Z", - "author_association": "MEMBER", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3146", - "html_url": "https://github.com/RDFLib/rdflib/pull/3146", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3146.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3146.patch", - "merged_at": "2025-06-02T00:48:56Z" - }, - "body": "This PR replaces #3125 since a bunch of conflicts from subsequent PRs needed merging into it.\r\n\r\nCloses #3125", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3146/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3145", - "id": 3106545238, - "node_id": "PR_kwDOADL-3s6Ycf5X", - "number": 3145, - "title": "replace PR 3109; improve plugins modules docs; change header colour t\u2026", - "user": { - "login": "nicholascar", - "id": 7321872, - "node_id": "MDQ6VXNlcjczMjE4NzI=", - "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/nicholascar", - "html_url": "https://github.com/nicholascar", - "followers_url": "https://api.github.com/users/nicholascar/followers", - "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", - "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", - "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", - "organizations_url": "https://api.github.com/users/nicholascar/orgs", - "repos_url": "https://api.github.com/users/nicholascar/repos", - "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", - "received_events_url": "https://api.github.com/users/nicholascar/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-06-01T04:12:57Z", - "updated_at": "2025-06-01T04:25:26Z", - "closed_at": "2025-06-01T04:25:24Z", - "author_association": "MEMBER", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3145", - "html_url": "https://github.com/RDFLib/rdflib/pull/3145", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3145.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3145.patch", - "merged_at": "2025-06-01T04:25:24Z" - }, - "body": "Closes #3109", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3145/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3144", - "id": 3105339557, - "node_id": "PR_kwDOADL-3s6YYuTz", - "number": 3144, - "title": "Pr/3143", - "user": { - "login": "nicholascar", - "id": 7321872, - "node_id": "MDQ6VXNlcjczMjE4NzI=", - "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/nicholascar", - "html_url": "https://github.com/nicholascar", - "followers_url": "https://api.github.com/users/nicholascar/followers", - "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", - "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", - "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", - "organizations_url": "https://api.github.com/users/nicholascar/orgs", - "repos_url": "https://api.github.com/users/nicholascar/repos", - "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", - "received_events_url": "https://api.github.com/users/nicholascar/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-05-31T11:16:16Z", - "updated_at": "2025-06-01T02:23:59Z", - "closed_at": "2025-06-01T02:23:58Z", - "author_association": "MEMBER", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3144", - "html_url": "https://github.com/RDFLib/rdflib/pull/3144", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3144.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3144.patch", - "merged_at": "2025-06-01T02:23:58Z" - }, - "body": "Replacement for #3143 with some black & mypy additions\r\n\r\nCloses #3143", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3144/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3142", - "id": 3090061335, - "node_id": "PR_kwDOADL-3s6XlAuc", - "number": 3142, - "title": "build(deps-dev): bump coverage from 7.7.1 to 7.8.2", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-05-26T05:35:59Z", - "updated_at": "2025-05-31T10:00:27Z", - "closed_at": "2025-05-31T10:00:25Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3142", - "html_url": "https://github.com/RDFLib/rdflib/pull/3142", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3142.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3142.patch", - "merged_at": "2025-05-31T10:00:25Z" - }, - "body": "Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.7.1 to 7.8.2.\n
\nChangelog\n

Sourced from coverage's changelog.

\n
\n

Version 7.8.2 \u2014 2025-05-23

\n
    \n
  • Wheels are provided for Windows ARM64 on Python 3.11, 3.12, and 3.13.\nThanks, Finn Womack <pull 1972_>_.
  • \n
\n

.. _issue 1971: nedbat/coveragepy#1971\n.. _pull 1972: nedbat/coveragepy#1972

\n

.. _changes_7-8-1:

\n

Version 7.8.1 \u2014 2025-05-21

\n
    \n
  • \n

    A number of EncodingWarnings were fixed that could appear if you've enabled\nPYTHONWARNDEFAULTENCODING, fixing issue 1966. Thanks, Henry Schreiner <pull 1967_>.

    \n
  • \n
  • \n

    Fixed a race condition when using sys.monitoring with free-threading Python,\nclosing issue 1970_.

    \n
  • \n
\n

.. _issue 1966: nedbat/coveragepy#1966\n.. _pull 1967: nedbat/coveragepy#1967\n.. _issue 1970: nedbat/coveragepy#1970

\n

.. _changes_7-8-0:

\n

Version 7.8.0 \u2014 2025-03-30

\n
    \n
  • \n

    Added a new source_dirs setting for symmetry with the existing\nsource_pkgs setting. It's preferable to the existing source setting,\nbecause you'll get a clear error when directories don't exist. Fixes issue 1942. Thanks, Jeremy Fleischman <pull 1943_>.

    \n
  • \n
  • \n

    Fix: the PYTHONSAFEPATH environment variable new in Python 3.11 is properly\nsupported, closing issue 1696. Thanks, Philipp A. <pull 1700_>. This\nworks properly except for a detail when using the coverage command on\nWindows. There you can use python -m coverage instead if you need exact\nemulation.

    \n
  • \n
\n

.. _issue 1696: nedbat/coveragepy#1696\n.. _pull 1700: nedbat/coveragepy#1700\n.. _issue 1942: nedbat/coveragepy#1942\n.. _pull 1943: nedbat/coveragepy#1943

\n

.. _changes_7-7-1:

\n
\n
\n
\nCommits\n
    \n
  • 51ab2e5 build: have to keep expected dist counts in sync
  • \n
  • be7bbf2 docs: sample HTML for 7.8.2
  • \n
  • 3cee850 docs: prep for 7.8.2
  • \n
  • 39bc6b0 docs: provide more details if the kit matrix is edited.
  • \n
  • a608fb3 build: add support for Windows arm64 (#1972)
  • \n
  • 2fe6225 build: run tox lint if actions have changed
  • \n
  • 3d93a78 docs: docs need scriv for making github releases
  • \n
  • 0c443a2 build: bump version to 7.8.2
  • \n
  • ed98b87 docs: sample HTML for 7.8.1
  • \n
  • b98bc9b docs: prep for 7.8.1
  • \n
  • Additional commits viewable in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=coverage&package-manager=pip&previous-version=7.7.1&new-version=7.8.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3142/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3141", - "id": 3089982425, - "node_id": "PR_kwDOADL-3s6Xkvpa", - "number": 3141, - "title": "fix: do not automatically generate header id in RDF patch generation and fix missing fullstop", - "user": { - "login": "recalcitrantsupplant", - "id": 10570038, - "node_id": "MDQ6VXNlcjEwNTcwMDM4", - "avatar_url": "https://avatars.githubusercontent.com/u/10570038?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/recalcitrantsupplant", - "html_url": "https://github.com/recalcitrantsupplant", - "followers_url": "https://api.github.com/users/recalcitrantsupplant/followers", - "following_url": "https://api.github.com/users/recalcitrantsupplant/following{/other_user}", - "gists_url": "https://api.github.com/users/recalcitrantsupplant/gists{/gist_id}", - "starred_url": "https://api.github.com/users/recalcitrantsupplant/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/recalcitrantsupplant/subscriptions", - "organizations_url": "https://api.github.com/users/recalcitrantsupplant/orgs", - "repos_url": "https://api.github.com/users/recalcitrantsupplant/repos", - "events_url": "https://api.github.com/users/recalcitrantsupplant/events{/privacy}", - "received_events_url": "https://api.github.com/users/recalcitrantsupplant/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-05-26T04:42:48Z", - "updated_at": "2025-05-31T09:48:58Z", - "closed_at": "2025-05-31T09:48:58Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3141", - "html_url": "https://github.com/RDFLib/rdflib/pull/3141", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3141.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3141.patch", - "merged_at": "2025-05-31T09:48:58Z" - }, - "body": "# Summary of changes\r\n\r\nFixes a bug in `PatchSerializer` where the `H prev` header line missed a trailing period. Also, `header_id` is now treated as optional; the `H id` line is only written if `header_id` is provided, removing the previous default UUID generation. This change is backwards compatible and primarily addresses a formatting issue and refines header generation.\r\n\r\n# Checklist\r\n\r\n- [x] Checked that there aren't other open pull requests for the same change.\r\n- [x] Checked that all tests and type checking passes. \r\n- If the change has a potential impact on users of this project:\r\n - [x] Added or updated tests that fail without the change.\r\n - [N/A] Updated relevant documentation to avoid inaccuracies.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3141/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3139", - "id": 3074399060, - "node_id": "PR_kwDOADL-3s6WwPrJ", - "number": 3139, - "title": "List on docs the COTTAS store backend", - "user": { - "login": "arenas-guerrero-julian", - "id": 18464038, - "node_id": "MDQ6VXNlcjE4NDY0MDM4", - "avatar_url": "https://avatars.githubusercontent.com/u/18464038?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/arenas-guerrero-julian", - "html_url": "https://github.com/arenas-guerrero-julian", - "followers_url": "https://api.github.com/users/arenas-guerrero-julian/followers", - "following_url": "https://api.github.com/users/arenas-guerrero-julian/following{/other_user}", - "gists_url": "https://api.github.com/users/arenas-guerrero-julian/gists{/gist_id}", - "starred_url": "https://api.github.com/users/arenas-guerrero-julian/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/arenas-guerrero-julian/subscriptions", - "organizations_url": "https://api.github.com/users/arenas-guerrero-julian/orgs", - "repos_url": "https://api.github.com/users/arenas-guerrero-julian/repos", - "events_url": "https://api.github.com/users/arenas-guerrero-julian/events{/privacy}", - "received_events_url": "https://api.github.com/users/arenas-guerrero-julian/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-05-19T16:25:00Z", - "updated_at": "2025-05-20T01:49:26Z", - "closed_at": "2025-05-20T01:49:26Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3139", - "html_url": "https://github.com/RDFLib/rdflib/pull/3139", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3139.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3139.patch", - "merged_at": "2025-05-20T01:49:26Z" - }, - "body": "# Summary of changes\r\n\r\nAdded [COTTAS](https://github.com/arenas-guerrero-julian/pycottas) store backend to the docs.\r\n\r\n# Checklist\r\n\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3139/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3134", - "id": 3044690733, - "node_id": "PR_kwDOADL-3s6VNJb_", - "number": 3134, - "title": "[7.x] fix namespace prefixes in longturtle serialization", - "user": { - "login": "edmondchuc", - "id": 37032744, - "node_id": "MDQ6VXNlcjM3MDMyNzQ0", - "avatar_url": "https://avatars.githubusercontent.com/u/37032744?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/edmondchuc", - "html_url": "https://github.com/edmondchuc", - "followers_url": "https://api.github.com/users/edmondchuc/followers", - "following_url": "https://api.github.com/users/edmondchuc/following{/other_user}", - "gists_url": "https://api.github.com/users/edmondchuc/gists{/gist_id}", - "starred_url": "https://api.github.com/users/edmondchuc/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/edmondchuc/subscriptions", - "organizations_url": "https://api.github.com/users/edmondchuc/orgs", - "repos_url": "https://api.github.com/users/edmondchuc/repos", - "events_url": "https://api.github.com/users/edmondchuc/events{/privacy}", - "received_events_url": "https://api.github.com/users/edmondchuc/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-05-07T05:00:00Z", - "updated_at": "2025-05-20T01:50:14Z", - "closed_at": "2025-05-20T01:50:13Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3134", - "html_url": "https://github.com/RDFLib/rdflib/pull/3134", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3134.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3134.patch", - "merged_at": "2025-05-20T01:50:13Z" - }, - "body": "7.x PR of https://github.com/RDFLib/rdflib/pull/3106.\r\n\r\n- [ ] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [ ] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [ ] Created an issue to discuss the change and get in-principle agreement.\r\n - [ ] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [ ] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [ ] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3134/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3132", - "id": 3039033122, - "node_id": "PR_kwDOADL-3s6U52DB", - "number": 3132, - "title": "Cope with Namespace annotations in Python 3.14", - "user": { - "login": "nphilipp", - "id": 820624, - "node_id": "MDQ6VXNlcjgyMDYyNA==", - "avatar_url": "https://avatars.githubusercontent.com/u/820624?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/nphilipp", - "html_url": "https://github.com/nphilipp", - "followers_url": "https://api.github.com/users/nphilipp/followers", - "following_url": "https://api.github.com/users/nphilipp/following{/other_user}", - "gists_url": "https://api.github.com/users/nphilipp/gists{/gist_id}", - "starred_url": "https://api.github.com/users/nphilipp/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/nphilipp/subscriptions", - "organizations_url": "https://api.github.com/users/nphilipp/orgs", - "repos_url": "https://api.github.com/users/nphilipp/repos", - "events_url": "https://api.github.com/users/nphilipp/events{/privacy}", - "received_events_url": "https://api.github.com/users/nphilipp/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-05-05T08:57:17Z", - "updated_at": "2025-06-01T06:50:30Z", - "closed_at": "2025-06-01T06:50:30Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3132", - "html_url": "https://github.com/RDFLib/rdflib/pull/3132", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3132.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3132.patch", - "merged_at": "2025-06-01T06:50:30Z" - }, - "body": "I submitted this already in #3084 which got merged, but the change is missing from the main branch. So here we go again:\r\n\r\n-----------------------\r\n\r\nThe __annotations__ member can be incomplete, use the get_annotations() helper from annotationlib (Python >= 3.14) or inspect (Python >= 3.10) if available.\r\n\r\nRelated: #3083\r\n\r\n\r\n\r\n# Summary of changes\r\n\r\nThis fixes accessing Namespace annotations on Python 3.14, which makes `import rdflib` fail on this Python version. This should be backwards-compatible.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes. \u21d2 Other tests (sparql) fail on Python 3.14, see #3083 \r\n- If the change adds new features or changes the RDFLib public API: n/a\r\n- If the change has a potential impact on users of this project: n/a (covered by existing tests)\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3132/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3127", - "id": 3033489921, - "node_id": "PR_kwDOADL-3s6Uny5a", - "number": 3127, - "title": "Fix incorrect deskolemization of literals", - "user": { - "login": "ddeschepper", - "id": 1130183, - "node_id": "MDQ6VXNlcjExMzAxODM=", - "avatar_url": "https://avatars.githubusercontent.com/u/1130183?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/ddeschepper", - "html_url": "https://github.com/ddeschepper", - "followers_url": "https://api.github.com/users/ddeschepper/followers", - "following_url": "https://api.github.com/users/ddeschepper/following{/other_user}", - "gists_url": "https://api.github.com/users/ddeschepper/gists{/gist_id}", - "starred_url": "https://api.github.com/users/ddeschepper/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/ddeschepper/subscriptions", - "organizations_url": "https://api.github.com/users/ddeschepper/orgs", - "repos_url": "https://api.github.com/users/ddeschepper/repos", - "events_url": "https://api.github.com/users/ddeschepper/events{/privacy}", - "received_events_url": "https://api.github.com/users/ddeschepper/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-05-01T09:16:35Z", - "updated_at": "2025-09-18T05:02:33Z", - "closed_at": "2025-09-18T05:02:33Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3127", - "html_url": "https://github.com/RDFLib/rdflib/pull/3127", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3127.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3127.patch", - "merged_at": "2025-09-18T05:02:33Z" - }, - "body": "Fixes issue https://github.com/RDFLib/rdflib/issues/3126.\r\n\r\nGraph.de_skolemize() incorrectly tries to deskolemize literals, which fails in some edge cases. Limiting deskolemization of objects to `URIRef`s only fixes this behaviour.", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3127/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3121", - "id": 3023781553, - "node_id": "PR_kwDOADL-3s6UGzEk", - "number": 3121, - "title": "build(deps-dev): bump typing-extensions from 4.13.0 to 4.13.2", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-04-28T05:53:17Z", - "updated_at": "2025-05-31T09:46:31Z", - "closed_at": "2025-05-31T09:46:29Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3121", - "html_url": "https://github.com/RDFLib/rdflib/pull/3121", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3121.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3121.patch", - "merged_at": "2025-05-31T09:46:29Z" - }, - "body": "Bumps [typing-extensions](https://github.com/python/typing_extensions) from 4.13.0 to 4.13.2.\n
\nRelease notes\n

Sourced from typing-extensions's releases.

\n
\n

4.13.2

\n
    \n
  • Fix TypeError when taking the union of typing_extensions.TypeAliasType and a\ntyping.TypeAliasType on Python 3.12 and 3.13.\nPatch by Joren Hammudoglu.
  • \n
  • Backport from CPython PR #132160\nto avoid having user arguments shadowed in generated __new__ by\n@typing_extensions.deprecated.\nPatch by Victorien Plot.
  • \n
\n

4.13.1

\n

This is a bugfix release fixing two edge cases that appear on old bugfix releases of CPython.

\n

Bugfixes:

\n
    \n
  • Fix regression in 4.13.0 on Python 3.10.2 causing a TypeError when using Concatenate.\nPatch by Daraan.
  • \n
  • Fix TypeError when using evaluate_forward_ref on Python 3.10.1-2 and 3.9.8-10.\nPatch by Daraan.
  • \n
\n
\n
\n
\nChangelog\n

Sourced from typing-extensions's changelog.

\n
\n

Release 4.13.2 (April 10, 2025)

\n
    \n
  • Fix TypeError when taking the union of typing_extensions.TypeAliasType and a\ntyping.TypeAliasType on Python 3.12 and 3.13.\nPatch by Joren Hammudoglu.
  • \n
  • Backport from CPython PR #132160\nto avoid having user arguments shadowed in generated __new__ by\n@typing_extensions.deprecated.\nPatch by Victorien Plot.
  • \n
\n

Release 4.13.1 (April 3, 2025)

\n

Bugfixes:

\n
    \n
  • Fix regression in 4.13.0 on Python 3.10.2 causing a TypeError when using Concatenate.\nPatch by Daraan.
  • \n
  • Fix TypeError when using evaluate_forward_ref on Python 3.10.1-2 and 3.9.8-10.\nPatch by Daraan.
  • \n
\n
\n
\n
\nCommits\n
    \n
  • 4525e9d Prepare release 4.13.2 (#583)
  • \n
  • 88a0c20 Do not shadow user arguments in generated __new__ by @deprecated (#581)
  • \n
  • 281d7b0 Add 3rd party tests for litestar (#578)
  • \n
  • 8092c39 fix TypeAliasType union with typing.TypeAliasType (#575)
  • \n
  • 45a8847 Prepare release 4.13.1 (#573)
  • \n
  • f264e58 Move CI to "ubuntu-latest" (round 2) (#570)
  • \n
  • 5ce0e69 Fix TypeError with evaluate_forward_ref on some 3.10 and 3.9 versions (#558)
  • \n
  • 304f5cb Add SQLAlchemy to third-party daily tests (#561)
  • \n
  • ebe2b94 Fix duplicated keywords for typing._ConcatenateGenericAlias in 3.10.2 (#557)
  • \n
  • 9f93d6f Add intersphinx links for 3.13 typing features (#550)
  • \n
  • See full diff in compare view
  • \n
\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=typing-extensions&package-manager=pip&previous-version=4.13.0&new-version=4.13.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3121/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3118", - "id": 3007735442, - "node_id": "PR_kwDOADL-3s6TQzkU", - "number": 3118, - "title": "build(deps): bump library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4 in /docker/unstable", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4545133062, - "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", - "name": "docker", - "color": "21ceff", - "default": false, - "description": "Pull requests that update Docker code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-04-21T06:55:35Z", - "updated_at": "2025-05-31T09:43:35Z", - "closed_at": "2025-05-31T09:43:28Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3118", - "html_url": "https://github.com/RDFLib/rdflib/pull/3118", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3118.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3118.patch", - "merged_at": "2025-05-31T09:43:28Z" - }, - "body": "Bumps library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1&new-version=sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nYou can trigger a rebase of this PR by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
\n\n> **Note**\n> Automatic rebases have been disabled on this pull request as it has been open for over 30 days.\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3118/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3117", - "id": 3007648012, - "node_id": "PR_kwDOADL-3s6TQgTc", - "number": 3117, - "title": "build(deps): bump library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4 in /docker/latest", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4545133062, - "node_id": "LA_kwDOADL-3s8AAAABDuk6Bg", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/docker", - "name": "docker", - "color": "21ceff", - "default": false, - "description": "Pull requests that update Docker code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-04-21T06:06:30Z", - "updated_at": "2025-05-31T09:42:54Z", - "closed_at": "2025-05-31T09:42:52Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3117", - "html_url": "https://github.com/RDFLib/rdflib/pull/3117", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3117.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3117.patch", - "merged_at": "2025-05-31T09:42:52Z" - }, - "body": "Bumps library/python from 3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1 to sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4.\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=library/python&package-manager=docker&previous-version=3.13.1-slim@sha256:1127090f9fff0b8e7c3a1367855ef8a3299472d2c9ed122948a576c39addeaf1&new-version=sha256:34dc8eb488136014caf530ec03a3a2403473a92d67a01a26256c365b5b2fc0d4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3117/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3115", - "id": 2999054664, - "node_id": "PR_kwDOADL-3s6Szxae", - "number": 3115, - "title": "fix: remove Literal.toPython date conversion for gYear/gYearMonth", - "user": { - "login": "lu-pl", - "id": 128675670, - "node_id": "U_kgDOB6tvVg", - "avatar_url": "https://avatars.githubusercontent.com/u/128675670?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/lu-pl", - "html_url": "https://github.com/lu-pl", - "followers_url": "https://api.github.com/users/lu-pl/followers", - "following_url": "https://api.github.com/users/lu-pl/following{/other_user}", - "gists_url": "https://api.github.com/users/lu-pl/gists{/gist_id}", - "starred_url": "https://api.github.com/users/lu-pl/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/lu-pl/subscriptions", - "organizations_url": "https://api.github.com/users/lu-pl/orgs", - "repos_url": "https://api.github.com/users/lu-pl/repos", - "events_url": "https://api.github.com/users/lu-pl/events{/privacy}", - "received_events_url": "https://api.github.com/users/lu-pl/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 1, - "created_at": "2025-04-16T09:34:25Z", - "updated_at": "2025-05-31T09:51:02Z", - "closed_at": "2025-05-31T09:51:02Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3115", - "html_url": "https://github.com/RDFLib/rdflib/pull/3115", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3115.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3115.patch", - "merged_at": "2025-05-31T09:51:02Z" - }, - "body": "\r\n\r\n# Summary of changes\r\n\r\nIssue #3078 reports, that `rdflib.Literal.toPython`-casting of `xsd:gYear` and `xsd:gYearMonth` to datetime objects should not be possible, as there is no appropriate Python equivalence for those types. \r\n\r\nThe current implementation casts `xsd:gYear` and `xsd:gYearMonth` to datetime objects assuming January 1st for `xsd:gYear` and the 1st day of the given month for `xsd:gYearMonth`. This is plain wrong.\r\n\r\nThe change removes datetime casting for `xsd:gYear` and `xsd:gYearMonth` for `rdflib.Literal.toPython` and adapts the `rdflib.Literal` tests accordingly.\r\n\r\nNote that validation of `xsd:gYear` and `xsd:gYearMonth` is lost as a result, but could be easily implemented using regex checks. As I understand it, XSD types without an entry in the `rdflib.term.XSDToPython` mapping are never typed-checked though; at least for `xsd:gYear` and `xsd:gYearMonth` the xsd-type checks ran as part of `rdflib.xsd_datetime.parse_xsd_gyear` and `rdflib.xsd_datetime.parse_xsd_gyearmonth`. \r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes.\r\n- If the change adds new features or changes the RDFLib public API:\r\n \r\n - [x] Created an issue to discuss the change and get in-principle agreement.\r\n - [] Considered adding an example in `./examples`.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [x] Added or updated tests that fail without the change.\r\n - [x] Updated relevant documentation to avoid inaccuracies.\r\n - [x] Considered adding additional documentation.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\nNote: I looked through the docs and couldn't find a place where `xsd:gYear` or `xsd:gYearMonth` casting was mentioned (apart from the generated references).", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3115/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3106", - "id": 2961384878, - "node_id": "PR_kwDOADL-3s6Q0v6d", - "number": 3106, - "title": "fix namespace prefixes in longturtle serialization", - "user": { - "login": "ddeschepper", - "id": 1130183, - "node_id": "MDQ6VXNlcjExMzAxODM=", - "avatar_url": "https://avatars.githubusercontent.com/u/1130183?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/ddeschepper", - "html_url": "https://github.com/ddeschepper", - "followers_url": "https://api.github.com/users/ddeschepper/followers", - "following_url": "https://api.github.com/users/ddeschepper/following{/other_user}", - "gists_url": "https://api.github.com/users/ddeschepper/gists{/gist_id}", - "starred_url": "https://api.github.com/users/ddeschepper/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/ddeschepper/subscriptions", - "organizations_url": "https://api.github.com/users/ddeschepper/orgs", - "repos_url": "https://api.github.com/users/ddeschepper/repos", - "events_url": "https://api.github.com/users/ddeschepper/events{/privacy}", - "received_events_url": "https://api.github.com/users/ddeschepper/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 3, - "created_at": "2025-03-31T19:28:56Z", - "updated_at": "2025-05-31T09:47:31Z", - "closed_at": "2025-05-31T09:47:31Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3106", - "html_url": "https://github.com/RDFLib/rdflib/pull/3106", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3106.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3106.patch", - "merged_at": "2025-05-31T09:47:31Z" - }, - "body": "\r\n\r\n# Summary of changes\r\n\r\n\r\n\r\nSolves https://github.com/RDFLib/rdflib/issues/3105 by storing the namespace manager of the graph that is to be serialized temporarily and reapplying it to the graph that is actually serialized by the implementation of the longturtle serializer.\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [x] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [x] Checked that all tests and type checking passes.\r\n- If the change has a potential impact on users of this project:\r\n \r\n - [x] Added or updated tests that fail without the change.\r\n - [ ] Updated relevant documentation to avoid inaccuracies.\r\n - [ ] Considered adding additional documentation.\r\n- [x] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3106/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3101", - "id": 2957748383, - "node_id": "PR_kwDOADL-3s6QpJbk", - "number": 3101, - "title": "build(deps): bump rdflib from 7.1.2 to 7.1.4 in /docker/latest", - "user": { - "login": "dependabot[bot]", - "id": 49699333, - "node_id": "MDM6Qm90NDk2OTkzMzM=", - "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/dependabot%5Bbot%5D", - "html_url": "https://github.com/apps/dependabot", - "followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers", - "following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}", - "gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}", - "starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions", - "organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs", - "repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos", - "events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}", - "received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events", - "type": "Bot", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 1999840232, - "node_id": "MDU6TGFiZWwxOTk5ODQwMjMy", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/dependencies", - "name": "dependencies", - "color": "0366d6", - "default": false, - "description": "Pull requests that update a dependency file" - }, - { - "id": 4181259078, - "node_id": "LA_kwDOADL-3s75OPNG", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/python", - "name": "python", - "color": "2b67c6", - "default": false, - "description": "Pull requests that update Python code" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-03-29T04:48:39Z", - "updated_at": "2025-05-27T03:04:02Z", - "closed_at": "2025-05-27T03:04:01Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3101", - "html_url": "https://github.com/RDFLib/rdflib/pull/3101", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3101.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3101.patch", - "merged_at": "2025-05-27T03:04:01Z" - }, - "body": "Bumps [rdflib](https://github.com/RDFLib/rdflib) from 7.1.2 to 7.1.4.\n
\nRelease notes\n

Sourced from rdflib's releases.

\n
\n

2025-03-29 RELEASE 7.1.4

\n

A tidy-up release with no major updates over 7.1.3. This may be the last 7.x release as we move to a version 8 with breaking changes to Dataset and a few APIs.

\n

Interesting PRs merged:

\n\n

... and lots of boring dependency bump PRs merged!

\n

2025-01-18 RELEASE 7.1.3

\n

A fix-up release that re-adds support for Python 3.8 after it was accidentally\nremoved in Release 7.1.2.

\n

This release cherrypicks many additions to 7.1.2 added to 7.1.1 but leaves out\ntyping changes that are not compatible\nwith Python 3.8.

\n

Also not carried over from 7.1.2 is the change from Poetry 1.x to 2.0.

\n

Included are PRs such as Defined Namespace warnings fix, sort longturtle\nblank nodes, deterministic longturtle serialisation and Dataset documentation\nimprovements.

\n
\n
\n
\nChangelog\n

Sourced from rdflib's changelog.

\n
\n

2025-03-29 RELEASE 7.1.4

\n

A tidy-up release with no major updates over 7.1.3. This may be the last 7.x\nrelease as we move to a version 8 with breaking changes to Dataset and a few\nAPIs.

\n

Interesting PRs merged:

\n\n

... and lots of boring dependency bump PRs merged!

\n

2025-01-17 RELEASE 7.1.3

\n

A fix-up release that re-adds support for Python 3.8 after it was accidentally\nremoved in Release 7.1.2.

\n

This release cherrypicks many additions to 7.1.2 added to 7.1.1 but leaves out\ntyping changes that are not compatible\nwith Python 3.8.

\n

Also not carried over from 7.1.2 is the change from Poetry 1.x to 2.0.

\n

Included are PRs such as Defined Namespace warnings fix, sort longturtle\nblank nodes, deterministic longturtle serialisation and Dataset documentation\nimprovements.

\n

For the full list of included PRs, see the preparatory PR:\nRDFLib/rdflib#3036.

\n
\n
\n
\nCommits\n\n
\n
\n\n\n[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=rdflib&package-manager=pip&previous-version=7.1.2&new-version=7.1.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)\n\nDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.\n\n[//]: # (dependabot-automerge-start)\n[//]: # (dependabot-automerge-end)\n\n---\n\n
\nDependabot commands and options\n
\n\nYou can trigger Dependabot actions by commenting on this PR:\n- `@dependabot rebase` will rebase this PR\n- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it\n- `@dependabot merge` will merge this PR after your CI passes on it\n- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it\n- `@dependabot cancel merge` will cancel a previously requested merge and block automerging\n- `@dependabot reopen` will reopen this PR if it is closed\n- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually\n- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency\n- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)\n- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)\n\n\n
", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3101/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3098", - "id": 2957658627, - "node_id": "PR_kwDOADL-3s6Qo2lb", - "number": 3098, - "title": "7.1.4 pre-release", - "user": { - "login": "nicholascar", - "id": 7321872, - "node_id": "MDQ6VXNlcjczMjE4NzI=", - "avatar_url": "https://avatars.githubusercontent.com/u/7321872?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/nicholascar", - "html_url": "https://github.com/nicholascar", - "followers_url": "https://api.github.com/users/nicholascar/followers", - "following_url": "https://api.github.com/users/nicholascar/following{/other_user}", - "gists_url": "https://api.github.com/users/nicholascar/gists{/gist_id}", - "starred_url": "https://api.github.com/users/nicholascar/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/nicholascar/subscriptions", - "organizations_url": "https://api.github.com/users/nicholascar/orgs", - "repos_url": "https://api.github.com/users/nicholascar/repos", - "events_url": "https://api.github.com/users/nicholascar/events{/privacy}", - "received_events_url": "https://api.github.com/users/nicholascar/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 0, - "created_at": "2025-03-29T02:09:03Z", - "updated_at": "2025-03-29T02:19:06Z", - "closed_at": "2025-03-29T02:19:05Z", - "author_association": "MEMBER", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3098", - "html_url": "https://github.com/RDFLib/rdflib/pull/3098", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3098.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3098.patch", - "merged_at": "2025-03-29T02:19:05Z" - }, - "body": "A tidy-up release with no major updates over 7.1.3. This may be the last 7.x release as we move to a version 8 with breaking changes to Dataset and a few APIs.\r\n\r\nFixed some small pre-commit issues too", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3098/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3075", - "id": 2869027589, - "node_id": "PR_kwDOADL-3s6MDLLC", - "number": 3075, - "title": "Specify `Optional` parameters in `Graph.triples_choices`", - "user": { - "login": "slahn", - "id": 3298124, - "node_id": "MDQ6VXNlcjMyOTgxMjQ=", - "avatar_url": "https://avatars.githubusercontent.com/u/3298124?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/slahn", - "html_url": "https://github.com/slahn", - "followers_url": "https://api.github.com/users/slahn/followers", - "following_url": "https://api.github.com/users/slahn/following{/other_user}", - "gists_url": "https://api.github.com/users/slahn/gists{/gist_id}", - "starred_url": "https://api.github.com/users/slahn/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/slahn/subscriptions", - "organizations_url": "https://api.github.com/users/slahn/orgs", - "repos_url": "https://api.github.com/users/slahn/repos", - "events_url": "https://api.github.com/users/slahn/events{/privacy}", - "received_events_url": "https://api.github.com/users/slahn/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [ - { - "id": 7242799529, - "node_id": "LA_kwDOADL-3s8AAAABr7RZqQ", - "url": "https://api.github.com/repos/RDFLib/rdflib/labels/7.1", - "name": "7.1", - "color": "FC7848", - "default": false, - "description": "Issues planned to fix in v7.1" - } - ], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 2, - "created_at": "2025-02-21T13:26:01Z", - "updated_at": "2025-09-03T04:59:58Z", - "closed_at": "2025-09-03T04:59:58Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3075", - "html_url": "https://github.com/RDFLib/rdflib/pull/3075", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3075.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3075.patch", - "merged_at": "2025-09-03T04:59:58Z" - }, - "body": "\r\n\r\n# Summary of changes\r\n\r\nChange the typing of `Graph.triples_choices` and `Store.triples_choises`\r\nto match the actual types allowed by the code.\r\n\r\nThe two non-list parameters can be `None`, but this is not reflected in\r\nthe type hint today.\r\n\r\nIntroduce a type alias to simplify method signatures, and update all\r\noverloads of `triples_choises` to use this alias.\r\n\r\n\r\n\r\n# Checklist\r\n\r\n\r\n\r\n- [\u2713] Checked that there aren't other open pull requests for\r\n the same change.\r\n- [\u2713] Checked that all tests and type checking passes.\r\n - Did not run webtests (`pytest -m \"not webtest\"`), since I could not get them working at all.\r\n `7266 passed, 61 skipped, 333 deselected, 330 xfailed, 36 xpassed, 6925 warnings`\r\n- [\u2713] Considered granting [push permissions to the PR branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork),\r\n so maintainers can fix minor issues and keep your PR up to date.\r\n\r\n", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3075/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - }, - { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020", - "repository_url": "https://api.github.com/repos/RDFLib/rdflib", - "labels_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/labels{/name}", - "comments_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/comments", - "events_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/events", - "html_url": "https://github.com/RDFLib/rdflib/pull/3020", - "id": 2769455166, - "node_id": "PR_kwDOADL-3s6GwndC", - "number": 3020, - "title": "notation3.py: don't normalize float representation", - "user": { - "login": "tgbugs", - "id": 4299776, - "node_id": "MDQ6VXNlcjQyOTk3NzY=", - "avatar_url": "https://avatars.githubusercontent.com/u/4299776?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/tgbugs", - "html_url": "https://github.com/tgbugs", - "followers_url": "https://api.github.com/users/tgbugs/followers", - "following_url": "https://api.github.com/users/tgbugs/following{/other_user}", - "gists_url": "https://api.github.com/users/tgbugs/gists{/gist_id}", - "starred_url": "https://api.github.com/users/tgbugs/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/tgbugs/subscriptions", - "organizations_url": "https://api.github.com/users/tgbugs/orgs", - "repos_url": "https://api.github.com/users/tgbugs/repos", - "events_url": "https://api.github.com/users/tgbugs/events{/privacy}", - "received_events_url": "https://api.github.com/users/tgbugs/received_events", - "type": "User", - "user_view_type": "public", - "site_admin": false - }, - "labels": [], - "state": "closed", - "locked": false, - "assignee": null, - "assignees": [], - "milestone": null, - "comments": 5, - "created_at": "2025-01-05T21:21:48Z", - "updated_at": "2025-09-18T04:19:43Z", - "closed_at": "2025-09-18T04:19:43Z", - "author_association": "CONTRIBUTOR", - "type": null, - "active_lock_reason": null, - "draft": false, - "pull_request": { - "url": "https://api.github.com/repos/RDFLib/rdflib/pulls/3020", - "html_url": "https://github.com/RDFLib/rdflib/pull/3020", - "diff_url": "https://github.com/RDFLib/rdflib/pull/3020.diff", - "patch_url": "https://github.com/RDFLib/rdflib/pull/3020.patch", - "merged_at": "2025-09-18T04:19:43Z" - }, - "body": "fix behavior of the n3 parser family to avoid normalizing raw float string representation which makes it impossible to roundtrip the exact original string representation of e.g. `1e10`", - "reactions": { - "url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/reactions", - "total_count": 0, - "+1": 0, - "-1": 0, - "laugh": 0, - "hooray": 0, - "confused": 0, - "heart": 0, - "rocket": 0, - "eyes": 0 - }, - "timeline_url": "https://api.github.com/repos/RDFLib/rdflib/issues/3020/timeline", - "performed_via_github_app": null, - "state_reason": null, - "score": 1.0 - } -] \ No newline at end of file
\nRelease notes\n

Sourced from mkdocstrings's releases.

\n
\n

0.30.0

\n

0.30.0 - 2025-07-23

\n

Compare with 0.29.1

\n

Features

\n
    \n
  • Add data-skip-inventory boolean attribute for elements to skip registration in local inventory (f856160 by Bartosz S\u0142awecki). Issue-671, PR-774
  • \n
  • Add I18N support (translations) (2b4ed54 by Nyuan Zhang). PR-645, Co-authored-by: Timoth\u00e9e Mazzucotelli dev@pawamoy.fr
  • \n
\n
\n