Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wheel installation fails when local path dependency introduced #4868

Closed
3 tasks done
stonelazy opened this issue Dec 7, 2021 · 14 comments
Closed
3 tasks done

Wheel installation fails when local path dependency introduced #4868

stonelazy opened this issue Dec 7, 2021 · 14 comments
Labels
kind/bug Something isn't working as expected

Comments

@stonelazy
Copy link

stonelazy commented Dec 7, 2021

ERROR: Exception:
Traceback (most recent call last):
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3021, in _dep_map
    return self.__dep_map
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2815, in __getattr__
    raise AttributeError(attr)
AttributeError: _DistInfoDistribution__dep_map

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3101, in __init__
    super(Requirement, self).__init__(requirement_string)
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/packaging/requirements.py", line 117, in __init__
    raise InvalidRequirement(f"Invalid URL: {req.url}")
pip._vendor.packaging.requirements.InvalidRequirement: Invalid URL: appname-pkg3

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_internal/cli/base_command.py", line 173, in _main
    status = self.run(options, args)
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_internal/cli/req_command.py", line 203, in wrapper
    return func(self, options, args)
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_internal/commands/install.py", line 315, in run
    requirement_set = resolver.resolve(
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 94, in resolve
    result = self._result = resolver.resolve(
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", line 472, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", line 366, in resolve
    failure_causes = self._attempt_to_pin_criterion(name)
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", line 212, in _attempt_to_pin_criterion
    criteria = self._get_updated_criteria(candidate)
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", line 202, in _get_updated_criteria
    for requirement in self._p.get_dependencies(candidate=candidate):
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/provider.py", line 197, in get_dependencies
    return [r for r in candidate.iter_dependencies(with_requires) if r is not None]
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/provider.py", line 197, in <listcomp>
    return [r for r in candidate.iter_dependencies(with_requires) if r is not None]
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 250, in iter_dependencies
    requires = self.dist.requires() if with_requires else ()
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2736, in requires
    dm = self._dep_map
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3023, in _dep_map
    self.__dep_map = self._compute_dependencies()
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3033, in _compute_dependencies
    reqs.extend(parse_requirements(req))
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3094, in parse_requirements
    yield Requirement(line)
  File "/Users/sudharsan-2598/opt/anaconda3/envs/npsp6/lib/python3.9/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3103, in __init__
    raise RequirementParseError(str(e))
pip._vendor.pkg_resources.RequirementParseError: Invalid URL: appname-pkg3
  • Tree of the repo
├── README.md
├── appname
│   └── pkg1
│       ├── __init__.py
│       └── sample.py
├── appname-pkg3
│   ├── appname
│   │   └── pkg3
│   │       ├── __init__.py
│   │       └── sample.py
│   └── pyproject.toml
├── dist
│   ├── appname-1-py3-none-any.whl
│   └── appname-1.tar.gz
├── error.txt
├── poetry.lock
└── pyproject.toml
  • Expected Result:

    • Installation gets completed and that i would be able to do following the python import.
    from appname.pkg3 import sample
    from appname.pkg1 import sample
    
  • Why should this be a poetry issue and not something wrong with the code ?

    • If i do poetry lock and then poetry install then i don't face any issue.
@stonelazy stonelazy added kind/bug Something isn't working as expected status/triage This issue needs to be triaged labels Dec 7, 2021
@m-credera
Copy link

m-credera commented Feb 15, 2022

I get this behavior as well - building a wheel with a local path dependency results in a wheel that can't be installed on other systems (even when wheels for that path dependency have been previously built and installed).

Stacktrace:

ERROR: Exception:
Traceback (most recent call last):
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3021, in _dep_map
    return self.__dep_map
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2815, in __getattr__
    raise AttributeError(attr)
AttributeError: _DistInfoDistribution__dep_map

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3101, in __init__
    super(Requirement, self).__init__(requirement_string)
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/packaging/requirements.py", line 130, in __init__
    raise InvalidRequirement("Invalid URL: {0}".format(req.url))
pip._vendor.packaging.requirements.InvalidRequirement: Invalid URL: ../../libs/analysis

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_internal/cli/base_command.py", line 180, in _main
    status = self.run(options, args)
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_internal/cli/req_command.py", line 204, in wrapper
    return func(self, options, args)
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_internal/commands/install.py", line 318, in run
    requirement_set = resolver.resolve(
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 127, in resolve
    result = self._result = resolver.resolve(
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/resolvelib/resolvers.py", line 473, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/resolvelib/resolvers.py", line 367, in resolve
    failure_causes = self._attempt_to_pin_criterion(name)
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/resolvelib/resolvers.py", line 213, in _attempt_to_pin_criterion
    criteria = self._get_criteria_to_update(candidate)
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/resolvelib/resolvers.py", line 202, in _get_criteria_to_update
    for r in self._p.get_dependencies(candidate=candidate):
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_internal/resolution/resolvelib/provider.py", line 175, in get_dependencies
    return [r for r in candidate.iter_dependencies(with_requires) if r is not None]
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_internal/resolution/resolvelib/provider.py", line 175, in <listcomp>
    return [r for r in candidate.iter_dependencies(with_requires) if r is not None]
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 259, in iter_dependencies
    requires = self.dist.requires() if with_requires else ()
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2736, in requires
    dm = self._dep_map
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3023, in _dep_map
    self.__dep_map = self._compute_dependencies()
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3033, in _compute_dependencies
    reqs.extend(parse_requirements(req))
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3094, in parse_requirements
    yield Requirement(line)
  File "/Users/michael/envs/testing/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3103, in __init__
    raise RequirementParseError(str(e))
pip._vendor.pkg_resources.RequirementParseError: Invalid URL: ../../libs/analysis

@KholdStare
Copy link

KholdStare commented Mar 7, 2022

Currently setting up a monorepo with python services/libraries, and need to work around these current limitations.

At the end of the day, poetry supports local paths for dependencies so I think there needs to be a sane outcome when running poetry build in this case. IMHO, it should most likely package all the local dependencies up into the built .whl file - that's the simplest solution. If something is local, and we're building a module to be installed somewhere, it makes sense to bundle any non-external dependencies with the build, because there is no other way to get them.

I mentioned the same in #1168

@abn
Copy link
Member

abn commented Mar 8, 2022

@stonelazy I suspect your issue is related to pypa/pip#6658 (comment). For poetry-core's part, atleast for now, we do write the dependency specification as required by the wheel METADATA. Outside of that it is the PEP 517 front end that needs to handle it.

Also even in a case where the above is fixed, your code will need the following snippet in order for the wheel to include the package as a vendored dependency.

include = [
    { path = "appname-pkg3", format = ["sdist", "wheel"] }
]

This will ensure the source is vendored into your wheel when built.

@michaelcredera for your issue, the relative paths are problematic. When wheels are being built under PEP-517, the source is expected to be moved into an isolated environment at build time. This means your relative paths will need to hold even when your project source is moved to an ephemeral build directory. Unlike the OP's case yours is trickier. The general question of "what should this be relative to?" is still an open question within the packaging group. See here for more on this.

The alternative approach for this case is to build ../../libs/analysis into a wheel and use pip install --find-links /wheelhouse project.whl on install. This might not be applicable in all cases.

@KholdStare
Copy link

KholdStare commented Mar 8, 2022

When wheels are being built under PEP-517, the source is expected to be moved into an isolated environment at build time. This means your relative paths will need to hold even when your project source is moved to an ephemeral build directory. Unlike the OP's case yours is trickier. The general question of "what should this be relative to?" is still an open question within the packaging group. See here for more on this.

At the risk of sounding ignorant, I'm speaking from a user/"peasant" perspective of someone trying to make things work with the available tools. It just doesn't seem right to leave it up to pip to build the relative path dependencies - it would be a wrong level of abstraction no? When poetry build finishes I would think the user expects a self contained package that knows how to get its dependencies. Only including one "level" (just the current package), and then leaving the rest of the job up to pip seems very strange. Imagine if there's 3 levels of dependencies - the first level is the wheel, the second level would be up to pip to make an on-the-fly build using poetry, and then again recursively try and build the 3 level. I am not sure I understand why pip has to be in the middle of this at that point if the references are relative/local.

IMHO, it should be a special case, or at least there should be a commandline switch or configuration to do this with poetry all during the build step itself, and not unnecessarily postpone half of the process to a separate tool (i.e. pip). It just overcomplicates something that users intuitively expect.

@TBBle
Copy link
Contributor

TBBle commented Mar 8, 2022

I wouldn't want my pathed dependencies to be vendored in, personally. I'd rather pathed dependencies be turned into package-dependencies, perhaps with a version-equality exactly matching the version-at-time-of-build of the pathed dependencies. My approach would be to poetry build each of my packages in the monorepo, and then make all the resulting wheels available so that installing the top-level one (or ones) pulls in the appropriate lower-level wheel from the same directory.

So similar to what #4868 (comment) expected, that pre-installing the resulting depended-upon wheel would work correctly.

@KholdStare
Copy link

Understood - I can agree with not vendoring in by default. However,

I'd rather pathed dependencies be turned into package-dependencies, perhaps with a version-equality exactly matching the version-at-time-of-build of the pathed dependencies.

This is quite cumbersome and non-intuitive from a usability perspective. It means having a pathed dependency requires a host of conventions and scripts that come with it to make it work. I.e. if I have dependency= {path = "../dependency", develop = true} listed in my poetry project, then I need:

  • to make sure I also have a versioned dependency dependency = "0.1.1"
  • I need to keep that version manually synced up with the version in the other directory (more scripts or manual work here)
  • I cannot use the result of poetry build directly
  • I need external scripts outside of poetry that will "stitch up the build" in CI.

Otherwise IMO the path dependency is "incomplete" as a feature because it has all these caveats attached to it - it's not useable. At the very least these should be expressed in the documentation to make sure users know what to expect. Hopefully it would be possible to do this without any of the extra steps.

@TBBle
Copy link
Contributor

TBBle commented Mar 8, 2022

I was actually thinking that poetry build would be doing the first two dot points, because when it looks at the pathed dependency, it can see the version of the pointed-to package generate that == dependency as part of the resulting setup.py pretty trivially.

I agree that it's not a workflow for every occasion: It works well for "I run poetry build on every package in parallel in my CI system, upload them to a directory, and then COPY that directory into a DockerFile where I call pip install for the particular top-level packages being installed in this case", but vendoring as you describe is a better workflow for poetry build -o X.whl && pip install X.whl for a particular top-level package X.

The flow I want to use also works for "I build each package and push it to our PyPI repo in parallel", particularly if there are consumers of the libraries (not the root packages) who're not in the same monorepo, and hence need a normally exported package anyway. This flow means that both the monorepo output and the external users are using the same wheels.

That said, if vendoring into a single wheel is implemented and reliable, I'd probably use it in preference to the current approach, which is exposing all the source into the containers and poetry install there. It'd be a significant improvement, even if it's not my perfection vision.

I'll also note that one of my considered approaches was to only use dev-dependencies in pathed dependencies, and then use normal versioned package dependencies in the non-dev dependencies. In this case, I'd want poetry build to ignore the dev-dependencies (which is a different problem...) and poetry install to fail if the two dependencies for the package conflict (i.e. what's at the path has a current version that doesn't meet the version requirement).

In this flow, it's required (and expected) that people are maintaining the versioned-dependency (dot point 2), and intentionally so, as that version is an ABI dependency, and should be being bumped when ABIs are changed/added/removed, so you can't successfully install mismatching packages.

However, Poetry has some other limitations that currently prevent this, i.e. the resolver merges dependencies and dev-dependencies too early to make this work, if I recall correctly. (There's a ticket elsewhere here tracking this use-case, so it's not specifically-relevant here, I just wanted to share some context on my approach here.)

@m-credera
Copy link

m-credera commented Mar 8, 2022

The alternative approach for this case is to build ../../libs/analysis into a wheel and use pip install --find-links /wheelhouse project.whl on install. This might not be applicable in all cases.

@abn Pip seems to still have trouble with relative paths - for me this resulted in the same "Invalid URL" error:

pip._vendor.pkg_resources.RequirementParseError: Invalid URL: ../../libs/analysis

Process:

  1. Build analysis package which has only pypi dependencies
  2. Build other package which depends on ../../libs/analysis
  3. Install analysis wheel - no issues
  4. Install other wheel - breaks due to invalid requirement for other - the relative path "url"

Edit: Maybe I'm misunderstanding your suggestion. Perhaps you mean to change other's pyproject.toml to be something like:

[tool.poetry.dependencies]
python = "~3.8"
analysis = {path = "wheelhouse/", develop = false}

And having every module reference other modules via their wheelhouse? As TBBle pointed out, this seems like it would require a decent amount of automation / scripting to manually manage dependency build order, wheel staging, and then installation order in a remote machine (as opposed to a simple poetry install just working out of the box the way it does with a relative path).

That being said, I probably just don't understand what you're suggesting :)

@abn
Copy link
Member

abn commented Mar 8, 2022

@michaelcredera what I am suggesting is closer to what @TBBle is already doing I reckon.

Here is an example project.

poetry new foo
cd foo
poetry new bar
poetry add -D ./bar
poetry add bar@*

And then build it as show in this Dockerfile.

FROM docker.io/python:3.10 AS build

COPY . /build

WORKDIR /build

RUN pip install --quiet build

RUN mkdir wheelhouse
RUN python -m build --wheel --outdir ./wheelhouse ./bar
RUN python -m build --wheel --outdir ./wheelhouse .

FROM docker.io/python:3.10-slim AS runtime

COPY --from=build /build/wheelhouse /wheelhouse

RUN python -m venv .venv
RUN python -m pip install --find-links /wheelhouse foo

ENTRYPOINT ["/.venv/bin/python"]

The "wheelhouse" is simply a directory with wheels for both bar and foo. The important bit in the foo project is that we add bar as both a dependency (poetry add bar@*) and as a dev group dependency (poetry add -D ./bar). The latter is added as a path dependency the former is not.

This is more a workaround for now. Issues #936 and #2270 might be of interest.

Additionally, the pyproject.toml file for foo also should contain the following ideally.

include = [
    { path = "bar", format = "sdist" }
]

@abn
Copy link
Member

abn commented Mar 8, 2022

The root cause of the issue has been addressed in #4868 (comment).

As there is no standard mechanism in wheels to handle relative path dependencies, and as any resolution in pip and/or wheel spec should address the issue, I am closing this.

Regarding the broader use cases for monorepos/subprojects, see #936 and #2270. Regarding local path dependency support, see #1168.

If you would like help regarding your project and/or guidence around workarounds, I suggest a discussion be started.

@abn abn closed this as completed Mar 8, 2022
@mkniewallner mkniewallner removed the status/triage This issue needs to be triaged label Jun 11, 2022
@Jack1082013
Copy link

So keep having issues with the wheel dependency for pysha3

Building wheel for pysha3 (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [18 lines of output]
running bdist_wheel running build
running build_py
creating build
creating build/lib.linux-aarch64-cpython-311
copying sha3.py -> build/lib.linux-aarch64-cpython-311 running build_ext
building '_pysha3' extension
creating build/temp.linux-aarch64-cpython-311
creating build/temp.linux-aarch64-cpython-311/Modules
creating build/temp.linux-aarch64-cpython-311/Modules/_sha3
aarch64-linux-android-clang -DNDEBUG -g -fwrapv -O3 -Wall -fstack-protector-strong -O3 -fstack-protector-strong -O3 -fPIC -DPY_WITH_KECCAK=1 -I/data/data/com.termux/files/usr/include/python3.11 -c Modules/_sha3/sha3module.c -o build/temp.linux-aarch64-cpython-311/Modules/_sha3/sha3module.o
In file included from Modules/_sha3/sha3module.c:20: Modules/_sha3/backport.inc:78:10: fatal error: 'pystrhex.h' file not found
78 | #include "pystrhex.h"
| ^~~~~~~~~~~~
1 error generated. error: command '/data/data/com.termux/files/usr/bin/aarch64-linux-android-clang' failed with exit code 1
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for pysha3
Running setup.py clean for pysha3
Failed to build pysha3
ERROR: Could not build wheels for pysha3, which is required to install pyproject.toml-based projects
Retrying installation of merkletools...
Collecting merkletools
Using cached merkletools-1.0.3-py3-none-any.whl
Collecting pysha3>=1.0b1 (from merkletools)
Using cached pysha3-1.0.2.tar.gz (829 kB)
Preparing metadata (setup.py) ... done
Building wheels for collected packages: pysha3
Building wheel for pysha3 (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [18 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-aarch64-cpython-311
copying sha3.py -> build/lib.linux-aarch64-cpython-311
running build_ext
building '_pysha3' extension
creating build/temp.linux-aarch64-cpython-311
creating build/temp.linux-aarch64-cpython-311/Modules
creating build/temp.linux-aarch64-cpython-311/Modules/_sha3
aarch64-linux-android-clang -DNDEBUG -g -fwrapv -O3 -Wall -fstack-protector-strong -O3 -fstack-protector-strong -O3 -fPIC -DPY_WITH_KECCAK=1 -I/data/data/com.termux/files/usr/include/python3.11 -c Modules/_sha3/sha3module.c -o build/temp.linux-aarch64-cpython-311/Modules/_sha3/sha3module.o
In file included from Modules/_sha3/sha3module.c:20:
Modules/_sha3/backport.inc:78:10: fatal error: 'pystrhex.h' file not found
78 | #include "pystrhex.h"
| ^~~~~~~~~~~~
1 error generated.
error: command '/data/data/com.termux/files/usr/bin/aarch64-linux-android-clang' failed with exit code 1
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pysha3
Running setup.py clean for pysha3
Failed to build pysha3
ERROR: Could not build wheels for pysha3, which is required to install pyproject.toml-based projects
Failed to install merkletools.
dpkg: error processing package zeronet (--configure):
installed zeronet package post-installation script subprocess returned error exit status 1
Setting up wuzz (0.5.0-4) ...
Errors were encountered while processing:
zeronet
E: Sub-process /data/data/com.termux/files/usr/bin/dpkg returned an error code (1)

I have used $pip install " " --no-cache-dir. it works sometimes. But I would like to solve the issue. The old python was made obsolete and the new has no id

@TBBle
Copy link
Contributor

TBBle commented Oct 19, 2023

@Jack1082013 Your problem does not appear to have any connection to this issue, or in fact any connection with Poetry at all.

I guess that you've hit Tierion/pymerkletools#20 and should use the approach described there, and follow up with zeronet (the thing you appear to be trying to install, that is pulling in pymerketools, that is pulling in pysha3) if you still can't get it to work.

@Jack1082013
Copy link

Jack1082013 commented Oct 19, 2023 via email

Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Feb 29, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind/bug Something isn't working as expected
Projects
None yet
Development

No branches or pull requests

7 participants