-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid suggesting bad practice of building in the same GHA job as publishing #14
Comments
Thank you. Will split the job into two parts. Good idea |
poetry is striking back |
I shall use twine in the 2nd job. No poetry needed |
@webknjaz Do we like this version more?
|
Well, at this point, why not just use my action that wraps twine and aquires tokens internally? |
The publish job doesn't need to call checkout or content permission. Plus see my comment above — this is reinventing the wheel. |
I am well aware of the gh-action-pypi-publish. The initial goal was not to give people a tool that can upload to pypi in any way. We wanted to restrict their options and enforce trusted publishing. For that it was enough to generate the token and do a poetry publish. This token mint was a byproduct of this process. Following the GitHub mess with deployment to protected tags I need to revisit our deployment process anyway. |
By the way, the example workflow should also recommend using GitHub Environments since that allows for extra protection, like requiring a manual button click to allow the job to even start. P.S. If you're building a scalable process to enforce OIDC, I'd recommend using jsonschema-based linting. I like https://check-jsonschema.rtfd.io/en/latest/precommit_usage.html#example-usages that lets one check GHA workflow files and action repos for basic problems. There's an example of enforcing that |
We are using a release environment in the example? Do you mean we should more explicitly mention it or do we oversee something here?
Ok, we run the checks of the workflows. I have not specified any schema though. I assume it will check against the standard schema for workflows? |
Well, in the current readme it's not used. Only in the example above. But envs are auto-created. The problem is that they are created without any protection on. So it'd be important to call out that it's recommended to create the env manually and set up required reviewers there, for example, so the workflow is paused before starting the job. Otherwise, it's not doing much. Setting this up will make it so only trusted humans would be able to resume the workflow execution, letting the publishing to actually happen. |
Yes, it runs the checks against some standard schemas published on the schema store. You can add another check pointing at an in-repo schema file that you would make yourself. |
Ok, understood. Might come in handy at different other projects where the yaml file is say the configuration for a professional engine trading international stock markets :-) |
I see. I guess we have done this almost correct then. I tend to use the release environment and I have protection rules in place for this environment. So, I guess we ramp up the protection for this environment. |
I usually call the environment
|
I like that idea. |
Having access to OIDC opens up a can of worms related to privilege elevation during the build. And the compromised targets might be not just PyPI, but any other OIDC integrations people may have set up (and sometimes misconfigured). This also doesn't allow for adequate workflows of synchronized publishing of platform-specific wheels.
Here's a note regarding the security considerations that we have in pypi-publish, for example: https://github.com/marketplace/actions/pypi-publish#trusted-publishing.
Based on the above, I suggest modifying the README example to make use of two jobs and pass GHA artifacts between them. This also allows to smoke-test the build even if publishing gets skipped.
We're currently upgrading the PyPUG guide with similar considerations: pypa/packaging.python.org#1261.
The text was updated successfully, but these errors were encountered: