-
Notifications
You must be signed in to change notification settings - Fork 216
GitHub Actions Debian Builds #1812
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GitHub Actions Debian Builds #1812
Conversation
b09265d
to
1c95c91
Compare
It seems that |
Requesting review from @peternewman |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me, thanks!
Possibly @yoe or @peternewman may have some ideas about the failed test phases? I imagine these issue would possibly affect the official Debian builds as well. |
Agreed, I can't see any changes you've made that would cause this.
Note that the Debian CI fails to find some packages so doesn't test: Whereas we try and test and get some symbol type errors. |
If we're going down that route, we should do this eventually too:
That looks about right thanks.
I've been testing the Python 3 changes with Python 3 libraries so I think they'll be about right. |
I think there's an "allow failure" flag that I can add to that job for now so that this can be merged without marking everyone's pulls as broken.
Yes! Nightly builds in an automatic repo would be fantastic for continually testing new features/stability! I've tried to do this manually once before and failed utterly, but that GitHub Action looks like a one-and-done solution! |
OK, wow a proper allow failure feature that allows a job to fail, mark it as failed, but not prevent merge is contentious: actions/runner#2347 For now it will soft-fail but won't raise any alarms. Once the packaging issue is fixed it can be re-enabled. Should this target 0.10? |
More generally this could target 0.10 too, like the other ones. @DaAwesomeP regarding the test failure, we want this commit, or the important bit of it: As per: If you're feeling really keen, fancy trying a PR against 0.10 which pulls in most of @yoe 's stuff from https://salsa.debian.org/wouter/ola/-/tree/master (as he's not keeping his GitHub mirror up to date anymore sadly. I think we want to keep our postinst stuff in Debian, and the local JS libs, but most of the rest of it could be ported back from downstream which would avoid gotchas like this compilation failure! |
Will do!
I'll take a stab at it! |
Amazing, as mentioned, I think cherry-picking that one commit into this branch should be enough to fix the autopkgtest stuff which is hopefully enough to get this into the stage where it can be merged in too! |
f333252
to
1516247
Compare
This change pulled from Debian by Perry Naseck <[email protected]> https://salsa.debian.org/wouter/ola/-/commit/223f9ddaf9d4001dfc908734f0641315edbf9ea2 Co-authored-by: Wouter Verhelst <[email protected]> Co-authored-by: Perry Naseck <[email protected]>
@peternewman See this build: https://github.com/OpenLightingProject/ola/actions/runs/4235745316/jobs/7359746616 Is there a flag to configure to not require cpplint and flake8? |
…tionsDebianBuilds
Huzzah! Thanks to #1835 the test part is working on bullseye! Now I just need to figure out why it insists on installing Python 2 for the RDM tests! |
@peternewman OK, I figured out the issue. Debian is inferring that it needs Python 2 by looking at the shebangs in files that it installs. You can see the commit history; I tried everything I could think of to force it to use Python 3. Python 2 works on bullseye because python2 is still available, but it fails on anything later than that. The only workaround I can think of besides changing the shebang is to somehow patch these files during the build process to use a Python 3 shebang. @yoe simply disabled the RDM tests in Salsa repo, but I'm not sure that we want to do that here: https://salsa.debian.org/wouter/ola/-/blob/175a46c5e0f8bdadc26a35597711159ebbfeff6d/debian/control#L42-62 Thoughts? |
Also a note that
https://www.debian.org/doc/packaging-manuals/python-policy/ A messy workaround would maybe be to provide our own environment I'm wondering if there is a way we can patch all instances of |
This reverts commit a44c1de.
Hey, I solved it! Turns out you can tell This does mean that we need to be sure that every single file with a This also means that we're on-track to once again offer the @peternewman This is ready for review! |
@peternewman polite ping! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, just some minor potential improvements
debian/rules
Outdated
dh_makeshlibs -V | ||
|
||
override_dh_clean: | ||
find . -type d -name '__pycache__' -print0 | xargs -0 rm -rf |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought there might be a better clean tool and it seems it's this:
https://manpages.ubuntu.com/manpages/bionic/man1/py3clean.1.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe also check what other packages use?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I got this from downstream: https://salsa.debian.org/wouter/ola/-/blob/4a0ec01039b06f8ff2d350520a1d031b86e7920b/debian/rules#L27-29
Not many results (although this is not a definitive search) of using py3clean
with an override: https://www.google.com/search?q=%22override_dh_clean%22+py3clean
It seems that dh_python3
may run py3clean
at some point (prerm-py3clean
step) but given that this was deemed necessary downstream I sort of trust that it is deleting something that isn't getting removed otherwise? I suppose this should be easy to test out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pushed a change, I'll check the build for any rouge __pycache__
folders when it completes. https://github.com/DaAwesomeP/ola/actions/runs/4471323397
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I checked all of the *.deb
files in the bullseye build and there are not any __pycache__
folders, *.pyc
files or *.pyo
files!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@yoe is there some technical reason we're missing why this needs to be included? Maybe because we build and compile files for ola-rdm-tests, but you don't currently clean them correctly as you don't currently install/package ola-rdm-tests?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@yoe is there some technical reason we're missing why this needs to be included? Maybe because we build and compile files for ola-rdm-tests, but you don't currently clean them correctly as you don't currently install/package ola-rdm-tests?
tbh, I'm not sure why I added that line and not the py3clean one. For sure the cache and pyc and pyo files are required to not be shipped but be generated on Debian systems, so the fact that we remove them before packaging is not something we can get rid of, but how we do it is very much open for discussion...
strategy: | ||
fail-fast: false | ||
matrix: | ||
image_tag: [bullseye, bookworm, sid] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm no Debian naming expert, I guess we want to use static names so people know what they want, although if we used dynamic ones we'd always be testing the bleeding edge in testing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I think we want to use the static names because every little while everything shifts (i.e. testing
becomes stable
). We probably want to manually account for those shifts. It is also favorable to name APT repo dists with the OS version name and not stable
, testing
, etc. because then when the version shifts you are not suddenly receiving broken packages.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess we could cheat it and either build testing too, or ideally check if testing was an alias for one of those other names. Or just remember to check/add occasionally which is probably easier!
Actually presumably the oldest (bullseye?) will fail to build when they shuffle them all on, as its repo may go away?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The repo won't disappear right away (it becomes "oldstable" I believe). I'm mainly thinking ahead to possibly having a nightly APT repo where we wouldn't want to break packages. It also makes it easier to go back in time if you are unfortunately stuck with managing an old machine (i.e. you can still find the last build of an old release even if not built currently anymore).
I think we can just do our best to add new releases and remove the old ones when they shuffle down. I can possibly add a check for it in a later PR (just have a Python script read the CI yaml). Personally I install Debian about once every other month so I am happy to submit update PRs for the time being.
debian-test: | ||
name: 'Debian Test ${{ matrix.image_tag }} amd64' | ||
needs: debian-build | ||
if: "always()" # Run if some builds fail but ensure they all complete first |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know if you've changed this, but on a previous test say bookworm had failed, but it still tried to run all the tests. We can't easily make them run as three two step actions, rather than three actions linked to another three actions can we?
Can we if the previous matrix step status?
Possibly one for a later PR...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can't easily make them run as three two step actions, rather than three actions linked to another three actions can we?
No, the real limitation is that GitHub Actions has no way to properly define matrix dependencies, only whole-job dependencies. What we can do is probably a better check than always()
so that is properly shows those matrix iterations as skipped instead of failed. I will look into it for a future PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just the one minor improvement before we merge I think.
Feel free to merge once the arch stuff is working.
Thanks again for all of this!
Done! armhf, arm64, and i386 builds in the future would be great.
I don't think I have permission to do this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for those final tweaks.
Amazing! I'm going to use this build in a production project at the end of the month! |
This pull adds preliminary automatic builds for Debian on GitHub Actions. This is most useful as a way of producing easily installed artifacts for testing new features/branches without needing every person wishing to test to build it themselves.
The build workflow was modeled off of the Debian GitLab CI workflow.
I have not yet added theIt currently only builds forautopkgtest
workflow.amd64
(GitHub Actions architecture), but cross-compilation forarmhf
andaarch64
should be possible. Building for Debian in this repo may alleviate the need for as many specific patches in that repository to fix Debian build issues (should that be desired).I had some difficulties getting it to build surrounding outdated/missing Python 2 packages no longer available in different Debian releases. I simply substituted the Python 3 equivalents, but I did so without checking for proper compatibility. Please let me know if the code base is actually running on Python 2 or if these changes are correct.