Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create workflow for nightly builds #36

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open

Conversation

Mudb0y
Copy link

@Mudb0y Mudb0y commented Jun 5, 2024

This uses a verry similar workflow to the release one, but it gets triggered on each push or new pull request to create nightly builds of the engine. All builds get built, packaged and uploaded to GH Artephacts for use with a service such as nightly.link or individual downloads.

Mudb0y added 3 commits June 5, 2024 15:36
This uses a verry similar workflow to the release one, but it gets triggered on each push or new pull request to create nightly builds of the engine. All builds get built, packaged and uploaded to GH Artephacts for use with a service such as nightly.link or individual downloads.
@patricus3
Copy link

it's actually the only reason I build from source, to get the latest stuff faster.
@samtupy what do you think?

@Mudb0y
Copy link
Author

Mudb0y commented Jun 5, 2024

Seems like this isn't quite as ready as I thought, I can't get the final package to get uploaded successfully. If someone could take a look at this and fix it that'd be great.

@samtupy
Copy link
Owner

samtupy commented Jun 5, 2024

Hi,
First thanks much for all your work on this, and for the new bash tricks I learned by reading that file!
My only concern comes from as you might be able to imagine, github artifacts.
So are there any storage limits for these things, and how do I dynamically retrieve these links? I'm still looking at nightly.link to understand it.
My biggest concern however is don't we have some sort of 500mb or 1gb storage limit on artifacts? For example the build jobs generate about 60ish mb of data per build, and the installers which use that data then add another I think over 100mb, so we are talking about over 150mb of gh artifacts nightly.
Are we sure that this is wise? Perhaps it is, I have little experience with artifacts which is why I'm asking. I've tried several times to look up github artifact storage limits, and the answers are actually vague from a 4 year old issue about people questioning storage limits to people saying it's 1gb and then maybe 2gb for pro accounts or something etc. People are talking about then needing yet another workflow to delete old artifacts and then this stackoverflow question which was updated just 2 months ago which btw mentions a 0.5gb quota sadly just seems disconcerting and single-handedly reduces my trust in github artifacts for this sort of thing. Do we have any actual information on this issue? I have over 800gb available on the ftp server used by the release workflow. It seems like the retention-days argument of the artifact action may be worth looking into.
Furthermore, do artifacts allow for any sort of testing that doesn't involve executing the entire workflow over again?
For example if I'm only having issues with the final package job, I can simply set if: false in each other job and comment out the needs line in the one I want to test. However I fear that while using artifacts it would be difficult for the one running final package operation to access artifacts from previous failed jobs. It seems like this would dramatically increase the difficulty and time required to test the CI upon any minor change where we just need to make sure that the final package operations work successfully.
As such unless someone can ease my concerns about gh artifacts we may switch this to use the existing ftp solution, but to be clear I am certainly open to learn more about these things and only am concerned because I don't want to randomly run into issues a few days after we start using this because of some sort of storage limit.
Again though thanks for the great work on this workflow, we'll continue this discussion about whether artifacts or our custom solution are best for this so that we can get it merged soon!

@patricus3
Copy link

foss projects don't have such limits because they can do it as much as they want I had read about it quite a lot on github's stuff.

@Lucas18503
Copy link
Collaborator

Lucas18503 commented Jun 5, 2024

More specifically, I believe this is covered here:

GitHub Actions usage is free for standard GitHub-hosted runners in public repositories, and for self-hosted runners. For private repositories, each GitHub account receives a certain amount of free minutes and storage for use with GitHub-hosted runners, depending on the account's plan. Any usage beyond the included amounts is controlled by spending limits.
...
The storage used by a repository is the total storage used by GitHub Actions artifacts and GitHub Packages.

@patricus3
Copy link

and you always can auto remove old stuff, right?

@samtupy
Copy link
Owner

samtupy commented Jun 5, 2024

well, if all these limits only apply to private repositories, then indeed that is at least one problem solved. Now as for the other questions, say I modify just the packaging step of the workflow. Maybe I want to package the documentation in another format, include a changelog, anything. How do I do that without running the entire workflow instead of just the last job which I think would not have access to the artifacts from previous workflow builds? For example if I'm trying to update the final build package and it fails because I forgot an = character, now I must wait an extra 30 minutes just to test that one final packaging job again where I could easily find another mistake, where I'll then need to wait still another 30 minutes to see if I've fixed that one. Github's native workflow reruns feature is not an option because you to my knowledge cannot run one failed job of a workflow given a new commit. Maybe one can create a secondary workflow with a manual run trigger that accepts a run-id input that can be passed to download artifact?
Then once these artifacts are uploaded, what do we do with them? I already know that by uploading the files to a custom server I can just move them somewhere and display them. How do we get these showing on nvgt.gg? Or are we planning to link people to the artifacts run page. I know that once we do release tags we can somehow put artifacts on those, but have not yet had time to learn how any of that stuff works on git and so am a bit clueless on that.
Finally the last comment by the author says that this workflow is currently broken and does not upload the final package anyway but accidentally neglected to provide any error information to debug the issue. If someone wants to either spend 30 minutes per workflow run until they get this working or wants to figure out how to just rerun the failed package job with previous artifacts, I'm certainly willing to continue considering this. I think however it would be better to spend my own time in the short term fixing things with nvgt that specifically require a level of experience with the NVGT source tree that nobody else has, at least in part because we can see by these comments that I'm clearly not the most experienced git user here and thus I may not be the best candidate to fix this anyway. Of course I will get to it in time myself, but this doesn't seem like a good highest priority for me at the moment, especially since I don't understand the advantage to github artifacts over the already existing system accept that it exists as a standard. If someone wants to tackle it, it would be appreciated!
I'll keep an eye out for updates, thanks for the contribution!

@Mudb0y
Copy link
Author

Mudb0y commented Jun 5, 2024

I just pushed a potential fix to the uploading stage, it should work fine now. Once everything builds we'll find out, I'll update this pull request if everything passes.
To answer your other questions, the way you would get a link to put the nightly builds on the site would be through nightly.link, that's a service that can give you static direct links to the latest artifacts without having you log-in to GitHub to download them which is an ideal solution. I'd set the links up too if I could but this has to be done by one of the maintainers once this is merged.

@pauliyobo
Copy link

Hello.
I guess I'm replying a bit too late, but figured I'd also share my two cents for what it's worth.
@samtupy

say I modify just the packaging step of the workflow. Maybe I want to package the documentation in another format, include a changelog, anything. How do I do that without running the entire workflow instead of just the last job which I think would not have access to the artifacts from previous workflow builds?

I'm not entirely sure I have understood the question. But generally, what you could do to isolate documentation, using it as an example in this case, is either building documentation in its own workflow, or additionally put every single step behind a flag. For example, you can skip a step if certain criteria are not met, e.g. the commit message does not contain or contains a particular word, by doing this you could skip or run certain jobs based on what you chose as a criteria.
Ideally though it would be good to set up linting and validating steps in order to catch these kind of errors before starting to build in the first place, which should shorten the wait time considerably.
As for downloading artifacts, sadly it is not possible to natively download artifacts from jobs that aren't part of the same workflow run using the download-artifact action. To do that you'd have to create a script that interacts with the github API in order to retrieve the latest ones, but with linting and caching set up I think it shouldn't be as bad in terms of waiting time. Of course we'd need to understand whether this approach is applicable in our context, I'm not entirely familiar with the codebase.
Releases are actually relatively easy to handle, there's a dedicated action you can use found here
Not only you can attach artifacts to it but you can also generate release notes based on commits. I recommend setting the release as a draft so that you can manually edit information before publishing.
I am happy to try to help with this issue if this is still something you're interested in, and I hope I haven't misunderstood or missed anything.
Feel free to ask more questions if I was not clear.

@JessicaTegner
Copy link

What is the status of this @Mudb0y @samtupy
I think getting this off the ground could significantly help NVGT be adopted by the larger community.

To grab the links automatically (or have statuc links to the latest nightly version), you could do something similar to what I do in my projects, which is to generate a "latest" release/tag and attach the artifacts to that.
That way, the link becomes nvgt/releases/latest/nvgt-xyz.extension

Look here for a reference (under the publisher-latest job): https://github.com/JessicaTegner/pypandoc/blob/master/.github/workflows/ci.yaml

@samtupy
Copy link
Owner

samtupy commented Oct 27, 2024

Hi,

So basically we need to redo this such that the nightly build workflow calls the release workflow with a parameter. The way this pull request is currently set up would require modification to the nightly build workflow as well as the release workflow every time something about NVGT's build changes, when in reality the release workflow does everything required for nightly builds sans simply uploading them somewhere different with modified filenames. So we need to modify this workflow to call the release workflow with some sort of nightly argument that causes artifacts to be uploaded somewhere other than the official releases page, instead of having double the maintenance cost by trying to maintain multiple workflows that do 95% the same things.

Practically this means copying the artifact usage from the workflow in this pr to release.yml as it's best to use artifacts in the build>packaging stages so that people can test most of the workflow on repo forks, and then adding an argument to release.yml that causes it to run a nightly_package step instead of the final_package step that is run for an official release.

In short, we don't need 2 workflows for this, as it causes a very unnecessary amount of repeated instructions and increased maintenance.

I'm planning to solve this once we start using the github releases page in general. At that time, the release.yml workflow will be updated to use artifacts to upload to the releases page anyway, so I can add that extra argument while I'm at it.

Aside from this, the last comment from the author of this pr was that he'll update here if everything passes, which never happened, so I'm unsure as to the status on that front, which is fine since as I said I think we'll mostly use release.yml for this, and if we need a nightly.yml file, all it will do is call release.yml with a certain argument or environment variable or something set to make the release workflow function a bit differently.

@JessicaTegner
Copy link

good to hear @samtupy as you'll probably figure out when you look into it (or if you look at the reference I linked above), basically what I do for release versus nightly builds is only have them publish on main (but have tests and building/generation done on forks and other branches), and if a tag is on the commit (like vx.y.z) do a release build, if not do a nightly build.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants