goPackages: make it easier to automate updates#13819
goPackages: make it easier to automate updates#13819cstrahan wants to merge 1 commit intoNixOS:masterfrom
Conversation
This reverts commit 7760d6e.
This reverts commit 1371084. Introducing yet another non-standard package style / auto-updater without proper discussion is really not a good idea. We really need a proper process for this.
|
GitHub won't show anything in the "Files Changed" tab, so if you want to review the changes without pulling this down and running https://github.com/NixOS/nixpkgs/pull/13819.patch |
|
Maybe it's better to view the individual files in github's UI, or some unusual tool (perhaps word-diff). |
Yeah, you can also look at the individual files and compare with the parent commit. But I figure it's much easier to see what exactly is going on by just looking at the patch. Conceptually, this is a really simple, mechanical change (in fact, the JSON was dumped by a program). |
|
Oh, I see. I only looked at the beginning of the patch and I thought it looked like first adding the whole of the new file and the removing the whole of the old file. Now I see I was wrong. |
|
/cc @edolstra @copumpkin @viric @hrdinka Feedback in the positive, negative or passive would be greatly appreciated. |
|
and @wkennington |
|
With what I propose, we won't lose the ability to continue updating packages as we currently do, that is, manually. When a tool is created, I would definitely want to see it documented, so people could benefit from it. Unlike the Haskell case, however, we could continue to update packages by hand without having them clobbered and rolled back upon running an update tool (which is something I've had happen long ago when I first tried contributing something Haskell related).
I could make the format Nix, and just parse (or evaluate) the file in the (as of yet hypothetical) tool I suggest using to update package revisions.
If it's possible to pull off a single, universal updater, then I agree -- that sounds great. However, I think that's a monumental effort, vs creating a few specialized tools, and providing really good documentation on how to use them (which is precisely what I intend to do once I've written such a tool, but that comes next). I would argue that the burden of needing to know how to use a tool for automated or assisted updates far outweighs the present burden of updating Go packages.
Speaking of burden, I think there's an important difference to keep in mind between Go and C packages. The release cycle for C libs usually spans something between months and years; the go release cycle is effectively each commit -- if you don't keep up with changes, your stuff is broken. One change in a single library causes a cascade of required version bumps. C libs have versioned releases; Go packages don't. Trying to keep up with changes by hand is going to become an intractable ordeal as we package more Go packages. I wanted to update |
|
I suppose that, as an alternative to this PR, I could write my own tool that parses the |
|
You could try using my updater. Its not precise at adding new dependencies
|
|
@wkennington Where is your updater hosted? |
|
https://github.com/triton/triton/blob/master/pkgs/development/tools/godep/updatePackageAndDeps.sh
|
|
@wkennington Thanks |
|
I'm edging towards just writing my own personal tool to update But it would work. And it would get us out of the predicament that we're currently in; if you look at Either way, I need to get unblocked. I need to use a bunch of Hashicorp's stuff for some upcoming projects, and I don't want to put in 10 hours updating it all if I'm just going to put in another 10 hours every two to four weeks. And, ideally, everyone gets to benefit from whatever I do. Nothing is more crushing than knowing someone else has put in the hours to do something after you've gone through the exact same steps, and you realize that -- had they contributed their work -- you could have spent an hour or two with friends, instead of doing boring, uninspiring, monotonous grunt work in front of a computer screen. |
|
Is this will only work for github repos? What we should do with projects hosted on bitbucket and others then? |
|
Hey, thanks for taking a look at this :).
As this stands, this just attempts to factor out the GitHub repos, as they represent the overwhelming majority of Go packages (there are only 4 non-GitHub packages, while there are 344 GitHub packages). I would also be fine with extending this to support other hosts, and I'm open to suggestions for how to do so (i.e. what should
That's actually something that I think needs to be discussed. For big projects that use Personally, within the context of Go packaging, I feel a _lot_ more comfortable using the vendored packages, as they have been vetted by the authors. I greatly appreciate your work on _So some questions remain:_
Personally, this makes sense to me:
Whatever we do, I think we must keep in mind that the perfect is the enemy of the good. The Go community have yet to settle on any meaningful packaging/versioning story, and given the built-in support for vendoring coming up in Go 1.6, I doubt that's going to change any time soon. |
|
On vendoring, I think it's undesirable in the context of Nix, as much as it's appealing from the "developer chose the versions" angle. That's probably a whole separate can of worms to discuss in another thread, as it's broader than just Go, but it also applies heavily to Go as you say. |
|
@copumpkin I completely agree about it being undesirable. I'm torn, really. Regardless, I want to reiterate that this PR is focused on how to improve the current state of affairs, where we try to provide single, curated set of Go packages. Our Go stuff is woefully outdated, and I want an effective way to keep stuff up to date. I will soon be recording a professional-grade, free instructional course on Nix/NixOS/NixOps, and I need this stuff updated before I can finish the course outline and start recording. |
|
Why can't we just generate nix expressions from Godeps.json files for each project ? Maintaining one big set of working packages is not practical for any language except Haskell because they already do all the work. It's an exponential problem that's made worse by the fact that each commit is a potential candidate in go (but it's also unpractical for all the other languages IMO). Especially for go it doesn't make much difference on the closure size since At least by using the project's Godeps.json we will be using the exact same set of dependencies that they vetted works with the project. It means that if we have any issue to report to upstream we will also be much more credible. I think that's quite important if we want to be perceived as a reliable platform. |
|
I have one big thing (that may fail) in mind and I hope that I find some time to challenge it next week. I want to start from the roots and extract all But I also agree with @cstrahan that the perfect is the enemy of the good. So if this PR could help us to update current packages quickly, I would merge it. As @cstrahan said it won't make it harder to update packages by hand but gives us a possibility to automate updates. |
|
I tried to generate complete separate derivations for few big packages with |
|
I took me a while, but I have basic PoC of go packages rework that I'm thinking about. Few thing you can see there:
Many things needs too be done, like using |
|
@cstrahan what do you think about my proposal? |
|
this pull request has been superseded. |
This is a new PR based on the old #13701, which merged in ee3b295 and reverted in 7760d6e (see there for commentary re: process).
To restate what this does:
What this does
This moves revisions and checksums to a separate, machine readable (specifically, JSON) file.
Why?
Presently, we have to go chasing each dependency of each go package in order to update revisions. This requires opening a browser, going to each GitHub repo, looking at the commits and tags, think long and hard about which revision to choose, open a terminal and get the sha256, then plug the revision and sha256 back into the
go-packages.nixfile. I tried updating theserfpackage by hand, and that took almost 3 hours.Ideally, a tool would provide a central way to pull the last couple commits and tags from each GitHub repo, and either completely automatically update the package set (or a given subset, like
serfand it's closure), or it would provide a UI for users that presents the revisions from upstream so you can make intelligent decisions about which revision to use (e.g. there are 4 new commits on master, but version 1.0.0 was tagged just a couple days ago... I think I'll go with tag).However, this commit doesn't place a dependency on such tooling (in fact, none exists as of yet), so we can consider the design later.
Why JSON, and not Nix?
The Node and Haskell package sets are good examples of where we can update the whole package by regenerating the Nix expressions from scratch. That works for those ecosystems, because they have a culture of tagging releases. They also explicitly state their dependencies and the version bounds thereof. So, with minimal tweaks a separate file (e.g.
pkgs/development/haskell-modules/configuration-common.nix) can extend the few troublesome packages with patches and such.However, that doesn't work for Go packages. Go projects don't explicitly list their dependencies and version bounds. Go projects don't tag releases[1]. If we want to have any chance of updating Go packages in an assisted or automatic way, we need to know what revisions we currently package.
What this doesn't do
[1]: If you're unfamiliar with the Go ecosystem, I repeat: they don't tag releases. They don't have any sort of package repository, so they all pull from each other's master and hope everything works out.