-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(datasource): add debian datasource #30071
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your contribution, and the well documented functionality!
This is my first pass as there is a lot going on here and we have to be sure that the caching is setup efficiently.
Would this work for every APT repository?
lib/modules/datasource/deb/index.ts
Outdated
* @returns An array of component URLs. | ||
* @throws Will throw an error if required parameters are missing from the URL. | ||
*/ | ||
constructComponentUrls(registryUrl: string): string[] { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use functions and not methods where possible. Also move functions into separate files.
lib/modules/datasource/deb/index.ts
Outdated
}); | ||
}; | ||
|
||
const getReleaseParam = (url: URL): string => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No local functions, please.
We should start simple and then start to optimize. |
Thx for the initial review, I will fix the findings soon.
should work with every apt repository, which follows the https://wiki.debian.org/DebianRepository standard. |
}); | ||
}); | ||
|
||
const getComponentUrl = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use proper function declarations
lib/modules/datasource/deb/readme.md
Outdated
<!-- prettier-ignore --> | ||
!!! tip | ||
We recommend you try `loose` versioning for distribution packages first. | ||
This is because the version number usually doesn't match Renovate's default `semver-coerced` specification. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No don't set any versioning, the datasource should set deb
versioning as default, which should work in those cases.
@@ -33,7 +33,7 @@ First you would set a custom manager in your `renovate.json` file for `Dockerfil | |||
|
|||
Then you would put comments in your Dockerfile, to tell Renovate where to find the updates: | |||
|
|||
```docker | |||
```dockerfile |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Co-authored-by: Sebastian Poxhofer <[email protected]>
…t/debian-datasource
Changes
Context
Resolves: #7041, #24906
Concept
Based on the discussion and the reused implementation (see the related issue for more details):
Evaluation of Provided or Default Registry URL
https://ftp.debian.org/debian?release=bullseye&components=main,contrib&binaryArch=amd64
is evaluated to:.../debian/dists/bullseye/main/binary-amd64
.../debian/dists/bullseye/contrib/binary-amd64
Check for Existing Compressed Package
Packages.gz
file has already been downloaded and extracted:Iterate through Package Index
Return relevant release information
Documentation (please check one with an [x])
included documentation for new datasource
How I've tested my work (please select one)
I have verified these changes via:
Real Test
Open Points
Does the current caching implementation for the datasource need to be reworked? Can the Renovate team provide a concept/details for this? (e.g., invalidation of the compressed file based on its timestamp)
Is the current approach generic enough? Are there any points that were not considered?
If significant rework is required, please let me know in advance.