-
Notifications
You must be signed in to change notification settings - Fork 9
Add maven artifact pinning support #412
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis change extends repository configuration by adding Maven and Spark installation functions in the WORKSPACE file. The update integrates JSON configuration support via a new parameter Changes
Sequence Diagram(s)sequenceDiagram
participant W as WORKSPACE
participant L as load_all_dependencies
participant M as Maven Installer
participant S as Spark Installer
participant R as repository()
W->>L: load_all_dependencies()
W->>M: pinned_maven_repo_install()
M->>R: Call repository(..., maven_install_json)
W->>S: pinned_spark_repo_install()
S->>R: Call repository(..., maven_install_json)
Possibly related PRs
Suggested reviewers
Poem
Warning Review ran into problems🔥 ProblemsGitHub Actions and Pipeline Checks: Resource not accessible by integration - https://docs.github.com/rest/actions/workflow-runs#list-workflow-runs-for-a-repository. Please grant the required permissions to the CodeRabbit GitHub App under the organization or repository settings. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🔭 Outside diff range comments (1)
tools/build_rules/dependencies/maven_repository.bzl (1)
151-152:⚠️ Potential issueRemove duplicate dependency.
vertx-unitis listed twice with the same version."io.vertx:vertx-unit:4.5.10", - "io.vertx:vertx-unit:4.5.10",
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (5)
WORKSPACE(1 hunks)tools/build_rules/dependencies/defs.bzl(2 hunks)tools/build_rules/dependencies/load_dependencies.bzl(1 hunks)tools/build_rules/dependencies/maven_repository.bzl(1 hunks)tools/build_rules/dependencies/spark_repository.bzl(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: bazel_config_tests
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (4)
tools/build_rules/dependencies/load_dependencies.bzl (1)
21-21: LGTM!The change enables JSON-based configuration for Maven installations.
tools/build_rules/dependencies/defs.bzl (1)
14-14: LGTM!The new optional parameter maintains backward compatibility.
Also applies to: 31-31
tools/build_rules/dependencies/spark_repository.bzl (1)
9-9:✅ Verification successful
Verify that spark_install.json exists.
🏁 Script executed:
#!/bin/bash # Check if spark_install.json exists in the repository root test -f spark_install.json || echo "spark_install.json not found"Length of output: 28
Spark install file verified.
- Line 9 in
tools/build_rules/dependencies/spark_repository.bzlcorrectly references//:spark_install.json.- The repository root contains
spark_install.jsonas confirmed by the shell check.tools/build_rules/dependencies/maven_repository.bzl (1)
11-163:❓ Verification inconclusive
Verify dependency versions.
Several dependencies use very recent versions. Ensure they are stable and compatible.
🏁 Script executed:
#!/bin/bash # Check Maven Central for latest stable versions and any reported issues for pkg in "io.vertx:vertx-core:4.5.10" "ch.qos.logback:logback-classic:1.5.6" "software.amazon.awssdk:dynamodb:2.30.13" do artifact=$(echo $pkg | cut -d':' -f1,2) version=$(echo $pkg | cut -d':' -f3) curl -s "https://search.maven.org/solrsearch/select?q=g:$(echo $artifact | cut -d':' -f1)+AND+a:$(echo $artifact | cut -d':' -f2)&rows=1&wt=json" | jq -r '.response.docs[0].latestVersion' doneLength of output: 1665
Action: Verify and confirm dependency versions for stability.
- io.vertx:vertx-core: Currently set to 4.5.10; Maven Central reports 5.0.0.CR5 (a release candidate). Confirm if the current version is intentional.
- ch.qos.logback:logback-classic: Set to 1.5.6; update available to 1.5.16. Verify upgrade compatibility.
- software.amazon.awssdk:dynamodb: Using 2.30.13; Maven Central shows 2.30.24. Assess if an update is needed.
Ensure all dependencies are intentionally pinned to stable versions.
| load("@maven//:defs.bzl", pinned_maven_repo_install = "pinned_maven_install") | ||
| pinned_maven_repo_install() | ||
|
|
||
| load("@spark//:defs.bzl", pinned_spark_repo_install = "pinned_maven_install") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix the alias name for spark repository install.
-load("@spark//:defs.bzl", pinned_spark_repo_install = "pinned_maven_install")
+load("@spark//:defs.bzl", pinned_spark_repo_install = "pinned_spark_install")📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| load("@spark//:defs.bzl", pinned_spark_repo_install = "pinned_maven_install") | |
| load("@spark//:defs.bzl", pinned_spark_repo_install = "pinned_spark_install") |
| pinned = False, | ||
| maven_install_json = "//:maven_install.json", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Resolve conflicting pinning configuration.
pinned is set to False but maven_install_json is provided. This configuration appears contradictory since JSON pinning requires pinned=True.
- pinned = False,
+ pinned = True,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| pinned = False, | |
| maven_install_json = "//:maven_install.json", | |
| pinned = True, | |
| maven_install_json = "//:maven_install.json", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
devnotes.md (1)
91-98: Concise Documentation: Pinning Artifacts
The new section clearly describes Maven (and Spark) artifact pinning. Consider rewording “can take up a very long time” to “can be time-consuming” for brevity.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (1)
devnotes.md(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: bazel_config_tests
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (1)
devnotes.md (1)
99-108: Clear Command Instructions
The re-pin commands are clear and correct. Ensure that the JSON file updates are committed as noted.
## Summary Added pinning support for both our maven and spark repositories so we don't have to resolve them during builds. Going forward whenever we make any updates to the artifacts in either maven or spark repositories, we would need to re-pin the changed repos using following commands and check-in the updated json files. ``` REPIN=1 bazel run @maven//:pin REPIN=1 bazel run @spark//:pin ``` ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Integrated enhanced repository management for Maven and Spark, providing improved dependency installation. - Added support for JSON configuration files for Maven and Spark installations. - **Chores** - Updated documentation to include instructions on pinning Maven artifacts and managing dependency versions effectively. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Added pinning support for both our maven and spark repositories so we don't have to resolve them during builds. Going forward whenever we make any updates to the artifacts in either maven or spark repositories, we would need to re-pin the changed repos using following commands and check-in the updated json files. ``` REPIN=1 bazel run @maven//:pin REPIN=1 bazel run @spark//:pin ``` ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Integrated enhanced repository management for Maven and Spark, providing improved dependency installation. - Added support for JSON configuration files for Maven and Spark installations. - **Chores** - Updated documentation to include instructions on pinning Maven artifacts and managing dependency versions effectively. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Added pinning support for both our maven and spark repositories so we don't have to resolve them during builds. Going forward whenever we make any updates to the artifacts in either maven or spark repositories, we would need to re-pin the changed repos using following commands and check-in the updated json files. ``` REPIN=1 bazel run @maven//:pin REPIN=1 bazel run @spark//:pin ``` ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Integrated enhanced repository management for Maven and Spark, providing improved dependency installation. - Added support for JSON configuration files for Maven and Spark installations. - **Chores** - Updated documentation to include instructions on pinning Maven artifacts and managing dependency versions effectively. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Added pinning support for both our maven and spark repositories so we don't have to resolve them during builds. Going forward whenever we make any updates to the artifacts in either maven or spark repositories, we would need to re-pin the changed repos using following commands and check-in the updated json files. ``` REPIN=1 bazel run @maven//:pin REPIN=1 bazel run @spark//:pin ``` ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Integrated enhanced repository management for Maven and Spark, providing improved dependency installation. - Added support for JSON configuration files for Maven and Spark installations. - **Chores** - Updated documentation to include instructions on pinning Maven artifacts and managing dependency versions effectively. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Added pinning support for both our maven and spark repositories so we don't have to resolve them during builds. Going forward whenever we make any updates to the artifacts in either maven or spark repositories, we would need to re-pin the changed repos using following commands and cheour clients-in the updated json files. ``` REPIN=1 bazel run @maven//:pin REPIN=1 bazel run @spark//:pin ``` ## Cheour clientslist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Integrated enhanced repository management for Maven and Spark, providing improved dependency installation. - Added support for JSON configuration files for Maven and Spark installations. - **Chores** - Updated documentation to include instructions on pinning Maven artifacts and managing dependency versions effectively. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
Summary
Added pinning support for both our maven and spark repositories so we don't have to resolve them during builds.
Going forward whenever we make any updates to the artifacts in either maven or spark repositories, we would need to re-pin the changed repos using following commands and check-in the updated json files.
Checklist
Summary by CodeRabbit
New Features
Chores