-
Notifications
You must be signed in to change notification settings - Fork 8
Scope all workflow paths to self #140
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThe pull request focuses on refining GitHub Actions workflow trigger paths across multiple workflow files. Specifically, the changes modify the Changes
Possibly related PRs
Suggested reviewers
Poem
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (4)
.github/workflows/test_python.yaml(1 hunks).github/workflows/test_scala_fmt.yaml(1 hunks).github/workflows/test_scala_no_spark.yaml(2 hunks).github/workflows/test_scala_spark.yaml(1 hunks)
✅ Files skipped from review due to trivial changes (2)
- .github/workflows/test_python.yaml
- .github/workflows/test_scala_fmt.yaml
🔇 Additional comments (1)
.github/workflows/test_scala_spark.yaml (1)
9-9: LGTM! Path scoping changes look correct.
The addition of .github/workflows/test_scala_spark.yaml to the trigger paths ensures the workflow runs when modified while maintaining the existing Spark-related triggers. This change aligns with the PR objective of scoping workflow paths to self.
Let's verify that all Spark-related paths are included:
Also applies to: 16-16
✅ Verification successful
Let me verify the paths in the workflow file against the discovered Spark-related files.
Paths configuration is complete and accurate
The current path triggers in the workflow file:
spark/**: Covers all Spark-related source files and tests found in the repositorybuild.sbt: Captures any Scala/SBT build configuration changes.github/workflows/test_scala_spark.yaml: Self-referential for workflow changes
This matches perfectly with the discovered files from our verification, as all Spark-related files are under the spark/ directory, and the build configuration is covered by build.sbt. No additional paths need to be included.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify all Spark-related paths in the repository
# to ensure we haven't missed any important triggers
# Find all Spark-related source files and configurations
fd . spark/ -t f | grep -v '/target/' | sort | uniq
Length of output: 6576
Script:
#!/bin/bash
# Get the current paths configuration from the workflow file
cat .github/workflows/test_scala_spark.yaml
Length of output: 3225
| - 'online/**' | ||
| - 'api/**' | ||
| - '.github/workflows/**' | ||
| - '.github/workflows/test_scala_no_spark.yaml' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Workflow paths need to be updated to include missing Scala modules
The current workflow paths are missing several Scala modules that are being tested in the jobs section:
hub/**is missing but has hub tests being runservice/**is missing but has service tests being run
Additionally, there are other Scala modules in the codebase that might need consideration:
cloud_aws/**cloud_gcp/**quickstart/mongo-online-impl/**
🔗 Analysis chain
LGTM! Path scoping changes look correct.
The addition of .github/workflows/test_scala_no_spark.yaml to the trigger paths ensures the workflow runs when modified while maintaining the existing Scala-related triggers. This change aligns with the PR objective of scoping workflow paths to self.
Let's verify that all relevant Scala paths are included:
Also applies to: 22-22
Let me verify the paths in the workflow file against the Scala files found in the repository.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify all Scala-related paths in the repository
# to ensure we haven't missed any important triggers
# Find all Scala source files
fd -e scala -e sbt . | grep -v '/target/' | sort | uniq
Length of output: 17065
Script:
#!/bin/bash
# Check the current paths in the workflow file
cat .github/workflows/test_scala_no_spark.yaml
Length of output: 1639
## Summary Scope all the Github workflow trigger paths regarding workflows to self to not unnecessarily trigger all the other workflows.  See Slack [thread](https://zipline-2kh4520.slack.com/archives/C072LUA50KA/p1734555635936189) ## Checklist - [ ] Added Unit Tests - [ ] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- av pr metadata This information is embedded by the av CLI when creating PRs to track the status of stacks when using Aviator. Please do not delete or edit this section of the PR. ``` {"parent":"main","parentHead":"","trunk":"main"} ``` --> Co-authored-by: Sean Lynch <[email protected]>
Summary
Scope all the Github workflow trigger paths regarding workflows to self to not unnecessarily trigger all the other workflows.
See Slack thread
Checklist