-
Notifications
You must be signed in to change notification settings - Fork 8
feat: use log4j2 everywhere in spark consistently #99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThe changes in this pull request involve modifications to the logging configuration across several files. The Changes
Possibly related PRs
Suggested reviewers
Poem
Warning Rate limit exceeded@tchow-zlai has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 9 minutes and 33 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (5)
🚧 Files skipped from review as they are similar to previous changes (3)
🔇 Additional comments (4).github/workflows/test_scala_no_spark.yaml (2)
Adding Also applies to: 23-23
Since this PR aims for consistent logging across the codebase, we should ensure the spark testing workflow is similarly updated. Also applies to: 23-23 ✅ Verification successfulChanges in spark workflow are consistent with no-spark workflow The verification confirms that 🏁 Scripts executedThe following scripts were executed for the analysis: Script: #!/bin/bash
# Description: Verify if spark workflow has similar build.sbt trigger
# Expected: Should find build.sbt in paths section of spark workflow
rg -A 5 "paths:" .github/workflows/test_scala_spark.yaml
Length of output: 265 .github/workflows/test_scala_spark.yaml (2)
Adding build.sbt to the workflow triggers is essential since the PR involves logging library changes. This ensures that any modifications to the logging dependencies will trigger the test suite. Also applies to: 17-17 Line range hint Since this PR implements log4j2 consistently across the codebase, we should verify that the Spark tests properly exercise the logging configuration. Let's check for logging-related test coverage: Consider adding specific test cases to verify:
✅ Verification successfulBased on the search results, I can now generate the final response: Spark tests properly exercise logging configuration The codebase shows comprehensive logging test coverage:
The tests verify both logging functionality and log content through:
🏁 Scripts executedThe following scripts were executed for the analysis: Script: #!/bin/bash
# Description: Check for logging-related test files and assertions in Spark tests
# Look for test files that might contain logging-related tests
rg -l "log.*Test" spark/src/test
rg -l "Log4j" spark/src/test
# Look for logging assertions or setup in test files
ast-grep --pattern 'assert($$$log$$$)' spark/src/test
ast-grep --pattern 'Logger.$$$' spark/src/test
Length of output: 168 Script: #!/bin/bash
# Let's try a broader search for logging-related code in Spark tests and configuration
# Search for any logging-related imports or configurations
rg -l "import.*log" spark/src/test
rg -l "log4j" --ignore-case spark/src/test
# Check for logging configuration files
fd "log4j2?.properties|log4j2?.xml" spark/src
# Look for logging setup in test resources
fd "log.*properties|log.*xml" spark/src/test/resources
# Search for logging-related code with context
rg "log" -A 3 -B 3 spark/src/test
Length of output: 107782 Script: #!/bin/bash
# Let's check for log4j2 configuration and test setup
# Look for log4j2 configuration files
fd "log4j2.*" spark/src
# Check for test configuration files that might set up logging
rg -l "log4j2?" spark/src/test/resources
# Look for logging configuration in test setup
rg "SparkSessionBuilder.*build" -A 5 -B 5 spark/src/test
# Check for any logging-related test assertions
rg "assert.*log" spark/src/test
Length of output: 45625 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
09a2d1d to
8c2b333
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (1)
build.sbt (1)
Line range hint
275-279: Inconsistent logging configuration in hub projectThe hub project still overrides the logback version which conflicts with the PR objective of using log4j everywhere consistently. Consider:
- Removing the logback override
- Adding log4j2 dependencies to the hub project
// Ensure consistent versions of logging libraries dependencyOverrides ++= Seq( - "org.slf4j" % "slf4j-api" % "1.7.36", - "ch.qos.logback" % "logback-classic" % "1.2.13" + "org.slf4j" % "slf4j-api" % "1.7.36" )
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (3)
build.sbt(2 hunks)docker-init/demo/run.sh(1 hunks)spark/src/main/resources/log4j2.properties(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- spark/src/main/resources/log4j2.properties
🔇 Additional comments (3)
docker-init/demo/run.sh (1)
Line range hint 24-33: Verify Log4j2 configuration pickup in Spark environment
Since this PR aims to implement consistent log4j usage, we should verify that Spark properly picks up the Log4j2 configuration without explicit configuration in spark-submit.
Let's verify the logging setup:
✅ Verification successful
Log4j2 configuration is properly set up in the Spark environment
The verification shows that Log4j2 is correctly configured:
- Log4j2 configuration files are present in both
docker-init/demo/log4j2.propertiesandspark/src/main/resources/log4j2.properties - Both configuration files have identical, proper Log4j2 settings including:
- Root logger configuration
- Console appender with pattern layout
- Specific logger for
ai.chrononpackage - ANSI color support enabled
- The configuration files are in the correct locations where Spark will automatically pick them up
- No explicit logging configurations are needed in spark-submit as the default classpath will include these configuration files
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for Log4j2 configuration files and related settings
# Check for Log4j2 configuration files in the codebase
echo "Searching for Log4j2 configuration files..."
fd -e properties -e xml -e yaml -e yml -e json | rg -i "log4j2"
# Check for logging-related configurations in Spark configuration files
echo "Checking Spark configuration files for logging settings..."
rg -l "log4j" . | rg -i "spark.*\.conf$"
# Check for logging-related dependencies in build files
echo "Checking build files for logging dependencies..."
rg -A 5 "log4j" . | rg "\.sbt$|pom\.xml$|build\.gradle$"
Length of output: 570
Script:
#!/bin/bash
# Let's examine the content of the log4j2 configuration files to verify their setup
echo "Content of docker-init/demo/log4j2.properties:"
cat docker-init/demo/log4j2.properties
echo -e "\nContent of spark/src/main/resources/log4j2.properties:"
cat spark/src/main/resources/log4j2.properties
# Check build files for Log4j2 dependencies
echo -e "\nChecking build files for Log4j2 dependencies:"
rg -i "log4j2" . | rg "\.sbt$|pom\.xml$|build\.gradle$|\.gradle$"
# Check if there are any explicit logging configurations in spark-submit scripts
echo -e "\nChecking spark-submit scripts for logging configurations:"
rg "spark-submit" -A 10 | rg -i "log"
Length of output: 5176
build.sbt (2)
170-174: LGTM! Comprehensive exclusion rules for logging conflicts
The exclusion rules correctly prevent multiple logging implementations from conflicting with Log4j2.
Line range hint 1-1: Verify complete removal of Logback configurations
Let's verify that all Logback configurations have been removed and Log4j2 is consistently used across all projects.
|
very nice! |
0f27fd6 to
b9c30db
Compare
## Summary Testing this locally: I see no spark logging churn. Looks like it preserves @nikhil-zlai 's PR's behavior: #96 <img width="760" alt="Screenshot 2024-11-26 at 7 55 03 PM" src="https://github.com/user-attachments/assets/844a44e1-c769-4089-b245-a86d138e1d1a"> ## Checklist - [x] Added Unit Tests - [x] Covered by existing CI - [x] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced a new logging configuration using Log4j2, enhancing logging capabilities and readability. - **Bug Fixes** - Removed outdated logging configuration references, streamlining Docker container execution and Spark application setup. - **Chores** - Updated dependency management to replace Logback with Log4j2 for consistent logging behavior across the project. - Enhanced CI/CD workflows to trigger on changes to the `build.sbt` file, improving responsiveness to updates. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Testing this locally: I see no spark logging churn. Looks like it preserves @nikhil-zlai 's PR's behavior: #96 <img width="760" alt="Screenshot 2024-11-26 at 7 55 03 PM" src="https://github.com/user-attachments/assets/844a44e1-c769-4089-b245-a86d138e1d1a"> ## Checklist - [x] Added Unit Tests - [x] Covered by existing CI - [x] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced a new logging configuration using Log4j2, enhancing logging capabilities and readability. - **Bug Fixes** - Removed outdated logging configuration references, streamlining Docker container execution and Spark application setup. - **Chores** - Updated dependency management to replace Logback with Log4j2 for consistent logging behavior across the project. - Enhanced CI/CD workflows to trigger on changes to the `build.sbt` file, improving responsiveness to updates. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Testing this locally: I see no spark logging churn. Looks like it preserves @nikhil-zlai 's PR's behavior: #96 <img width="760" alt="Screenshot 2024-11-26 at 7 55 03 PM" src="https://github.com/user-attachments/assets/844a44e1-c769-4089-b245-a86d138e1d1a"> ## Checklist - [x] Added Unit Tests - [x] Covered by existing CI - [x] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced a new logging configuration using Log4j2, enhancing logging capabilities and readability. - **Bug Fixes** - Removed outdated logging configuration references, streamlining Docker container execution and Spark application setup. - **Chores** - Updated dependency management to replace Logback with Log4j2 for consistent logging behavior across the project. - Enhanced CI/CD workflows to trigger on changes to the `build.sbt` file, improving responsiveness to updates. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Testing this locally: I see no spark logging churn. Looks like it preserves @nikhil-zlai 's PR's behavior: #96 <img width="760" alt="Screenshot 2024-11-26 at 7 55 03 PM" src="https://github.com/user-attachments/assets/844a44e1-c769-4089-b245-a86d138e1d1a"> ## Cheour clientslist - [x] Added Unit Tests - [x] Covered by existing CI - [x] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced a new logging configuration using Log4j2, enhancing logging capabilities and readability. - **Bug Fixes** - Removed outdated logging configuration references, streamlining Doour clientser container execution and Spark application setup. - **Chores** - Updated dependency management to replace Logbaour clients with Log4j2 for consistent logging behavior across the project. - Enhanced CI/CD workflows to trigger on changes to the `build.sbt` file, improving responsiveness to updates. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
Summary
Testing this locally:
I see no spark logging churn. Looks like it preserves @nikhil-zlai 's PR's behavior: #96
Checklist
Summary by CodeRabbit
New Features
Bug Fixes
Chores
build.sbtfile, improving responsiveness to updates.