Skip to content

Conversation

@kumar-zlai
Copy link
Contributor

@kumar-zlai kumar-zlai commented Feb 4, 2025

Summary

To add missing dependencies for flink module coming from our recent changes to keep it in sync with sbt

Tested locally by running Flink jobs from DataprocSubmitterTest

Checklist

  • Added Unit Tests
  • Covered by existing CI
  • Integration tested
  • Documentation update

Summary by CodeRabbit

  • New Features
    • Expanded Kafka integration with enhanced authentication, client functionality, and Protobuf serialization support.
    • Improved JSON processing support for Scala-based operations.
    • Adjusted dependency versions to ensure better compatibility and stability with Kafka and cloud services.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 4, 2025

Walkthrough

The changes add several new artifact dependencies to various scala_library targets across build files. In particular, Kafka-related dependencies are introduced in cloud_gcp/BUILD.bazel and flink/BUILD.bazel, while a Jackson Scala module is added in online/BUILD.bazel. Additionally, the Maven repository configuration is updated with a Kafka client version downgrade and a new managed Kafka handler version. The existing structure remains intact.

Changes

Files Change Summary
cloud_gcp/BUILD.bazel and tools/build_rules/.../maven_repository.bzl Added Kafka dependencies: managed-kafka-auth-login-handler and kafka-clients; downgraded kafka-clients version from 3.9.0 to 3.8.1 in Maven repository config.
flink/BUILD.bazel Added io.confluent:kafka-protobuf-provider to both deps and test_deps for Kafka protobuf support.
online/BUILD.bazel Added com.fasterxml.jackson.module:jackson-module-scala to deps for JSON processing support.

Possibly related PRs

Suggested reviewers

  • piyush-zlai
  • nikhil-zlai
  • tchow-zlai

Poem

New artifacts join the code parade,
Kafka and Jackson dance in cascade.
Build files sing a vibrant tune,
Dependencies set, under the moon.
Cheers to code that’s freshly made!

Warning

Review ran into problems

🔥 Problems

GitHub Actions and Pipeline Checks: Resource not accessible by integration - https://docs.github.com/rest/actions/workflow-runs#list-workflow-runs-for-a-repository.

Please grant the required permissions to the CodeRabbit GitHub App under the organization or repository settings.


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)

📥 Commits

Reviewing files that changed from the base of the PR and between a104e39 and 05c6906.

📒 Files selected for processing (4)
  • cloud_gcp/BUILD.bazel (2 hunks)
  • flink/BUILD.bazel (1 hunks)
  • online/BUILD.bazel (1 hunks)
  • tools/build_rules/dependencies/maven_repository.bzl (2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (2)
  • GitHub Check: no_spark_scala_tests
  • GitHub Check: enforce_triggered_workflows
🔇 Additional comments (4)
online/BUILD.bazel (1)

21-21: LGTM!

The jackson-module-scala addition complements existing Jackson dependencies.

flink/BUILD.bazel (1)

22-22: Verify Kafka Protobuf provider version compatibility.

The addition looks good but ensure version compatibility with Flink's Kafka connector.

✅ Verification successful

Version compatibility confirmed.

  • Kafka protobuf provider: version 7.8.0.
  • Flink Kafka connector: version 1.17.0.

Both dependencies appear compatible.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check version compatibility between Flink Kafka connector and Protobuf provider
rg -A 1 "flink-connector-kafka|kafka-protobuf-provider" tools/build_rules/dependencies/maven_repository.bzl

Length of output: 337

cloud_gcp/BUILD.bazel (1)

24-24: LGTM!

The GCP Kafka auth handler and client dependencies are properly placed.

Also applies to: 37-37

tools/build_rules/dependencies/maven_repository.bzl (1)

67-67: Verify Kafka client version compatibility with Flink.

The kafka-clients downgrade to 3.8.1 needs verification with Flink 1.17.0.

Also applies to: 96-96

✅ Verification successful

Kafka Client Compatibility Verified.

  • The Kafka clients version 3.8.1 is used with Flink 1.17.0.
  • Flink’s documentation confirms modern Kafka clients are backwards compatible, so the downgrade poses no compatibility issues.
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check Flink's recommended Kafka client versions
rg -A 5 "kafka-clients.*3\.8\.1|flink.*1\.17\.0" tools/build_rules/dependencies/maven_repository.bzl

# Look for any version constraints in Flink docs
curl -s "https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/connectors/datastream/kafka/" | grep -i "kafka.*version"

Length of output: 2382


🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@tchow-zlai
Copy link
Collaborator

@kumar-zlai would you mind also updating the following:

sbt flink/assembly
sbt service/assembly

and

ZIPLINE_FLINK_JAR_DEFAULT = "flink-assembly-0.1.0-SNAPSHOT.jar"

once this is in?

@kumar-zlai
Copy link
Contributor Author

@kumar-zlai would you mind also updating the following:

sbt flink/assembly
sbt service/assembly

and

ZIPLINE_FLINK_JAR_DEFAULT = "flink-assembly-0.1.0-SNAPSHOT.jar"

once this is in?

Sure, I can do that change today

@kumar-zlai kumar-zlai merged commit b4b6da4 into main Feb 4, 2025
5 checks passed
@kumar-zlai kumar-zlai deleted the flink_bazel_changes branch February 4, 2025 18:01
kumar-zlai added a commit that referenced this pull request Apr 25, 2025
## Summary
To add missing dependencies for flink module coming from our recent
changes to keep it in sync with sbt

Tested locally by running Flink jobs from DataprocSubmitterTest

## Checklist
- [ ] Added Unit Tests
- [x] Covered by existing CI
- [ ] Integration tested
- [ ] Documentation update



<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

- **New Features**
- Expanded Kafka integration with enhanced authentication, client
functionality, and Protobuf serialization support.
  - Improved JSON processing support for Scala-based operations.
- Adjusted dependency versions to ensure better compatibility and
stability with Kafka and cloud services.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
kumar-zlai added a commit that referenced this pull request Apr 29, 2025
## Summary
To add missing dependencies for flink module coming from our recent
changes to keep it in sync with sbt

Tested locally by running Flink jobs from DataprocSubmitterTest

## Checklist
- [ ] Added Unit Tests
- [x] Covered by existing CI
- [ ] Integration tested
- [ ] Documentation update



<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

- **New Features**
- Expanded Kafka integration with enhanced authentication, client
functionality, and Protobuf serialization support.
  - Improved JSON processing support for Scala-based operations.
- Adjusted dependency versions to ensure better compatibility and
stability with Kafka and cloud services.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
chewy-zlai pushed a commit that referenced this pull request May 15, 2025
## Summary
To add missing dependencies for flink module coming from our recent
changes to keep it in sync with sbt

Tested locally by running Flink jobs from DataprocSubmitterTest

## Checklist
- [ ] Added Unit Tests
- [x] Covered by existing CI
- [ ] Integration tested
- [ ] Documentation update



<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

- **New Features**
- Expanded Kafka integration with enhanced authentication, client
functionality, and Protobuf serialization support.
  - Improved JSON processing support for Scala-based operations.
- Adjusted dependency versions to ensure better compatibility and
stability with Kafka and cloud services.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
chewy-zlai pushed a commit that referenced this pull request May 15, 2025
## Summary
To add missing dependencies for flink module coming from our recent
changes to keep it in sync with sbt

Tested locally by running Flink jobs from DataprocSubmitterTest

## Checklist
- [ ] Added Unit Tests
- [x] Covered by existing CI
- [ ] Integration tested
- [ ] Documentation update



<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

- **New Features**
- Expanded Kafka integration with enhanced authentication, client
functionality, and Protobuf serialization support.
  - Improved JSON processing support for Scala-based operations.
- Adjusted dependency versions to ensure better compatibility and
stability with Kafka and cloud services.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
chewy-zlai pushed a commit that referenced this pull request May 16, 2025
## Summary
To add missing dependencies for flink module coming from our recent
changes to keep it in sync with sbt

Tested locally by running Flink jobs from DataprocSubmitterTest

## Cheour clientslist
- [ ] Added Unit Tests
- [x] Covered by existing CI
- [ ] Integration tested
- [ ] Documentation update



<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

- **New Features**
- Expanded Kafka integration with enhanced authentication, client
functionality, and Protobuf serialization support.
  - Improved JSON processing support for Scala-based operations.
- Adjusted dependency versions to ensure better compatibility and
stability with Kafka and cloud services.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants