-
Notifications
You must be signed in to change notification settings - Fork 9
Support scala version cross builds #379
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis PR updates the build configuration and Scala source code for improved Scala version management. It introduces new options for Scala 2.12 and 2.13 in Bazel files, replaces static settings with conditional select statements, and standardizes collection conversions (switching from asScala to toScala). Dependency declarations and formatting attributes across various BUILD and Scala files have been updated. A new repository rule in Changes
Sequence Diagram(s)sequenceDiagram
participant Dev as Developer
participant Baz as Bazel
participant Config as scala_config.bzl
participant WS as WORKSPACE
Dev->>Baz: Set SCALA_VERSION in environment
Baz->>Config: Invoke _scala_version_impl
Config->>Config: Read SCALA_VERSION (default "2.12.18" if unset)
Config->>WS: Generate version files (BUILD, version.bzl)
WS->>Baz: Supply Scala version to build rules
Suggested reviewers
Poem
Warning Review ran into problems🔥 ProblemsGitHub Actions and Pipeline Checks: Resource not accessible by integration - https://docs.github.com/rest/actions/workflow-runs#list-workflow-runs-for-a-repository. Please grant the required permissions to the CodeRabbit GitHub App under the organization or repository settings. 📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (3)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms (10)
🔇 Additional comments (11)
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🔭 Outside diff range comments (2)
spark/src/main/scala/ai/chronon/spark/stats/drift/SummaryUploader.scala (1)
79-79: 🛠️ Refactor suggestionReplace Await.result with async handling.
Based on past feedback, blocking with Await.result should be avoided in production code.
Consider refactoring to handle futures asynchronously:
- Await.result(aggregatedFuture, Duration(10L, TimeUnit.SECONDS)) + // Return the future and let the caller decide how to handle it + aggregatedFutureflink/src/main/scala/ai/chronon/flink/types/FlinkTypes.scala (1)
20-33: 💡 Verification agent🧩 Analysis chain
Verify impact of removed hashCode method.
Removing hashCode while keeping equals might affect hash-based collections.
🏁 Script executed:
#!/bin/bash # Search for uses of TimestampedIR in hash-based collections rg -l "Map.*TimestampedIR|Set.*TimestampedIR|HashMap.*TimestampedIR|HashSet.*TimestampedIR"Length of output: 164
Action: Implement a hashCode for TimestampedIR
- TimestampedIR is used in hash-based collections (e.g., in
flink/src/main/scala/ai/chronon/flink/window/FlinkRowAggregators.scala), so the equals/hashCode contract must be maintained.- Add a hashCode method (using, for instance,
util.Arrays.deepHashCodeforirand combininglatestTsMillis) to avoid erratic behavior.
🧹 Nitpick comments (13)
online/src/main/scala/ai/chronon/online/HTTPKVStore.scala (2)
19-38: Consider more granular response handling.The current implementation returns a single boolean for multiple requests.
- case StatusCode.Ok => Seq(true) - case _ => - logger.error(s"HTTP multiPut failed with status ${response.code}: ${response.body}") - Seq(false) + case StatusCode.Ok => Seq.fill(putRequests.size)(true) + case _ => + logger.error(s"HTTP multiPut failed with status ${response.code}: ${response.body}") + Seq.fill(putRequests.size)(false)
53-53: Consider using collection.Seq for consistency.For complete cross-version compatibility, consider updating the jsonList signature.
- def jsonList(requests: Seq[PutRequest]): String = { + def jsonList(requests: collection.Seq[PutRequest]): String = {flink/src/main/scala/ai/chronon/flink/AvroCodecFn.scala (1)
50-52: Consider performance impact of extra array creation.The change from
x.flatMaptox.toArray.flatMapcreates an intermediate array. While functionally correct, it may impact performance for large maps.Consider using the original approach or stream operations:
- x.toArray.flatMap { case (key, value) => Array(key, value) } + x.iterator.flatMap { case (key, value) => Array(key, value) }.toArrayspark/src/main/scala/ai/chronon/spark/format/Format.scala (4)
17-26: Check sub-partition error message.
You might include more context in theNotImplementedErrorto guide callers.
51-76: Add parameter validations.
Ensure arguments (tableName, partitionColumns, etc.) are non-empty before DDL generation.
78-90: Batch property alterations carefully.
Consider verifying table property keys before altering to reduce potential runtime errors.
99-101: Finalize creation logic.
canCreateTableremains false; remove once all formats support table creation..bazelrc (1)
9-13: Document version override in README.Add instructions for version override to project documentation.
api/src/main/scala/ai/chronon/api/ScalaJavaConversions.scala (1)
51-57: Consider lazy conversion.Using iterator().asScala.toList creates two intermediate collections.
- def toScala: List[T] = { - if (list == null) { - null - } else { - list.iterator().asScala.toList - } - } + def toScala: List[T] = { + if (list == null) { + null + } else { + list.asScala.toList + } + }tools/build_rules/artifact.bzl (1)
3-3: Remove unused import.SCALA_VERSION is imported but not used in this file.
-load("@io_bazel_rules_scala_config//:config.bzl", "SCALA_MAJOR_VERSION", "SCALA_VERSION") +load("@io_bazel_rules_scala_config//:config.bzl", "SCALA_MAJOR_VERSION")spark/src/main/scala/ai/chronon/spark/stats/drift/Expressions.scala (1)
117-118: Consider performance optimization.Two-step conversion (
toMap.toJava) creates unnecessary intermediate collection.- .toMap - .toJava + .toJavaspark/src/main/scala/ai/chronon/spark/LogFlattenerJob.scala (2)
182-182: Redundant conversion to Seq.
hashesis already aSeq[String], making.toSequnnecessary.- .where(col(Constants.SchemaHash).isin(hashes.toSeq: _*)) + .where(col(Constants.SchemaHash).isin(hashes: _*))
221-221: Simplify map conversion.The
.map(identity)is redundant asmapValuesalready creates a new map.- val schemaMap = schemaStringsMap.mapValues(LoggingSchema.parseLoggingSchema).map(identity).toMap + val schemaMap = schemaStringsMap.mapValues(LoggingSchema.parseLoggingSchema).toMap
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (72)
.bazelrc(1 hunks).scalafmt.conf(1 hunks)WORKSPACE(1 hunks)aggregator/BUILD.bazel(2 hunks)aggregator/src/main/scala/ai/chronon/aggregator/row/StatsGenerator.scala(1 hunks)aggregator/src/test/scala/ai/chronon/aggregator/test/ApproxHistogramTest.scala(2 hunks)aggregator/src/test/scala/ai/chronon/aggregator/test/FrequentItemsTest.scala(2 hunks)api/BUILD.bazel(2 hunks)api/src/main/scala/ai/chronon/api/Extensions.scala(1 hunks)api/src/main/scala/ai/chronon/api/ScalaJavaConversions.scala(2 hunks)api/src/test/scala/ai/chronon/api/test/TileSeriesSerializationTest.scala(3 hunks)cloud_aws/BUILD.bazel(3 hunks)cloud_aws/src/main/scala/ai/chronon/integrations/aws/DynamoDBKVStoreImpl.scala(11 hunks)cloud_aws/src/test/scala/ai/chronon/integrations/aws/DynamoDBKVStoreTest.scala(1 hunks)cloud_gcp/BUILD.bazel(2 hunks)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigTableKVStoreImpl.scala(1 hunks)flink/BUILD.bazel(3 hunks)flink/src/main/scala/ai/chronon/flink/AsyncKVStoreWriter.scala(1 hunks)flink/src/main/scala/ai/chronon/flink/AvroCodecFn.scala(2 hunks)flink/src/main/scala/ai/chronon/flink/FlinkJob.scala(3 hunks)flink/src/main/scala/ai/chronon/flink/SparkExpressionEvalFn.scala(2 hunks)flink/src/main/scala/ai/chronon/flink/TestFlinkJob.scala(3 hunks)flink/src/main/scala/ai/chronon/flink/types/FlinkTypes.scala(1 hunks)flink/src/main/scala/ai/chronon/flink/window/FlinkRowAggregators.scala(1 hunks)flink/src/main/scala/ai/chronon/flink/window/KeySelectorBuilder.scala(2 hunks)flink/src/test/scala/ai/chronon/flink/test/FlinkJobIntegrationTest.scala(5 hunks)flink/src/test/scala/ai/chronon/flink/test/FlinkTestUtils.scala(4 hunks)hub/BUILD.bazel(2 hunks)hub/src/main/scala/ai/chronon/hub/handlers/ConfHandler.scala(1 hunks)hub/src/main/scala/ai/chronon/hub/handlers/DriftHandler.scala(1 hunks)hub/src/main/scala/ai/chronon/hub/store/MonitoringModelStore.scala(3 hunks)online/BUILD.bazel(2 hunks)online/src/main/scala/ai/chronon/online/CatalystUtil.scala(2 hunks)online/src/main/scala/ai/chronon/online/Fetcher.scala(3 hunks)online/src/main/scala/ai/chronon/online/FetcherBase.scala(2 hunks)online/src/main/scala/ai/chronon/online/FetcherMain.scala(2 hunks)online/src/main/scala/ai/chronon/online/GroupByServingInfoParsed.scala(2 hunks)online/src/main/scala/ai/chronon/online/HTTPKVStore.scala(1 hunks)online/src/main/scala/ai/chronon/online/TileCodec.scala(1 hunks)online/src/main/scala/ai/chronon/online/stats/DriftStore.scala(2 hunks)online/src/main/scala/ai/chronon/online/stats/PivotUtils.scala(2 hunks)online/src/test/scala/ai/chronon/online/test/stats/PivotUtilsTest.scala(2 hunks)orchestration/BUILD.bazel(2 hunks)orchestration/src/main/scala/ai/chronon/orchestration/RepoIndex.scala(2 hunks)orchestration/src/main/scala/ai/chronon/orchestration/RepoTypes.scala(1 hunks)orchestration/src/main/scala/ai/chronon/orchestration/VersionUpdate.scala(1 hunks)orchestration/src/main/scala/ai/chronon/orchestration/utils/TablePrinter.scala(2 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala(3 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/RelevantLeftForJoinPartSpec.scala(1 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/RepoIndexSpec.scala(2 hunks)scala_config.bzl(1 hunks)spark/BUILD.bazel(3 hunks)spark/src/main/scala/ai/chronon/spark/BootstrapInfo.scala(2 hunks)spark/src/main/scala/ai/chronon/spark/GroupBy.scala(2 hunks)spark/src/main/scala/ai/chronon/spark/GroupByUpload.scala(1 hunks)spark/src/main/scala/ai/chronon/spark/JoinBase.scala(1 hunks)spark/src/main/scala/ai/chronon/spark/JoinUtils.scala(3 hunks)spark/src/main/scala/ai/chronon/spark/LogFlattenerJob.scala(3 hunks)spark/src/main/scala/ai/chronon/spark/TableUtils.scala(3 hunks)spark/src/main/scala/ai/chronon/spark/format/Format.scala(1 hunks)spark/src/main/scala/ai/chronon/spark/scripts/ObservabilityDemo.scala(2 hunks)spark/src/main/scala/ai/chronon/spark/stats/StatsCompute.scala(2 hunks)spark/src/main/scala/ai/chronon/spark/stats/drift/Expressions.scala(1 hunks)spark/src/main/scala/ai/chronon/spark/stats/drift/Summarizer.scala(2 hunks)spark/src/main/scala/ai/chronon/spark/stats/drift/SummaryUploader.scala(1 hunks)spark/src/test/scala/ai/chronon/spark/test/MockKVStore.scala(1 hunks)tools/build_rules/artifact.bzl(2 hunks)tools/build_rules/common.bzl(1 hunks)tools/build_rules/dependencies/maven_repository.bzl(4 hunks)tools/build_rules/dependencies/spark_repository.bzl(2 hunks)tools/build_rules/prelude_bazel(2 hunks)tools/config/BUILD(1 hunks)
✅ Files skipped from review due to trivial changes (17)
- .scalafmt.conf
- flink/src/main/scala/ai/chronon/flink/window/FlinkRowAggregators.scala
- online/src/main/scala/ai/chronon/online/TileCodec.scala
- hub/src/main/scala/ai/chronon/hub/handlers/DriftHandler.scala
- cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigTableKVStoreImpl.scala
- spark/src/main/scala/ai/chronon/spark/GroupByUpload.scala
- online/src/main/scala/ai/chronon/online/GroupByServingInfoParsed.scala
- orchestration/src/main/scala/ai/chronon/orchestration/VersionUpdate.scala
- online/src/test/scala/ai/chronon/online/test/stats/PivotUtilsTest.scala
- cloud_aws/src/test/scala/ai/chronon/integrations/aws/DynamoDBKVStoreTest.scala
- spark/src/main/scala/ai/chronon/spark/stats/StatsCompute.scala
- aggregator/src/main/scala/ai/chronon/aggregator/row/StatsGenerator.scala
- online/src/main/scala/ai/chronon/online/FetcherBase.scala
- spark/src/test/scala/ai/chronon/spark/test/MockKVStore.scala
- spark/src/main/scala/ai/chronon/spark/JoinBase.scala
- spark/src/main/scala/ai/chronon/spark/stats/drift/Summarizer.scala
- online/src/main/scala/ai/chronon/online/Fetcher.scala
🧰 Additional context used
🧠 Learnings (3)
spark/src/main/scala/ai/chronon/spark/stats/drift/Expressions.scala (1)
Learnt from: nikhil-zlai
PR: zipline-ai/chronon#17
File: spark/src/main/scala/ai/chronon/spark/stats/drift/Expressions.scala:76-102
Timestamp: 2024-11-12T09:38:33.532Z
Learning: In performance-critical sections of the codebase, prefer mutable data structures over immutable ones for better performance.
cloud_aws/src/main/scala/ai/chronon/integrations/aws/DynamoDBKVStoreImpl.scala (4)
Learnt from: piyush-zlai
PR: zipline-ai/chronon#33
File: cloud_aws/src/main/scala/ai/chronon/integrations/aws/DynamoDBKVStoreImpl.scala:29-30
Timestamp: 2024-11-12T09:38:33.532Z
Learning: In the codebase, the `KVStore` implementation provides an implicit `ExecutionContext` in scope, so it's unnecessary to import another.
Learnt from: piyush-zlai
PR: zipline-ai/chronon#33
File: cloud_aws/src/main/scala/ai/chronon/integrations/aws/DynamoDBKVStoreImpl.scala:245-260
Timestamp: 2024-11-12T09:38:33.532Z
Learning: In `DynamoDBKVStoreImpl.scala`, refactoring methods like `extractTimedValues` and `extractListValues` to eliminate code duplication is discouraged if it would make the code more convoluted.
Learnt from: piyush-zlai
PR: zipline-ai/chronon#33
File: cloud_aws/src/main/scala/ai/chronon/integrations/aws/DynamoDBKVStoreImpl.scala:167-229
Timestamp: 2024-11-12T09:38:33.532Z
Learning: Leveraging DynamoDB batch writes for performance optimization in `multiPut` will be addressed in a future review.
Learnt from: piyush-zlai
PR: zipline-ai/chronon#33
File: cloud_aws/src/main/scala/ai/chronon/integrations/aws/DynamoDBKVStoreImpl.scala:167-229
Timestamp: 2024-11-12T09:38:33.532Z
Learning: Errors and logging are handled at a granular level in `handleDynamoDbOperation` by passing the metrics context and updating relevant metric values based on the invoking operation.
spark/src/main/scala/ai/chronon/spark/stats/drift/SummaryUploader.scala (3)
Learnt from: chewy-zlai
PR: zipline-ai/chronon#50
File: spark/src/main/scala/ai/chronon/spark/stats/drift/SummaryUploader.scala:37-40
Timestamp: 2024-11-12T09:38:33.532Z
Learning: Avoid using `Await.result` in production code; prefer handling `Future`s asynchronously when possible to prevent blocking.
Learnt from: chewy-zlai
PR: zipline-ai/chronon#62
File: spark/src/main/scala/ai/chronon/spark/stats/drift/SummaryUploader.scala:9-10
Timestamp: 2024-11-12T09:38:33.532Z
Learning: In Spark applications, when defining serializable classes, passing an implicit `ExecutionContext` parameter can cause serialization issues. In such cases, it's acceptable to use `scala.concurrent.ExecutionContext.Implicits.global`.
Learnt from: nikhil-zlai
PR: zipline-ai/chronon#50
File: spark/src/main/scala/ai/chronon/spark/stats/drift/SummaryUploader.scala:19-47
Timestamp: 2024-11-12T09:38:33.532Z
Learning: In Scala, the `grouped` method on collections returns an iterator, allowing for efficient batch processing without accumulating all records in memory.
🔇 Additional comments (90)
flink/src/main/scala/ai/chronon/flink/window/KeySelectorBuilder.scala (2)
4-4: LGTM! Import changes for Scala 2.13 compatibility.The import changes correctly support cross-building for Scala 2.12 and 2.13.
Also applies to: 8-8
27-27: LGTM! Collection conversion updated for Scala 2.13.The change from
asScalatotoScalaaligns with Scala 2.13's collection conversion methods.flink/src/main/scala/ai/chronon/flink/SparkExpressionEvalFn.scala (2)
9-9: LGTM: Import for cross-version compatibility.
43-44: LGTM: Collection conversions updated for Scala 2.13.Using
toScalafromScalaJavaConversionsensures compatibility across Scala versions.online/src/main/scala/ai/chronon/online/stats/PivotUtils.scala (2)
15-15: LGTM!Custom conversion utility aligns with cross-build support.
72-72: LGTM!Standardized collection conversion method.
online/src/main/scala/ai/chronon/online/CatalystUtil.scala (2)
77-77: LGTM! Simplified type declaration.Using
Seqdirectly is more idiomatic Scala.
115-115: LGTM! Consistent type declaration.Matches the case class update above.
flink/src/test/scala/ai/chronon/flink/test/FlinkJobIntegrationTest.scala (3)
4-4: LGTM! Import changes align with Scala 2.13 compatibility.The switch to
ScalaJavaConversionsutility and explicitSeqimport supports cross-version builds.Also applies to: 22-22
41-41: LGTM! Collection conversions updated for Scala 2.13.Conversion from
asScalatotoScalamaintains compatibility across Scala versions.Also applies to: 46-46
95-95: LGTM! Test collection handling updated for cross-version support.Collection conversions in test assertions updated to use
toScala.Also applies to: 133-133
online/src/main/scala/ai/chronon/online/HTTPKVStore.scala (2)
9-9: LGTM: Explicit import for cross-version compatibility.The explicit import of scala.collection.Seq aligns with Scala 2.13 compatibility goals.
17-17: LGTM: Method signature updated for version compatibility.Explicit use of collection.Seq improves cross-version compatibility.
flink/src/main/scala/ai/chronon/flink/AvroCodecFn.scala (1)
22-22: LGTM: Explicit Seq import for cross-version compatibility.spark/src/main/scala/ai/chronon/spark/format/Format.scala (1)
28-41: Validate partition column existence.
Consider handling missingpartitionColumnkeys to avoid empty sets.tools/config/BUILD (1)
1-6: LGTM.
Good introduction ofscala_2_13config.scala_config.bzl (1)
1-12: Looks solid.
Defaulting to"2.12.18"is sensible. The environment lookup is straightforward.tools/build_rules/common.bzl (1)
1-1: Consider utilizing the imported SCALA_VERSION.The SCALA_VERSION is imported but not used in the file.
Should this be used in the
scala_jarfunction for version-specific naming?orchestration/BUILD.bazel (1)
4-7:✅ Verification successful
Temporary format disable looks good.
Matches PR objective to disable ScalaFmt for 2.13 temporarily.
Track re-enabling when rules_scala fix is released:
Also applies to: 34-37
🏁 Script executed:
#!/bin/bash # Check latest rules_scala release gh api repos/bazelbuild/rules_scala/releases/latest | jq .tag_nameLength of output: 77
Review Approved: Temporary Workaround Verified
- The workaround for disabling ScalaFmt for 2.13 is valid.
- Current latest rules_scala release is "v6.6.0"; track and re-enable when the fix is released.
- Applies also to lines 34–37.
api/src/main/scala/ai/chronon/api/ScalaJavaConversions.scala (1)
20-20: Verify toSeq performance impact.The explicit toSeq call might create an unnecessary copy.
Consider benchmarking with large collections.
tools/build_rules/dependencies/spark_repository.bzl (2)
16-20: LGTM! Added Spark artifacts for Scala 2.13.The additions enable cross-building with Scala 2.13.
44-45: LGTM! Excluded Scala core libraries.Prevents version conflicts by excluding scala-library and scala-reflect.
cloud_aws/BUILD.bazel (1)
4-7: LGTM! Disabled formatting for Scala 2.13.Temporarily disabled due to known ScalaFmt issues with Scala 2.13.
Also applies to: 43-46
orchestration/src/main/scala/ai/chronon/orchestration/utils/TablePrinter.scala (1)
3-4: LGTM! Updated collection handling for Scala 2.13.Added explicit Seq import and toSeq conversions for cross-version compatibility.
Also applies to: 24-24, 29-29
tools/build_rules/artifact.bzl (1)
40-43: LGTM! Enhanced maven_artifact with version support.Added optional version parameter for flexible artifact versioning.
aggregator/BUILD.bazel (1)
4-7: LGTM: Conditional formatting based on Scala versionFormat disabled for Scala 2.13 as a temporary workaround until ScalaFmt fix is released.
Also applies to: 56-59
orchestration/src/main/scala/ai/chronon/orchestration/RepoTypes.scala (1)
5-5: LGTM: Explicit Seq import for Scala 2.13Improves clarity and version compatibility.
api/src/test/scala/ai/chronon/api/test/TileSeriesSerializationTest.scala (1)
4-4: LGTM: Updated collection conversionsSwitched to toScala for better Scala 2.13 compatibility.
Also applies to: 42-42, 72-72
hub/BUILD.bazel (1)
4-7: LGTM: Consistent format handlingFormat disabled for Scala 2.13, matching the pattern across other BUILD files.
Also applies to: 63-66
api/BUILD.bazel (1)
27-30: LGTM: Format attribute changes align with PR objective.Correctly disables formatting for Scala 2.13 builds while maintaining it for other versions.
Also applies to: 76-79
spark/src/main/scala/ai/chronon/spark/stats/drift/SummaryUploader.scala (1)
19-19: LGTM: Explicit Seq import and usage.Improves type specificity and clarity.
Also applies to: 31-31
online/BUILD.bazel (1)
4-7: LGTM: Format attribute changes align with PR objective.Correctly disables formatting for Scala 2.13 builds while maintaining it for other versions.
Also applies to: 73-76
WORKSPACE (1)
5-8:✅ Verification successful
LGTM: Dynamic Scala version configuration.
Properly loads Scala version from external config, enabling version cross builds.
Verify the config file exists and contains valid versions:
🏁 Script executed:
#!/bin/bash # Check if scala_config.bzl exists and contains version definitions cat scala_config.bzlLength of output: 435
LGTM: Dynamic Scala version configuration verified.
- Loads Scala version from external config.
scala_config.bzlcorrectly defines and uses the version.flink/BUILD.bazel (2)
4-7: LGTM: Conditional formatting based on Scala version.Format disabled for Scala 2.13 as a workaround for ScalaFmt issues.
48-48: LGTM: Added protobuf dependency.Required for testing with protocol buffers.
hub/src/main/scala/ai/chronon/hub/handlers/ConfHandler.scala (1)
13-13: LGTM: Using standard Scala collection type.Explicit Seq import improves cross-version compatibility.
tools/build_rules/prelude_bazel (2)
3-4: LGTM: Dynamic Scala version configuration.Using SCALA_VERSION variable enables cross-version builds.
Also applies to: 34-35
37-46: LGTM: Consolidated test dependencies.Common Scala test libraries grouped together.
flink/src/main/scala/ai/chronon/flink/types/FlinkTypes.scala (1)
46-46: LGTM: Using Seq instead of List.Change prevents null list access issues.
cloud_gcp/BUILD.bazel (1)
4-7: LGTM: Format attribute changes align with PR objectives.Format is correctly disabled for Scala 2.13 to work around ScalaFmt issues.
Also applies to: 93-96
aggregator/src/test/scala/ai/chronon/aggregator/test/FrequentItemsTest.scala (1)
11-11: LGTM: Collection conversion updates.Standardized Scala-Java collection conversions using custom utility.
Also applies to: 160-160
orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala (1)
8-8: LGTM: Explicit sequence conversions added.Added explicit toSeq conversions for better cross-version compatibility.
Also applies to: 109-109, 114-114, 125-125
spark/BUILD.bazel (2)
4-7: LGTM: Format attribute changes align with PR objectives.Format is correctly disabled for Scala 2.13 to work around ScalaFmt issues.
Also applies to: 96-99
48-48:❓ Verification inconclusive
Verify hadoop-common dependency version compatibility.
Please ensure the hadoop-common version is compatible with both Scala 2.12 and 2.13.
🏁 Script executed:
#!/bin/bash # Check hadoop-common version compatibility rg -A 5 "hadoop-common" tools/build_rules/spark/Length of output: 48
Action Required: Confirm Hadoop Version Compatibility
The current artifact usage in
spark/BUILD.bazeldoes not specify a version, and our automated search intools/build_rules/spark/returned no details.
• Please manually verify that thehadoop-commondependency is set to a version compatible with Scala 2.12 and 2.13.flink/src/main/scala/ai/chronon/flink/AsyncKVStoreWriter.scala (1)
23-23: LGTM: Import addition supports cross-build compatibility.The explicit import of
scala.collection.Seqensures compatibility across Scala 2.12 and 2.13.aggregator/src/test/scala/ai/chronon/aggregator/test/ApproxHistogramTest.scala (2)
9-9: LGTM: Using custom conversion utilities.Standardizing collection conversions using custom utilities.
147-147: LGTM: Consistent collection conversion methods.Using
toScalaandtoJavafor explicit collection conversions.Also applies to: 150-150
flink/src/test/scala/ai/chronon/flink/test/FlinkTestUtils.scala (2)
11-11: LGTM: Import changes support cross-build compatibility.Using explicit Seq import and ScalaJavaConversions.
Also applies to: 38-38
103-103: LGTM: Consistent collection conversions.Using
toScalafor explicit collection conversions.Also applies to: 116-116
hub/src/main/scala/ai/chronon/hub/store/MonitoringModelStore.scala (2)
27-27: LGTM: Import addition supports cross-build compatibility.Explicit import of
scala.collection.Seqensures compatibility.
60-60: LGTM: Explicit collection type conversion.Using
toSeqfor explicit collection type conversion.Also applies to: 81-81
orchestration/src/test/scala/ai/chronon/orchestration/test/RepoIndexSpec.scala (5)
12-12: LGTM!Import added for explicit Seq type usage.
49-49: LGTM!Explicit conversion to Seq improves type safety.
12-12: LGTM! Collection type conversion is handled correctly.The explicit import of
scala.collection.Seqand use oftoSeqaligns with Scala 2.13's collection changes.Also applies to: 49-49
12-12: LGTM! Import aligns with Scala 2.13 support.
49-49: LGTM! Explicit collection conversion improves cross-version compatibility.orchestration/src/test/scala/ai/chronon/orchestration/test/RelevantLeftForJoinPartSpec.scala (3)
9-10: LGTM!Import added for explicit Seq type usage.
9-9: LGTM! Import aligns with Scala 2.13.The addition of
scala.collection.Seqimport ensures proper collection type handling.
9-9: LGTM! Import aligns with Scala 2.13 support.spark/src/main/scala/ai/chronon/spark/scripts/ObservabilityDemo.scala (5)
31-31: LGTM!Import added for explicit Seq type usage.
149-149: LGTM!Explicit conversion to Seq improves type safety.
31-31: LGTM! Explicit collection type handling.The import and explicit
toSeqconversion ensure proper type handling in Scala 2.13.Also applies to: 149-149
31-31: LGTM! Import aligns with Scala 2.13 support.
149-149: LGTM! Explicit collection conversion improves cross-version compatibility.flink/src/main/scala/ai/chronon/flink/TestFlinkJob.scala (5)
19-19: LGTM!Import added for standardized Java-Scala collection conversions.
135-135: LGTM!Explicit toScala conversion improves type safety.
Also applies to: 148-148
19-19: LGTM! Standardized collection conversions.The changes from
asScalatotoScalaand the addition ofScalaJavaConversionsimport improve Java-Scala interop.Also applies to: 135-135, 148-148
19-19: LGTM! Import enables standardized Java-to-Scala collection conversions.
135-136: LGTM! Standardized collection conversions improve cross-version compatibility.Also applies to: 148-148
tools/build_rules/dependencies/maven_repository.bzl (3)
3-3: LGTM: Dynamic version loading.Loading Scala version from config enables cross-compilation support.
11-12: LGTM: Dynamic core library versions.Using SCALA_VERSION variable for core libraries ensures version consistency.
13-160:✅ Verification successful
Verify dependency version compatibility.
The parallel 2.12/2.13 dependencies look good, but verify version compatibility across the ecosystem.
🌐 Web query:
Are the versions of scala-collection-compat (2.6.0), scalatest (3.2.15), and circe (0.14.9) compatible with both Scala 2.12 and 2.13?💡 Result:
The compatibility of these library versions with Scala 2.12 and 2.13 can be confirmed as follows:
scala-collection-compat (v2.x+)
- Explicitly designed for cross-building between Scala versions
- Supports Scala 2.x (including both 2.x)
- Provides backward compatibility for Scala APIs [1][5]
- Binary compatibility guaranteed from v onwards [1]
ScalaTest (v3.*
- Officially supports Scala and on JVM/JS/Native
- Version specifically updated to use:
- Scala (PR #2171)
- Scala (PR #2171) [6]
- Confirmed working with async tests and Scala features [6][8]
Circe (v0.*
- Cross-built for Scala [3]
- Version specifically listed in dependencies as compatible:
"io.circe" %% "circe-core" % "0.** // Works on both- Known to require
-Ymacro-annotationsflag only on [3]All three libraries meet the cross-compilation requirements through:
- Explicit multi-Scala version publishing (via
%%syntax in sbt)- Backward compatibility layers (scala-collection-compat)
- Version-specific compiler flag handling (Circe)
No version conflicts or incompatible API changes are reported in their respective release notes for these combinations[1][6][3].
Citations:
- 1: https://github.com/scala/scala-collection-compat
- 2: https://www.scalatest.org/release_notes/3.2.15
- 3: https://circe.github.io/circe/quickstart.html
- 4: https://community.cloudera.com/t5/Community-Articles/Spark-Scala-Version-Compatibility-Matrix/ta-p/383713
- 5: https://index.scala-lang.org/scala/scala-collection-compat
- 6: https://github.com/scalatest/scalatest/releases
- 7: https://cir.is/docs/overview
- 8: https://www.scalatest.org/release_notes/3.2.0
- 9: https://docs.scala-lang.org/overviews/core/collections-migration-213.html
- 10: https://docs.scala-lang.org/overviews/jdk-compatibility/overview.html
Dependency Compatibility Verified
The dependency versions for
scala-collection-compat(2.6.0),scalatest(3.2.15), andcirce(0.14.9) have been confirmed to be cross-compatible with Scala 2.12 and 2.13. No further changes are needed.online/src/main/scala/ai/chronon/online/stats/DriftStore.scala (2)
30-30: LGTM: Explicit Seq import.Improves clarity by explicitly importing Seq.
60-60: LGTM: Explicit Map conversion.Ensures consistent Map type across Scala versions.
online/src/main/scala/ai/chronon/online/FetcherMain.scala (2)
5-5: LGTM: Custom conversion utilities.Using ScalaJavaConversions provides consistent conversion behavior.
166-172: LGTM: Consistent toScala usage.Using toScala method ensures uniform conversion behavior.
orchestration/src/main/scala/ai/chronon/orchestration/RepoIndex.scala (2)
11-11: LGTM: Explicit Seq import.Improves clarity by explicitly importing Seq.
161-166: LGTM: Explicit sequence conversion.Using toSeq ensures consistent sequence type.
spark/src/main/scala/ai/chronon/spark/BootstrapInfo.scala (1)
111-112: LGTM! Explicit sequence conversions improve type safety.The changes make collection type handling more explicit and safer.
Also applies to: 146-147
cloud_aws/src/main/scala/ai/chronon/integrations/aws/DynamoDBKVStoreImpl.scala (1)
102-103: LGTM! Consistent use of toScala/toJava improves code clarity.The changes standardize collection conversions across the codebase.
Also applies to: 135-135, 154-154, 188-188, 200-200, 223-223, 277-278, 295-296, 338-339
spark/src/main/scala/ai/chronon/spark/JoinUtils.scala (2)
38-38: LGTM! Added explicit Seq import.The import ensures type safety by using the explicit Scala collection type.
154-154: LGTM! Explicit collection type conversion.The
.toSeqconversions ensure type safety when working with collections.Also applies to: 173-173
flink/src/main/scala/ai/chronon/flink/FlinkJob.scala (2)
9-9: LGTM! Added necessary imports.The imports support standardized collection conversions.
Also applies to: 50-50
380-380: LGTM! Standardized Map conversion.Using
toJavafrom ScalaJavaConversions for consistent conversion.spark/src/main/scala/ai/chronon/spark/TableUtils.scala (3)
177-177: LGTM! Explicit Map conversion.Using
toMapensures immutability and type safety.
289-289: LGTM! Explicit Seq conversion.Using
toSeqensures type safety for partition columns.
796-797: LGTM! Improved null handling.Better handling of empty partitions by explicitly checking isEmpty.
spark/src/main/scala/ai/chronon/spark/GroupBy.scala (1)
743-743: LGTM! Explicit Seq conversion.Using
toSeqensures type safety when selecting columns.api/src/main/scala/ai/chronon/api/Extensions.scala (1)
920-920: Add toMap for Scala 2.13 compatibility.The change ensures compatibility with Scala 2.13 where
mapValuesreturns a view instead of a Map.spark/src/main/scala/ai/chronon/spark/LogFlattenerJob.scala (1)
262-262: LGTM!Explicit conversion to Map improves type safety.
| common:scala_2.12 --repo_env=SCALA_VERSION=2.12.18 | ||
| common:scala_2.12 --define=SCALA_VERSION=2.12.18 | ||
| common:scala_2.13 --repo_env=SCALA_VERSION=2.13.12 | ||
| common:scala_2.13 --define=SCALA_VERSION=2.13.12 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify Scala version compatibility.
Let's check if these are the latest compatible versions.
🌐 Web query:
What are the latest stable versions of Scala 2.12 and 2.13?
💡 Result:
As of February 2025, the latest stable versions of Scala are:
| Series | Latest Version | Release Date | Notable Features/Changes |
|---|---|---|---|
| Scala 2.13 | 2.13.16 | January 15, 2025 | • Breaking change: .tail/.init now throw exceptions on empty strings[1]• JDK 24 support via ASM upgrade[1] • JLine upgrades for REPL on Windows[1] |
| Scala 2.12 | 2.12.20 | September 4, 2024 | • Security vulnerability fixes[7] • Compatibility with newer JVMs[7] • Minimal maintenance continues for sbt compatibility[4] |
Key details:
- Scala 2.13.x remains actively maintained with regular updates aligning with Scala 3 features and JDK support[1][7]
- Scala 2.12.x receives security patches and JVM compatibility updates but minimal feature development[4][7]
- Both versions maintain binary compatibility within their respective series[5][7]
For new projects, consider migrating to Scala 3.x series which offers improved features and tooling support[3][4]. Existing projects should upgrade to these latest versions for security fixes and JDK compatibility improvements[7][10].
Citations:
- 1: https://github.com/scala/scala/releases
- 2: https://www.scala-lang.org/download/2.13.13.html
- 3: https://scala-cli.virtuslab.org/docs/reference/scala-versions/
- 4: https://www.scala-lang.org/development/
- 5: https://www.scala-lang.org/download/2.13.0.html
- 6: https://dzone.com/articles/scala-213-has-scala-done-it-again
- 7: https://www.scala-lang.org/download/2.12.20.html
- 8: https://www.scala-lang.org/download/2.13.14.html
- 9: https://www.scala-lang.org/download/changelog.html
- 10: https://www.scala-lang.org/download/2.12.19.html
Action: Update Scala Versions in .bazelrc
- Currently set: 2.12.18 and 2.13.12.
- Latest stable: 2.12.20 and 2.13.16 (Feb 2025).
- Verify if updating is feasible and document any migration details.
piyush-zlai
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets fix the equals and hashcode behavior prior to merging
| override def toString: String = | ||
| s"TimestampedIR(ir=${ir.mkString(", ")}, latestTsMillis=$latestTsMillis)" | ||
|
|
||
| override def hashCode(): Int = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we add an implementation of hashCode here give we have an equals?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch. Sure added it back.
|
|
||
| override def hashCode(): Int = | ||
| Objects.hash(keys.toArray.deep, tileBytes, latestTsMillis.asInstanceOf[java.lang.Long]) | ||
| s"TimestampedTile(keys=${keys.iterator().toScala.mkString(", ")}, tileBytes=${java.util.Base64.getEncoder |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this isn't a legit hashCode implementation right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Modified it.
| // for derivations we only need to read one row at a time. | ||
| // for interactive we set the limit to 16. | ||
| .config("spark.sql.parquet.columnarReaderBatchSize", "16") | ||
| .config("spark.driver.bindAddress", "127.0.0.1") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this needed? If so, can you add a 1-line comment explaining why?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala (1)
97-98: Create issue for TODO and ignored test.The TODO and ignored test indicate unresolved iterator distinct functionality.
Would you like me to create an issue to track this?
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (2)
orchestration/src/main/scala/ai/chronon/orchestration/utils/CollectionExtensions.scala(1 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala(3 hunks)
✅ Files skipped from review due to trivial changes (1)
- orchestration/src/main/scala/ai/chronon/orchestration/utils/CollectionExtensions.scala
⏰ Context from checks skipped due to timeout of 90000ms (10)
- GitHub Check: streaming_tests
- GitHub Check: spark_tests
- GitHub Check: fetcher_tests
- GitHub Check: groupby_tests
- GitHub Check: join_tests
- GitHub Check: analyzer_tests
- GitHub Check: scala_compile_fmt_fix
- GitHub Check: bazel_config_tests
- GitHub Check: non_spark_tests
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (3)
orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala (3)
8-8: LGTM: Explicit Seq import.Good practice for cross-version compatibility.
110-110: LGTM: Iterator to Seq conversion.Necessary change for consistent behavior across Scala versions.
Also applies to: 115-115
118-127:❓ Verification inconclusive
Verify distinct behavior for complex objects.
Test is ignored and assertion style changed, suggesting potential issues.
🏁 Script executed:
#!/bin/bash # Search for other uses of distinct with complex objects rg -l "distinct.*case class" --type scalaLength of output: 43
Distinct behavior: Verify test and assertions
- Test in
orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala(L118-127) remains disabled.- Assertion style now differs; confirm that
iterator.distinctreliably returns a unique sequence for complex objects.- Please double-check if the test should be re-enabled and consistent assertions applied.
## Summary Changes to support builds/tests with both scala 2.12 and 2.13 versions. By default we build against 2.12 version, pass "--config scala_2.13" option to "bazel build/test" to override it. ScalaFmt seems to be breaking for 2.13 using bazel rules_scala package, [fix](bazel-contrib/rules_scala#1631) is already deployed but a release with that change is not available yet, so temporarily disabled ScalaFmt checks for 2.13 will enable later once the fix is released. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enabled flexible Scala version selection (2.12 and 2.13) for smoother builds and enhanced compatibility. - Introduced a default Scala version constant and a repository rule for improved version management. - Added support for additional Scala 2.13 dependencies in the build configuration. - **Refactor and Improvements** - Streamlined build and dependency management for increased stability and performance. - Consolidated collection conversion utilities to boost reliability in tests and runtime processing. - Enhanced type safety and clarity in collection handling across various modules. - Improved handling of Scala collections and maps throughout the codebase for better type consistency and safety. - Updated method implementations to ensure explicit type conversions, enhancing clarity and preventing runtime errors. - Modified method signatures and internal logic to utilize `Seq` for improved type clarity and consistency. - Enhanced the `maven_artifact` function to accept an optional version parameter for better dependency management. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Changes to support builds/tests with both scala 2.12 and 2.13 versions. By default we build against 2.12 version, pass "--config scala_2.13" option to "bazel build/test" to override it. ScalaFmt seems to be breaking for 2.13 using bazel rules_scala package, [fix](bazel-contrib/rules_scala#1631) is already deployed but a release with that change is not available yet, so temporarily disabled ScalaFmt checks for 2.13 will enable later once the fix is released. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enabled flexible Scala version selection (2.12 and 2.13) for smoother builds and enhanced compatibility. - Introduced a default Scala version constant and a repository rule for improved version management. - Added support for additional Scala 2.13 dependencies in the build configuration. - **Refactor and Improvements** - Streamlined build and dependency management for increased stability and performance. - Consolidated collection conversion utilities to boost reliability in tests and runtime processing. - Enhanced type safety and clarity in collection handling across various modules. - Improved handling of Scala collections and maps throughout the codebase for better type consistency and safety. - Updated method implementations to ensure explicit type conversions, enhancing clarity and preventing runtime errors. - Modified method signatures and internal logic to utilize `Seq` for improved type clarity and consistency. - Enhanced the `maven_artifact` function to accept an optional version parameter for better dependency management. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Changes to support builds/tests with both scala 2.12 and 2.13 versions. By default we build against 2.12 version, pass "--config scala_2.13" option to "bazel build/test" to override it. ScalaFmt seems to be breaking for 2.13 using bazel rules_scala package, [fix](bazel-contrib/rules_scala#1631) is already deployed but a release with that change is not available yet, so temporarily disabled ScalaFmt checks for 2.13 will enable later once the fix is released. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enabled flexible Scala version selection (2.12 and 2.13) for smoother builds and enhanced compatibility. - Introduced a default Scala version constant and a repository rule for improved version management. - Added support for additional Scala 2.13 dependencies in the build configuration. - **Refactor and Improvements** - Streamlined build and dependency management for increased stability and performance. - Consolidated collection conversion utilities to boost reliability in tests and runtime processing. - Enhanced type safety and clarity in collection handling across various modules. - Improved handling of Scala collections and maps throughout the codebase for better type consistency and safety. - Updated method implementations to ensure explicit type conversions, enhancing clarity and preventing runtime errors. - Modified method signatures and internal logic to utilize `Seq` for improved type clarity and consistency. - Enhanced the `maven_artifact` function to accept an optional version parameter for better dependency management. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Changes to support builds/tests with both scala 2.12 and 2.13 versions. By default we build against 2.12 version, pass "--config scala_2.13" option to "bazel build/test" to override it. ScalaFmt seems to be breaking for 2.13 using bazel rules_scala package, [fix](bazel-contrib/rules_scala#1631) is already deployed but a release with that change is not available yet, so temporarily disabled ScalaFmt checks for 2.13 will enable later once the fix is released. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enabled flexible Scala version selection (2.12 and 2.13) for smoother builds and enhanced compatibility. - Introduced a default Scala version constant and a repository rule for improved version management. - Added support for additional Scala 2.13 dependencies in the build configuration. - **Refactor and Improvements** - Streamlined build and dependency management for increased stability and performance. - Consolidated collection conversion utilities to boost reliability in tests and runtime processing. - Enhanced type safety and clarity in collection handling across various modules. - Improved handling of Scala collections and maps throughout the codebase for better type consistency and safety. - Updated method implementations to ensure explicit type conversions, enhancing clarity and preventing runtime errors. - Modified method signatures and internal logic to utilize `Seq` for improved type clarity and consistency. - Enhanced the `maven_artifact` function to accept an optional version parameter for better dependency management. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
## Summary Changes to support builds/tests with both scala 2.12 and 2.13 versions. By default we build against 2.12 version, pass "--config scala_2.13" option to "bazel build/test" to override it. ScalaFmt seems to be breaking for 2.13 using bazel rules_scala paour clientsage, [fix](bazel-contrib/rules_scala#1631) is already deployed but a release with that change is not available yet, so temporarily disabled ScalaFmt cheour clientss for 2.13 will enable later once the fix is released. ## Cheour clientslist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Enabled flexible Scala version selection (2.12 and 2.13) for smoother builds and enhanced compatibility. - Introduced a default Scala version constant and a repository rule for improved version management. - Added support for additional Scala 2.13 dependencies in the build configuration. - **Refactor and Improvements** - Streamlined build and dependency management for increased stability and performance. - Consolidated collection conversion utilities to boost reliability in tests and runtime processing. - Enhanced type safety and clarity in collection handling across various modules. - Improved handling of Scala collections and maps throughout the codebase for better type consistency and safety. - Updated method implementations to ensure explicit type conversions, enhancing clarity and preventing runtime errors. - Modified method signatures and internal logic to utilize `Seq` for improved type clarity and consistency. - Enhanced the `maven_artifact` function to accept an optional version parameter for better dependency management. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
Summary
Changes to support builds/tests with both scala 2.12 and 2.13 versions. By default we build against 2.12 version, pass "--config scala_2.13" option to "bazel build/test" to override it.
ScalaFmt seems to be breaking for 2.13 using bazel rules_scala package, fix is already deployed but a release with that change is not available yet, so temporarily disabled ScalaFmt checks for 2.13 will enable later once the fix is released.
Checklist
Summary by CodeRabbit
New Features
Refactor and Improvements
Seqfor improved type clarity and consistency.maven_artifactfunction to accept an optional version parameter for better dependency management.