-
Notifications
You must be signed in to change notification settings - Fork 8
Changes to fix all the test failures with Bazel #310
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
# Conflicts: # flink/src/main/scala/ai/chronon/flink/FlinkJob.scala
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (3)
service_commons/src/main/java/ai/chronon/service/RouteHandlerWrapper.java (3)
Line range hint
130-156: Add Javadoc for the public method.Method looks good but needs documentation since it's now public.
240-262: Enhance error handling.Add more context to the error message:
- throw new RuntimeException("Error converting field " + entry.getKey(), e); + throw new RuntimeException("Error converting field '" + entry.getKey() + "' of class " + pojo.getClass().getSimpleName(), e);
288-333: Consider breaking down the method.Large method with complex logic. Consider extracting type-specific conversions into separate methods.
private static String convertToString(Object value) { if (value == null) return null; Class<?> valueClass = value.getClass(); - // Handle primitive types and their wrappers - if (valueClass.isPrimitive() || - Number.class.isAssignableFrom(valueClass) || - Boolean.class == valueClass || - String.class == valueClass) { - return String.valueOf(value); - } + if (isPrimitiveOrWrapper(valueClass)) return String.valueOf(value); + if (valueClass.isEnum()) return convertEnumToString(value); + if (List.class.isAssignableFrom(valueClass)) return convertListToString((List<?>) value); + if (Map.class.isAssignableFrom(valueClass)) return convertMapToString((Map<?, ?>) value); // ... rest of the method } + +private static boolean isPrimitiveOrWrapper(Class<?> clazz) { + return clazz.isPrimitive() || + Number.class.isAssignableFrom(clazz) || + Boolean.class == clazz || + String.class == clazz; +}
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (1)
service_commons/src/main/java/ai/chronon/service/RouteHandlerWrapper.java(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (8)
- GitHub Check: fetcher_spark_tests
- GitHub Check: table_utils_delta_format_spark_tests
- GitHub Check: other_spark_tests
- GitHub Check: mutation_spark_tests
- GitHub Check: no_spark_scala_tests
- GitHub Check: scala_compile_fmt_fix
- GitHub Check: join_spark_tests
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (1)
service_commons/src/main/java/ai/chronon/service/RouteHandlerWrapper.java (1)
264-286: LGTM!Clean implementation with proper edge case handling.
| * @param latestTsMillis timestamp of the current event being processed | ||
| */ | ||
| class TimestampedTile(var keys: List[Any], var tileBytes: Array[Byte], var latestTsMillis: Long) { | ||
| class TimestampedTile(var keys: Seq[Any], var tileBytes: Array[Byte], var latestTsMillis: Long) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
leave a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
| throw new IllegalArgumentException("Unsupported type: " + targetType.getTypeName()); | ||
| } | ||
|
|
||
| public static Map<String, String> convertPojoToMap(Object pojo) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove these
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
| * sbt spark/test -> Will skip the test suite | ||
| * sbt "spark/testOnly -- -n foo" -> Will include the tests in the suite if tagName = foo | ||
| * This allows us to skip some tests selectively by default while still being able to invoke them individually | ||
| */ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add comment - or we need to rely on online test
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (7)
cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/BigTableKVStoreTest.scala (1)
Line range hint
204-226: Consider using MockitoSugar's mock syntax.The mock creation can be simplified using ScalaTest's MockitoSugar trait.
- val mockDataClient = mock[BigtableDataClient](withSettings().mockMaker("mock-maker-inline")) + val mockDataClient = mock[BigtableDataClient]flink/src/test/scala/org/apache/spark/sql/avro/AvroDeSerializationSupportSpec.scala (1)
Line range hint
71-106: Add error handling.Consider adding validation for required fields and error handling for object creation.
online/src/test/scala/ai/chronon/online/test/stats/AssignIntervalsTest.scala (2)
13-14: Remove commented code and consider standardizing number format.- //val interval = 0.25 val expected = Array(0.0, 1.0 / 3.0, 1.0 / 3.0, 1.0 / 3.0 + 1.0 / 2.0, 1.0 / 2.0, 2.5, 0.5, 1, 0)
16-21: Remove debug println and consider using scalatest's built-in diff output.val result = AssignIntervals.on(ptiles = percentiles.map(_.toDouble), breaks = breaks.map(_.toDouble)) expected.zip(result).foreach { case (e, r) => - println(s"exp: $e res: $r") r shouldEqual e }hub/src/test/scala/ai/chronon/hub/handlers/TimeSeriesHandlerTest.scala (3)
95-99: Consider using a helper method for timer verification.Multiple test methods use identical timer verification blocks. Extract this pattern to reduce duplication.
+ private def verifyErrorResponse(async: TestContext#Async, expectedCode: Int): Unit = { + vertx.setTimer(1000, _ => { + verify(response).setStatusCode(expectedCode) + async.complete() + }) + } - vertx.setTimer(1000, - _ => { - verify(response).setStatusCode(400) - async.complete() - }) + verifyErrorResponse(async, 400)Also applies to: 112-116, 128-132, 144-148, 160-164, 182-186, 242-246, 260-264, 277-281, 294-298, 311-315, 328-332, 345-349
513-518: Consider using a case class for query parameters.The parameter lists are long and duplicated. A case class would improve maintainability.
+ private case class QueryParams( + startTs: Long, + endTs: Long, + metricType: String, + metrics: String, + offset: Option[String] = None, + algorithm: Option[String] = None, + granularity: Option[String] = None + )Also applies to: 531-537
566-576: Simplify mock data generation logic.The conditional logic for numeric vs categorical data can be simplified using pattern matching.
- val percentileDrifts = - if (isNumeric) List.fill(timestamps.size())(JDouble.valueOf(0.12)).asJava - else List.fill[JDouble](timestamps.size())(null).asJava - val histogramDrifts = - if (isNumeric) List.fill[JDouble](timestamps.size())(null).asJava - else List.fill(timestamps.size())(JDouble.valueOf(0.23)).asJava + val (percentileDrifts, histogramDrifts) = isNumeric match { + case true => ( + List.fill(timestamps.size())(JDouble.valueOf(0.12)).asJava, + List.fill[JDouble](timestamps.size())(null).asJava + ) + case false => ( + List.fill[JDouble](timestamps.size())(null).asJava, + List.fill(timestamps.size())(JDouble.valueOf(0.23)).asJava + ) + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (39)
aggregator/src/test/scala/ai/chronon/aggregator/test/ApproxHistogramTest.scala(4 hunks)aggregator/src/test/scala/ai/chronon/aggregator/test/DataGen.scala(3 hunks)aggregator/src/test/scala/ai/chronon/aggregator/test/FrequentItemsTest.scala(3 hunks)aggregator/src/test/scala/ai/chronon/aggregator/test/RowAggregatorTest.scala(1 hunks)aggregator/src/test/scala/ai/chronon/aggregator/test/SawtoothOnlineAggregatorTest.scala(1 hunks)aggregator/src/test/scala/ai/chronon/aggregator/test/TwoStackLiteAggregatorTest.scala(4 hunks)api/src/test/scala/ai/chronon/api/test/DataTypeConversionTest.scala(1 hunks)api/src/test/scala/ai/chronon/api/test/TileSeriesSerializationTest.scala(2 hunks)cloud_aws/src/test/scala/ai/chronon/integrations/aws/DynamoDBKVStoreTest.scala(1 hunks)cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/BigTableKVStoreTest.scala(6 hunks)flink/src/main/scala/ai/chronon/flink/types/FlinkTypes.scala(1 hunks)flink/src/test/scala/ai/chronon/flink/test/AsyncKVStoreWriterTest.scala(5 hunks)flink/src/test/scala/ai/chronon/flink/test/FlinkJobIntegrationTest.scala(5 hunks)flink/src/test/scala/ai/chronon/flink/test/FlinkTestUtils.scala(4 hunks)flink/src/test/scala/ai/chronon/flink/test/SchemaRegistrySchemaProviderSpec.scala(3 hunks)flink/src/test/scala/ai/chronon/flink/test/UserAvroSchema.scala(1 hunks)flink/src/test/scala/org/apache/spark/sql/avro/AvroDeSerializationSupportSpec.scala(4 hunks)hub/src/test/scala/ai/chronon/hub/handlers/ConfHandlerTest.scala(8 hunks)hub/src/test/scala/ai/chronon/hub/handlers/DriftHandlerTest.scala(9 hunks)hub/src/test/scala/ai/chronon/hub/handlers/TimeSeriesHandlerTest.scala(28 hunks)online/src/test/scala/ai/chronon/online/test/CatalystUtilHiveUDFTest.scala(1 hunks)online/src/test/scala/ai/chronon/online/test/CatalystUtilTest.scala(1 hunks)online/src/test/scala/ai/chronon/online/test/DataRangeTest.scala(2 hunks)online/src/test/scala/ai/chronon/online/test/FetcherBaseTest.scala(4 hunks)online/src/test/scala/ai/chronon/online/test/LRUCacheTest.scala(0 hunks)online/src/test/scala/ai/chronon/online/test/stats/AssignIntervalsTest.scala(1 hunks)online/src/test/scala/ai/chronon/online/test/stats/PivotUtilsTest.scala(11 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala(3 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/RelevantLeftForJoinPartSpec.scala(9 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/RepoIndexSpec.scala(4 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/SequenceMapSpec.scala(1 hunks)orchestration/src/test/scala/ai/chronon/orchestration/test/TimeExpressionSpec.scala(1 hunks)service_commons/src/main/java/ai/chronon/service/RouteHandlerWrapper.java(1 hunks)spark/src/test/scala/ai/chronon/spark/test/AnalyzerTest.scala(2 hunks)spark/src/test/scala/ai/chronon/spark/test/ResultValidationAbilityTest.scala(1 hunks)spark/src/test/scala/ai/chronon/spark/test/SchemaEvolutionTest.scala(1 hunks)spark/src/test/scala/ai/chronon/spark/test/StatsComputeTest.scala(3 hunks)spark/src/test/scala/ai/chronon/spark/test/TaggedFilterSuite.scala(1 hunks)spark/src/test/scala/ai/chronon/spark/test/stats/drift/DriftTest.scala(1 hunks)
💤 Files with no reviewable changes (1)
- online/src/test/scala/ai/chronon/online/test/LRUCacheTest.scala
✅ Files skipped from review due to trivial changes (24)
- spark/src/test/scala/ai/chronon/spark/test/stats/drift/DriftTest.scala
- orchestration/src/test/scala/ai/chronon/orchestration/test/SequenceMapSpec.scala
- online/src/test/scala/ai/chronon/online/test/DataRangeTest.scala
- online/src/test/scala/ai/chronon/online/test/CatalystUtilTest.scala
- cloud_aws/src/test/scala/ai/chronon/integrations/aws/DynamoDBKVStoreTest.scala
- aggregator/src/test/scala/ai/chronon/aggregator/test/RowAggregatorTest.scala
- hub/src/test/scala/ai/chronon/hub/handlers/DriftHandlerTest.scala
- spark/src/test/scala/ai/chronon/spark/test/AnalyzerTest.scala
- aggregator/src/test/scala/ai/chronon/aggregator/test/FrequentItemsTest.scala
- online/src/test/scala/ai/chronon/online/test/CatalystUtilHiveUDFTest.scala
- orchestration/src/test/scala/ai/chronon/orchestration/test/TimeExpressionSpec.scala
- spark/src/test/scala/ai/chronon/spark/test/SchemaEvolutionTest.scala
- aggregator/src/test/scala/ai/chronon/aggregator/test/ApproxHistogramTest.scala
- orchestration/src/test/scala/ai/chronon/orchestration/test/RepoIndexSpec.scala
- api/src/test/scala/ai/chronon/api/test/TileSeriesSerializationTest.scala
- spark/src/test/scala/ai/chronon/spark/test/ResultValidationAbilityTest.scala
- hub/src/test/scala/ai/chronon/hub/handlers/ConfHandlerTest.scala
- aggregator/src/test/scala/ai/chronon/aggregator/test/SawtoothOnlineAggregatorTest.scala
- orchestration/src/test/scala/ai/chronon/orchestration/test/RelevantLeftForJoinPartSpec.scala
- spark/src/test/scala/ai/chronon/spark/test/StatsComputeTest.scala
- aggregator/src/test/scala/ai/chronon/aggregator/test/DataGen.scala
- flink/src/test/scala/ai/chronon/flink/test/SchemaRegistrySchemaProviderSpec.scala
- aggregator/src/test/scala/ai/chronon/aggregator/test/TwoStackLiteAggregatorTest.scala
- online/src/test/scala/ai/chronon/online/test/stats/PivotUtilsTest.scala
🚧 Files skipped from review as they are similar to previous changes (5)
- flink/src/main/scala/ai/chronon/flink/types/FlinkTypes.scala
- flink/src/test/scala/ai/chronon/flink/test/UserAvroSchema.scala
- service_commons/src/main/java/ai/chronon/service/RouteHandlerWrapper.java
- spark/src/test/scala/ai/chronon/spark/test/TaggedFilterSuite.scala
- flink/src/test/scala/ai/chronon/flink/test/FlinkJobIntegrationTest.scala
⏰ Context from checks skipped due to timeout of 90000ms (8)
- GitHub Check: table_utils_delta_format_spark_tests
- GitHub Check: other_spark_tests
- GitHub Check: mutation_spark_tests
- GitHub Check: join_spark_tests
- GitHub Check: scala_compile_fmt_fix
- GitHub Check: fetcher_spark_tests
- GitHub Check: no_spark_scala_tests
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (21)
orchestration/src/test/scala/ai/chronon/orchestration/test/CollectionExtensionsTest.scala (2)
87-91: LGTM!Clean formatting.
124-124: LGTM!Better spacing.
cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/BigTableKVStoreTest.scala (2)
33-49: LGTM! Clean implementation of EmulatorWrapper.The class effectively manages the Bigtable emulator lifecycle with clear before/after methods.
Line range hint
55-84: LGTM! Test setup correctly uses EmulatorWrapper.The setup properly initializes the emulator and configures the test clients.
flink/src/test/scala/ai/chronon/flink/test/FlinkTestUtils.scala (3)
11-11: LGTM!Import changes align with the new return types and Java collection requirements.
Also applies to: 23-24
44-47: LGTM!Return type change matches the broader codebase updates, and Java collection conversion is properly handled.
59-62: LGTM!Consistent with E2EEventSource changes while preserving watermark functionality.
flink/src/test/scala/org/apache/spark/sql/avro/AvroDeSerializationSupportSpec.scala (2)
3-3: LGTM!Clean import addition and improved class formatting.
Also applies to: 15-21
27-27: LGTM!Schema retrieval simplified across all test cases.
Also applies to: 42-45, 59-59
online/src/test/scala/ai/chronon/online/test/FetcherBaseTest.scala (4)
148-148: LGTM!Clean alignment of pattern match cases.
233-236: LGTM!Clean map formatting.
249-252: LGTM!Clean map formatting.
265-268: LGTM!Clean map formatting.
api/src/test/scala/ai/chronon/api/test/DataTypeConversionTest.scala (2)
34-44: LGTM! Improved readability.The reformatting enhances code readability while preserving functionality.
34-45: Verify test coverage with Bazel.Since this PR aims to fix Bazel test failures, let's verify the test execution.
✅ Verification successful
Test file inclusion verified in Bazel targets.
- The BUILD.bazel file is present under api/src/test.
- The DataTypeConversionTest.scala file is detected, confirming its inclusion as a Bazel test target.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Check if this test is included in Bazel test targets # Find the corresponding BUILD file fd -t f "BUILD.bazel" "api/src/test" # Search for test target that includes this file rg "DataTypeConversionTest" "api/src/test"Length of output: 194
online/src/test/scala/ai/chronon/online/test/stats/AssignIntervalsTest.scala (1)
9-11: LGTM! Clear test description with well-formatted input data.hub/src/test/scala/ai/chronon/hub/handlers/TimeSeriesHandlerTest.scala (1)
24-24: LGTM! Proper mock cleanup implementation.The addition of
AutoCloseable mocksandtearDownmethod ensures proper cleanup of Mockito resources.Also applies to: 57-57, 67-67, 79-84
flink/src/test/scala/ai/chronon/flink/test/AsyncKVStoreWriterTest.scala (4)
7-8: LGTM: Import changes align with Java migration.Appropriate replacement of Scala-specific imports with Java equivalents.
Also applies to: 12-14
26-30: LGTM: Stream creation and result collection updates.Proper migration from Scala collections to Java streams.
Also applies to: 41-41, 52-55, 67-67
16-16: Verify removal of TaggedFilterSuite.The trait removal might affect test filtering capabilities.
✅ Verification successful
Removal Verified:
AsyncKVStoreWriterTest.scalanow extendsAnyFlatSpecwithoutTaggedFilterSuite.TaggedFilterSuiteremains in use in the online and spark tests, which is likely intentional.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check if TaggedFilterSuite is used elsewhere rg "TaggedFilterSuite" -lLength of output: 418
72-72: Verify if tagName is still needed.The commented-out override suggests uncertainty about its necessity.
✅ Verification successful
AsyncKVStoreWriterTest does not need a custom tagName override.
- Other tests override tagName only when a custom tag is required.
- This file appears to rely on the default implementation, so the commented code can be removed.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check for other tagName usages rg "def tagName" -A 2Length of output: 1857
| capacity: Int = kvStoreConcurrency): DataStream[WriteResponse] = { | ||
| // We use the Java API here as we have encountered issues in integration tests in the | ||
| // past using the Scala async datastream API. | ||
| new DataStream( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lets restore this - will potentially crop up as CI failures
| .aggregate( | ||
| // See Flink's "ProcessWindowFunction with Incremental Aggregation" | ||
| preAggregator = new FlinkRowAggregationFunction(groupByServingInfoParsed.groupBy, inputSchema), | ||
| windowFunction = new FlinkRowAggProcessFunction(groupByServingInfoParsed.groupBy, inputSchema) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice to revert to this form from a readability standpoint
| import org.apache.avro.Schema | ||
| import org.apache.avro.SchemaBuilder | ||
|
|
||
| object UserAvroSchema { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we revert to using the avsc file instead? It's a lot more readable than this..
| class AvroDeSerializationSupportSpec extends AnyFlatSpec { | ||
|
|
||
| it should "deserialize avro data" in { | ||
| val schemaSrc = Source.fromURL(getClass.getClassLoader.getResource("user.avsc")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we revert to this?
| maven_artifact("junit:junit"), | ||
| maven_artifact("org.junit.platform:junit-platform-launcher"), | ||
| maven_artifact("org.junit.platform:junit-platform-reporting"), | ||
| maven_artifact("net.bytebuddy:byte-buddy"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
curious what is this needed for?
## Summary Fixed all the test failures after bazel migration. Known test failures are 3 tests in Spark which deal with Resource loading of test data so we plan to temporarily disable them for now, also we have a way out to fix them after killing sbt. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit Based on the comprehensive summary of changes, here are the concise release notes: ## Release Notes - **New Features** - Added support for Bazelisk installation in Docker environment - Enhanced Scala and Java dependency management - Expanded Protobuf and gRPC support - Introduced new testing utilities and frameworks - **Dependency Updates** - Updated Flink, Spark, and Kafka-related dependencies - Added new Maven artifacts for Avro, Thrift, and testing libraries - Upgraded various library versions - **Testing Improvements** - Introduced `scala_junit_suite` for more flexible test execution - Added new test resources and configurations - Enhanced test coverage and dependency management - **Build System** - Updated Bazel build configurations - Improved dependency resolution and repository management - Added new build rules and scripts - **Code Quality** - Refactored package imports and type conversions - Improved code formatting and readability - Streamlined dependency handling across modules <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: nikhil-zlai <[email protected]>
## Summary Fixed all the test failures after bazel migration. Known test failures are 3 tests in Spark which deal with Resource loading of test data so we plan to temporarily disable them for now, also we have a way out to fix them after killing sbt. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit Based on the comprehensive summary of changes, here are the concise release notes: ## Release Notes - **New Features** - Added support for Bazelisk installation in Docker environment - Enhanced Scala and Java dependency management - Expanded Protobuf and gRPC support - Introduced new testing utilities and frameworks - **Dependency Updates** - Updated Flink, Spark, and Kafka-related dependencies - Added new Maven artifacts for Avro, Thrift, and testing libraries - Upgraded various library versions - **Testing Improvements** - Introduced `scala_junit_suite` for more flexible test execution - Added new test resources and configurations - Enhanced test coverage and dependency management - **Build System** - Updated Bazel build configurations - Improved dependency resolution and repository management - Added new build rules and scripts - **Code Quality** - Refactored package imports and type conversions - Improved code formatting and readability - Streamlined dependency handling across modules <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: nikhil-zlai <[email protected]>
## Summary Fixed all the test failures after bazel migration. Known test failures are 3 tests in Spark which deal with Resource loading of test data so we plan to temporarily disable them for now, also we have a way out to fix them after killing sbt. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit Based on the comprehensive summary of changes, here are the concise release notes: ## Release Notes - **New Features** - Added support for Bazelisk installation in Docker environment - Enhanced Scala and Java dependency management - Expanded Protobuf and gRPC support - Introduced new testing utilities and frameworks - **Dependency Updates** - Updated Flink, Spark, and Kafka-related dependencies - Added new Maven artifacts for Avro, Thrift, and testing libraries - Upgraded various library versions - **Testing Improvements** - Introduced `scala_junit_suite` for more flexible test execution - Added new test resources and configurations - Enhanced test coverage and dependency management - **Build System** - Updated Bazel build configurations - Improved dependency resolution and repository management - Added new build rules and scripts - **Code Quality** - Refactored package imports and type conversions - Improved code formatting and readability - Streamlined dependency handling across modules <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: nikhil-zlai <[email protected]>
## Summary Fixed all the test failures after bazel migration. Known test failures are 3 tests in Spark which deal with Resource loading of test data so we plan to temporarily disable them for now, also we have a way out to fix them after killing sbt. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit Based on the comprehensive summary of changes, here are the concise release notes: ## Release Notes - **New Features** - Added support for Bazelisk installation in Docker environment - Enhanced Scala and Java dependency management - Expanded Protobuf and gRPC support - Introduced new testing utilities and frameworks - **Dependency Updates** - Updated Flink, Spark, and Kafka-related dependencies - Added new Maven artifacts for Avro, Thrift, and testing libraries - Upgraded various library versions - **Testing Improvements** - Introduced `scala_junit_suite` for more flexible test execution - Added new test resources and configurations - Enhanced test coverage and dependency management - **Build System** - Updated Bazel build configurations - Improved dependency resolution and repository management - Added new build rules and scripts - **Code Quality** - Refactored package imports and type conversions - Improved code formatting and readability - Streamlined dependency handling across modules <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: nikhil-zlai <[email protected]>
## Summary Fixed all the test failures after bazel migration. Known test failures are 3 tests in Spark which deal with Resource loading of test data so we plan to temporarily disable them for now, also we have a way out to fix them after killing sbt. ## Checklist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit Based on the comprehensive summary of changes, here are the concise release notes: ## Release Notes - **New Features** - Added support for Bazelisk installation in Docker environment - Enhanced Scala and Java dependency management - Expanded Protobuf and gRPC support - Introduced new testing utilities and frameworks - **Dependency Updates** - Updated Flink, Spark, and Kafka-related dependencies - Added new Maven artifacts for Avro, Thrift, and testing libraries - Upgraded various library versions - **Testing Improvements** - Introduced `scala_junit_suite` for more flexible test execution - Added new test resources and configurations - Enhanced test coverage and dependency management - **Build System** - Updated Bazel build configurations - Improved dependency resolution and repository management - Added new build rules and scripts - **Code Quality** - Refactored package imports and type conversions - Improved code formatting and readability - Streamlined dependency handling across modules <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: nikhil-zlai <[email protected]>
## Summary Fixed all the test failures after bazel migration. Known test failures are 3 tests in Spark which deal with Resource loading of test data so we plan to temporarily disable them for now, also we have a way out to fix them after killing sbt. ## Cheour clientslist - [ ] Added Unit Tests - [x] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit Based on the comprehensive summary of changes, here are the concise release notes: ## Release Notes - **New Features** - Added support for Bazelisk installation in Doour clientser environment - Enhanced Scala and Java dependency management - Expanded Protobuf and gRPC support - Introduced new testing utilities and frameworks - **Dependency Updates** - Updated Flink, Spark, and Kafka-related dependencies - Added new Maven artifacts for Avro, Thrift, and testing libraries - Upgraded various library versions - **Testing Improvements** - Introduced `scala_junit_suite` for more flexible test execution - Added new test resources and configurations - Enhanced test coverage and dependency management - **Build System** - Updated Bazel build configurations - Improved dependency resolution and repository management - Added new build rules and scripts - **Code Quality** - Refactored paour clientsage imports and type conversions - Improved code formatting and readability - Streamlined dependency handling across modules <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: nikhil-zlai <[email protected]>
Summary
Fixed all the test failures after bazel migration. Known test failures are 3 tests in Spark which deal with Resource loading of test data so we plan to temporarily disable them for now, also we have a way out to fix them after killing sbt.
Checklist
Summary by CodeRabbit
Based on the comprehensive summary of changes, here are the concise release notes:
Release Notes
New Features
Dependency Updates
Testing Improvements
scala_junit_suitefor more flexible test executionBuild System
Code Quality