Skip to content

[HUDI-9336] Extract common logic of getting reader for secondary index #40527

[HUDI-9336] Extract common logic of getting reader for secondary index

[HUDI-9336] Extract common logic of getting reader for secondary index #40527

Triggered via pull request April 25, 2025 22:07
Status Success
Total duration 51m 47s
Artifacts

bot.yml

on: pull_request
validate-source
32s
validate-source
Matrix: build-spark-java17
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java-functional-tests
Matrix: test-spark-java-unit-tests
Matrix: test-spark-java11-17-java-functional-tests
Matrix: test-spark-java11-17-java-unit-tests
Matrix: test-spark-java11-17-scala-dml-tests
Matrix: test-spark-java11-17-scala-other-tests
Matrix: test-spark-java17-java-functional-tests
Matrix: test-spark-java17-java-unit-tests
Matrix: test-spark-java17-scala-dml-tests
Matrix: test-spark-java17-scala-other-tests
Matrix: test-spark-scala-dml-tests
Matrix: test-spark-scala-other-tests
Matrix: validate-bundles
Fit to window
Zoom out
Zoom in

Annotations

12 errors and 210 warnings
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
test-spark-java17-java-functional-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3....
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250425222134576, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250425222134574, actionState=COMPLETED'}
test-spark-java17-java-functional-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3....
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250425222158849, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250425222158853, actionState=COMPLETED'}
test-spark-java11-17-java-functional-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spar...
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250425222150958, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250425222150960, actionState=COMPLETED'}
test-spark-java-functional-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250425222127953, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250425222127955, actionState=COMPLETED'}
test-spark-java-functional-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250425222123307, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250425222123262, actionState=COMPLETED'}
test-spark-java11-17-java-functional-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spar...
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250425222152184, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250425222152209, actionState=COMPLETED'}
test-spark-java-functional-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250425222131558, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250425222131561, actionState=COMPLETED'}
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.12, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation