refactor: Add Lombok annotations to hudi-utilities (Part 3) #47499
Triggered via pull request
January 15, 2026 06:00
Status
Success
Total duration
1h 7m 57s
Artifacts
–
bot.yml
on: pull_request
validate-source
42s
test-hudi-trino-plugin
8m 12s
Matrix: build-flink-java17
Matrix: build-spark-java17
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-flink-1
Matrix: test-flink-2
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java-tests-part1
Matrix: test-spark-java-tests-part2
Matrix: test-spark-java-tests-part3
Matrix: test-spark-java11-17-java-tests-part1
Matrix: test-spark-java11-17-java-tests-part2
Matrix: test-spark-java11-17-java-tests-part3
Matrix: test-spark-java11-17-scala-dml-tests
Matrix: test-spark-java11-17-scala-other-tests
Matrix: test-spark-java17-java-tests-part1
Matrix: test-spark-java17-java-tests-part2
Matrix: test-spark-java17-java-tests-part3
Matrix: test-spark-java17-scala-dml-tests
Matrix: test-spark-java17-scala-other-tests
Matrix: test-spark-scala-dml-tests
Matrix: test-spark-scala-other-tests
Matrix: validate-bundle-spark4
Matrix: validate-bundles-java11
Matrix: validate-bundles
Annotations
19 errors and 146 warnings
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
org/apache/hudi/metaserver/HoodieMetaserver has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
org/apache/hudi/metaserver/HoodieMetaserver has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
org/apache/hudi/metaserver/HoodieMetaserver has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
org/apache/hudi/metaserver/HoodieMetaserver has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
|
|
|
|
|
|
test-spark-java11-17-java-tests-part2 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115061825537), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115061825524)
|
|
test-spark-java17-java-tests-part2 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115061755264), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115061755245)
|
|
test-spark-java17-java-tests-part2 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115061831573), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115061831555)
|
|
test-spark-java11-17-java-tests-part2 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115061942250), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115061942231)
|
|
test-spark-java17-java-tests-part2 (scala-2.13, spark4.0, hudi-spark-datasource/hudi-spark4.0.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115062001036), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115062001016)
|
|
test-spark-java-tests-part2 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115061833789), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115061833771)
|
|
test-spark-java-tests-part2 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115061848836), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115061848818)
|
|
test-spark-java-tests-part2 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes between first operation = ConcurrentOperation(actionState=INFLIGHT, actionType=commit, instantTime=20260115061948730), second operation = ConcurrentOperation(actionState=COMPLETED, actionType=commit, instantTime=20260115061948751)
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
validate.sh done validating flink 2.x bundle
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
validate.sh done validating Flink bundle validation was successful.
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
validate.sh validating flink 2.0 bundle
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
validate.sh done validating flink 2.x bundle
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
validate.sh done validating Flink bundle validation was successful.
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles-java11 (scala-2.12, flink2.1, 1.11.4, 1.15.2, spark3.5, spark3.5.1)
validate.sh validating flink 2.0 bundle
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating flink 2.x bundle
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating Flink bundle validation was successful.
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh validating flink 2.0 bundle
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating flink 2.x bundle
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating Flink bundle validation was successful.
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh validating flink 2.0 bundle
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi
|
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Running tests with Java 17
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:3
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:2
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:1
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh copying hadoop conf
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Done building Hudi
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi
|
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh Running tests with Java 17
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh starting datanode:3
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh starting datanode:2
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh starting datanode:1
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh starting hadoop hdfs
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh copying hadoop conf
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh Done building Hudi
|
|
docker-java17-test (scala-2.13, flink1.20, spark4.0, spark4.0.0)
docker_test_java17.sh Building Hudi
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh done validating cli bundle
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh setting up CLI bundle validation
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh validating cli bundle
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundle-spark4 (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark4.0, spark4.0.0)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for Spark < 3.5 build
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for Spark < 3.5 build
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for Spark < 3.5 build
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for Spark < 3.5 build
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
|
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
|