Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ jobs:
test: spark3-iceberg
- image: spark3-delta
test: spark3-delta
- image: spark3-hudi
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you enable tests for this image? We just recently added them to almost all images. See bin/test.sh. The test is just a simple smoke test to see if a container using this image will start.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

test: spark3-hudi
- image: kerberos
test: kerberos
- image: gpdb-6
Expand Down
4 changes: 4 additions & 0 deletions etc/compose/spark3-hudi/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
version: '2.0'
services:
spark:
image: testing/spark3-hudi:latest
53 changes: 53 additions & 0 deletions testing/spark3-hudi/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

FROM testing/centos7-oj11:unlabelled

ARG SPARK_VERSION=3.2.1
Copy link
Copy Markdown
Contributor

@findinpath findinpath Oct 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From https://hudi.apache.org/docs/quick-start-guide/

Hudi Supported Spark 3 version
0.12.x 3.3.x (default build), 3.2.x, 3.1.x
0.11.x 3.2.x (default build, Spark bundle only), 3.1.x

Given that hudi is a fresh connector in the Trino ecosystem, let's make use of the latest hudi 0.12.x which has support for the latest Spark version 3.3.0 which can also execute on Java 8/11/17 versions.

Let's build therefore on top of testing/centos7-oj17:unlabelled base image

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have kept the same version as in trino-hudi. The plan is to upgrade to Hudi 0.12.1 which will be out very soon. Then, i'll make the changes here as well in a followup PR.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@codope Hudi doesn't support different version between server and client?

ARG HADOOP_VERSION=3.2
ARG HUDI_VERSION=0.11.1
ARG SCALA_VERSION=2.12

ARG SPARK_ARTIFACT="spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}"

ENV SPARK_HOME=/spark

RUN set -xeu; \
wget -nv "https://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/${SPARK_ARTIFACT}.tgz"; \
tar -xf ${SPARK_ARTIFACT}.tgz; \
rm ${SPARK_ARTIFACT}.tgz; \
ln -sn /${SPARK_ARTIFACT} ${SPARK_HOME}

WORKDIR ${SPARK_HOME}/jars

# install AWS SDK so we can access S3; the version must match the hadoop-* jars which are part of SPARK distribution
RUN wget -nv "https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.1/hadoop-aws-3.3.1.jar"
RUN wget -nv "https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-bundle/1.12.48/aws-java-sdk-bundle-1.12.48.jar"

# install Hudi
RUN wget -nv "https://repo1.maven.org/maven2/org/apache/hudi/hudi-spark3-bundle_${SCALA_VERSION}/${HUDI_VERSION}/hudi-spark3-bundle_${SCALA_VERSION}-${HUDI_VERSION}.jar"

# Create Hive user to match Hive container
RUN adduser hive

ENV PATH="${SPARK_HOME}/bin:${PATH}"

EXPOSE 10213

HEALTHCHECK --interval=10s --timeout=5s --start-period=10s \
CMD curl -f http://localhost:10213/
CMD spark-submit \
--master "local[*]" \
--class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 \
--name "Thrift JDBC/ODBC Server" \
--conf spark.hive.server2.thrift.port=10213 \
spark-internal