-
Notifications
You must be signed in to change notification settings - Fork 53
Add spark3-hudi image #136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| version: '2.0' | ||
| services: | ||
| spark: | ||
| image: testing/spark3-hudi:latest |
| Original file line number | Diff line number | Diff line change | ||||||
|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,53 @@ | ||||||||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||||||||
| # you may not use this file except in compliance with the License. | ||||||||
| # You may obtain a copy of the License at | ||||||||
| # | ||||||||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||||||||
| # | ||||||||
| # Unless required by applicable law or agreed to in writing, software | ||||||||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||||||||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||||||||
| # See the License for the specific language governing permissions and | ||||||||
| # limitations under the License. | ||||||||
|
|
||||||||
| FROM testing/centos7-oj11:unlabelled | ||||||||
|
|
||||||||
| ARG SPARK_VERSION=3.2.1 | ||||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. From https://hudi.apache.org/docs/quick-start-guide/
Given that hudi is a fresh connector in the Trino ecosystem, let's make use of the latest hudi Let's build therefore on top of
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I have kept the same version as in trino-hudi. The plan is to upgrade to Hudi 0.12.1 which will be out very soon. Then, i'll make the changes here as well in a followup PR.
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @codope Hudi doesn't support different version between server and client? |
||||||||
| ARG HADOOP_VERSION=3.2 | ||||||||
| ARG HUDI_VERSION=0.11.1 | ||||||||
| ARG SCALA_VERSION=2.12 | ||||||||
|
|
||||||||
| ARG SPARK_ARTIFACT="spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" | ||||||||
|
|
||||||||
| ENV SPARK_HOME=/spark | ||||||||
|
|
||||||||
| RUN set -xeu; \ | ||||||||
| wget -nv "https://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/${SPARK_ARTIFACT}.tgz"; \ | ||||||||
| tar -xf ${SPARK_ARTIFACT}.tgz; \ | ||||||||
| rm ${SPARK_ARTIFACT}.tgz; \ | ||||||||
| ln -sn /${SPARK_ARTIFACT} ${SPARK_HOME} | ||||||||
|
|
||||||||
| WORKDIR ${SPARK_HOME}/jars | ||||||||
|
|
||||||||
| # install AWS SDK so we can access S3; the version must match the hadoop-* jars which are part of SPARK distribution | ||||||||
| RUN wget -nv "https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.1/hadoop-aws-3.3.1.jar" | ||||||||
| RUN wget -nv "https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-bundle/1.12.48/aws-java-sdk-bundle-1.12.48.jar" | ||||||||
|
|
||||||||
| # install Hudi | ||||||||
| RUN wget -nv "https://repo1.maven.org/maven2/org/apache/hudi/hudi-spark3-bundle_${SCALA_VERSION}/${HUDI_VERSION}/hudi-spark3-bundle_${SCALA_VERSION}-${HUDI_VERSION}.jar" | ||||||||
|
|
||||||||
| # Create Hive user to match Hive container | ||||||||
| RUN adduser hive | ||||||||
|
|
||||||||
| ENV PATH="${SPARK_HOME}/bin:${PATH}" | ||||||||
|
|
||||||||
| EXPOSE 10213 | ||||||||
|
|
||||||||
| HEALTHCHECK --interval=10s --timeout=5s --start-period=10s \ | ||||||||
| CMD curl -f http://localhost:10213/ | ||||||||
| CMD spark-submit \ | ||||||||
| --master "local[*]" \ | ||||||||
| --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 \ | ||||||||
| --name "Thrift JDBC/ODBC Server" \ | ||||||||
| --conf spark.hive.server2.thrift.port=10213 \ | ||||||||
| spark-internal | ||||||||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you enable tests for this image? We just recently added them to almost all images. See
bin/test.sh. The test is just a simple smoke test to see if a container using this image will start.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done