-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-32353][TEST] Update docker/spark-test and clean up unused stuff #29150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
dongjoon-hyun
approved these changes
Jul 17, 2020
Member
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Thank you, @williamhyun . Merged to master/3.0.
This docker test is irrelevant to Jenkins or GitHub Action. I verified this locally like the following.
$ echo $SPARK_HOME
/Users/dongjoon/APACHE/spark-release/spark-3.0.0-bin-hadoop3.2
$ ./build
Sending build context to Docker daemon 3.072kB
Step 1/3 : FROM ubuntu:20.04
---> adafef2e596e
Step 2/3 : RUN apt-get update && apt-get install -y less openjdk-11-jre-headless iproute2 vim-tiny sudo openssh-server && rm -rf /var/lib/apt/lists/*
---> Using cache
---> c0a25f1156a1
Step 3/3 : ENV SPARK_HOME /opt/spark
---> Running in c6fa50285efd
Removing intermediate container c6fa50285efd
---> 670b40e6a27b
Successfully built 670b40e6a27b
Successfully tagged spark-test-base:latest
Sending build context to Docker daemon 4.608kB
Step 1/3 : FROM spark-test-base
---> 670b40e6a27b
Step 2/3 : ADD default_cmd /root/
---> 50e6d8e839fc
Step 3/3 : CMD ["/root/default_cmd"]
---> Running in bb9acde96ee2
Removing intermediate container bb9acde96ee2
---> 78b609510012
Successfully built 78b609510012
Successfully tagged spark-test-master:latest
Sending build context to Docker daemon 4.608kB
Step 1/4 : FROM spark-test-base
---> 670b40e6a27b
Step 2/4 : ENV SPARK_WORKER_PORT 8888
---> Running in aefed698790f
Removing intermediate container aefed698790f
---> 0bf56e9f36fb
Step 3/4 : ADD default_cmd /root/
---> 0373eecd3965
Step 4/4 : ENTRYPOINT ["/root/default_cmd"]
---> Running in f42de0824560
Removing intermediate container f42de0824560
---> e31ccdd0101e
Successfully built e31ccdd0101e
Successfully tagged spark-test-worker:latest
$ docker images | grep spark-test
spark-test-worker latest e31ccdd0101e 18 seconds ago 405MB
spark-test-master latest 78b609510012 19 seconds ago 405MB
spark-test-base latest 670b40e6a27b 20 seconds ago 405MB
$ docker run -v $SPARK_HOME:/opt/spark spark-test-master
CONTAINER_IP=172.17.0.2
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/07/17 18:59:39 INFO Master: Started daemon with process name: 9@df23bd555689
20/07/17 18:59:39 INFO SignalUtils: Registered signal handler for TERM
20/07/17 18:59:39 INFO SignalUtils: Registered signal handler for HUP
20/07/17 18:59:39 INFO SignalUtils: Registered signal handler for INT
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.0.0.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/07/17 18:59:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/07/17 18:59:40 INFO SecurityManager: Changing view acls to: root
20/07/17 18:59:40 INFO SecurityManager: Changing modify acls to: root
20/07/17 18:59:40 INFO SecurityManager: Changing view acls groups to:
20/07/17 18:59:40 INFO SecurityManager: Changing modify acls groups to:
20/07/17 18:59:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
20/07/17 18:59:40 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
20/07/17 18:59:40 INFO Master: Starting Spark master at spark://172.17.0.2:7077
20/07/17 18:59:40 INFO Master: Running Spark version 3.0.0
20/07/17 18:59:41 INFO Utils: Successfully started service 'MasterUI' on port 8080.
20/07/17 18:59:41 INFO MasterWebUI: Bound MasterWebUI to 172.17.0.2, and started at http://172.17.0.2:8080
20/07/17 18:59:41 INFO Master: I have been elected leader! New state: ALIVE
20/07/17 19:00:18 INFO Master: Registering worker 172.17.0.3:8888 with 10 cores, 20.5 GiB RAM$ docker run -v $SPARK_HOME:/opt/spark spark-test-worker spark://172.17.0.2:7077
CONTAINER_IP=172.17.0.3
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/07/17 19:00:16 INFO Worker: Started daemon with process name: 9@ba805965c4c0
20/07/17 19:00:16 INFO SignalUtils: Registered signal handler for TERM
20/07/17 19:00:16 INFO SignalUtils: Registered signal handler for HUP
20/07/17 19:00:16 INFO SignalUtils: Registered signal handler for INT
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.0.0.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/07/17 19:00:17 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/07/17 19:00:17 INFO SecurityManager: Changing view acls to: root
20/07/17 19:00:17 INFO SecurityManager: Changing modify acls to: root
20/07/17 19:00:17 INFO SecurityManager: Changing view acls groups to:
20/07/17 19:00:17 INFO SecurityManager: Changing modify acls groups to:
20/07/17 19:00:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
20/07/17 19:00:17 INFO Utils: Successfully started service 'sparkWorker' on port 8888.
20/07/17 19:00:17 INFO Worker: Starting Spark worker 172.17.0.3:8888 with 10 cores, 20.5 GiB RAM
20/07/17 19:00:17 INFO Worker: Running Spark version 3.0.0
20/07/17 19:00:17 INFO Worker: Spark home: /opt/spark
20/07/17 19:00:17 INFO ResourceUtils: ==============================================================
20/07/17 19:00:17 INFO ResourceUtils: Resources for spark.worker:
20/07/17 19:00:17 INFO ResourceUtils: ==============================================================
20/07/17 19:00:18 INFO Utils: Successfully started service 'WorkerUI' on port 8081.
20/07/17 19:00:18 INFO WorkerWebUI: Bound WorkerWebUI to 172.17.0.3, and started at http://172.17.0.3:8081
20/07/17 19:00:18 INFO Worker: Connecting to master 172.17.0.2:7077...
20/07/17 19:00:18 INFO TransportClientFactory: Successfully created connection to /172.17.0.2:7077 after 41 ms (0 ms spent in bootstraps)
20/07/17 19:00:18 INFO Worker: Successfully registered with master spark://172.17.0.2:7077
dongjoon-hyun
pushed a commit
that referenced
this pull request
Jul 17, 2020
### What changes were proposed in this pull request? This PR aims to update the docker/spark-test and clean up unused stuff. ### Why are the changes needed? Since Spark 3.0.0, Java 11 is supported. We had better use the latest Java and OS. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Manually do the following as described in https://github.com/apache/spark/blob/master/external/docker/spark-test/README.md . ``` docker run -v $SPARK_HOME:/opt/spark spark-test-master docker run -v $SPARK_HOME:/opt/spark spark-test-worker spark://<master_ip>:7077 ``` Closes #29150 from williamhyun/docker. Authored-by: William Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]> (cherry picked from commit 7dc1d89) Signed-off-by: Dongjoon Hyun <[email protected]>
Member
Author
|
Thank you @dongjoon-hyun |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
This PR aims to update the docker/spark-test and clean up unused stuff.
Why are the changes needed?
Since Spark 3.0.0, Java 11 is supported. We had better use the latest Java and OS.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Manually do the following as described in https://github.com/apache/spark/blob/master/external/docker/spark-test/README.md .