Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 0 additions & 5 deletions bin/docker-image-tool.sh
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,6 @@ Commands:

Options:
-f file (Optional) Dockerfile to build for JVM based Jobs. By default builds the Dockerfile shipped with Spark.
For Java 17, use `-f kubernetes/dockerfiles/spark/Dockerfile.java17`
-p file (Optional) Dockerfile to build for PySpark Jobs. Builds Python dependencies and ships with Spark.
Skips building PySpark docker image if not specified.
-R file (Optional) Dockerfile to build for SparkR Jobs. Builds R dependencies and ships with Spark.
Expand Down Expand Up @@ -277,10 +276,6 @@ Examples:
# Note: buildx, which does cross building, needs to do the push during build
# So there is no separate push step with -X

- Build and push Java17-based image with tag "v3.3.0" to docker.io/myrepo
$0 -r docker.io/myrepo -t v3.3.0 -f kubernetes/dockerfiles/spark/Dockerfile.java17 build
$0 -r docker.io/myrepo -t v3.3.0 push

EOF
}

Expand Down
2 changes: 1 addition & 1 deletion project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -813,7 +813,7 @@ object KubernetesIntegrationTests {
val bindingsDir = s"$sparkHome/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/bindings"
val javaImageTag = sys.props.get("spark.kubernetes.test.javaImageTag")
val dockerFile = sys.props.getOrElse("spark.kubernetes.test.dockerFile",
s"$sparkHome/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17")
s"$sparkHome/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile")
val pyDockerFile = sys.props.getOrElse("spark.kubernetes.test.pyDockerFile",
s"$bindingsDir/python/Dockerfile")
val rDockerFile = sys.props.getOrElse("spark.kubernetes.test.rDockerFile",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
ARG java_image_tag=11-jre-focal
ARG java_image_tag=17-jre

FROM eclipse-temurin:${java_image_tag}

Expand Down

This file was deleted.

3 changes: 1 addition & 2 deletions resource-managers/kubernetes/integration-tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,8 @@ To run tests with Java 11 instead of Java 8, use `--java-image-tag` to specify t
To run tests with a custom docker image, use `--docker-file` to specify the Dockerfile.
Note that if both `--docker-file` and `--java-image-tag` are used, `--docker-file` is preferred,
and the custom Dockerfile need to include a Java installation by itself.
Dockerfile.java17 is an example of custom Dockerfile, and you can specify it to run tests with Java 17.

./dev/dev-run-integration-tests.sh --docker-file ../docker/src/main/dockerfiles/spark/Dockerfile.java17
./dev/dev-run-integration-tests.sh --docker-file ../docker/src/main/dockerfiles/spark/Dockerfile

To run tests with Hadoop 2.x instead of Hadoop 3.x, use `--hadoop-profile`.

Expand Down
2 changes: 1 addition & 1 deletion resource-managers/kubernetes/integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
<spark.kubernetes.test.master></spark.kubernetes.test.master>
<spark.kubernetes.test.namespace></spark.kubernetes.test.namespace>
<spark.kubernetes.test.serviceAccountName></spark.kubernetes.test.serviceAccountName>
<spark.kubernetes.test.dockerFile>Dockerfile.java17</spark.kubernetes.test.dockerFile>
<spark.kubernetes.test.dockerFile>Dockerfile</spark.kubernetes.test.dockerFile>

<test.exclude.tags></test.exclude.tags>
<test.default.exclude.tags>org.apache.spark.deploy.k8s.integrationtest.YuniKornTag</test.default.exclude.tags>
Expand Down