From 2b55ed22848b32c3bbdf39ec0091f621351cdb22 Mon Sep 17 00:00:00 2001 From: Andrew Or Date: Fri, 5 Dec 2014 14:47:43 -0800 Subject: [PATCH 1/2] Document standalone cluster supervise mode --- docs/spark-standalone.md | 11 ++++++++++- 1 file changed, 10 insertions(+), 1 deletion(-) diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index ae7b81d5bb71..0b465f36c1a2 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -257,7 +257,7 @@ To run an interactive Spark shell against the cluster, run the following command You can also pass an option `--total-executor-cores ` to control the number of cores that spark-shell uses on the cluster. -# Launching Compiled Spark Applications +# Launching Spark Applications The [`spark-submit` script](submitting-applications.html) provides the most straightforward way to submit a compiled Spark application to the cluster. For standalone clusters, Spark currently @@ -272,6 +272,15 @@ should specify them through the `--jars` flag using comma as a delimiter (e.g. ` To control the application's configuration or execution environment, see [Spark Configuration](configuration.html). +Additionally, standalone `cluster` mode supports restarting your application on failure. To use +this feature, you may pass in the `--supervise` flag to `spark-submit` when launching your +application. Then, if you wish to kill an application that is failing repeatedly, you may do so +through: + + ./bin/spark-class org.apache.spark.deploy.Client kill + +You can find the driver ID through the standalone Master web UI at `http://:8080`. + # Resource Scheduling The standalone cluster mode currently only supports a simple FIFO scheduler across applications. From 9ca0908e632fc7434aea1a98511bd5edd1c744ff Mon Sep 17 00:00:00 2001 From: Andrew Or Date: Tue, 9 Dec 2014 18:01:54 -0800 Subject: [PATCH 2/2] Wording changes --- docs/spark-standalone.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index 0b465f36c1a2..5c6084fb4625 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -272,10 +272,10 @@ should specify them through the `--jars` flag using comma as a delimiter (e.g. ` To control the application's configuration or execution environment, see [Spark Configuration](configuration.html). -Additionally, standalone `cluster` mode supports restarting your application on failure. To use -this feature, you may pass in the `--supervise` flag to `spark-submit` when launching your -application. Then, if you wish to kill an application that is failing repeatedly, you may do so -through: +Additionally, standalone `cluster` mode supports restarting your application automatically if it +exited with non-zero exit code. To use this feature, you may pass in the `--supervise` flag to +`spark-submit` when launching your application. Then, if you wish to kill an application that is +failing repeatedly, you may do so through: ./bin/spark-class org.apache.spark.deploy.Client kill