You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/submitting-applications.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ title: Submitting Applications
5
5
6
6
The `spark-submit` script in Spark's `bin` directory is used to launch applications on a cluster.
7
7
It can use all of Spark's supported [cluster managers](cluster-overview.html#cluster-manager-types)
8
-
through a uniform interface so you don't have to configure your application specially for each one.
8
+
through a uniform interface so you don't have to configure your application especially for each one.
9
9
10
10
# Bundling Your Application's Dependencies
11
11
If your code depends on other projects, you will need to package them alongside
@@ -58,7 +58,7 @@ for applications that involve the REPL (e.g. Spark shell).
58
58
59
59
Alternatively, if your application is submitted from a machine far from the worker machines (e.g.
60
60
locally on your laptop), it is common to use `cluster` mode to minimize network latency between
61
-
the drivers and the executors. Currently, standalone mode does not support cluster mode for Python
61
+
the drivers and the executors. Currently, the standalone mode does not support cluster mode for Python
62
62
applications.
63
63
64
64
For Python applications, simply pass a `.py` file in the place of `<application-jar>` instead of a JAR,
@@ -68,7 +68,7 @@ There are a few options available that are specific to the
68
68
[cluster manager](cluster-overview.html#cluster-manager-types) that is being used.
69
69
For example, with a [Spark standalone cluster](spark-standalone.html) with `cluster` deploy mode,
70
70
you can also specify `--supervise` to make sure that the driver is automatically restarted if it
71
-
fails with non-zero exit code. To enumerate all such options available to `spark-submit`,
71
+
fails with a non-zero exit code. To enumerate all such options available to `spark-submit`,
72
72
run it with `--help`. Here are a few examples of common options:
73
73
74
74
{% highlight bash %}
@@ -192,7 +192,7 @@ debugging information by running `spark-submit` with the `--verbose` option.
192
192
193
193
# Advanced Dependency Management
194
194
When using `spark-submit`, the application jar along with any jars included with the `--jars` option
195
-
will be automatically transferred to the cluster. URLs supplied after `--jars` must be separated by commas. That list is included on the driver and executor classpaths. Directory expansion does not work with `--jars`.
195
+
will be automatically transferred to the cluster. URLs supplied after `--jars` must be separated by commas. That list is included in the driver and executor classpaths. Directory expansion does not work with `--jars`.
196
196
197
197
Spark uses the following URL scheme to allow different strategies for disseminating jars:
0 commit comments