Skip to content

Commit a7471e4

Browse files
committed
[SPARK-23165][DOC] Grammar and spelling mistake fix in submitting-application.
1 parent b665356 commit a7471e4

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

docs/submitting-applications.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ title: Submitting Applications
55

66
The `spark-submit` script in Spark's `bin` directory is used to launch applications on a cluster.
77
It can use all of Spark's supported [cluster managers](cluster-overview.html#cluster-manager-types)
8-
through a uniform interface so you don't have to configure your application specially for each one.
8+
through a uniform interface so you don't have to configure your application especially for each one.
99

1010
# Bundling Your Application's Dependencies
1111
If your code depends on other projects, you will need to package them alongside
@@ -58,7 +58,7 @@ for applications that involve the REPL (e.g. Spark shell).
5858

5959
Alternatively, if your application is submitted from a machine far from the worker machines (e.g.
6060
locally on your laptop), it is common to use `cluster` mode to minimize network latency between
61-
the drivers and the executors. Currently, standalone mode does not support cluster mode for Python
61+
the drivers and the executors. Currently, the standalone mode does not support cluster mode for Python
6262
applications.
6363

6464
For Python applications, simply pass a `.py` file in the place of `<application-jar>` instead of a JAR,
@@ -68,7 +68,7 @@ There are a few options available that are specific to the
6868
[cluster manager](cluster-overview.html#cluster-manager-types) that is being used.
6969
For example, with a [Spark standalone cluster](spark-standalone.html) with `cluster` deploy mode,
7070
you can also specify `--supervise` to make sure that the driver is automatically restarted if it
71-
fails with non-zero exit code. To enumerate all such options available to `spark-submit`,
71+
fails with a non-zero exit code. To enumerate all such options available to `spark-submit`,
7272
run it with `--help`. Here are a few examples of common options:
7373

7474
{% highlight bash %}
@@ -192,7 +192,7 @@ debugging information by running `spark-submit` with the `--verbose` option.
192192

193193
# Advanced Dependency Management
194194
When using `spark-submit`, the application jar along with any jars included with the `--jars` option
195-
will be automatically transferred to the cluster. URLs supplied after `--jars` must be separated by commas. That list is included on the driver and executor classpaths. Directory expansion does not work with `--jars`.
195+
will be automatically transferred to the cluster. URLs supplied after `--jars` must be separated by commas. That list is included in the driver and executor classpaths. Directory expansion does not work with `--jars`.
196196

197197
Spark uses the following URL scheme to allow different strategies for disseminating jars:
198198

0 commit comments

Comments
 (0)