Skip to content

Commit 13cab5b

Browse files
nartzandrewor14
authored andcommitted
add spark.driver.memory to config docs
It took me a minute to track this down, so I thought it could be useful to have it in the docs. I'm unsure if 512mb is the default for spark.driver.memory? Also - there could be a better value for the 'description' to differentiate it from spark.executor.memory. Author: nartz <[email protected]> Author: Nathan Artz <[email protected]> Closes apache#2410 from nartz/docs/add-spark-driver-memory-to-config-docs and squashes the following commits: a2f6c62 [nartz] Update configuration.md 74521b8 [Nathan Artz] add spark.driver.memory to config docs
1 parent 86b3929 commit 13cab5b

File tree

1 file changed

+8
-0
lines changed

1 file changed

+8
-0
lines changed

docs/configuration.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -103,6 +103,14 @@ of the most common options to set are:
103103
(e.g. <code>512m</code>, <code>2g</code>).
104104
</td>
105105
</tr>
106+
<tr>
107+
<td><code>spark.driver.memory</code></td>
108+
<td>512m</td>
109+
<td>
110+
Amount of memory to use for the driver process, i.e. where SparkContext is initialized.
111+
(e.g. <code>512m</code>, <code>2g</code>).
112+
</td>
113+
</tr>
106114
<tr>
107115
<td><code>spark.serializer</code></td>
108116
<td>org.apache.spark.serializer.<br />JavaSerializer</td>

0 commit comments

Comments
 (0)