Skip to content

Commit 92e6047

Browse files
committed
Update a few comments (minor)
1 parent 22b1acd commit 92e6047

File tree

4 files changed

+4
-5
lines changed

4 files changed

+4
-5
lines changed

bin/spark-class2.cmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -127,7 +127,7 @@ if not "x%JAVA_HOME%"=="x" set RUNNER=%JAVA_HOME%\bin\java
127127

128128
rem In Spark submit client mode, the driver is launched in the same JVM as Spark submit itself.
129129
rem Here we must parse the properties file for relevant "spark.driver.*" configs before launching
130-
rem the driver JVM itself. Instead of handling this complexity in Bash, we launch a separate JVM
130+
rem the driver JVM itself. Instead of handling this complexity here, we launch a separate JVM
131131
rem to prepare the launch environment of this driver JVM.
132132

133133
rem In this case, leave out the main class (org.apache.spark.deploy.SparkSubmit) and use our own.

bin/spark-submit.cmd

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,6 @@ set SPARK_SUBMIT_DRIVER_MEMORY=
2929
set SPARK_SUBMIT_LIBRARY_PATH=
3030
set SPARK_SUBMIT_CLASSPATH=
3131
set SPARK_SUBMIT_OPTS=
32-
set SPARK_DRIVER_MEMORY=
3332
set SPARK_SUBMIT_BOOTSTRAP_DRIVER=
3433

3534
:loop

core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,7 @@ private[spark] object SparkSubmitDriverBootstrapper {
139139
stderrThread.start()
140140

141141
// In Windows, the subprocess reads directly from our stdin, so we should avoid spawning
142-
// a thread that also reads from stdin and contends with the subprocess.
142+
// a thread that contends with the subprocess in reading from System.in.
143143
if (Utils.isWindows) {
144144
// For the PySpark shell, the termination of this process is handled in java_gateway.py
145145
process.waitFor()

python/pyspark/java_gateway.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ def preexec_func():
7070
error_msg += "--------------------------------------------------------------\n"
7171
raise Exception(error_msg)
7272

73-
# Ensure the Java child processes do not linger after python has exited in Windows.
73+
# In Windows, ensure the Java child processes do not linger after Python has exited.
7474
# In UNIX-based systems, the child process can kill itself on broken pipe (i.e. when
7575
# the parent process' stdin sends an EOF). In Windows, however, this is not possible
7676
# because java.lang.Process reads directly from the parent process' stdin, contending
@@ -81,7 +81,7 @@ def preexec_func():
8181
# (because the UNIX "exec" command is not available). This means we cannot simply
8282
# call proc.kill(), which kills only the "spark-submit.cmd" process but not the
8383
# JVMs. Instead, we use "taskkill" with the tree-kill option "/t" to terminate all
84-
# child processes.
84+
# child processes in the tree.
8585
def killChild():
8686
Popen(["cmd", "/c", "taskkill", "/f", "/t", "/pid", str(proc.pid)])
8787
atexit.register(killChild)

0 commit comments

Comments
 (0)