Skip to content

Commit 3fb53c0

Browse files
srowenAndrew Or
authored andcommitted
SPARK-4300 [CORE] Race condition during SparkWorker shutdown
Close appender saving stdout/stderr before destroying process to avoid exception on reading closed input stream. (This also removes a redundant `waitFor()` although it was harmless) CC tdas since I think you wrote this method. Author: Sean Owen <[email protected]> Closes #4787 from srowen/SPARK-4300 and squashes the following commits: e0cdabf [Sean Owen] Close appender saving stdout/stderr before destroying process to avoid exception on reading closed input stream
1 parent 5f3238b commit 3fb53c0

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

core/src/main/scala/org/apache/spark/deploy/worker/ExecutorRunner.scala

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -85,14 +85,13 @@ private[spark] class ExecutorRunner(
8585
var exitCode: Option[Int] = None
8686
if (process != null) {
8787
logInfo("Killing process!")
88-
process.destroy()
89-
process.waitFor()
9088
if (stdoutAppender != null) {
9189
stdoutAppender.stop()
9290
}
9391
if (stderrAppender != null) {
9492
stderrAppender.stop()
9593
}
94+
process.destroy()
9695
exitCode = Some(process.waitFor())
9796
}
9897
worker ! ExecutorStateChanged(appId, execId, state, message, exitCode)

0 commit comments

Comments
 (0)