Skip to content

Commit 6d392b3

Browse files
lianchengrxin
authored andcommitted
[SPARK-2608][Core] Fixed command line option passing issue over Mesos via SPARK_EXECUTOR_OPTS
This is another try after #2145 to fix [SPARK-2608](https://issues.apache.org/jira/browse/SPARK-2608). The basic idea is to pass `extraJavaOpts` and `extraLibraryPath` together via environment variable `SPARK_EXECUTOR_OPTS`. This variable is recognized by `spark-class` and not used anywhere else. In this way, we still launch Mesos executors with `spark-class`/`spark-executor`, but avoids the executor side Spark home issue. Quoted string with spaces is not allowed in either `extraJavaOpts` or `extraLibraryPath` when using Spark over Mesos. The reason is that Mesos passes the whole command line as a single string argument to `sh -c` to start the executor, and this makes shell string escaping non-trivial to handle. This should be fixed in a later release. Classes in package `org.apache.spark.deploy` shouldn't be used as they assume Spark is deployed in standalone mode, and give wrong executor side Spark home directory. Please refer to comments in #2145 for more details. Author: Cheng Lian <[email protected]> Closes #2161 from liancheng/mesos-fix-with-env-var and squashes the following commits: ba59190 [Cheng Lian] Added fine grained Mesos executor support 1174076 [Cheng Lian] Draft fix for CoarseMesosSchedulerBackend (cherry picked from commit 935bffe) Signed-off-by: Reynold Xin <[email protected]>
1 parent 70d8146 commit 6d392b3

File tree

2 files changed

+24
-4
lines changed

2 files changed

+24
-4
lines changed

core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,12 @@ private[spark] class CoarseMesosSchedulerBackend(
122122
val extraLibraryPath = conf.getOption(libraryPathOption).map(p => s"-Djava.library.path=$p")
123123
val extraOpts = Seq(extraJavaOpts, extraLibraryPath).flatten.mkString(" ")
124124

125+
environment.addVariables(
126+
Environment.Variable.newBuilder()
127+
.setName("SPARK_EXECUTOR_OPTS")
128+
.setValue(extraOpts)
129+
.build())
130+
125131
sc.executorEnvs.foreach { case (key, value) =>
126132
environment.addVariables(Environment.Variable.newBuilder()
127133
.setName(key)
@@ -140,16 +146,16 @@ private[spark] class CoarseMesosSchedulerBackend(
140146
if (uri == null) {
141147
val runScript = new File(sparkHome, "./bin/spark-class").getCanonicalPath
142148
command.setValue(
143-
"\"%s\" org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %s %d".format(
144-
runScript, extraOpts, driverUrl, offer.getSlaveId.getValue, offer.getHostname, numCores))
149+
"\"%s\" org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %d".format(
150+
runScript, driverUrl, offer.getSlaveId.getValue, offer.getHostname, numCores))
145151
} else {
146152
// Grab everything to the first '.'. We'll use that and '*' to
147153
// glob the directory "correctly".
148154
val basename = uri.split('/').last.split('.').head
149155
command.setValue(
150156
("cd %s*; " +
151-
"./bin/spark-class org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %s %d")
152-
.format(basename, extraOpts, driverUrl, offer.getSlaveId.getValue,
157+
"./bin/spark-class org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %d")
158+
.format(basename, driverUrl, offer.getSlaveId.getValue,
153159
offer.getHostname, numCores))
154160
command.addUris(CommandInfo.URI.newBuilder().setValue(uri))
155161
}

core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,20 @@ private[spark] class MesosSchedulerBackend(
9090
"Spark home is not set; set it through the spark.home system " +
9191
"property, the SPARK_HOME environment variable or the SparkContext constructor"))
9292
val environment = Environment.newBuilder()
93+
sc.conf.getOption("spark.executor.extraClassPath").foreach { cp =>
94+
environment.addVariables(
95+
Environment.Variable.newBuilder().setName("SPARK_CLASSPATH").setValue(cp).build())
96+
}
97+
val extraJavaOpts = sc.conf.getOption("spark.executor.extraJavaOptions")
98+
val extraLibraryPath = sc.conf.getOption("spark.executor.extraLibraryPath").map { lp =>
99+
s"-Djava.library.path=$lp"
100+
}
101+
val extraOpts = Seq(extraJavaOpts, extraLibraryPath).flatten.mkString(" ")
102+
environment.addVariables(
103+
Environment.Variable.newBuilder()
104+
.setName("SPARK_EXECUTOR_OPTS")
105+
.setValue(extraOpts)
106+
.build())
93107
sc.executorEnvs.foreach { case (key, value) =>
94108
environment.addVariables(Environment.Variable.newBuilder()
95109
.setName(key)

0 commit comments

Comments
 (0)