Skip to content

Commit c970d86

Browse files
Bharath Bhushanpwendell
authored andcommitted
[SPARK-1403] Move the class loader creation back to where it was in 0.9.0
[SPARK-1403] I investigated why spark 0.9.0 loads fine on mesos while spark 1.0.0 fails. What I found was that in SparkEnv.scala, while creating the SparkEnv object, the current thread's classloader is null. But in 0.9.0, at the same place, it is set to org.apache.spark.repl.ExecutorClassLoader . I saw that 7edbea4 moved it to it current place. I moved it back and saw that 1.0.0 started working fine on mesos. I just created a minimal patch that allows me to run spark on mesos correctly. It seems like SecurityManager's creation needs to be taken into account for a correct fix. Also moving the creation of the serializer out of SparkEnv might be a part of the right solution. PTAL. Author: Bharath Bhushan <[email protected]> Closes #322 from manku-timma/spark-1403 and squashes the following commits: 606c2b9 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 ec8f870 [Bharath Bhushan] revert the logger change for java 6 compatibility as PR 334 is doing it 728beca [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 044027d [Bharath Bhushan] fix compile error 6f260a4 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 b3a053f [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 04b9662 [Bharath Bhushan] add missing line 4803c19 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 f3c9a14 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 42d3d6a [Bharath Bhushan] used code fragment from @ueshin to fix the problem in a better way 89109d7 [Bharath Bhushan] move the class loader creation back to where it was in 0.9.0 (cherry picked from commit ca11919) Signed-off-by: Patrick Wendell <[email protected]>
1 parent 52d401b commit c970d86

File tree

1 file changed

+15
-7
lines changed

1 file changed

+15
-7
lines changed

core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala

Lines changed: 15 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -50,13 +50,21 @@ private[spark] class MesosExecutorBackend
5050
executorInfo: ExecutorInfo,
5151
frameworkInfo: FrameworkInfo,
5252
slaveInfo: SlaveInfo) {
53-
logInfo("Registered with Mesos as executor ID " + executorInfo.getExecutorId.getValue)
54-
this.driver = driver
55-
val properties = Utils.deserialize[Array[(String, String)]](executorInfo.getData.toByteArray)
56-
executor = new Executor(
57-
executorInfo.getExecutorId.getValue,
58-
slaveInfo.getHostname,
59-
properties)
53+
val cl = Thread.currentThread.getContextClassLoader
54+
try {
55+
// Work around for SPARK-1480
56+
Thread.currentThread.setContextClassLoader(getClass.getClassLoader)
57+
logInfo("Registered with Mesos as executor ID " + executorInfo.getExecutorId.getValue)
58+
this.driver = driver
59+
val properties = Utils.deserialize[Array[(String, String)]](executorInfo.getData.toByteArray)
60+
executor = new Executor(
61+
executorInfo.getExecutorId.getValue,
62+
slaveInfo.getHostname,
63+
properties)
64+
} finally {
65+
// Work around for SPARK-1480
66+
Thread.currentThread.setContextClassLoader(cl)
67+
}
6068
}
6169

6270
override def launchTask(d: ExecutorDriver, taskInfo: TaskInfo) {

0 commit comments

Comments
 (0)