-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-2608] fix executor backend launch commond over mesos mode #1986
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
|
Is there any difference between this PR than the one you closed? |
|
@tnachen, no actually, i just merge with the new master |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only passing sc.executorEnvs here to the command object here is not enough. Because we only use command with CommandUtils.buildCommandSeq below to generate the command line string, and in this case, command.environment is only used to run bin/compute-classpath (see here), not propagated to the target executor process.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, here i should set env to CommandInfo to propagate to the target executor process.
|
Hi @scwf, given that there is another PR that supersedes this one, would you mind closing this? There are many duplicates of the same PR and it's a little confusing to find that all of them are open. |
https://issues.apache.org/jira/browse/SPARK-2608
This is the updated version of #1513
Mesos scheduler backend use spark-class/spark-executor to launch executor backend, this will lead to issues:
1 when set spark.executor.extraJavaOptions CoarseMesosSchedulerBackend will throw errors because of the launch command "./bin/spark-class org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %s %d").format(basename, extraOpts, driverUrl, offer.getSlaveId.getValue,offer.getHostname, numCores))
2 spark.executor.extraJavaOptions and spark.executor.extraLibraryPath set in sparkconf will not be valid