[SPARK-28525][DEPLOY] Allow Launcher to be applied Java options#25265
[SPARK-28525][DEPLOY] Allow Launcher to be applied Java options#25265sarutak wants to merge 3 commits intoapache:masterfrom
Conversation
|
Test build #108234 has finished for PR 25265 at commit
|
|
Test build #108250 has finished for PR 25265 at commit
|
|
retest this please. |
|
Test build #108252 has finished for PR 25265 at commit
|
|
Is the |
|
@xuanyuanking Launcher and SparkSubmit run on different JVM processes so we can't use |
bin/spark-class
Outdated
There was a problem hiding this comment.
Better to initialize CMD_START_FLAG to an empty string before the loop.
Or you could use DELIM for the same effect.
There was a problem hiding this comment.
Thanks. I'll initialize the flag.
|
Do we want to update |
|
@kiszk I think this environment variable is for developers so how about update |
|
Test build #108364 has finished for PR 25265 at commit
|
|
retest this please. |
|
Test build #108369 has finished for PR 25265 at commit
|
|
retest this please. |
|
Test build #108372 has finished for PR 25265 at commit
|
|
Lots of flaky tests... |
|
retest this please. |
|
Test build #108383 has finished for PR 25265 at commit
|
|
Merging to master. (Hopefully nobody will ask for this on Windows...) |
|
I fetched this commit and my spark-shell seemed to be broken... do I miss something? |
|
In my environment (Ubuntu 18.04), it works well. |
|
oh, I see. I'll check again later, thx! |
|
It seems my bash is too old; |
|
It's strange. I use bash of same version for you. But the version of macOS is different. Is this the cause? |
That's also interesting because that line hasn't changed. |
|
Takeshi, I can see that happening if you have the new script with an old build of the launcher jar. Otherwise, the logic seems right (and works for me). |
|
That's it! You're right; I rebuilt the package, then the problem has gone. Thx! |
|
I also had the same issue as @maropu Solution: rebuild |
|
I have this problem when trying to run spark shell, my java env path seems ok, but its still giving me this error>: Here's the error while trying to run the shell |
|
@jahidhasanlinix |
Launcher is implemented as a Java application and sometimes I'd like to apply Java options.
One situation I have met is the time I try to attach debugger to Launcher.
Launcher is launched from bin/spark-class but there is no room to apply Java options.
Considering that it's not so many times to apply Java options to Launcher, one compromise would just modify spark-class by user like as follows.
But it doesn't work when any text related to Java options is output to standard output because whole output is used as command-string for spark-shell and spark-submit in current implementation.
One example is jdwp. When apply agentlib option to use jdwp for debug, we will get output like as follows.
The output shown above is not a command-string so spark-submit and spark-shell will fail.
To enable Java options for Launcher, we need treat command-string and others.
I changed launcher/Main.java and bin/spark-class to print separator-character and treat it.
How was this patch tested?
Tested manually using Spark Shell with / without LAUNCHER_JAVA_OPTIONS like as follows.