[SPARK-21578][CORE] Add JavaSparkContextSuite#18778
[SPARK-21578][CORE] Add JavaSparkContextSuite#18778dongjoon-hyun wants to merge 2 commits intoapache:masterfrom dongjoon-hyun:SPARK-21578
Conversation
|
Test build #80061 has finished for PR 18778 at commit
|
|
The Python failure is irrelevant. |
|
Retest this please |
|
Could you check whether JAVA APIs still work? Could you add the related test cases? |
|
Test build #80062 has finished for PR 18778 at commit
|
|
Thank you for review, @gatorsmile . |
|
Retest this please. |
|
|
|
Do you want to create a new test suite for that under https://github.com/apache/spark/tree/master/core/src/test/java/test/org/apache/spark? The following suite seems to be irrelevant for that purpose because they uses
|
|
@gatorsmile . I see what your did concern here. If we are locked in here in order to preserve the previous behavior, what about updating those comments? It's not about SI-8479 anymore. Now, it's just for backward compatibility. |
|
Test build #80069 has finished for PR 18778 at commit
|
|
@gatorsmile . Three usages (the comment-out lines) were broken here as you concerned. @Test
public void scalaSparkContext() {
List<String> jars = List$.MODULE$.empty();
Map<String, String> environment = Map$.MODULE$.empty();
new SparkContext(new SparkConf().setMaster("local").setAppName("name")).stop();
new SparkContext("local", "name", new SparkConf()).stop();
// new SparkContext("local", "name").stop();
// new SparkContext("local", "name", "sparkHome").stop();
// new SparkContext("local", "name", "sparkHome", jars).stop();
new SparkContext("local", "name", "sparkHome", jars, environment).stop();
}This Scala behavior is due to SI-4278 instead of SI-8479. Since SI-4278 is |
|
@gatorsmile . I fixes the comment and adds explicit test suite to prevent future regression. |
|
Test build #80071 has finished for PR 18778 at commit
|
| @Test | ||
| public void javaSparkContext() { | ||
| String[] jars = new String[] {}; | ||
| java.util.Map<String, String> environment = new java.util.HashMap<>(); |
There was a problem hiding this comment.
Nit: just import these classes as usual?
There was a problem hiding this comment.
Thank you for review, @srowen !
This is due to the import conflicts on Map. Java code still doesn't allow import aliasing. We can do aliasing only in Scala code.
|
Hi, @gatorsmile and @srowen . |
|
Thanks! Merging to master. |
|
Thank you, @gatorsmile and @srowen ! |
What changes were proposed in this pull request?
Due to SI-8479, SPARK-1093 introduced redundant SparkContext constructors. However, SI-8479 is already fixed in Scala 2.10.5 and Scala 2.11.1.
The real reason to provide this constructor is that Java code can access
SparkContextdirectly. It's Scala behavior, SI-4278. So, this PR adds an explicit testsuite,JavaSparkContextSuiteto prevent future regression, and fixes the outdate comment, too.How was this patch tested?
Pass the Jenkins with a new test suite.