Skip to content

Conversation

@dongjoon-hyun
Copy link
Member

What changes were proposed in this pull request?

This PR re-enable the disabled test. This was manually verified by the following steps and the prebuilt test.jar is committed together.

$ ./build/sbt -Pyarn -Phadoop-2.3 -Pkinesis-asl -Phive-thriftserver -Phive package assembly/assembly streaming-kafka-assembly/assembly streaming-flume-assembly/assembly streaming-mqtt-assembly/assembly streaming-mqtt/test:assembly streaming-kinesis-asl-assembly/assembly

$ cd sql/hive/src/test/resources/regression-test-SPARK-8489/

$ scalac -classpath ~/spark/assembly/target/scala-2.11/spark-assembly-2.0.0-SNAPSHOT-hadoop2.3.0.jar Main.scala MyCoolClass.scala

$ rm test.jar

$ jar cvf test.jar *.class

$ cd ~/spark

$ './bin/spark-submit' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--driver-java-options' '-Dderby.system.durability=test' '--class' 'Main' 'sql/hive/src/test/resources/regression-test-SPARK-8489/test.jar'

How was this patch tested?

Pass the Jenkins test. (Also manually do the following.)

$ build/sbt "project hive" "test-only *HiveSparkSubmitSuite -- -z SPARK-8489"
[info] HiveSparkSubmitSuite:
...
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.

@dongjoon-hyun
Copy link
Member Author

Jenkins is still running, but we can see the re-enabled test is passed correctly from the log.

- stdout> Regression test for SPARK-8489 success!
[info] - SPARK-8489: MissingRequirementError during reflection (16 seconds, 504 milliseconds)

@SparkQA
Copy link

SparkQA commented Mar 10, 2016

Test build #52821 has finished for PR 11630 at commit de507c5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dongjoon-hyun
Copy link
Member Author

Hi, @rxin .
Could you review and merge this PR?
It's about your SPARK-12653 on 2.0.0.

Please note that Spark PR Appspot shows wrong information together now. I don't know why Jenkins adds JIRA-8489 to this PR and marks 1.4.1 and 1.5.0.

@srowen
Copy link
Member

srowen commented Mar 10, 2016

Do we know what fixed the reason that this test had to be disabled?

@dongjoon-hyun
Copy link
Member Author

Sure. It is described in the JIRA.

@dongjoon-hyun
Copy link
Member Author

I fixed this according @rxin 's original guess. It was right. The prebuilt jar file is built with old spark.

@dongjoon-hyun
Copy link
Member Author

@srowen , please let me know if I did some mistake.
Thank you always!

@srowen
Copy link
Member

srowen commented Mar 10, 2016

Yeah I don't know what that referred to -- just local build artifacts or something that needs to be rebuilt and committed somewhere? Just trying to figure out whether the outcome was "this should work now" or "we need to do X for it to work again". But you show it seems to work.

@dongjoon-hyun
Copy link
Member Author

Ah, the description in JIRA issue was short, and I didn't wrote the real error message. Sorry for lack of description.

When I enabled that test and investigated the log on the master branch, I could face the real error message.

sql/hive/src/test/resources/regression-test-SPARK-8489/test.jar''spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--driver-java-options' '-Dderby.system.durability=test' '--class' 'Main' ' 
Running regression test for SPARK-8489.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$6()Lscala/collection/Map;
        at Main$.main(Main.scala:34)
        at Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:737)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

It happens many time when versions are mismatched. And, it was the guess of @rxin , too.
So, I tried to rebuilt that from the source since the real purpose of this test case is to guarantee to pass the test with the source code.

@dongjoon-hyun
Copy link
Member Author

Please note that the error message is changed. I think it's due to the change of master branch.

@dongjoon-hyun
Copy link
Member Author

Hmm.
I'll test with Scala 2.10 and report here soon.
I hope I didn't make the same mistake again.
Thank you for review, @srowen .

@dongjoon-hyun
Copy link
Member Author

With Scala 2.10, it still has a problem. I will close this PR right now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants