-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
call sc.stop() in accordance with spark 0.8 #5
base: master
Are you sure you want to change the base?
Conversation
@@ -27,6 +27,7 @@ class LogisticRegressionSuite extends FunSuite with LocalSparkContext { | |||
x = model.predict(MLVector(Array(0.0,0.0))).toNumber | |||
println("Model prediction for (-1.0,0.0): " + x) | |||
assert(x <= 0.5) | |||
sc.stop() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For these test suites, shouldn't the afterEach() method in the LocalSparkContext trait take care of stopping these contexts? SparkContext.stop() is idempotent and an extra call wouldn't cause problems, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right, after playing with it some more it appears to fail non-deterministically. Adding sc.stop() here falsely mitigated the problem by reducing the number of times it fails. I'm not sure the root issue (maybe sbt/sbt test run in parallel?), but will stymie this for now and check later.
Thanks for looking at this, Austin. In general, I'll plan to add an MLContext.stop and an MLContext.broadcast. I'm also wondering if an sbt upgrade will fix the non-determinism issue (though I haven't observed it myself). I'll plan to push those in a bit. |
Yeah this issue only arose when swapping in spark master, so it might not ever bear fruit. |
I think that SBT runs tests in parallel by default; Spark's SparkBuild.scala contains a line to disable parallel tests: // Only allow one test at a time, even across projects, since they run in the same JVM
concurrentRestrictions in Global += Tags.limit(Tags.Test, 1), We might want to do this here, too. |
Hullo,
This pull request adds a corresponding
sc.stop()
for eachval sc = new SparkContext(...)
to avoid
address already in use
errors that come up running./sbt/sbt test
against the current version of spark.