Skip to content

Conversation

@HyukjinKwon
Copy link
Member

@HyukjinKwon HyukjinKwon commented Dec 5, 2018

What is this PR for?

This is just to update scala to 2.11.12 which to be consistent with spark (SPARK-24418).
This PR takes over and closes #3033

There was a minor conflict which my PR (#3206) introduced. That change is compatible with both Scala 2.11.8 and 2.11.12 so we don't need to change it anymore.

What type of PR is it?

[Improvement]

Todos

  • - None

What is the Jira issue?

How should this be tested?

  • CI pass

Screenshots (if appropriate)

Questions:

  • Does the licenses files need update? No
  • Is there breaking changes for older versions? No
  • Does this needs documentation? No

@HyukjinKwon HyukjinKwon closed this Dec 5, 2018
@HyukjinKwon HyukjinKwon reopened this Dec 5, 2018
Copy link
Member

@felixcheung felixcheung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems good to me

.travis.yml Outdated
- sudo: required
jdk: "oraclejdk8"
dist: trusty
env: BUILD_PLUGINS="true" PYTHON="3" SCALA_VER="2.10" PROFILE="-Pspark-1.6 -Pscala-2.10" SPARKR="true" BUILD_FLAG="install -DskipTests -DskipRat -am" TEST_FLAG="test -DskipRat -am" MODULES="-pl zeppelin-zengine,spark/interpreter,spark/spark-dependencies" TEST_PROJECTS="-Dtest=SparkIntegrationTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why remove this again SCALA_VER="2.10"?

@GezimSejdiu
Copy link

Hi @HyukjinKwon ,
any news about this PR? Does it support Spark 2.4.0 (Scala 2.11.x) interpreter already? I just created a new branch which builds a customized zeppelin docker based on BDE spark docker and while testing it on SANSA via SANSA-Notebooks, found out that it does not work with Spark 2.4.0 :(. (see the stack trace below) :

java.lang.NoSuchMethodException: scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$loopPostInit()
	at java.lang.Class.getMethod(Class.java:1786)
	at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:268)
	at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:262)
	at org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:84)
	at org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
	at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:617)
	at org.apache.zeppelin.scheduler.Job.run(Job.java:188)
	at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)

Is there a plan to have that support soon on the new release? or we have to downgrade and use Spark 2.3.x instead ?

Looking forward to hearing from you.

Best regards,

@HyukjinKwon
Copy link
Member Author

HyukjinKwon commented Dec 12, 2018

Spark 2.4 support was added at #3206, which exactly addresses the issue you faced - see this line https://github.com/apache/zeppelin/pull/3206/files#diff-b935226b71a3cfbabfb5324b9264c430L84. This will be available in new release of Zeppelin.

@HyukjinKwon
Copy link
Member Author

I'm not aware of release plan in Zeppelin since I'm just one of contributors. For the current status, the Spark should be downgraded as far as I can tell.

@GezimSejdiu
Copy link

Many thanks for your response!
As of now, I will downgrade the Spark version and looking forward to the new release of Zeppelin.

Best regards,

@felixcheung felixcheung closed this Jan 2, 2019
@felixcheung felixcheung reopened this Jan 2, 2019
@HyukjinKwon HyukjinKwon closed this Jan 3, 2019
@HyukjinKwon HyukjinKwon reopened this Jan 3, 2019
@felixcheung
Copy link
Member

hmm, not sure why, this is the error, but I don't see recent changes that might have broken it


09:52:10,841  INFO org.apache.zeppelin.notebook.Paragraph:381 - Run paragraph [paragraph_id: paragraph_1546509130838_525871799, interpreter: org.apache.zeppelin.spark.SparkInterpreter, note_id: 2DZC2NGPW, user: anonymous]
09:52:10,841 DEBUG org.apache.zeppelin.interpreter.remote.RemoteInterpreter:206 - st:
z.run(1)

then the test timed out

@HyukjinKwon
Copy link
Member Author

Yup.. let me take a look

@github-actions
Copy link

This pull request has been inactive for over a year. If no further activity occurs within the next 30 days, it will be automatically closed. If you believe this PR is still relevant, please feel free to leave a comment or make an update. Thank you!

@github-actions github-actions bot added the Stale label Jul 19, 2025
@github-actions
Copy link

This pull request has been automatically closed due to prolonged inactivity (over one year without updates). If you feel this was done in error or would like to continue the discussion, feel free to reopen it. Thank you for your contributions!

@github-actions github-actions bot closed this Aug 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants