-
Notifications
You must be signed in to change notification settings - Fork 2.8k
[ZEPPELIN-3351] Fix build error with 'spark-2.3' profile #2880
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
felixcheung
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| </spark.src.download.url> | ||
| <spark.bin.download.url> | ||
| http://www-us.apache.org/dist/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz | ||
| </spark.bin.download.url> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a pre-existing problem but we really need to change to use the asf mirrors instead - we can't release with this...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, this is tracked. But I afraid it could not be fixed in 0.8.
@woowahan-jaehoon Zeppelin 0.8 can still run spark 2.3 without this PR. The spark-2.3 profile only affect that it would make spark 2.3 as the embedded version of spark. As @felixcheung mentioned, we can make this change after the download url is fixed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, how did you solved download issue?
Just using CDN?
|
I think we should try to fix the build profile as in this PR.
But we must fix the download source - otherwise given what I’ve seen recently, we cannot release (its not up to me)
|
|
For now, we already had a |
|
@jongyoul, I know that 'spark-2.3' profile is already existed some maven projects but it has some bugs for building with spark-2.3 profile. |
|
@woowahan-jaehoon Ok, got it. Then, could you please file your issue? You can also change the title of ZEPPELIN-3351 which was made by you. WDYT? |
|
I modified title and content. Is that enough or, I'll describe more information. |
|
You'd better specify your problem. If your issue is very urgent, we need to fix it as soon as we can. |
|
I want to build for supporting spark-2.3. When I built with these commands, I failed to build with these messages. This error was caused that spark 2.3 didn't support scala-2.10 anymore. More informations:
When I built again, I failed to build again with these message. This error was caused that 'http://d3kbcqa49mib13.cloudfront.net' wasn't contain spark source and binary. When I built again, I failed to build again with these message. This error was caused by wrong property. When I built again, I failed to build again with these message. This error was caused by unused import. When I built again, I succeed. |
|
You don't need to build with profile spark2.3 to use spark 2.3 in zeppelin. You can just build it with the following simple command to use spark 2.3 The purpose of profile spark 2.3 is for the embedded spark version (Which doesn't supported yet). But you can use spark 2.3 via setting SPARK_HOME in interpreter setting. |
|
Oh, I missed. It is not urgent. |
|
Hi, the latest release can't seem to show schedular icon on the zeppelin UI. |
|
@usmanskhan It is because cron is disabled by default, you can set |
|
@woowahan-jaehoon Can you please close this issue if it does not proceed any longer? |
|
Ok |
|
Thanks!! :-) |
What is this PR for?
I failed to build with this option (specially
spark-2.3profile)mvn package -DskipTests -Pspark-2.3 -Phadoop-2.7 -Pyarn -Ppyspark -Pscala-2.11It is caused these reason
This PR can fix their error.
What type of PR is it?
Bug Fix
Todos
What is the Jira issue?
How should this be tested?
Screenshots (if appropriate)
Questions: