-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-3787] Assembly jar name is wrong when we build with sbt omitting -Dhadoop.version #2647
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
QA tests have started for PR 2647 at commit
|
project/SparkBuild.scala
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Typo cnahged -> changed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I've fixed.
|
QA tests have finished for PR 2647 at commit
|
|
QA tests have started for PR 2647 at commit
|
|
QA tests have finished for PR 2647 at commit
|
|
QA tests have started for PR 2647 at commit
|
|
QA tests have finished for PR 2647 at commit
|
|
retest this please. |
|
QA tests have started for PR 2647 at commit
|
|
QA tests have finished for PR 2647 at commit
|
|
QA tests have started for PR 2647 at commit
|
|
QA tests have finished for PR 2647 at commit
|
|
I'd hate to have to hard-code more stuff like this. How about just documenting that |
|
Yeah, enforcing to set |
|
Is there a programmatic way to read POM properties from SBT? For example, |
|
@srowen @liancheng Thanks for your advice! I found the systematic way which refers |
|
retest this please. |
|
Jenkins wouldn't pick this PR up so I opened another PR #3046 for this issue. |
This PR is another solution for When we build with sbt with profile for hadoop and without property for hadoop version like:
jar name is always used default version (1.0.4).
When we build with maven with same condition for sbt, default version for each profile is used.
For instance, if we build like:
jar name is used hadoop2.2.0 as a default version of hadoop-2.2.