-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-31644][BUILD] Make Spark's guava version configurable from the command line #28455
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-31644][BUILD] Make Spark's guava version configurable from the command line #28455
Conversation
…ommand line. This adds the maven property guava.version which can be used to control the guava version for a build. It does not change the current version. Change-Id: Icc20a9bb2a73432f4a56f5def52a88f4bf7c06b6
|
Thank you so much for taking a look at this, @steveloughran . :) |
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Merged to master/3.0.
… command line ### What changes were proposed in this pull request? This adds the maven property guava.version which can be used to control the guava version for a build. It does not change the current version. ### Why are the changes needed? All future Hadoop releases are going to be built with a later guava version, including Hadoop 3.1.4. This means to run the spark tests with that release you need to update the spark guava version. This patch lets whoever builds spark do this locally. ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? ran the hadoop-cloud module tests with the 3.1.4 RC0 ``` mvn -T 1 -Phadoop-3.2 -Dhadoop.version=3.1.4 -Psnapshots-and-staging -Phadoop-cloud,yarn,kinesis-asl test --pl hadoop-cloud ``` observed the linkage problem ``` java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357) ``` made the version configurable, retested with ``` -Phadoop-3.2 -Dhadoop.version=3.1.4 -Psnapshots-and-staging Dguava.version=27.0-jre ``` all good. Closes #28455 from steveloughran/SPARK-31644-guava-version. Authored-by: Steve Loughran <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]> (cherry picked from commit 86c4e43) Signed-off-by: Dongjoon Hyun <[email protected]>
|
BTW, for the other reviewers, Hive module will fail if we use the latest Guava. This PR is a preparation for next steps. |
|
Test build #122323 has finished for PR 28455 at commit
|
|
bq. Hive module will fail if we use the latest Guava. This PR is a preparation for next steps. Hive. joy |
…e from the command line for sbt ### What changes were proposed in this pull request? This PR proposes to support guava version configurable from command line for sbt. ### Why are the changes needed? #28455 added the configurability for Maven but not for sbt. sbt is usually faster than Maven so it's useful for developers. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? I confirmed the guava version is changed with the following commands. ``` $ build/sbt "inspect tree clean" | grep guava [info] +-spark/*:dependencyOverrides = Set(com.google.guava:guava:14.0.1, xerces:xercesImpl:2.12.0, jline:jline:2.14.6, org.apache.avro:avro:1.8.2) ``` ``` $ build/sbt -Dguava.version=25.0-jre "inspect tree clean" | grep guava [info] +-spark/*:dependencyOverrides = Set(com.google.guava:guava:25.0-jre, xerces:xercesImpl:2.12.0, jline:jline:2.14.6, org.apache.avro:avro:1.8.2) ``` Closes #28822 from sarutak/guava-version-for-sbt. Authored-by: Kousuke Saruta <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
This adds the maven property guava.version which can be
used to control the guava version for a build.
It does not change the current version.
Why are the changes needed?
All future Hadoop releases are going to be built with a later guava version, including Hadoop 3.1.4. This means to run the spark tests with that release you need to update the spark guava version. This patch lets whoever builds spark do this locally.
Does this PR introduce any user-facing change?
no
How was this patch tested?
ran the hadoop-cloud module tests with the 3.1.4 RC0
observed the linkage problem
made the version configurable, retested with
all good.