Skip to content

Conversation

@warrenzhu25
Copy link
Contributor

What changes were proposed in this pull request?

Upgrade guava version to 27.0-jre to be consistent with hadoop

Why are the changes needed?

When use hadoop-aws 3.2 with spark 3.0, got below error. This is caused by guava version mismatch as hadoop used guava 27.0-jre while spark used 14.0.1.

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)V
at org.apache.hadoop.fs.s3a.S3AUtils.lookupPassword(S3AUtils.java:742)
at org.apache.hadoop.fs.s3a.S3AUtils.getAWSAccessKeys(S3AUtils.java:712)
at org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:559)
at org.apache.hadoop.fs.s3a.DefaultS3ClientFactory.createS3Client(DefaultS3ClientFactory.java:52)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:264)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3303)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1853)
at org.apache.spark.deploy.history.EventLogFileWriter.(EventLogFileWriters.scala:60)
at org.apache.spark.deploy.history.SingleEventLogFileWriter.(EventLogFileWriters.scala:213)
at org.apache.spark.deploy.history.EventLogFileWriter$.apply(EventLogFileWriters.scala:181)
at org.apache.spark.scheduler.EventLoggingListener.(EventLoggingListener.scala:66)
at org.apache.spark.SparkContext.(SparkContext.scala:584)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2588)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:937)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:931)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:30)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:944)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1023)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1032)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Does this PR introduce any user-facing change?

No

How was this patch tested?

Manually tested

@github-actions github-actions bot added the BUILD label Sep 27, 2021
@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@sunchao
Copy link
Member

sunchao commented Sep 27, 2021

@warrenzhu25 this won't work because of Guava dependencies from Hive. Check #33989 and #29326 for the latest efforts on this.

@warrenzhu25
Copy link
Contributor Author

@warrenzhu25 this won't work because of Guava dependencies from Hive. Check #33989 and #29326 for the latest efforts on this.

Thanks for comment. I'll close this

@warrenzhu25 warrenzhu25 deleted the guava branch March 14, 2022 04:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants