-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-4034]change the scope of guava to compile #2876
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
CC @vanzin This should not be changed as it has to do with how Guava is shaded, I believe. As I say, this does not seem to be a problem for the Maven build or IDEA in general. I think you have locally modified your project. |
|
Can one of the admins verify this patch? |
|
i think the root cause is: the scope of guava in root pom.xml is "provided", every time when we do reimport (right click the whole project, click maven->Reimport), the scope will be set to "provided" and cause the Exception. If we change it to "compile", the Exception will never occurs |
|
I need to think a bit about this. There is an issue when running unit tests if you rely on spark-core but don't have an explicit dependency on guava; at the same time, I don't like exposing Guava as a transitive dependency because the Spark assembly does not expose it. Let me think if there's a better way to handle this. |
|
@baishuo can you try whether setting the dependency to That avoids my main worry (leaking Guava into the compilation classpath of client apps) while hopefully allowing unit tests to run. |
|
(BTW, a pre-emptive note: you may have to explicitly add guava as a "provided" dependency for sub-modules that actually compile against guava, when changing the root pom to "runtime"). |
|
Can one of the admins verify this patch? |
|
hi @vanzin , I had modify 4 pom.xml, change the scope of guava to "runtime" at root pom.xml. And all test of sql project can passed. can this change be tested? |
|
(That last comment was meant to be for the dependencies you added in your patch... man github's review interface is confusing.) Anyway, I can't trigger tests for you, an admin would have to do it. |
|
I can trigger the tests, but it sounds like this is not the suggested way to resolve the problem. If thats the case then I suggest we close this issue. |
|
Related: #3658 |
After click maven->reimport for spark project in idea, and begin to start "sparksqlclidriver" in idea, we will get a exception:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/util/concurrent/ThreadFactoryBuilder
at org.apache.spark.util.Utils$.(Utils.scala:611)
at org.apache.spark.util.Utils$.(Utils.scala)
at org.apache.spark.SparkContext.(SparkContext.scala:178)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:36)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:256)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:149)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
This is casued by after maven->reimport was clicked, the scope of guava_.jar in the project spark-hive-thriftserver is changed to provided(rigth click project spark-hive-thriftserver->choose the tab Dependencies, will find each jar's scope in this project ). We can change it to "compile" ,and re-start SparkSQLCLIDriver, the excepiton disappear. But if we re-run maven->reimport, the scope of guava_.jar will return to "provided"