-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
spark-connect 的jar包大小是100m,它太大了超过了我们平台的大小限制 #43
Comments
@PhilosophyBuns Hi! The repo you submitted this issue was the doc repo. I transferred this issue to the nebula-spark-connector repo. |
奇怪的是我在引用nebula-spark-connector后就会引入scala 即便我exclusion后scala也依然还在,这导致我打包和spark上存在的scala冲突 |
您项目中是否还引用的其他依赖scala包的依赖? 我试了下exclusion掉scala-library包,dependencies中就不存在它了
|
Merged
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
为什么一个connect需要这么大的体积,这完全不符合常理
The text was updated successfully, but these errors were encountered: