Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

spark-connect 的jar包大小是100m,它太大了超过了我们平台的大小限制 #43

Closed
PhilosophyBuns opened this issue Apr 21, 2022 · 3 comments · Fixed by #68
Closed

Comments

@PhilosophyBuns
Copy link

PhilosophyBuns commented Apr 21, 2022

为什么一个connect需要这么大的体积,这完全不符合常理

@randomJoe211 randomJoe211 transferred this issue from vesoft-inc/nebula-docs-cn Apr 21, 2022
@randomJoe211
Copy link
Contributor

@PhilosophyBuns Hi! The repo you submitted this issue was the doc repo. I transferred this issue to the nebula-spark-connector repo.

@PhilosophyBuns
Copy link
Author

奇怪的是我在引用nebula-spark-connector后就会引入scala 即便我exclusion后scala也依然还在,这导致我打包和spark上存在的scala冲突

@Nicole00
Copy link
Contributor

奇怪的是我在引用nebula-spark-connector后就会引入scala 即便我exclusion后scala也依然还在,这导致我打包和spark上存在的scala冲突

您项目中是否还引用的其他依赖scala包的依赖? 我试了下exclusion掉scala-library包,dependencies中就不存在它了

<dependency>
            <groupId>com.vesoft</groupId>
            <artifactId>nebula-spark-connector</artifactId>
            <version>3.0.0</version>
            <exclusions>
                <exclusion>
                    <artifactId>scala-library</artifactId>
                    <groupId>org.scala-lang</groupId>
                </exclusion>
            </exclusions>
        </dependency>

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants