-
Notifications
You must be signed in to change notification settings - Fork 3k
Build: Require JDK 17 / 21 for Spark 4.0 support #13381
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
spark/v4.0/build.gradle
Outdated
| String sparkMajorVersion = '4.0' | ||
| String scalaVersion = '2.13' | ||
|
|
||
| if (!JavaVersion.current().isCompatibleWith(JavaVersion.VERSION_17)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about JDK 21?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the suggestion. Is there any background of that? Given Spark 4.0 only requires JDK 17.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spark 4.0 can compile with JDK 21 so we shouldn't fail.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But that will make Iceberg + Spark 4.0 require at least JDK 21 to build ? For example I can build them with JDK 17 without failures but after that change, I have to switch to JDK 21.
Am I missing something here?
This is the Spark's JDK requirement which is 17 for 4.0: https://github.com/apache/spark/blob/fa33ea000a0bda9e5a3fa1af98e8e85b8cc5e4d4/pom.xml#L117
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, I mean it should build with either JDK 17 or JDK 21.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh, got it.
Since we have this 11 / 17 / 21 check in the root script, would you mean we should add own checks for each Spark module?
I'll update the PR for all Spark versions if I don't misunderstand something here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhztheplayer What @manuzhang is saying is that you can build with a later version, but you can set the target to 17. This way the JARs produced, will work with JDK 17: https://www.baeldung.com/java-source-target-options#backward-compatibility-with-older-java-versions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Fokko Yes I understand this point. I was thinking we already had a 11 / 17 / 21 check in the root script so we could just add check for version >= 17 which would inherently cover both 17 and 21.
spark/v3.5/build.gradle
Outdated
|
|
||
| def currentJavaVersion = JavaVersion.current() | ||
|
|
||
| if (!(currentJavaVersion.is(JavaVersion.VERSION_11) || currentJavaVersion.is(JavaVersion.VERSION_17) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spark 3.5 builds with all supported JDKs so no check is needed here.
|
@manuzhang I just decided to only do 4.0 in this PR... I am not sure about the user-end impact of adding the enforcement for 3.x at the moment |
Issue: #13380
Currently some errors like
error: cannot access JavaRDDwill be thrown when building Iceberg + Spark 4.0 with JDK 11.The patch adds a check in Gradle script to require a minimal version 17 JDK. Once not satisfied, reports the following error: