Make Delta able to cross-compile against Spark Latest Release (3.5) and Spark Master (4.0)#2877
Merged
scottsand-db merged 1 commit intodelta-io:masterfrom Apr 10, 2024
Conversation
Contributor
|
@scottsand-db if you have a chance could you look at this #2828 PR? It also makes some changes to match Scala version with Spark main and 3.5 branches, and security updates. Thanks. |
vkorukanti
approved these changes
Apr 10, 2024
Collaborator
vkorukanti
left a comment
There was a problem hiding this comment.
The shim model lgtm. One qn: does this require Spark to publish the jars from master branch daily or periodically?
Collaborator
Author
@vkorukanti we just depend on 4.0.0-SNAPSHOT, which is a nightly spark build. if they skip a night, then 🤷 that's fine too |
andreaschat-db
pushed a commit
to andreaschat-db/delta
that referenced
this pull request
Apr 16, 2024
…nd Spark Master (4.0) (delta-io#2877) #### Which Delta project/connector is this regarding? - [X] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description ### What DOES this PR do? - changes Delta's `build.sbt` to compile `delta-spark` against spark master. compilation succeeds. tests pass against spark 3.5. tests run but fail against spark master - e.g. `build/sbt -DsparkVersion=master spark/test` - the default spark version for Delta is still Spark 3.5 - testing requires building unidoc for (unfortunately) ALL projects in build.sbt. that breaks since spark master uses JDK 17 but delta-iceberg uses JDK 8. thus, we disable unidoc for delta-spark compiling against spark-master for now. - Delta: creates `spark-3.5` and `spark-master` folders. Delta will be able to cross compile against both. These folders will contain `shims` (code that will be selectively pulled to compile against a single spark version) but also spark-version-only code ### What does this PR NOT do? - this PR does not update any build infra (GitHub actions) to actually compile or test delta-spark against Spark Master. That will come later. ## How was this patch tested? Existing tests. `build/sbt -DsparkVersion=3.5 spark/test` ✅ `build/sbt -DsparkVersion=master spark/compile` ✅ `build/sbt -DsparkVersion=master spark/test` ❌ (expected, these fixes will come later) ## Does this PR introduce _any_ user-facing changes? No
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Which Delta project/connector is this regarding?
Description
What DOES this PR do?
build.sbtto compiledelta-sparkagainst spark master. compilation succeeds. tests pass against spark 3.5. tests run but fail against spark masterbuild/sbt -DsparkVersion=master spark/testspark-3.5andspark-masterfolders. Delta will be able to cross compile against both. These folders will containshims(code that will be selectively pulled to compile against a single spark version) but also spark-version-only codeWhat does this PR NOT do?
How was this patch tested?
Existing tests.
build/sbt -DsparkVersion=3.5 spark/test✅build/sbt -DsparkVersion=master spark/compile✅build/sbt -DsparkVersion=master spark/test❌ (expected, these fixes will come later)Does this PR introduce any user-facing changes?
No