-
Notifications
You must be signed in to change notification settings - Fork 3k
ORC column map fix #227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
ORC column map fix #227
Changes from all commits
Commits
Show all changes
26 commits
Select commit
Hold shift + click to select a range
e8e8619
Read iceberg to ORC mapping from ORC file if found
edgarRd c4d8017
Use the right ORC schema in SparkOrcReader
edgarRd 9d7c4fd
Use ORC 1.6.0 TypeDescription attributes for column mapping
edgarRd a833102
Address some PR comments
edgarRd ec8bba5
Add example on ORC schema evolution handling
edgarRd 3c84d0a
Add metadata for converting different types of binary fields
edgarRd e0bbee2
Verify roundtrip convertion of types
edgarRd 028acc6
Add GenericOrcWriter implementation in iceberg-data
edgarRd 7f86427
Update to ORC 1.6.1
edgarRd 39b3b62
Save Game
edgarRd 2fca5c1
Fix projection by computing last max iceberg id
edgarRd 6d26fee
Fix duplicated classes in runtime
edgarRd 407b2e8
Fix style check
edgarRd e392ffb
Make hadoop dependency compileOnly
edgarRd df64260
Remove unnecessary call to buildOrcProjection
edgarRd c7f9a80
Remove empty comment
edgarRd e332067
Fix typo on test column id
edgarRd f740509
Handle case when renaming a column and reusing previous column name
edgarRd dc30658
Avoid trying to project ORC columns if no Iceberg ID is found
edgarRd 2c0687a
Revert white space changes in build.gradle
edgarRd 0782063
Upgrade ORC to 1.6.2
edgarRd 3e86d28
Fix typo
edgarRd eeb0a56
Use RuntimeIOException
edgarRd 8378036
Using this for setting instance fields
edgarRd 1f5668e
Add more complex schema roundtrip conversion tests
edgarRd 37f7c8a
Split ORC schema tests for build projection tests
edgarRd File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why was it necessary to add this hadoop-common dependency?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since
org.apache.hadoop:hadoop-commonis a non-transitive dependency used at compilation time on the other projects we need to add it here as ORC requires it for usingorg.apache.hadoop.io.WritableComparableonHiveDecimalWritable.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just thought that hadoop-client would pull it in transitively.