-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-34404][SQL] Add new Avro datasource options to control datetime rebasing in read #31529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@cloud-fan @gengliangwang @HyukjinKwon Could you review this PR, please. |
|
Test build #135045 has finished for PR 31529 at commit
|
|
Kubernetes integration test starting |
|
Test build #135047 has finished for PR 31529 at commit
|
|
Kubernetes integration test status failure |
|
jenkins, retest this, please |
|
Test build #135051 has finished for PR 31529 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status success |
|
Test build #135080 has finished for PR 31529 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status success |
|
@cloud-fan @gengliangwang @HyukjinKwon This PR is a companion to #31489, and should solve the same issues. Any concerns about it? |
|
thanks, merging to master! |
### What changes were proposed in this pull request? Mention the DS options introduced by #31529 and by #31489 in `SparkUpgradeException`. ### Why are the changes needed? To improve user experience with Spark SQL. Before the changes, the error message recommends to set SQL configs but the configs cannot help in the some situations (see the PRs for more details). ### Does this PR introduce _any_ user-facing change? Yes. After the changes, the error message is: _org.apache.spark.SparkUpgradeException: You may get a different result due to the upgrading of Spark 3.0: reading dates before 1582-10-15 or timestamps before 1900-01-01T00:00:00Z from Parquet files can be ambiguous, as the files may be written by Spark 2.x or legacy versions of Hive, which uses a legacy hybrid calendar that is different from Spark 3.0+'s Proleptic Gregorian calendar. See more details in SPARK-31404. You can set the SQL config 'spark.sql.legacy.parquet.datetimeRebaseModeInRead' or the datasource option 'datetimeRebaseMode' to 'LEGACY' to rebase the datetime values w.r.t. the calendar difference during reading. To read the datetime values as it is, set the SQL config 'spark.sql.legacy.parquet.datetimeRebaseModeInRead' or the datasource option 'datetimeRebaseMode' to 'CORRECTED'._ ### How was this patch tested? 1. By checking coding style: `./dev/scalastyle` 2. By running the related test suite: ``` $ build/sbt -Phive-2.3 -Phive-thriftserver "test:testOnly *ParquetRebaseDatetimeV1Suite" ``` Closes #31562 from MaxGekk/rebase-upgrade-exception. Authored-by: Max Gekk <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
… options and SQL configs ### What changes were proposed in this pull request? In the PR, I propose to update the Spark SQL guide about the SQL configs that are related to datetime rebasing: - spark.sql.parquet.int96RebaseModeInWrite - spark.sql.parquet.datetimeRebaseModeInWrite - spark.sql.parquet.int96RebaseModeInRead - spark.sql.parquet.datetimeRebaseModeInRead - spark.sql.avro.datetimeRebaseModeInWrite - spark.sql.avro.datetimeRebaseModeInRead Parquet options added by #31489: - datetimeRebaseMode - int96RebaseMode and Avro options added by #31529: - datetimeRebaseMode <img width="998" alt="Screenshot 2021-02-17 at 21 42 09" src="https://user-images.githubusercontent.com/1580697/108252043-3afb8900-7169-11eb-8568-511e21fa7f78.png"> ### Why are the changes needed? To inform users about supported DS options and SQL configs. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? By generating the doc and manually checking: ``` $ SKIP_API=1 SKIP_SCALADOC=1 SKIP_PYTHONDOC=1 SKIP_RDOC=1 jekyll serve --watch ``` Closes #31564 from MaxGekk/doc-rebase-options. Authored-by: Max Gekk <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
What changes were proposed in this pull request?
In the PR, I propose new option
datetimeRebaseModefor the Avro datasource. The option influences on loading ancient dates and timestamps column values from avro files.The option supports the same values as the SQL config
spark.sql.legacy.avro.datetimeRebaseModeInReadnamely;"LEGACY", when an option is set to this value, Spark rebases dates/timestamps from the legacy hybrid calendar (Julian + Gregorian) to the Proleptic Gregorian calendar."CORRECTED", dates/timestamps are read AS IS from avro files."EXCEPTION", when it is set as an option value, Spark will fail the reading if it sees ancient dates/timestamps that are ambiguous between the two calendars.Why are the changes needed?
Before the changes, it is impossible because the SQL config
spark.sql.legacy.avro.datetimeRebaseModeInReadinfluences on both reads.Does this PR introduce any user-facing change?
No.
How was this patch tested?
By running the modified test suites: