-
Notifications
You must be signed in to change notification settings - Fork 365
chore: Suppress javac deprecation warnings in SparkCatalog #2394
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
dimas-b
merged 1 commit into
apache:main
from
dimas-b:suppress-deprecation-in-spark-client
Aug 20, 2025
Merged
chore: Suppress javac deprecation warnings in SparkCatalog #2394
dimas-b
merged 1 commit into
apache:main
from
dimas-b:suppress-deprecation-in-spark-client
Aug 20, 2025
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
SparkCatalog intentionally overrides and uses deprecated methods from Spark's TableCatalog. This PR adds suppression annotations to allow for clean compilation given that the deprecated method calls and overrides are clearly expected in this case.
flyrain
approved these changes
Aug 19, 2025
gh-yzou
approved these changes
Aug 20, 2025
adutra
approved these changes
Aug 20, 2025
Contributor
adutra
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah it seems Spark delegates the new method to the old one, forcing implementors to implement the latter rather than the former:
In 3.5.6:
In 4.0.0:
🤷♂️ 😅
dimas-b
added a commit
to dimas-b/polaris
that referenced
this pull request
Sep 19, 2025
Background: apache#2394 Since we have to override the deprecated `createTable` method, we suppress deprecation warnings produced by `javac`. Suppressing `RedundantSuppression` is needed for IntelliJ, which appears to consider this a normal situation and does not issue a deprecation warning.
snazy
added a commit
to snazy/polaris
that referenced
this pull request
Nov 20, 2025
* feat: Add Pod Disruption Budget support to Helm chart (apache#2380) * chore(deps): update quay.io/keycloak/keycloak docker tag to v26.3.3 (apache#2407) * Mention Helm chart support for PodDisruptionBudget in CHANGELOG.md (apache#2408) * chore: Suppress javac deprecation warnings in SparkCatalog (apache#2394) SparkCatalog intentionally overrides and uses deprecated methods from Spark's TableCatalog. This PR adds suppression annotations to allow for clean compilation given that the deprecated method calls and overrides are clearly expected in this case. * Python client auto generate (apache#2192) * Python client auto generate * Python client auto generate * Python client auto generate * Python client auto generate * Python client auto generate * Python client auto generate * Remove auto generated doc * undo * Fix doc * Fix docker ref from CONTAINER_TOOL to DOCKER * Add client help manual to GH action * Add missing region to MinIO getting-started example (apache#2411) The example was missing an AWS region, thus causing Spark to fail with: ``` spark-sql ()> create table ns.t1 as select 'abc'; 25/08/20 16:25:06 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0) software.amazon.awssdk.core.exception.SdkClientException: Unable to load region from any of the providers in the chain software.amazon.awssdk.regions.providers.DefaultAwsRegionProviderChain@47578c86: [software.amazon.awssdk.regions.providers.SystemSettingsRegionProvider@1656f847: Unable to load region from system settings. Region must be specified either via environment variable (AWS_REGION) or system property (aws.region)., software.amazon.awssdk.regions.providers.AwsProfileRegionProvider@2bbaabe3: No region provided in profile: default, software.amazon.awssdk.regions.providers.InstanceProfileRegionProvider@54b1cfd8: Unable to contact EC2 metadata service.] ... at org.apache.iceberg.aws.AwsClientFactories$DefaultAwsClientFactory.s3(AwsClientFactories.java:119) at org.apache.iceberg.aws.s3.S3FileIO.client(S3FileIO.java:391) at org.apache.iceberg.aws.s3.S3FileIO.newOutputFile(S3FileIO.java:193) ``` * Add feature config to allow dropping views without purging (apache#2369) * Add feature config to allow dropping views without purging With tables, the client can decide whether to purge the table on drop or not. However, Polaris Servers used to unconditionally perform the purge on dropping a view. After apache#1619 that behaviour effectively prevents dropping views if the admin user does not set `DROP_WITH_PURGE_ENABLED`. The latter, though, is not currently advisable per apache#1617. This change introduces a new feature configuration (`PURGE_VIEWS_ON_DROP`) that allows the admin user to instruct Polaris servers to drop views without purging to achieve operational parity with tables. Fixes apache#2367 * review: rename to PURGE_VIEW_METADATA_ON_DROP * review: re-fix description * Last merged commit c97b150 --------- Co-authored-by: Bryan Maloyer <[email protected]> Co-authored-by: Mend Renovate <[email protected]> Co-authored-by: Alexandre Dutra <[email protected]> Co-authored-by: Dmitri Bourlatchkov <[email protected]> Co-authored-by: Yong Zheng <[email protected]>
snazy
added a commit
to snazy/polaris
that referenced
this pull request
Nov 20, 2025
* Suppress deprecation warnings in `PolarisSparkCatalog.createTable()` (apache#2631) Background: apache#2394 Since we have to override the deprecated `createTable` method, we suppress deprecation warnings produced by `javac`. Suppressing `RedundantSuppression` is needed for IntelliJ, which appears to consider this a normal situation and does not issue a deprecation warning. * Service: Add Events for PolarisServiceImpl APIs (apache#2482) * CHANGELOG: Freeze change log for 1.1 and clear out unreleased version (apache#2635) * Re-add CHANGELOG.md entry for apache#2197 (apache#2638) Using `git log -p apache-polaris-1.1.0-incubating..553cb06 -- CHANGELOG.md` to find changes missed in the previous CHANGELOG update (apache#2635) * Azure: Fix azure expires at prefix for the credentials refresh (apache#2633) * Remove unused LOG in SparkCatalog (apache#2639) * fix(deps): update dependency com.google.errorprone:error_prone_core to v2.42.0 (apache#2636) * fix(deps): update dependency io.smallrye.config:smallrye-config-core to v3.14.0 (apache#2637) * Fix client license check (apache#2642) * fix(deps): update dependency software.amazon.awssdk:bom to v2.34.0 (apache#2645) * fix(deps): update mockito monorepo to v5.20.0 (apache#2641) * chore(deps): update docker.io/prom/prometheus docker tag to v3.6.0 (apache#2644) * chore(events): unify in-memory buffer listeners implementations (apache#2628) * fix(deps): update quarkus platform and group (apache#2595) * Update jandex dependency to 3.5.0 (apache#2649) * Last merged commit e6796f7 --------- Co-authored-by: Dmitri Bourlatchkov <[email protected]> Co-authored-by: Adnan Hemani <[email protected]> Co-authored-by: Prashant Singh <[email protected]> Co-authored-by: Mend Renovate <[email protected]> Co-authored-by: Yong Zheng <[email protected]> Co-authored-by: Alexandre Dutra <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
SparkCatalog intentionally overrides and uses deprecated methods from Spark's TableCatalog.
This PR adds suppression annotations to allow for clean compilation given that the deprecated method calls and overrides are clearly expected in this case.