-
Notifications
You must be signed in to change notification settings - Fork 2.9k
support create table like in flink catalog #12199
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| FlinkCreateTableOptions.CATALOG_TABLE.key(), tablePath.getObjectName()); | ||
| catalogAndTableProps.put("connector", FlinkDynamicTableFactory.FACTORY_IDENTIFIER); | ||
| catalogAndTableProps.putAll(table.properties()); | ||
| return toCatalogTableWithProps(table, catalogAndTableProps); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you help me understand why is the table properties needed to be added here?
We also send the table as a parameter. Wouldn't it be enough?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just thinking out loud:
- Maybe the code would be easier to read if we send only the catalogProps to the
toCatalogTableWithPropsand create a merged map when calling the Flink method - This is somewhat suboptimal as we create an extra map
Even if we decide to follow your approach, the parameter name of the method should reflect that at the declaration of toCatalogTableWithProps, and maybe some comments or javadoc should be nice there for future generations 😉
WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah merged it here , as merging later on needs an extra map.
Renamed the method to toCatalogTableWithCustomProps and modified parameter names. Hope it's more readable now.
flink/v1.20/flink/src/test/java/org/apache/iceberg/flink/TestFlinkCatalogTable.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Show resolved
Hide resolved
flink/v1.20/flink/src/test/java/org/apache/iceberg/flink/TestFlinkCatalogTable.java
Show resolved
Hide resolved
| } else if (!("connector".equalsIgnoreCase(entry.getKey()) | ||
| || FlinkCreateTableOptions.SRC_CATALOG_PROPS_KEY.equalsIgnoreCase(entry.getKey()))) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It took me some time to decipher this 😄
Could we please break up the negation so it doesn't span multiple lines?
Maybe:
} else if (!"connector".equalsIgnoreCase(entry.getKey())
&& !FlinkCreateTableOptions.SRC_CATALOG_PROPS_KEY.equalsIgnoreCase(entry.getKey())) {
Or it is just me?
Also maybe a comment should be nice
| String srcCatalogProps = | ||
| FlinkCreateTableOptions.toJson( | ||
| getName(), tablePath.getDatabaseName(), tablePath.getObjectName(), catalogProps); | ||
|
|
||
| ImmutableMap.Builder<String, String> mergedProps = ImmutableMap.builder(); | ||
| mergedProps.put("connector", FlinkDynamicTableFactory.FACTORY_IDENTIFIER); | ||
| mergedProps.put(FlinkCreateTableOptions.SRC_CATALOG_PROPS_KEY, srcCatalogProps); | ||
| mergedProps.putAll(table.properties()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we please add a comment here, that we store the catalog properties in the merged property list to work around the Flink API limitations?
| tableProps.forEach(flinkConf::setString); | ||
|
|
||
| String catalogName = flinkConf.getString(CATALOG_NAME); | ||
| Map<String, String> mergedProps = mergeSrcCatalogProps(tableProps); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Either here, or in the javadoc for the mergeSrcCatalogProps please describe what we are doing, and why
pvary
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I generally like this approach.
Please add some comments to the code for the future developers.
Otherwise looks good to me.
|
Added comments to the code. |
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCreateTableOptions.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCreateTableOptions.java
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCreateTableOptions.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/test/java/org/apache/iceberg/flink/TestFlinkCatalogTable.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/test/java/org/apache/iceberg/flink/source/TestIcebergSourceSql.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCreateTableOptions.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/test/java/org/apache/iceberg/flink/TestFlinkCatalogTable.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Outdated
Show resolved
Hide resolved
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Outdated
Show resolved
Hide resolved
| .noDefaultValue() | ||
| .withDescription("Properties for the underlying catalog for iceberg table."); | ||
|
|
||
| public static final String SRC_CATALOG_PROPS_KEY = "src-catalog"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also does Flink create table ... like ... work for Hive tables as source? I assume no. otherwise, we can refer to how that is implemented.
It feels tacky to use special property key to carry over catalog props and source table identifier. Does it make sense to modify Flink API to support this more elegantly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right this is not supported in hive,
https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/dev/table/hive-compatibility/hive-dialect/create/
Supported in Kafka, but yeah there is no concept of catalog , mostly flat properties of cluster for kafka connector.
Sure, can look into Flink API changes for long term, but as all connectors don't have concept of catalog or hierarchy like Iceberg, not sure how that works out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, Kafka doesn't have catalog concept. we can take this discussion separately. it won't be a blocker for this PR
flink/v1.20/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java
Outdated
Show resolved
Hide resolved
stevenzwu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
waiting for CI to complete
|
thanks @swapna267 for the contribution and @pvary for the review |
|
Thanks @stevenzwu and @pvary for the review. |
…1.18 and v1.19 backports #12199
Creation of dynamic Iceberg table in Flink Catalog using the underlying physical Iceberg table using LIKE clause.
Currently (without the changes in PR), create table in flink catalog works by configuring flink connector as described in,
flink-connector
But that needs user to provide the schema for the table. A way to tackle that is to do create table LIKE using below DDL.
Options like connector, catalog-name, catalog-database, catalog-table need to be duplicated as Iceberg FlinkCatalog doesn't return any catalog related properties during getTable. This PR addresses the issue by including these properties when getTable is called , which will be used by Flink when creating table in Flink Catalog.
Previous discussion related to feature is in PR, #12116 .