[HUDI-5585][flink]Fix flink creates and writes the table, the spark alter table reports an error#7706
Conversation
…lter table reports an error
|
Thanks for the fix @waywtdcc , can we describe in high level what we are fixing here? |
This is spark and flink metadata incompatibility issue. The table created by flink cannot be changed by spark. |
| schema = HiveSchemaUtils.convertTableSchema(hiveTable); | ||
| } | ||
| org.apache.flink.table.api.Schema resultSchema = DataTypeUtils.dropIfExistsColumns(schema, HoodieRecord.HOODIE_META_COLUMNS_WITH_OPERATION); | ||
|
|
There was a problem hiding this comment.
In line419, we already ignore the metadata column, so why drop it again?
There was a problem hiding this comment.
Indeed, I have removed.
|
HUDI-5585.patch.zip
then rebase with the latest master code and force push? |
|
The two failed tests are un-related with this patch: Would merge it soon~ |
…alter table reports an error (#7706) Co-authored-by: danny0405 <yuzhao.cyz@gmail.com>
|
Which released version includes this feature? Hudi 0.12 version has to stick with this problem? |
|
0.13.0 |
…alter table reports an error (apache#7706) Co-authored-by: danny0405 <yuzhao.cyz@gmail.com>
…alter table reports an error (apache#7706) Co-authored-by: danny0405 <yuzhao.cyz@gmail.com>
Change Logs
Fix flink creates and writes the table, the spark alter table reports an error
After the flink hive catalog is created, it does not include meta information fields in the hive metadata, such as _hoodie_commit_time, etc. However, these fields are included in the spark creation table hive. So it leads to metadata incompatibility issues.
Impact
Fix flink creates and writes the table, the spark alter table reports an error
Risk level (write none, low medium or high below)
low
Documentation Update
Contributor's checklist