-
Notifications
You must be signed in to change notification settings - Fork 3k
Flink: Upgrade to flink 1.13.2 #3116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
+1 to this. We have a number of customized options for things like AWS s3 / IAM etc that go on catalogs and we need to be able to support them.
Does this mean we're dropping support entirely for Flink 1.12 in the next release (and effectively in the master branch)? This seems potentially premature. I thought that the Flink 1.13 interface added Is it possible we can do the same (continue to implement both interfaces or have two separate classes) to keep support for Flink 1.12 @Flyangz? |
|
we do not have the plan of improving flink to 1.13 recently, is there a way to support both 1.12 and 1.13 versions. |
I believe that @zhangjun0x01's original PR supported both flink 1.12 and flink 1.13. Like I mentioned, if the Dropping support for Flink 1.12 is a big decision in my opinion, and should be brought up on the mailing list or at the community sync-up. Does my idea of continuing to implement |
cc38d44 to
ff57f9b
Compare
|
This committed code implements both |
Oh that's rather unfortunate. I had hoped that Flink 1.13 would use the new interface, if present, while allowing for the old interface to also be implemented for backwards compatibility. Perhaps somebody more involved with the Flink Community than myself can comment as to whether or not there might be a flag added or that could be added to tell Flink to prefer the new interface first (when present). I would suggest implementing both interfaces for now (and also organizing the interface functions so that they're organized together based on which interface), so as to keep support for Flink 1.12. While this won't give us the benefit of the new API, we can upgrade to Flink 1.13.2 and then consider the importance of that later on. I would still implement the new API functions though (since it's already been done), and then we can track that in a follow up issue. We might indeed decide dropping support for 1.12 is worth it, but we can't make that decision on our own. EDIT: Perhaps it can be broken up into two different classes, and then when we build the jar only one META-INF file is placed on the class path depending on the Flink version. Seems really messy, but possibly we can consider this down the road. Any thoughts on this @stevenzwu @openinx? |
| @@ -1,7 +1,7 @@ | |||
| org.slf4j:* = 1.7.25 | |||
| org.apache.avro:avro = 1.10.1 | |||
| org.apache.calcite:* = 1.10.0 | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we know if our calcite version needs to be upgraded?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The apache flink runtime won't depend on this calcite jar , this calcite was introduced for hive before. So I think we don't need to change this calcite version for flink upgrading.
| public static void assertThrowsRootCause(String message, | ||
| Class<? extends Exception> expected, | ||
| String containedInMessage, | ||
| Runnable runnable) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: It's not the correct iceberg indent, I think you will need to configure your IDE by following this: https://iceberg.apache.org/community/#setting-up-ide-and-code-style
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed it.
| } | ||
|
|
||
| @Override | ||
| public Map<String, String> requiredContext() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As the requiredContext and supportedProperties have been marked as deprecated, I will suggest to implement those in requiredOptions & optionalOptions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As Context class is missing in 1.12, I think we can only implement these deprecated methods.
|
|
||
| org.apache.iceberg.flink.FlinkDynamicTableFactory | ||
| org.apache.iceberg.flink.FlinkDynamicTableFactory | ||
| org.apache.iceberg.flink.FlinkCatalogFactory |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we remove this file org.apache.flink.table.factories.TableFactory) if we plan to upgrade the flink version to 1.13.2 directly ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we only support 1.13, then is ok to remove it. But if we want to be compatible with 1.12, FlinkCatalogFactory need to be treated as TableFactory because Context class is missing in 1.12 and without implementing createCatalog(Context context) will throw exception when we place FlinkCatalogFactory in org.apache.flink.table.factories.Factory
|
@kbendick @Flyangz According to the last discussion from mail list , I think we'd better to keep the same strategy as the apache spark plan to do, users from apache iceberg community also proposed to support both flink 1.12 & flink 1.13. Since the difference between flink 1.12 & 1.13 is not so large, I think it's possible to support both of them in the single flink module ( I mean we don't need to separate them into two different modules). |
|
In order to be compatible with Flink 1.12 and 1.13, this commit only implements |
Is the deprecated createCatalog work for both flink 1.12 & flink 1.13 ? |
I think so, at least it can pass all the unit tests in the iceberg-flink module compiled with 1.12 or 1.13. |
|
Okay, so does any other problems for guaranteeing compatibility for both flink 1.12 & flink 1.13 in your own context ? I think we will need to make the whole unit tests work for both flink 1.12 & flink 1.13 , the work should be similar to this PR: 111fe81. I think it's good to make it into a separate PR ! |
|
openinx
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
@Flyangz Please file a new issue to address the unit tests things, and attach the issue to [Priority 1] Flink: Upgrade to 1.13.2 project. |
Sorry for the late reply. Do you mean #3183 issue that you have already created ? |
|
@Flyangz Yes, this's the correct issue. You can just assign it to yourself. |
This job is for #2558 and is completed on the basis of #2629 , so thanks for the contribution of @zhangjun0x01 .
Below are some explanations:
FlinkCatalogFactory.createCatalog(), this is to be compatible with the old implementation. And this way we can use customized options as before without pre-settings.