-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-53812][SDP] Refactor DefineDataset and DefineFlow protos to group related properties and future-proof #52532
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
SCHJonathan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mostly LGTM with one minor comment
| optional SourceCodeLocation source_code_location = 6; | ||
|
|
||
| oneof details { | ||
| WriteRelationFlowDetails relation_flow_details = 7; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
open to discussion, wdyt we define a ImplicitFlowDetails that only has relation, and StandaloneFlowDetails that has flow_name, relation and potential optional bool once in the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The way I had been thinking about it, the "details" are conceptually independent from whether the flow is implicitly specified as part of the dataset definition. The details describe the computation that happens during flow execution.
Even though our current APIs don't permit it, if we add new flow types in the future, I think we might want the option to specify them either...
- As part of the statement that defines the table
- As a standalone flow
E.g.
CREATE STREAMING TABLE t1
AS MERGE ...or
CREATE STREAMING TABLE t1
CREATE FLOW f1
AS MERGE INTO t1 ...And it would be good to only need to add one MergeIntoFlowDetails flow in this case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The details describe the computation that happens during flow execution.
Agreed, and the direction I’m proposing is aligned with that philosophy. The idea is to categorize flow definitions with a finer granularity while keeping the schema extensible for future flow types (e.g., MergeIntoFlow). This approach preserves the semantic separation between implicit and standalone flows while allowing us to evolve the protocol incrementally as new flow behaviors emerge.
message DefineFlow {
...
oneof details {
// Can be planned as a CompleteFlow, StreamingFlow, ExternalSinkFlow, and ForeachBatchSinkFlow
// depends on the dataset it writes to
ImplicitlyFlow implicitFlow = 1;
// can either be a AppendFlow or UpdateFlow
StandaloneFlow standalone_flow = 2;
AutoCdcFlow auto_cdc_flow = 3;
AutoCdcFromSnapshotFlow auto_cdc_flow4;
Any extension = 999;
}
// implicit flow only have the unresolved logical plan, it doesn't have a user specified flow name
// as it is inferred from the dataset it writes to
message ImplicitlyFlow {
optional spark.connect.Relation relation = 1;
}
message StandaloneFlow {
optional spark.connect.Relation relation = 1;
// user-specified flow name for standalone flow
optional string name = 2;
// whether it is a one time operation
optional bool once = 3;
// can either be Append or Update
optional outputMode = 4;
}
}There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I mean is that I want to avoid something like...
oneof details {
StandaloneWriteRelationFlowDetails standalone_write_relation = 1;
ImplicitWriteRelationFlowDetails implicit_write_relation = 2;
StandaloneMergeIntoFlowDetails standalone_merge_into = 3;
ImplicitMergeIntoFlowDetails implicit_merge_into = 4;
...
}
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM from my side. Thank you, @sryza and all.
|
Could you resolve the conflicts, @sryza ? |
9cef21b to
a3c61f7
Compare
|
Thank you for rebasing. |
a3c61f7 to
db20995
Compare
|
Oh, @sryza . In general, we don't delete the ASF community assets (against the author's permissions) although we have a super-power permission in GitHub. It's like the ASF mailing list which we use it as an archive for community communication. If it's inappropriate for the community, you had better ask them to delete those comments by themselves.
|
|
@dongjoon-hyun my bad! In this case, I talked to @SCHJonathan before deleting. In the future, I'll leave it to others to manage their own comments and avoid manually doing anything like this myself |
|
Ya, I thought like that exactly. No problem at all. I just wanted to leave a note to clarify because those kind of records are misleading in the community. 😄 |
|
Fixed the conflicts – now merging to master |
|
Thank you! |
…th `4.1.0-preview3` RC1 ### What changes were proposed in this pull request? This PR aims to update Spark Connect-generated Swift source code with Apache Spark `4.1.0-preview3` RC1. ### Why are the changes needed? There are many changes between Apache Spark 4.1.0-preview2 and preview3. - apache/spark#52685 - apache/spark#52613 - apache/spark#52553 - apache/spark#52532 - apache/spark#52517 - apache/spark#52514 - apache/spark#52487 - apache/spark#52328 - apache/spark#52200 - apache/spark#52154 - apache/spark#51344 To use the latest bug fixes and new messages to develop for new features of `4.1.0-preview3`. ``` $ git clone -b v4.1.0-preview3 https://github.com/apache/spark.git $ cd spark/sql/connect/common/src/main/protobuf/ $ protoc --swift_out=. spark/connect/*.proto $ protoc --grpc-swift_out=. spark/connect/*.proto // Remove empty GRPC files $ cd spark/connect $ grep 'This file contained no services' * | awk -F: '{print $1}' | xargs rm ``` ### Does this PR introduce _any_ user-facing change? Pass the CIs. ### How was this patch tested? Pass the CIs. I manually tested with `Apache Spark 4.1.0-preview3` (with the two SDP ignored tests). ``` $ swift test --no-parallel ... ✔ Test run with 203 tests in 21 suites passed after 19.088 seconds. ``` ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #252 from dongjoon-hyun/SPARK-54043. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
…oup related properties and future-proof ### What changes were proposed in this pull request? - In `DefineDataset', pulls out all the properties that are table-specific into their own sub-message. - In `DefineFlow`, pulls out the `relation` property into its own sub-message. ### Why are the changes needed? The DefineDataset and DefineFlow Spark Connect protos are moshpits of properties that could be refactored into a more coherent structure: - In `DefineDataset`, there are a set of properties that are only relevant to tables (not views). They can be - In `DefineFlow`, the relation property refers to flows that write the results of a relation to a target table. In the future, we may want to introduce additional flows types that mutate the target table in different ways. ### Does this PR introduce _any_ user-facing change? No, these protos haven't been shipped yet. ### How was this patch tested? Updated existing tests. ### Was this patch authored or co-authored using generative AI tooling? Closes apache#52532 from sryza/table-flow-details. Lead-authored-by: Sandy Ryza <[email protected]> Co-authored-by: Sandy Ryza <[email protected]> Signed-off-by: Sandy Ryza <[email protected]>

What changes were proposed in this pull request?
DefineDataset, pulls out all the properties that are table-specific into their own sub-message.DefineFlow, pulls out therelationproperty into its own sub-message.Why are the changes needed?
The DefineDataset and DefineFlow Spark Connect protos are moshpits of properties that could be refactored into a more coherent structure:
DefineDataset, there are a set of properties that are only relevant to tables (not views).DefineFlow, the relation property refers to flows that write the results of a relation to a target table. In the future, we may want to introduce additional flows types that mutate the target table in different ways.Does this PR introduce any user-facing change?
No, these protos haven't been shipped yet.
How was this patch tested?
Updated existing tests.
Was this patch authored or co-authored using generative AI tooling?