-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-52779][SQL][CONNECT] Support TimeType literal in Connect #51462
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
ping @MaxGekk to take a look, please. |
|
@peter-toth Could you review this PR since you are working on similar one: #51464 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we use SparkDateTimeUtils.localTimeToNanos() here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ditto.
peter-toth
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just minor nits.
|
+1, LGTM. Merging to master. |
|
quick question: does it works with python client? |
Thanks for pointing out this @zhengruifeng . I think it works with Python client, but I need to check it and add some PySpark tests. |
### What changes were proposed in this pull request? This is the follow-up of #51462 to support TimeType literal in pyspark connect. ### Why are the changes needed? To align the Python Connect client with the Java/Scala Connect client. ### Does this PR introduce _any_ user-facing change? Yes, we can use TimeType literal in several ways, for example, `PySparkSession.sql("SELECT TIME '12:13:14'")` and `pyspark.sql.connect.functions.lit(datetime.time(12, 13, 14))`. ### How was this patch tested? 1. Add some local literal convert tests 2. Add pyspark SQL tests ### Was this patch authored or co-authored using generative AI tooling? No Closes #51515 from dengziming/SPARK-52779. Authored-by: dengziming <[email protected]> Signed-off-by: Ruifeng Zheng <[email protected]>
…th `4.1.0-preview2` ### What changes were proposed in this pull request? This PR aims to update Spark Connect-generated Swift source code with Apache Spark `4.1.0-preview2`. ### Why are the changes needed? There are many changes from Apache Spark 4.1.0. - apache/spark#52342 - apache/spark#52256 - apache/spark#52271 - apache/spark#52242 - apache/spark#51473 - apache/spark#51653 - apache/spark#52072 - apache/spark#51561 - apache/spark#51563 - apache/spark#51489 - apache/spark#51507 - apache/spark#51462 - apache/spark#51464 - apache/spark#51442 To use the latest bug fixes and new messages to develop for new features of `4.1.0-preview2`. ``` $ git clone -b v4.1.0-preview2 https://github.com/apache/spark.git $ cd spark/sql/connect/common/src/main/protobuf/ $ protoc --swift_out=. spark/connect/*.proto $ protoc --grpc-swift_out=. spark/connect/*.proto // Remove empty GRPC files $ cd spark/connect $ grep 'This file contained no services' * catalog.grpc.swift:// This file contained no services. commands.grpc.swift:// This file contained no services. common.grpc.swift:// This file contained no services. example_plugins.grpc.swift:// This file contained no services. expressions.grpc.swift:// This file contained no services. ml_common.grpc.swift:// This file contained no services. ml.grpc.swift:// This file contained no services. pipelines.grpc.swift:// This file contained no services. relations.grpc.swift:// This file contained no services. types.grpc.swift:// This file contained no services. $ rm catalog.grpc.swift commands.grpc.swift common.grpc.swift example_plugins.grpc.swift expressions.grpc.swift ml_common.grpc.swift ml.grpc.swift pipelines.grpc.swift relations.grpc.swift types.grpc.swift ``` ### Does this PR introduce _any_ user-facing change? Pass the CIs. ### How was this patch tested? Pass the CIs. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #250 from dongjoon-hyun/SPARK-53777. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
Support TimeType literal in Spark Connect.
Why are the changes needed?
Part of SPARK-51162 to introduce TimeType
Does this PR introduce any user-facing change?
Yes, TimeType literals are supported in Connect, we can use functions such as
org.apache.spark.sql.functions.lit()to pass a time literal.How was this patch tested?
Added some unit tests.
Was this patch authored or co-authored using generative AI tooling?
No.