-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-25496][SQL] Deprecate from_utc_timestamp and to_utc_timestamp #24195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
791b4e2
8665159
9c00cfb
2f6fcf2
d37483e
1720e5b
a4971e1
9c896b0
c4a4a93
daa1eb7
56e2b3f
eb0fb52
828da8b
036bfd7
7d232d8
c80be0a
de77ac2
8a8d51b
9f57bdc
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -26,11 +26,13 @@ import scala.util.control.NonFatal | |
|
|
||
| import org.apache.commons.lang3.StringEscapeUtils | ||
|
|
||
| import org.apache.spark.sql.AnalysisException | ||
| import org.apache.spark.sql.catalyst.InternalRow | ||
| import org.apache.spark.sql.catalyst.expressions.codegen._ | ||
| import org.apache.spark.sql.catalyst.expressions.codegen.Block._ | ||
| import org.apache.spark.sql.catalyst.util.{DateTimeUtils, TimestampFormatter} | ||
| import org.apache.spark.sql.catalyst.util.DateTimeUtils._ | ||
| import org.apache.spark.sql.internal.SQLConf | ||
| import org.apache.spark.sql.types._ | ||
| import org.apache.spark.unsafe.types.{CalendarInterval, UTF8String} | ||
|
|
||
|
|
@@ -1021,6 +1023,11 @@ case class TimeAdd(start: Expression, interval: Expression, timeZoneId: Option[S | |
| case class FromUTCTimestamp(left: Expression, right: Expression) | ||
| extends BinaryExpression with ImplicitCastInputTypes { | ||
|
|
||
| if (!SQLConf.get.utcTimestampFuncEnabled) { | ||
| throw new AnalysisException(s"The $prettyName function has been disabled since Spark 3.0." + | ||
| s"Set ${SQLConf.UTC_TIMESTAMP_FUNC_ENABLED.key} to true to enable this function.") | ||
| } | ||
|
|
||
| override def inputTypes: Seq[AbstractDataType] = Seq(TimestampType, StringType) | ||
| override def dataType: DataType = TimestampType | ||
| override def prettyName: String = "from_utc_timestamp" | ||
|
|
@@ -1227,6 +1234,11 @@ case class MonthsBetween( | |
| case class ToUTCTimestamp(left: Expression, right: Expression) | ||
| extends BinaryExpression with ImplicitCastInputTypes { | ||
|
|
||
| if (!SQLConf.get.utcTimestampFuncEnabled) { | ||
| throw new AnalysisException(s"The $prettyName function has been disabled since Spark 3.0. " + | ||
| s"Set ${SQLConf.UTC_TIMESTAMP_FUNC_ENABLED.key} to true to enable this function.") | ||
| } | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @cloud-fan, I don't think this is a right way to deprecate. We should then at least rather throw a warning, and or we should better have a better mechanism to deprecate it in SQL side. Do we really want to add every configuration for every deprecation?
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. As I said, I don't think warning log is the right way to deprecate a SQL function, as users won't see it. This is not good either but this is the best I can think of.
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Then we should have the right way first before we start to do it differently from what I have done so far. |
||
|
|
||
| override def inputTypes: Seq[AbstractDataType] = Seq(TimestampType, StringType) | ||
| override def dataType: DataType = TimestampType | ||
| override def prettyName: String = "to_utc_timestamp" | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ur, @MaxGekk . Usually, deprecation means showing warnings instead of
Exception. This looks like a ban to me.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, deprecation is deprecation. I think we can simply deprecate them in
functions.scala, and that's all.