Skip to content

[SPARK-20909][SQL] Add build-int SQL function - DAYOFWEEK#18134

Closed
wangyum wants to merge 4 commits intoapache:masterfrom
wangyum:SPARK-20909
Closed

[SPARK-20909][SQL] Add build-int SQL function - DAYOFWEEK#18134
wangyum wants to merge 4 commits intoapache:masterfrom
wangyum:SPARK-20909

Conversation

@wangyum
Copy link
Copy Markdown
Member

@wangyum wangyum commented May 28, 2017

What changes were proposed in this pull request?

Add build-int SQL function - DAYOFWEEK

How was this patch tested?

unit tests

@SparkQA
Copy link
Copy Markdown

SparkQA commented May 28, 2017

Test build #77478 has finished for PR 18134 at commit 02b62b4.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • case class DayOfWeek(child: Expression) extends UnaryExpression with ImplicitCastInputTypes

Copy link
Copy Markdown
Member

@ueshin ueshin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM except for a minor comment.


select to_timestamp(null), to_timestamp('2016-12-31 00:12:00'), to_timestamp('2016-12-31', 'yyyy-MM-dd');

select dayofweek('2007-02-03'), dayofweek('2009-07-30'), dayofweek('2017-05-27'), dayofweek(null), dayofweek('1582-10-15 13:10:15'); No newline at end of file
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Can you add a line break at the end of file?


override protected def nullSafeEval(date: Any): Any = {
c.setTimeInMillis(date.asInstanceOf[Int] * 1000L * 3600L * 24L)
c.get(Calendar.DAY_OF_WEEK)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In WeekOfYear, we set Calendar.MONDAY as the first day of a week. Here seems we assume it's Calendar.SUNDAY. Is there any conflict we will encounter?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keep pace with Hive's DayOfWeek
.

}
}

// scalastyle:off line.size.limit
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As Sunday, Saturday are included, it is not only weekday. Returns the day of the week ....

@viirya
Copy link
Copy Markdown
Member

viirya commented May 30, 2017

LGTM except for one comment about the function description.

@SparkQA
Copy link
Copy Markdown

SparkQA commented May 30, 2017

Test build #77514 has finished for PR 18134 at commit 87defdc.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • case class UnresolvedRelation(
  • case class StringReplace(srcExpr: Expression, searchExpr: Expression, replaceExpr: Expression)

@SparkQA
Copy link
Copy Markdown

SparkQA commented May 30, 2017

Test build #77513 has finished for PR 18134 at commit dcae776.

  • This patch passes all tests.
  • This patch does not merge cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link
Copy Markdown

SparkQA commented May 30, 2017

Test build #77515 has finished for PR 18134 at commit 69d4227.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@ueshin
Copy link
Copy Markdown
Member

ueshin commented May 30, 2017

Thanks! Merging to master.

@asfgit asfgit closed this in d797ed0 May 30, 2017
@sergiobilello-eb
Copy link
Copy Markdown

sergiobilello-eb commented Nov 10, 2017

@ueshin This function is not reported in the API documentation: https://spark.apache.org/docs/2.2.0/api/java/index.html?org/apache/spark/sql/functions.html or https://spark.apache.org/docs/2.1.0/api/java/index.html?org/apache/spark/sql/functions.html ?
How is it possible? I am trying to use it from spark-sql on spark 2.1.0 cluster
Thanks

@gatorsmile
Copy link
Copy Markdown
Member

@sergiobilello-eb This is just a SQL function. You can call it in SQL interface or using the df.select(expr("dayofweek('2009-07-30')")). It is not part of the DataFrame functions. You can submit a PR or report it as an issue to add such an API.

@sergiobilello-eb
Copy link
Copy Markdown

sergiobilello-eb commented Nov 11, 2017

thanks @gatorsmile :) how can I call it from spark-sql?

@sergiobilello-eb
Copy link
Copy Markdown

sergiobilello-eb commented Nov 11, 2017

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
      /_/

Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_131
Branch
Compiled by user jenkins on 2017-06-30T22:58:04Z
Revision
Url
Type --help for more information.
➜  ~


spark-sql> select dayofweek('2007-02-03'), dayofweek('2009-07-30'), dayofweek('2017-05-27'), dayofweek(null), dayofweek('1582-10-15 13:10:15');
17/11/10 16:08:23 INFO SparkSqlParser: Parsing command: select dayofweek('2007-02-03'), dayofweek('2009-07-30'), dayofweek('2017-05-27'), dayofweek(null), dayofweek('1582-10-15 13:10:15')
17/11/10 16:08:23 INFO HiveMetaStore: 0: get_database: default
17/11/10 16:08:23 INFO audit: ugi=sergio.bilello	ip=unknown-ip-addr	cmd=get_database: default
17/11/10 16:08:23 INFO HiveMetaStore: 0: get_database: default
17/11/10 16:08:23 INFO audit: ugi=sergio.bilello	ip=unknown-ip-addr	cmd=get_database: default
17/11/10 16:08:23 INFO HiveMetaStore: 0: get_function: default.dayofweek
17/11/10 16:08:23 INFO audit: ugi=sergio.bilello	ip=unknown-ip-addr	cmd=get_function: default.dayofweek
Error in query: Undefined function: 'dayofweek'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 7``` 

@gatorsmile
Copy link
Copy Markdown
Member

https://issues.apache.org/jira/browse/SPARK-20909

This is not part of 2.2. Based on JIRA, it will be included in 2.3

@sergiobilello-eb
Copy link
Copy Markdown

thanks @gatorsmile! Do you suggest any workaround until then? I mean not rebuilding spark with that patch.... Can I register my UDF that contains that logic?

@gatorsmile
Copy link
Copy Markdown
Member

Yeah, you always can implement such a UDF.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants