-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-33453][SQL][TESTS] Unify v1 and v2 SHOW PARTITIONS tests #30377
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
MaxGekk
wants to merge
18
commits into
apache:master
from
MaxGekk:unify-dsv1_v2-show-partitions-tests
Closed
Changes from all commits
Commits
Show all changes
18 commits
Select commit
Hold shift + click to select a range
29e5bae
Create ShowPartitionsParserSuite
MaxGekk d7bf651
Move tests to ShowPartitionsParserSuite
MaxGekk c23048e
Add v1/v2 ShowPartitionsSuite
MaxGekk 87e86b5
Move a view test
MaxGekk 851929b
Add hive.execution.command.ShowPartitionsSuite
MaxGekk 82432c8
Move tests from HiveCommandSuite
MaxGekk 2351b64
Move "filter by partitions" to v1 ShowPartitionsSuite
MaxGekk cc89024
Move the test "show partitions from a datasource"
MaxGekk a9bcdbb
de-dup code
MaxGekk 38d3c67
Move the test "non-partitioning columns"
MaxGekk f86f159
Fix "show partitions of not partitioned table"
MaxGekk 76e6399
Move the test "show partitions of a view"
MaxGekk 8707d3e
Fix v1/ShowPartitionsSuite
MaxGekk 8fe17a0
Add TODO
MaxGekk cd04107
not partitioned -> non-partitioned
MaxGekk 47925f2
Don't override table creation in Hive: USING HIVE
MaxGekk e3cd5e1
Merge remote-tracking branch 'origin/master' into unify-dsv1_v2-show-…
MaxGekk 59f2b38
Fix ShowPartitionsParserSuite
MaxGekk File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
52 changes: 52 additions & 0 deletions
52
...ore/src/test/scala/org/apache/spark/sql/execution/command/ShowPartitionsParserSuite.scala
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,52 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.sql.execution.command | ||
|
|
||
| import org.apache.spark.sql.catalyst.analysis.AnalysisTest | ||
| import org.apache.spark.sql.catalyst.parser.CatalystSqlParser.parsePlan | ||
| import org.apache.spark.sql.catalyst.parser.ParseException | ||
| import org.apache.spark.sql.catalyst.plans.logical.ShowPartitionsStatement | ||
| import org.apache.spark.sql.execution.SparkSqlParser | ||
| import org.apache.spark.sql.test.SharedSparkSession | ||
|
|
||
| class ShowPartitionsParserSuite extends AnalysisTest with SharedSparkSession { | ||
| test("SHOW PARTITIONS") { | ||
| Seq( | ||
| "SHOW PARTITIONS t1" -> ShowPartitionsStatement(Seq("t1"), None), | ||
| "SHOW PARTITIONS db1.t1" -> ShowPartitionsStatement(Seq("db1", "t1"), None), | ||
| "SHOW PARTITIONS t1 PARTITION(partcol1='partvalue', partcol2='partvalue')" -> | ||
| ShowPartitionsStatement( | ||
| Seq("t1"), | ||
| Some(Map("partcol1" -> "partvalue", "partcol2" -> "partvalue"))), | ||
| "SHOW PARTITIONS a.b.c" -> ShowPartitionsStatement(Seq("a", "b", "c"), None), | ||
| "SHOW PARTITIONS a.b.c PARTITION(ds='2017-06-10')" -> | ||
| ShowPartitionsStatement(Seq("a", "b", "c"), Some(Map("ds" -> "2017-06-10"))) | ||
| ).foreach { case (sql, expected) => | ||
| val parsed = parsePlan(sql) | ||
| comparePlans(parsed, expected) | ||
| } | ||
| } | ||
|
|
||
| test("empty values in non-optional partition specs") { | ||
| val e = intercept[ParseException] { | ||
| new SparkSqlParser().parsePlan( | ||
| "SHOW PARTITIONS dbx.tab1 PARTITION (a='1', b)") | ||
| }.getMessage | ||
| assert(e.contains("Found an empty partition key 'b'")) | ||
| } | ||
| } |
36 changes: 36 additions & 0 deletions
36
sql/core/src/test/scala/org/apache/spark/sql/execution/command/ShowPartitionsSuiteBase.scala
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.sql.execution.command | ||
|
|
||
| import org.scalactic.source.Position | ||
| import org.scalatest.Tag | ||
|
|
||
| import org.apache.spark.sql.QueryTest | ||
| import org.apache.spark.sql.test.SQLTestUtils | ||
|
|
||
| trait ShowPartitionsSuiteBase extends QueryTest with SQLTestUtils { | ||
| protected def version: String | ||
| protected def catalog: String | ||
| protected def defaultNamespace: Seq[String] | ||
| protected def defaultUsing: String | ||
|
|
||
| override def test(testName: String, testTags: Tag*)(testFun: => Any) | ||
| (implicit pos: Position): Unit = { | ||
| super.test(s"SHOW PARTITIONS $version: " + testName, testTags: _*)(testFun) | ||
| } | ||
| } |
184 changes: 184 additions & 0 deletions
184
sql/core/src/test/scala/org/apache/spark/sql/execution/command/v1/ShowPartitionsSuite.scala
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,184 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.sql.execution.command.v1 | ||
|
|
||
| import org.apache.spark.sql.{AnalysisException, Row, SaveMode} | ||
| import org.apache.spark.sql.catalyst.analysis.NoSuchTableException | ||
| import org.apache.spark.sql.connector.catalog.CatalogManager | ||
| import org.apache.spark.sql.execution.command | ||
| import org.apache.spark.sql.test.SharedSparkSession | ||
|
|
||
| trait ShowPartitionsSuiteBase extends command.ShowPartitionsSuiteBase { | ||
| override def version: String = "V1" | ||
| override def catalog: String = CatalogManager.SESSION_CATALOG_NAME | ||
| override def defaultNamespace: Seq[String] = Seq("default") | ||
| override def defaultUsing: String = "USING parquet" | ||
|
|
||
| private def createDateTable(table: String): Unit = { | ||
| sql(s""" | ||
| |CREATE TABLE $table (price int, qty int, year int, month int) | ||
| |$defaultUsing | ||
| |partitioned by (year, month)""".stripMargin) | ||
| sql(s"INSERT INTO $table PARTITION(year = 2015, month = 1) SELECT 1, 1") | ||
| sql(s"INSERT INTO $table PARTITION(year = 2015, month = 2) SELECT 2, 2") | ||
| sql(s"INSERT INTO $table PARTITION(year = 2016, month = 2) SELECT 3, 3") | ||
| sql(s"INSERT INTO $table PARTITION(year = 2016, month = 3) SELECT 3, 3") | ||
| } | ||
|
|
||
| test("show everything") { | ||
| val table = "dateTable" | ||
| withTable(table) { | ||
| createDateTable(table) | ||
| checkAnswer( | ||
| sql(s"show partitions $table"), | ||
| Row("year=2015/month=1") :: | ||
| Row("year=2015/month=2") :: | ||
| Row("year=2016/month=2") :: | ||
| Row("year=2016/month=3") :: Nil) | ||
|
|
||
| checkAnswer( | ||
| sql(s"show partitions default.$table"), | ||
| Row("year=2015/month=1") :: | ||
| Row("year=2015/month=2") :: | ||
| Row("year=2016/month=2") :: | ||
| Row("year=2016/month=3") :: Nil) | ||
| } | ||
| } | ||
|
|
||
| test("filter by partitions") { | ||
| val table = "dateTable" | ||
| withTable(table) { | ||
| createDateTable(table) | ||
| checkAnswer( | ||
| sql(s"show partitions default.$table PARTITION(year=2015)"), | ||
| Row("year=2015/month=1") :: | ||
| Row("year=2015/month=2") :: Nil) | ||
| checkAnswer( | ||
| sql(s"show partitions default.$table PARTITION(year=2015, month=1)"), | ||
| Row("year=2015/month=1") :: Nil) | ||
| checkAnswer( | ||
| sql(s"show partitions default.$table PARTITION(month=2)"), | ||
| Row("year=2015/month=2") :: | ||
| Row("year=2016/month=2") :: Nil) | ||
| } | ||
| } | ||
|
|
||
| test("show everything more than 5 part keys") { | ||
| val table = "wideTable" | ||
| withTable(table) { | ||
| sql(s""" | ||
| |CREATE TABLE $table ( | ||
| | price int, qty int, | ||
| | year int, month int, hour int, minute int, sec int, extra int) | ||
| |$defaultUsing | ||
| |PARTITIONED BY (year, month, hour, minute, sec, extra)""".stripMargin) | ||
| sql(s""" | ||
| |INSERT INTO $table | ||
| |PARTITION(year = 2016, month = 3, hour = 10, minute = 10, sec = 10, extra = 1) SELECT 3, 3 | ||
| """.stripMargin) | ||
| sql(s""" | ||
| |INSERT INTO $table | ||
| |PARTITION(year = 2016, month = 4, hour = 10, minute = 10, sec = 10, extra = 1) SELECT 3, 3 | ||
| """.stripMargin) | ||
| checkAnswer( | ||
| sql(s"show partitions $table"), | ||
| Row("year=2016/month=3/hour=10/minute=10/sec=10/extra=1") :: | ||
| Row("year=2016/month=4/hour=10/minute=10/sec=10/extra=1") :: Nil) | ||
| } | ||
| } | ||
|
|
||
| test("non-partitioning columns") { | ||
| val table = "dateTable" | ||
| withTable(table) { | ||
| createDateTable(table) | ||
| val errMsg = intercept[AnalysisException] { | ||
| sql(s"SHOW PARTITIONS $table PARTITION(abcd=2015, xyz=1)") | ||
| }.getMessage | ||
| assert(errMsg.contains("Non-partitioning column(s) [abcd, xyz] are specified")) | ||
| } | ||
| } | ||
|
|
||
| test("show partitions of non-partitioned table") { | ||
| val table = "not_partitioned_table" | ||
| withTable(table) { | ||
| sql(s"CREATE TABLE $table (col1 int) $defaultUsing") | ||
| val errMsg = intercept[AnalysisException] { | ||
| sql(s"SHOW PARTITIONS $table") | ||
| }.getMessage | ||
| assert(errMsg.contains("not allowed on a table that is not partitioned")) | ||
| } | ||
| } | ||
|
|
||
| test("show partitions of a view") { | ||
| val table = "dateTable" | ||
| withTable(table) { | ||
| createDateTable(table) | ||
| val view = "view1" | ||
| withView(view) { | ||
| sql(s"CREATE VIEW $view as select * from $table") | ||
| val errMsg = intercept[AnalysisException] { | ||
| sql(s"SHOW PARTITIONS $view") | ||
| }.getMessage | ||
| assert(errMsg.contains("is not allowed on a view")) | ||
| } | ||
| } | ||
| } | ||
|
|
||
| test("show partitions of a temporary view") { | ||
| val viewName = "test_view" | ||
| withTempView(viewName) { | ||
| spark.range(10).createTempView(viewName) | ||
| val errMsg = intercept[NoSuchTableException] { | ||
| sql(s"SHOW PARTITIONS $viewName") | ||
| }.getMessage | ||
| assert(errMsg.contains(s"Table or view '$viewName' not found")) | ||
| } | ||
| } | ||
| } | ||
|
|
||
| class ShowPartitionsSuite extends ShowPartitionsSuiteBase with SharedSparkSession { | ||
| // The test is placed here because it fails with `USING HIVE`: | ||
| // org.apache.spark.sql.AnalysisException: | ||
| // Hive data source can only be used with tables, you can't use it with CREATE TEMP VIEW USING | ||
| test("issue exceptions on the temporary view") { | ||
| val viewName = "test_view" | ||
| withTempView(viewName) { | ||
| sql(s""" | ||
| |CREATE TEMPORARY VIEW $viewName (c1 INT, c2 STRING) | ||
| |$defaultUsing""".stripMargin) | ||
| val errMsg = intercept[NoSuchTableException] { | ||
| sql(s"SHOW PARTITIONS $viewName") | ||
| }.getMessage | ||
| assert(errMsg.contains(s"Table or view '$viewName' not found")) | ||
| } | ||
| } | ||
|
|
||
| test("show partitions from a datasource") { | ||
| import testImplicits._ | ||
| withTable("part_datasrc") { | ||
| val df = (1 to 3).map(i => (i, s"val_$i", i * 2)).toDF("a", "b", "c") | ||
| df.write | ||
| .partitionBy("a") | ||
| .format("parquet") | ||
| .mode(SaveMode.Overwrite) | ||
| .saveAsTable("part_datasrc") | ||
|
|
||
| assert(sql("SHOW PARTITIONS part_datasrc").count() == 3) | ||
| } | ||
| } | ||
| } | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this seems like testing the DataFrameWriter API not the SHOW PARTITIONS command.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah the test was already there. Let's keep it then.