Skip to content

Conversation

@Peng-Lei
Copy link
Contributor

@Peng-Lei Peng-Lei commented Nov 26, 2021

What changes were proposed in this pull request?

unify test case:

  1. Move the testcase from DDLParserSuite.scala to ShowCreateTableParserSuite.scala.
  2. Move the testcase from DataSourceV2SQLSuite to v2.ShowCreateTableSuite.
    If the test case also work with V1/Hive, then move to ShowCreateTableSuiteBase
  3. Move the testcase from sql/ShowCreateTableSuite.scala to v1.ShowCreateTableSuite
    If the test case also work with Hive, then move to v1.ShowCreateTableSuiteBase
    If the test case also work with V2/Hive, then move to ShowCreateTableSuiteBase
  4. Move the testcase from HiveShowCreateTableSuite to hive.ShowCreateTableSuite
    If the test case also work with V1/V2, then move to ShowCreateTableSuiteBase
    If the test case also work with V1, then move to v1.ShowCreateTableSuiteBase
  5. Use getShowCreateDDL instead of checkCreateTable to check the result.

fix diff behavior:

  1. Add one space after create table xxx for the command SHOW CREATE TABLE AS SERDE to unify the output.
  2. Add one space after OPTIONS for v2 command SHOW CREATE TABLE to unify the output.

The changes follow the approach of #30287 #34305

Why are the changes needed?

  1. The unification will allow to run common SHOW CREATE TABLE tests for both DSv1/Hive DSv1 and DSv2
  2. We can detect missing features and differences between DSv1 and DSv2 implementations.

Does this PR introduce any user-facing change?

No

How was this patch tested?

$ build/sbt -Phive-2.3 -Phive-thriftserver "test:testOnly *ShowCreateTableSuite"

@github-actions github-actions bot added the SQL label Nov 26, 2021
@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Test build #145651 has finished for PR 34719 at commit 47d5ca4.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class ShowCreateTableParserSuite extends AnalysisTest
  • trait ShowCreateTableSuiteBase extends QueryTest with DDLCommandTestUtils
  • class ShowCreateTableSuite extends ShowCreateTableSuiteBase with CommandSuiteBase
  • class ShowCreateTableSuite extends command.ShowCreateTableSuiteBase with CommandSuiteBase
  • class ShowCreateTableSuite extends v1.ShowCreateTableSuiteBase with CommandSuiteBase

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50122/

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50123/

)
checkCreateTable(table)
}
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As could not convert CLUSTERTED SORTED into transform, So we could not put it to ShowCreateTableBaseSuite for V2

"'prop3' = '3',",
"'prop4' = '4')"
))
}
Copy link
Contributor Author

@Peng-Lei Peng-Lei Nov 26, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

diff with V2:
1.Options key value instead of 'key' = 'value'
2.Partition column is at last of columns
3.location start with file:
4.TBLPROPERTIES ( insted of TBLPROPERTIES(

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please, place the diffs to catalog specific test suites instead of describing the diffs in text. Also, open JIRAs and add TODOs to fix the differences.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MaxGekk diff1 and diff 3 are already resolved. diff 2 will be resolved by #34780. diff 4 will be resolved by #34773

| hours(ts))
""".stripMargin)
// V1 transforms cannot be converted to partition columns: bucket,year,...)
val showDDL = getShowCreateDDL(s"SHOW CREATE TABLE $t")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As we could not convert to patition columns for v1,so put it here just for v2

@Peng-Lei
Copy link
Contributor Author

@MaxGekk @cloud-fan @imback82 Could you take a look? Thank you.

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50123/

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50122/

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Test build #145652 has finished for PR 34719 at commit 22356f7.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class ShowCreateTableParserSuite extends AnalysisTest
  • trait ShowCreateTableSuiteBase extends QueryTest with DDLCommandTestUtils
  • class ShowCreateTableSuite extends ShowCreateTableSuiteBase with CommandSuiteBase
  • class ShowCreateTableSuite extends command.ShowCreateTableSuiteBase with CommandSuiteBase
  • class ShowCreateTableSuite extends v1.ShowCreateTableSuiteBase with CommandSuiteBase

@MaxGekk
Copy link
Member

MaxGekk commented Nov 29, 2021

@Peng-Lei Thank you for the ping. Will review this PR today.

import org.apache.spark.sql.catalyst.plans.logical.ShowCreateTable

class ShowCreateTableParserSuite extends AnalysisTest {
test("SHOW CREATE table") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
test("SHOW CREATE table") {
test("SHOW CREATE TABLE") {

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

}
}

test("DO NOT SUPPORT TEMP VIEW") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this test is better to be in the view suite, as we can run it with different view types automatically.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And we know that this check is done by the analyzer and has nothing to do with the underlying table/catalog.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And we know that this check is done by the analyzer and has nothing to do with the underlying table/catalog.

make sense

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

override val command = "SHOW CREATE TABLE"

test("SPARK-33892: SHOW CREATE TABLE w/ char/varchar") {
val db = "ns1"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will put this testcase into CharVarcharDDLTestBase later

"`c` BIGINT,",
"`extraCol` ARRAY<INT>,",
"`<another>` STRUCT<`x`: INT, `y`: ARRAY<BOOLEAN>>,",
"`a` BIGINT NOT NULL)",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like a bug in v1, as Spark silently changes the table schema from what the user specified.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Peng-Lei can you confirm with the real table schema? maybe it's just a bug in SHOW CREATE TABLE.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan I add code after create table

      val actual = spark.sharedState.externalCatalog.getTable("default", t)
      println(actual)

The result is as follows:

CatalogTable(
Database: default
Table: tbl
Created Time: Tue Nov 30 03:34:05 PST 2021
Last Access: UNKNOWN
Created By: Spark 3.3.0-SNAPSHOT
Type: EXTERNAL
Provider: parquet
Comment: This is a comment
Table Properties: [prop1=1, prop2=2, prop3=3, prop4=4]
Location: file:/tmp
Storage Properties: [from=0, to=1, via=2]
Partition Provider: Catalog
Partition Columns: [`a`]
Schema: root
 |-- b: long (nullable = true)
 |-- c: long (nullable = true)
 |-- extraCol: array (nullable = true)
 |    |-- element: integer (containsNull = true)
 |-- <another>: struct (nullable = true)
 |    |-- x: integer (nullable = true)
 |    |-- y: array (nullable = true)
 |    |    |-- element: boolean (containsNull = true)
 |-- a: long (nullable = false)
)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think this is a bug in v1 with wrong order of columns in schema? If so, I want to try to resolve it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea I think so

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan This problem may be a bit slower to process for me, I want to process this PR first and I also add todo in the code.

Comment on lines 40 to 41
val db = "ns1"
val table = "tbl"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just inline db and table into withNamespaceAndTable since they are used only once.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

}
}

test("DO NOT SUPPORT TEMP VIEW") {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why it is in upper case? Could you align test titles to other unified test suites

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done.


test("DO NOT SUPPORT TEMP VIEW") {
val viewName = "spark_28383"
withTempView(viewName) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you really test all catalogs in such way?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I will change it that put it into SQLViewSuite. It's already failed in the analysis phase before the resolve phase.

test("SPARK-36012: ADD NULL FLAG WHEN SHOW CREATE TABLE") {
val db = "ns1"
val table = "tbl"
withNamespaceAndTable(db, table) { t =>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use "ns1" and "tbl" directly since they are used only once.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

withTable(table) {
sql(
s"""CREATE TABLE $table
|USING json
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not $defaultUsing?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for Hive V1. That defaultUsing == "USING HIVE", It will failed. The reason as follows:

  1. Test data source does not support int data type.
  2. Dynamic partition strict mode resuires at least one static partition column.
  3. Create Hive serde table is not supported yet.

So I change it to "USING json". It just persisting data source table into Hive metastore in json which is not compatible with Hive

} else {
sql(s"SHOW CREATE TABLE ${table.quotedString}").head().getString(0)
}
sql(s"DROP $checkType ${table.quotedString}")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The table should exist before the check otherwise other test leaves garbage which can be considered as a bug. Please, remove the DROP .. command.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you. That's true, you are right.

withTable(t) {
  ...
  checkCreateTableOrView // without drop command
}

seems more safe. I will modify it.

Comment on lines 77 to 59
} finally {
sql(s"DROP $checkType IF EXISTS ${table.table}")
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

withTable has more complex logic in handling exceptions, see:

  protected def withTable(tableNames: String*)(f: => Unit): Unit = {
    Utils.tryWithSafeFinally(f) {
      tableNames.foreach { name =>
        spark.sql(s"DROP TABLE IF EXISTS $name")
      }
    }
  }

def tryWithSafeFinally[T](block: => T)(finallyBlock: => Unit): T = {
var originalThrowable: Throwable = null
try {
block
} catch {
case t: Throwable =>
// Purposefully not using NonFatal, because even fatal exceptions
// we don't want to have our finallyBlock suppress
originalThrowable = t
throw originalThrowable
} finally {
try {
finallyBlock
} catch {
case t: Throwable if (originalThrowable != null && originalThrowable != t) =>
originalThrowable.addSuppressed(t)
logWarning(s"Suppressing exception in finally: ${t.getMessage}", t)
throw originalThrowable
}
}
}

Are you sure that your custom code is reliable enough?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I'll try to use existing framework code instead of my custom code. Thank you.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MaxGekk I agree with you very much. The check function The code is the original for v1 and hive. At now. I modify the check logic to compare the ddl for v1 and v2. The hive is left, the logic is compare the catalogtable after normalize between first create table and the second table created by showDDL of first create table. So I left the check logic for hive. Do you agree?


test("SHOW CREATE TABLE") {
val t = "tbl"
withTable(t) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use withNamespaceAndTable

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done.

"'prop3' = '3',",
"'prop4' = '4')"
))
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please, place the diffs to catalog specific test suites instead of describing the diffs in text. Also, open JIRAs and add TODOs to fix the differences.

cloud-fan pushed a commit that referenced this pull request Dec 1, 2021
…2 command

### What changes were proposed in this pull request?
1. Rename method name `makeQualifiedNamespacePath` -> `makeQualifiedLocationPath` in `CatalogUtils`, so it not only for db/namespace, also for table.
2. Override the method `makeQualifiedLocationPath` to take more types of parameters
3. In `CreateTableExec` add handle the `location` properties convert.
4. Add handle for `Replace table` command.

### Why are the changes needed?
keep consistent for v1 and v2, and disscuss at [#comments](#34719 (comment))

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
existed test case.

Closes #34758 from Peng-Lei/qualify-location.

Authored-by: PengLei <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
cloud-fan pushed a commit that referenced this pull request Dec 2, 2021
…LE` command

### What changes were proposed in this pull request?
1. Change the v1 `SHOW CREATE TABLE` command behaviors that options output match v2. eg:
   `'key' = 'value'`
2. sort the order of options output.
3. sort the order of properties output.

### Why are the changes needed?
match v2 behaviors and disscuss at [#comments](#34719 (comment))

### Does this PR introduce _any_ user-facing change?
Yes. when `SHOW CREATE TABLE` the output of properties and options is sorted and options output is like `'key' = 'value'`

### How was this patch tested?
Add test case.

Closes #34753 from Peng-Lei/v1-show-create-table.

Authored-by: PengLei <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
@cloud-fan
Copy link
Contributor

can we continue this PR now?

@Peng-Lei
Copy link
Contributor Author

can we continue this PR now?

I've been dealing with other things a while ago. I will continue to do it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The TODO says we should move it to the super class CharVarcharTestSuite, not ShowCreateTableSuiteBase

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you help me review the #34773. In the PR. I try to fix two problem 1.Move the SHOW CREATE TABLE w/ char/varchar to CharVarcharDDLTestBase
2.Fix the behavior different with v1 command that about the TBLPROPERTIES
Thank you very much. @cloud-fan

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
""".stripMargin).show()
""".stripMargin)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why doesn't this test work in v2?

Copy link
Contributor Author

@Peng-Lei Peng-Lei Jan 5, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sql(
  s"""CREATE TABLE $t
     |USING json
     |PARTITIONED BY (c)
     |CLUSTERED BY (a) SORTED BY (b) INTO 2 BUCKETS
     |COMMENT 'This is a comment'
     |TBLPROPERTIES ('a' = '1')
     |AS SELECT 1 AS a, "foo" AS b, 2.5 AS c
   """.stripMargin
)

v2 result is

PARTITIONED BY (c, bucket(2, a, b))
...
TBLPROPERTIES(
...

v1 and hive result is

PARTITIONED BY (c)
CLUSTERED BY (a)
SORTED BY (b)
INTO 2 BUCKETS
...
TBLPROPERTIES (
...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan When use V2 command default, Do you think we should change V1 command behavior to match V2 ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why doesn't this test work in hive?

Copy link
Contributor Author

@Peng-Lei Peng-Lei Jan 5, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

two diffs between v1 and hive:

  1. default using does not match. The defaultUsing == "USING HIVE" when test for hive catalog, but the result is USING TEXT of show create table. because hive is not datasource table, so we call convertTableMetadata for SHOW CREATE TABLE. Then HiveSerDe.serdeToSource to get the source as the provider of output. The code is #L1100
  2. more properties(transient_lastDdlTime) for hive.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you confirm that the test cases in this file only work in Hive?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. sql/ShowCreateTableSuite.scala move to v1.ShowCreateTableSuiteBase to test for V1/Hive. if the test case also work for V2. then move to command.ShowCreateTableSuiteBase
  2. Most of HiveShowCreateTableSuite.scala test case only work in Hive. So move to hive.ShowCreateTableSuite, try to test every test case whether work in V2/V1, If it works ok for V2/V1, then move to base.

cloud-fan pushed a commit that referenced this pull request Jan 5, 2022
…CREATE TABLE` command

### What changes were proposed in this pull request?
1. Move the `SHOW CREATE TABLE w/ char/varchar` to `CharVarcharDDLTestBase`
2. Fix the behavior different with v1 command that about the `TBLPROPERTIES`

### Why are the changes needed?
Before the [#PR](#34719) merge. We should handle some different behavior or bugs between v1 and v2 command.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
existed test case.

Closes #34773 from Peng-Lei/v2-behavior-show-create-table.

Authored-by: PengLei <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
gengliangwang pushed a commit that referenced this pull request Jan 10, 2022
…reate table

### What changes were proposed in this pull request?

Add options in the description document for `SHOW CREATE TABLE ` command.
<img width="767" alt="1" src="https://user-images.githubusercontent.com/41178002/148747443-ecd6586f-e4c4-4ae4-8ea5-969896b7d416.png">
<img width="758" alt="2" src="https://user-images.githubusercontent.com/41178002/148747457-873bc0c3-08fa-4d31-89e7-b44440372462.png">

### Why are the changes needed?
[#discussion](#34719 (comment))

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
SKIP_API=1 SKIP_RDOC=1 SKIP_PYTHONDOC=1 SKIP_SCALADOC=1 bundle exec jekyll build

Closes #35107 from Peng-Lei/SPARK-37818.

Authored-by: PengLei <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
gengliangwang pushed a commit that referenced this pull request Jan 10, 2022
…reate table

### What changes were proposed in this pull request?

Add options in the description document for `SHOW CREATE TABLE ` command.
<img width="767" alt="1" src="https://user-images.githubusercontent.com/41178002/148747443-ecd6586f-e4c4-4ae4-8ea5-969896b7d416.png">
<img width="758" alt="2" src="https://user-images.githubusercontent.com/41178002/148747457-873bc0c3-08fa-4d31-89e7-b44440372462.png">

### Why are the changes needed?
[#discussion](#34719 (comment))

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
SKIP_API=1 SKIP_RDOC=1 SKIP_PYTHONDOC=1 SKIP_SCALADOC=1 bundle exec jekyll build

Closes #35107 from Peng-Lei/SPARK-37818.

Authored-by: PengLei <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
(cherry picked from commit 2f70e4f)
Signed-off-by: Gengliang Wang <[email protected]>
@Peng-Lei
Copy link
Contributor Author

Peng-Lei commented Jan 12, 2022

do sumary
Test case:

  1. Move the testcase from DDLParserSuite.scala to ShowCreateTableParserSuite.scala.
  2. Move the testcase from DataSourceV2SQLSuite to v2.ShowCreateTableSuite.
    If the test case also work with V1/Hive, then move to ShowCreateTableSuiteBase
  3. Move the testcase from sql/ShowCreateTableSuite.scala to v1.ShowCreateTableSuite
    If the test case also work with Hive, then move to v1.ShowCreateTableSuiteBase
    If the test case also work with V2/Hive, then move to ShowCreateTableSuiteBase
  4. Move the testcase from HiveShowCreateTableSuite to hive.ShowCreateTableSuite
    If the test case also work with V1/V2, then move to ShowCreateTableSuiteBase
    If the test case also work with V1, then move to v1.ShowCreateTableSuiteBase
  5. Use getShowCreateDDL instead of checkCreateTable to check the result.

code:

  1. Add one space after create table xxx for the command SHOW CREATE TABLE AS SERDE to unify the output.
  2. Add one space after OPTIONS for v2 command SHOW CREATE TABLE to unify the output

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we move the test to SQLViewSuite to test all view types?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, I will do it.

}
}

test("temp view") {
Copy link
Contributor

@cloud-fan cloud-fan Jan 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah sorry I pointed to the wrong test suite. This is the old test suite, we have a new one for view: SQLViewTestSuite.

For temp view, the test can be put it TempViewTestSuite, which extends SQLViewTestSuite and runs with both local and global temp view.

For persistent view, the test can be put it PersistedViewTestSuite

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done.

}
}

test("hive partitioned view is not supported") {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan I didn't find the right location for this test case. So I put it here for now.

@cloud-fan
Copy link
Contributor

thanks, merging to master!

@cloud-fan cloud-fan closed this in e7cc032 Jan 13, 2022
" COMMENT 'table comment'" +
" TBLPROPERTIES ( 'prop1' = 'value1', 'prop2' = 'value2')" +
" AS SELECT 1 AS c1, '2' AS c2"
assert(getShowCreateDDL(formattedViewName(viewName), serde) == expected)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, this test is flaky:

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: "...BLPROPERTIES ( 'prop[2' = 'value2', 'prop1' = 'value1]') AS SELECT 1 AS c1..." did not equal "...BLPROPERTIES ( 'prop[1' = 'value1', 'prop2' = 'value2]') AS SELECT 1 AS c1..."
	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
	at org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
	at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
	at org.apache.spark.sql.execution.PersistedViewTestSuite.$anonfun$new$139(SQLViewTestSuite.scala:643)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1475)
	at org.apache.spark.sql.test.SQLTestUtilsBase.withView(SQLTestUtils.scala:317)
	at org.apache.spark.sql.test.SQLTestUtilsBase.withView$(SQLTestUtils.scala:315)
	at org.apache.spark.sql.execution.SQLViewTestSuite.withView(SQLViewTestSuite.scala:37)
	at org.apache.spark.sql.execution.PersistedViewTestSuite.$anonfun$new$138(SQLViewTestSuite.scala:635)
	at org.apache.spark.sql.execution.PersistedViewTestSuite.$anonfun$new$138$adapted(SQLViewTestSuite.scala:634)
	at scala.collection.immutable.List.foreach(List.scala:333)
	at org.apache.spark.sql.execution.PersistedViewTestSuite.$anonfun$new$137(SQLViewTestSuite.scala:634)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
	at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:203)
	at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like we should sort TBLPROPERTIES

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Made a followup: #35222

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you fix it.

HyukjinKwon added a commit that referenced this pull request Jan 17, 2022
…on VIEW (v1)

### What changes were proposed in this pull request?

This PR is a sort of a followup of #34719. It added a test but it is flaky due to the order of `TABLPROPERTIES` in `SHOW CREATE TALBE` on `VIEW` in v1 code path.

This PR proposes to have a deterministic order by sorting the table properties in the show command. This is already being sorted in v2 (see `ShowCreateTableExec`).

### Why are the changes needed?

To have the deterministic order, and fix the flaky test.

### Does this PR introduce _any_ user-facing change?

Virtually no. It might affect the order of TABLEPROPERTIES in `SHOW TABLE`'s output on `VIEW` when users are rely on

### How was this patch tested?

Fixed the flaky unittest to explicitly test the order.

Closes #35222 from HyukjinKwon/SPARK-37924.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
dchvn pushed a commit to dchvn/spark that referenced this pull request Jan 19, 2022
…reate table

### What changes were proposed in this pull request?

Add options in the description document for `SHOW CREATE TABLE ` command.
<img width="767" alt="1" src="https://user-images.githubusercontent.com/41178002/148747443-ecd6586f-e4c4-4ae4-8ea5-969896b7d416.png">
<img width="758" alt="2" src="https://user-images.githubusercontent.com/41178002/148747457-873bc0c3-08fa-4d31-89e7-b44440372462.png">

### Why are the changes needed?
[#discussion](apache#34719 (comment))

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
SKIP_API=1 SKIP_RDOC=1 SKIP_PYTHONDOC=1 SKIP_SCALADOC=1 bundle exec jekyll build

Closes apache#35107 from Peng-Lei/SPARK-37818.

Authored-by: PengLei <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
dchvn pushed a commit to dchvn/spark that referenced this pull request Jan 19, 2022
### What changes were proposed in this pull request?
unify test case:

1. Move the testcase from `DDLParserSuite.scala` to `ShowCreateTableParserSuite.scala`.
2. Move the testcase from `DataSourceV2SQLSuite` to `v2.ShowCreateTableSuite`.
    If the test case also work with V1/Hive, then move to ShowCreateTableSuiteBase
3. Move the testcase from  `sql/ShowCreateTableSuite.scala` to `v1.ShowCreateTableSuite`
    If the test case also work with Hive, then move to `v1.ShowCreateTableSuiteBase`
    If the test case also work with V2/Hive, then move to `ShowCreateTableSuiteBase`
4. Move the testcase from `HiveShowCreateTableSuite` to `hive.ShowCreateTableSuite`
    If the test case also work with V1/V2, then move to `ShowCreateTableSuiteBase`
    If the test case also work with V1, then move to `v1.ShowCreateTableSuiteBase`
5. Use `getShowCreateDDL` instead of `checkCreateTable` to check the result.

fix diff behavior:

1. Add one space after `create table xxx` for the command `SHOW CREATE TABLE AS SERDE` to unify the output.
2. Add one space after `OPTIONS` for v2 command `SHOW CREATE TABLE` to unify the output.

The changes follow the approach of [apache#30287](apache#30287) [apache#34305](apache#34305)

### Why are the changes needed?

1. The unification will allow to run common `SHOW CREATE TABLE` tests for both DSv1/Hive DSv1 and DSv2
2. We can detect missing features and differences between DSv1 and DSv2 implementations.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
$  build/sbt -Phive-2.3 -Phive-thriftserver "test:testOnly *ShowCreateTableSuite"

Closes apache#34719 from Peng-Lei/SPARK-37381.

Authored-by: PengLei <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
catalinii pushed a commit to lyft/spark that referenced this pull request Feb 22, 2022
…reate table

### What changes were proposed in this pull request?

Add options in the description document for `SHOW CREATE TABLE ` command.
<img width="767" alt="1" src="https://user-images.githubusercontent.com/41178002/148747443-ecd6586f-e4c4-4ae4-8ea5-969896b7d416.png">
<img width="758" alt="2" src="https://user-images.githubusercontent.com/41178002/148747457-873bc0c3-08fa-4d31-89e7-b44440372462.png">

### Why are the changes needed?
[#discussion](apache#34719 (comment))

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
SKIP_API=1 SKIP_RDOC=1 SKIP_PYTHONDOC=1 SKIP_SCALADOC=1 bundle exec jekyll build

Closes apache#35107 from Peng-Lei/SPARK-37818.

Authored-by: PengLei <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
(cherry picked from commit 2f70e4f)
Signed-off-by: Gengliang Wang <[email protected]>
catalinii pushed a commit to lyft/spark that referenced this pull request Mar 4, 2022
…reate table

### What changes were proposed in this pull request?

Add options in the description document for `SHOW CREATE TABLE ` command.
<img width="767" alt="1" src="https://user-images.githubusercontent.com/41178002/148747443-ecd6586f-e4c4-4ae4-8ea5-969896b7d416.png">
<img width="758" alt="2" src="https://user-images.githubusercontent.com/41178002/148747457-873bc0c3-08fa-4d31-89e7-b44440372462.png">

### Why are the changes needed?
[#discussion](apache#34719 (comment))

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
SKIP_API=1 SKIP_RDOC=1 SKIP_PYTHONDOC=1 SKIP_SCALADOC=1 bundle exec jekyll build

Closes apache#35107 from Peng-Lei/SPARK-37818.

Authored-by: PengLei <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
(cherry picked from commit 2f70e4f)
Signed-off-by: Gengliang Wang <[email protected]>
kazuyukitanimura pushed a commit to kazuyukitanimura/spark that referenced this pull request Aug 10, 2022
…reate table

### What changes were proposed in this pull request?

Add options in the description document for `SHOW CREATE TABLE ` command.
<img width="767" alt="1" src="https://user-images.githubusercontent.com/41178002/148747443-ecd6586f-e4c4-4ae4-8ea5-969896b7d416.png">
<img width="758" alt="2" src="https://user-images.githubusercontent.com/41178002/148747457-873bc0c3-08fa-4d31-89e7-b44440372462.png">

### Why are the changes needed?
[#discussion](apache#34719 (comment))

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
SKIP_API=1 SKIP_RDOC=1 SKIP_PYTHONDOC=1 SKIP_SCALADOC=1 bundle exec jekyll build

Closes apache#35107 from Peng-Lei/SPARK-37818.

Authored-by: PengLei <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
(cherry picked from commit 2f70e4f)
Signed-off-by: Gengliang Wang <[email protected]>
(cherry picked from commit 385b34d)
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants