Skip to content
Closed
Show file tree
Hide file tree
Changes from 9 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -495,9 +495,13 @@ object TypeCoercion {
i
}

case i @ In(a, b) if b.exists(_.dataType != a.dataType) =>
findWiderCommonType(i.children.map(_.dataType)) match {
case Some(finalDataType) => i.withNewChildren(i.children.map(Cast(_, finalDataType)))
case i @ In(value, list) if list.exists(_.dataType != value.dataType) =>
findWiderCommonType(list.map(_.dataType)) match {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are you sure this is the behavior of binary comparison? It seems we should call findTightestCommonType and then findCommonTypeForBinaryComparison

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

findTightestCommonType(value.dataType, listType).orElse(findCommonTypeForBinaryComparison(value.dataType, listType, conf))

Can not cover these cases:

  • bigint vs decimal
  • float vs decimal
  • double vs decimal

Example:

-- !query 43
SELECT cast(1 as bigint) in (cast(1 as decimal(10, 0))) FROM t
-- !query 43 schema
struct<>
-- !query 43 output
org.apache.spark.sql.AnalysisException
cannot resolve '(CAST(1 AS BIGINT) IN (CAST(1 AS DECIMAL(10,0))))' due to data type mismatch: Arguments must be same type but were: bigint != decimal(10,0); line 1 pos 25

spark-sql> explain SELECT cast(1 as bigint) > (cast(1 as decimal(10, 0)));
== Physical Plan ==
*(1) Project [false AS (CAST(CAST(CAST(1 AS BIGINT) AS DECIMAL(20,0)) AS DECIMAL(20,0)) > CAST(CAST(1 AS DECIMAL(10,0)) AS DECIMAL(20,0)))#29]
+- *(1) Scan OneRowRelation[]

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems we should call findCommonTypeForBinaryComparison -> findWiderTypeForDecimal -> findTightestCommonType.

Because findCommonTypeForBinaryComparison in PromoteStrings in TypeCoercion.scala#L52. findWiderTypeForDecimal in DecimalPrecision in TypeCoercion.scala#L53. findTightestCommonType in ImplicitTypeCasts in TypeCoercion.scala#L63.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After taking a closer look, the existing logic for binary comparison seems to be:

if (not decimal type) {
  apply findTightestCommonType
  apply findCommonTypeForBinaryComparison
} else {
  if (both sides are decimal) {
    apply widerDecimalType 
  } if (one side is not decimal) {
    apply function integralAndDecimalLiteral
    apply function nondecimalAndDecimal
  }
}

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan @maropu It seems the logic is PromoteStrings -> DecimalPrecision -> ImplicitTypeCasts:

scala> spark.conf.set("spark.sql.optimizer.planChangeLog.level", "WARN")

scala> spark.sql("select  bigint(1) > double(2), 1 > '1', bigint(1) > cast(1 as decimal(10, 0)) ").explain
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1: Batch Hints has no effect.
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1: Batch Simple Sanity Check has no effect.
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1: Batch Substitution has no effect.
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions ===
!'Project [unresolvedalias(('bigint(1) > 'double(2)), None), unresolvedalias((1 > 1), None), unresolvedalias(('bigint(1) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                             +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.ResolveTimeZone ===
 'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                                                  +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.TypeCoercion$PromoteStrings ===
!'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                                                  +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.DecimalPrecision ===
!'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(cast(1 as bigint) as decimal(20,0)) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                                                               +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.TypeCoercion$ImplicitTypeCasts ===
!'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(cast(1 as bigint) as decimal(20,0)) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(cast(1 as bigint) as double) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(cast(1 as bigint) as decimal(20,0)) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelationscala> spark.conf.set("spark.sql.optimizer.planChangeLog.level", "WARN")

scala> spark.sql("select  bigint(1) > double(2), 1 > '1', bigint(1) > cast(1 as decimal(10, 0)) ").explain
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1: Batch Hints has no effect.
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1: Batch Simple Sanity Check has no effect.
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1: Batch Substitution has no effect.
20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions ===
!'Project [unresolvedalias(('bigint(1) > 'double(2)), None), unresolvedalias((1 > 1), None), unresolvedalias(('bigint(1) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                             +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.ResolveTimeZone ===
 'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                                                  +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.TypeCoercion$PromoteStrings ===
!'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > 1), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                                                  +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.DecimalPrecision ===
!'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(1 as bigint) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(cast(1 as bigint) as decimal(20,0)) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation                                                                                                                                                                               +- OneRowRelation

20/02/02 19:31:10 WARN HiveSessionStateBuilder$$anon$1:
=== Applying Rule org.apache.spark.sql.catalyst.analysis.TypeCoercion$ImplicitTypeCasts ===
!'Project [unresolvedalias((cast(1 as bigint) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(cast(1 as bigint) as decimal(20,0)) > cast(1 as decimal(10,0))), None)]   'Project [unresolvedalias((cast(cast(1 as bigint) as double) > cast(2 as double)), None), unresolvedalias((1 > cast(1 as int)), None), unresolvedalias((cast(cast(1 as bigint) as decimal(20,0)) > cast(1 as decimal(10,0))), None)]
 +- OneRowRelation

case Some(listType) =>
val finalDataType = findCommonTypeForBinaryComparison(value.dataType, listType, conf)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you leave some comments about the discussion above?

.orElse(findWiderTypeForDecimal(value.dataType, listType))
.orElse(findTightestCommonType(value.dataType, listType))
finalDataType.map(t => i.withNewChildren(i.children.map(Cast(_, t)))).getOrElse(i)
case None => i
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1459,6 +1459,11 @@ class TypeCoercionSuite extends AnalysisTest {
In(Cast(Literal("a"), StringType),
Seq(Cast(Literal(1), StringType), Cast(Literal("b"), StringType)))
)
ruleTest(inConversion,
In(Literal(Decimal(3.13)), Seq(Literal("1"), Literal(2))),
In(Cast(Decimal(3.13), DoubleType),
Seq(Cast(Literal("1"), DoubleType), Cast(Literal(2), DoubleType)))
)
}

test("SPARK-15776 Divide expression's dataType should be casted to Double or Decimal " +
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ true
-- !query 8
SELECT cast(1 as tinyint) in (cast(1 as string)) FROM t
-- !query 8 schema
struct<(CAST(CAST(1 AS TINYINT) AS STRING) IN (CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS TINYINT) AS TINYINT) IN (CAST(CAST(1 AS STRING) AS TINYINT))):boolean>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should also update migration guide.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the BinaryComparison behavior:

scala> spark.sql("explain SELECT cast(1 as tinyint) > (cast(1 as string))").show(false)
+---------------------------------------------------------------------------------------------------------------------------------------+
|plan                                                                                                                                   |
+---------------------------------------------------------------------------------------------------------------------------------------+
|== Physical Plan ==
*(1) Project [false AS (CAST(1 AS TINYINT) > CAST(CAST(1 AS STRING) AS TINYINT))#5]
+- *(1) Scan OneRowRelation[]

|
+---------------------------------------------------------------------------------------------------------------------------------------+

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But, since this is a behaviour change in the existing in, I think its worth updating the guide.

-- !query 8 output
true

Expand Down Expand Up @@ -169,7 +169,7 @@ true
-- !query 20
SELECT cast(1 as smallint) in (cast(1 as string)) FROM t
-- !query 20 schema
struct<(CAST(CAST(1 AS SMALLINT) AS STRING) IN (CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS SMALLINT) AS SMALLINT) IN (CAST(CAST(1 AS STRING) AS SMALLINT))):boolean>
-- !query 20 output
true

Expand Down Expand Up @@ -269,7 +269,7 @@ true
-- !query 32
SELECT cast(1 as int) in (cast(1 as string)) FROM t
-- !query 32 schema
struct<(CAST(CAST(1 AS INT) AS STRING) IN (CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS INT) AS INT) IN (CAST(CAST(1 AS STRING) AS INT))):boolean>
-- !query 32 output
true

Expand Down Expand Up @@ -369,7 +369,7 @@ true
-- !query 44
SELECT cast(1 as bigint) in (cast(1 as string)) FROM t
-- !query 44 schema
struct<(CAST(CAST(1 AS BIGINT) AS STRING) IN (CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS BIGINT) AS BIGINT) IN (CAST(CAST(1 AS STRING) AS BIGINT))):boolean>
-- !query 44 output
true

Expand Down Expand Up @@ -469,9 +469,9 @@ true
-- !query 56
SELECT cast(1 as float) in (cast(1 as string)) FROM t
-- !query 56 schema
struct<(CAST(CAST(1 AS FLOAT) AS STRING) IN (CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS FLOAT) AS FLOAT) IN (CAST(CAST(1 AS STRING) AS FLOAT))):boolean>
-- !query 56 output
false
true


-- !query 57
Expand Down Expand Up @@ -569,9 +569,9 @@ true
-- !query 68
SELECT cast(1 as double) in (cast(1 as string)) FROM t
-- !query 68 schema
struct<(CAST(CAST(1 AS DOUBLE) AS STRING) IN (CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS DOUBLE) AS DOUBLE) IN (CAST(CAST(1 AS STRING) AS DOUBLE))):boolean>
-- !query 68 output
false
true


-- !query 69
Expand Down Expand Up @@ -669,7 +669,7 @@ true
-- !query 80
SELECT cast(1 as decimal(10, 0)) in (cast(1 as string)) FROM t
-- !query 80 schema
struct<(CAST(CAST(1 AS DECIMAL(10,0)) AS STRING) IN (CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS DECIMAL(10,0)) AS DOUBLE) IN (CAST(CAST(1 AS STRING) AS DOUBLE))):boolean>
-- !query 80 output
true

Expand Down Expand Up @@ -713,55 +713,55 @@ cannot resolve '(CAST(1 AS DECIMAL(10,0)) IN (CAST('2017-12-11 09:30:00' AS DATE
-- !query 85
SELECT cast(1 as string) in (cast(1 as tinyint)) FROM t
-- !query 85 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(1 AS TINYINT) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS TINYINT) IN (CAST(CAST(1 AS TINYINT) AS TINYINT))):boolean>
-- !query 85 output
true


-- !query 86
SELECT cast(1 as string) in (cast(1 as smallint)) FROM t
-- !query 86 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(1 AS SMALLINT) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS SMALLINT) IN (CAST(CAST(1 AS SMALLINT) AS SMALLINT))):boolean>
-- !query 86 output
true


-- !query 87
SELECT cast(1 as string) in (cast(1 as int)) FROM t
-- !query 87 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(1 AS INT) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS INT) IN (CAST(CAST(1 AS INT) AS INT))):boolean>
-- !query 87 output
true


-- !query 88
SELECT cast(1 as string) in (cast(1 as bigint)) FROM t
-- !query 88 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(1 AS BIGINT) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS BIGINT) IN (CAST(CAST(1 AS BIGINT) AS BIGINT))):boolean>
-- !query 88 output
true


-- !query 89
SELECT cast(1 as string) in (cast(1 as float)) FROM t
-- !query 89 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(1 AS FLOAT) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS FLOAT) IN (CAST(CAST(1 AS FLOAT) AS FLOAT))):boolean>
-- !query 89 output
false
true


-- !query 90
SELECT cast(1 as string) in (cast(1 as double)) FROM t
-- !query 90 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(1 AS DOUBLE) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS DOUBLE) IN (CAST(CAST(1 AS DOUBLE) AS DOUBLE))):boolean>
-- !query 90 output
false
true


-- !query 91
SELECT cast(1 as string) in (cast(1 as decimal(10, 0))) FROM t
-- !query 91 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(1 AS DECIMAL(10,0)) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS DOUBLE) IN (CAST(CAST(1 AS DECIMAL(10,0)) AS DOUBLE))):boolean>
-- !query 91 output
true

Expand All @@ -777,35 +777,33 @@ true
-- !query 93
SELECT cast(1 as string) in (cast('1' as binary)) FROM t
-- !query 93 schema
struct<>
struct<(CAST(CAST(1 AS STRING) AS BINARY) IN (CAST(CAST(1 AS BINARY) AS BINARY))):boolean>
-- !query 93 output
org.apache.spark.sql.AnalysisException
cannot resolve '(CAST(1 AS STRING) IN (CAST('1' AS BINARY)))' due to data type mismatch: Arguments must be same type but were: string != binary; line 1 pos 25
true


-- !query 94
SELECT cast(1 as string) in (cast(1 as boolean)) FROM t
-- !query 94 schema
struct<>
struct<(CAST(CAST(1 AS STRING) AS BOOLEAN) IN (CAST(CAST(1 AS BOOLEAN) AS BOOLEAN))):boolean>
-- !query 94 output
org.apache.spark.sql.AnalysisException
cannot resolve '(CAST(1 AS STRING) IN (CAST(1 AS BOOLEAN)))' due to data type mismatch: Arguments must be same type but were: string != boolean; line 1 pos 25
true


-- !query 95
SELECT cast(1 as string) in (cast('2017-12-11 09:30:00.0' as timestamp)) FROM t
-- !query 95 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS TIMESTAMP) IN (CAST(CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) AS TIMESTAMP))):boolean>
-- !query 95 output
false
NULL


-- !query 96
SELECT cast(1 as string) in (cast('2017-12-11 09:30:00' as date)) FROM t
-- !query 96 schema
struct<(CAST(CAST(1 AS STRING) AS STRING) IN (CAST(CAST(2017-12-11 09:30:00 AS DATE) AS STRING))):boolean>
struct<(CAST(CAST(1 AS STRING) AS DATE) IN (CAST(CAST(2017-12-11 09:30:00 AS DATE) AS DATE))):boolean>
-- !query 96 output
false
NULL


-- !query 97
Expand Down Expand Up @@ -874,10 +872,9 @@ cannot resolve '(CAST('1' AS BINARY) IN (CAST(1 AS DECIMAL(10,0))))' due to data
-- !query 104
SELECT cast('1' as binary) in (cast(1 as string)) FROM t
-- !query 104 schema
struct<>
struct<(CAST(CAST(1 AS BINARY) AS BINARY) IN (CAST(CAST(1 AS STRING) AS BINARY))):boolean>
-- !query 104 output
org.apache.spark.sql.AnalysisException
cannot resolve '(CAST('1' AS BINARY) IN (CAST(1 AS STRING)))' due to data type mismatch: Arguments must be same type but were: binary != string; line 1 pos 27
true


-- !query 105
Expand Down Expand Up @@ -981,10 +978,9 @@ cannot resolve '(true IN (CAST(1 AS DECIMAL(10,0))))' due to data type mismatch:
-- !query 116
SELECT true in (cast(1 as string)) FROM t
-- !query 116 schema
struct<>
struct<(CAST(true AS BOOLEAN) IN (CAST(CAST(1 AS STRING) AS BOOLEAN))):boolean>
-- !query 116 output
org.apache.spark.sql.AnalysisException
cannot resolve '(true IN (CAST(1 AS STRING)))' due to data type mismatch: Arguments must be same type but were: boolean != string; line 1 pos 12
true


-- !query 117
Expand Down Expand Up @@ -1088,9 +1084,9 @@ cannot resolve '(CAST('2017-12-12 09:30:00.0' AS TIMESTAMP) IN (CAST(2 AS DECIMA
-- !query 128
SELECT cast('2017-12-12 09:30:00.0' as timestamp) in (cast(2 as string)) FROM t
-- !query 128 schema
struct<(CAST(CAST(2017-12-12 09:30:00.0 AS TIMESTAMP) AS STRING) IN (CAST(CAST(2 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(2017-12-12 09:30:00.0 AS TIMESTAMP) AS TIMESTAMP) IN (CAST(CAST(2 AS STRING) AS TIMESTAMP))):boolean>
-- !query 128 output
false
NULL


-- !query 129
Expand Down Expand Up @@ -1193,9 +1189,9 @@ cannot resolve '(CAST('2017-12-12 09:30:00' AS DATE) IN (CAST(2 AS DECIMAL(10,0)
-- !query 140
SELECT cast('2017-12-12 09:30:00' as date) in (cast(2 as string)) FROM t
-- !query 140 schema
struct<(CAST(CAST(2017-12-12 09:30:00 AS DATE) AS STRING) IN (CAST(CAST(2 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(2017-12-12 09:30:00 AS DATE) AS DATE) IN (CAST(CAST(2 AS STRING) AS DATE))):boolean>
-- !query 140 output
false
NULL


-- !query 141
Expand Down Expand Up @@ -1291,7 +1287,7 @@ true
-- !query 152
SELECT cast(1 as tinyint) in (cast(1 as tinyint), cast(1 as string)) FROM t
-- !query 152 schema
struct<(CAST(CAST(1 AS TINYINT) AS STRING) IN (CAST(CAST(1 AS TINYINT) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS TINYINT) AS TINYINT) IN (CAST(CAST(1 AS TINYINT) AS TINYINT), CAST(CAST(1 AS STRING) AS TINYINT))):boolean>
-- !query 152 output
true

Expand Down Expand Up @@ -1391,7 +1387,7 @@ true
-- !query 164
SELECT cast(1 as smallint) in (cast(1 as smallint), cast(1 as string)) FROM t
-- !query 164 schema
struct<(CAST(CAST(1 AS SMALLINT) AS STRING) IN (CAST(CAST(1 AS SMALLINT) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS SMALLINT) AS SMALLINT) IN (CAST(CAST(1 AS SMALLINT) AS SMALLINT), CAST(CAST(1 AS STRING) AS SMALLINT))):boolean>
-- !query 164 output
true

Expand Down Expand Up @@ -1491,7 +1487,7 @@ true
-- !query 176
SELECT cast(1 as int) in (cast(1 as int), cast(1 as string)) FROM t
-- !query 176 schema
struct<(CAST(CAST(1 AS INT) AS STRING) IN (CAST(CAST(1 AS INT) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS INT) AS INT) IN (CAST(CAST(1 AS INT) AS INT), CAST(CAST(1 AS STRING) AS INT))):boolean>
-- !query 176 output
true

Expand Down Expand Up @@ -1591,7 +1587,7 @@ true
-- !query 188
SELECT cast(1 as bigint) in (cast(1 as bigint), cast(1 as string)) FROM t
-- !query 188 schema
struct<(CAST(CAST(1 AS BIGINT) AS STRING) IN (CAST(CAST(1 AS BIGINT) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS BIGINT) AS BIGINT) IN (CAST(CAST(1 AS BIGINT) AS BIGINT), CAST(CAST(1 AS STRING) AS BIGINT))):boolean>
-- !query 188 output
true

Expand Down Expand Up @@ -1691,7 +1687,7 @@ true
-- !query 200
SELECT cast(1 as float) in (cast(1 as float), cast(1 as string)) FROM t
-- !query 200 schema
struct<(CAST(CAST(1 AS FLOAT) AS STRING) IN (CAST(CAST(1 AS FLOAT) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS FLOAT) AS FLOAT) IN (CAST(CAST(1 AS FLOAT) AS FLOAT), CAST(CAST(1 AS STRING) AS FLOAT))):boolean>
-- !query 200 output
true

Expand Down Expand Up @@ -1791,7 +1787,7 @@ true
-- !query 212
SELECT cast(1 as double) in (cast(1 as double), cast(1 as string)) FROM t
-- !query 212 schema
struct<(CAST(CAST(1 AS DOUBLE) AS STRING) IN (CAST(CAST(1 AS DOUBLE) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS DOUBLE) AS DOUBLE) IN (CAST(CAST(1 AS DOUBLE) AS DOUBLE), CAST(CAST(1 AS STRING) AS DOUBLE))):boolean>
-- !query 212 output
true

Expand Down Expand Up @@ -1891,7 +1887,7 @@ true
-- !query 224
SELECT cast(1 as decimal(10, 0)) in (cast(1 as decimal(10, 0)), cast(1 as string)) FROM t
-- !query 224 schema
struct<(CAST(CAST(1 AS DECIMAL(10,0)) AS STRING) IN (CAST(CAST(1 AS DECIMAL(10,0)) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(1 AS DECIMAL(10,0)) AS DOUBLE) IN (CAST(CAST(1 AS DECIMAL(10,0)) AS DOUBLE), CAST(CAST(1 AS STRING) AS DOUBLE))):boolean>
-- !query 224 output
true

Expand Down Expand Up @@ -2310,7 +2306,7 @@ cannot resolve '(CAST('2017-12-12 09:30:00.0' AS TIMESTAMP) IN (CAST('2017-12-12
-- !query 272
SELECT cast('2017-12-12 09:30:00.0' as timestamp) in (cast('2017-12-12 09:30:00.0' as timestamp), cast(1 as string)) FROM t
-- !query 272 schema
struct<(CAST(CAST(2017-12-12 09:30:00.0 AS TIMESTAMP) AS STRING) IN (CAST(CAST(2017-12-12 09:30:00.0 AS TIMESTAMP) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(2017-12-12 09:30:00.0 AS TIMESTAMP) AS TIMESTAMP) IN (CAST(CAST(2017-12-12 09:30:00.0 AS TIMESTAMP) AS TIMESTAMP), CAST(CAST(1 AS STRING) AS TIMESTAMP))):boolean>
-- !query 272 output
true

Expand Down Expand Up @@ -2415,7 +2411,7 @@ cannot resolve '(CAST('2017-12-12 09:30:00' AS DATE) IN (CAST('2017-12-12 09:30:
-- !query 284
SELECT cast('2017-12-12 09:30:00' as date) in (cast('2017-12-12 09:30:00' as date), cast(1 as string)) FROM t
-- !query 284 schema
struct<(CAST(CAST(2017-12-12 09:30:00 AS DATE) AS STRING) IN (CAST(CAST(2017-12-12 09:30:00 AS DATE) AS STRING), CAST(CAST(1 AS STRING) AS STRING))):boolean>
struct<(CAST(CAST(2017-12-12 09:30:00 AS DATE) AS DATE) IN (CAST(CAST(2017-12-12 09:30:00 AS DATE) AS DATE), CAST(CAST(1 AS STRING) AS DATE))):boolean>
-- !query 284 output
true

Expand Down