Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,7 @@ abstract class Optimizer(catalogManager: CatalogManager)
RewriteCorrelatedScalarSubquery,
EliminateSerialization,
RemoveRedundantAliases,
UnwrapCastInBinaryComparison,
RemoveNoopOperators,
CombineWithFields,
SimplifyExtractValueOps,
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,236 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package org.apache.spark.sql.catalyst.optimizer

import org.apache.spark.sql.catalyst.expressions._
import org.apache.spark.sql.catalyst.expressions.Literal.FalseLiteral
import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
import org.apache.spark.sql.catalyst.rules.Rule
import org.apache.spark.sql.types._

/**
* Unwrap casts in binary comparison operations with patterns like following:
*
* `BinaryComparison(Cast(fromExp, toType), Literal(value, toType))`
* or
* `BinaryComparison(Literal(value, toType), Cast(fromExp, toType))`
*
* This rule optimizes expressions with the above pattern by either replacing the cast with simpler
* constructs, or moving the cast from the expression side to the literal side, which enables them
* to be optimized away later and pushed down to data sources.
*
* Currently this only handles cases where:
* 1). `fromType` (of `fromExp`) and `toType` are of integral types (i.e., byte, short, int and
* long)
* 2). `fromType` can be safely coerced to `toType` without precision loss (e.g., short to int,
* int to long, but not long to int)
*
* If the above conditions are satisfied, the rule checks to see if the literal `value` is within
* range `(min, max)`, where `min` and `max` are the minimum and maximum value of `fromType`,
* respectively. If this is true then it means we can safely cast `value` to `fromType` and thus
* able to move the cast to the literal side. That is:
*
* `cast(fromExp, toType) op value` ==> `fromExp op cast(value, fromType)`
*
* If the `value` is not within range `(min, max)`, the rule breaks the scenario into different
* cases and try to replace each with simpler constructs.
*
* if `value > max`, the cases are of following:
* - `cast(fromExp, toType) > value` ==> if(isnull(fromExp), null, false)
* - `cast(fromExp, toType) >= value` ==> if(isnull(fromExp), null, false)
* - `cast(fromExp, toType) === value` ==> if(isnull(fromExp), null, false)
* - `cast(fromExp, toType) <=> value` ==> false (if `fromExp` is deterministic)
* - `cast(fromExp, toType) <=> value` ==> cast(fromExp, toType) <=> value (if `fromExp` is
* non-deterministic)
* - `cast(fromExp, toType) <= value` ==> if(isnull(fromExp), null, true)
* - `cast(fromExp, toType) < value` ==> if(isnull(fromExp), null, true)
*
* if `value == max`, the cases are of following:
* - `cast(fromExp, toType) > value` ==> if(isnull(fromExp), null, false)
* - `cast(fromExp, toType) >= value` ==> fromExp == max
* - `cast(fromExp, toType) === value` ==> fromExp == max
* - `cast(fromExp, toType) <=> value` ==> fromExp <=> max
* - `cast(fromExp, toType) <= value` ==> if(isnull(fromExp), null, true)
* - `cast(fromExp, toType) < value` ==> fromExp =!= max
*
* Similarly for the cases when `value == min` and `value < min`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it worth to specifically handle value == max and value == min? Rules will add cost into planning phase, and we probably not going to run into those two cases often in real world queries. Should we handle them as if it's within the range?

Copy link
Member Author

@sunchao sunchao Sep 1, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is hard to answer this question without knowing things like how normally user write queries, how much time this rule uses comparing to other rules, and how much time these specific 2 cases use. IMO the optimization on these two cases can offer significant speed up, and the extra cost is just the pattern matching:

else if (maxCmp == 0) {
      exp match {
        case GreaterThan(_, _) =>
          fromExp.falseIfNotNull
        case LessThanOrEqual(_, _) =>
          fromExp.trueIfNotNull
        case LessThan(_, _) =>
          Not(EqualTo(fromExp, Literal(max, fromType)))
        case GreaterThanOrEqual(_, _) | EqualTo(_, _) | EqualNullSafe(_, _) =>
          EqualTo(fromExp, Literal(max, fromType))
        case _ => exp
      }

which I think should be pretty fast? comparing to the rule itself which may require walking the whole plan tree.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we don't have maxCmp == 0 and minCmp ==0, it will be optimized to the following. Those can still be pushed down.

      val lit = Cast(Literal(value), fromType)
      exp match {
        case GreaterThan(_, _) => GreaterThan(fromExp, lit)
        case GreaterThanOrEqual(_, _) => GreaterThanOrEqual(fromExp, lit)
        case EqualTo(_, _) => EqualTo(fromExp, lit)
        case EqualNullSafe(_, _) => EqualNullSafe(fromExp, lit)
        case LessThan(_, _) => LessThan(fromExp, lit)
        case LessThanOrEqual(_, _) => LessThanOrEqual(fromExp, lit)
        case _ => exp
      }

Maybe we can have separate rules that optimize from GreaterThan(fromExp, lit) to fromExp.falseIfNotNull when lit is the max of lit.dataType. So more plans can take advantage of it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those filters are not as effective though. We may add separate rules but that may be more expensive I think (we'll need again to check both sides of the comparison, what types the literal is, the min/max, what kind of expression fromExp is, and potential edge cases when both sides have different types etc).

IMO the advantage of doing it here is that we've already done most of the checks and I don't see much extra cost.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking for value == max, we can have a separate rule such as

 *  - `fromExp > value` where fromExp.type == value.dataType && getRange(fromExp)._2 == value` ==> if(isnull(fromExp), null, false)
...
...

which should be valid for other types. Thus, we don't need to handle maxCmp == 0, and it opens up other opportunities to optimize rules that don't have cast. WDYT?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes +1 on exploring that. I think we should keep the maxCmp == 0 case for now though before that is ready, since this is more optimized with not much extra cost (without the maxCmp == 0 we still have to do pattern matching in the else case).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's consider it in the followup PR.

*
* Further, the above `if(isnull(fromExp), null, false)` is represented using conjunction
* `and(isnull(fromExp), null)`, to enable further optimization and filter pushdown to data sources.
* Similarly, `if(isnull(fromExp), null, true)` is represented with `or(isnotnull(fromExp), null)`.
*/
object UnwrapCastInBinaryComparison extends Rule[LogicalPlan] {
override def apply(plan: LogicalPlan): LogicalPlan = plan transform {
case l: LogicalPlan =>
l transformExpressionsUp {
case e @ BinaryComparison(_, _) => unwrapCast(e)
}
}

private def unwrapCast(exp: Expression): Expression = exp match {
// Not a canonical form. In this case we first canonicalize the expression by swapping the
// literal and cast side, then process the result and swap the literal and cast again to
// restore the original order.
case BinaryComparison(Literal(_, literalType), Cast(fromExp, toType, _))
if canImplicitlyCast(fromExp, toType, literalType) =>
def swap(e: Expression): Expression = e match {
case GreaterThan(left, right) => LessThan(right, left)
case GreaterThanOrEqual(left, right) => LessThanOrEqual(right, left)
case EqualTo(left, right) => EqualTo(right, left)
case EqualNullSafe(left, right) => EqualNullSafe(right, left)
case LessThanOrEqual(left, right) => GreaterThanOrEqual(right, left)
case LessThan(left, right) => GreaterThan(right, left)
case _ => e
}

swap(unwrapCast(swap(exp)))

// In case both sides have integral type, optimize the comparison by removing casts or
// moving cast to the literal side.
case be @ BinaryComparison(
Cast(fromExp, toType: IntegralType, _), Literal(value, literalType))
if canImplicitlyCast(fromExp, toType, literalType) =>
simplifyIntegralComparison(be, fromExp, toType, value)

case _ => exp
}

/**
* Check if the input `value` is within range `(min, max)` of the `fromType`, where `min` and
* `max` are the minimum and maximum value of the `fromType`. If the above is true, this
* optimizes the expression by moving the cast to the literal side. Otherwise if result is not
* true, this replaces the input binary comparison `exp` with simpler expressions.
*/
private def simplifyIntegralComparison(
exp: BinaryComparison,
fromExp: Expression,
toType: IntegralType,
value: Any): Expression = {

val fromType = fromExp.dataType
val (min, max) = getRange(fromType)
val (minInToType, maxInToType) = {
(Cast(Literal(min), toType).eval(), Cast(Literal(max), toType).eval())
}
val ordering = toType.ordering.asInstanceOf[Ordering[Any]]
val minCmp = ordering.compare(value, minInToType)
val maxCmp = ordering.compare(value, maxInToType)

if (maxCmp > 0) {
exp match {
case EqualTo(_, _) | GreaterThan(_, _) | GreaterThanOrEqual(_, _) =>
falseIfNotNull(fromExp)
case LessThan(_, _) | LessThanOrEqual(_, _) =>
trueIfNotNull(fromExp)
// make sure the expression is evaluated if it is non-deterministic
case EqualNullSafe(_, _) if exp.deterministic =>
FalseLiteral
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan for safety, we will skip optimization if exp is non deterministic. WDYT?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SGTM

case _ => exp
}
} else if (maxCmp == 0) {
exp match {
case GreaterThan(_, _) =>
falseIfNotNull(fromExp)
case LessThanOrEqual(_, _) =>
trueIfNotNull(fromExp)
case LessThan(_, _) =>
Not(EqualTo(fromExp, Literal(max, fromType)))
case GreaterThanOrEqual(_, _) | EqualTo(_, _) =>
EqualTo(fromExp, Literal(max, fromType))
case EqualNullSafe(_, _) =>
EqualNullSafe(fromExp, Literal(max, fromType))
case _ => exp
}
} else if (minCmp < 0) {
exp match {
case GreaterThan(_, _) | GreaterThanOrEqual(_, _) =>
trueIfNotNull(fromExp)
case LessThan(_, _) | LessThanOrEqual(_, _) | EqualTo(_, _) =>
falseIfNotNull(fromExp)
// make sure the expression is evaluated if it is non-deterministic
case EqualNullSafe(_, _) if exp.deterministic =>
FalseLiteral
case _ => exp
}
} else if (minCmp == 0) {
exp match {
case LessThan(_, _) =>
falseIfNotNull(fromExp)
case GreaterThanOrEqual(_, _) =>
trueIfNotNull(fromExp)
case GreaterThan(_, _) =>
Not(EqualTo(fromExp, Literal(min, fromType)))
case LessThanOrEqual(_, _) | EqualTo(_, _) =>
EqualTo(fromExp, Literal(min, fromType))
case EqualNullSafe(_, _) =>
EqualNullSafe(fromExp, Literal(min, fromType))
case _ => exp
}
} else {
// This means `value` is within range `(min, max)`. Optimize this by moving the cast to the
// literal side.
val lit = Cast(Literal(value), fromType)
exp match {
case GreaterThan(_, _) => GreaterThan(fromExp, lit)
case GreaterThanOrEqual(_, _) => GreaterThanOrEqual(fromExp, lit)
case EqualTo(_, _) => EqualTo(fromExp, lit)
case EqualNullSafe(_, _) => EqualNullSafe(fromExp, lit)
case LessThan(_, _) => LessThan(fromExp, lit)
case LessThanOrEqual(_, _) => LessThanOrEqual(fromExp, lit)
case _ => exp
}
}
}

/**
* Check if the input `fromExp` can be safely cast to `toType` without any loss of precision,
* i.e., the conversion is injective. Note this only handles the case when both sides are of
* integral type.
*/
private def canImplicitlyCast(fromExp: Expression, toType: DataType,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

indentation nit:

def func(
    para1: T,
    para2: T,
    para3: T): R = ...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW I'll also check !from.foldable, otherwise it's simpler to not run this rule.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ya. +1 for foldable checking.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. I'll check foldable in the follow-up PR to handle more types. And also fix the indentation.

literalType: DataType): Boolean = {
toType.sameType(literalType) &&
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This additional check looks good and robust. BTW, do we have a test coverage for this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to come up with a test for this but it seems the query compiler always wrap a cast to make sure type from both sides are the same.

fromExp.dataType.isInstanceOf[IntegralType] &&
toType.isInstanceOf[IntegralType] &&
Cast.canUpCast(fromExp.dataType, toType)
}

private def getRange(dt: DataType): (Any, Any) = dt match {
case ByteType => (Byte.MinValue, Byte.MaxValue)
case ShortType => (Short.MinValue, Short.MaxValue)
case IntegerType => (Int.MinValue, Int.MaxValue)
case LongType => (Long.MinValue, Long.MaxValue)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may want to safe guard here; case _: throw new UnsupportedOperationException.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably need to handle this exception in the caller to prevent the job being killed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do. Ideally this should never happen though unless someone adds another subclass under IntegralType (and in that case the implementor should fulfill their responsibility by checking every place that does pattern matching on the parent type). It's unfortunate that we can't seal the class and rely on Scala's exhaustive pattern match compile-time check :(

Regarding catching the exception, I don't think it is necessary since it should be an error in query compilation before job starts. I see similar pattern in ColumnType.

case other => throw new IllegalArgumentException(s"Unsupported type: ${other.catalogString}")
}

/**
* Wraps input expression `e` with `if(isnull(e), null, false)`. The if-clause is represented
* using `and(isnull(e), null)` which is semantically equivalent by applying 3-valued logic.
*/
private[optimizer] def falseIfNotNull(e: Expression): Expression = {
And(IsNull(e), Literal(null, BooleanType))
}

/**
* Wraps input expression `e` with `if(isnull(e), null, true)`. The if-clause is represented
* using `or(isnotnull(e), null)` which is semantically equivalent by applying 3-valued logic.
*/
private[optimizer] def trueIfNotNull(e: Expression): Expression = {
Or(IsNotNull(e), Literal(null, BooleanType))
}
}
Loading