[SPARK-25036][SQL] avoid match may not be exhaustive in Scala-2.12#22014
[SPARK-25036][SQL] avoid match may not be exhaustive in Scala-2.12#22014kiszk wants to merge 3 commits intoapache:masterfrom
Conversation
|
Test build #94314 has finished for PR 22014 at commit
|
srowen
left a comment
There was a problem hiding this comment.
Looks fine. I avoid RuntimeException as it's non-specific, and prefer IllegalStateException or IllegalArgumentException in most cases. However this isn't done consistently in the code, so don't feel strongly about it at all.
| val newMax = if (n1.max <= n2.max) n1.max else n2.max | ||
| (Some(EstimationUtils.fromDouble(newMin, dt)), | ||
| Some(EstimationUtils.fromDouble(newMax, dt))) | ||
| case _ => throw new RuntimeException(s"Not supported pair: $r1, $r2 at intersect()") |
There was a problem hiding this comment.
Shall we do UnsupportedOperationException?
| case NewFunctionSpec(functionName, None, None) => functionName | ||
| case NewFunctionSpec(functionName, Some(_), Some(innerClassInstance)) => | ||
| innerClassInstance + "." + functionName | ||
| case _ => null // nothing to do since addNewFunctionInteral() must return one of them |
There was a problem hiding this comment.
Shall we throw an IllegalArgumentException?
| (endpointsExpression.dataType, endpointsExpression.eval()) match { | ||
| case (ArrayType(elementType, _), arrayData: ArrayData) => | ||
| arrayData.toObjectArray(elementType).map(_.toString.toDouble) | ||
| case _ => throw new RuntimeException("not found at endpoints") |
There was a problem hiding this comment.
Can we do this like:
val endpointsType = endpointsExpression.dataType.asInstanceOf[ArrayType]
val endpoints = endpointsExpression.eval().asInstanceOf[ArrayData]
endpoints.toObjectArray(endpointsType.elementType).map(_.toString.toDouble)| def attributesFor[T: TypeTag]: Seq[Attribute] = schemaFor[T] match { | ||
| case Schema(s: StructType, _) => | ||
| s.toAttributes | ||
| case _ => throw new RuntimeException(s"$schemaFor is not supported at attributesFor()") |
There was a problem hiding this comment.
How about this:
case other =>
throw new UnsupportedOperationException(s"Attributes for type $other is not supported")|
LGTM except those rather nits. |
|
Test build #94388 has finished for PR 22014 at commit
|
|
Retest this please. |
| // For top level row writer, it always writes to the beginning of the global buffer holder, | ||
| // which means its fixed-size region always in the same position, so we don't need to call | ||
| // `reset` to set up its fixed-size region every time. | ||
| if (inputs.map(_.isNull).forall(_ == "false")) { |
|
Test build #94390 has finished for PR 22014 at commit
|
|
Test build #94402 has finished for PR 22014 at commit
|
|
Merged to master |
…ala-2.12. ## What changes were proposed in this pull request? This is a follow-up pr of #22014. We still have some more compilation errors in scala-2.12 with sbt: ``` [error] [warn] /.../sql/core/src/main/scala/org/apache/spark/sql/DataFrameNaFunctions.scala:493: match may not be exhaustive. [error] It would fail on the following input: (_, _) [error] [warn] val typeMatches = (targetType, f.dataType) match { [error] [warn] [error] [warn] /.../sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/MicroBatchExecution.scala:393: match may not be exhaustive. [error] It would fail on the following input: (_, _) [error] [warn] prevBatchOff.get.toStreamProgress(sources).foreach { [error] [warn] [error] [warn] /.../sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/AggUtils.scala:173: match may not be exhaustive. [error] It would fail on the following input: AggregateExpression(_, _, false, _) [error] [warn] val rewrittenDistinctFunctions = functionsWithDistinct.map { [error] [warn] [error] [warn] /.../sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/SymmetricHashJoinStateManager.scala:271: match may not be exhaustive. [error] It would fail on the following input: (_, _) [error] [warn] keyWithIndexToValueMetrics.customMetrics.map { [error] [warn] [error] [warn] /.../sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala:959: match may not be exhaustive. [error] It would fail on the following input: CatalogTableType(_) [error] [warn] val tableTypeString = metadata.tableType match { [error] [warn] [error] [warn] /.../sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:923: match may not be exhaustive. [error] It would fail on the following input: CatalogTableType(_) [error] [warn] hiveTable.setTableType(table.tableType match { [error] [warn] ``` ## How was this patch tested? Manually build with Scala-2.12. Closes #22039 from ueshin/issues/SPARK-25036/fix_match. Authored-by: Takuya UESHIN <ueshin@databricks.com> Signed-off-by: Sean Owen <srowen@gmail.com>
…ala-2.12. ## What changes were proposed in this pull request? This is a follow-up pr of #22014 and #22039 We still have some more compilation errors in mllib with scala-2.12 with sbt: ``` [error] [warn] /home/ishizaki/Spark/PR/scala212/spark/mllib/src/main/scala/org/apache/spark/ml/evaluation/ClusteringEvaluator.scala:116: match may not be exhaustive. [error] It would fail on the following inputs: ("silhouette", _), (_, "cosine"), (_, "squaredEuclidean"), (_, String()), (_, _) [error] [warn] ($(metricName), $(distanceMeasure)) match { [error] [warn] ``` ## How was this patch tested? Existing UTs Closes #22058 from kiszk/SPARK-25036c. Authored-by: Kazuaki Ishizaki <ishizaki@jp.ibm.com> Signed-off-by: Sean Owen <srowen@gmail.com>
…with an internal error ### What changes were proposed in this pull request? In this PR I propose to replace the legacy error class `_LEGACY_ERROR_TEMP_2014`, added as an `IllegalArgumentException` to avoid non-exhaustive case-matching in #22014, with an internal error as it is not triggered by the user space. ### Why are the changes needed? As the error is not triggered by the user space, the legacy error class can be replaced by an internal error. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? No new tests were added and no tests needed changing because of the nature of the updated error class. Closes #40951 from amousavigourabi/add-new-function-internal-error. Lead-authored-by: amousavigourabi <28668597+amousavigourabi@users.noreply.github.com> Co-authored-by: Atour <28668597+amousavigourabi@users.noreply.github.com> Signed-off-by: Max Gekk <max.gekk@gmail.com>
…with an internal error ### What changes were proposed in this pull request? In this PR I propose to replace the legacy error class `_LEGACY_ERROR_TEMP_2014`, added as an `IllegalArgumentException` to avoid non-exhaustive case-matching in apache#22014, with an internal error as it is not triggered by the user space. ### Why are the changes needed? As the error is not triggered by the user space, the legacy error class can be replaced by an internal error. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? No new tests were added and no tests needed changing because of the nature of the updated error class. Closes apache#40951 from amousavigourabi/add-new-function-internal-error. Lead-authored-by: amousavigourabi <28668597+amousavigourabi@users.noreply.github.com> Co-authored-by: Atour <28668597+amousavigourabi@users.noreply.github.com> Signed-off-by: Max Gekk <max.gekk@gmail.com>
What changes were proposed in this pull request?
The PR remove the following compilation error using scala-2.12 with sbt by adding a default case to
match.How was this patch tested?
Existing UTs with Scala-2.11.
Manually build with Scala-2.12