-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-41431][CORE][SQL][UI] Protobuf serializer for SQLExecutionUIData
#39139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
905478b
7f58481
a4f5ce8
46e03ae
658a0b8
c0f011e
6b5df61
1af820b
28c0989
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -147,6 +147,11 @@ | |
| <groupId>org.apache.xbean</groupId> | ||
| <artifactId>xbean-asm9-shaded</artifactId> | ||
| </dependency> | ||
| <dependency> | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is not required, right?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. For sbt, it is not required: But for maven, it is required now: during to sql module inherits
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If the place we mentioned yesterday can be changed to 3.21.11, then we can remove this declaration
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. ok, do we need to shade it?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Don't need shaded, I make a spark client with this pr for testing, and it can be parsed normally EDIT: run TPCDSQueryBenchmark with spark-client and check ui
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hmmmm, did you test with both Maven and SBT? Then why core module requires shading..
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, both MAVEN and SBT tested. Personally, I think the core module needs shaded+relocation because we don't want protobuf version which Spark depends on to affect other third-party projects, just like In fact, because Spark already uses the unified protobuf version, the |
||
| <groupId>com.google.protobuf</groupId> | ||
| <artifactId>protobuf-java</artifactId> | ||
| <version>${protobuf.version}</version> | ||
| </dependency> | ||
| <dependency> | ||
| <groupId>org.scalacheck</groupId> | ||
| <artifactId>scalacheck_${scala.binary.version}</artifactId> | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,18 @@ | ||
| # | ||
| # Licensed to the Apache Software Foundation (ASF) under one or more | ||
| # contributor license agreements. See the NOTICE file distributed with | ||
| # this work for additional information regarding copyright ownership. | ||
| # The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| # (the "License"); you may not use this file except in compliance with | ||
| # the License. You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
| # | ||
|
|
||
| org.apache.spark.status.protobuf.sql.SQLExecutionUIDataSerializer |
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,90 @@ | ||||||||||||||||||||||||||||||||||||||||||
| /* | ||||||||||||||||||||||||||||||||||||||||||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||||||||||||||||||||||||||||||||||||||||||
| * contributor license agreements. See the NOTICE file distributed with | ||||||||||||||||||||||||||||||||||||||||||
| * this work for additional information regarding copyright ownership. | ||||||||||||||||||||||||||||||||||||||||||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||||||||||||||||||||||||||||||||||||||||||
| * (the "License"); you may not use this file except in compliance with | ||||||||||||||||||||||||||||||||||||||||||
| * the License. You may obtain a copy of the License at | ||||||||||||||||||||||||||||||||||||||||||
| * | ||||||||||||||||||||||||||||||||||||||||||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||||||||||||||||||||||||||||||||||||||||||
| * | ||||||||||||||||||||||||||||||||||||||||||
| * Unless required by applicable law or agreed to in writing, software | ||||||||||||||||||||||||||||||||||||||||||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||||||||||||||||||||||||||||||||||||||||||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||||||||||||||||||||||||||||||||||||||||||
| * See the License for the specific language governing permissions and | ||||||||||||||||||||||||||||||||||||||||||
| * limitations under the License. | ||||||||||||||||||||||||||||||||||||||||||
| */ | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| package org.apache.spark.status.protobuf.sql | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| import java.util.Date | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| import collection.JavaConverters._ | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| import org.apache.spark.JobExecutionStatus | ||||||||||||||||||||||||||||||||||||||||||
| import org.apache.spark.sql.execution.ui.SQLExecutionUIData | ||||||||||||||||||||||||||||||||||||||||||
| import org.apache.spark.status.protobuf.{ProtobufSerDe, StoreTypes} | ||||||||||||||||||||||||||||||||||||||||||
| import org.apache.spark.status.protobuf.Utils.getOptional | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| class SQLExecutionUIDataSerializer extends ProtobufSerDe { | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| override val supportClass: Class[_] = classOf[SQLExecutionUIData] | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| override def serialize(input: Any): Array[Byte] = { | ||||||||||||||||||||||||||||||||||||||||||
| val ui = input.asInstanceOf[SQLExecutionUIData] | ||||||||||||||||||||||||||||||||||||||||||
| val builder = StoreTypes.SQLExecutionUIData.newBuilder() | ||||||||||||||||||||||||||||||||||||||||||
| builder.setExecutionId(ui.executionId) | ||||||||||||||||||||||||||||||||||||||||||
| builder.setDescription(ui.description) | ||||||||||||||||||||||||||||||||||||||||||
| builder.setDetails(ui.details) | ||||||||||||||||||||||||||||||||||||||||||
| builder.setPhysicalPlanDescription(ui.physicalPlanDescription) | ||||||||||||||||||||||||||||||||||||||||||
| ui.modifiedConfigs.foreach { | ||||||||||||||||||||||||||||||||||||||||||
| case (k, v) => builder.putModifiedConfigs(k, v) | ||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||
| ui.metrics.foreach(m => builder.addMetrics(SQLPlanMetricSerializer.serialize(m))) | ||||||||||||||||||||||||||||||||||||||||||
| builder.setSubmissionTime(ui.submissionTime) | ||||||||||||||||||||||||||||||||||||||||||
| ui.completionTime.foreach(ct => builder.setCompletionTime(ct.getTime)) | ||||||||||||||||||||||||||||||||||||||||||
| ui.errorMessage.foreach(builder.setErrorMessage) | ||||||||||||||||||||||||||||||||||||||||||
| ui.jobs.foreach { | ||||||||||||||||||||||||||||||||||||||||||
| case (id, status) => | ||||||||||||||||||||||||||||||||||||||||||
| builder.putJobs(id.toLong, StoreTypes.JobExecutionStatus.valueOf(status.toString)) | ||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||
| ui.stages.foreach(stageId => builder.addStages(stageId.toLong)) | ||||||||||||||||||||||||||||||||||||||||||
| val metricValues = ui.metricValues | ||||||||||||||||||||||||||||||||||||||||||
| if (metricValues != null) { | ||||||||||||||||||||||||||||||||||||||||||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. found a corner case.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. spark/sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLAppStatusListener.scala Lines 498 to 517 in 5b1ff91
Or we can protect |
||||||||||||||||||||||||||||||||||||||||||
| metricValues.foreach { | ||||||||||||||||||||||||||||||||||||||||||
| case (k, v) => builder.putMetricValues(k, v) | ||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||
| builder.build().toByteArray | ||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| override def deserialize(bytes: Array[Byte]): SQLExecutionUIData = { | ||||||||||||||||||||||||||||||||||||||||||
| val ui = StoreTypes.SQLExecutionUIData.parseFrom(bytes) | ||||||||||||||||||||||||||||||||||||||||||
| val completionTime = | ||||||||||||||||||||||||||||||||||||||||||
| getOptional(ui.hasCompletionTime, () => new Date(ui.getCompletionTime)) | ||||||||||||||||||||||||||||||||||||||||||
| val errorMessage = getOptional(ui.hasErrorMessage, () => ui.getErrorMessage) | ||||||||||||||||||||||||||||||||||||||||||
| val metrics = | ||||||||||||||||||||||||||||||||||||||||||
| ui.getMetricsList.asScala.map(m => SQLPlanMetricSerializer.deserialize(m)).toSeq | ||||||||||||||||||||||||||||||||||||||||||
| val jobs = ui.getJobsMap.asScala.map { | ||||||||||||||||||||||||||||||||||||||||||
| case (jobId, status) => jobId.toInt -> JobExecutionStatus.valueOf(status.toString) | ||||||||||||||||||||||||||||||||||||||||||
| }.toMap | ||||||||||||||||||||||||||||||||||||||||||
| val metricValues = ui.getMetricValuesMap.asScala.map { | ||||||||||||||||||||||||||||||||||||||||||
| case (k, v) => k.toLong -> v | ||||||||||||||||||||||||||||||||||||||||||
| }.toMap | ||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||
| new SQLExecutionUIData( | ||||||||||||||||||||||||||||||||||||||||||
| executionId = ui.getExecutionId, | ||||||||||||||||||||||||||||||||||||||||||
| description = ui.getDescription, | ||||||||||||||||||||||||||||||||||||||||||
| details = ui.getDetails, | ||||||||||||||||||||||||||||||||||||||||||
| physicalPlanDescription = ui.getPhysicalPlanDescription, | ||||||||||||||||||||||||||||||||||||||||||
| modifiedConfigs = ui.getModifiedConfigsMap.asScala.toMap, | ||||||||||||||||||||||||||||||||||||||||||
| metrics = metrics, | ||||||||||||||||||||||||||||||||||||||||||
| submissionTime = ui.getSubmissionTime, | ||||||||||||||||||||||||||||||||||||||||||
| completionTime = completionTime, | ||||||||||||||||||||||||||||||||||||||||||
| errorMessage = errorMessage, | ||||||||||||||||||||||||||||||||||||||||||
| jobs = jobs, | ||||||||||||||||||||||||||||||||||||||||||
| stages = ui.getStagesList.asScala.map(_.toInt).toSet, | ||||||||||||||||||||||||||||||||||||||||||
| metricValues = metricValues | ||||||||||||||||||||||||||||||||||||||||||
| ) | ||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.status.protobuf.sql | ||
|
|
||
| import org.apache.spark.sql.execution.ui.SQLPlanMetric | ||
| import org.apache.spark.status.protobuf.StoreTypes | ||
|
|
||
| object SQLPlanMetricSerializer { | ||
|
|
||
| def serialize(metric: SQLPlanMetric): StoreTypes.SQLPlanMetric = { | ||
| StoreTypes.SQLPlanMetric.newBuilder() | ||
| .setName(metric.name) | ||
| .setAccumulatorId(metric.accumulatorId) | ||
| .setMetricType(metric.metricType) | ||
| .build() | ||
| } | ||
|
|
||
| def deserialize(metrics: StoreTypes.SQLPlanMetric): SQLPlanMetric = { | ||
| SQLPlanMetric(metrics.getName, metrics.getAccumulatorId, metrics.getMetricType) | ||
| } | ||
| } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,88 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.status.protobuf.sql | ||
|
|
||
| import org.apache.spark.SparkFunSuite | ||
| import org.apache.spark.sql.execution.ui.SQLExecutionUIData | ||
| import org.apache.spark.status.api.v1.sql.SqlResourceSuite | ||
| import org.apache.spark.status.protobuf.KVStoreProtobufSerializer | ||
|
|
||
| class KVStoreProtobufSerializerSuite extends SparkFunSuite { | ||
|
|
||
| private val serializer = new KVStoreProtobufSerializer() | ||
|
|
||
| test("SQLExecutionUIData") { | ||
| val input = SqlResourceSuite.sqlExecutionUIData | ||
| val bytes = serializer.serialize(input) | ||
| val result = serializer.deserialize(bytes, classOf[SQLExecutionUIData]) | ||
| assert(result.executionId == input.executionId) | ||
| assert(result.description == input.description) | ||
| assert(result.details == input.details) | ||
| assert(result.physicalPlanDescription == input.physicalPlanDescription) | ||
| assert(result.modifiedConfigs == input.modifiedConfigs) | ||
| assert(result.metrics == input.metrics) | ||
| assert(result.submissionTime == input.submissionTime) | ||
| assert(result.completionTime == input.completionTime) | ||
| assert(result.errorMessage == input.errorMessage) | ||
| assert(result.jobs == input.jobs) | ||
| assert(result.stages == input.stages) | ||
| assert(result.metricValues == input.metricValues) | ||
| } | ||
|
|
||
| test("SQLExecutionUIData with metricValues is empty map and null") { | ||
| val templateData = SqlResourceSuite.sqlExecutionUIData | ||
|
|
||
| val input1 = new SQLExecutionUIData( | ||
| executionId = templateData.executionId, | ||
| description = templateData.description, | ||
| details = templateData.details, | ||
| physicalPlanDescription = templateData.physicalPlanDescription, | ||
| modifiedConfigs = templateData.modifiedConfigs, | ||
| metrics = templateData.metrics, | ||
| submissionTime = templateData.submissionTime, | ||
| completionTime = templateData.completionTime, | ||
| errorMessage = templateData.errorMessage, | ||
| jobs = templateData.jobs, | ||
| stages = templateData.stages, | ||
| metricValues = Map.empty | ||
| ) | ||
| val bytes1 = serializer.serialize(input1) | ||
| val result1 = serializer.deserialize(bytes1, classOf[SQLExecutionUIData]) | ||
| // input.metricValues is empty map, result.metricValues is empty map. | ||
| assert(result1.metricValues.isEmpty) | ||
|
|
||
| val input2 = new SQLExecutionUIData( | ||
| executionId = templateData.executionId, | ||
| description = templateData.description, | ||
| details = templateData.details, | ||
| physicalPlanDescription = templateData.physicalPlanDescription, | ||
| modifiedConfigs = templateData.modifiedConfigs, | ||
| metrics = templateData.metrics, | ||
| submissionTime = templateData.submissionTime, | ||
| completionTime = templateData.completionTime, | ||
| errorMessage = templateData.errorMessage, | ||
| jobs = templateData.jobs, | ||
| stages = templateData.stages, | ||
| metricValues = null | ||
| ) | ||
| val bytes2 = serializer.serialize(input2) | ||
| val result2 = serializer.deserialize(bytes2, classOf[SQLExecutionUIData]) | ||
| // input.metricValues is null, result.metricValues is also empty map. | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @gengliangwang For the collection type, the input is null, and after deserialization is empty collection, which may be different from json
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Returning an empty collection seems safer. It won't affect the UI page anyway. Thanks for noticing it.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. OK |
||
| assert(result2.metricValues.isEmpty) | ||
| } | ||
| } | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we define it in the
coremodule? Maybe it's better to put it in thesqlmodule?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is ok to define it in
coremodule. There is "SQL" concept in core module anyway.Can we skip shading protobuf in SQL module in this way?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we define all proto messages in the core module, I think we can skip shading protobuf in SQL module, the generated message code can find the corresponding
shaded+relocateddependencies in the core module