Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 9 additions & 2 deletions core/src/main/scala/org/apache/spark/status/AppStatusStore.scala
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ import java.util.{List => JList}
import scala.collection.JavaConverters._
import scala.collection.mutable.HashMap

import org.apache.spark.{JobExecutionStatus, SparkConf}
import org.apache.spark.{JobExecutionStatus, SparkConf, SparkException}
import org.apache.spark.status.api.v1
import org.apache.spark.ui.scope._
import org.apache.spark.util.Utils
Expand All @@ -36,7 +36,14 @@ private[spark] class AppStatusStore(
val listener: Option[AppStatusListener] = None) {

def applicationInfo(): v1.ApplicationInfo = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of make this function blocking, how about capture the Exception outside and display the proper message?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If blocking is supposed to be harmful, I'd prefer an optional result instead of throwing an exception here. In fact, due to the previous design and implementation, all the invokers of the method are expecting a valid ApplicationInfo, which means they are not capable of dealing with a None value or an exception now. If we change the returned type to Option, all the logic of the invokers (and invokers of these invokers maybe) should be somehow adjusted. There will be more design choices then.

Adding the retry logic solves the problem from the root. As I said, however, I can definitely make another choice if we reach a consensus.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think capturing the exception outside is actually a good compromise considering the issue is rather a corner case. Let's scope narrow here since the issue is rather minor, and consider a better fix with a standard approach when any bigger issue is found in the design.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also sounds reasonable to me.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your feedback, @HyukjinKwon. However, I didn't get your point (sorry for that). Did you mean changing the implementation to capturing the exception outside, or let's just keep the current approach?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry I wasn't clear. I was thinking a way, for example, as below:

-    store.view(classOf[ApplicationInfoWrapper]).max(1).iterator().next().info
+    try {
+      store.view(classOf[ApplicationInfoWrapper]).max(1).iterator().next().info
+    } catch {
+      case _: NoSuchElementException =>
+        throw new NoSuchElementException(
+          "Application information was not found. If your Spark application is " +
+            "starting up, try again when your Spark application is ready.")
+    }

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the explanation. I'll update the PR as suggested.

store.view(classOf[ApplicationInfoWrapper]).max(1).iterator().next().info
try {
// The ApplicationInfo may not be available when Spark is starting up.
store.view(classOf[ApplicationInfoWrapper]).max(1).iterator().next().info
} catch {
case _: NoSuchElementException =>
throw new SparkException("Failed to get the application information. " +
"If you are starting up Spark, please wait a while until it's ready.")
}
}

def environmentInfo(): v1.ApplicationEnvironmentInfo = {
Expand Down