Skip to content

Conversation

@nblintao
Copy link
Contributor

@nblintao nblintao commented Aug 2, 2016

What changes were proposed in this pull request?

Added a "Back to Master" link on Executor Page of Application Detail UI, if the application is run in standalone mode.

The links link to the main page of Master UI.

How was this patch tested?

Run Spark as a "local cluster" by ./bin/spark-shell --master=local-cluster[2,1,1024].

Then view the Executor Page on Application Detail UI (http://192.168.1.104:4040/executors/). Users could go to the Master Page by the link "Back to Master" on that page.

detailpagetomaster

@SparkQA
Copy link

SparkQA commented Aug 2, 2016

Test build #63129 has finished for PR 14461 at commit f35afa2.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • case class RegisteredApplication(appId: String, master: RpcEndpointRef,

@ajbozarth
Copy link
Member

I take some time to check out and run this later today, but on a quick pass the code looks ok. I have some issues with the actual pr though. The link on the application summary page is redundant, clicking the spark logo goes back to the master summary already. As for the Executors page, I'll have to test it, but I think there are quite a few situations where a admin might not want a user to access the master UI, even in a standalone cluster.

@ajbozarth
Copy link
Member

I forgot to comment, but I did check this out and it runs well, I still think the link on the app summary is redundant though.

@nblintao nblintao changed the title [SPARK-16856] [WEBUI] [CORE] Link application summary page and detail page to the master page [SPARK-16856] [WEBUI] [CORE] Link the application's executor page to the master's UI Aug 6, 2016
@SparkQA
Copy link

SparkQA commented Aug 6, 2016

Test build #63302 has finished for PR 14461 at commit 0192b37.

  • This patch fails Scala style tests.
  • This patch does not merge cleanly.
  • This patch adds no public classes.

# Conflicts:
#	project/MimaExcludes.scala
@nblintao
Copy link
Contributor Author

nblintao commented Aug 6, 2016

@ajbozarth Thanks for pointing out that. I have removed the link on the application summary page and reverted related changes.

@nblintao
Copy link
Contributor Author

nblintao commented Aug 6, 2016

retest it please

@SparkQA
Copy link

SparkQA commented Aug 6, 2016

Test build #63303 has finished for PR 14461 at commit 4564bc5.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • public class ShuffleIndexInformation
    • public class ShuffleIndexRecord
    • case class Least(children: Seq[Expression]) extends Expression
    • case class Greatest(children: Seq[Expression]) extends Expression
    • case class CreateTable(tableDesc: CatalogTable, mode: SaveMode, query: Option[LogicalPlan])
    • case class PreprocessDDL(conf: SQLConf) extends Rule[LogicalPlan]

@SparkQA
Copy link

SparkQA commented Aug 6, 2016

Test build #63306 has finished for PR 14461 at commit e3707db.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@nblintao
Copy link
Contributor Author

nblintao commented Aug 6, 2016

[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/project/MimaExcludes.scala:50: ')' expected but '.' found.
[error]       ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.SparkListenerApplicationStart.apply")
[error]                     ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/project/MimaExcludes.scala:51: ';' expected but '.' found.
[error]       ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.SparkListenerApplicationStart.copy")
[error]                     ^

I think I have fixed the conflict correctly. Why's that?

@ajbozarth
Copy link
Member

The excludes are a list, you forgot to add commas between them

@nblintao
Copy link
Contributor Author

nblintao commented Aug 6, 2016

Oh yes. Thanks :)

@SparkQA
Copy link

SparkQA commented Aug 6, 2016

Test build #63308 has finished for PR 14461 at commit 76e68eb.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@ajbozarth
Copy link
Member

@nblintao I'd like to take another look at this, mind rebasing?

@nblintao
Copy link
Contributor Author

@ajbozarth I finally have a chance to rebase it in the winter break. Could you please have a look? Thanks!

@SparkQA
Copy link

SparkQA commented Dec 29, 2016

Test build #70698 has finished for PR 14461 at commit d598e55.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@ajbozarth
Copy link
Member

Thanks for rebasing, I'll check this out sometime this week

@ajbozarth
Copy link
Member

@nblintao did you try this code on different cluster types? Like local, standalone, yarn, mesos?

@ajbozarth
Copy link
Member

Also @srowen and @vanzin could you take a look at this one too?

@ajbozarth
Copy link
Member

I did some testing and this properly shows up on local and standalone and doesn't show the link on yarn. So this LGTM

appAttemptId: Option[String],
driverLogs: Option[Map[String, String]] = None) extends SparkListenerEvent
driverLogs: Option[Map[String, String]] = None,
masterWebUiUrl: Option[String] = None) extends SparkListenerEvent
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you need this? It seems the standalone registration messages already take care of this.

If you want this, are you intentionally not saving this in event logs? The way it is, this is only ever useful when the application is starting but hasn't yet registered with the master. That's a very short period of time and probably not worth it.

If you add code to save to event logs, I'd rename this to be more cluster manager-agnostic. e.g. "clusterWebUiUrl".

private val listener = parent.listener

def render(request: HttpServletRequest): Seq[Node] = {
val masterWebUiUrl = listener.masterWebUiUrl
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this information makes way more sense in the "Environment" tab.

@HyukjinKwon
Copy link
Member

(gentle ping @nblintao)

@nblintao
Copy link
Contributor Author

nblintao commented Feb 9, 2017

@HyukjinKwon Thanks for reminding. I will have a look asap. Too many things to do recently.

@HyukjinKwon
Copy link
Member

(gentle ping @nblintao)

@nblintao
Copy link
Contributor Author

@HyukjinKwon Thanks for reminding. I am finals this week. I will fix this after 15th.

@asfgit asfgit closed this in b771fed Jun 8, 2017
zifeif2 pushed a commit to zifeif2/spark that referenced this pull request Nov 22, 2025
# What changes were proposed in this pull request?

This PR proposes to close stale PRs, mostly the same instances with apache#18017

Closes apache#11459
Closes apache#13833
Closes apache#13720
Closes apache#12506
Closes apache#12456
Closes apache#12252
Closes apache#17689
Closes apache#17791
Closes apache#18163
Closes apache#17640
Closes apache#17926
Closes apache#18163
Closes apache#12506
Closes apache#18044
Closes apache#14036
Closes apache#15831
Closes apache#14461
Closes apache#17638
Closes apache#18222

Added:
Closes apache#18045
Closes apache#18061
Closes apache#18010
Closes apache#18041
Closes apache#18124
Closes apache#18130
Closes apache#12217

Added:
Closes apache#16291
Closes apache#17480
Closes apache#14995

Added:
Closes apache#12835
Closes apache#17141

## How was this patch tested?

N/A

Author: hyukjinkwon <[email protected]>

Closes apache#18223 from HyukjinKwon/close-stale-prs.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants