-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-3454] separate json endpoints for data in the UI #4435
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 5 commits
4b398d0
d1a8c92
b252e7a
e0356b6
bceb3a9
f0264a7
cba1ef6
4fdc39f
a4ab5aa
cd37845
b6a96a8
0c12b50
c7d884f
0dc3ea7
1b78cb7
017c755
56edce0
5598f19
18a8c45
b86e2b0
654cecf
d05f7a9
3377e61
00e9cc5
9c0c125
f7df095
56d2fc7
e91750a
b4c75ed
36e4062
84cd497
190c17a
dddbd29
0c96147
97d411f
73f1378
7fd156a
5f9df24
b86bcb0
dbfc7bf
fef6605
2382bef
0be5120
1f361c8
a066055
a61a43c
99764e1
9f28b7e
51eaedb
25cd894
eaf3bbb
e48ba32
a4b1397
aaba896
b2efcaf
ad27de8
63eb4a6
fdfc181
db61211
f5a5196
9922be0
674f8dc
1578a4a
2c8b7ee
b4a7863
1f53a66
e031719
b136e39
d2ef58d
101a698
f05ae89
d493b38
cf86175
9ea682c
c22b11f
f2e63ad
4c92af6
1ed0d07
0c6f968
a9c5cf1
a325563
7bd4d15
a157a2f
4a234d3
d2bde77
39ac29c
ba3d9d2
2e19be2
b2f8b91
31c79ce
52bbae8
f48a7b0
9d889d6
7bf1811
acb7ef6
dc8a7fe
f90680e
14ac3ed
befff0c
2af11e5
188762c
b87cd63
c9bae1c
9e51400
67008b4
7f3bc4e
56db31e
cbaf287
cc1febf
ec140a2
3347b72
f016182
5ae02ad
5e78b4f
da1e35f
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -214,6 +214,11 @@ | |
| <artifactId>json4s-jackson_${scala.binary.version}</artifactId> | ||
| <version>3.2.10</version> | ||
| </dependency> | ||
| <dependency> | ||
| <groupId>com.fasterxml.jackson.module</groupId> | ||
| <artifactId>jackson-module-scala_2.10</artifactId> | ||
| <version>2.3.1</version> | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this seems like a common dependency by our users. how well does jackson work against different versions?
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this is rxin: ah i c we are already adding it. the question stands for @pwendell / @andrewor14 - should we shade this? Is it robust enough to not shade?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. not exactly what you're asking but this module doesn't always play nicely with different versions of jackson itself. So if somehow jackson gets bumped up to a later version b/c of some transitive dependency, often this needs to get bumped up to go along with it.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. v2.4.4 is already included in master so if you rebase you won't have to include it again
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what i was asking @pwendell/@andrewor14 was the probability this breaks user applications (or user applications break Spark) due to version conflicts for both Jackson & the Scala module.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @squito just a reminder that you should update this version to |
||
| </dependency> | ||
| <dependency> | ||
| <groupId>org.apache.mesos</groupId> | ||
| <artifactId>mesos</artifactId> | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,23 @@ | ||
| package org.apache.spark.status.api;/* | ||
|
||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| public enum StageStatus { | ||
| Active, | ||
| Complete, | ||
| Failed, | ||
| Pending | ||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,56 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
| package org.apache.spark.deploy.history | ||
|
|
||
| import javax.servlet.http.HttpServletRequest | ||
|
|
||
| import org.apache.spark.status.{UIRoot, StatusJsonRoute} | ||
| import org.apache.spark.status.api.ApplicationInfo | ||
| import org.apache.spark.deploy.master.{ApplicationInfo => InternalApplicationInfo} | ||
|
|
||
| class AllApplicationsJsonRoute(val uiRoot: UIRoot) extends StatusJsonRoute[Seq[ApplicationInfo]] { | ||
|
|
||
| override def renderJson(request: HttpServletRequest): Seq[ApplicationInfo] = { | ||
| //TODO filter on some query params, eg. completed, minStartTime, etc | ||
| uiRoot.getApplicationInfoList | ||
| } | ||
|
|
||
| } | ||
|
|
||
| object AllApplicationsJsonRoute { | ||
| def appHistoryInfoToPublicAppInfo(app: ApplicationHistoryInfo): ApplicationInfo = { | ||
| ApplicationInfo( | ||
| id = app.id, | ||
| name = app.name, | ||
| startTime = app.startTime, | ||
| endTime = app.endTime, | ||
| sparkUser = app.sparkUser, | ||
| completed = app.completed | ||
| ) | ||
| } | ||
|
|
||
| def convertApplicationInfo(internal: InternalApplicationInfo, completed: Boolean): ApplicationInfo = { | ||
| ApplicationInfo( | ||
| id = internal.id, | ||
| name = internal.desc.name, | ||
| startTime = internal.startTime, | ||
| endTime = internal.endTime, | ||
| sparkUser = internal.desc.user, | ||
| completed = completed | ||
| ) | ||
| } | ||
| } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -122,7 +122,10 @@ private[history] class FsHistoryProvider(conf: SparkConf) extends ApplicationHis | |
| } | ||
| } | ||
|
|
||
| override def getListing() = applications.values | ||
| override def getListing(refresh: Boolean) = { | ||
| if (refresh) checkForLogs() | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: We only use the shorthand when using |
||
| applications.values | ||
| } | ||
|
|
||
| override def getAppUI(appId: String): Option[SparkUI] = { | ||
| try { | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -34,7 +34,7 @@ private[spark] class HistoryPage(parent: HistoryServer) extends WebUIPage("") { | |
| val requestedIncomplete = | ||
| Option(request.getParameter("showIncomplete")).getOrElse("false").toBoolean | ||
|
|
||
| val allApps = parent.getApplicationList().filter(_.completed != requestedIncomplete) | ||
| val allApps = parent.getApplicationList(true).filter(_.completed != requestedIncomplete) | ||
|
||
| val actualFirst = if (requestedFirst < allApps.size) requestedFirst else 0 | ||
| val apps = allApps.slice(actualFirst, Math.min(actualFirst + pageSize, allApps.size)) | ||
|
|
||
|
|
@@ -67,7 +67,7 @@ private[spark] class HistoryPage(parent: HistoryServer) extends WebUIPage("") { | |
|
|
||
| <h4> | ||
| Showing {actualFirst + 1}-{last + 1} of {allApps.size} | ||
| {if (requestedIncomplete) "(Incomplete applications)"} | ||
| ({if (requestedIncomplete) "Incomplete" else "Complete"} applications) | ||
| <span style="float: right"> | ||
| { | ||
| if (actualPage > 1) { | ||
|
|
@@ -90,7 +90,7 @@ private[spark] class HistoryPage(parent: HistoryServer) extends WebUIPage("") { | |
| </h4> ++ | ||
| appTable | ||
| } else { | ||
| <h4>No completed applications found!</h4> ++ | ||
| <h4>No {if (requestedIncomplete) "running" else "completed"} applications found!</h4> ++ | ||
| <p>Did you specify the correct logging directory? | ||
| Please verify your setting of <span style="font-style:italic"> | ||
| spark.history.fs.logDirectory</span> and whether you have the permissions to | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -21,6 +21,9 @@ import java.util.NoSuchElementException | |
| import javax.servlet.http.{HttpServlet, HttpServletRequest, HttpServletResponse} | ||
|
|
||
| import com.google.common.cache._ | ||
| import org.apache.spark.deploy.master.ui.MasterApplicationJsonRoute | ||
| import org.apache.spark.status.api.ApplicationInfo | ||
| import org.apache.spark.status.{UIRoot, JsonRequestHandler} | ||
| import org.eclipse.jetty.servlet.{ServletContextHandler, ServletHolder} | ||
|
|
||
| import org.apache.spark.{Logging, SecurityManager, SparkConf} | ||
|
|
@@ -45,7 +48,7 @@ class HistoryServer( | |
| provider: ApplicationHistoryProvider, | ||
| securityManager: SecurityManager, | ||
| port: Int) | ||
| extends WebUI(securityManager, port, conf) with Logging { | ||
| extends WebUI(securityManager, port, conf) with Logging with UIRoot { | ||
|
|
||
| // How many applications to retain | ||
| private val retainedApplications = conf.getInt("spark.history.retainedApplications", 50) | ||
|
|
@@ -71,6 +74,7 @@ class HistoryServer( | |
| protected override def doGet(req: HttpServletRequest, res: HttpServletResponse): Unit = { | ||
| val parts = Option(req.getPathInfo()).getOrElse("").split("/") | ||
| if (parts.length < 2) { | ||
| logError("bad path info!") | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. log the path too? |
||
| res.sendError(HttpServletResponse.SC_BAD_REQUEST, | ||
| s"Unexpected path info in request (URI = ${req.getRequestURI()}") | ||
| return | ||
|
|
@@ -98,6 +102,10 @@ class HistoryServer( | |
| } | ||
| } | ||
|
|
||
| def getSparkUI(appKey: String): Option[SparkUI] = { | ||
| Option(appCache.get(appKey)) | ||
| } | ||
|
|
||
| initialize() | ||
|
|
||
| /** | ||
|
|
@@ -108,6 +116,11 @@ class HistoryServer( | |
| */ | ||
| def initialize() { | ||
| attachPage(new HistoryPage(this)) | ||
|
|
||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: delete |
||
| val jsonHandler = new JsonRequestHandler(this, securityManager) | ||
| attachHandler(jsonHandler.jsonContextHandler) | ||
|
|
||
|
|
||
| attachHandler(createStaticHandler(SparkUI.STATIC_RESOURCE_DIR, "/static")) | ||
|
|
||
| val contextHandler = new ServletContextHandler | ||
|
|
@@ -145,7 +158,11 @@ class HistoryServer( | |
| * | ||
| * @return List of all known applications. | ||
| */ | ||
| def getApplicationList() = provider.getListing() | ||
| def getApplicationList(refresh: Boolean) = provider.getListing(refresh) | ||
|
|
||
| def getApplicationInfoList: Seq[ApplicationInfo] = { | ||
| getApplicationList(true).map{AllApplicationsJsonRoute.appHistoryInfoToPublicAppInfo}.toSeq | ||
| } | ||
|
|
||
| /** | ||
| * Returns the provider configuration to show in the listing page. | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,34 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
| package org.apache.spark.deploy.history | ||
|
|
||
| import javax.servlet.http.HttpServletRequest | ||
|
|
||
| import org.apache.spark.status.{UIRoot, JsonRequestHandler, StatusJsonRoute} | ||
| import org.apache.spark.status.api.ApplicationInfo | ||
|
|
||
| class OneApplicationJsonRoute(val uiRoot: UIRoot) extends StatusJsonRoute[ApplicationInfo] { | ||
| override def renderJson(request: HttpServletRequest): ApplicationInfo = { | ||
| val appIdOpt = JsonRequestHandler.extractAppId(request.getPathInfo) | ||
| appIdOpt.map{ appId => | ||
| val apps = uiRoot.getApplicationInfoList.find{_.id == appId} | ||
| apps.getOrElse(throw new IllegalArgumentException("unknown app: " + appId)) | ||
| }.getOrElse{ | ||
| throw new IllegalArgumentException("no application id specified") | ||
| } | ||
| } | ||
| } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,44 @@ | ||
| /* | ||
| * Licensed to the Apache Software Foundation (ASF) under one or more | ||
| * contributor license agreements. See the NOTICE file distributed with | ||
| * this work for additional information regarding copyright ownership. | ||
| * The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| * (the "License"); you may not use this file except in compliance with | ||
| * the License. You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
| package org.apache.spark.deploy.master.ui | ||
|
|
||
| import javax.servlet.http.HttpServletRequest | ||
|
|
||
| import akka.pattern.ask | ||
|
|
||
| import org.apache.spark.deploy.DeployMessages.{RequestMasterState, MasterStateResponse} | ||
| import org.apache.spark.status.StatusJsonRoute | ||
| import org.apache.spark.status.api.ApplicationInfo | ||
|
|
||
| import scala.concurrent.Await | ||
|
|
||
| class MasterApplicationJsonRoute(val parent: MasterWebUI) extends StatusJsonRoute[ApplicationInfo] { | ||
| private val master = parent.masterActorRef | ||
| private val timeout = parent.timeout | ||
|
|
||
|
|
||
| override def renderJson(request: HttpServletRequest): ApplicationInfo = { | ||
| //TODO not really the app id | ||
| val appId = request.getPathInfo() | ||
| println("pathInfo = " + request.getPathInfo()) | ||
| val stateFuture = (master ? RequestMasterState)(timeout).mapTo[MasterStateResponse] | ||
| val state = Await.result(stateFuture, timeout) | ||
| state.activeApps.find(_.id == appId).orElse({ | ||
| state.completedApps.find(_.id == appId) | ||
| }).map{MasterJsonRoute.masterAppInfoToPublicAppInfo}.getOrElse(null) | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this OK? or should the apache header go in the test resource files as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the header shouldn't go into those files