You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/monitoring.md
+73Lines changed: 73 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -174,6 +174,79 @@ making it easy to identify slow tasks, data skew, etc.
174
174
175
175
Note that the history server only displays completed Spark jobs. One way to signal the completion of a Spark job is to stop the Spark Context explicitly (`sc.stop()`), or in Python using the `with SparkContext() as sc:` to handle the Spark Context setup and tear down, and still show the job history on the UI.
176
176
177
+
## REST API
178
+
179
+
In addition to viewing the metrics in the UI, they are also available as JSON. This gives developers
180
+
an easy way to create new visualizations and monitoring tools for Spark. The JSON is available for
181
+
both running applications, an in the history server. The endpoints are mounted at `/json/v1`. Eg.,
182
+
for the history server, they would typically be accessible at `http://<server_url>:18080/json/v1`.
183
+
184
+
<tableclass="table">
185
+
<tr><th>Endpoint</th><th>Meaning</th></tr>
186
+
<tr>
187
+
<td>`/applications`</td>
188
+
<td>A list of all applications</td>
189
+
</tr>
190
+
<tr>
191
+
<td>`/applications/<app_id>/jobs`</td>
192
+
<td>A list of all jobs for a given application</td>
193
+
</tr>
194
+
<tr>
195
+
<td>`/applications/<app_id>/jobs/<job_id>`</td>
196
+
<td>Details for one job</td>
197
+
</tr>
198
+
<tr>
199
+
<td>`/applications/<app_id>/stages`</td>
200
+
<td>A list of all stages for a given application</td>
0 commit comments