-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-20428][Core]REST interface about 'v1/submissions/ kill/', currently only supports delete a single ‘driver’, now let spark support delete some ‘drivers’ #17714
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ucceeded|failed|unknown]
…remove redundant description.
…, currently only supports delete a single ‘driver’, now let spark support delete some ‘drivers’
|
@HyukjinKwon Help with code review,thank you. |
|
I guess I am not used to this code path. Git blame says @tnachen changed the codes lately. |
|
@tnachen Help with code review,thank you. |
| response: HttpServletResponse): Unit = { | ||
| val submissionId = parseSubmissionId(request.getPathInfo) | ||
| val responseMessage = submissionId.map(handleKill).getOrElse { | ||
| val submissionIds = parseSubmissionId(request.getPathInfo) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think having submission Ids parsed on the request path is a good idea.
I would assume most use cases for batch delete is required when you have a larger number of drivers to delete (otherwise you would be fine just deleting a few one by one).
But most URLs are length limited.
You might be better have creating a new request for this that takes a body
|
@tnachen Batch delete e.g., Single delete e.g., |
|
Hello, This PR has been created for some time now. please help to review the code,thanks. |
|
Unfortunately I'm not a committer, so need to loop in someone who is to help merge it though. @srowen do you know who's responsible for the general deploy package? |
|
@srowen |
|
@SparkQA please test it. |
|
ok to test |
|
One general question: what do we expect to gain from this improvement? Seems it's just looping the drivers and send kill requests? |
|
@jiangxb1987 |
|
Let me be explicit - I don't think the improvement is really needed in Spark, as long as it's just looping the drivers and sending blocking KillDriver messages, since we gain nothing on performance issue this way. If you have a huge cluster with many drivers dying simultaneously(which, in my mind, should be really extreme case), then it's fine you write a script to call from outside of Spark. |
|
Really appreciate your contribution! Sorry, based on the comment, we might need to close this PR, but please submit more PRs in the future. Thanks again! |
## What changes were proposed in this pull request? This PR proposes to close stale PRs, mostly the same instances with apache#18017 I believe the author in apache#14807 removed his account. Closes apache#7075 Closes apache#8927 Closes apache#9202 Closes apache#9366 Closes apache#10861 Closes apache#11420 Closes apache#12356 Closes apache#13028 Closes apache#13506 Closes apache#14191 Closes apache#14198 Closes apache#14330 Closes apache#14807 Closes apache#15839 Closes apache#16225 Closes apache#16685 Closes apache#16692 Closes apache#16995 Closes apache#17181 Closes apache#17211 Closes apache#17235 Closes apache#17237 Closes apache#17248 Closes apache#17341 Closes apache#17708 Closes apache#17716 Closes apache#17721 Closes apache#17937 Added: Closes apache#14739 Closes apache#17139 Closes apache#17445 Closes apache#18042 Closes apache#18359 Added: Closes apache#16450 Closes apache#16525 Closes apache#17738 Added: Closes apache#16458 Closes apache#16508 Closes apache#17714 Added: Closes apache#17830 Closes apache#14742 ## How was this patch tested? N/A Author: hyukjinkwon <[email protected]> Closes apache#18417 from HyukjinKwon/close-stale-pr.
What changes were proposed in this pull request?
Make a post REST interface request:
http://ip:6066/v1/submissions/kill/driver-20170421111514-0000
Currently only supports delete a single ‘driver’.But our large data management platform, hope to delete some 'drivers' in a request.
Because these drivers may be abnormal situation.
Now i let spark support delete some ‘drivers’.
When a post request:
http://zdh120:6066/v1/submissions/kill/**driver-20170421111514-0000,driver-20170421111515-0001,driver-20170421111517-0002,driver-20170421111517-0003**
'submissionId' must be separated by commas.Through this interface, I can delete four drivers in a request together.
How was this patch tested?
manual tests
Please review http://spark.apache.org/contributing.html before opening a pull request.