Skip to content

Conversation

@guoxiaolongzte
Copy link

@guoxiaolongzte guoxiaolongzte commented Apr 21, 2017

What changes were proposed in this pull request?

Make a post REST interface request:
http://ip:6066/v1/submissions/kill/driver-20170421111514-0000
Currently only supports delete a single ‘driver’.But our large data management platform, hope to delete some 'drivers' in a request.
Because these drivers may be abnormal situation.

Now i let spark support delete some ‘drivers’.
When a post request:
http://zdh120:6066/v1/submissions/kill/**driver-20170421111514-0000,driver-20170421111515-0001,driver-20170421111517-0002,driver-20170421111517-0003**

'submissionId' must be separated by commas.Through this interface, I can delete four drivers in a request together.

How was this patch tested?

manual tests

Please review http://spark.apache.org/contributing.html before opening a pull request.

郭小龙 10207633 added 19 commits March 31, 2017 21:57
…, currently only supports delete a single ‘driver’, now let spark support delete some ‘drivers’
@guoxiaolongzte
Copy link
Author

@HyukjinKwon Help with code review,thank you.

@HyukjinKwon
Copy link
Member

I guess I am not used to this code path. Git blame says @tnachen changed the codes lately.

@guoxiaolongzte
Copy link
Author

@tnachen Help with code review,thank you.

response: HttpServletResponse): Unit = {
val submissionId = parseSubmissionId(request.getPathInfo)
val responseMessage = submissionId.map(handleKill).getOrElse {
val submissionIds = parseSubmissionId(request.getPathInfo)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think having submission Ids parsed on the request path is a good idea.
I would assume most use cases for batch delete is required when you have a larger number of drivers to delete (otherwise you would be fine just deleting a few one by one).
But most URLs are length limited.
You might be better have creating a new request for this that takes a body

@guoxiaolongzte
Copy link
Author

guoxiaolongzte commented May 3, 2017

@tnachen
You are very justified.Add the batch to remove the interface.Help with code review,thank you.

Batch delete e.g.,
Request URL: http://ip:6066/v1/submissions/kill/batch
Method: POST
Request body:
{
"submissionIds":"driver-20170503112644-0002,driver-20170503112537-0001"
}

Single delete e.g.,
request url: http://ip:6066/v1/submissions/kill/driver-20170503112537-0001"
Method: POST

@guoxiaolongzte
Copy link
Author

Hello, This PR has been created for some time now. please help to review the code,thanks.
@tnachen

@tnachen
Copy link
Contributor

tnachen commented Jun 2, 2017

Unfortunately I'm not a committer, so need to loop in someone who is to help merge it though. @srowen do you know who's responsible for the general deploy package?

@guoxiaolongzte
Copy link
Author

@srowen
Help to merge to master , Thanks.

@guoxiaolongzte
Copy link
Author

@SparkQA please test it.

@jiangxb1987
Copy link
Contributor

ok to test

@jiangxb1987
Copy link
Contributor

One general question: what do we expect to gain from this improvement? Seems it's just looping the drivers and send kill requests?

@guoxiaolongzte
Copy link
Author

guoxiaolongzte commented Jun 24, 2017

@jiangxb1987
I can delete some 'drivers' in a request when the 'drivers' when these 'drivers' are dead and meaningless.

@jiangxb1987
Copy link
Contributor

jiangxb1987 commented Jun 24, 2017

Let me be explicit - I don't think the improvement is really needed in Spark, as long as it's just looping the drivers and sending blocking KillDriver messages, since we gain nothing on performance issue this way. If you have a huge cluster with many drivers dying simultaneously(which, in my mind, should be really extreme case), then it's fine you write a script to call from outside of Spark.

@gatorsmile
Copy link
Member

Really appreciate your contribution! Sorry, based on the comment, we might need to close this PR, but please submit more PRs in the future. Thanks again!

@asfgit asfgit closed this in b32bd00 Jun 27, 2017
zifeif2 pushed a commit to zifeif2/spark that referenced this pull request Nov 22, 2025
## What changes were proposed in this pull request?

This PR proposes to close stale PRs, mostly the same instances with apache#18017

I believe the author in apache#14807 removed his account.

Closes apache#7075
Closes apache#8927
Closes apache#9202
Closes apache#9366
Closes apache#10861
Closes apache#11420
Closes apache#12356
Closes apache#13028
Closes apache#13506
Closes apache#14191
Closes apache#14198
Closes apache#14330
Closes apache#14807
Closes apache#15839
Closes apache#16225
Closes apache#16685
Closes apache#16692
Closes apache#16995
Closes apache#17181
Closes apache#17211
Closes apache#17235
Closes apache#17237
Closes apache#17248
Closes apache#17341
Closes apache#17708
Closes apache#17716
Closes apache#17721
Closes apache#17937

Added:
Closes apache#14739
Closes apache#17139
Closes apache#17445
Closes apache#18042
Closes apache#18359

Added:
Closes apache#16450
Closes apache#16525
Closes apache#17738

Added:
Closes apache#16458
Closes apache#16508
Closes apache#17714

Added:
Closes apache#17830
Closes apache#14742

## How was this patch tested?

N/A

Author: hyukjinkwon <[email protected]>

Closes apache#18417 from HyukjinKwon/close-stale-pr.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants