-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
H/2 stress many failure in content posting methods #55261
Comments
Tagging subscribers to this area: @dotnet/ncl Issue DetailsReproduces in and out of Docker. Started happening between 6.0 Preview 4 and 6.0 Preview 6 (on 4 it ran, on 6 started failing) - tested locally.
Potentially kestrel error, cc: @JamesNK @Tratcher @halter73 Discovered in #55098
|
Some more digging:
Happens with slow content sending operation much more:
Cancellation:
|
@karelz What makes this seem like a Kestrel regression? The above exception is only thrown if Kestrel received a
I'm assuming this means the client received a 500 response. This could be caused by an uncaught exception on the server, but not an exception caused by Can you collect more verbose server logs and/or collect a network trace? |
Client logs show that these HTTP/2 POST tests are failing only on Windows and only with timeout in one of the following 2 places: The timeout exception message is
|
My first guess would be that the server hasn't responded in the given time. |
My comment above might not be related to this issue because here the server responds with 500 to client whereas I observed timeouts. I'll continue the investigation. |
It seems the issue is caused by the client sending EndOfStream without sending a RST_STREAM first when it has set the Content-Length and the server is still reading the request body when EndOfStream arrives. It's a protocol violation so the server throws the following exception and responds with 500.
|
It might indicate there is a bug with how Http2Stream sends EndOfStream in SendRequestBodyAsync. |
https://datatracker.ietf.org/doc/html/rfc7540#section-8.1.2.6
|
Can someone please try to repro this failure on .NET 5? That would help us understand if this is a regression. |
This seemed to cause massive failures in the CI |
AFAIK it started after updating to latest Kestrel version in #55098 because we needed it for HTTP/3, but in the same PR we put "active issue" on the failures not to pollute runs. Most possibly the massive failures you've seen were for that PR only, not for main? |
The test were disabled on 7/8, so that checks out ;) @alnikola will you be able to run your repro easily against earlier version of Kestrel (either 5.0, or earlier 6.0 Preview)? |
I will try to run stress tests on 6.0 Preview 4 (it's reported to be healthy) or .NET 5. |
@adityamandaleeka I ran these POST tests on .NET Core: 5.0.8, ASP.NET Core: 5.0.8 and they completed successfully. I have not got any errors. |
@alnikola would it be possible to use latest .NET Core libraries for client side, while using old Kestrel? That would tell us if our current client has a bug or if it is Kestrel. |
Please, disregard the above. There was a mistake. It is reproducing. |
I tested with |
Thank you for confirming, @alnikola. We'll prioritize investigating this regression. |
@alnikola Can you please share instructions for how to run the stress test that reproduces this? |
First, globally install the required SDK and runtime versions because stress tests use the globally installed .NET. Then, in
|
Depends on dotnet/aspnetcore#34768 |
Triage: punting from 6.0 since the bug is handled on the ASP.NET side so the root cause should get fix in time. We can re-enable the stress test part once the fix is published in the nightly docker image, which might land after ZBB. |
Enable HTTP/2 POST stress tests because, the Kestrel's issue with an incorrect HTTP/2 stream resetting seems to be fixed by dotnet/aspnetcore#34768. Fixes #55261
Reopening to track backport to 6.0. |
Reproduces in and out of Docker. Started happening between 6.0 Preview 4 and 6.0 Preview 6 (on 4 it ran, on 6 started failing) - tested locally.
Error:
Potentially kestrel error, cc: @JamesNK @Tratcher @halter73
Discovered in #55098
The text was updated successfully, but these errors were encountered: