Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GracefulConnectionClosureHandlingTest failure #2117

Open
Scottmitch opened this issue Feb 28, 2022 · 21 comments · Fixed by #2834
Open

GracefulConnectionClosureHandlingTest failure #2117

Scottmitch opened this issue Feb 28, 2022 · 21 comments · Fixed by #2834
Labels
flaky tests Unit tests are flaky

Comments

@Scottmitch
Copy link
Member

https://github.com/apple/servicetalk/runs/5347779658

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0xac0bf9d7, L:/127.0.0.1:39712 ! R:127.0.0.1/127.0.0.1:42991]
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:470)


Caused by: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0xac0bf9d7, L:/127.0.0.1:39712 ! R:127.0.0.1/127.0.0.1:42991]
Caused by: io.servicetalk.transport.netty.internal.StacklessClosedChannelException
	at io.servicetalk.transport.netty.internal.DefaultNettyConnection.channelInactive(...)(Unknown Source)
@Scottmitch Scottmitch mentioned this issue Feb 28, 2022
@idelpivnitskiy idelpivnitskiy added the flaky tests Unit tests are flaky label Apr 4, 2022
@idelpivnitskiy
Copy link
Member

https://github.com/apple/servicetalk/pull/2174/checks?check_run_id=5796849625

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0xb2f977e3, L:/127.0.0.1:60616 ! R:127.0.0.1/127.0.0.1:33911]
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:470)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
Caused by: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0xb2f977e3, L:/127.0.0.1:60616 ! R:127.0.0.1/127.0.0.1:33911]
Caused by: io.servicetalk.transport.netty.internal.StacklessClosedChannelException
	at io.servicetalk.transport.netty.internal.DefaultNettyConnection.channelInactive(...)(Unknown Source)

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/runs/6044078838

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x4ea4e931, L:/127.0.0.1:46650 ! R:127.0.0.1/127.0.0.1:45391]
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:471)


Caused by: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x4ea4e931, L:/127.0.0.1:46650 ! R:127.0.0.1/127.0.0.1:45391]
Caused by: io.servicetalk.transport.netty.internal.StacklessClosedChannelException
	at io.servicetalk.transport.netty.internal.DefaultNettyConnection.channelInactive(...)(Unknown Source)

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/runs/6536106368https://github.com/apple/servicetalk/runs/6536106368

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x028bd8bc, L:/127.0.0.1:55348 ! R:127.0.0.1/127.0.0.1:36561]
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:471)

Caused by: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x028bd8bc, L:/127.0.0.1:55348 ! R:127.0.0.1/127.0.0.1:36561]
Caused by: io.servicetalk.transport.netty.internal.StacklessClosedChannelException
	at io.servicetalk.transport.netty.internal.DefaultNettyConnection.channelInactive(...)(Unknown Source)

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/runs/6631729272

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x28726a78, L:/127.0.0.1:47236 ! R:127.0.0.1/127.0.0.1:34131]
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:471)

Caused by: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x28726a78, L:/127.0.0.1:47236 ! R:127.0.0.1/127.0.0.1:34131]
Caused by: io.servicetalk.transport.netty.internal.StacklessClosedChannelException
	at io.servicetalk.transport.netty.internal.DefaultNettyConnection.channelInactive(...)(Unknown Source)

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/pull/2257/checks?check_run_id=7156982960

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x78d236d3, L:/127.0.0.1:45684 ! R:127.0.0.1/127.0.0.1:37845]
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:471)


Caused by: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0x78d236d3, L:/127.0.0.1:45684 ! R:127.0.0.1/127.0.0.1:37845]
Caused by: io.servicetalk.transport.netty.internal.StacklessClosedChannelException
	at io.servicetalk.transport.netty.internal.DefaultNettyConnection.channelInactive(...)(Unknown Source)

@idelpivnitskiy
Copy link
Member

https://github.com/apple/servicetalk/pull/2356/checks?check_run_id=8308215695

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:470)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
Caused by: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/actions/runs/4768582573/jobs/8478257970

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:471)
	

Caused by: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer

@idelpivnitskiy
Copy link
Member

https://github.com/apple/servicetalk/pull/2614/checks?check_run_id=13905833408

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 secure=true initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:544)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
Caused by: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/pull/2621/checks?check_run_id=14422322949

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 secure=true initiateClosureFromClient=false useUds=false viaProxy=true

java.util.concurrent.ExecutionException: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:121)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:92)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:544)
	
...
Caused by: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/pull/2640/checks?check_run_id=14801441626

GracefulConnectionClosureHandlingTest.5: protocol=HTTP_1 secure=true initiateClosureFromClient=false useUds=false viaProxy=true
java.util.concurrent.ExecutionException: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:126)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:95)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:544)

Caused by: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer

@bryce-anderson
Copy link
Contributor

bryce-anderson commented Nov 1, 2023

https://github.com/apple/servicetalk/actions/runs/6724690520/job/18277489377?pr=2741

This one is unique in that it failed with the HTTP_2 protocol and failed with a timeout.

GracefulConnectionClosureHandlingTest > closeAfterRequestMetaDataSentNoResponseReceived(HttpProtocol, boolean, boolean, boolean, boolean) > 15: protocol=HTTP_2 secure=true initiateClosureFromClient=false useUds=false viaProxy=true FAILED
    java.util.concurrent.TimeoutException: closeAfterRequestMetaDataSentNoResponseReceived(io.servicetalk.http.netty.HttpProtocol, boolean, boolean, boolean, boolean) timed out after 30 seconds
        at org.junit.jupiter.engine.extension.TimeoutExceptionFactory.create(TimeoutExceptionFactory.java:29)
        at org.junit.jupiter.engine.extension.SameThreadTimeoutInvocation.proceed(SameThreadTimeoutInvocation.java:58)

@bryce-anderson
Copy link
Contributor

https://github.com/apple/servicetalk/actions/runs/6897145342/job/18765420150#step:7:1342
Another unique failed test:

GracefulConnectionClosureHandlingTest > closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(HttpProtocol, boolean, boolean, boolean, boolean) > 5: protocol=HTTP_1 secure=true initiateClosureFromClient=false useUds=false viaProxy=true FAILED
    java.util.concurrent.ExecutionException: io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer
        at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:126)
        at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:95)
        at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:543)

        Caused by:
        io.netty.channel.unix.Errors$NativeIoException: recvAddress(..) failed: Connection reset by peer

@bryce-anderson
Copy link
Contributor

These all smell similar to the problem in https://github.com/apple/servicetalk/issues/2403#issuecomment-1809344023 where the channel close is surfacing differently than we expected.

@Scottmitch
Copy link
Member Author

https://github.com/apple/servicetalk/pull/2799/checks?check_run_id=20544782936

GracefulConnectionClosureHandlingTest.14: protocol=HTTP_2 secure=true initiateClosureFromClient=false useUds=true viaProxy=false
java.util.concurrent.ExecutionException: io.netty.channel.StacklessClosedChannelException

java.util.concurrent.ExecutionException: io.netty.channel.StacklessClosedChannelException
	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:126)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:95)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.assertResponsePayloadBody(GracefulConnectionClosureHandlingTest.java:700)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:569)
	at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)

...


Caused by: io.netty.channel.StacklessClosedChannelException
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(Object, ChannelPromise)(Unknown Source)

@bryce-anderson
Copy link
Contributor

This test is pretty easy to repro locally if you start all your cores working on something else and run it. For me, it looks like it fails about 50% of the time.

I can't reproduce it with cleartext but it fails very often with TLS enabled.

@idelpivnitskiy
Copy link
Member

Spent some time to debug it today, and found that this happens only for closeAfterRequestMetaDataSentResponseMetaDataReceived with protocol=HTTP_2 secure=true initiateClosureFromClient=false, other params don't matter. The easier config is index=16. Run this test repeatedly until failure in Intellij IDEA. Open another project and build it with gradle. That will create load and cause flaky test behavior.

The test is structured in the following way:

  1. Client sends a request headers, awaits for response headers.
  2. Server initiates graceful closure.
  3. After that we release latches for request and response payload body writes.
  4. When server sees that the full request was read and the full response was written, it closes the stream, and KeepAliveManager closes the channel (see close0(null)). Because we operate with a DuplexChannel, it does:
    i. channel.writeAndFlush(EMPTY_BUFFER)
    ii. sslHandler.closeOutbound()
    iii. duplexChannel.shutdownOutput()
  5. Concurrently, client tries to assert response payload body. It does request(Long.MAX_VALUE) on payload subscription, this translates into NettyChannelPublisher.requestN -> StreamChannel.read() -> Http2ChannelUnsafe.beginRead() -> stream channel is active -> updateLocalWindowIfNeeded() -> write(Http2WindowUpdateFrame).

So far, all steps are expected. But the write fails bcz the parent channel is already closed.
Wire-logging shows that almost immediately after the server closed its output side (sent close_notify followed by FIN), client processed SslCloseCompletionEvent on the pipeline. This is not expected!
The expected behavior is that the client receives all response data (last DATA frame with endStream=true) and bcz parent channel always has autoRead=true, it must parse these data and propagate to StreamChannel. That will add them to NettyChannelPublisher's pending queue where they will wait for requestN. After reading and decrypting all data from network, SslHandler will process close_notify, send SslCloseCompletionEvent, then process FIN and send ChannelInputShutdownReadComplete.

Somehow, it happened that SslHandler lost data or didn't propagate them before firing SslCloseCompletionEvent.

2024-01-24 13:27:12,392                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a] REGISTERED
2024-01-24 13:27:12,392                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a] CONNECT: /127.0.0.1:57502
2024-01-24 13:27:12,399                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] ACTIVE
2024-01-24 13:27:12,399                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 24B
2024-01-24 13:27:12,401                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 33B
2024-01-24 13:27:12,401                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 13B
2024-01-24 13:27:12,401                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,401                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,401                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,431                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] USER_EVENT: class io.netty.handler.ssl.SslHandshakeCompletionEvent
2024-01-24 13:27:12,432                  client-io-2-1 a= b= c= [DEBUG] HttpDebugUtils                 - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] HTTP/2.0 pipeline initialized: SslHandler#0, ServiceTalkWireLogger#0, Http2FrameCodec#0, Http2MultiplexHandler#0, H2ClientParentConnectionContext$DefaultH2ClientParentConnection#0, DefaultChannelPipeline$TailContext#0
2024-01-24 13:27:12,440                  client-io-2-1 a= b= c= [DEBUG] ReservableRequestConcurrencyControllers - IdleTimeoutConnectionFilterImpl[300000 ms](io.servicetalk.http.netty.HttpMessageDiscardWatchdogClientFilter$1([id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502])) Received transportEventStream(HttpEventKey{name='max-concurrency-no-offloading', type=ConsumableEvent}@2d6eac87) event: IgnoreConsumedEvent{event=100}
2024-01-24 13:27:12,444                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,444                  server-io-1-2 a= b= c= [DEBUG] AlpnChannelSingle$AlpnChannelHandler - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503] ALPN negotiated h2 protocol
2024-01-24 13:27:12,444                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,444                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,446                  server-io-1-2 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503] Configured for duplex channel with policy=DefaultKeepAlivePolicy{idleDuration=PT0S, ackTimeout=PT5S, withoutActiveStreams=false}
2024-01-24 13:27:12,448                  server-io-1-2 a= b= c= [DEBUG] HttpDebugUtils                 - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503] HTTP/2.0 pipeline initialized: SslHandler#0, ServiceTalkWireLogger#0, AlpnChannelSingle$AlpnChannelHandler#0, Http2FrameCodec#0, Http2MultiplexHandler#0, H2ServerParentConnectionContext$DefaultH2ServerParentConnection#0, DefaultChannelPipeline$TailContext#0
2024-01-24 13:27:12,449                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ: 34B
2024-01-24 13:27:12,454                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 9B
2024-01-24 13:27:12,454                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,455                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ: 9B
2024-01-24 13:27:12,455                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,455                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,455                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,464                  client-io-2-1 a= b= c= [DEBUG] CloseHandler                   - io.servicetalk.transport.netty.internal.CloseHandler.LogLevel=null
2024-01-24 13:27:12,510                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 9B
2024-01-24 13:27:12,510                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 31B
2024-01-24 13:27:12,512                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,546                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ: 14B
2024-01-24 13:27:12,547                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,547                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,547                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,553                  server-io-1-2 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503] Close gracefully with activeStreams=1, keepAliveState=null
2024-01-24 13:27:12,555                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ: 41B
2024-01-24 13:27:12,556                  client-io-2-1 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] Close gracefully with activeStreams=1, keepAliveState=null
2024-01-24 13:27:12,556                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 17B
2024-01-24 13:27:12,556                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 8B
2024-01-24 13:27:12,556                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 17B
2024-01-24 13:27:12,556                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,556                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 17B
2024-01-24 13:27:12,556                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,557                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,557                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,557                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,558                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 9B
2024-01-24 13:27:12,559                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 15B
2024-01-24 13:27:12,559                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,559                  client-io-2-1 a= b= c= [DEBUG] DefaultNettyConnection         - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502](H2 - 3) Received a user event: io.servicetalk.transport.netty.internal.CloseHandler$OutboundDataEndEvent
2024-01-24 13:27:12,559                  server-io-1-2 a= b= c= [DEBUG] DefaultNettyConnection         - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503](H2 - 3) Received a user event: io.servicetalk.transport.netty.internal.CloseHandler$OutboundDataEndEvent
2024-01-24 13:27:12,561                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 9B
2024-01-24 13:27:12,561                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 0B
2024-01-24 13:27:12,561                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,561                  client-io-2-1 a= b= c= [DEBUG] WriteStreamSubscriber          - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502](H2 - 3) Terminate subscriber, state: 11001
2024-01-24 13:27:12,561                  server-io-1-2 a= b= c= [DEBUG] WriteStreamSubscriber          - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503](H2 - 3) Terminate subscriber, state: 11001
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ: 34B
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,562                  server-io-1-2 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503] Graceful close PING(ACK) received, writing the second GO_AWAY frame, activeStreams=1
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ: 17B
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] Graceful close PING(ACK) received, writing the second GO_AWAY frame, activeStreams=1
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 17B
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 8B
2024-01-24 13:27:12,562                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,563                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ: 25B
2024-01-24 13:27:12,563                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,563                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,563                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,563                  server-io-1-2 a= b= c= [DEBUG] DefaultNettyConnection         - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503](H2 - 3) Received a user event: io.servicetalk.transport.netty.internal.CloseHandler$InboundDataEndEvent
2024-01-24 13:27:12,564                  server-io-1-2 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503] Marking all states as CLOSED with activeStreams=0, gracefulCloseState=GRACEFUL_CLOSE_SECOND_GO_AWAY_SENT, keepAliveState=null
2024-01-24 13:27:12,566                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] USER_EVENT: class io.netty.handler.ssl.SslCloseCompletionEvent
2024-01-24 13:27:12,566                  server-io-1-2 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x7740d711, L:/127.0.0.1:57502 - R:/127.0.0.1:57503] Observed input shutdown, other side is shutdown too, closing the channel with activeStreams=0, gracefulCloseState=CLOSED, keepAliveState=CLOSED
2024-01-24 13:27:12,566                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,566                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,566                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,600                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] USER_EVENT: class io.netty.channel.socket.ChannelInputShutdownEvent
2024-01-24 13:27:12,600                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_COMPLETE
2024-01-24 13:27:12,601                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,601                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] READ_REQUEST
2024-01-24 13:27:12,601                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] USER_EVENT: class io.netty.channel.socket.ChannelInputShutdownReadComplete
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] DefaultNettyConnection         - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502](H2 - 3) Received a user event: io.servicetalk.transport.netty.internal.CloseHandler$InboundDataEndEvent
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] Marking all states as CLOSED with activeStreams=0, gracefulCloseState=GRACEFUL_CLOSE_SECOND_GO_AWAY_SENT, keepAliveState=null
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] WRITE: 0B
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] FLUSH
2024-01-24 13:27:12,603                  server-io-1-2 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x7740d711, L:/127.0.0.1:57502 ! R:/127.0.0.1:57503] Channel closed with activeStreams=0, gracefulCloseState=CLOSED, keepAliveState=CLOSED
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] USER_EVENT: class io.netty.channel.socket.ChannelOutputShutdownEvent
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] Observed output shutdown, other side is shutdown too, closing the channel with activeStreams=0, gracefulCloseState=CLOSED, keepAliveState=CLOSED
2024-01-24 13:27:12,603                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 - R:127.0.0.1/127.0.0.1:57502] CLOSE
2024-01-24 13:27:12,603                  server-io-1-2 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x7740d711, L:/127.0.0.1:57502 ! R:/127.0.0.1:57503] Channel closed with activeStreams=0, gracefulCloseState=CLOSED, keepAliveState=CLOSED
2024-01-24 13:27:12,605                  client-io-2-1 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502] Input and output shutdown, closing the channel
2024-01-24 13:27:12,605                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502] CLOSE
2024-01-24 13:27:12,605                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502] INACTIVE
2024-01-24 13:27:12,605                  client-io-2-1 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502] Channel closed with activeStreams=0, gracefulCloseState=CLOSED, keepAliveState=CLOSED
2024-01-24 13:27:12,605                  client-io-2-1 a= b= c= [DEBUG] servicetalk-tests-wire-logger  - [id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502] UNREGISTERED
2024-01-24 13:27:12,605                  client-io-2-1 a= b= c= [DEBUG] KeepAliveManager               - [id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502] Channel closed with activeStreams=0, gracefulCloseState=CLOSED, keepAliveState=CLOSED
2024-01-24 13:27:12,611       ForkJoinPool-1-worker-19 a= b= c= [DEBUG] ReservableRequestConcurrencyControllers - IdleTimeoutConnectionFilterImpl[300000 ms](io.servicetalk.http.netty.HttpMessageDiscardWatchdogClientFilter$1([id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502])) Received transportEventStream(HttpEventKey{name='max-concurrency-no-offloading', type=ConsumableEvent}@2d6eac87) event: IgnoreConsumedEvent{event=0}
2024-01-24 13:27:12,612       ForkJoinPool-1-worker-19 a= b= c= [DEBUG] ReservableRequestConcurrencyControllers - IdleTimeoutConnectionFilterImpl[300000 ms](io.servicetalk.http.netty.HttpMessageDiscardWatchdogClientFilter$1([id: 0x66d8955a, L:/127.0.0.1:57503 ! R:127.0.0.1/127.0.0.1:57502])) transportEventStream(HttpEventKey{name='max-concurrency-no-offloading', type=ConsumableEvent}@2d6eac87) stream completes
2024-01-24 13:27:12,612            client-executor-3-1 a= b= c= [DEBUG] RoundRobinLoadBalancer         - RoundRobinLoadBalancer{id=DefaultHttpLoadBalancerFactory@23059c9, targetResource=/127.0.0.1:57502} is closing nongracefully. Last seen addresses (size=1): [Host{lbDescription=RoundRobinLoadBalancer{id=DefaultHttpLoadBalancerFactory@23059c9, targetResource=/127.0.0.1:57502}, address=/127.0.0.1:57502, state=ACTIVE(failedConnections=0), #connections=0}].
2024-01-24 13:27:12,615            client-executor-3-1 a= b= c= [DEBUG] NettyHttpServer                - Stopped HTTP server for address /127.0.0.1:57502.

java.util.concurrent.ExecutionException: io.netty.channel.StacklessClosedChannelException

	at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:126)
	at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:95)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.assertResponsePayloadBody(GracefulConnectionClosureHandlingTest.java:701)
	at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closeAfterRequestMetaDataSentResponseMetaDataReceived(GracefulConnectionClosureHandlingTest.java:450)
	at jdk.internal.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:727)
	at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
	at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:156)
	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:147)
	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestTemplateMethod(TimeoutExtension.java:94)
	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103)
	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:92)
	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:86)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:217)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:213)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:138)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:68)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService$ExclusiveTask.compute(ForkJoinPoolHierarchicalTestExecutorService.java:202)
	at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService.submit(ForkJoinPoolHierarchicalTestExecutorService.java:135)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:204)
	at org.junit.jupiter.engine.descriptor.TestTemplateTestDescriptor.execute(TestTemplateTestDescriptor.java:142)
	at org.junit.jupiter.engine.descriptor.TestTemplateTestDescriptor.lambda$execute$2(TestTemplateTestDescriptor.java:110)
	at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
	at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:658)
	at java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:274)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
	at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
	at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
	at java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:274)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
	at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
	at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
	at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
	at java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:274)
	at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
	at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
	at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
	at org.junit.jupiter.engine.descriptor.TestTemplateTestDescriptor.execute(TestTemplateTestDescriptor.java:110)
	at org.junit.jupiter.engine.descriptor.TestTemplateTestDescriptor.execute(TestTemplateTestDescriptor.java:44)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService$ExclusiveTask.compute(ForkJoinPoolHierarchicalTestExecutorService.java:202)
	at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService.invokeAll(ForkJoinPoolHierarchicalTestExecutorService.java:146)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService$ExclusiveTask.compute(ForkJoinPoolHierarchicalTestExecutorService.java:202)
	at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService.invokeAll(ForkJoinPoolHierarchicalTestExecutorService.java:146)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService$ExclusiveTask.compute(ForkJoinPoolHierarchicalTestExecutorService.java:202)
	at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:189)
	at java.base/java.util.concurrent.ForkJoinTask.doExec$$$capture(ForkJoinTask.java:290)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
	at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: io.netty.channel.StacklessClosedChannelException
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(Object, ChannelPromise)(Unknown Source)

@idelpivnitskiy
Copy link
Member

Debugged it a little further today. Actually, SslHandler does its job correctly. It fires channelRead with data before propagating SslCloseCompletionEvent and closing the parent channel.

This test started to fail more frequently after we merged "disable auto read for stream channels" (#2799). The problem is that the last fired DATA frame ends up sitting in AbstractHttp2StreamChannel.inboundBuffer queue, waiting for more demand. When there is a demand, we translate requentN into StreamChannel.read() -> Http2ChannelUnsafe.beginRead(). Then because of #9400, AbstractHttp2StreamChannel tries to write Http2WindowUpdateFrame, instead of draining the queue first.

I read discussions around #9400, and those seem reasonable. An alternative fix could be to enhance updateLocalWindowIfNeeded() to check if the parent is still active, either by parentContext().isRemoved() or parentContext().channel().isActive(), before doing the write0. I will draft a PR for netty, implementing a test may take some time.

@idelpivnitskiy
Copy link
Member

Proposed fix for netty: netty/netty#13812

@idelpivnitskiy
Copy link
Member

Will be fixed with Netty 4.1.107.Final

@bryce-anderson
Copy link
Contributor

Until we can upgrade netty I've put up a PR to disable the tests: #2827

idelpivnitskiy added a commit to idelpivnitskiy/servicetalk that referenced this issue Feb 13, 2024
idelpivnitskiy added a commit that referenced this issue Feb 13, 2024
This reverts (#2827), commit db9a026.
Resolves #2117.
@idelpivnitskiy
Copy link
Member

https://github.com/apple/servicetalk/actions/runs/11127685633/job/30920582786?pr=3071

GracefulConnectionClosureHandlingTest > closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(HttpProtocol, boolean, boolean, boolean, boolean) > 5: protocol=HTTP_1 secure=true initiateClosureFromClient=false useUds=false viaProxy=true FAILED
    java.util.concurrent.ExecutionException: io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0xc70c6092, L:/127.0.0.1:57154 ! R:localhost/127.0.0.1:32865]
        at io.servicetalk.concurrent.api.SourceToFuture.reportGet(SourceToFuture.java:134)
        at io.servicetalk.concurrent.api.SourceToFuture.get(SourceToFuture.java:103)
        at io.servicetalk.http.netty.GracefulConnectionClosureHandlingTest.closePipelinedAfterTwoRequestsSentBeforeAnyResponseReceived(GracefulConnectionClosureHandlingTest.java:543)

        Caused by:
        io.servicetalk.transport.netty.internal.CloseHandler$CloseEventObservedException: CHANNEL_CLOSED_OUTBOUND(The transport backing this connection has been shutdown (write)) [id: 0xc70c6092, L:/127.0.0.1:57154 ! R:localhost/127.0.0.1:32865]

            Caused by:
            io.servicetalk.transport.netty.internal.StacklessClosedChannelException
                at io.servicetalk.transport.netty.internal.DefaultNettyConnection.channelInactive(...)(Unknown Source)

@idelpivnitskiy idelpivnitskiy reopened this Oct 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
flaky tests Unit tests are flaky
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants