Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There appears to be a bug with the http2 header decoding #28632

Open
murgatroid99 opened this issue Jul 11, 2019 · 29 comments
Open

There appears to be a bug with the http2 header decoding #28632

murgatroid99 opened this issue Jul 11, 2019 · 29 comments
Labels
help wanted Issues that need assistance from volunteers or PRs that need help to proceed. http2 Issues or PRs related to the http2 subsystem.

Comments

@murgatroid99
Copy link
Contributor

  • Version: 10.15.3
  • Platform: node: 10-alpine official docker image
  • Subsystem: http2

This comes from googleapis/nodejs-datastore#415. It looks like occasionally http2 headers objects are being provided from the http2 module with invalid and clearly incorrect header keys. For example, some bad headers objects look like these

[Object: null prototype] {
 ':status': 200,
 'content-disposition': 'attachment',
 'content-type': 'application/grpc',
 ' pø0': 'Wed, 10 Jul 2019 23:39:46 GMT',
 'alt-svc': 'quic=":443"; ma=2592000; v="46,43,39"'
}

[Object: null prototype] {
   ':status': 200,
   'content-disposition': 'attachment',
   'content-type': 'application/grpc',
   'A\u0000\u0000\u0000': 'Wed, 10 Jul 2019 22:37:34 GMT',
   'alt-svc': 'quic=":443"; ma=2592000; v="46,43,39"'
 }

[Object: null prototype] {
 ':status': 200,
 'content-disposition': 'attachment',
 'content-type': 'application/grpc',
 ' ÐÕJ': 'Wed, 10 Jul 2019 23:58:43 GMT',
 'alt-svc': 'quic=":443"; ma=2592000; v="46,43,39"'
}

while valid headers look like this:

[Object: null prototype] {
 ':status': 200,
 'content-disposition': 'attachment',
 'content-type': 'application/grpc',
 date: 'Wed, 10 Jul 2019 23:05:02 GMT',
 'alt-svc': 'quic=":443"; ma=2592000; v="46,43,39"'
}

In all of the cases we have seen, it looks like the date key is replace with this garbled nonsense. My guess is that it is relevant that that header is the one most likely to be encoded in the format Literal Header Field Never Indexed — Indexed Name because the value changes with every request.

@addaleax addaleax added the http2 Issues or PRs related to the http2 subsystem. label Jul 11, 2019
@addaleax
Copy link
Member

Do you happen to have some kind of reproduction for this?

@murgatroid99
Copy link
Contributor Author

Unfortunately, not yet. I was hoping that someone more familiar with the implementation might be able to see what the error is from the information I have.

@addaleax
Copy link
Member

@murgatroid99 Okay – this is happening on the client side and it is only happening with the date header so far, correct? Do the invalid header names vary from process to process, or from request to request?

@pworkpop
Copy link

@addaleax
We had this error re-surface since Oct 10th - same key across several containers (does not vary by process or by request)

{ Error: Metadata key "@���" contains illegal characters at Http2CallStream.call.on 
(/app/bundle/programs/server/npm/node_modules/@grpc/grpc-js/build/src/client.js:101:45) at 
emitOne (events.js:121:20) at Http2CallStream.emit (events.js:211:7) at process.nextTick 
(/app/bundle/programs/server/npm/node_modules/@grpc/grpc-js/build/src/call-stream.js:71:22)
 at _combinedTickCallback (internal/process/next_tick.js:132:7) at process._tickDomainCallback 
(internal/process/next_tick.js:219:9) code: 2, details: 'Metadata key "@\u0000\u0000\u0000" 
contains illegal characters', metadata: Metadata { options: undefined, internalRepr: Map {} }, note: 
'Exception occurred in retry method that was not classified as transient' }

I can't confirm this being a client side bug either - we are using Google PubSub.

Platform: node: 8-stretch (docker) image (one instance on 10-stretch)

@addaleax
Copy link
Member

@pworkpop I guess the same question goes for you then – do you have some kind of reproduction?

@pworkpop
Copy link

@addaleax unfortunately not, we are battling some connection issues that surface as deadline exceeded errors in PubSub so it may well be the servers sending the invalid key

@jasnell
Copy link
Member

jasnell commented Oct 16, 2019

one thing you can potentially try is setting up Wireshark to intercept the HTTPS secured HTTP/2 traffic flowing into the endpoint. Obviously that depends on your set up, but it would be interesting to see if the headers coming into our Header handling code are correct. Given the kind of corruption that is occurring in the examples, it almost looks like the pointers could be uninitialized, which means our ref counting may be off and nghttp2 could be freeing the buffers earlier than expected. That shouldn't be possible with the current implementation unless there's definitely something wrong and we very likely would have seen the bug crop up previously. If we can rule out errors in the inbound data, then that would help in lieu of a reproducible test case

@remduf
Copy link

remduf commented Dec 12, 2019

Is this problem also present in node 12 ?

@murgatroid99
Copy link
Contributor Author

I'm not sure. We've had multiple recent reports of this error but they haven't specified the Node version.

@jasnell jasnell added the help wanted Issues that need assistance from volunteers or PRs that need help to proceed. label Jun 26, 2020
@clshortfuse
Copy link
Contributor

clshortfuse commented Jun 27, 2020

Sounds like an encoding issue with a child dependency, not NodeJS. It does look like it's fixed now:

googleapis/nodejs-datastore#415

googleapis/gax-nodejs#521

Edit: Unless we want NodeJS to do the urlencoding...

@murgatroid99
Copy link
Contributor Author

I don't think that gax-nodejs PR is at all related to this bug. And people have reported the bug to us much more recently than the last update in either of those issues, so I don't see any reason to think that this is resolved.

@clshortfuse
Copy link
Contributor

@murgatroid99 You might be right. The error is likely coming from:

https://github.com/grpc/grpc-node/blob/95289edcaf36979cccf12797cc27335da8d01f03/packages/grpc-js/src/metadata.ts#L250

https://github.com/grpc/grpc-node/blob/95289edcaf36979cccf12797cc27335da8d01f03/packages/grpc-js/src/call-stream.ts#L471

Do you have an exact version of what version of grpc-js you're using? It will help narrow down what's going on. Based on this code, which is the current master branch, it would be NodeJS that includes a mangled key name.

@murgatroid99
Copy link
Contributor Author

To be clear, I am the author of grpc-js, and I filed this issue because the mangled key names were reported to me, and I am sure they are coming from Node.

@clshortfuse
Copy link
Contributor

@murgatroid99 Yeah I saw you name in the source code and realized that after I commented. 😅

Yeah, so it's either:

  • The client that's sending the header isn't packing it correctly,
  • The nghttp2 isn't sending unpacked when it should.
  • NodeJS isn't expecting an unpacked header when it should.

It's curious that it's always a short 3-4 character header. If I were to take a guess, I'd say nghttp2 is giving us the pointer for the index, which is 4 bytes. The first character being blank sometimes might be 00 meaning the 4-byte integer is in little endian. But what's curious is that it doesn't happen always. Always is easy. Sometimes is hard.

I don't know if we have a test for HPACK, but maybe we should. Looking at our .cc code, we aren't using nghttp2_on_header_callback, but nghttp2_on_header_callback2. And the only reference to HPACK exists in nghttp2_on_header_callback.

Another thing that may be possible is that the reference to the header name exists inside dynamic table, and an awkward timing of events may cause the dynamic table to be different when the HPACK'd header arrives. That could explain why it's only sometimes.

@clshortfuse
Copy link
Contributor

Actually, I read the spec wrong. The name field is included in the the header when it's not indexed (duh). Standard practice is to index. It means it shouldn't be pointing to the dynamic table for this field.

@murgatroid99
Copy link
Contributor Author

murgatroid99 commented Jun 27, 2020

As I said in the beginning of the issue, my guess is that this header is represented as "Literal Header Field Never Indexed — Indexed Name", because the header name "date" is in the static table without a corresponding value.

Among the headers listed in the initial issue, date could easily be the only one represented that way, because it's the only one with a value that is different for every request.

@clshortfuse
Copy link
Contributor

We can't exclude the possibility that the client is sending it corrupted. NodeJS's HTTP2 client doesn't even have the ability to send a header with Never Indexed (I checked). We're expecting nghttp2 to do the unpacking for us. Still, I did find something odd:

nghttp2_on_header_callback2
This callback behaves like nghttp2_on_header_callback, except that name and value are stored in reference counted buffer. If application wishes to keep these references without copying them, use nghttp2_rcbuf_incref() to increment their reference count. It is the application's responsibility to call nghttp2_rcbuf_decref() if they called nghttp2_rcbuf_incref() so as not to leak memory.

@jasnell Aren't we missing a Http2RcBufferPointer::inc(name); Http2RcBufferPointer::inc(value); here?

bool Http2Stream::AddHeader(nghttp2_rcbuf* name,

@murgatroid99
Copy link
Contributor Author

This error is happening on the client, and it has been reported with non-Node.js servers. So we can't assume anything about what kinds of header encodings the sender can use.

@clshortfuse
Copy link
Contributor

@murgatroid99 Sorry, I see. I was misinterpreting this as a server environment. In either cases, NodeJS isn't able to send request headers with the never-index flag.

It seems as though nghttp2 will prevent malformed headers from landing in NodeJS. The connection should actually fail before firing the 'respond' event if the client server was responding with bad data. So it is probably a leak somewhere on the NodeJS C++ code. We are getting the header name, presumably fine, and the value of 4 for length when nghttp2 gives it to us. But by the time it hits Javascript, it's gone and the reference is lost.

@murgatroid99
Copy link
Contributor Author

In either cases, NodeJS isn't able to send request headers with the never-index flag.

The error in question is on response headers received by a Node client, sent by a different server. Those could be using any header encodings.

@clshortfuse
Copy link
Contributor

@murgatroid99 Yes. It's just that since NodeJS can't send never-index, we can't build a self-contained test around it. So it likely can't be reproduced with just NodeJS code (eg: NodeJS client receiving data from a NodeJS Server). We need to incorporate never-index flag into outbound requests, so we can test around what would happen when a NodeJS stream (either client or server) receives headers with a never-index flag. I'm going to start by filing that issue.

@addaleax
Copy link
Member

If never-index being set causes this, then this particular issue here is most likely one in nghttp2, not Node.js itself. But yeah, Node.js should really provide a way to set never-index.

@clshortfuse
Copy link
Contributor

clshortfuse commented Jun 28, 2020

So, I've manually traced the events as:

To my understanding, on inflated headers nghttp2_rcbuf_incref is called far too late. It needs to be called within OnHeaderCallback, or else it risks being garbage by the time it's sent to Javascript.

@addaleax
Copy link
Member

@clshortfuse Thanks for looking into this so thoroughly! Maybe I’m missing something, but the Http2Header in Http2Stream::AddHeader() does take ownership and calls incref() on the rcbufs it receives, right? And the reset() in GetName()/GetValue() decreases the refcount, but only if we really don’t use the rcbuf anymore (because GetInternalizedString() creates its own copy of the string data)?

@clshortfuse
Copy link
Contributor

clshortfuse commented Jun 28, 2020

@addaleax I don't think AddHeader() ever increments the buffer count. It adds it to a stack. OnFrameReceive iterates through that stack and when it hands it off to V8, then the buffer count increments. But by then it's too late. I was clued into this by this comment:

node/src/node_http_common.h

Lines 495 to 498 in 241ed44

// Calling GetName and GetValue will have the effect of releasing
// control over the reference counted buffer from this NgHeader
// object to the v8 string. Once the v8 string is garbage collected,
// the reference counter will be decremented.

It's very possible that on nghttp (HTTP1.x) you don't have to worry about nghttp calling nghttp2_rcbuf_decref until you get all the headers. But for nghttp2, you can't wait for all headers before you call NgRcBufPointer::inc(). You need to call it per header (within on_header_callback2). To clarify, on_header here means "on individual header" not, "on header collection".

Yes, it's more efficient to hand over all the headers to JS once we're done getting them all, but seems like that won't work with inflated HTTP2 header items.

Edit: Increment buffer count, not increase buffer

@clshortfuse
Copy link
Contributor

clshortfuse commented Jun 28, 2020

@addaleax Confusingly, despite its name, .reset() will increase the buffer reference count on first call. It'll decrement when it's called again (by V8 Garbage collection). (Edit: Effectively, on first call. Technically if the pointer is still valid.)

@addaleax
Copy link
Member

I don't think AddHeader() ever increments the buffer count.

To be clear, this is what I’m referring to here:

(gdb) bt
#0  0x00005561981f0830 in nghttp2_rcbuf_incref ()
#1  0x0000556197311ee0 in node::http2::Http2RcBufferPointerTraits::inc (buf=0x55619ad87ac0 <static_table+384>) at ../src/node_http2.h:128
#2  0x0000556197320119 in node::NgRcBufPointer<node::http2::Http2RcBufferPointerTraits>::reset (this=0x7ffc8d595980, ptr=0x55619ad87ac0 <static_table+384>, internalizable=true) at ../src/node_http_common.h:356
#3  0x000055619731c9af in node::NgHeader<node::http2::Http2HeaderTraits>::NgHeader (this=0x7ffc8d595970, env=0x55619bdc44d0, token=-1, name=0x55619ad87ac0 <static_table+384>, value=0x55619ad87ae8 <static_table+424>, flags=0 '\000') at ../src/node_http_common-inl.h:124
#4  0x0000556197316a06 in node::NgHeader<node::http2::Http2HeaderTraits>::NgHeader (this=0x7ffc8d595970, env=0x55619bdc44d0, name=0x55619ad87ac0 <static_table+384>, value=0x55619ad87ae8 <static_table+424>, flags=0 '\000') at ../src/node_http_common-inl.h:113
#5  0x00005561972f52ef in node::http2::Http2Stream::AddHeader (this=0x55619c0da040, name=0x55619ad87ac0 <static_table+384>, value=0x55619ad87ae8 <static_table+424>, flags=0 '\000') at ../src/node_http2.cc:2218
#6  0x00005561972ee7ec in node::http2::Http2Session::OnHeaderCallback (handle=0x55619c054dd8, frame=0x55619c054f80, name=0x55619ad87ac0 <static_table+384>, value=0x55619ad87ae8 <static_table+424>, flags=0 '\000', user_data=0x55619c060050) at ../src/node_http2.cc:845
#7  0x00005561981f8fb5 in nghttp2_session_mem_recv ()
#8  0x00005561972ee338 in node::http2::Http2Session::ConsumeHTTP2Data (this=0x55619c060050) at ../src/node_http2.cc:744
#9  0x00005561972f2d5c in node::http2::Http2Session::OnStreamRead (this=0x55619c060050, nread=67, buf_=...) at ../src/node_http2.cc:1797
#10 0x0000556197236593 in node::StreamResource::EmitRead (this=0x55619c05fed8, nread=67, buf=...) at ../src/stream_base-inl.h:104
#11 0x000055619741cc78 in node::LibuvStreamWrap::OnUvRead (this=0x55619c05fe80, nread=67, buf=0x7ffc8d595e00) at ../src/stream_wrap.cc:283
#12 0x000055619741c856 in node::LibuvStreamWrap::<lambda(uv_stream_t*, ssize_t, const uv_buf_t*)>::operator()(uv_stream_t *, ssize_t, const uv_buf_t *) const (__closure=0x0, stream=0x55619c05ff20, nread=67, buf=0x7ffc8d595e00) at ../src/stream_wrap.cc:205
#13 0x000055619741c886 in node::LibuvStreamWrap::<lambda(uv_stream_t*, ssize_t, const uv_buf_t*)>::_FUN(uv_stream_t *, ssize_t, const uv_buf_t *) () at ../src/stream_wrap.cc:206
#14 0x0000556197d2f363 in uv__read (stream=stream@entry=0x55619c05ff20) at ../deps/uv/src/unix/stream.c:1239
#15 0x0000556197d2f8f8 in uv__stream_io (loop=<optimized out>, w=0x55619c05ffa8, events=1) at ../deps/uv/src/unix/stream.c:1306
#16 0x0000556197d362dd in uv__io_poll (loop=loop@entry=0x55619ae09280 <default_loop_struct>, timeout=767) at ../deps/uv/src/unix/linux-core.c:431
#17 0x0000556197d23c93 in uv_run (loop=0x55619ae09280 <default_loop_struct>, mode=UV_RUN_DEFAULT) at ../deps/uv/src/unix/core.c:381
#18 0x000055619733464e in node::NodeMainInstance::Run (this=0x7ffc8d599390) at ../src/node_main_instance.cc:130
#19 0x00005561972658ad in node::Start (argc=2, argv=0x7ffc8d5995a8) at ../src/node.cc:1083
#20 0x0000556198078b7b in main (argc=2, argv=0x7ffc8d5995a8) at ../src/node_main.cc:127

i.e. yes, AddHeader() (frame 5 here) does increment the reference count.

@addaleax Confusingly, despite its name, .reset() will increase the buffer reference count on first call.

Not if it’s called without an argument.

@clshortfuse
Copy link
Contributor

clshortfuse commented Jun 28, 2020

I guess I was wrong about not being called. I missed the node_http_common-inl.h. I see how reset() works now.

I don't see why name_.reset(name, true); is called twice:

if (token == -1) {
CHECK_NOT_NULL(name);
name_.reset(name, true); // Internalizable
}
CHECK_NOT_NULL(value);
name_.reset(name, true); // Internalizable

But I don't think it really matters. I'll have to check another venue. Maybe I'm on the wrong track about the nghttp2_rcbuf_decref.

@addaleax
Copy link
Member

I don't see why name_.reset(name, true); is called twice:

Yeah, me neither. 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Issues that need assistance from volunteers or PRs that need help to proceed. http2 Issues or PRs related to the http2 subsystem.
Projects
None yet
Development

No branches or pull requests

6 participants