Add a filter to support HTTP/JSON<->gRPC transcoding#1079
Add a filter to support HTTP/JSON<->gRPC transcoding#1079mattklein123 merged 42 commits intoenvoyproxy:masterfrom
Conversation
The class implements TranscoderInputStream from transcoding library, take input from Envoy::Buffer, this also allows Envoy::Buffer to be used as protobuf::io::ZeroCopyInputStream. For envoyproxy#501
There was a problem hiding this comment.
This is a strange include. I was looking for the interface and documentation, where can I find this?
The source files needed for descriptor generation, at https://github.com/lizan/envoy/blob/aa3aada1fa9e2557e34c1144d91bb2241da84ebf/test/proto/BUILD#L24 in envoyproxy/envoy#1079
|
The name looks weird. Do you have plan to support protocols other than REST in this filter? |
|
Let @mattklein123 or @htuch decide the name, candidates are:
I'm fine with either, or a new name. |
aa3aada to
b68400e
Compare
| "type": "both", | ||
| "name": "grpc_json_transcoder", | ||
| "config": { | ||
| "proto_descriptors": "proto.pb", |
There was a problem hiding this comment.
(Not a full review, will do full code review tomorrow)
Can we get a description of each of these config parameters? If I understand correctly, it's required that all descriptors get merged into a single pb file, right? What if there are multiple services from multiple protos? How do those get merged? Can you maybe provide an example?
For "services" what happens if the .pb file contains more services than you specify? Those aren't transcoded? Can you clarify?
Also, in general, I would love to talk about reflection possibilities. Obviously this is a follow up, but from Lyft perspective (and I think others), this filter is vastly less useful and more difficult to use if Envoy has to know ahead of time the .pb files and we need to keep them up to date. If we could do this with reflection I think that is the true absolutely killer feature. cc @twoism @rodaine @louiscryan
There was a problem hiding this comment.
Can we get a description of each of these config parameters? If I understand correctly, it's required that all descriptors get merged into a single pb file, right? What if there are multiple services from multiple protos? How do those get merged? Can you maybe provide an example?
Will write it in docs, in short, protoc accepts multiple input files, so just do them together.
For "services" what happens if the .pb file contains more services than you specify? Those aren't transcoded? Can you clarify?
Right, those aren't transcoded.
Also, in general, I would love to talk about reflection possibilities. Obviously this is a follow up, but from Lyft perspective (and I think others), this filter is vastly less useful and more difficult to use if Envoy has to know ahead of time the .pb files and we need to keep them up to date. If we could do this with reflection I think that is the true absolutely killer feature.
I totally agree with you that would provide better usability, I'm just saying that won't included in this PR as this is big enough :) Feel free to open an issue and assign to me once this get merged.
|
@lizan please ping me once the doc comment is fixed and this is passing tests and I will take another pass. |
|
@mattklein123 Docs are updated and travis is almost passed. PTAL, thanks! |
htuch
left a comment
There was a problem hiding this comment.
A few more comments, it's generally looking good. One meta comment is that this PR is now quite huge, as a reviewer I would appreciate it if we could try and break it up a little further in the future. Thanks.
|
|
||
| proto_descriptors | ||
| *(required, string)* Supplies binary protobuf descriptor set for the gRPC services. | ||
| The descriptor set have to include all types that are used in the services. Make sure use |
| The descriptor set have to include all types that are used in the services. Make sure use | ||
| ``--include_import`` option for ``protoc``. | ||
|
|
||
| To generate protobuf descriptor set for the gRPC service, You'll also need to clone the |
|
|
||
| .. code-block:: bash | ||
|
|
||
| protoc -I$(GOOGLEAPIS_DIR) -I. --include_imports --include_source_info \ |
There was a problem hiding this comment.
I think we can make this more seamless via some bazel run command in the Envoy repository, that uses our googleapis proto imports. Not for this PR, but maybe raise an issue.
There was a problem hiding this comment.
I don't think bazel run is easier in this case.
The Envoy customer (not developer) might not know how to build Envoy, and just using docker images, or binaries. Let them checkout envoy and install bazel sounds too much.
protoc is something that they are already using when they try Envoy with gRPC.
There was a problem hiding this comment.
Agreed I don't think we should assume that the user of envoy has access to the envoy code or bazel.
| ":transcoder_input_stream_lib", | ||
| "//include/envoy/http:filter_interface", | ||
| "//source/common/common:base64_lib", | ||
| "//source/common/http:headers_lib", |
There was a problem hiding this comment.
We can't reference external deps like googleapis with "@googleapis", they need to appear in external_deps instead. We won't be able to perform the remapping during the Google import otherwise.
test/proto/BUILD
Outdated
| srcs = [":bookstore.proto"], | ||
| deps = [ | ||
| "@googleapis//:http_api_protos", | ||
| "@protobuf_bzl//:cc_wkt_protos", |
There was a problem hiding this comment.
Same point as earlier here and everywhere, we can't use "@foo" deps, they need to be bound to something that gets referenced in external_deps.
There was a problem hiding this comment.
Done, added a function in envoy_build_system.bzl and invoke it here.
There was a problem hiding this comment.
I still see a reference to @protobuf_bzl here, maybe I'm looking at a stale patch..
|
|
||
| namespace { | ||
|
|
||
| const std::string TypeUrlPrefix{"type.googleapis.com"}; |
| const std::string proto_descriptor_file = config.getString("proto_descriptor"); | ||
| FileDescriptorSet descriptor_set; | ||
| if (!descriptor_set.ParseFromString(Filesystem::fileReadToEnd(proto_descriptor_file))) { | ||
| throw EnvoyException("transcoding_filter: Unable to parse proto descriptor"); |
There was a problem hiding this comment.
I don't see any EXPECT_THROW or EXPECT_THROW_WITH_MESSAGE statements below. Are we testing and obtaining coverage of this error handling (and below)?
|
|
||
| Status status = methodToRequestInfo(method_descriptor, &request_info); | ||
| if (!status.ok()) { | ||
| return status; |
There was a problem hiding this comment.
Can you confirm that you've looked at coverage for this file and obtained 100%? The reason I ask is that there are a lot of error handling cases, and a coverage check is the easiest way to validate this.
There was a problem hiding this comment.
Is 100% required to merge the PR? I already added some test to cover config error cases. The coverage is currently 92.7%.
To obtain 100% coverage, it would need adding mocks for config and transcoder. I would like to do it in a follow up PR.
There was a problem hiding this comment.
100% coverage is expected when reasonable. In this case this PR is big enough already. Let's do a follow up but please open an issue and link it here so that we can track.
| name = "envoy_eds", | ||
| actual = "@envoy_api//api:eds", | ||
| ) | ||
| native.bind( |
There was a problem hiding this comment.
This seems like the wrong if statement to have this under? I will defer to @htuch for review on all of the bazel changes.
There was a problem hiding this comment.
Since googleapis is loaded from envoy-api, so I added here, let me know if this should go somewhere else. @htuch
There was a problem hiding this comment.
Yeah, this is OK, but in theory you could source http_api_protos independently, let's keep it like this for now.
| method); | ||
| break; | ||
| default: | ||
| break; |
There was a problem hiding this comment.
Can this happen? Should there be some kind of warning here? If it can't happen can we delete default case?
There was a problem hiding this comment.
The only possibility is ::google::api::HttpRule::PATTEN_NOT_SET (added a comment here), and we do nothing here.
|
|
||
| Http::FilterDataStatus JsonTranscoderFilter::decodeData(Buffer::Instance& data, bool end_stream) { | ||
| if (error_) { | ||
| return Http::FilterDataStatus::StopIterationNoBuffer; |
There was a problem hiding this comment.
I don't think this can happen if you send a local complete reply. I would ASSERT(!error_)
|
|
||
| Http::FilterHeadersStatus JsonTranscoderFilter::encodeHeaders(Http::HeaderMap& headers, | ||
| bool end_stream) { | ||
| if (error_) { |
There was a problem hiding this comment.
error_ || !transcoder_ and delete dup return below.
|
@lizan P.S. I did some light copy editing on the docs and pushed a commit to your branch. |
htuch
left a comment
There was a problem hiding this comment.
LGTM modulo remaining comments.
| name = "envoy_eds", | ||
| actual = "@envoy_api//api:eds", | ||
| ) | ||
| native.bind( |
There was a problem hiding this comment.
Yeah, this is OK, but in theory you could source http_api_protos independently, let's keep it like this for now.
| config_.createTranscoder(headers, request_in_, response_in_, transcoder_, method_); | ||
|
|
||
| if (!status.ok()) { | ||
| return Http::FilterHeadersStatus::Continue; |
There was a problem hiding this comment.
Why continue on failure here? Please add a comment if this is intentional.
There was a problem hiding this comment.
Add a comment, we shouldn't fail the request because the transcoder creation failed. The request might be a normal gRPC request that it should be passed to upstream.
| Buffer::OwnedImpl data; | ||
| readToBuffer(*transcoder_->RequestOutput(), data); | ||
|
|
||
| if (data.length()) { |
| const Http::HeaderEntry* grpc_status_header = trailers.GrpcStatus(); | ||
| if (grpc_status_header) { | ||
| uint64_t grpc_status_code; | ||
| if (!StringUtil::atoul(grpc_status_header->value().c_str(), grpc_status_code)) { |
There was a problem hiding this comment.
You might want to use Common::getGrpcStatus here as it has additional error checking.
There was a problem hiding this comment.
Hmm, since we're just forward grpc-status to client, so I don't need the additional error checking here. There is a plan to have HTTP code mapping, will do that in follow up.
| std::unique_ptr<google::grpc::transcoding::TypeHelper> type_helper_; | ||
| }; | ||
|
|
||
| typedef std::shared_ptr<JsonTranscoderConfig> TranscodingConfigSharedPtr; |
There was a problem hiding this comment.
Is there a reason this is shared ptr?
There was a problem hiding this comment.
It has to be shared_ptr to be captured into lambda here:
https://github.com/lizan/envoy/blob/grpc_json_transcoding/source/server/config/http/grpc_json_transcoder.cc#L17
test/proto/BUILD
Outdated
| srcs = [":bookstore.proto"], | ||
| deps = [ | ||
| "@googleapis//:http_api_protos", | ||
| "@protobuf_bzl//:cc_wkt_protos", |
There was a problem hiding this comment.
I still see a reference to @protobuf_bzl here, maybe I'm looking at a stale patch..
| @@ -0,0 +1,159 @@ | |||
| syntax = "proto3"; | |||
|
|
|||
| package bookstore; | |||
There was a problem hiding this comment.
This seems a rather complicated proto for doing tests, can't we express everything we need to test in a much smaller proto?
There was a problem hiding this comment.
This is already a simplified version compared to what we have in Cloud Endpoints, but feel free to add an item to #1192 to see what can be done here.
|
Random software engineer thanks you all. Looking forward to using this filter! +1 grpc_json_bridge |
Description: The was only needed prior to dynamic registration of platform filters. It's benign, but thoroughly unnecessary now. Risk Level: Low Testing: CI Signed-off-by: Mike Schore <mike.schore@gmail.com> Signed-off-by: JP Simard <jp@jpsim.com>
Description: The was only needed prior to dynamic registration of platform filters. It's benign, but thoroughly unnecessary now. Risk Level: Low Testing: CI Signed-off-by: Mike Schore <mike.schore@gmail.com> Signed-off-by: JP Simard <jp@jpsim.com>
The source files needed for descriptor generation, at https://github.com/lizan/envoy/blob/aa3aada1fa9e2557e34c1144d91bb2241da84ebf/test/proto/BUILD#L24 in envoyproxy/envoy#1079
Closes #501