Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What are the common patterns for including the intermediate .proto files? #2597

Closed
doubleyou opened this issue Jan 17, 2017 · 1 comment
Closed

Comments

@doubleyou
Copy link

Hi folks,

I've been working with grpc-gateway and stumbled upon the following behavior of Protocol Buffers.

Say, we have a file server.proto that also imports a file some/location/http.proto. Then, in order to make, say, a generated file server_pb.py work, we need to make sure that a Python module some.location.http_pb.py actually exists and is accessible. Same applies to most other languages, I guess.

This may be a problem, for example if a package with the same name already exists, plus it makes the automation a bit harder. My real-life case with this is described in a bit more detail at grpc-ecosystem/grpc-gateway#298.

An alternative could have been, that we go through the entire dependency tree, resolve it and build into a single file, and then compile that file into one module. But, my guess is, there has been some certain idea behind the current behavior (more explicit dependency updating, perhaps?)

So, my question is: what's the common practice of importing .proto files from some external sources and making sure their compiled counterparts are also accessible? In my example with Python, I could only come up with a workaround: grpc-ecosystem/grpc-gateway#298 (comment).

Sorry if my explanation is too vague, I can try to provide a more concrete explanation of the problem, with code examples, if needed.

Thanks!

@acozzette
Copy link
Member

I think the usual practice is to express the dependencies in your build system so that it can keep the generated code up to date, by rerunning protoc when a .proto file changes, for example. If you prefer you can also check in the generated code and have a script for updating it; this is a bit clunky but might be simpler depending on the situation.

For depending on .proto files in external repos, this varies a bit with each language and the language-specific packaging mechanism, but usually the easiest way to do it would be to just publish the generated code as part of the project artifact. This way if you depend on some package that uses protos, the package comes with the generated code and you don't have to run protoc yourself. The one exception would be C++, since it doesn't seem to have a dominant packaging mechanism and for C++ we don't attempt to keep the generated code compatible with different versions of the runtime library.

@TeBoring TeBoring closed this as completed Mar 7, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants