-
Notifications
You must be signed in to change notification settings - Fork 274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support protoc --proto_path #8
Comments
Such functionality is necessary. I am a little struggling on how it should be provided. Conceptually it is a compile-time dependency for It would be ideal if sourceSets {
main {
proto {
importDir 'path/to/dir'
}
}
} |
In earlier discussion with protobuf team I suggested we primarily provide this functionality through deps in maven central, like JARs with the protos inside. I've not looked at the suggested config yet. |
For this use case we actually want to compile the protos from the JARs, which is what the plugin is doing right now. @mmmdreg can you point me to a project that would benefit from this feature? This would help me decide what the best solution is. |
It's internal so I can't share specifics but a simplified form looks like: SharedProject
DependentProject1 (swing client)
DependentProject2 (swing client)
PackagedProject (Java applet) The protos in the dependent projects import some protos from the shared project. If we were to prevent proto compilation on the shared project, the packaged project will have to deal with the copies compiled by both dependent projects. It's cleaner to simply allow import without compile. Refactoring suggestions are welcome but the immediate goal is to achieve functionality similar to our maven build with minimal structural changes. The application is a bit legacy. Maybe two types of configuration dependencies for the two cases could work. Also it would be nice to create dependencies on configurations of sibling projects without needing to package the protos into archives. eg protobuf project(path : ':sharedProject', configuration: 'protos')
|
But I'm saying we want the import from deps on Maven Central. Also, I didn't think the current compile from JAR supported building with JAR on Maven Central; does it? |
The dependencies API supports pulling from Maven Central. Not sure if the way the plugin uses the API pulls artifacts. If not, it should be trivial to make it do so. |
So how do we see someone using importDir along with a JAR dependency? |
@mmmdreg ideally |
I think the issue was the build of the dependent projects is including the artifacts (ie compiled Java) from the shared project, so adding to the proto_path and compiling them again will mean duplicate classes as well as wasted effort. It's possible to exclude these duplicates but that seems unnecessary too. Maven proto plugin by default adds proto definitions in any maven dependencies to the include path so the shared compiled protos compiled once and included as a normal dependency and the dependent protos are compiled based on the included shared protos without recompiling. For now I will just use gradle-protoc-plugin but in any case it'll be nice if you can expose this protoc feature at some point. |
I thought adding the proto directories of the shared project to the dependent projects' |
Oh right, misread your post. That's exactly right. :) |
One issue we have run into while figuring out how to share message definitions was the question of where should the Java code be generated, compiled and shared. The problem is that the generated Java code by protoc is reliant upon a specific version of a protoc compiler, which in turn expects that the generated code is compiled against a specific version of the protobuf-java library. But not only should it be compiled against that version it also MUST be the same version that is used at runtime, otherwise you can get Linking errors if you for example used protoc version 2.3 to generate the Java code, but at runtime you end up using version 2.5 of the library. The general solution that we came up with was to NEVER share the generated Java classes, but instead always share the proto artifacts and each deployable service (not a library or api) should ALWAYS generate their own Java version of the protos and compile them against whatever version of supported protoc they wish. For libraries and apis we recommend in general not to make signatures that depend on specific generated protoc Java classes, but instead define their own interface facades, so that the consumer end service, if they choose to, can provide a wrapper implementation that uses the underlying Proto object as a data source. This also has a nice benefit of having a place to define various business logic and other processing functions, since Protobuf is a data structure and not a rich object. To simplify the coding effort and the code maintenance, we rely on the protoc code generator plugins to apply code templates that encapsulate the wrapper class code, so it all gets embedded into the generated Java class, i.e. in a Good news is that the code templates can also be packaged as artifacts and shared as build dependencies. :) So if it seems like a complex setup, it is only sounds like it from the explanation, but in reality is very simple and has very low cost of maintenance, while virtually preventing any case of Java Serialization or signature incompatible Linking Errors to throw a wrench into a gearbox. Since protos are really a communication data protocol, this allows for isolation of the wire data and in-JVM rich implementation. While we want to standardize and optimize the wire protocol, we are all about in-JVM flexibility of usage, so trying to have EVERY repo depend on exactly the same version of a specific library is way to tight of a coupling. This design also aids with being able to create easy test mocks, since the interfaces don't have to be Proto Wrappers, but can just be locally created mocks, etc. All-in-all, we praise this solution as a triple-win scenario. Feel free to ask clarifying questions in case I've rambled through some things without clearly explaining the details, I would be happy to explain and provide more info. |
I agree with the idea behind "never share the generated Java classes," and I've spent a lot of time educating others as to why it is required and why it sucks (but is the only short-term option). However, protobuf-java will soon 1) be shipping .proto files for well-known protobufs and 2) include pre-compiled code in the runtime. We also have cases of multiple projects within a single application are wanting to use common protos and files that depend on those. Using --proto_path actually works fine in these scenarios and prevents multiple copies of proto-generated .class files without adding another layer of abstraction. We are going to need to differentiate whether we generate code or not for "dependencies." If we are generating code for a "dependency" it isn't acting like a dependency as much as an extension of the source. |
@ejona86, completely agree with the need to differentiate. IMHO it seems that if you declare something as Thoughts? P.S. |
@aantono, dependencies need not be external, and external artifacts need not be "dependencies." As we've already done for protoc itself, we can grab artifacts even if they aren't in the Having a separate protobufInclude could work. Concerning the P.S., this is why I brought up protobuf-java itself. It is going to become a common occurrence to use |
Does makes sense to support both (all 3) then... One thing we do have to do now is to copy the |
You can already see that It seems like we are on the same page as far as what features to support, but how we expose the features to the user isn't quite agreed. |
Agreed. I think we just need to decide what would be the most appropriate and intuitive way to define the configuration that makes sense. |
There is a third option -- we search in the |
This sounds good, but I have a fear that searching through every single dependency jar for included @ejona86 thoughts? |
I had a bit of concern about searching through the JAR files as well, but really, that is already done during compilation. In addition, ZIP does not have a tree of directories; it just has a list of files (some with Since this would only be used with Overall, sounds fine. |
This has converged with #15. |
When you define protos importing definitions from another project, protoc --proto_path=blah allows you to depend on these without re-compiling them.
In your plugin, adding another srcDir will compile the dependent protos, resulting in duplicates in a multi module project. The maven proto plugin correctly adds dependent protos to the include path without compiling them, as does the Thomas lee gradle-protoc-plugin.
Would be great if you can include the ability to modify the include path only, without adding the included proto files as arguments requiring compilation. Whether this is done using simple filesets or dependency archives isn't an issue.
Thanks
The text was updated successfully, but these errors were encountered: