Skip to content

Tensor Bridge: OpenAPI spec and REST wrapper around TensorFlow Serving

License

Notifications You must be signed in to change notification settings

CoventryResearch/tf-bridge

Repository files navigation

Tensor Bridge

Tensor Bridge is an OpenAPI Specification as well as a simple Connexion wrapper for TensorFlow Serving.

The specification was obtained by compiling an annotated tensor_bridge.proto using grpc-gateway. The result is located in swagger/tensor_bridge.json.

How is this useful?

The publicly available version of TensorFlow serving works over gRPC.

Now you can use the API to build your own REST service and use JSON to talk to your TensorFlow models. A full example is included in this repo (see app.py and api/). If you prefer Go, you can even generate a reverse proxy automatically using grpc-gateway.

Installation

Simply run

docker build -t tf-bridge .

from the project root.

This will take a while as you are compiling TensorFlow and TensorFlow Serving from source. (Consider dedicating around 6-8 GB of RAM to Docker)

When the image is created you can start the servers

docker run -d -p 9001:9001 -p 9000:9000 -e MODEL=mnist tf-bridge

Tensor Bridge can be queried at 9001.

The gRPC endpoint is still available at 9000 for convenience and testing.

You will notice that we use the MODEL variable to specify the model. As an example, we included an exported MNIST model in this repo.

To see the Swagger UI go to http://localhost:9001/ui/

Client

There is also a simple client located in client/mnist_client.py for testing purposes. Make sure to install the necessary dependencies from requirements.txt.

If everything went well, you will shortly get the following output

Inference error rate: 10.4%

About

Tensor Bridge: OpenAPI spec and REST wrapper around TensorFlow Serving

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published