This repository contains a sample backend API implemented in Go as part of a job application process.
Expand
- Data is stored in the low-code platform Microsoft Dataverse, accessed via OData Web API.
- Tables:
Player
,Character
,Class
,Race
,DiceRoll
. - A dedicated endpoint should return all
Players
and theirCharacters
, using a Redis cache. - There should be a simple business logic:
- The
Player
fieldsVatId
andAddress
should be fetched automatically from the ARES service. - The
Character
fieldsStrength
,Dexterity
,Intelligence
, andCharisma
are the sum of:- User input.
- Base value from the
Class
. - Base value from the
Race
. - Random value from the
DiceRoll
. The roll is stored for audit purposes.
- The
- Development should be optimized for a small team of 1–2 Go developers and 1–2 UI developers.
- The backend is consumed primarily by Vue.js UI.
- The solution is tailored for agile development and fast modifications.
- No existing Go libraries generate entities from Dataverse/OData models.
- Quality libraries for this domain primarily exist only for
C#
andJava
.
- 🚀 Designed for rapid development in a small team with easy testing:
- Fast, high-quality development is enabled by the right technology choices.
- Dependencies are selected with a focus on long-term maintainability and easy updates.
- The code is divided into small, single-purpose, easily testable packages.
- ⚡ API structure is defined using Protocol Buffers format.
- Go server and JS/TS client for UI are generated using Connect RPC, read more.
- 🤖 Development benefits from AI assistance:
- AI is used to modify or extend the API by adjusting Protocol Buffer definitions.
- Definitions are much shorter as a code itself, reducing issues with AI's context size limitations.
- The
proto
definition language is limited, decreasing the risk of errors. - There is no variability in code style.
- As a result, 🕵️♂️ reviewing the generated definitions is much simpler than reviewing a generated code.
- Business logic code is also streamlined with AI, focusing solely on the logic itself.
- ✔️ Automatic input/output validation:
- Validation rules are part of the Protocol Buffers definitions.
- 📊 OpenTelemetry standard manages logs, traces, and metrics:
- Data can be exported to various services, read more.
- ✍ Manual work was minimized.
- ✅ Functionality is validated by end-to-end tests, read more.
- ⏱️ Implementation took approximately 5MD.
- Most time was spent on Dataverse: studying and creating the code generator.
- Significant effort went into selecting technologies for the API.
The repository follows the recommended Go project layout.
Development uses Docker to ensure consistent environments.
- .devcontainer/Containerfile:
dev
image specification.
- .devcontainer/compose.yaml:
- Defines services:
dev
:- Main DEV container.
- Run the API server using
make run
and openhttp://localhost:8000
.
dev-no-ports
- DEV container without exposed ports for IDE integration or test execution.
redis
- Redis cache server.
redisinsight
- Redis Insight - a UI for Redis.
http://localhost:5540
telemetry
:- OpenObserver server - an example telemetry receiver.
- To process logs, traces, and metrics.
http://localhost:5080
- Defines services:
Common tasks are defined in the Makefile:
make clean
: Remove all Docker containers.make shell
: Start a shell in thedev
container.- Other
Makefile
commands should be executed inside thedev
container (make shell
):make lint
: Code linting.make fix
: Code linting with automatic fixes.make test
: Run all tests.make deps-tidy
: Updatego.mod
andgo.sum
.make deps-upgrade
: Interactive dependency upgrade.make gen-model
: Generate Go entities and repositories from the Dataverse model.make gen-api
: Generate API server and JavaScript client from protobuf definitions.make gen-wire
: Generate dependency initialization code with Google Wire.make buf-lint
: Lint protobuf files.make buf-update
: Update external protobuf files.
Code linting uses GolangCI-Lint, configured in build/ci/lint.yaml.
Telemetry is implemented using OpenTelemetry:
Logs, traces, and metrics can be sent to various backends.
- Logs use structured JSON format.
- The slog package from standard library is used.
- Logs are always sent to
stdout
, and optionally also to a remote OpenTelemetry HTTP endpoint.
- The application implements a straightforward approach to graceful shutdown.
- Services must be terminated in reverse order of their creation.
- This simple LIFO stack is implemented by the shutdown package.
- The main
context
is created in main.go and can be terminated withSIGTERM
, triggering a graceful shutdown. - Examples (
down.OnShutdown(...)
:
Application-wide configuration is defined in internal/pkg/app/demo/config/config.go.
- Each component/package has its own configuration structure.
- It encourages single responsibility principle.
- Each part is independently testable.
- All partial configurations combine into the application configuration.
- Configuration can be set using ENVs or CLI flags.
- If needed in the future, the configuration can also be loaded from a YAML or JSON file.
Partial configurations:
- Logger: internal/pkg/common/log/config/config.go
- Telemetry: internal/pkg/common/telemetry/config/config.go
- Dataverse Web API: internal/pkg/common/dataverse/webapi/config.go
- Server: internal/pkg/app/demo/server/config/config.go
- Redis: internal/pkg/common/cache/redis/config.go
Run go run ./cmd/demo --help
to see all available options:
Output
Usage: demo [flags]
Note: Each flag can be set as an ENV.
Flags:
-h, --help
--logger-exporter="none" ($DEMO_LOGGER_EXPORTER)
--logger-http-endpoint-url="http://localhost:4318/v1/logs" ($DEMO_LOGGER_HTTP_ENDPOINT_URL)
--logger-http-authorization="Basic ...." ($DEMO_LOGGER_HTTP_AUTHORIZATION)
--telemetry-trace-exporter="none" ($DEMO_TELEMETRY_TRACE_EXPORTER)
--telemetry-trace-http-endpoint-url="http://localhost:4318/v1/traces" ($DEMO_TELEMETRY_TRACE_HTTP_ENDPOINT_URL)
--telemetry-trace-http-authorization="Basic ...." ($DEMO_TELEMETRY_TRACE_HTTP_AUTHORIZATION)
--telemetry-metric-exporter="none" ($DEMO_TELEMETRY_METRIC_EXPORTER)
--telemetry-metric-http-endpoint-url="http://localhost:4318/v1/metrics" ($DEMO_TELEMETRY_METRIC_HTTP_ENDPOINT_URL)
--telemetry-metric-http-authorization="Basic ...." ($DEMO_TELEMETRY_METRIC_HTTP_AUTHORIZATION)
--server-listen-address="0.0.0.0:8000" ($DEMO_SERVER_LISTEN_ADDRESS)
--model-tenant-id=STRING ($DEMO_MODEL_TENANT_ID)
--model-client-id=STRING ($DEMO_MODEL_CLIENT_ID)
--model-client-secret=STRING ($DEMO_MODEL_CLIENT_SECRET)
--model-api-host=STRING ($DEMO_MODEL_API_HOST)
--model-debug-request ($DEMO_MODEL_DEBUG_REQUEST)
--model-debug-response ($DEMO_MODEL_DEBUG_RESPONSE)
--redis-address=STRING ($DEMO_REDIS_ADDRESS)
--redis-username=STRING ($DEMO_REDIS_USERNAME)
--redis-password=STRING ($DEMO_REDIS_PASSWORD)
--redis-db=0 ($DEMO_REDIS_DB)
Dependency management in Go usually avoids "magic" frameworks.
- Dependencies are parameters to a constructor or a provider function.
- This project uses Google Wire to automate service wiring.
Example:
- ares.NewClient depends on
*http.Client
, and it is provided by httpclient.New.- The wire.go in the
httpclient
package definesvar WireSet = wire.NewSet(New)
. - It means, we should use the
New
function to create*http.Client
.
- The wire.go in the
- All application dependencies are defined in internal/pkg/app/demo/cmd/wire.go.
- It includes
httpclient.WireSet
andares.NewClient
from the example above.
- It includes
- Command
make gen-wire
generates wire_gen.go composing all dependencies together. - A different initialization code can be generated for tests, using some mocked services.
The main challenge was integrating the Dataverse low-code model with Go.
As mentioned in the Analysis, Dataverse and Go are not a typical combination of technologies.
- There are no high-quality pre-built solutions available.
- However, since it was a strict requirement of the assignment, I took on the challenge and made it work.
A simple HTTP client has been composed to interact with the Dataverse Web API.
- See the dataverse/webapi package.
- OAuth2 authorization is handled transparently by the golang.org/x/oauth2.
A simple Metadata API client has been created to fetch entity metadata.
- See the dataverse/metadata package.
- It fetches custom
tables
and theircolumns
.
Dataverse metadata is used in a custom code generator.
- The main part of the generator logic is in dataverse/entitygen/entity.go.
- It generates Go structures representing individual tables / entities.
- It also generates a repository with methods like
Create
,Update
,Delete
,ById
, and others. - For example, the
fields
method generates the struct fields. - Similarly, the
createMethod
generates theCreate
method in the Repository. - It is not necessary to study the code generator in detail.
- See the examples of generated code below.
Code has been generated using make gen-model
.
Generated code is committed to the repository:
Features:
- webapi.Lookup[T] represents foreign key references.
- webapi.ChangeSet supports batching multiple changes atomically❗
<entity>.TrackChanges
method provides changes trackingPATCH
updates contain changed fields only❗
- Use ENVs below to log request / response details to
stdout
.DEMO_MODEL_DEBUG_REQUEST=true
DEMO_MODEL_DEBUG_RESPONSE=true
Functionality of generated code was validated with a test:
Further examples are demonstrated in the API Service section below.
- The API service is defined using Protocol Buffers
.proto
files: - The API model is distinct from the Dataverse model, as they are not and will never be 1:1.
- Validation rules are included in the definition using bufbuild/protovalidate.
Advantages
- The syntax is concise and allows for easy modifications.
- Works well with AI due to its limited DSL nature, reducing potential errors, simplifying code review.
- Protocol Buffers language is a Google-developed standard with long-term stability.
Alternatives
- There are several ways to generate a server from the
proto
service definition. - I chosen ConnectRPC.
- The code is generated using
make gen-api
.
- Fewer layers compared to alternatives.
- Part of the Cloud Native Computing Foundation.
- Utilizes many components from Go's standard library, ensuring higher reliability.
- Built on HTTP, without requiring HTTP/2, making browser calls straightforward.
- All requests use HTTP
POST
method. - Connect RPC does not seek to be compatible with REST standards and OpenAPI.
- And hence its simplicity, for examples fields are not separated between
query
andbody
. - These are advantages, especially in our case, when the API is directly consumed by the UI.
- gRPC
- More suitable for communication between microservices
- Requires HTTP/2.
- gRPC + gRPC-Gateway
- Allows for RESTful API generation from gRPC definitions.
- More complex, requires more components.
- If necessary, the definitions can be easily supplemented.
- Result will be a REST API according standards.
- But it's (unnecessarily) more work.
Connect RPC example:
rpc UpdatePlayer(UpdatePlayerRequest) returns (Player);
gRPC-Gateway example:
rpc UpdatePlayer(UpdatePlayerRequest) returns (Player) {
option (google.api.http) = {
patch: "/player/{id}"
body: "*"
};
};
- A Go server is generated based on the
.proto
files mentioned above.- Example generated code: api/gen/go/demo/v1/apiconnect/api.connect.go
- The key part is the
ApiServiceHandler
interface, which must be implemented by us. - The rest is managed by the generated code.
- A JS/TS client for the web is also generated from the same definitions❗
- Example generated code: api/gen/ts/demo/v1/api_pb.ts
- Integration with Connect Query for TanStack Query can also be generated:
- TanStack Query is
asynchronous state management for TS/JS, React, Solid, Vue, Svelte, and Angular
.
- TanStack Query is
The mapper package provides mapping between Dataverse entities and API models:
The service package implements service methods:
The playerbiz package implements player business logic:
The characterbiz package implements character business logic:
- Results of the
ListPlayersAndCharacters
method are cached in Redis. - API responses are cached directly, not entities from Dataverse, in this use-case it doesn't really matter.
- Cache is invalidated using
tags
, when anyPlayer
orCharacter
is updated.
How to try the application.
- Clone repo:
git clone https://github.com/michaljurecko/api-demo.git
- Open directory:
cd api-demo
- Copy
env.example
toenv.local
:cp env.example env.local
- Edit
env.local
and set the following variables:DEMO_MODEL_TENANT_ID=...
DEMO_MODEL_CLIENT_ID=...
DEMO_MODEL_CLIENT_SECRET=...
DEMO_MODEL_API_HOST=...
- Optionally, enable export of logs, traces and metrics to OpenObserve:
- Set the following variables:
DEMO_LOGGER_EXPORTER=http
DEMO_TELEMETRY_TRACE_EXPORTER=http
DEMO_TELEMETRY_METRIC_EXPORTER=http
DEMO_LOGGER_HTTP_AUTHORIZATION="Basic <token>"
DEMO_TELEMETRY_TRACE_HTTP_AUTHORIZATION="Basic <token>"
DEMO_TELEMETRY_METRIC_HTTP_AUTHORIZATION="Basic <token>"
- Start OpenObserve container:
docker compose -f .devcontainer/compose.yaml up -d telemetry
- Open http://localhost:5080/web/ingestion/custom/logs/otel
- Copy the token, and use it as the
<token>
placeholder in the variables above.
- Copy the token, and use it as the
- Set the following variables:
- Run
dev
container:make shell
- In the
dev
container, start the server:make run
- If you want raw logs with all details, you can use
make run-raw
instead. - The graceful shutdown can be triggered by
Ctrl+C
(SIGTERM
).
- If you want raw logs with all details, you can use
- The server is running at
http://localhost:8000
.- The root endpoint
/
contains interactive OpenAPI documentation for easy testing. - All endpoints use HTTP
POST
method, read Things that can surprise.
- The root endpoint
To run all unit and E2E tests, execute make tests
.
E2E tests are implemented using ginkgo.
- The server is launched on a random port during tests.
- It provides true end-to-end testing.
- Requests are executed using the generated Go client.
- Player tests
- Character tests
- Class tests
- Race tests
- Aggregation tests - test Redis cache invalidation
- The Dataverse model does not define alternate keys on
<entity>.name
.- So in some places, all entities are iterated in a
for
loop. - I did not modify the model, as stated in the assignment.
- So in some places, all entities are iterated in a
- Authentication and authorization were not addressed
- Both would be implemented as middleware in the API server.
- Rate limiting and retry logic were not addressed.
- Cascading deletion for
Player
->Character
->DiceRoll
was not implemented.