Titanic API exposes several endpoints over titanic dataset, enabling querying passengers and passengers aggregated data.
Before using the Makefile commands, make sure you have the following tools and dependencies installed:
- Go: Make sure you have Go installed to build the API server and run tests.
- Docker: Install Docker to build and run Docker images.
- Docker Compose (optional): If you plan to use Docker Compose to deploy the application.
- Kubernetes Cluster: If you plan to use Kubernetes and helm or kubectl to deploy the application
The application uses environment variables located in .env. Make sure to set the environment variables if you are not using the Makefile.
If you are using the Makefile environment variables will be loaded from .env.
Project structure follows unofficial standard however simple one for API layout - https://github.com/golang-standards/project-layout
Once you run the API you can access the OpenAPI UI in /api/docs/ or /api/docs/index.html.
Also you can access the custom UI built with HTMX under /ui.
Notice the default host and ports are http://localhost:8089
Dataset used in the API is the Titanic CSV data under folder /data/csv/titanic.csv
Dataset is a copy of /data/csv/titanic.csv data located in /data/sqlite/titanic.db
created with the following in sqlite terminal:
CREATE TABLE passengers (
id INTEGER PRIMARY KEY,
survived INTEGER,
class INTEGER,
name TEXT,
sex TEXT,
age TEXT,
siblings_spouses INTEGER,
parents_children INTEGER,
ticket TEXT,
fare REAL,
cabin TEXT,
embarked TEXT
);
and then
.mode csv
.import data/csv/titanic.csv passengers
NOTICE: If you want to check both implementation you can set the store type in config.yaml to SQLITE/CSV.
Not all test files are complete however you can find testing for edge case examples in API handlers test files for reference
Run the API server in standalone mode using go run.
Build the API server binary using go build.
Build the API server as a Docker image.
Build the data store as a Docker image.
Run the API server as a Docker container.
Run the data store as a Docker container.
Build Docker API image and run it.
Remove the Docker images and containers.
Build Docker images for API & store and deploy using Docker Compose.
Stop the Docker Compose services.
Deploy Kubernetes resources.
Remove Kubernetes resources.
Deploy Kubernetes resources using helm.
Remove Kubernetes resources using helm.
Generate OpenAPI3 specification and save it as docs/openapi.json.
Precentiles tests for the project.
Measure code coverage for the tests.
Generate an HTML report of the code coverage.