This repo shows you how to deploy your dbt project with Lightdash to GCP Cloud Run.
This directory is a standard dbt project repo with 3 extra files:
lightdash-dockerfile
: you need to create your own simple dockerfile to deploy Lightdash with your dbt projectlightdash-entrypoint.sh
: (optional) - an optional script if you need to run any dbt commands before deploying lightdashprofiles/profiles.yml
: credentials to access your data warehouse
.
├── data/
├── dbt_project.yml
├── lightdash-dockerfile
├── lightdash-entrypoint.sh
├── models/
└── profiles
└── profiles.yml
Your profiles.yml
should contain a profile matching the profile in your dbt_project.yml
. Do not put secrets in here
instead use the env_var
function. You can see an example in this repo.
Bigquery tip - by default cloud run can access bigquery, so you don't need to pass any credentials (see the ./profiles.yml
in this repo)
Now you can use this repo to launch a Cloud Run container using Docker. The steps below show you how to do this with the GCP UI using the github integration.
https://console.cloud.google.com/run
Select "Deploy continuously... from a source repository" and point it to your github repo. Select your branch and the name of the docker file (in our case it's lightdash-dockerfile
Hit advanced settings and check the port is set to 8080
. Also set minimum instances to 1: Lightdash is slow to startup so it's better to keep it live.
Set environment variables to populate all the values of your profiles.yml
file that use the env_var
function.
In our example we don't consider these secrets so we add them under "environment variables". However, if you need to pass database passwords or other secrets to your profiles.yml
file, then you can use the "secrets" functionality.
In our example we just make it public. You may want to run it in a VPC, like your internal company network.