Social Data Dashboard and Data Processing
export AWS_PROFILE={profile}
export AWS_DEFAULT_REGION={region}
make build
make updev
make downdev
First time, you set following variables to github secrets for CI/CD on githuba actions
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- GCP_PROJECT_ID
- GCLOUD_REGION
- GCP_SERVICE_ACCOUNT_KEY
First time, you set gcp service account credential file to your local environment, and set file path toenvironmant variable GOOGLE_APPLICATION_CREDENTIALS
.
And you also set aws access information in your local machine.
First time, you set gloud token to ~/.config/gcloud
directory in your local machine
gcloud auth application-default login
After, you set poetry environment for running dbt
poetry shell
For check configuration, you run under command
dbt debug --project-dir dbt/ --profiles-dir dbt/ --vars '{bq_dataset_name: <BQ_DATASET_NAME>}'
First time, you set gcp service account credential file to your local environment, and set file path toenvironmant variable GOOGLE_APPLICATION_CREDENTIALS
, and install gcloud.
Runners are build on Cloud Build and deployed and ran on Cloud Runner. So, you create cloud run service and pub/sub whose names are same as runner by terraform topic before deploy and run.
Check terraform and dbt script when You create pull request. Check details are following.
- terraform
- check for formot and actual diff by
terraform plan
- check for formot and actual diff by
- dbt
- check configuration and parse jinja by
dbt debug
anddbt parse
- check configuration and parse jinja by
- runner
- check build is successed by
gcloud builds submit
- check build is successed by
Deploy terraform and dbt script when You create pull request. Deploy details are following.
- terraform
- deploy diff by
terraform apply
- check diff is nothing by
terraform plam
- deploy diff by
- dbt
- Deploay sata model by
dbt run
- Check deploy results by
dbt test
- Deploay sata model by
- runner
- deploy to cloud run service by
gcloud run deploy
- deploy to cloud run service by