Skip to content

The ELT pipeline we’ve developed leverages several Google Cloud Services including Google Cloud Storage (GCS), BigQuery, Pub/Sub, Cloud Workflows, Cloud Run, and Cloud Build. We also use dbt for data transformation and Terraform for infrastructure as code.

Notifications You must be signed in to change notification settings

dsyouness/retail-etl-gcp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

End to End Serverless ELT with Google Cloud, dbt and Terraform

The ELT pipeline we’ve developed leverages several Google Cloud Services including Google Cloud Storage (GCS), BigQuery, Pub/Sub, Cloud Workflows, Cloud Run, and Cloud Build. We also use dbt for data transformation and Terraform for infrastructure as code.

Full article 👉 Medium

img.png

Deploy services in Google Cloud

Navigate to infra folder, we gonna deploy the project using Terraform :

  1. Initialize your Terraform workspace, which will download the provider plugins for Google Cloud: terraform init
  2. Plan the deployment and review the changes: terraform plan
  3. If everything looks good, apply the changes: terraform apply

Testing

Finally to test the workflow from end to end, we can lunch the script in scripts folder:

sh upload_include_dataset_to_gcs.sh

Contact

About

The ELT pipeline we’ve developed leverages several Google Cloud Services including Google Cloud Storage (GCS), BigQuery, Pub/Sub, Cloud Workflows, Cloud Run, and Cloud Build. We also use dbt for data transformation and Terraform for infrastructure as code.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages