Skip to content

Latest commit

 

History

History
130 lines (98 loc) · 4.31 KB

README.md

File metadata and controls

130 lines (98 loc) · 4.31 KB

Overview

  1. Prerequisites
  2. Configure credentials
  3. Setup Postgres (optional)
  4. Setup virtual environment
  5. Installation for development
  6. Run the integration tests
  7. Run tests
  8. Creating a new integration test

Prerequisites

  • python3
  • Docker

Configure credentials

Edit the env file for your TARGET in integration_tests/.env/[TARGET].env.

Load the environment variables:

set -a; source integration_tests/.env/[TARGET].env; set +a

or more specific:

set -a; source integration_tests/.env/postgres.env; set +a

Setup Postgres (optional)

Docker and docker-compose are both used in testing. Specific instructions for your OS can be found here.

Postgres offers the easiest way to test most dbt-utils functionality today. Its tests are the fastest to run, and the easiest to set up. To run the Postgres integration tests, you'll have to do one extra step of setting up the test database:

make setup-db

or, alternatively:

docker-compose up --detach postgres

Setup virtual environment

We strongly recommend using virtual environments when developing code in dbt-utils. We recommend creating this virtualenv in the root of the dbt-utils repository. To create a new virtualenv, run:

python3 -m venv env
source env/bin/activate

This will create and activate a new Python virtual environment.

Installation for development

First make sure that you set up your virtual environment as described above. Also ensure you have the latest version of pip installed with pip install --upgrade pip. Next, install dbt-core (and its dependencies) with:

make dev target=[postgres|redshift|...]
# or
pip install --pre dbt-[postgres|redshift|...] -r dev-requirements.txt

or more specific:

make dev target=postgres
# or
pip install --pre dbt-postgres -r dev-requirements.txt

Run the integration tests

To run all the integration tests on your local machine like they will get run in the CI (using CircleCI):

make test target=postgres

or, to run tests for a single model:

make test target=[postgres|redshift|...]

or more specific:

make test target=postgres

Where possible, targets are being run in docker containers (this works for Postgres or in the future Spark for example). For managed services like Snowflake, BigQuery and Redshift this is not possible, hence your own configuration for these services has to be provided in the appropriate env files in integration_tests/.env/[TARGET].env

Creating a new integration test

Set up profiles

Do either one of the following:

  1. Use DBT_PROFILES_DIR
    cp integration_tests/ci/sample.profiles.yml integration_tests/profiles.yml
    export DBT_PROFILES_DIR=$(cd integration_tests && pwd)
  2. Use ~/.dbt/profiles.yml
    • Copy contents from integration_tests/ci/sample.profiles.yml into ~/.dbt/profiles.yml.

Add your integration test

This directory contains an example dbt project which tests the macros in the dbt-utils package. An integration test typically involves making 1) a new seed file 2) a new model file 3) a generic test to assert anticipated behaviour.

For an example integration tests, check out the tests for the get_url_parameter macro:

  1. Macro definition
  2. Seed file with fake data
  3. Model to test the macro
  4. A generic test to assert the macro works as expected

Once you've added all of these files, you should be able to run:

Assuming you are in the integration_tests folder,

dbt deps --target {your_target}
dbt seed --target {your_target}
dbt run --target {your_target} --model {your_model_name}
dbt test --target {your_target} --model {your_model_name}

Alternatively:

dbt deps --target {your_target}
dbt build --target {your_target} --select +{your_model_name}

If the tests all pass, then you're good to go! All tests will be run automatically when you create a PR against this repo.