You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -134,6 +134,6 @@ Most of *airflow-dbt-python*'s operator and hook tests follow the same pattern:
134
134
135
135
1. Initialize a specific operator or hook.
136
136
2. Run it with a basic test *dbt* project against the test PostgreSQL database.
137
-
3. Assert *dbt* executes succesfuly, any results are properly propagated, and any artifacts are pushed to where they need to go.
137
+
3. Assert *dbt* executes successfully, any results are properly propagated, and any artifacts are pushed to where they need to go.
138
138
139
139
However, *airflow-dbt-python* also includes DAG tests, which can be seen as broader integration tests. These are located under ``tests/dags/``. DAG tests focus on testing complete end-to-end DAGs, including those shown in :ref:`example_dags`.
Copy file name to clipboardExpand all lines: docs/getting_started.rst
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -118,7 +118,7 @@ Accessing a *dbt* project
118
118
119
119
1. Using a `local executor <https://airflow.apache.org/docs/apache-airflow/stable/executor/local.html>`_ with a single-machine installation means we can rely on the local machine's filesystem to store a *dbt* project. This also applies to ``DebugExecutor`` and ``SequentialExecutor``, but these executors are generally only used for debugging/development so we will ignore them. If you are running a setup like this, then simply ensure your *dbt* project and *profiles.yml* exist somewhere in the ``LocalExecutor``'s file system.
120
120
121
-
2. Once your setup has evolved to a multi-machine/cloud installation with any remote executor, we must rely on a remote storage for *dbt* files. Currently, supported remote storages include AWS S3 and git remote repositories although more are in plans to be added. In this setup, your *dbt* project will need to be uploaded to a remote storage that Airflow can access. *airflow-dbt-python* can utilize Airflow connections to access these storages.
121
+
2. Once your setup has evolved to a multi-machine/cloud installation with any remote executor, we must rely on a remote storage for *dbt* files. Currently, supported remote storages include AWS S3, Google Cloud Storage and Git repositories although more are in plans to be added. In this setup, your *dbt* project will need to be uploaded to a remote storage that Airflow can access. *airflow-dbt-python* can utilize Airflow connections to access these storages.
122
122
123
123
Single-machine installation
124
124
^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -316,7 +316,7 @@ The DAG looks the same as the AWS S3 example, except that now we use the GitHub
Copy file name to clipboardExpand all lines: docs/how_does_it_work.rst
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -58,7 +58,7 @@ This ensures *dbt* can work with any Airflow deployment, including most producti
58
58
*dbt* remote hooks
59
59
------------------
60
60
61
-
*dbt* remote hooks implement a simple interface to communicate with *dbt* remotes. A *dbt* remote can be any external storage that contains a *dbt* project and potentially also a *profiles.yml* file for example: an AWS S3 bucket or a GitHub repository. See the reference for a list of which remotes are currently supported.
61
+
*dbt* remote hooks implement a simple interface to communicate with *dbt* remotes. A *dbt* remote can be any external storage that contains a *dbt* project and potentially also a *profiles.yml* file for example: an AWS S3 bucket, a Google Cloud Storage or a GitHub repository. See the reference for a list of which remotes are currently supported.
Copy file name to clipboardExpand all lines: docs/introduction.rst
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ Features
14
14
We believe Airflow can **enhance** a *dbt* user's experience with several additional features that leverage Airflow as much as possible:
15
15
16
16
* Configuring *dbt* connections with Airflow connections.
17
-
* Downloading *dbt* projects from remote storages, like `AWS S3 <https://aws.amazon.com/s3/>`_ or Github repositories.
17
+
* Downloading *dbt* projects from remote storages, like `AWS S3 <https://aws.amazon.com/s3/>`_, `Google Cloud Storage <https://cloud.google.com/storage/docs>`_ or Github repositories.
18
18
* Communicate between tasks by pushing results and artifacts to `XCom <https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/xcoms.html>`_.
19
19
20
20
Can you think of another way Airflow can enhance *dbt*? Let us know in a `GitHub issue <https://github.com/tomasfarias/airflow-dbt-python/issues/new/choose>`_!
@@ -26,7 +26,7 @@ Read along for a breakdown of *airflow-dbt-python*'s main features, or head over
26
26
Download dbt files from S3
27
27
^^^^^^^^^^^^^^^^^^^^^^^^^^
28
28
29
-
The dbt parameters ``profiles_dir`` and ``project_dir`` would normally point to a directory containing a ``profiles.yml`` file and a dbt project in the local environment respectively (defined by the presence of a ``dbt_project.yml`` file). airflow-dbt-python extends these parameters to also accept an `AWS S3 <https://aws.amazon.com/s3/>`_ URL (identified by a ``s3://`` scheme):
29
+
The dbt parameters ``profiles_dir`` and ``project_dir`` would normally point to a directory containing a ``profiles.yml`` file and a dbt project in the local environment respectively (defined by the presence of a ``dbt_project.yml`` file). airflow-dbt-python extends these parameters to also accept an `AWS S3 <https://aws.amazon.com/s3/>`_ URL (identified by a ``s3://`` scheme) and a `Google Cloud Storate <https://cloud.google.com/storage/docs>`_ URL (identified by a ``gs://`` scheme):
30
30
31
31
* If an S3 URL is used for ``profiles_dir``, then this URL must point to a directory in S3 that contains a ``profiles.yml`` file. The ``profiles.yml`` file will be downloaded and made available for the operator to use when running.
32
32
* If an S3 URL is used for ``project_dir``, then this URL must point to a directory in S3 containing all the files required for a dbt project to run. All of the contents of this directory will be downloaded and made available for the operator. The URL may also point to a zip file containing all the files of a dbt project, which will be downloaded, uncompressed, and made available for the operator.
@@ -134,7 +134,7 @@ Use Airflow connections as dbt targets (without a profiles.yml)
134
134
) as dag:
135
135
dbt_run = DbtRunOperator(
136
136
task_id="dbt_run_hourly",
137
-
target="my_db_connection",
137
+
dbt_conn_id="my_db_connection",
138
138
# Profiles file is not needed as we are using an Airflow connection.
139
139
# If a profiles file is used, the Airflow connection will be merged to the
0 commit comments