Skip to content

Commit aec7265

Browse files
committed
docs: update
1 parent 3e41041 commit aec7265

File tree

5 files changed

+14
-8
lines changed

5 files changed

+14
-8
lines changed

docs/development.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ If you wish to install a different version of Airflow for testing you may skip t
4343

4444
.. code-block:: shell
4545
46-
pip install apache-airflow==2.2 apache-airflow-providers-amazon==5.0
46+
pip install apache-airflow>=2.2 apache-airflow-providers-amazon>=3.0
4747
4848
Modifying dependencies
4949
----------------------
@@ -134,6 +134,6 @@ Most of *airflow-dbt-python*'s operator and hook tests follow the same pattern:
134134

135135
1. Initialize a specific operator or hook.
136136
2. Run it with a basic test *dbt* project against the test PostgreSQL database.
137-
3. Assert *dbt* executes succesfuly, any results are properly propagated, and any artifacts are pushed to where they need to go.
137+
3. Assert *dbt* executes successfully, any results are properly propagated, and any artifacts are pushed to where they need to go.
138138

139139
However, *airflow-dbt-python* also includes DAG tests, which can be seen as broader integration tests. These are located under ``tests/dags/``. DAG tests focus on testing complete end-to-end DAGs, including those shown in :ref:`example_dags`.

docs/getting_started.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@ Accessing a *dbt* project
118118

119119
1. Using a `local executor <https://airflow.apache.org/docs/apache-airflow/stable/executor/local.html>`_ with a single-machine installation means we can rely on the local machine's filesystem to store a *dbt* project. This also applies to ``DebugExecutor`` and ``SequentialExecutor``, but these executors are generally only used for debugging/development so we will ignore them. If you are running a setup like this, then simply ensure your *dbt* project and *profiles.yml* exist somewhere in the ``LocalExecutor``'s file system.
120120

121-
2. Once your setup has evolved to a multi-machine/cloud installation with any remote executor, we must rely on a remote storage for *dbt* files. Currently, supported remote storages include AWS S3 and git remote repositories although more are in plans to be added. In this setup, your *dbt* project will need to be uploaded to a remote storage that Airflow can access. *airflow-dbt-python* can utilize Airflow connections to access these storages.
121+
2. Once your setup has evolved to a multi-machine/cloud installation with any remote executor, we must rely on a remote storage for *dbt* files. Currently, supported remote storages include AWS S3, Google Cloud Storage and Git repositories although more are in plans to be added. In this setup, your *dbt* project will need to be uploaded to a remote storage that Airflow can access. *airflow-dbt-python* can utilize Airflow connections to access these storages.
122122

123123
Single-machine installation
124124
^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -316,7 +316,7 @@ The DAG looks the same as the AWS S3 example, except that now we use the GitHub
316316
project_dir="git+ssh://github.com:dbt-labs/jaffle-shop-classic",
317317
select=["+tag:daily"],
318318
exclude=["tag:deprecated"],
319-
target="my_warehouse_connection",
319+
dbt_conn_id="my_warehouse_connection",
320320
profile="my-project",
321321
)
322322

docs/how_does_it_work.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ This ensures *dbt* can work with any Airflow deployment, including most producti
5858
*dbt* remote hooks
5959
------------------
6060

61-
*dbt* remote hooks implement a simple interface to communicate with *dbt* remotes. A *dbt* remote can be any external storage that contains a *dbt* project and potentially also a *profiles.yml* file for example: an AWS S3 bucket or a GitHub repository. See the reference for a list of which remotes are currently supported.
61+
*dbt* remote hooks implement a simple interface to communicate with *dbt* remotes. A *dbt* remote can be any external storage that contains a *dbt* project and potentially also a *profiles.yml* file for example: an AWS S3 bucket, a Google Cloud Storage or a GitHub repository. See the reference for a list of which remotes are currently supported.
6262

6363
Implementing the ``DbtRemoteHook`` interface
6464
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

docs/introduction.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Features
1414
We believe Airflow can **enhance** a *dbt* user's experience with several additional features that leverage Airflow as much as possible:
1515

1616
* Configuring *dbt* connections with Airflow connections.
17-
* Downloading *dbt* projects from remote storages, like `AWS S3 <https://aws.amazon.com/s3/>`_ or Github repositories.
17+
* Downloading *dbt* projects from remote storages, like `AWS S3 <https://aws.amazon.com/s3/>`_, `Google Cloud Storage <https://cloud.google.com/storage/docs>`_ or Github repositories.
1818
* Communicate between tasks by pushing results and artifacts to `XCom <https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/xcoms.html>`_.
1919

2020
Can you think of another way Airflow can enhance *dbt*? Let us know in a `GitHub issue <https://github.com/tomasfarias/airflow-dbt-python/issues/new/choose>`_!
@@ -26,7 +26,7 @@ Read along for a breakdown of *airflow-dbt-python*'s main features, or head over
2626
Download dbt files from S3
2727
^^^^^^^^^^^^^^^^^^^^^^^^^^
2828

29-
The dbt parameters ``profiles_dir`` and ``project_dir`` would normally point to a directory containing a ``profiles.yml`` file and a dbt project in the local environment respectively (defined by the presence of a ``dbt_project.yml`` file). airflow-dbt-python extends these parameters to also accept an `AWS S3 <https://aws.amazon.com/s3/>`_ URL (identified by a ``s3://`` scheme):
29+
The dbt parameters ``profiles_dir`` and ``project_dir`` would normally point to a directory containing a ``profiles.yml`` file and a dbt project in the local environment respectively (defined by the presence of a ``dbt_project.yml`` file). airflow-dbt-python extends these parameters to also accept an `AWS S3 <https://aws.amazon.com/s3/>`_ URL (identified by a ``s3://`` scheme) and a `Google Cloud Storate <https://cloud.google.com/storage/docs>`_ URL (identified by a ``gs://`` scheme):
3030

3131
* If an S3 URL is used for ``profiles_dir``, then this URL must point to a directory in S3 that contains a ``profiles.yml`` file. The ``profiles.yml`` file will be downloaded and made available for the operator to use when running.
3232
* If an S3 URL is used for ``project_dir``, then this URL must point to a directory in S3 containing all the files required for a dbt project to run. All of the contents of this directory will be downloaded and made available for the operator. The URL may also point to a zip file containing all the files of a dbt project, which will be downloaded, uncompressed, and made available for the operator.
@@ -134,7 +134,7 @@ Use Airflow connections as dbt targets (without a profiles.yml)
134134
) as dag:
135135
dbt_run = DbtRunOperator(
136136
task_id="dbt_run_hourly",
137-
target="my_db_connection",
137+
dbt_conn_id="my_db_connection",
138138
# Profiles file is not needed as we are using an Airflow connection.
139139
# If a profiles file is used, the Airflow connection will be merged to the
140140
# existing targets

docs/reference/hooks.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -33,3 +33,9 @@ The DbtRemoteHook interface
3333

3434
.. automodule:: airflow_dbt_python.hooks.remote.s3
3535
:members:
36+
37+
*dbt* GCS remote
38+
^^^^^^^^^^^^^^^^
39+
40+
.. automodule:: airflow_dbt_python.hooks.remote.gcs
41+
:members:

0 commit comments

Comments
 (0)