Skip to content

Latest commit

 

History

History
212 lines (146 loc) · 9.73 KB

README.md

File metadata and controls

212 lines (146 loc) · 9.73 KB

Note

Starting from Airflow version 2.9, MWAA has open-sourced the original Docker image used in our production deployments. You can refer to our open-source image repository at https://github.com/aws/amazon-mwaa-docker-images to create a local environment identical to that of MWAA. You can also continue to use the MWAA Local Runner for testing and packaging requirements for all Airflow versions supported on MWAA.

About aws-mwaa-local-runner

This repository provides a command line interface (CLI) utility that replicates an Amazon Managed Workflows for Apache Airflow (MWAA) environment locally.

Please note: MWAA/AWS/DAG/Plugin issues should be raised through AWS Support or the Airflow Slack #airflow-aws channel. Issues here should be focused on this local-runner repository.

Please note: The dynamic configurations which are dependent on the class of an environment are aligned with the Large environment class in this repository.

About the CLI

The CLI builds a Docker container image locally that’s similar to a MWAA production image. This allows you to run a local Apache Airflow environment to develop and test DAGs, custom plugins, and dependencies before deploying to MWAA.

What this repo contains

dags/
  example_lambda.py
  example_dag_with_taskflow_api.py
  example_redshift_data_execute_sql.py
docker/
  config/
    airflow.cfg
    constraints.txt
    mwaa-base-providers-requirements.txt
    webserver_config.py
    .env.localrunner
  script/
    bootstrap.sh
    entrypoint.sh
    systemlibs.sh
    generate_key.sh
  docker-compose-local.yml
  docker-compose-resetdb.yml
  docker-compose-sequential.yml
  Dockerfile
plugins/
  README.md
requirements/
  requirements.txt
.gitignore
CODE_OF_CONDUCT.md
CONTRIBUTING.md
LICENSE
mwaa-local-env
README.md
VERSION

Prerequisites

Get started

git clone https://github.com/aws/aws-mwaa-local-runner.git
cd aws-mwaa-local-runner

Step one: Building the Docker image

Build the Docker container image using the following command:

./mwaa-local-env build-image

Note: it takes several minutes to build the Docker image locally.

Step two: Running Apache Airflow

Local runner

Runs a local Apache Airflow environment that is a close representation of MWAA by configuration.

./mwaa-local-env start

To stop the local environment, Ctrl+C on the terminal and wait till the local runner and the postgres containers are stopped.

Step three: Accessing the Airflow UI

By default, the bootstrap.sh script creates a username and password for your local Airflow environment.

  • Username: admin
  • Password: test

Airflow UI

Step four: Add DAGs and supporting files

The following section describes where to add your DAG code and supporting files. We recommend creating a directory structure similar to your MWAA environment.

DAGs

  1. Add DAG code to the dags/ folder.
  2. To run the sample code in this repository, see the example_dag_with_taskflow_api.py file.

Requirements.txt

  1. Add Python dependencies to requirements/requirements.txt.
  2. To test a requirements.txt without running Apache Airflow, use the following script:
./mwaa-local-env test-requirements

Let's say you add aws-batch==0.6 to your requirements/requirements.txt file. You should see an output similar to:

Installing requirements.txt
Collecting aws-batch (from -r /usr/local/airflow/dags/requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/5d/11/3aedc6e150d2df6f3d422d7107ac9eba5b50261cf57ab813bb00d8299a34/aws_batch-0.6.tar.gz
Collecting awscli (from aws-batch->-r /usr/local/airflow/dags/requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/07/4a/d054884c2ef4eb3c237e1f4007d3ece5c46e286e4258288f0116724af009/awscli-1.19.21-py2.py3-none-any.whl (3.6MB)
    100% |████████████████████████████████| 3.6MB 365kB/s
...
...
...
Installing collected packages: botocore, docutils, pyasn1, rsa, awscli, aws-batch
  Running setup.py install for aws-batch ... done
Successfully installed aws-batch-0.6 awscli-1.19.21 botocore-1.20.21 docutils-0.15.2 pyasn1-0.4.8 rsa-4.7.2
  1. To package the necessary WHL files for your requirements.txt without running Apache Airflow, use the following script:
./mwaa-local-env package-requirements

For example usage see Installing Python dependencies using PyPi.org Requirements File Format Option two: Python wheels (.whl).

Custom plugins

  • There is a directory at the root of this repository called plugins.
  • In this directory, create a file for your new custom plugin.
  • Add any Python dependencies to requirements/requirements.txt.

Note: this step assumes you have a DAG that corresponds to the custom plugin. For example usage MWAA Code Examples.

Startup script

  • There is a sample shell script startup.sh located in a directory at the root of this repository called startup_script.
  • If there is a need to run additional setup (e.g. install system libraries, setting up environment variables), please modify the startup.sh script.
  • To test a startup.sh without running Apache Airflow, use the following script:
./mwaa-local-env test-startup-script

What's next?

FAQs

The following section contains common questions and answers you may encounter when using your Docker container image.

Can I test execution role permissions using this repository?

How do I add libraries to requirements.txt and test install?

  • A requirements.txt file is included in the /requirements folder of your local Docker container image. We recommend adding libraries to this file, and running locally.

What if a library is not available on PyPi.org?

Troubleshooting

The following section contains errors you may encounter when using the Docker container image in this repository.

My environment is not starting

  • If you encountered the following error: process fails with "dag_stats_table already exists", you'll need to reset your database using the following command:
./mwaa-local-env reset-db
  • If you are moving from an older version of local-runner you may need to run the above reset-db command, or delete your ./db-data folder. Note, too, that newer Airflow versions have newer provider packages, which may require updating your DAG code.

Fernet Key InvalidToken

A Fernet Key is generated during image build (./mwaa-local-env build-image) and is durable throughout all containers started from that image. This key is used to encrypt connection passwords in the Airflow DB. If changes are made to the image and it is rebuilt, you may get a new key that will not match the key used when the Airflow DB was initialized, in this case you will need to reset the DB (./mwaa-local-env reset-db).

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.