Dynamic dbt Data Pipeline
An example of a dbt pipeline which generates tasks dynamically from a
Run this DAG
1. Install the Astronomer CLI:Skip if you already have our CLI
2. Download the repository:
3. Navigate to where the repository was cloned and start the DAG:
Airflow DAGs for dbt
The code in this repository is meant to accompany this blog post on beginner and advanced implementation concepts at the intersection of dbt and Airflow.
To run these DAGs locally:
- Download the Astro CLI
- Download and run Docker
- Clone this repository and
astro dev startto spin up a local Airflow environment and run the accompanying DAGs on your machine.
dbt project setup
We are currently using the jaffle_shop sample dbt project.
The only files required for the Airflow DAGs to run are
target/manifest.json, but we included the models for completeness. If you would like to try these DAGs with your own dbt workflow, feel free to drop in your own project files.
- If you make changes to the dbt project, you will need to run
dbt compilein order to update the
This may be done manually during development, as part of a CI/CD pipeline, or as a separate step in a production pipeline run before the Airflow DAG is triggered.
- The sample dbt project contains the
profiles.yml, which is configured to use environment variables. If the environment variables do not exist, we default to use Astronomer's containerized postgres database. This is solely for the purpose of this demo. In a production environment, you should use a production-ready database and use environment variables or some other form of secret management for the database credentials.
- Each DAG runs a
dbt_seedtask at the beginning that loads sample data into the database. This is simply for the purpose of this demo.