Orchestrate Azure Data Factory in Airflow
Example DAG from the Astronomer Azure Operators tutorial demonstrating the use of orchestrating Azure Data Factory pipelines in Airflow.
Run this DAG
1. Install the Astronomer CLI:Skip if you already have our CLI
2. Download the repository:
3. Navigate to where the repository was cloned and start the DAG:
This repo contains an Astronomer project with multiple example DAGs showing how to use Azure service operators in Airflow.
This repo contains DAGs to interact with the following Azure services. Links will direct you to a detailed guide walking through how to use the operators.
- Azure Container Instances
- Azure Data Explorer
- Azure Data Factory
- Azure Blob Storage
The easiest way to run these example DAGs is to use the Astronomer CLI to get an Airflow instance up and running locally:
- Install the Astronomer CLI
- Clone this repo somewhere locally and navigate to it in your terminal
- Initialize an Astronomer project by running
astro dev init
- Start Airflow locally by running
astro dev start
- Navigate to localhost:8080 in your browser and you should see the tutorial DAGs there