Building Blocks for your Apache Airflow Data Pipelines.
Python classes that can be imported from a provider package for use in Airflow.
Modules are the building blocks of DAGs and include hooks, operators, sensors, and more.
Executes SQL code in a Snowflake database
Submits a Spark job run to Databricks using the api/2.0/jobs/runs/submit API endpoint.
Retrieves Connections and Variables from Hashicorp Vault.
An operator to leverage Great Expectations as a task in your Airflow DAG.
Asks for the state of the EMR JobFlow (Cluster) until it reaches any of the target states. If it fails the sensor errors, failing the task.
FivetranOperator starts a Fivetran sync job.
Just getting started with Apache Airflow? Check out our certification program for an entry-level course on the fundamentals.
Explore our sample Github repo that explains how to write, publish, and build a provider package for your tool.
Publish & Share
Ready to publish your provider to the Astronomer Registry? Get in touch and we'll make it happen.
Ready to run Airflow in production?