Building Blocks for your Apache Airflow Data Pipelines.
Python classes that can be imported from a provider package for use in Airflow.
Modules are the building blocks of DAGs and include hooks, operators, sensors, and more.
Executes SQL code in a Snowflake database
Submits a Spark job run to Databricks using the api/2.1/jobs/runs/submit API endpoint.
Retrieves Connections and Variables from Hashicorp Vault.
This decorator will allow users to write python functions while treating SQL tables as dataframes
Given a python function that returns a SQL statement and (optional) tables, execute the SQL statement and output the result into a SQL table
Given a python function that returns a SQL statement and (optional) tables, execute the SQL statement and output the result into a SQL table.
An operator to leverage Great Expectations as a task in your Airflow DAG.
FivetranOperator starts a Fivetran sync job.
Just getting started with Apache Airflow? Check out our certification program for an entry-level course on the fundamentals.
Explore our sample Github repo that explains how to write, publish, and build a provider package for your tool.
Publish & Share
Ready to publish your provider to the Astronomer Registry? Get in touch and we'll make it happen.
Ready to run Airflow in production?