Building Blocks for your Apache Airflow Data Pipelines.
Python classes that can be imported from a provider package for use in Airflow.
Modules are the building blocks of DAGs and include hooks, operators, sensors, and more.
Executes SQL code in a Snowflake database
Submits a Spark job run to Databricks using the api/2.0/jobs/runs/submit API endpoint.
Retrieves Connections and Variables from Hashicorp Vault.
FivetranOperator starts a Fivetran sync job.
An operator to leverage Great Expectations as a task in your Airflow DAG.
Asks for the state of the EMR JobFlow (Cluster) until it reaches any of the target states. If it fails the sensor errors, failing the task.
Convert a Table object into a Pandas DataFrame or persist a DataFrame result to a database table.
Execute SQL that is not expected to return data like DDL or DML operations.
Execute an explicit, SELECT SQL statement. Data returned from this SQL is inserted into a temporary table which can used by other downstream tasks.
Just getting started with Apache Airflow? Check out our certification program for an entry-level course on the fundamentals.
Explore our sample Github repo that explains how to write, publish, and build a provider package for your tool.
Publish & Share
Ready to publish your provider to the Astronomer Registry? Get in touch and we'll make it happen.
Ready to run Airflow in production?