Astronomer Registry
Building Blocks for your Apache Airflow Data Pipelines.
Providers
Python packages containing all relevant Airflow modules for a third-party service.
Modules
Python classes that can be imported from a provider package for use in Airflow.
Modules are the building blocks of DAGs and include hooks, operators, sensors, and more.
Submits a Spark job run to Databricks using the api/2.1/jobs/runs/submit API endpoint.
Executes SQL code in a Snowflake database
Convert a Table object into a Pandas DataFrame or persist a DataFrame result to a database table.
Execute an explicit, SELECT SQL statement. Data returned from this SQL is inserted into a temporary table which can used by other downstream tasks.
Execute SQL that is not expected to return data like DDL or DML operations.
Retrieves Connections and Variables from Hashicorp Vault.
An operator to leverage Great Expectations as a task in your Airflow DAG.
FivetranOperator starts a Fivetran sync job.
Asks for the state of the EMR JobFlow (Cluster) until it reaches any of the target states. If it fails the sensor errors, failing the task.
Learn Airflow
Just getting started with Apache Airflow? Check out our certification program for an entry-level course on the fundamentals.
Provider Template
Explore our sample Github repo that explains how to write, publish, and build a provider package for your tool.
Publish & Share
Ready to publish your provider to the Astronomer Registry? Get in touch and we'll make it happen.
Ready to run Airflow in production?