While not an explicit provider package, Apache Airflow contains a set of native modules that are closely-coupled to the project codebase. Many providers extend these core modules to add additional, more specialized functionality.
Modules are Python callables available from this provider package.
Interface that providers can implement to be discovered by ProvidersManager.
Abstract base class for sql hooks.
Execute a Bash script, command or set of commands.
Performs checks against a db. The SQLCheckOperator expects a sql query that will return a single row. Each value on that first row is evaluated using …
Checks that the values of metrics given as SQL expressions are within a certain tolerance of the ones from days_back before.
Performs a value check using sql code against a minimum threshold and a maximum threshold. Thresholds can be in the form of a numeric value OR a sql s…
Performs a simple value check using sql code.
Triggers a DAG run for a specified dag_id
Operator that does literally nothing. It can be used to group tasks in a DAG.
Sends an email.
Allows a workflow to skip tasks that are not running during the most recent schedule interval.
Allows a workflow to “branch” or follow a path following the execution of this task.
Executes a Python callable
Allows one to run a function in a virtualenv that is created and destroyed automatically (with certain caveats).
Allows a workflow to continue only if a condition is met. Otherwise, the workflow “short-circuits” and downstream tasks are skipped.
Wraps a Python callable and captures args/kwargs when called for execution.
Executes sql code in a specific database
Sensor operators are derived from this class and inherit these attributes.
This runs a sub dag. By convention, a sub dag’s dag_id should be prefixed by its parent and a dot. As in parent.child. Although SubDagOperator can occ…
Abstract base class to retrieve Connection object given a conn_id or Variable given a key
Retrieves Connection object and Variable from environment variable.
Retrieves Connection objects and Variables from local files
Retrieves Connection object and Variable from airflow metastore database.
Executes a bash command/script and returns True if and only if the return code is 0.
Waits until the specified datetime.
Waits for a Python callable to return True.
This class stores a sensor work with decoded context value. It is only used inside of smart sensor. Create a sensor work based on sensor instance reco…
Smart sensor operators are derived from this class.
Waits for a timedelta after the task’s execution_date + schedule_interval. In Airflow, the daily task stamped with execution_date 2016-01-01 can only …
Waits until the specified time of the day.
Waits until the first specified day of the week. For example, if the execution day of the task is ‘2018-12-22’ (Saturday) and you pass ‘FRIDAY’, the t…
Abstract base class for hooks, hooks are meant as an interface to interact with external systems. MySqlHook, HiveHook, PigHook return object that can …
Waits for a different DAG or a task in a different DAG to complete for a specific execution_date
Allows for interaction with an file server.
This is a base class for creating operators with branching functionality, similarly to BranchPythonOperator.
Interact with HDFS. This class is a wrapper around the hdfscli library.