Astro SDKCertified

Execute SQL that is not expected to return data like DDL or DML operations.

View on GitHub

Last Updated: Jun. 13, 2022

Access Instructions

Install the Astro SDK provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


python_callableOptional[Callable]Function to be decorated.
multiple_outputsOptional[bool]If set to True, the decorated function's return value will be unrolled to multiple XCom values. Dict will unroll to XCom values with its keys as XCom keys.
conn_idstrThe ID of the configured Airflow Connection to use to connect to a database.
autocommitboolIf True, each SQL command is automatically committed.
parametersOptional[dict or iterable]The parameters to render the SQL query with.
databaseOptional[str]The name of the database to use when executing the SQL.
schemaOptional[str]The name of the schema to use when executing the SQL.
warehouseOptional[str]The name of the warehouse to use when executing the SQL.


Most ETL use-cases can be addressed by cross-sharing Task outputs. For SQL operations that don't return tables but might take tables as arguments, there is @run_raw_sql.

def drop_table(table_to_drop):
return "DROP TABLE IF EXISTS {table_to_drop}"

Was this page helpful?