transform_file

AstroCertified

Execute a SELECT SQL statement contained in a file. Data returned from this SQL is inserted into a temporary table which can used by other downstream tasks.

View on GitHub

Last Updated: Dec. 16, 2021

Access Instructions

Install the Astro provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

sqlRequiredstrPath to a SQL file.
conn_idstrThe ID of the configured Airflow Connection to use to connect to a database.
parametersOptional[dict or iterable]The parameters to render the SQL query with.
databaseOptional[str]The name of the database to use when executing the SQL.
schemaOptional[str]The name of the schema to use when executing the SQL.
warehouseOptional[str]The name of the warehouse to use when executing the SQL.
output_tableOptional[astro.sql.table.Table]The table to writes the SQL results to.

Documentation

Another option for larger SQL queries is to use the transform_file function to pass an external SQL file to the DAG. All of the same templating will work for this SQL query.

Example:
f = transform_file(
sql=str(cwd) + "/my_sql_function.sql",
conn_id="postgres_conn",
database="pagila",
parameters={
"actor": Table("actor"),
"film_actor_join": Table("film_actor"),
"unsafe_parameter": "G%%",
},
output_table=Table("my_table_from_file"),
)

Was this page helpful?