SnowflakeOperator

Snowflake

Executes SQL code in a Snowflake database

View on GitHub

Last Updated: Jan. 20, 2022

Access Instructions

Install the Snowflake provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

snowflake_conn_idstrReference to Snowflake connection id
sqlRequiredCan receive a str representing a sql statement, a list of str (sql statements), or reference to a template file. Template reference are recognized by str ending in ‘.sql’the sql code to be executed. (templated)
autocommitboolif True, each command is automatically committed. (default value: True)
parametersdict or iterable(optional) the parameters to render the SQL query with.
warehousestrname of warehouse (will overwrite any warehouse defined in the connection’s extra JSON)
databasestrname of database (will overwrite database defined in connection)
schemastrname of schema (will overwrite schema defined in connection)
rolestrname of role (will overwrite any role defined in connection’s extra JSON)
authenticatorstrauthenticator for Snowflake. ‘snowflake’ (default) to use the internal Snowflake authenticator ‘externalbrowser’ to authenticate using your web browser and Okta, ADFS or any other SAML 2.0-compliant identify provider (IdP) that has been defined for your account ‘https://.okta.com’ to authenticate through native Okta.
session_parametersdictYou can set session-level parameters at the time you connect to Snowflake

Documentation

Executes SQL code in a Snowflake database

See also

For more information on how to use this operator, take a look at the guide: SnowflakeOperator

Example DAGs

snowflake

Example use of Snowflake related operators.

Snowflake
Big Data & Analytics
lineage_emission_dag

Lineage Emission

DataHub
Data Management & Governance
Using Census for Reverse ETL in Airflow

This reverse-ETL pipeline syncs customer data from a Snowflake data warehouse to a marketing platform using Census.

CensusApache Airflow Snowflake
DatabasesETL/ELT
Using Hightouch for Reverse ETL in Airflow

An ELT pipeline which extracts Salesforce customer data to Snowflake for transformation and enrichment, and synced back to Salesforce via Hightouch.

HightouchAmazonSnowflakeApache Airflow
DatabasesETL/ELTStorageBig Data & Analytics
Modern ELT from Salesforce to Snowflake

A modern ELT pipeline from extracting Salesforce data to loading and transforming in Snowflake.

AmazonSnowflakeApache Airflow
DatabasesETL/ELTStorageBig Data & Analytics
Extract Zendesk to Snowflake

Upload the following Zendesk objects to S3: Tickets, Organizations, Users. From S3, loads into Snowflake. Loads can be daily or full-extracts.

SnowflakeAmazonApache Airflow HTTP
ETL/ELTStorageDatabases
Taxi Data Quality Example With Snowflake

An example DAG showing the use of SnowflakeCheckOperators.

Apache Airflow SnowflakeAmazon
ETL/ELTData QualityDatabases
dbt Test Copy Store Failures Snowflake

DAG to run dbt project and tests, then load the store_failures table into a permanent table so subsequent runs do not overwrite.

Apache Airflow Snowflake
StorageETL/ELTData QualityData Management & Governance
Great Expectations Snowflake Example

Example DAG showcasing loading and data quality checking with Snowflake and Great Expectations.

Apache Airflow Great ExpectationsSnowflakeAmazon
Data QualityETL/ELT
Google Analytics to Snowflake

The google_analytics_to_snowflake DAG transfers data from Google Analytics (GA) to Google Cloud Storage (GCS), and then loads the data to Snowflake staging tables. From staging, data is loaded to Snow…

Apache Airflow Snowflake
ETL/ELTData ProcessingStorageDatabasesData Management & Governance

Was this page helpful?