SnowflakeHook

Snowflake

A client to interact with Snowflake.

View Source

Last Updated: Apr. 30, 2021

Access Instructions

Install the Snowflake provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

accountOptional[str]snowflake account name
authenticatorOptional[str]authenticator for Snowflake. 'snowflake' (default) to use the internal Snowflake authenticator 'externalbrowser' to authenticate using your web browser and Okta, ADFS or any other SAML 2.0-compliant identify provider (IdP) that has been defined for your account 'https://.okta.com' to authenticate through native Okta.
warehouseOptional[str]name of snowflake warehouse
databaseOptional[str]name of snowflake database
regionOptional[str]name of snowflake region
roleOptional[str]name of snowflake role
schemaOptional[str]name of snowflake schema
session_parametersOptional[dict]You can set session-level parameters at the time you connect to Snowflake

Documentation

A client to interact with Snowflake.

This hook requires the snowflake_conn_id connection. The snowflake host, login, and, password field must be setup in the connection. Other inputs can be defined in the connection or hook instantiation. If used with the S3ToSnowflakeOperator add ‘aws_access_key_id’ and ‘aws_secret_access_key’ to extra field in the connection.

Note

get_sqlalchemy_engine() depends on snowflake-sqlalchemy

See also

For more information on how to use this Snowflake connection, take a look at the guide: SnowflakeOperator

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `__init__.py` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?