S3ToSnowflakeOperator

Snowflake

Executes an COPY command to load files from s3 to Snowflake

View Source

Last Updated: Apr. 25, 2021

Access Instructions

Install the Snowflake provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

s3_keyslistreference to a list of S3 keys
tablestrreference to a specific table in snowflake database
schemastrname of schema (will overwrite schema defined in connection)
stagestrreference to a specific snowflake stage. If the stage's schema is not the same as the table one, it must be specified
prefixstrcloud storage location specified to limit the set of files to load
file_formatstrreference to a specific file format
warehousestrname of warehouse (will overwrite any warehouse defined in the connection's extra JSON)
databasestrreference to a specific database in Snowflake connection
columns_arraylistreference to a specific columns array in snowflake database
snowflake_conn_idstrreference to a specific snowflake connection
rolestrname of role (will overwrite any role defined in connection's extra JSON)
authenticatorstrauthenticator for Snowflake. 'snowflake' (default) to use the internal Snowflake authenticator 'externalbrowser' to authenticate using your web browser and Okta, ADFS or any other SAML 2.0-compliant identify provider (IdP) that has been defined for your account 'https://.okta.com' to authenticate through native Okta.
session_parametersdictYou can set session-level parameters at the time you connect to Snowflake

Documentation

Executes an COPY command to load files from s3 to Snowflake

See also

For more information on how to use this operator, take a look at the guide: S3ToSnowflakeOperator

Was this page helpful?