Executes an UNLOAD command to s3 as a CSV with headers

View Source

Last Updated: May. 7, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


schemastrreference to a specific schema in redshift database
tablestrreference to a specific table in redshift database
s3_bucketstrreference to a specific S3 bucket
s3_keystrreference to a specific S3 key. If table_as_file_name is set to False, this param must include the desired file name
redshift_conn_idstrreference to a specific redshift database
aws_conn_idstrreference to a specific S3 connection
verifybool or strWhether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:False: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified.path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
unload_optionslistreference to a list of UNLOAD options
autocommitboolIf set to True it will automatically commit the UNLOAD statement. Otherwise it will be committed right before the redshift connection gets closed.
include_headerboolIf set to True the s3 file contains the header columns.
table_as_file_nameboolIf set to True, the s3 file will be named as the table


Executes an UNLOAD command to s3 as a CSV with headers

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?