S3ToRedshiftOperator

Amazon

Executes an COPY command to load files from s3 to Redshift

View Source

Last Updated: Feb. 8, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

schemastrreference to a specific schema in redshift database
tablestrreference to a specific table in redshift database
s3_bucketstrreference to a specific S3 bucket
s3_keystrreference to a specific S3 key
redshift_conn_idstrreference to a specific redshift database
aws_conn_idstrreference to a specific S3 connection
verifybool or strWhether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:False: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified.path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
copy_optionslistreference to a list of COPY options
truncate_tableboolwhether or not to truncate the destination table before the copy

Documentation

Executes an COPY command to load files from s3 to Redshift

See also

For more information on how to use this operator, take a look at the guide: S3 To Redshift Transfer Operator

Was this page helpful?