LocalFilesystemToS3Operator

Amazon

Uploads a file from a local filesystem to Amazon S3.

View on GitHub

Last Updated: Aug. 14, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

filenameRequiredstrPath to the local file. Path can be either absolute (e.g. /path/to/file.ext) or relative (e.g. ../../foo//.csv). (templated)
dest_keyRequiredstrThe key of the object to copy to. (templated) It can be either full s3:// style url or relative path from root level. When it’s specified as a full s3:// url, including dest_bucket results in a TypeError.
dest_bucketstrName of the S3 bucket to where the object is copied. (templated) Inclusion when dest_key is provided as a full s3:// url results in a TypeError.
aws_conn_idstrConnection id of the S3 connection to use
verifybool or strWhether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values: False: do not validate SSL certificates. SSL will still be used,but SSL certificates will not be verified. path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
replaceboolA flag to decide whether or not to overwrite the key if it already exists. If replace is False and the key exists, an error will be raised.
encryptboolIf True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.
gzipboolIf True, the file will be compressed locally
acl_policystrString specifying the canned ACL policy for the file being uploaded to the S3 bucket.

Documentation

Uploads a file from a local filesystem to Amazon S3.

Was this page helpful?