SFTPToS3Operator

Amazon

This operator enables the transferring of files from a SFTP server to Amazon S3.

View Source

Last Updated: May. 7, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

sftp_conn_idstrThe sftp connection id. The name or identifier for establishing a connection to the SFTP server.
sftp_pathstrThe sftp remote path. This is the specified file path for downloading the file from the SFTP server.
s3_conn_idstrThe s3 connection id. The name or identifier for establishing a connection to S3
s3_bucketstrThe targeted s3 bucket. This is the S3 bucket to where the file is uploaded.
s3_keystrThe targeted s3 key. This is the specified path for uploading the file to S3.

Documentation

This operator enables the transferring of files from a SFTP server to Amazon S3.

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `__init__.py` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?