MySQLToS3Operator

Amazon

Saves data from an specific MySQL query into a file in S3.

View Source

Last Updated: Apr. 27, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

querystrthe sql query to be executed. If you want to execute a file, place the absolute path of it, ending with .sql extension. (templated)
s3_bucketstrbucket where the data will be stored. (templated)
s3_keystrdesired key for the file. It includes the name of the file. (templated)
mysql_conn_idstrreference to a specific mysql database
aws_conn_idstrreference to a specific S3 connection
verifybool or strWhether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:False: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified.path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
pd_csv_kwargsdictarguments to include in pd.to_csv (header, index, columns...)
indexstrwhether to have the index or not in the dataframe
headerboolwhether to include header or not into the S3 file

Documentation

Saves data from an specific MySQL query into a file in S3.

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `__init__.py` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?