ExasolToS3Operator

Amazon

Export data from Exasol database to AWS S3 bucket.

View Source

Last Updated: May. 7, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

query_or_tablestrthe sql statement to be executed or table name to export
keystrS3 key that will point to the file
bucket_namestrName of the bucket in which to store the file
replaceboolA flag to decide whether or not to overwrite the key if it already exists. If replace is False and the key exists, an error will be raised.
encryptboolIf True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.
gzipboolIf True, the file will be compressed locally
acl_policystrString specifying the canned ACL policy for the file being uploaded to the S3 bucket.
query_paramsdictQuery parameters passed to underlying export_to_file method of :class:`~pyexasol.connection.ExaConnection`.No role entry for "class" in module "docutils.parsers.rst.languages.en". Trying "class" as canonical role name.Unknown interpreted text role "class".
export_paramsdictExtra parameters passed to underlying export_to_file method of :class:`~pyexasol.connection.ExaConnection`.No role entry for "class" in module "docutils.parsers.rst.languages.en". Trying "class" as canonical role name.Unknown interpreted text role "class".

Documentation

Export data from Exasol database to AWS S3 bucket.

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `__init__.py` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?