AwsBaseHook

Amazon

Interact with AWS. This class is a thin wrapper around the boto3 python library.

View Source

Last Updated: Mar. 26, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

aws_conn_idstrThe Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).
verifyUnion[bool, str, None]Whether or not to verify SSL certificates. https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html
region_nameOptional[str]AWS region_name. If not specified then the default boto3 behaviour is used.
client_typeOptional[str]boto3.client client_type. Eg 's3', 'emr' etc
resource_typeOptional[str]boto3.resource resource_type. Eg 'dynamodb' etc
configOptional[botocore.client.Config]Configuration for botocore client. (https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html)

Documentation

Interact with AWS. This class is a thin wrapper around the boto3 python library.

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `__init__.py` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?