AzureBlobStorageToGCSOperator

Microsoft Azure

Operator transfers data from Azure Blob Storage to specified bucket in Google Cloud Storage

View on GitHub

Last Updated: May. 7, 2021

Access Instructions

Install the Microsoft Azure provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

wasb_conn_idstrReference to the wasb connection.
gcp_conn_idstrThe connection ID to use when fetching connection info.
blob_nameRequiredstrName of the blob
file_pathRequiredstrPath to the file to download
container_nameRequiredstrName of the container
bucket_nameRequiredstrThe bucket to upload to
object_nameRequiredstrThe object name to set when uploading the file
filenameRequiredstrThe local file path to the file to be uploaded
gzipRequiredboolOption to compress local file or file data for upload
delegate_toRequiredstrThe account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
impersonation_chainUnion[str, Sequence[str]]Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account.

Documentation

Operator transfers data from Azure Blob Storage to specified bucket in Google Cloud Storage

See also

For more information on how to use this operator, take a look at the guide: Transfer Data from Blob Storage to Google Cloud Storage

Was this page helpful?