OracleToAzureDataLakeOperator

Microsoft Azure

Moves data from Oracle to Azure Data Lake. The operator runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

View on GitHub

Last Updated: May. 7, 2021

Access Instructions

Install the Microsoft Azure provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

filenameRequiredstrfile name to be used by the csv file.
azure_data_lake_conn_idRequiredstrdestination azure data lake connection.
azure_data_lake_pathRequiredstrdestination path in azure data lake to put the file.
oracle_conn_idRequiredstrsource Oracle connection.
sqlRequiredstrSQL query to execute against the Oracle database. (templated)
sql_paramsOptional[dict]Parameters to use in sql query. (templated)
delimiterstrfield delimiter in the file.
encodingstrencoding type for the file.
quotecharstrCharacter to use in quoting.
quotingstrQuoting strategy. See unicodecsv quoting for more information.

Documentation

Moves data from Oracle to Azure Data Lake. The operator runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

Was this page helpful?