DataprocSubmitHiveJobOperator

Google

Start a Hive query Job on a Cloud DataProc cluster.

View Source

Last Updated: May. 7, 2021

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

querystrThe query or reference to the query file (q extension).
query_uristrThe HCFS URI of the script that contains the Hive queries.
variablesdictMap of named parameters for the query.

Documentation

Start a Hive query Job on a Cloud DataProc cluster.

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `__init__.py` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?