Runs Hive job in Data Proc cluster.

View Source

Last Updated: May. 7, 2021

Access Instructions

Install the Yandex provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


queryOptional[str]Hive query.
query_file_uriOptional[str]URI of the script that contains Hive queries. Can be placed in HDFS or S3.
propertiesOptional[Dist[str, str]]A mapping of property names to values, used to configure Hive.
script_variablesOptional[Dist[str, str]]Mapping of query variable names to values.
continue_on_failureboolWhether to continue executing queries if a query fails.
namestrName of the job. Used for labeling.
cluster_idOptional[str]ID of the cluster to run job in. Will try to take the ID from Dataproc Hook object if ot specified. (templated)
connection_idOptional[str]ID of the Yandex.Cloud Airflow connection.


Runs Hive job in Data Proc cluster.

Was this page helpful?