SparkSqlOperator

SparkCertified

Execute Spark SQL query

View Source

Last Updated: May. 7, 2021

Access Instructions

Install the Spark provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

sqlstrThe SQL query to execute. (templated)
confstr (format: PROP=VALUE)arbitrary Spark configuration property
conn_idstrconnection_id string
total_executor_coresint(Standalone & Mesos only) Total cores for all executors (Default: all the available cores on the worker)
executor_coresint(Standalone & YARN only) Number of cores per executor (Default: 2)
executor_memorystrMemory per executor (e.g. 1000M, 2G) (Default: 1G)
keytabstrFull path to the file that contains the keytab
masterstrspark://host:port, mesos://host:port, yarn, or local
namestrName of the job
num_executorsintNumber of executors to launch
verboseboolWhether to pass the verbose flag to spark-sql
yarn_queuestrThe YARN queue to submit to (Default: "default")

Documentation

Execute Spark SQL query

See also

For more information on how to use this operator, take a look at the guide: SparkSqlOperator

Was this page helpful?