BashOperator

Apache Airflow Certified

Execute a Bash script, command or set of commands.

View Source

Last Updated: Apr. 27, 2021

Access Instructions

Install the Apache Airflow provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

bash_commandstrThe command, set of commands or reference to a bash script (must be '.sh') to be executed. (templated)
envdictIf env is not None, it must be a dict that defines the environment variables for the new process; these are used instead of inheriting the current process environment, which is the default behavior. (templated)
output_encodingstrOutput encoding of bash command

Documentation

Execute a Bash script, command or set of commands.

See also

For more information on how to use this operator, take a look at the guide: BashOperator

If BaseOperator.do_xcom_push is True, the last line written to stdout will also be pushed to an XCom when the bash command completes

On execution of this operator the task will be up for retry when exception is raised. However, if a sub-command exits with non-zero value Airflow will not recognize it as failure unless the whole shell exits with a failure. The easiest way of achieving this is to prefix the command with set -e; Example:

bash_command = "set -e; python3 script.py '{{ next_execution_date }}'"

Note

Add a space after the script name when directly calling a .sh script with the bash_command argument – for example bash_command="my_script.sh ". This is because Airflow tries to apply load this file and process it as a Jinja template to it ends with .sh, which will likely not be what most users want.

Warning

Care should be taken with “user” input or when using Jinja templates in the bash_command, as this bash operator does not perform any escaping or sanitization of the command.

This applies mostly to using “dag_run” conf, as that can be submitted via users in the Web UI. Most of the default template variables are not at risk.

For example, do not do this:

bash_task = BashOperator( task_id="bash_task", bash_command='echo "Here is the message: \'{{ dag_run.conf["message"] if dag_run else "" }}\'"', )

Instead, you should pass this via the env kwarg and use double-quotes inside the bash_command, as below:

bash_task = BashOperator( task_id="bash_task", bash_command='echo "here is the message: \'$message\'"', env={'message': '{{ dag_run.conf["message"] if dag_run else "" }}'}, )

Example DAGs

complex

Example Airflow DAG that shows the complex DAG structure.

bash_operator

Example DAG demonstrating the usage of the BashOperator.

task_group

Example DAG demonstrating the usage of the TaskGroup.

trigger_target_dag

Example usage of the TriggerDagRunOperator. This example holds 2 DAGs: 1. 1st DAG (example_trigger_…

xcomargs

Example DAG demonstrating the usage of the XComArgs.

test_utils

Used for unit tests

docker

docker_copy_data

This sample "listen to directory". move the new file and print it, using docker-containers. The fol…

singularity

datafusion

Example Airflow DAG that shows how to use DataFusion.

kubernetes_engine

Example Airflow DAG for Google Kubernetes Engine.

natural_language

Example Airflow DAG for Google Cloud Natural Language service

translate

Example Airflow DAG that translates text in Google Cloud Translate service in the Google Cloud.

sheets

passing_params_via_test_command

Example DAG demonstrating the usage of the params arguments in templated arguments.

tutorial

### Tutorial Documentation Documentation that goes along with the Airflow tutorial located [here](h…

cloud_memorystore

Example Airflow DAG for Google Cloud Memorystore service.

mlengine

Example Airflow DAG for Google ML Engine service.

pubsub

Example Airflow DAG that uses Google PubSub services.

video_intelligence

Example Airflow DAG that demonstrates operators for the Google Cloud Video Intelligence service in …

bigquery_queries

Example Airflow DAG for Google BigQuery service.

kubernetes

This is an example dag for using the KubernetesPodOperator.

cloud_build

Example Airflow DAG that displays interactions with Google Cloud Build. This DAG relies on the foll…

datacatalog

Example Airflow DAG that interacts with Google Data Catalog service

vision

Example Airflow DAG that creates, gets, updates and deletes Products and Product Sets in the Google…

twitter_dag

This is an example dag for managing twitter data.

bigquery_operations

Example Airflow DAG for Google BigQuery service.

gcs

Example Airflow DAG for Google Cloud Storage operators.

Was this page helpful?