airflow.operators.bash
¶
Module Contents¶
Classes¶
Execute a Bash script, command or set of commands. |
- class airflow.operators.bash.BashOperator(*, bash_command: str, env: Optional[Dict[str, str]] = None, output_encoding: str = 'utf-8', skip_exit_code: int = 99, cwd: str = None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
Execute a Bash script, command or set of commands.
See also
For more information on how to use this operator, take a look at the guide: BashOperator
If BaseOperator.do_xcom_push is True, the last line written to stdout will also be pushed to an XCom when the bash command completes
- Parameters
bash_command (str) – The command, set of commands or reference to a bash script (must be ‘.sh’) to be executed. (templated)
env (dict) – If env is not None, it must be a dict that defines the environment variables for the new process; these are used instead of inheriting the current process environment, which is the default behavior. (templated)
output_encoding (str) – Output encoding of bash command
skip_exit_code (int) – If task exits with this exit code, leave the task in
skipped
state (default: 99). If set toNone
, any non-zero exit code will be treated as a failure.cwd (str) – Working directory to execute the command in. If None (default), the command is run in a temporary directory.
Airflow will evaluate the exit code of the bash command. In general, a non-zero exit code will result in task failure and zero will result in task success. Exit code
99
(or another set inskip_exit_code
) will throw anairflow.exceptions.AirflowSkipException
, which will leave the task inskipped
state. You can have all non-zero exit codes be treated as a failure by settingskip_exit_code=None
.Exit code
Behavior
0
success
skip_exit_code (default: 99)
otherwise
Note
Airflow will not recognize a non-zero exit code unless the whole shell exit with a non-zero exit code. This can be an issue if the non-zero exit arises from a sub-command. The easiest way of addressing this is to prefix the command with
set -e;
Example: .. code-block:: python
bash_command = “set -e; python3 script.py ‘{{ next_execution_date }}’”
Note
Add a space after the script name when directly calling a
.sh
script with thebash_command
argument – for examplebash_command="my_script.sh "
. This is because Airflow tries to apply load this file and process it as a Jinja template to it ends with.sh
, which will likely not be what most users want.Warning
Care should be taken with “user” input or when using Jinja templates in the
bash_command
, as this bash operator does not perform any escaping or sanitization of the command.This applies mostly to using “dag_run” conf, as that can be submitted via users in the Web UI. Most of the default template variables are not at risk.
For example, do not do this:
bash_task = BashOperator( task_id="bash_task", bash_command='echo "Here is the message: \'{{ dag_run.conf["message"] if dag_run else "" }}\'"', )
Instead, you should pass this via the
env
kwarg and use double-quotes inside the bash_command, as below:bash_task = BashOperator( task_id="bash_task", bash_command="echo \"here is the message: '$message'\"", env={"message": '{{ dag_run.conf["message"] if dag_run else "" }}'}, )
- get_env(self, context)[source]¶
Builds the set of environment variables to be exposed for the bash command