airflow.operators.bash

Module Contents

class airflow.operators.bash.BashOperator(*, bash_command: str, env: Optional[Dict[str, str]] = None, output_encoding: str = 'utf-8', skip_exit_code: int = 99, **kwargs)[source]

Bases: airflow.models.BaseOperator

Execute a Bash script, command or set of commands.

See also

For more information on how to use this operator, take a look at the guide: BashOperator

If BaseOperator.do_xcom_push is True, the last line written to stdout will also be pushed to an XCom when the bash command completes

Parameters
  • bash_command (str) -- The command, set of commands or reference to a bash script (must be '.sh') to be executed. (templated)

  • env (dict) -- If env is not None, it must be a dict that defines the environment variables for the new process; these are used instead of inheriting the current process environment, which is the default behavior. (templated)

  • output_encoding (str) -- Output encoding of bash command

  • skip_exit_code (int) -- If task exits with this exit code, leave the task in skipped state (default: 99). If set to None, any non-zero exit code will be treated as a failure.

Airflow will evaluate the exit code of the bash command. In general, a non-zero exit code will result in task failure and zero will result in task success. Exit code 99 (or another set in skip_exit_code) will throw an airflow.exceptions.AirflowSkipException, which will leave the task in skipped state. You can have all non-zero exit codes be treated as a failure by setting skip_exit_code=None.

Exit code

Behavior

0

success

skip_exit_code (default: 99)

raise airflow.exceptions.AirflowSkipException

otherwise

raise airflow.exceptions.AirflowException

Note

Airflow will not recognize a non-zero exit code unless the whole shell exit with a non-zero exit code. This can be an issue if the non-zero exit arises from a sub-command. The easiest way of addressing this is to prefix the command with set -e;

Example: .. code-block:: python

bash_command = "set -e; python3 script.py '{{ next_execution_date }}'"

Note

Add a space after the script name when directly calling a .sh script with the bash_command argument -- for example bash_command="my_script.sh ". This is because Airflow tries to apply load this file and process it as a Jinja template to it ends with .sh, which will likely not be what most users want.

Warning

Care should be taken with "user" input or when using Jinja templates in the bash_command, as this bash operator does not perform any escaping or sanitization of the command.

This applies mostly to using "dag_run" conf, as that can be submitted via users in the Web UI. Most of the default template variables are not at risk.

For example, do not do this:

bash_task = BashOperator(
    task_id="bash_task",
    bash_command='echo "Here is the message: \'{{ dag_run.conf["message"] if dag_run else "" }}\'"',
)

Instead, you should pass this via the env kwarg and use double-quotes inside the bash_command, as below:

bash_task = BashOperator(
    task_id="bash_task",
    bash_command='echo "here is the message: \'$message\'"',
    env={'message': '{{ dag_run.conf["message"] if dag_run else "" }}'},
)
template_fields = ['bash_command', 'env'][source]
template_fields_renderers[source]
template_ext = ['.sh', '.bash'][source]
ui_color = #f0ede4[source]
subprocess_hook(self)[source]

Returns hook for running the bash command

get_env(self, context)[source]

Builds the set of environment variables to be exposed for the bash command

execute(self, context)[source]
on_kill(self)[source]

Was this entry helpful?