airflow.contrib.operators.ssh_operator

Module Contents

class airflow.contrib.operators.ssh_operator.SSHOperator(ssh_hook=None, ssh_conn_id=None, remote_host=None, command=None, timeout=10, do_xcom_push=False, environment=None, get_pty=False, *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

SSHOperator to execute commands on given remote host using the ssh_hook.

Parameters
  • ssh_hook (airflow.contrib.hooks.ssh_hook.SSHHook) – predefined ssh_hook to use for remote execution. Either ssh_hook or ssh_conn_id needs to be provided.

  • ssh_conn_id (str) – connection id from airflow Connections. ssh_conn_id will be ignored if ssh_hook is provided.

  • remote_host (str) – remote host to connect (templated) Nullable. If provided, it will replace the remote_host which was defined in ssh_hook or predefined in the connection of ssh_conn_id.

  • command (str) – command to execute on remote host. (templated)

  • timeout (int) – timeout (in seconds) for executing the command. The default is 10 seconds.

  • environment (dict) – a dict of shell environment variables. Note that the server will reject them silently if AcceptEnv is not set in SSH config.

  • do_xcom_push (bool) – return the stdout which also get set in xcom by airflow platform

  • get_pty (bool) – request a pseudo-terminal from the server. Set to True to have the remote process killed upon task timeout. The default is False but note that get_pty is forced to True when the command starts with sudo.

template_fields = ['command', 'remote_host'][source]
template_ext = ['.sh'][source]
execute(self, context)[source]
tunnel(self)[source]

Was this entry helpful?