Module Contents

class airflow.providers.ssh.operators.ssh.SSHOperator(*, ssh_hook: Optional[SSHHook] = None, ssh_conn_id: Optional[str] = None, remote_host: Optional[str] = None, command: Optional[str] = None, timeout: int = 10, environment: Optional[dict] = None, get_pty: bool = False, **kwargs)[source]

Bases: airflow.models.BaseOperator

SSHOperator to execute commands on given remote host using the ssh_hook.

  • ssh_hook (airflow.providers.ssh.hooks.ssh.SSHHook) -- predefined ssh_hook to use for remote execution. Either ssh_hook or ssh_conn_id needs to be provided.

  • ssh_conn_id (str) -- ssh connection id from airflow Connections. ssh_conn_id will be ignored if ssh_hook is provided.

  • remote_host (str) -- remote host to connect (templated) Nullable. If provided, it will replace the remote_host which was defined in ssh_hook or predefined in the connection of ssh_conn_id.

  • command (str) -- command to execute on remote host. (templated)

  • timeout (int) -- timeout (in seconds) for executing the command. The default is 10 seconds.

  • environment (dict) -- a dict of shell environment variables. Note that the server will reject them silently if AcceptEnv is not set in SSH config.

  • get_pty (bool) -- request a pseudo-terminal from the server. Set to True to have the remote process killed upon task timeout. The default is False but note that get_pty is forced to True when the command starts with sudo.

template_fields = ['command', 'remote_host'][source]
template_ext = ['.sh'][source]
execute(self, context)[source]

Get ssh tunnel

Was this entry helpful?