airflow.providers.ssh.triggers.ssh_remote_job

SSH Remote Job Trigger for deferrable execution.

Classes

SSHRemoteJobTrigger

Trigger that monitors a remote SSH job and streams logs.

Module Contents

class airflow.providers.ssh.triggers.ssh_remote_job.SSHRemoteJobTrigger(ssh_conn_id, remote_host, job_id, job_dir, log_file, exit_code_file, remote_os, poll_interval=5, log_chunk_size=65536, log_offset=0, command_timeout=30.0)[source]

Bases: airflow.triggers.base.BaseTrigger

Trigger that monitors a remote SSH job and streams logs.

This trigger polls the remote host to check job completion status and reads log output incrementally.

Parameters:
  • ssh_conn_id (str) – SSH connection ID from Airflow Connections

  • remote_host (str | None) – Optional override for the remote host

  • job_id (str) – Unique identifier for the remote job

  • job_dir (str) – Remote directory containing job artifacts

  • log_file (str) – Path to the log file on the remote host

  • exit_code_file (str) – Path to the exit code file on the remote host

  • remote_os (Literal['posix', 'windows']) – Operating system of the remote host (‘posix’ or ‘windows’)

  • poll_interval (int) – Seconds between polling attempts

  • log_chunk_size (int) – Maximum bytes to read per poll

  • log_offset (int) – Current byte offset in the log file

ssh_conn_id[source]
remote_host[source]
job_id[source]
job_dir[source]
log_file[source]
exit_code_file[source]
remote_os[source]
poll_interval = 5[source]
log_chunk_size = 65536[source]
log_offset = 0[source]
command_timeout = 30.0[source]
serialize()[source]

Serialize the trigger for storage.

async run()[source]

Poll the remote job status and yield events with log chunks.

This method runs in a loop, checking the job status and reading logs at each poll interval. It yields a TriggerEvent each time with the current status and any new log output.

Was this entry helpful?