Module Contents



Cloud Batch trigger to check if templated job has been finished.


DEFAULT_BATCH_LOCATION = 'us-central1'[source]
class, project_id, location=DEFAULT_BATCH_LOCATION, gcp_conn_id='google_cloud_default', impersonation_chain=None, polling_period_seconds=10, timeout=None)[source]

Bases: airflow.triggers.base.BaseTrigger

Cloud Batch trigger to check if templated job has been finished.

  • job_name (str) – Required. Name of the job.

  • project_id (str | None) – Required. the Google Cloud project ID in which the job was started.

  • location (str) – Optional. the location where job is executed. If set to None then the value of DEFAULT_BATCH_LOCATION will be used

  • gcp_conn_id (str) – The connection ID to use connecting to Google Cloud.

  • impersonation_chain (str | Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • poll_sleep – Polling period in seconds to check for the status


Serialize class arguments and classpath.

async run()[source]

Fetch job status or yield certain Events.

Main loop of the class in where it is fetching the job status and yields certain Event.

If the job has status success then it yields TriggerEvent with success status, if job has status failed - with error status and if the job is being deleted - with deleted status. In any other case Trigger will wait for specified amount of time stored in self.polling_period_seconds variable.

Was this entry helpful?