Module Contents



Dataflow trigger to check if templated job has been finished.


DEFAULT_DATAFLOW_LOCATION = 'us-central1'[source]
class, project_id, location=DEFAULT_DATAFLOW_LOCATION, gcp_conn_id='google_cloud_default', poll_sleep=10, impersonation_chain=None, cancel_timeout=5 * 60)[source]

Bases: airflow.triggers.base.BaseTrigger

Dataflow trigger to check if templated job has been finished.

  • project_id (str | None) – Required. the Google Cloud project ID in which the job was started.

  • job_id (str) – Required. ID of the job.

  • location (str) – Optional. the location where job is executed. If set to None then the value of DEFAULT_DATAFLOW_LOCATION will be used

  • gcp_conn_id (str) – The connection ID to use connecting to Google Cloud.

  • impersonation_chain (str | Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • cancel_timeout (int | None) – Optional. How long (in seconds) operator should wait for the pipeline to be successfully cancelled when task is being killed.


Serializes class arguments and classpath.

async run()[source]

Main loop of the class in where it is fetching the job status and yields certain Event.

If the job has status success then it yields TriggerEvent with success status, if job has status failed - with error status. In any other case Trigger will wait for specified amount of time stored in self.poll_sleep variable.

Was this entry helpful?