airflow.providers.google.cloud.sensors.workflows

Module Contents

Classes

WorkflowExecutionSensor

Checks state of an execution for the given workflow_id and execution_id.

class airflow.providers.google.cloud.sensors.workflows.WorkflowExecutionSensor(*, workflow_id, execution_id, location, project_id, success_states=None, failure_states=None, retry=DEFAULT, request_timeout=None, metadata=(), gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks state of an execution for the given workflow_id and execution_id.

Parameters
  • workflow_id (str) -- Required. The ID of the workflow.

  • execution_id (str) -- Required. The ID of the execution.

  • project_id (str) -- Required. The ID of the Google Cloud project the cluster belongs to.

  • location (str) -- Required. The Cloud Dataproc region in which to handle the request.

  • success_states (Optional[Set[google.cloud.workflows.executions_v1beta.Execution.State]]) -- Execution states to be considered as successful, by default it's only SUCCEEDED state

  • failure_states (Optional[Set[google.cloud.workflows.executions_v1beta.Execution.State]]) -- Execution states to be considered as failures, by default they are FAILED and CANCELLED states.

  • retry (Union[google.api_core.retry.Retry, google.api_core.gapic_v1.method._MethodDefault]) -- A retry object used to retry requests. If None is specified, requests will not be retried.

  • request_timeout (Optional[float]) -- The amount of time, in seconds, to wait for the request to complete. Note that if retry is specified, the timeout applies to each individual attempt.

  • metadata (Sequence[Tuple[str, str]]) -- Additional metadata that is provided to the method.

template_fields :Sequence[str] = ['location', 'workflow_id', 'execution_id'][source]
poke(self, context)[source]

Function that the sensors defined while deriving this class should override.

Was this entry helpful?