Module Contents

class*, workflow_id: str, execution_id: str, location: str, project_id: str, success_states: Optional[Set[Execution.State]] = None, failure_states: Optional[Set[Execution.State]] = None, retry: Optional[Retry] = None, request_timeout: Optional[float] = None, metadata: Optional[Sequence[Tuple[str, str]]] = None, gcp_conn_id: str = 'google_cloud_default', impersonation_chain: Optional[Union[str, Sequence[str]]] = None, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks state of an execution for the given workflow_id and execution_id.

  • workflow_id (str) -- Required. The ID of the workflow.

  • execution_id (str) -- Required. The ID of the execution.

  • project_id (str) -- Required. The ID of the Google Cloud project the cluster belongs to.

  • location (str) -- Required. The Cloud Dataproc region in which to handle the request.

  • success_states (List[Execution.State]) -- Execution states to be considered as successful, by default it's only SUCCEEDED state

  • failure_states (List[Execution.State]) -- Execution states to be considered as failures, by default they are FAILED and CANCELLED states.

  • retry (google.api_core.retry.Retry) -- A retry object used to retry requests. If None is specified, requests will not be retried.

  • request_timeout (float) -- The amount of time, in seconds, to wait for the request to complete. Note that if retry is specified, the timeout applies to each individual attempt.

  • metadata (Sequence[Tuple[str, str]]) -- Additional metadata that is provided to the method.

template_fields = ['location', 'workflow_id', 'execution_id'][source]
poke(self, context)[source]

Was this entry helpful?