airflow.providers.google.cloud.sensors.dataproc

This module contains a Dataproc Job sensor.

Module Contents

Classes

DataprocJobSensor

Check for the state of a previously submitted Dataproc job.

class airflow.providers.google.cloud.sensors.dataproc.DataprocJobSensor(*, project_id: str, dataproc_job_id: str, region: Optional[str] = None, location: Optional[str] = None, gcp_conn_id: str = 'google_cloud_default', wait_timeout: Optional[int] = None, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Check for the state of a previously submitted Dataproc job.

Parameters
  • project_id (str) -- The ID of the google cloud project in which to create the cluster. (templated)

  • dataproc_job_id (str) -- The Dataproc job ID to poll. (templated)

  • region (str) -- Required. The Cloud Dataproc region in which to handle the request. (templated)

  • location (str) -- (To be deprecated). The Cloud Dataproc region in which to handle the request. (templated)

  • gcp_conn_id (str) -- The connection ID to use connecting to Google Cloud Platform.

  • wait_timeout (int) -- How many seconds wait for job to be ready.

template_fields :Sequence[str] = ['project_id', 'region', 'dataproc_job_id'][source]
ui_color = #f0eee4[source]
execute(self, context: airflow.utils.context.Context) None[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

poke(self, context: airflow.utils.context.Context) bool[source]

Function that the sensors defined while deriving this class should override.

Was this entry helpful?