airflow.providers.google.cloud.sensors.dataproc

This module contains a Dataproc Job sensor.

Module Contents

Classes

DataprocJobSensor

Check for the state of a previously submitted Dataproc job.

class airflow.providers.google.cloud.sensors.dataproc.DataprocJobSensor(*, project_id, dataproc_job_id, region=None, location=None, gcp_conn_id='google_cloud_default', wait_timeout=None, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Check for the state of a previously submitted Dataproc job.

Parameters
  • project_id (str) -- The ID of the google cloud project in which to create the cluster. (templated)

  • dataproc_job_id (str) -- The Dataproc job ID to poll. (templated)

  • region (Optional[str]) -- Required. The Cloud Dataproc region in which to handle the request. (templated)

  • location (Optional[str]) -- (To be deprecated). The Cloud Dataproc region in which to handle the request. (templated)

  • gcp_conn_id (str) -- The connection ID to use connecting to Google Cloud Platform.

  • wait_timeout (Optional[int]) -- How many seconds wait for job to be ready.

template_fields :Sequence[str] = ['project_id', 'region', 'dataproc_job_id'][source]
ui_color = #f0eee4[source]
execute(self, context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

poke(self, context)[source]

Function that the sensors defined while deriving this class should override.

Was this entry helpful?