airflow.providers.amazon.aws.sensors.emr_containers

Module Contents

class airflow.providers.amazon.aws.sensors.emr_containers.EMRContainerSensor(*, virtual_cluster_id: str, job_id: str, max_retries: Optional[int] = None, aws_conn_id: str = 'aws_default', poll_interval: int = 10, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Asks for the state of the job run until it reaches a failure state or success state. If the job run fails, the task will fail.

Parameters
  • job_id (str) -- job_id to check the state of

  • max_retries (int) -- Number of times to poll for query state before returning the current state, defaults to None

  • aws_conn_id (str) -- aws connection to use, defaults to 'aws_default'

  • poll_interval (int) -- Time in seconds to wait between two consecutive call to check query status on athena, defaults to 10

INTERMEDIATE_STATES = ['PENDING', 'SUBMITTED', 'RUNNING'][source]
FAILURE_STATES = ['FAILED', 'CANCELLED', 'CANCEL_PENDING'][source]
SUCCESS_STATES = ['COMPLETED'][source]
template_fields = ['virtual_cluster_id', 'job_id'][source]
template_ext = [][source]
ui_color = #66c3ff[source]
poke(self, context: dict)[source]
hook(self)[source]

Create and return an EMRContainerHook

Was this entry helpful?