AWS Batch Operators

AWS Batch enables you to run batch computing workloads on the AWS Cloud. Batch computing is a common way for developers, scientists, and engineers to access large amounts of compute resources. AWS Batch removes the undifferentiated heavy lifting of configuring and managing the required infrastructure.

Prerequisite Tasks

To use these operators, you must do a few things:

AWS Batch Sensor

To wait on the state of an AWS Batch Job until it reaches a terminal state you can use BatchSensor.

airflow/providers/amazon/aws/example_dags/example_batch.py[source]

wait_for_batch_job = BatchSensor(
    task_id='wait_for_batch_job',
    job_id=JOB_ID,
)

AWS Batch Operator

To submit a new AWS Batch Job and monitor it until it reaches a terminal state you can use BatchOperator.

airflow/providers/amazon/aws/example_dags/example_batch.py[source]

submit_batch_job = BatchOperator(
    task_id='submit_batch_job',
    job_name=JOB_NAME,
    job_queue=JOB_QUEUE,
    job_definition=JOB_DEFINITION,
    overrides=JOB_OVERRIDES,
)

Reference

For further information, look at:

Was this entry helpful?