AWS Batch

AWS Batch enables you to run batch computing workloads on the AWS Cloud. Batch computing is a common way for developers, scientists, and engineers to access large amounts of compute resources. AWS Batch removes the undifferentiated heavy lifting of configuring and managing the required infrastructure.

Prerequisite Tasks

To use these operators, you must do a few things:

Operators

Submit a new AWS Batch job

To submit a new AWS Batch job and monitor it until it reaches a terminal state you can use BatchOperator.

airflow/providers/amazon/aws/example_dags/example_batch.py[source]

submit_batch_job = BatchOperator(
    task_id='submit_batch_job',
    job_name=JOB_NAME,
    job_queue=JOB_QUEUE,
    job_definition=JOB_DEFINITION,
    overrides=JOB_OVERRIDES,
)

Sensors

Wait on an AWS Batch job state

To wait on the state of an AWS Batch Job until it reaches a terminal state you can use BatchSensor.

airflow/providers/amazon/aws/example_dags/example_batch.py[source]

wait_for_batch_job = BatchSensor(
    task_id='wait_for_batch_job',
    job_id=JOB_ID,
)

Was this entry helpful?