:mod:`airflow.contrib.operators.awsbatch_operator` ================================================== .. py:module:: airflow.contrib.operators.awsbatch_operator Module Contents --------------- .. py:class:: AWSBatchOperator(job_name, job_definition, job_queue, overrides, max_retries=4200, aws_conn_id=None, region_name=None, **kwargs) Bases::class:`airflow.models.BaseOperator` Execute a job on AWS Batch Service .. warning: the queue parameter was renamed to job_queue to segregate the internal CeleryExecutor queue from the AWS Batch internal queue. :param job_name: the name for the job that will run on AWS Batch :type job_name: str :param job_definition: the job definition name on AWS Batch :type job_definition: str :param job_queue: the queue name on AWS Batch :type job_queue: str :param overrides: the same parameter that boto3 will receive on containerOverrides (templated): http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job :type overrides: dict :param max_retries: exponential backoff retries while waiter is not merged, 4200 = 48 hours :type max_retries: int :param aws_conn_id: connection id of AWS credentials / region name. If None, credential boto3 strategy will be used (http://boto3.readthedocs.io/en/latest/guide/configuration.html). :type aws_conn_id: str :param region_name: region name to use in AWS Hook. Override the region_name in connection (if provided) :type region_name: str .. attribute:: ui_color :annotation: = #c3dae0 .. attribute:: client .. attribute:: arn .. attribute:: template_fields :annotation: = ['overrides'] .. method:: execute(self, context) .. method:: _wait_for_task_ended(self) Try to use a waiter from the below pull request * https://github.com/boto/botocore/pull/1307 If the waiter is not available apply a exponential backoff * docs.aws.amazon.com/general/latest/gr/api-retries.html .. method:: _check_success_task(self) .. method:: get_hook(self) .. method:: on_kill(self)