airflow.contrib.operators.ecs_operator

Module Contents

class airflow.contrib.operators.ecs_operator.ECSOperator(task_definition, cluster, overrides, aws_conn_id=None, region_name=None, launch_type='EC2', group=None, placement_constraints=None, platform_version='LATEST', network_configuration=None, tags=None, awslogs_group=None, awslogs_region=None, awslogs_stream_prefix=None, **kwargs)[source]

Bases: airflow.models.BaseOperator

Execute a task on AWS EC2 Container Service

Parameters
  • task_definition (str) – the task definition name on EC2 Container Service

  • cluster (str) – the cluster name on EC2 Container Service

  • overrides (dict) – the same parameter that boto3 will receive (templated): http://boto3.readthedocs.org/en/latest/reference/services/ecs.html#ECS.Client.run_task

  • aws_conn_id (str) – connection id of AWS credentials / region name. If None, credential boto3 strategy will be used (http://boto3.readthedocs.io/en/latest/guide/configuration.html).

  • region_name (str) – region name to use in AWS Hook. Override the region_name in connection (if provided)

  • launch_type (str) – the launch type on which to run your task (‘EC2’ or ‘FARGATE’)

  • group (str) – the name of the task group associated with the task

  • placement_constraints (list) – an array of placement constraint objects to use for the task

  • platform_version (str) – the platform version on which your task is running

  • network_configuration (dict) – the network configuration for the task

  • tags (dict) – a dictionary of tags in the form of {‘tagKey’: ‘tagValue’}.

  • awslogs_group (str) – the CloudWatch group where your ECS container logs are stored. Only required if you want logs to be shown in the Airflow UI after your job has finished.

  • awslogs_region (str) – the region in which your CloudWatch logs are stored. If None, this is the same as the region_name parameter. If that is also None, this is the default AWS region based on your connection settings.

  • awslogs_stream_prefix (str) – the stream prefix that is used for the CloudWatch logs. This is usually based on some custom name combined with the name of the container. Only required if you want logs to be shown in the Airflow UI after your job has finished.

ui_color = #f0ede4[source]
client[source]
arn[source]
template_fields = ['overrides'][source]
execute(self, context)[source]
_wait_for_task_ended(self)[source]
_check_success_task(self)[source]
get_hook(self)[source]
get_logs_hook(self)[source]
on_kill(self)[source]

Was this entry helpful?