airflow.executors.celery_kubernetes_executor

Module Contents

class airflow.executors.celery_kubernetes_executor.CeleryKubernetesExecutor(celery_executor, kubernetes_executor)[source]

Bases: airflow.utils.log.logging_mixin.LoggingMixin

CeleryKubernetesExecutor consists of CeleryExecutor and KubernetesExecutor. It chooses an executor to use based on the queue defined on the task. When the queue is the value of kubernetes_queue in section [celery_kubernetes_executor] of the configuration (default value: kubernetes), KubernetesExecutor is selected to run the task, otherwise, CeleryExecutor is used.

KUBERNETES_QUEUE[source]
queued_tasks[source]

Return queued tasks from celery and kubernetes executor

running[source]

Return running tasks from celery and kubernetes executor

job_id[source]

This is a class attribute in BaseExecutor but since this is not really an executor, but a wrapper of executors we implement as property so we can have custom setter.

slots_available[source]

Number of new tasks this executor instance can accept

start(self)[source]

Start celery and kubernetes executor

queue_command(self, task_instance: TaskInstance, command: CommandType, priority: int = 1, queue: Optional[str] = None)[source]

Queues command via celery or kubernetes executor

queue_task_instance(self, task_instance: TaskInstance, mark_success: bool = False, pickle_id: Optional[str] = None, ignore_all_deps: bool = False, ignore_depends_on_past: bool = False, ignore_task_deps: bool = False, ignore_ti_state: bool = False, pool: Optional[str] = None, cfg_path: Optional[str] = None)[source]

Queues task instance via celery or kubernetes executor

has_task(self, task_instance: TaskInstance)[source]

Checks if a task is either queued or running in either celery or kubernetes executor.

Parameters

task_instance – TaskInstance

Returns

True if the task is known to this executor

heartbeat(self)[source]

Heartbeat sent to trigger new jobs in celery and kubernetes executor

get_event_buffer(self, dag_ids=None)[source]

Returns and flush the event buffer from celery and kubernetes executor

Parameters

dag_ids – to dag_ids to return events for, if None returns all

Returns

a dict of events

try_adopt_task_instances(self, tis: List[TaskInstance])[source]

Try to adopt running task instances that have been abandoned by a SchedulerJob dying.

Anything that is not adopted will be cleared by the scheduler (and then become eligible for re-scheduling)

Returns

any TaskInstances that were unable to be adopted

Return type

list[airflow.models.TaskInstance]

end(self)[source]

End celery and kubernetes executor

terminate(self)[source]

Terminate celery and kubernetes executor

_router(self, simple_task_instance: SimpleTaskInstance)[source]

Return either celery_executor or kubernetes_executor

Parameters

simple_task_instance – SimpleTaskInstance

Returns

celery_executor or kubernetes_executor

Return type

Union[CeleryExecutor, KubernetesExecutor]

debug_dump(self)[source]

Called in response to SIGUSR2 by the scheduler

Was this entry helpful?