airflow.executors

Package Contents

class airflow.executors.LoggingMixin(context=None)[source]

Bases: object

Convenience super-class to have a logger configured with the class name

logger
log
_set_context(self, context)
airflow.executors.conf[source]
exception airflow.executors.AirflowException[source]

Bases: Exception

Base class for all Airflow’s errors. Each custom exception should be derived from this class

status_code = 500
class airflow.executors.BaseExecutor(parallelism=PARALLELISM)[source]

Bases: airflow.utils.log.logging_mixin.LoggingMixin

start(self)

Executors may need to get things started. For example LocalExecutor starts N workers.

queue_command(self, simple_task_instance, command, priority=1, queue=None)
queue_task_instance(self, task_instance, mark_success=False, pickle_id=None, ignore_all_deps=False, ignore_depends_on_past=False, ignore_task_deps=False, ignore_ti_state=False, pool=None, cfg_path=None)
has_task(self, task_instance)

Checks if a task is either queued or running in this executor

Parameters

task_instance – TaskInstance

Returns

True if the task is known to this executor

sync(self)

Sync will get called periodically by the heartbeat method. Executors should override this to perform gather statuses.

heartbeat(self)
trigger_tasks(self, open_slots)

Trigger tasks

Parameters

open_slots – Number of open slots

Returns

change_state(self, key, state)
fail(self, key)
success(self, key)
get_event_buffer(self, dag_ids=None)

Returns and flush the event buffer. In case dag_ids is specified it will only return and flush events for the given dag_ids. Otherwise it returns and flushes all

Parameters

dag_ids – to dag_ids to return events for, if None returns all

Returns

a dict of events

execute_async(self, key, command, queue=None, executor_config=None)

This method will execute the command asynchronously.

end(self)

This method is called when the caller is done submitting job and is wants to wait synchronously for the job submitted previously to be all done.

terminate(self)

This method is called when the daemon receives a SIGTERM

class airflow.executors.LocalExecutor[source]

Bases: airflow.executors.base_executor.BaseExecutor

LocalExecutor executes tasks locally in parallel. It uses the multiprocessing Python library and queues to parallelize the execution of tasks.

class _UnlimitedParallelism(executor)

Bases: object

Implements LocalExecutor with unlimited parallelism, starting one process per each command to execute.

start(self)
execute_async(self, key, command)
Parameters
  • key (tuple(dag_id, task_id, execution_date)) – the key to identify the TI

  • command (str) – the command to execute

sync(self)
end(self)
class _LimitedParallelism(executor)

Bases: object

Implements LocalExecutor with limited parallelism using a task queue to coordinate work distribution.

start(self)
execute_async(self, key, command)
Parameters
  • key (tuple(dag_id, task_id, execution_date)) – the key to identify the TI

  • command (str) – the command to execute

sync(self)
end(self)
start(self)
execute_async(self, key, command, queue=None, executor_config=None)
sync(self)
end(self)
class airflow.executors.SequentialExecutor[source]

Bases: airflow.executors.base_executor.BaseExecutor

This executor will only run one task instance at a time, can be used for debugging. It is also the only executor that can be used with sqlite since sqlite doesn’t support multiple connections.

Since we want airflow to work out of the box, it defaults to this SequentialExecutor alongside sqlite as you first install it.

execute_async(self, key, command, queue=None, executor_config=None)
sync(self)
end(self)
airflow.executors.DEFAULT_EXECUTOR[source]
airflow.executors._integrate_plugins()[source]
Integrate plugins to the context.
airflow.executors.get_default_executor()[source]
Creates a new instance of the configured executor if none exists and returns it
class airflow.executors.Executors[source]
LocalExecutor = LocalExecutor[source]
SequentialExecutor = SequentialExecutor[source]
CeleryExecutor = CeleryExecutor[source]
DaskExecutor = DaskExecutor[source]
MesosExecutor = MesosExecutor[source]
KubernetesExecutor = KubernetesExecutor[source]
DebugExecutor = DebugExecutor[source]
airflow.executors._get_executor(executor_name)[source]
Creates a new instance of the named executor.
In case the executor name is not know in airflow,
look for it in the plugins

Was this entry helpful?