See also

For more information on how the DebugExecutor works, take a look at the guide: Debug Executor

Module Contents



This executor is meant for debugging purposes. It can be used with SQLite.

class airflow.executors.debug_executor.DebugExecutor[source]

Bases: airflow.executors.base_executor.BaseExecutor

This executor is meant for debugging purposes. It can be used with SQLite.

It executes one task instance at time. Additionally to support working with sensors, all sensors mode will be automatically set to “reschedule”.

execute_async(self, *args, **kwargs)[source]

The method is replaced by custom trigger_task implementation.


Sync will get called periodically by the heartbeat method. Executors should override this to perform gather statuses.

queue_task_instance(self, task_instance, mark_success=False, pickle_id=None, ignore_all_deps=False, ignore_depends_on_past=False, ignore_task_deps=False, ignore_ti_state=False, pool=None, cfg_path=None)[source]

Queues task instance with empty command because we do not need it.

trigger_tasks(self, open_slots)[source]

Triggers tasks. Instead of calling exec_async we just add task instance to tasks_to_run queue.


open_slots (int) – Number of open slots


When the method is called we just set states of queued tasks to UPSTREAM_FAILED marking them as not executed.


This method is called when the daemon receives a SIGTERM

change_state(self, key, state, info=None)[source]

Changes state of the task.


Was this entry helpful?