airflow.executors.debug_executor
¶
DebugExecutor
See also
For more information on how the DebugExecutor works, take a look at the guide: Debug Executor
Module Contents¶
Classes¶
This executor is meant for debugging purposes. It can be used with SQLite. |
- class airflow.executors.debug_executor.DebugExecutor[source]¶
Bases:
airflow.executors.base_executor.BaseExecutor
This executor is meant for debugging purposes. It can be used with SQLite.
It executes one task instance at time. Additionally to support working with sensors, all sensors
mode
will be automatically set to "reschedule".- execute_async(self, *args, **kwargs)[source]¶
The method is replaced by custom trigger_task implementation.
- sync(self)[source]¶
Sync will get called periodically by the heartbeat method. Executors should override this to perform gather statuses.
- queue_task_instance(self, task_instance, mark_success=False, pickle_id=None, ignore_all_deps=False, ignore_depends_on_past=False, ignore_task_deps=False, ignore_ti_state=False, pool=None, cfg_path=None)[source]¶
Queues task instance with empty command because we do not need it.
- trigger_tasks(self, open_slots)[source]¶
Triggers tasks. Instead of calling exec_async we just add task instance to tasks_to_run queue.
- Parameters
open_slots (int) -- Number of open slots
- end(self)[source]¶
When the method is called we just set states of queued tasks to UPSTREAM_FAILED marking them as not executed.
- change_state(self, key, state, info=None)[source]¶
Changes state of the task.
- Parameters
info -- Executor information for the task instance
key (airflow.models.taskinstance.TaskInstanceKey) -- Unique key for the task instance
state (str) -- State to set for the task.