airflow.executors.base_executor

Module Contents

airflow.executors.base_executor.PARALLELISM[source]
class airflow.executors.base_executor.BaseExecutor(parallelism=PARALLELISM)[source]

Bases:airflow.utils.log.logging_mixin.LoggingMixin

start(self)[source]

Executors may need to get things started. For example LocalExecutor starts N workers.

queue_command(self, simple_task_instance, command, priority=1, queue=None)[source]
queue_task_instance(self, task_instance, mark_success=False, pickle_id=None, ignore_all_deps=False, ignore_depends_on_past=False, ignore_task_deps=False, ignore_ti_state=False, pool=None, cfg_path=None)[source]
has_task(self, task_instance)[source]

Checks if a task is either queued or running in this executor

Parameters

task_instance – TaskInstance

Returns

True if the task is known to this executor

sync(self)[source]

Sync will get called periodically by the heartbeat method. Executors should override this to perform gather statuses.

heartbeat(self)[source]
change_state(self, key, state)[source]
fail(self, key)[source]
success(self, key)[source]
get_event_buffer(self, dag_ids=None)[source]

Returns and flush the event buffer. In case dag_ids is specified it will only return and flush events for the given dag_ids. Otherwise it returns and flushes all

Parameters

dag_ids – to dag_ids to return events for, if None returns all

Returns

a dict of events

execute_async(self, key, command, queue=None, executor_config=None)[source]

This method will execute the command asynchronously.

end(self)[source]

This method is called when the caller is done submitting job and is wants to wait synchronously for the job submitted previously to be all done.

terminate(self)[source]

This method is called when the daemon receives a SIGTERM