Listeners¶
You can write listeners to enable Airflow to notify you when events happen. Pluggy powers these listeners.
Airflow supports notifications for the following events:
Lifecycle Events¶
on_startingbefore_stopping
Lifecycle events allow you to react to start and stop events for an Airflow Job, like SchedulerJob or BackfillJob.
DagRun State Change Events¶
on_dag_run_runningon_dag_run_successon_dag_run_failed
DagRun state change events occur when a DagRun changes state.
TaskInstance State Change Events¶
on_task_instance_runningon_task_instance_successon_task_instance_failed
TaskInstance state change events occur when a TaskInstance changes state.
You can use these events to react to LocalTaskJob state changes.
Dataset Events¶
on_dataset_createdon_dataset_changed
Dataset events occur when Dataset management operations are run.
This is an experimental feature.
Usage¶
To create a listener:
import
airflow.listeners.hookimplimplement the
hookimplsfor events that you’d like to generate notifications
Airflow defines the specification as hookspec. Your implementation must accept the same named parameters as defined in hookspec. If you don’t use the same parameters as hookspec, Pluggy throws an error when you try to use your plugin. But you don’t need to implement every method. Many listeners only implement one method, or a subset of methods.
To include the listener in your Airflow installation, include it as a part of an Airflow Plugin.
Listener API is meant to be called across all DAGs and all operators. You can’t listen to events generated by specific DAGs. For that behavior, try methods like on_success_callback and pre_execute. These provide callbacks for particular DAG authors or operator creators. The logs and print() calls will be handled as part of the listeners.