Writing logs to Azure Blob Storage¶
Airflow can be configured to read and write task logs in Azure Blob Storage. It uses an existing Airflow connection to read or write logs. If you don’t have a connection properly setup, this process will fail.
Follow the steps below to enable Azure Blob Storage logging:
Airflow’s logging system requires a custom
.py
file to be located in thePYTHONPATH
, so that it’s importable from Airflow. Start by creating a directory to store the config file,$AIRFLOW_HOME/config
is recommended.Create empty files called
$AIRFLOW_HOME/config/log_config.py
and$AIRFLOW_HOME/config/__init__.py
.Copy the contents of
airflow/config_templates/airflow_local_settings.py
into thelog_config.py
file created inStep 2
.Customize the following portions of the template:
# wasb buckets should start with "wasb" just to help Airflow select correct handler REMOTE_BASE_LOG_FOLDER = 'wasb-<whatever you want here>' # Rename DEFAULT_LOGGING_CONFIG to LOGGING CONFIG LOGGING_CONFIG = ...
Make sure a Azure Blob Storage (Wasb) connection hook has been defined in Airflow. The hook should have read and write access to the Azure Blob Storage bucket defined above in
REMOTE_BASE_LOG_FOLDER
.Update
$AIRFLOW_HOME/airflow.cfg
to contain:[logging] remote_logging = True logging_config_class = log_config.LOGGING_CONFIG remote_log_conn_id = <name of the Azure Blob Storage connection>
Restart the Airflow webserver and scheduler, and trigger (or wait for) a new task execution.
Verify that logs are showing up for newly executed tasks in the bucket you have defined.