Writing logs to Azure Blob Storage¶
Airflow can be configured to read and write task logs in Azure Blob Storage. It uses an existing Airflow connection to read or write logs. If you don't have a connection properly setup, this process will fail.
Follow the steps below to enable Azure Blob Storage logging:
Airflow's logging system requires a custom
.pyfile to be located in the
PYTHONPATH, so that it's importable from Airflow. Start by creating a directory to store the config file,
Create empty files called
Copy the contents of
log_config.pyfile created in
Customize the following portions of the template:
# wasb buckets should start with "wasb" just to help Airflow select correct handler REMOTE_BASE_LOG_FOLDER = 'wasb-<whatever you want here>' # Rename DEFAULT_LOGGING_CONFIG to LOGGING CONFIG LOGGING_CONFIG = ...
Make sure a Azure Blob Storage (Wasb) connection hook has been defined in Airflow. The hook should have read and write access to the Azure Blob Storage bucket defined above in
[logging] remote_logging = True logging_config_class = log_config.LOGGING_CONFIG remote_log_conn_id = <name of the Azure Blob Storage connection>
Restart the Airflow webserver and scheduler, and trigger (or wait for) a new task execution.
Verify that logs are showing up for newly executed tasks in the bucket you have defined.