Writing logs to Amazon S3

Remote logging to Amazon S3 uses an existing Airflow connection to read or write logs. If you don’t have a connection properly setup, this process will fail.

Enabling remote logging

To enable this feature, airflow.cfg must be configured as follows:

[logging]
# Airflow can store logs remotely in AWS S3. Users must supply a remote
# location URL (starting with either 's3://...') and an Airflow connection
# id that provides access to the storage location.
remote_logging = True
remote_base_log_folder = s3://my-bucket/path/to/logs
remote_log_conn_id = MyS3Conn
# Use server-side encryption for logs stored in S3
encrypt_s3_logs = False

In the above example, Airflow will try to use S3Hook('MyS3Conn').

You can also use LocalStack to emulate Amazon S3 locally. To configure it, you must additionally set the endpoint url to point to your local stack. You can do this via the Connection Extra host field. For example, {"host": "http://localstack:4572"}

Was this entry helpful?