Writing logs to Google Stackdriver¶
Airflow can be configured to read and write task logs in Google Stackdriver Logging.
To enable this feature,
airflow.cfg must be configured as in this
[logging] # Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search. # Users must supply an Airflow connection id that provides access to the storage # location. If remote_logging is set to true, see UPDATING.md for additional # configuration requirements. remote_logging = True remote_base_log_folder = stackdriver://logs-name
All configuration options are in the
The value of field
remote_logging must always be set to
True for this feature to work.
Turning this option off will result in data not being sent to Stackdriver.
remote_base_log_folder option contains the URL that specifies the type of handler to be used.
For integration with Stackdriver, this option should start with
The path section of the URL specifies the name of the log e.g.
logs under the name