airflow.providers.google.cloud.log.gcs_task_handler

Module Contents

class airflow.providers.google.cloud.log.gcs_task_handler.GCSTaskHandler(*, base_log_folder: str, gcs_log_folder: str, filename_template: str, gcp_key_path: Optional[str] = None, gcp_keyfile_dict: Optional[dict] = None, gcp_scopes: Optional[Collection[str]] = _DEFAULT_SCOPESS, project_id: Optional[str] = None)[source]

Bases: airflow.utils.log.file_task_handler.FileTaskHandler, airflow.utils.log.logging_mixin.LoggingMixin

GCSTaskHandler is a python log handler that handles and reads task instance logs. It extends airflow FileTaskHandler and uploads to and reads from GCS remote storage. Upon log reading failure, it reads from host machine’s local disk.

Parameters
  • base_log_folder (str) – Base log folder to place logs.

  • gcs_log_folder (str) – Path to a remote location where logs will be saved. It must have the prefix gs://. For example: gs://bucket/remote/log/location

  • filename_template (str) – template filename string

  • gcp_key_path (str) – Path to Google Cloud Service Account file (JSON). Mutually exclusive with gcp_keyfile_dict. If omitted, authorization based on the Application Default Credentials will be used.

  • gcp_keyfile_dict (dict) – Dictionary of keyfile parameters. Mutually exclusive with gcp_key_path.

  • gcp_scopes (str) – Comma-separated string containing OAuth2 scopes

  • project_id (str) – Project ID to read the secrets from. If not passed, the project ID from credentials will be used.

client(self)[source]

Returns GCS Client.

set_context(self, ti)[source]
close(self)[source]

Close and upload local log file to remote storage GCS.

gcs_write(self, log, remote_log_location)[source]

Writes the log to the remote_log_location. Fails silently if no log was created.

Parameters
  • log (str) – the log to write to the remote_log_location

  • remote_log_location (str (path)) – the log’s location in remote storage

Was this entry helpful?