GCSTaskHandler is a python log handler that handles and reads
- class airflow.providers.google.cloud.log.gcs_task_handler.GCSTaskHandler(*, base_log_folder, gcs_log_folder, filename_template=None, gcp_key_path=None, gcp_keyfile_dict=None, gcp_scopes=_DEFAULT_SCOPESS, project_id=None)¶
GCSTaskHandler is a python log handler that handles and reads task instance logs. It extends airflow FileTaskHandler and uploads to and reads from GCS remote storage. Upon log reading failure, it reads from host machine’s local disk.
base_log_folder (str) – Base log folder to place logs.
gcs_log_folder (str) – Path to a remote location where logs will be saved. It must have the prefix
gs://. For example:
filename_template (str | None) – template filename string
gcp_key_path (str | None) – Path to Google Cloud Service Account file (JSON). Mutually exclusive with gcp_keyfile_dict. If omitted, authorization based on the Application Default Credentials will be used.
gcp_keyfile_dict (dict | None) – Dictionary of keyfile parameters. Mutually exclusive with gcp_key_path.
gcp_scopes (Collection[str] | None) – Comma-separated string containing OAuth2 scopes
project_id (str | None) – Project ID to read the secrets from. If not passed, the project ID from credentials will be used.
Returns GCS Client.
Provide task_instance context to airflow task handler.
Generally speaking returns None. But if attr maintain_propagate has been set to propagate, then returns sentinel MAINTAIN_PROPAGATE. This has the effect of overriding the default behavior to set propagate to False whenever set_context is called. At time of writing, this functionality is only used in unit testing.
ti – task instance object
Close and upload local log file to remote storage GCS.
- gcs_write(log, remote_log_location)¶
Writes the log to the remote_log_location. Fails silently if no log was created.
log – the log to write to the remote_log_location
remote_log_location – the log’s location in remote storage