airflow.providers.google.cloud.log.gcs_task_handler
¶
Module Contents¶
Classes¶
GCSTaskHandler is a python log handler that handles and reads |
- class airflow.providers.google.cloud.log.gcs_task_handler.GCSTaskHandler(*, base_log_folder, gcs_log_folder, filename_template=None, gcp_key_path=None, gcp_keyfile_dict=None, gcp_scopes=_DEFAULT_SCOPESS, project_id=None)[source]¶
Bases:
airflow.utils.log.file_task_handler.FileTaskHandler
,airflow.utils.log.logging_mixin.LoggingMixin
GCSTaskHandler is a python log handler that handles and reads task instance logs. It extends airflow FileTaskHandler and uploads to and reads from GCS remote storage. Upon log reading failure, it reads from host machine’s local disk.
- Parameters
base_log_folder (str) – Base log folder to place logs.
gcs_log_folder (str) – Path to a remote location where logs will be saved. It must have the prefix
gs://
. For example:gs://bucket/remote/log/location
filename_template (str | None) – template filename string
gcp_key_path (str | None) – Path to Google Cloud Service Account file (JSON). Mutually exclusive with gcp_keyfile_dict. If omitted, authorization based on the Application Default Credentials will be used.
gcp_keyfile_dict (dict | None) – Dictionary of keyfile parameters. Mutually exclusive with gcp_key_path.
gcp_scopes (Collection[str] | None) – Comma-separated string containing OAuth2 scopes
project_id (str | None) – Project ID to read the secrets from. If not passed, the project ID from credentials will be used.
- set_context(ti)[source]¶
Provide task_instance context to airflow task handler.
Generally speaking returns None. But if attr maintain_propagate has been set to propagate, then returns sentinel MAINTAIN_PROPAGATE. This has the effect of overriding the default behavior to set propagate to False whenever set_context is called. At time of writing, this functionality is only used in unit testing.
- Parameters
ti – task instance object