airflow.providers.amazon.aws.log.s3_task_handler

Module Contents

class airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler(base_log_folder: str, s3_log_folder: str, filename_template: str)[source]

Bases: airflow.utils.log.file_task_handler.FileTaskHandler, airflow.utils.log.logging_mixin.LoggingMixin

S3TaskHandler is a python log handler that handles and reads task instance logs. It extends airflow FileTaskHandler and uploads to and reads from S3 remote storage.

hook(self)[source]

Returns S3Hook.

set_context(self, ti)[source]
close(self)[source]

Close and upload local log file to remote storage S3.

s3_log_exists(self, remote_log_location: str)[source]

Check if remote_log_location exists in remote storage

Parameters

remote_log_location (str) -- log's location in remote storage

Returns

True if location exists else False

s3_read(self, remote_log_location: str, return_error: bool = False)[source]

Returns the log found at the remote_log_location. Returns '' if no logs are found or there is an error.

Parameters
  • remote_log_location (str (path)) -- the log's location in remote storage

  • return_error (bool) -- if True, returns a string error message if an error occurs. Otherwise returns '' when an error occurs.

Returns

the log found at the remote_log_location

s3_write(self, log: str, remote_log_location: str, append: bool = True)[source]

Writes the log to the remote_log_location. Fails silently if no hook was created.

Parameters
  • log (str) -- the log to write to the remote_log_location

  • remote_log_location (str (path)) -- the log's location in remote storage

  • append (bool) -- if False, any existing log file is overwritten. If True, the new log is appended to any existing logs.

Was this entry helpful?