airflow.providers.amazon.aws.log.s3_task_handler

Module Contents

Classes

S3TaskHandler

S3TaskHandler is a python log handler that handles and reads

class airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler(base_log_folder, s3_log_folder, filename_template)[source]

Bases: airflow.utils.log.file_task_handler.FileTaskHandler, airflow.utils.log.logging_mixin.LoggingMixin

S3TaskHandler is a python log handler that handles and reads task instance logs. It extends airflow FileTaskHandler and uploads to and reads from S3 remote storage.

hook(self)[source]

Returns S3Hook.

set_context(self, ti)[source]

Provide task_instance context to airflow task handler.

Parameters

ti – task instance object

close(self)[source]

Close and upload local log file to remote storage S3.

s3_log_exists(self, remote_log_location)[source]

Check if remote_log_location exists in remote storage

Parameters

remote_log_location (str) – log’s location in remote storage

Returns

True if location exists else False

Return type

bool

s3_read(self, remote_log_location, return_error=False)[source]

Returns the log found at the remote_log_location. Returns ‘’ if no logs are found or there is an error.

Parameters
  • remote_log_location (str) – the log’s location in remote storage

  • return_error (bool) – if True, returns a string error message if an error occurs. Otherwise returns ‘’ when an error occurs.

Returns

the log found at the remote_log_location

Return type

str

s3_write(self, log, remote_log_location, append=True, max_retry=1)[source]

Writes the log to the remote_log_location. Fails silently if no hook was created.

Parameters
  • log (str) – the log to write to the remote_log_location

  • remote_log_location (str) – the log’s location in remote storage

  • append (bool) – if False, any existing log file is overwritten. If True, the new log is appended to any existing logs.

  • max_retry (int) – Maximum number of times to retry on upload failure

Was this entry helpful?