airflow.providers.amazon.aws.log.s3_task_handler
¶
Module Contents¶
Classes¶
S3TaskHandler is a python log handler that handles and reads |
- class airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler(base_log_folder, s3_log_folder, filename_template=None)[source]¶
Bases:
airflow.utils.log.file_task_handler.FileTaskHandler
,airflow.utils.log.logging_mixin.LoggingMixin
S3TaskHandler is a python log handler that handles and reads task instance logs. It extends airflow FileTaskHandler and uploads to and reads from S3 remote storage.
- set_context(self, ti)[source]¶
Provide task_instance context to airflow task handler.
- Parameters
ti – task instance object
- s3_log_exists(self, remote_log_location)[source]¶
Check if remote_log_location exists in remote storage
- s3_read(self, remote_log_location, return_error=False)[source]¶
Returns the log found at the remote_log_location. Returns ‘’ if no logs are found or there is an error.
- s3_write(self, log, remote_log_location, append=True, max_retry=1)[source]¶
Writes the log to the remote_log_location. Fails silently if no hook was created.
- Parameters
log (str) – the log to write to the remote_log_location
remote_log_location (str) – the log’s location in remote storage
append (bool) – if False, any existing log file is overwritten. If True, the new log is appended to any existing logs.
max_retry (int) – Maximum number of times to retry on upload failure