S3TaskHandler is a python log handler that handles and reads
Load delete_local_logs conf if Airflow version > 2.6 and return False if not
Load delete_local_logs conf if Airflow version > 2.6 and return False if not TODO: delete this function when min airflow version >= 2.6
- class airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler(base_log_folder, s3_log_folder, filename_template=None, **kwargs)[source]¶
S3TaskHandler is a python log handler that handles and reads task instance logs. It extends airflow FileTaskHandler and uploads to and reads from S3 remote storage.
Provide task_instance context to airflow task handler.
Generally speaking returns None. But if attr maintain_propagate has been set to propagate, then returns sentinel MAINTAIN_PROPAGATE. This has the effect of overriding the default behavior to set propagate to False whenever set_context is called. At time of writing, this functionality is only used in unit testing.
ti – task instance object
Check if remote_log_location exists in remote storage
- s3_read(remote_log_location, return_error=False)[source]¶
Returns the log found at the remote_log_location. Returns ‘’ if no logs are found or there is an error.
- s3_write(log, remote_log_location, append=True, max_retry=1)[source]¶
- Writes the log to the remote_log_location and return True when done. Fails silently
and return False if no log was created.
log (str) – the log to write to the remote_log_location
remote_log_location (str) – the log’s location in remote storage
append (bool) – if False, any existing log file is overwritten. If True, the new log is appended to any existing logs.
max_retry (int) – Maximum number of times to retry on upload failure
whether the log is successfully written to remote location or not.
- Return type