airflow.providers.amazon.aws.transfers.sftp_to_s3

Module Contents

Classes

SFTPToS3Operator

This operator enables the transferring of files from a SFTP server to

class airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator(*, s3_bucket: str, s3_key: str, sftp_path: str, sftp_conn_id: str = 'ssh_default', s3_conn_id: str = 'aws_default', use_temp_file: bool = True, **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator enables the transferring of files from a SFTP server to Amazon S3.

See also

For more information on how to use this operator, take a look at the guide: SFTPToS3Operator

Parameters
  • sftp_conn_id (str) -- The sftp connection id. The name or identifier for establishing a connection to the SFTP server.

  • sftp_path (str) -- The sftp remote path. This is the specified file path for downloading the file from the SFTP server.

  • s3_conn_id (str) -- The s3 connection id. The name or identifier for establishing a connection to S3

  • s3_bucket (str) -- The targeted s3 bucket. This is the S3 bucket to where the file is uploaded.

  • s3_key (str) -- The targeted s3 key. This is the specified path for uploading the file to S3.

  • use_temp_file (bool) -- If True, copies file first to local, if False streams file from SFTP to S3.

template_fields :Sequence[str] = ['s3_key', 'sftp_path'][source]
static get_s3_key(s3_key: str) str[source]

This parses the correct format for S3 keys regardless of how the S3 url is passed.

execute(self, context: airflow.utils.context.Context) None[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?