airflow.providers.amazon.aws.transfers.ftp_to_s3

Module Contents

class airflow.providers.amazon.aws.transfers.ftp_to_s3.FTPToS3Operator(s3_bucket, s3_key, ftp_path, ftp_conn_id='ftp_default', aws_conn_id='aws_default', replace=False, encrypt=False, gzip=False, acl_policy=None, *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator enables the transferring of files from FTP server to S3.

Parameters
  • s3_bucket (str) -- The targeted s3 bucket in which upload the file to

  • s3_key (str) -- The targeted s3 key. This is the specified file path for uploading the file to S3.

  • ftp_path (str) -- The ftp remote path, including the file.

  • ftp_conn_id (str) -- The ftp connection id. The name or identifier for establishing a connection to the FTP server.

  • aws_conn_id (str) -- The s3 connection id. The name or identifier for establishing a connection to S3

  • replace (bool) -- A flag to decide whether or not to overwrite the key if it already exists. If replace is False and the key exists, an error will be raised.

  • encrypt (bool) -- If True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.

  • gzip (bool) -- If True, the file will be compressed locally

  • acl_policy (str) -- String specifying the canned ACL policy for the file being uploaded to the S3 bucket.

template_fields = ['s3_bucket', 's3_key', 'ftp_path'][source]
execute(self, context)[source]

Was this entry helpful?