airflow.providers.amazon.aws.transfers.ftp_to_s3

Module Contents

Classes

FTPToS3Operator

This operator enables the transfer of files from a FTP server to S3. It can be used to

class airflow.providers.amazon.aws.transfers.ftp_to_s3.FTPToS3Operator(*, ftp_path: str, s3_bucket: str, s3_key: str, ftp_filenames: Optional[Union[str, List[str]]] = None, s3_filenames: Optional[Union[str, List[str]]] = None, ftp_conn_id: str = 'ftp_default', aws_conn_id: str = 'aws_default', replace: bool = False, encrypt: bool = False, gzip: bool = False, acl_policy: Optional[str] = None, **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator enables the transfer of files from a FTP server to S3. It can be used to transfer one or multiple files.

Parameters
  • ftp_path (str) -- The ftp remote path. For one file it is mandatory to include the file as well. For multiple files, it is the route where the files will be found.

  • s3_bucket (str) -- The targeted s3 bucket in which to upload the file(s).

  • s3_key (str) -- The targeted s3 key. For one file it must include the file path. For several, it must end with "/".

  • ftp_filenames (Union(str, list)) -- Only used if you want to move multiple files. You can pass a list with exact filenames present in the ftp path, or a prefix that all files must meet. It can also be the string '*' for moving all the files within the ftp path.

  • s3_filenames (Union(str, list)) -- Only used if you want to move multiple files and name them different from the originals from the ftp. It can be a list of filenames or file prefix (that will replace the ftp prefix).

  • ftp_conn_id (str) -- The ftp connection id. The name or identifier for establishing a connection to the FTP server.

  • aws_conn_id (str) -- The s3 connection id. The name or identifier for establishing a connection to S3.

  • replace (bool) -- A flag to decide whether or not to overwrite the key if it already exists. If replace is False and the key exists, an error will be raised.

  • encrypt (bool) -- If True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.

  • gzip (bool) -- If True, the file will be compressed locally

  • acl_policy (str) -- String specifying the canned ACL policy for the file being uploaded to the S3 bucket.

template_fields :Sequence[str] = ['ftp_path', 's3_bucket', 's3_key', 'ftp_filenames', 's3_filenames'][source]
execute(self, context: airflow.utils.context.Context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?