airflow.providers.amazon.aws.transfers.local_to_s3

Module Contents

Classes

LocalFilesystemToS3Operator

Uploads a file from a local filesystem to Amazon S3.

class airflow.providers.amazon.aws.transfers.local_to_s3.LocalFilesystemToS3Operator(*, filename, dest_key, dest_bucket=None, aws_conn_id='aws_default', verify=None, replace=False, encrypt=False, gzip=False, acl_policy=None, **kwargs)[source]

Bases: airflow.models.BaseOperator

Uploads a file from a local filesystem to Amazon S3.

See also

For more information on how to use this operator, take a look at the guide: Local to Amazon S3 transfer operator

Parameters
  • filename (str) – Path to the local file. Path can be either absolute (e.g. /path/to/file.ext) or relative (e.g. ../../foo//.csv). (templated)

  • dest_key (str) –

    The key of the object to copy to. (templated)

    It can be either full s3:// style url or relative path from root level.

    When it’s specified as a full s3:// url, please omit dest_bucket.

  • dest_bucket (str | None) – Name of the S3 bucket to where the object is copied. (templated)

  • aws_conn_id (str) – Connection id of the S3 connection to use

  • verify (str | bool | None) –

    Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified.

    You can provide the following values:

    • False: do not validate SSL certificates. SSL will still be used,

      but SSL certificates will not be verified.

    • path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.

      You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.

  • replace (bool) – A flag to decide whether or not to overwrite the key if it already exists. If replace is False and the key exists, an error will be raised.

  • encrypt (bool) – If True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.

  • gzip (bool) – If True, the file will be compressed locally

  • acl_policy (str | None) – String specifying the canned ACL policy for the file being uploaded to the S3 bucket.

template_fields: Sequence[str] = ('filename', 'dest_key', 'dest_bucket')[source]
execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?