airflow.providers.amazon.aws.operators.s3
¶
This module contains AWS S3 operators.
Module Contents¶
Classes¶
This operator creates an S3 bucket |
|
This operator deletes an S3 bucket |
|
This operator gets tagging from an S3 bucket |
|
This operator puts tagging for an S3 bucket. |
|
This operator deletes tagging from an S3 bucket. |
|
Creates a copy of an object that is already stored in S3. |
|
To enable users to delete single object or multiple objects from |
|
Copies data from a source S3 location to a temporary location on the |
|
List all objects from the bucket with the given string prefix in name. |
|
List all subfolders from the bucket with the given string prefix in name. |
Attributes¶
- airflow.providers.amazon.aws.operators.s3.BUCKET_DOES_NOT_EXIST_MSG = Bucket with name: %s doesn't exist[source]¶
- class airflow.providers.amazon.aws.operators.s3.S3CreateBucketOperator(*, bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', region_name: Optional[str] = None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
This operator creates an S3 bucket
See also
For more information on how to use this operator, take a look at the guide: Create and Delete Amazon S3 Buckets
- Parameters
bucket_name (str) -- This is bucket name you want to create
aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).
region_name (Optional[str]) -- AWS region_name. If not specified fetched from connection.
- class airflow.providers.amazon.aws.operators.s3.S3DeleteBucketOperator(bucket_name: str, force_delete: bool = False, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
This operator deletes an S3 bucket
See also
For more information on how to use this operator, take a look at the guide: Create and Delete Amazon S3 Buckets
- Parameters
bucket_name (str) -- This is bucket name you want to delete
force_delete (bool) -- Forcibly delete all objects in the bucket before deleting the bucket
aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).
- class airflow.providers.amazon.aws.operators.s3.S3GetBucketTaggingOperator(bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
This operator gets tagging from an S3 bucket
See also
For more information on how to use this operator, take a look at the guide: Using Amazon S3 Bucket Tagging
- Parameters
bucket_name (str) -- This is bucket name you want to reference
aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).
- class airflow.providers.amazon.aws.operators.s3.S3PutBucketTaggingOperator(bucket_name: str, key: Optional[str] = None, value: Optional[str] = None, tag_set: Optional[List[Dict[str, str]]] = None, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
This operator puts tagging for an S3 bucket.
See also
For more information on how to use this operator, take a look at the guide: Using Amazon S3 Bucket Tagging
- Parameters
bucket_name (str) -- The name of the bucket to add tags to.
key (str) -- The key portion of the key/value pair for a tag to be added. If a key is provided, a value must be provided as well.
value -- The value portion of the key/value pair for a tag to be added. If a value is provided, a key must be provided as well.
tag_set (List[Dict[str, str]]) -- A List of key/value pairs.
aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then the default boto3 configuration would be used (and must be maintained on each worker node).
- class airflow.providers.amazon.aws.operators.s3.S3DeleteBucketTaggingOperator(bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
This operator deletes tagging from an S3 bucket.
See also
For more information on how to use this operator, take a look at the guide: Using Amazon S3 Bucket Tagging
- Parameters
bucket_name (str) -- This is the name of the bucket to delete tags from.
aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).
- class airflow.providers.amazon.aws.operators.s3.S3CopyObjectOperator(*, source_bucket_key: str, dest_bucket_key: str, source_bucket_name: Optional[str] = None, dest_bucket_name: Optional[str] = None, source_version_id: Optional[str] = None, aws_conn_id: str = 'aws_default', verify: Optional[Union[str, bool]] = None, acl_policy: Optional[str] = None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
Creates a copy of an object that is already stored in S3.
Note: the S3 connection used here needs to have access to both source and destination bucket/key.
- Parameters
source_bucket_key (str) --
The key of the source object. (templated)
It can be either full s3:// style url or relative path from root level.
When it's specified as a full s3:// url, please omit source_bucket_name.
dest_bucket_key (str) --
The key of the object to copy to. (templated)
The convention to specify dest_bucket_key is the same as source_bucket_key.
source_bucket_name (str) --
Name of the S3 bucket where the source object is in. (templated)
It should be omitted when source_bucket_key is provided as a full s3:// url.
dest_bucket_name (str) --
Name of the S3 bucket to where the object is copied. (templated)
It should be omitted when dest_bucket_key is provided as a full s3:// url.
source_version_id (str) -- Version ID of the source object (OPTIONAL)
aws_conn_id (str) -- Connection id of the S3 connection to use
Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified.
You can provide the following values:
- False: do not validate SSL certificates. SSL will still be used,
but SSL certificates will not be verified.
- path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.
You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
acl_policy (str) -- String specifying the canned ACL policy for the file being uploaded to the S3 bucket.
- class airflow.providers.amazon.aws.operators.s3.S3DeleteObjectsOperator(*, bucket: str, keys: Optional[Union[str, list]] = None, prefix: Optional[str] = None, aws_conn_id: str = 'aws_default', verify: Optional[Union[str, bool]] = None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
To enable users to delete single object or multiple objects from a bucket using a single HTTP request.
Users may specify up to 1000 keys to delete.
- Parameters
bucket (str) -- Name of the bucket in which you are going to delete object(s). (templated)
The key(s) to delete from S3 bucket. (templated)
When
keys
is a string, it's supposed to be the key name of the single object to delete.When
keys
is a list, it's supposed to be the list of the keys to delete.You may specify up to 1000 keys.
prefix (str) -- Prefix of objects to delete. (templated) All objects matching this prefix in the bucket will be deleted.
aws_conn_id (str) -- Connection id of the S3 connection to use
Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified.
You can provide the following values:
False
: do not validate SSL certificates. SSL will still be used,but SSL certificates will not be verified.
path/to/cert/bundle.pem
: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
- class airflow.providers.amazon.aws.operators.s3.S3FileTransformOperator(*, source_s3_key: str, dest_s3_key: str, transform_script: Optional[str] = None, select_expression=None, script_args: Optional[Sequence[str]] = None, source_aws_conn_id: str = 'aws_default', source_verify: Optional[Union[bool, str]] = None, dest_aws_conn_id: str = 'aws_default', dest_verify: Optional[Union[bool, str]] = None, replace: bool = False, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
Copies data from a source S3 location to a temporary location on the local filesystem. Runs a transformation on this file as specified by the transformation script and uploads the output to a destination S3 location.
The locations of the source and the destination files in the local filesystem is provided as an first and second arguments to the transformation script. The transformation script is expected to read the data from source, transform it and write the output to the local destination file. The operator then takes over control and uploads the local destination file to S3.
S3 Select is also available to filter the source contents. Users can omit the transformation script if S3 Select expression is specified.
- Parameters
source_s3_key (str) -- The key to be retrieved from S3. (templated)
dest_s3_key (str) -- The key to be written from S3. (templated)
transform_script (str) -- location of the executable transformation script
select_expression (str) -- S3 Select expression
script_args (sequence of str) -- arguments for transformation script (templated)
source_aws_conn_id (str) -- source s3 connection
source_verify (bool or str) --
Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:
False
: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified.
path/to/cert/bundle.pem
: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
This is also applicable to
dest_verify
.dest_aws_conn_id (str) -- destination s3 connection
dest_verify (bool or str) -- Whether or not to verify SSL certificates for S3 connection. See:
source_verify
replace (bool) -- Replace dest S3 key if it already exists
- class airflow.providers.amazon.aws.operators.s3.S3ListOperator(*, bucket: str, prefix: str = '', delimiter: str = '', aws_conn_id: str = 'aws_default', verify: Optional[Union[str, bool]] = None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
List all objects from the bucket with the given string prefix in name.
This operator returns a python list with the name of objects which can be used by xcom in the downstream task.
- Parameters
bucket (str) -- The S3 bucket where to find the objects. (templated)
prefix (str) -- Prefix string to filters the objects whose name begin with such prefix. (templated)
delimiter (str) -- the delimiter marks key hierarchy. (templated)
aws_conn_id (str) -- The connection ID to use when connecting to S3 storage.
Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:
False
: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified.
path/to/cert/bundle.pem
: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
- Example:
The following operator would list all the files (excluding subfolders) from the S3
customers/2018/04/
key in thedata
bucket.s3_file = S3ListOperator( task_id='list_3s_files', bucket='data', prefix='customers/2018/04/', delimiter='/', aws_conn_id='aws_customers_conn' )
- class airflow.providers.amazon.aws.operators.s3.S3ListPrefixesOperator(*, bucket: str, prefix: str, delimiter: str, aws_conn_id: str = 'aws_default', verify: Optional[Union[str, bool]] = None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
List all subfolders from the bucket with the given string prefix in name.
This operator returns a python list with the name of all subfolders which can be used by xcom in the downstream task.
- Parameters
bucket (str) -- The S3 bucket where to find the subfolders. (templated)
prefix (str) -- Prefix string to filter the subfolders whose name begin with such prefix. (templated)
delimiter (str) -- the delimiter marks subfolder hierarchy. (templated)
aws_conn_id (str) -- The connection ID to use when connecting to S3 storage.
Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:
False
: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified.
path/to/cert/bundle.pem
: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
- Example:
The following operator would list all the subfolders from the S3
customers/2018/04/
prefix in thedata
bucket.s3_file = S3ListPrefixesOperator( task_id='list_s3_prefixes', bucket='data', prefix='customers/2018/04/', delimiter='/', aws_conn_id='aws_customers_conn' )