airflow.providers.amazon.aws.operators.s3_bucket

This module contains AWS S3 operators.

Module Contents

class airflow.providers.amazon.aws.operators.s3_bucket.S3CreateBucketOperator(*, bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', region_name: Optional[str] = None, **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator creates an S3 bucket

See also

For more information on how to use this operator, take a look at the guide: Create and Delete Amazon S3 Buckets

Parameters
  • bucket_name (str) -- This is bucket name you want to create

  • aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name (Optional[str]) -- AWS region_name. If not specified fetched from connection.

template_fields = ['bucket_name'][source]
execute(self, context)[source]
class airflow.providers.amazon.aws.operators.s3_bucket.S3DeleteBucketOperator(bucket_name: str, force_delete: bool = False, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator deletes an S3 bucket

See also

For more information on how to use this operator, take a look at the guide: Create and Delete Amazon S3 Buckets

Parameters
  • bucket_name (str) -- This is bucket name you want to delete

  • force_delete (bool) -- Forcibly delete all objects in the bucket before deleting the bucket

  • aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

template_fields = ['bucket_name'][source]
execute(self, context)[source]

Was this entry helpful?