airflow.providers.amazon.aws.operators.s3_bucket_tagging

This module contains AWS S3 operators.

Module Contents

airflow.providers.amazon.aws.operators.s3_bucket_tagging.BUCKET_DOES_NOT_EXIST_MSG = Bucket with name: %s doesn't exist[source]
class airflow.providers.amazon.aws.operators.s3_bucket_tagging.S3GetBucketTaggingOperator(bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator gets tagging from an S3 bucket

See also

For more information on how to use this operator, take a look at the guide: Using Amazon S3 Bucket Tagging

Parameters
  • bucket_name (str) -- This is bucket name you want to reference

  • aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

template_fields = ['bucket_name'][source]
execute(self, context)[source]
class airflow.providers.amazon.aws.operators.s3_bucket_tagging.S3PutBucketTaggingOperator(bucket_name: str, key: Optional[str] = None, value: Optional[str] = None, tag_set: Optional[List[Dict[str, str]]] = None, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator puts tagging for an S3 bucket.

See also

For more information on how to use this operator, take a look at the guide: Using Amazon S3 Bucket Tagging

Parameters
  • bucket_name (str) -- The name of the bucket to add tags to.

  • key (str) -- The key portion of the key/value pair for a tag to be added. If a key is provided, a value must be provided as well.

  • value -- The value portion of the key/value pair for a tag to be added. If a value is provided, a key must be provided as well.

  • tag_set (List[Dict[str, str]]) -- A List of key/value pairs.

  • aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then the default boto3 configuration would be used (and must be maintained on each worker node).

template_fields = ['bucket_name'][source]
template_fields_renderers[source]
execute(self, context)[source]
class airflow.providers.amazon.aws.operators.s3_bucket_tagging.S3DeleteBucketTaggingOperator(bucket_name: str, aws_conn_id: Optional[str] = 'aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator deletes tagging from an S3 bucket.

See also

For more information on how to use this operator, take a look at the guide: Using Amazon S3 Bucket Tagging

Parameters
  • bucket_name (str) -- This is the name of the bucket to delete tags from.

  • aws_conn_id (Optional[str]) -- The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

template_fields = ['bucket_name'][source]
execute(self, context)[source]

Was this entry helpful?