airflow.providers.amazon.aws.operators.s3_tables

Amazon S3 Tables operators.

Classes

S3TablesCreateTableOperator

Create a new table in an Amazon S3 Tables namespace.

S3TablesDeleteTableOperator

Delete a table from an Amazon S3 Tables namespace.

S3TablesDeleteNamespaceOperator

Delete a namespace from an Amazon S3 Tables table bucket.

S3TablesCreateTableBucketOperator

Create an Amazon S3 Tables table bucket.

S3TablesDeleteTableBucketOperator

Delete an Amazon S3 Tables table bucket.

S3TablesCreateNamespaceOperator

Create a namespace in an Amazon S3 Tables table bucket.

Module Contents

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesCreateTableOperator(*, table_bucket_arn, namespace, table_name, format='ICEBERG', metadata=None, **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook]

Create a new table in an Amazon S3 Tables namespace.

See also

For more information on how to use this operator, take a look at the guide: Create a Table

Parameters:
  • table_bucket_arn (str) – The ARN of the table bucket to create the table in. (templated)

  • namespace (str) – The namespace to associate with the table. (templated)

  • table_name (str) – The name of the table. (templated)

  • format (str) – The table format. (templated) Currently only ICEBERG is supported.

  • metadata (dict[str, Any] | None) – Optional Iceberg schema metadata. (templated) Example: {"iceberg": {"schema": {"fields": [{"name": "id", "type": "int", "required": True}]}}}

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

template_fields: collections.abc.Sequence[str][source]
template_fields_renderers[source]
aws_hook_class[source]
table_bucket_arn[source]
namespace[source]
table_name[source]
format = 'ICEBERG'[source]
metadata = None[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesDeleteTableOperator(*, table_bucket_arn, namespace, table_name, version_token=None, **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook]

Delete a table from an Amazon S3 Tables namespace.

See also

For more information on how to use this operator, take a look at the guide: Delete a Table

Parameters:
  • table_bucket_arn (str) – The ARN of the table bucket containing the table. (templated)

  • namespace (str) – The namespace of the table. (templated)

  • table_name (str) – The name of the table to delete. (templated)

  • version_token (str | None) – Optional version token for optimistic concurrency. (templated)

template_fields: collections.abc.Sequence[str][source]
aws_hook_class[source]
table_bucket_arn[source]
namespace[source]
table_name[source]
version_token = None[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesDeleteNamespaceOperator(*, table_bucket_arn, namespace, **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]

Delete a namespace from an Amazon S3 Tables table bucket.

See also

For more information on how to use this operator, take a look at the guide: Delete a Namespace

Parameters:
  • table_bucket_arn (str) – The ARN of the table bucket. (templated)

  • namespace (str) – The namespace to delete. (templated)

template_fields: collections.abc.Sequence[str][source]
aws_hook_class[source]
table_bucket_arn[source]
namespace[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesCreateTableBucketOperator(*, table_bucket_name, encryption_configuration=None, tags=None, if_exists='skip', **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]

Create an Amazon S3 Tables table bucket.

A table bucket is the top-level container for S3 Tables namespaces and tables.

See also

For more information on how to use this operator, take a look at the guide: Create a Table Bucket

Parameters:
  • table_bucket_name (str) – The name of the table bucket. (templated)

  • encryption_configuration (dict[str, str] | None) – Optional encryption configuration dict with sseAlgorithm and optional kmsKeyArn. (templated)

  • if_exists (Literal['fail', 'skip']) – Behavior when a table bucket with the same name already exists. "fail" raises an error, "skip" returns the existing bucket ARN.

template_fields: collections.abc.Sequence[str][source]
template_fields_renderers[source]
aws_hook_class[source]
table_bucket_name[source]
encryption_configuration = None[source]
tags = None[source]
if_exists = 'skip'[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesDeleteTableBucketOperator(*, table_bucket_arn, **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]

Delete an Amazon S3 Tables table bucket.

See also

For more information on how to use this operator, take a look at the guide: Delete a Table Bucket

Parameters:

table_bucket_arn (str) – The ARN of the table bucket to delete. (templated)

template_fields: collections.abc.Sequence[str][source]
aws_hook_class[source]
table_bucket_arn[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesCreateNamespaceOperator(*, table_bucket_arn, namespace, if_exists='skip', **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]

Create a namespace in an Amazon S3 Tables table bucket.

See also

For more information on how to use this operator, take a look at the guide: Create a Namespace

Parameters:
  • table_bucket_arn (str) – The ARN of the table bucket. (templated)

  • namespace (str) – The namespace name to create. (templated)

  • if_exists (Literal['fail', 'skip']) – Behavior when the namespace already exists. "fail" raises an error, "skip" logs and returns.

template_fields: collections.abc.Sequence[str][source]
aws_hook_class[source]
table_bucket_arn[source]
namespace[source]
if_exists = 'skip'[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?