airflow.providers.amazon.aws.operators.s3_tables¶
Amazon S3 Tables operators.
Classes¶
Create a new table in an Amazon S3 Tables namespace. |
|
Delete a table from an Amazon S3 Tables namespace. |
|
Delete a namespace from an Amazon S3 Tables table bucket. |
|
Create an Amazon S3 Tables table bucket. |
|
Delete an Amazon S3 Tables table bucket. |
|
Create a namespace in an Amazon S3 Tables table bucket. |
Module Contents¶
- class airflow.providers.amazon.aws.operators.s3_tables.S3TablesCreateTableOperator(*, table_bucket_arn, namespace, table_name, format='ICEBERG', metadata=None, **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook]Create a new table in an Amazon S3 Tables namespace.
See also
For more information on how to use this operator, take a look at the guide: Create a Table
- Parameters:
table_bucket_arn (str) – The ARN of the table bucket to create the table in. (templated)
namespace (str) – The namespace to associate with the table. (templated)
table_name (str) – The name of the table. (templated)
format (str) – The table format. (templated) Currently only
ICEBERGis supported.metadata (dict[str, Any] | None) – Optional Iceberg schema metadata. (templated) Example:
{"iceberg": {"schema": {"fields": [{"name": "id", "type": "int", "required": True}]}}}aws_conn_id – The Airflow connection used for AWS credentials. If this is
Noneor empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).region_name – AWS region_name. If not specified then the default boto3 behaviour is used.
verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html
botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html
- template_fields: collections.abc.Sequence[str][source]¶
- class airflow.providers.amazon.aws.operators.s3_tables.S3TablesDeleteTableOperator(*, table_bucket_arn, namespace, table_name, version_token=None, **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook]Delete a table from an Amazon S3 Tables namespace.
See also
For more information on how to use this operator, take a look at the guide: Delete a Table
- Parameters:
table_bucket_arn (str) – The ARN of the table bucket containing the table. (templated)
namespace (str) – The namespace of the table. (templated)
table_name (str) – The name of the table to delete. (templated)
version_token (str | None) – Optional version token for optimistic concurrency. (templated)
- template_fields: collections.abc.Sequence[str][source]¶
- class airflow.providers.amazon.aws.operators.s3_tables.S3TablesDeleteNamespaceOperator(*, table_bucket_arn, namespace, **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]Delete a namespace from an Amazon S3 Tables table bucket.
See also
For more information on how to use this operator, take a look at the guide: Delete a Namespace
- Parameters:
- template_fields: collections.abc.Sequence[str][source]¶
- class airflow.providers.amazon.aws.operators.s3_tables.S3TablesCreateTableBucketOperator(*, table_bucket_name, encryption_configuration=None, tags=None, if_exists='skip', **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]Create an Amazon S3 Tables table bucket.
A table bucket is the top-level container for S3 Tables namespaces and tables.
See also
For more information on how to use this operator, take a look at the guide: Create a Table Bucket
- Parameters:
table_bucket_name (str) – The name of the table bucket. (templated)
encryption_configuration (dict[str, str] | None) – Optional encryption configuration dict with
sseAlgorithmand optionalkmsKeyArn. (templated)if_exists (Literal['fail', 'skip']) – Behavior when a table bucket with the same name already exists.
"fail"raises an error,"skip"returns the existing bucket ARN.
- template_fields: collections.abc.Sequence[str][source]¶
- class airflow.providers.amazon.aws.operators.s3_tables.S3TablesDeleteTableBucketOperator(*, table_bucket_arn, **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]Delete an Amazon S3 Tables table bucket.
See also
For more information on how to use this operator, take a look at the guide: Delete a Table Bucket
- Parameters:
table_bucket_arn (str) – The ARN of the table bucket to delete. (templated)
- template_fields: collections.abc.Sequence[str][source]¶
- class airflow.providers.amazon.aws.operators.s3_tables.S3TablesCreateNamespaceOperator(*, table_bucket_arn, namespace, if_exists='skip', **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.s3_tables.S3TablesHook]Create a namespace in an Amazon S3 Tables table bucket.
See also
For more information on how to use this operator, take a look at the guide: Create a Namespace
- Parameters:
- template_fields: collections.abc.Sequence[str][source]¶