airflow.providers.amazon.aws.transfers.dynamodb_to_s3

This module contains operators to replicate records from DynamoDB table to S3.

Module Contents

Classes

JSONEncoder

Custom json encoder implementation

DynamoDBToS3Operator

Replicates records from a DynamoDB table to S3.

class airflow.providers.amazon.aws.transfers.dynamodb_to_s3.JSONEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]

Bases: json.JSONEncoder

Custom json encoder implementation

default(obj)[source]

Convert decimal objects in a json serializable format.

class airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator(*, dynamodb_table_name, s3_bucket_name, file_size, dynamodb_scan_kwargs=None, s3_key_prefix='', process_func=_convert_item_to_json_bytes, **kwargs)[source]

Bases: airflow.providers.amazon.aws.transfers.base.AwsToAwsBaseOperator

Replicates records from a DynamoDB table to S3. It scans a DynamoDB table and writes the received records to a file on the local filesystem. It flushes the file to S3 once the file size exceeds the file size limit specified by the user.

Users can also specify a filtering criteria using dynamodb_scan_kwargs to only replicate records that satisfy the criteria.

See also

For more information on how to use this operator, take a look at the guide: Amazon DynamoDB To Amazon S3 transfer operator

Parameters
template_fields: Sequence[str] = ()[source]
template_fields_renderers[source]
execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?