Amazon Redshift to S3 Transfer Operator

Use the RedshiftToS3Operator transfer to copy the data from an Amazon Redshift table into an Amazon Simple Storage Service (S3) file.

Prerequisite Tasks

To use these operators, you must do a few things:

Amazon Redshift To Amazon S3

This operator loads data from an Amazon Redshift table to an existing Amazon S3 bucket.

To get more information about this operator visit: RedshiftToS3Operator

Example usage:

airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py[source]

task_transfer_redshift_to_s3 = RedshiftToS3Operator(
    task_id='transfer_redshift_to_s3',
    s3_bucket=S3_BUCKET_NAME,
    s3_key=S3_KEY,
    schema='PUBLIC',
    table=REDSHIFT_TABLE,
)

You can find more information to the UNLOAD command used here.

Was this entry helpful?