Amazon Redshift to Amazon S3¶
Use the RedshiftToS3Operator
transfer to copy the data from an Amazon Redshift table into an Amazon Simple Storage
Service (S3) file.
Prerequisite Tasks¶
To use these operators, you must do a few things:
Create necessary resources using AWS Console or AWS CLI.
Install API libraries via pip.
pip install 'apache-airflow[amazon]'
Detailed information is available Installation
Operators¶
Amazon Redshift To Amazon S3 transfer operator¶
This operator loads data from an Amazon Redshift table to an existing Amazon S3 bucket.
To get more information about this operator visit:
RedshiftToS3Operator
Example usage:
task_transfer_redshift_to_s3 = RedshiftToS3Operator(
task_id='transfer_redshift_to_s3',
s3_bucket=S3_BUCKET_NAME,
s3_key=S3_KEY,
schema='PUBLIC',
table=REDSHIFT_TABLE,
)
You can find more information to the UNLOAD
command used
here.