airflow.operators.redshift_to_s3_operator
¶
Transfers data from AWS Redshift into a S3 Bucket.
Module Contents¶
-
class
airflow.operators.redshift_to_s3_operator.
RedshiftToS3Transfer
(schema, table, s3_bucket, s3_key, redshift_conn_id='redshift_default', aws_conn_id='aws_default', verify=None, unload_options=tuple(), autocommit=False, include_header=False, *args, **kwargs)[source]¶ Bases:
airflow.models.BaseOperator
Executes an UNLOAD command to s3 as a CSV with headers
- Parameters
schema (str) – reference to a specific schema in redshift database
table (str) – reference to a specific table in redshift database
s3_bucket (str) – reference to a specific S3 bucket
s3_key (str) – reference to a specific S3 key
redshift_conn_id (str) – reference to a specific redshift database
aws_conn_id (str) – reference to a specific S3 connection
Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:
False
: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified.
path/to/cert/bundle.pem
: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
unload_options (list) – reference to a list of UNLOAD options
autocommit (bool) – If set to True it will automatically commit the UNLOAD statement. Otherwise it will be committed right before the redshift connection gets closed.
include_header (bool) – If set to True the s3 file contains the header columns.