airflow.providers.amazon.aws.transfers.redshift_to_s3¶
Transfers data from AWS Redshift into a S3 Bucket.
Module Contents¶
- 
class airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator(*, schema: str, table: str, s3_bucket: str, s3_key: str, redshift_conn_id: str = 'redshift_default', aws_conn_id: str = 'aws_default', verify: Optional[Union[bool, str]] = None, unload_options: Optional[List] = None, autocommit: bool = False, include_header: bool = False, table_as_file_name: bool = True, **kwargs)[source]¶
- Bases: - airflow.models.BaseOperator- Executes an UNLOAD command to s3 as a CSV with headers - Parameters
- schema (str) -- reference to a specific schema in redshift database 
- table (str) -- reference to a specific table in redshift database 
- s3_bucket (str) -- reference to a specific S3 bucket 
- s3_key (str) -- reference to a specific S3 key. If - table_as_file_nameis set to False, this param must include the desired file name
- redshift_conn_id (str) -- reference to a specific redshift database 
- aws_conn_id (str) -- reference to a specific S3 connection 
- Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values: - False: do not validate SSL certificates. SSL will still be used
- (unless use_ssl is False), but SSL certificates will not be verified. 
 
- path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.
- You can specify this argument if you want to use a different CA cert bundle than the one used by botocore. 
 
 
- unload_options (list) -- reference to a list of UNLOAD options 
- autocommit (bool) -- If set to True it will automatically commit the UNLOAD statement. Otherwise it will be committed right before the redshift connection gets closed. 
- include_header (bool) -- If set to True the s3 file contains the header columns. 
- table_as_file_name (bool) -- If set to True, the s3 file will be named as the table