airflow.providers.amazon.aws.operators.redshift_data
¶
Module Contents¶
Classes¶
Executes SQL Statements against an Amazon Redshift cluster using Redshift Data |
- class airflow.providers.amazon.aws.operators.redshift_data.RedshiftDataOperator(database, sql, cluster_identifier=None, db_user=None, parameters=None, secret_arn=None, statement_name=None, with_event=False, wait_for_completion=True, poll_interval=10, return_sql_result=False, aws_conn_id='aws_default', region=None, await_result=None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
Executes SQL Statements against an Amazon Redshift cluster using Redshift Data
See also
For more information on how to use this operator, take a look at the guide: Execute a statement on an Amazon Redshift cluster
- Parameters
database (str) – the name of the database
sql (str | list) – the SQL statement or list of SQL statement to run
cluster_identifier (str | None) – unique identifier of a cluster
db_user (str | None) – the database username
parameters (list | None) – the parameters for the SQL statement
secret_arn (str | None) – the name or ARN of the secret that enables db access
statement_name (str | None) – the name of the SQL statement
with_event (bool) – indicates whether to send an event to EventBridge
wait_for_completion (bool) – indicates whether to wait for a result, if True wait, if False don’t wait
poll_interval (int) – how often in seconds to check the query status
return_sql_result (bool) – if True will return the result of an SQL statement, if False (default) will return statement ID
aws_conn_id (str) – aws connection to use
region (str | None) – aws region to use