airflow.providers.amazon.aws.hooks.redshift
¶
Interact with AWS Redshift clusters.
Module Contents¶
-
class
airflow.providers.amazon.aws.hooks.redshift.
RedshiftHook
(*args, **kwargs)[source]¶ Bases:
airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook
Interact with AWS Redshift, using the boto3 library
Additional arguments (such as
aws_conn_id
) may be specified and are passed down to the underlying AwsBaseHook.See also
- Parameters
aws_conn_id (str) -- The Airflow connection used for AWS credentials.
-
delete_cluster
(self, cluster_identifier: str, skip_final_cluster_snapshot: bool = True, final_cluster_snapshot_identifier: Optional[str] = None)[source]¶ Delete a cluster and optionally create a snapshot
-
describe_cluster_snapshots
(self, cluster_identifier: str)[source]¶ Gets a list of snapshots for a cluster
- Parameters
cluster_identifier (str) -- unique identifier of a cluster
-
class
airflow.providers.amazon.aws.hooks.redshift.
RedshiftSQLHook
[source]¶ Bases:
airflow.hooks.dbapi.DbApiHook
Execute statements against Amazon Redshift, using redshift_connector
This hook requires the redshift_conn_id connection.
- Parameters
redshift_conn_id (str) -- reference to Amazon Redshift connection id
Note
get_sqlalchemy_engine() and get_uri() depend on sqlalchemy-amazon-redshift
-
get_uri
(self)[source]¶ Overrides DbApiHook get_uri to use redshift_connector sqlalchemy dialect as driver name
-
get_sqlalchemy_engine
(self, engine_kwargs=None)[source]¶ Overrides DbApiHook get_sqlalchemy_engine to pass redshift_connector specific kwargs
-
get_table_primary_key
(self, table: str, schema: Optional[str] = 'public')[source]¶ Helper method that returns the table primary key :param table: Name of the target table :type table: str :param table: Name of the target schema, public by default :type table: str :return: Primary key columns list :rtype: List[str]