airflow.providers.amazon.aws.hooks.redshift_sql

Module Contents

Classes

RedshiftSQLHook

Execute statements against Amazon Redshift.

class airflow.providers.amazon.aws.hooks.redshift_sql.RedshiftSQLHook(*args, aws_conn_id='aws_default', **kwargs)[source]

Bases: airflow.providers.common.sql.hooks.sql.DbApiHook

Execute statements against Amazon Redshift.

This hook requires the redshift_conn_id connection.

Note: For AWS IAM authentication, use iam in the extra connection parameters and set it to true. Leave the password field empty. This will use the “aws_default” connection to get the temporary token unless you override with aws_conn_id when initializing the hook. The cluster-identifier is extracted from the beginning of the host field, so is optional. It can however be overridden in the extra field. extras example: {"iam":true}

Parameters

redshift_conn_id – reference to Amazon Redshift connection id

Note

get_sqlalchemy_engine() and get_uri() depend on sqlalchemy-amazon-redshift

conn_name_attr = 'redshift_conn_id'[source]
default_conn_name = 'redshift_default'[source]
conn_type = 'redshift'[source]
hook_name = 'Amazon Redshift'[source]
supports_autocommit = True[source]
static get_ui_field_behaviour()[source]

Custom field behavior.

conn()[source]
get_iam_token(conn)[source]

Retrieve a temporary password to connect to Redshift.

Port is required. If none is provided, default is used for each service.

get_uri()[source]

Overridden to use the Redshift dialect as driver name.

get_sqlalchemy_engine(engine_kwargs=None)[source]

Overridden to pass Redshift-specific arguments.

get_table_primary_key(table, schema='public')[source]

Get the table’s primary key.

Parameters
  • table (str) – Name of the target table

  • schema (str | None) – Name of the target schema, public by default

Returns

Primary key columns list

Return type

list[str] | None

get_conn()[source]

Get a redshift_connector.Connection object.

Was this entry helpful?