airflow.providers.amazon.aws.operators.redshift_cluster

Module Contents

Classes

RedshiftCreateClusterOperator

Creates a new cluster with the specified parameters.

RedshiftCreateClusterSnapshotOperator

Creates a manual snapshot of the specified cluster. The cluster must be in the available state

RedshiftDeleteClusterSnapshotOperator

Deletes the specified manual snapshot

RedshiftResumeClusterOperator

Resume a paused AWS Redshift Cluster

RedshiftPauseClusterOperator

Pause an AWS Redshift Cluster if it has status available.

RedshiftDeleteClusterOperator

Delete an AWS Redshift cluster.

class airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftCreateClusterOperator(*, cluster_identifier, node_type, master_username, master_user_password, cluster_type='multi-node', db_name='dev', number_of_nodes=1, cluster_security_groups=None, vpc_security_group_ids=None, cluster_subnet_group_name=None, availability_zone=None, preferred_maintenance_window=None, cluster_parameter_group_name=None, automated_snapshot_retention_period=1, manual_snapshot_retention_period=None, port=5439, cluster_version='1.0', allow_version_upgrade=True, publicly_accessible=True, encrypted=False, hsm_client_certificate_identifier=None, hsm_configuration_identifier=None, elastic_ip=None, tags=None, kms_key_id=None, enhanced_vpc_routing=False, additional_info=None, iam_roles=None, maintenance_track_name=None, snapshot_schedule_identifier=None, availability_zone_relocation=None, aqua_configuration_status=None, default_iam_role_arn=None, aws_conn_id='aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

Creates a new cluster with the specified parameters.

See also

For more information on how to use this operator, take a look at the guide: Create an Amazon Redshift cluster

Parameters
  • cluster_identifier (str) -- A unique identifier for the cluster.

  • node_type (str) -- The node type to be provisioned for the cluster. Valid Values: ds2.xlarge, ds2.8xlarge, dc1.large, dc1.8xlarge, dc2.large, dc2.8xlarge, ra3.xlplus, ra3.4xlarge, and ra3.16xlarge.

  • master_username (str) -- The username associated with the admin user account for the cluster that is being created.

  • master_user_password (str) -- The password associated with the admin user account for the cluster that is being created.

  • cluster_type (str) -- The type of the cluster single-node or multi-node. The default value is multi-node.

  • db_name (str) -- The name of the first database to be created when the cluster is created.

  • number_of_nodes (int) -- The number of compute nodes in the cluster. This param require when cluster_type is multi-node.

  • cluster_security_groups (Optional[List[str]]) -- A list of security groups to be associated with this cluster.

  • vpc_security_group_ids (Optional[List[str]]) -- A list of VPC security groups to be associated with the cluster.

  • cluster_subnet_group_name (Optional[str]) -- The name of a cluster subnet group to be associated with this cluster.

  • availability_zone (Optional[str]) -- The EC2 Availability Zone (AZ).

  • preferred_maintenance_window (Optional[str]) -- The time range (in UTC) during which automated cluster maintenance can occur.

  • cluster_parameter_group_name (Optional[str]) -- The name of the parameter group to be associated with this cluster.

  • automated_snapshot_retention_period (int) -- The number of days that automated snapshots are retained. The default value is 1.

  • manual_snapshot_retention_period (Optional[int]) -- The default number of days to retain a manual snapshot.

  • port (int) -- The port number on which the cluster accepts incoming connections. The Default value is 5439.

  • cluster_version (str) -- The version of a Redshift engine software that you want to deploy on the cluster.

  • allow_version_upgrade (bool) -- Whether major version upgrades can be applied during the maintenance window. The Default value is True.

  • enhanced_vpc_routing (bool) -- Whether to create the cluster with enhanced VPC routing enabled Default value is False.

  • additional_info (Optional[str]) -- Reserved

  • iam_roles (Optional[List[str]]) -- A list of IAM roles that can be used by the cluster to access other AWS services.

  • maintenance_track_name (Optional[str]) -- Name of the maintenance track for the cluster.

  • snapshot_schedule_identifier (Optional[str]) -- A unique identifier for the snapshot schedule.

  • availability_zone_relocation (Optional[bool]) -- Enable relocation for a Redshift cluster between Availability Zones after the cluster is created.

  • aqua_configuration_status (Optional[str]) -- The cluster is configured to use AQUA .

  • default_iam_role_arn (Optional[str]) -- ARN for the IAM role.

  • aws_conn_id (str) -- str = The Airflow connection used for AWS credentials. The default connection id is aws_default.

Parma publicly_accessible

Whether cluster can be accessed from a public network.

Parma encrypted

Whether data in the cluster is encrypted at rest. The default value is False.

Parma hsm_client_certificate_identifier

Name of the HSM client certificate the Amazon Redshift cluster uses to retrieve the data.

Parma hsm_configuration_identifier

Name of the HSM configuration

Parma elastic_ip

The Elastic IP (EIP) address for the cluster.

Parma tags

A list of tag instances

Parma kms_key_id

KMS key id of encryption key.

template_fields :Sequence[str] = ['cluster_identifier', 'cluster_type', 'node_type', 'number_of_nodes'][source]
ui_color = #eeaa11[source]
ui_fgcolor = #ffffff[source]
execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftCreateClusterSnapshotOperator(*, snapshot_identifier, cluster_identifier, retention_period=-1, wait_for_completion=False, poll_interval=15, max_attempt=20, aws_conn_id='aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

Creates a manual snapshot of the specified cluster. The cluster must be in the available state

See also

For more information on how to use this operator, take a look at the guide: Create an Amazon Redshift cluster snapshot

Parameters
  • snapshot_identifier (str) -- A unique identifier for the snapshot that you are requesting

  • cluster_identifier (str) -- The cluster identifier for which you want a snapshot

  • retention_period (int) -- The number of days that a manual snapshot is retained. If the value is -1, the manual snapshot is retained indefinitely.

  • wait_for_completion (bool) -- Whether wait for the cluster snapshot to be in available state

  • poll_interval (int) -- Time (in seconds) to wait between two consecutive calls to check state

  • max_attempt (int) -- The maximum number of attempts to be made to check the state

  • aws_conn_id (str) -- The Airflow connection used for AWS credentials. The default connection id is aws_default

execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftDeleteClusterSnapshotOperator(*, snapshot_identifier, cluster_identifier, wait_for_completion=True, aws_conn_id='aws_default', poll_interval=10, **kwargs)[source]

Bases: airflow.models.BaseOperator

Deletes the specified manual snapshot

See also

For more information on how to use this operator, take a look at the guide: Delete an Amazon Redshift cluster snapshot

Parameters
  • snapshot_identifier (str) -- A unique identifier for the snapshot that you are requesting

  • cluster_identifier (str) -- The unique identifier of the cluster the snapshot was created from

  • wait_for_completion (bool) -- Whether wait for cluster deletion or not The default value is True

  • aws_conn_id (str) -- The Airflow connection used for AWS credentials. The default connection id is aws_default

  • poll_interval (int) -- Time (in seconds) to wait between two consecutive calls to check snapshot state

execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

get_status()[source]
class airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftResumeClusterOperator(*, cluster_identifier, aws_conn_id='aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

Resume a paused AWS Redshift Cluster

See also

For more information on how to use this operator, take a look at the guide: Resume an Amazon Redshift cluster

Parameters
  • cluster_identifier (str) -- id of the AWS Redshift Cluster

  • aws_conn_id (str) -- aws connection to use

template_fields :Sequence[str] = ['cluster_identifier'][source]
ui_color = #eeaa11[source]
ui_fgcolor = #ffffff[source]
execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftPauseClusterOperator(*, cluster_identifier, aws_conn_id='aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

Pause an AWS Redshift Cluster if it has status available.

See also

For more information on how to use this operator, take a look at the guide: Pause an Amazon Redshift cluster

Parameters
  • cluster_identifier (str) -- id of the AWS Redshift Cluster

  • aws_conn_id (str) -- aws connection to use

template_fields :Sequence[str] = ['cluster_identifier'][source]
ui_color = #eeaa11[source]
ui_fgcolor = #ffffff[source]
execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftDeleteClusterOperator(*, cluster_identifier, skip_final_cluster_snapshot=True, final_cluster_snapshot_identifier=None, wait_for_completion=True, aws_conn_id='aws_default', poll_interval=30.0, **kwargs)[source]

Bases: airflow.models.BaseOperator

Delete an AWS Redshift cluster.

See also

For more information on how to use this operator, take a look at the guide: Delete an Amazon Redshift cluster

Parameters
  • cluster_identifier (str) -- unique identifier of a cluster

  • skip_final_cluster_snapshot (bool) -- determines cluster snapshot creation

  • final_cluster_snapshot_identifier (Optional[str]) -- name of final cluster snapshot

  • wait_for_completion (bool) -- Whether wait for cluster deletion or not The default value is True

  • aws_conn_id (str) -- aws connection to use

  • poll_interval (float) -- Time (in seconds) to wait between two consecutive calls to check cluster state

template_fields :Sequence[str] = ['cluster_identifier'][source]
ui_color = #eeaa11[source]
ui_fgcolor = #ffffff[source]
execute(context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

delete_cluster()[source]
check_status()[source]

Was this entry helpful?