Amazon Redshift

Amazon Redshift manages all the work of setting up, operating, and scaling a data warehouse: provisioning capacity, monitoring and backing up the cluster, and applying patches and upgrades to the Amazon Redshift engine. You can focus on using your data to acquire new insights for your business and customers.

Prerequisite Tasks

To use these operators, you must do a few things:

Operators

Create an Amazon Redshift cluster

To create an Amazon Redshift Cluster with the specified parameters you can use RedshiftCreateClusterOperator.

airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py[source]

task_create_cluster = RedshiftCreateClusterOperator(
    task_id="redshift_create_cluster",
    cluster_identifier=REDSHIFT_CLUSTER_IDENTIFIER,
    cluster_type="single-node",
    node_type="dc2.large",
    master_username="adminuser",
    master_user_password="dummypass",
)

Resume an Amazon Redshift cluster

To resume a ‘paused’ Amazon Redshift cluster you can use RedshiftResumeClusterOperator

airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py[source]

task_resume_cluster = RedshiftResumeClusterOperator(
    task_id='redshift_resume_cluster',
    cluster_identifier=REDSHIFT_CLUSTER_IDENTIFIER,
)

Pause an Amazon Redshift cluster

To pause an ‘available’ Amazon Redshift cluster you can use RedshiftPauseClusterOperator

airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py[source]

task_pause_cluster = RedshiftPauseClusterOperator(
    task_id='redshift_pause_cluster',
    cluster_identifier=REDSHIFT_CLUSTER_IDENTIFIER,
)

Delete an Amazon Redshift cluster

To delete an Amazon Redshift cluster you can use RedshiftDeleteClusterOperator

airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py[source]

task_delete_cluster = RedshiftDeleteClusterOperator(
    task_id="delete_cluster",
    cluster_identifier=REDSHIFT_CLUSTER_IDENTIFIER,
)

Sensors

Wait on an Amazon Redshift cluster state

To check the state of an Amazon Redshift Cluster until it reaches the target state or another terminal state you can use RedshiftClusterSensor.

airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py[source]

task_wait_cluster_available = RedshiftClusterSensor(
    task_id='sensor_redshift_cluster_available',
    cluster_identifier=REDSHIFT_CLUSTER_IDENTIFIER,
    target_status='available',
    poke_interval=5,
    timeout=60 * 15,
)

Was this entry helpful?