Source code for airflow.providers.google.cloud.example_dags.example_bigtable
# Licensed to the Apache Software Foundation (ASF) under one# or more contributor license agreements. See the NOTICE file# distributed with this work for additional information# regarding copyright ownership. The ASF licenses this file# to you under the Apache License, Version 2.0 (the# "License"); you may not use this file except in compliance# with the License. You may obtain a copy of the License at## http://www.apache.org/licenses/LICENSE-2.0## Unless required by applicable law or agreed to in writing,# software distributed under the License is distributed on an# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY# KIND, either express or implied. See the License for the# specific language governing permissions and limitations# under the License."""Example Airflow DAG that creates and performs following operations on Cloud Bigtable:- creates an Instance- creates a Table- updates Cluster- waits for Table replication completeness- deletes the Table- deletes the InstanceThis DAG relies on the following environment variables:* GCP_PROJECT_ID - Google Cloud project* CBT_INSTANCE_ID - desired ID of a Cloud Bigtable instance* CBT_INSTANCE_DISPLAY_NAME - desired human-readable display name of the Instance* CBT_INSTANCE_TYPE - type of the Instance, e.g. 1 for DEVELOPMENT See https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance # noqa E501* CBT_INSTANCE_LABELS - labels to add for the Instance* CBT_CLUSTER_ID - desired ID of the main Cluster created for the Instance* CBT_CLUSTER_ZONE - zone in which main Cluster will be created. e.g. europe-west1-b See available zones: https://cloud.google.com/bigtable/docs/locations* CBT_CLUSTER_NODES - initial amount of nodes of the Cluster* CBT_CLUSTER_NODES_UPDATED - amount of nodes for BigtableClusterUpdateOperator* CBT_CLUSTER_STORAGE_TYPE - storage for the Cluster, e.g. 1 for SSD See https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance.cluster # noqa E501* CBT_TABLE_ID - desired ID of the Table* CBT_POKE_INTERVAL - number of seconds between every attempt of Sensor check"""importjsonfromdatetimeimportdatetimefromosimportgetenvfromairflowimportmodelsfromairflow.providers.google.cloud.operators.bigtableimport(BigtableCreateInstanceOperator,BigtableCreateTableOperator,BigtableDeleteInstanceOperator,BigtableDeleteTableOperator,BigtableUpdateClusterOperator,BigtableUpdateInstanceOperator,)fromairflow.providers.google.cloud.sensors.bigtableimportBigtableTableReplicationCompletedSensor
withmodels.DAG('example_gcp_bigtable_operators',schedule_interval='@once',# Override to match your needsstart_date=datetime(2021,1,1),catchup=False,tags=['example'],)asdag:# [START howto_operator_gcp_bigtable_instance_create]