airflow.contrib.operators.gcp_container_operator
¶
Module Contents¶
-
class
airflow.contrib.operators.gcp_container_operator.
GKEClusterDeleteOperator
(project_id, name, location, gcp_conn_id='google_cloud_default', api_version='v2', *args, **kwargs)[source]¶ Bases:
airflow.models.BaseOperator
Deletes the cluster, including the Kubernetes endpoint and all worker nodes.
To delete a certain cluster, you must specify the
project_id
, thename
of the cluster, thelocation
that the cluster is in, and thetask_id
.Operator Creation:
operator = GKEClusterDeleteOperator( task_id='cluster_delete', project_id='my-project', location='cluster-location' name='cluster-name')
See also
For more detail about deleting clusters have a look at the reference: https://google-cloud-python.readthedocs.io/en/latest/container/gapic/v1/api.html#google.cloud.container_v1.ClusterManagerClient.delete_cluster
- Parameters
project_id (str) – The Google Developers Console [project ID or project number]
name (str) – The name of the resource to delete, in this case cluster name
location (str) – The name of the Google Compute Engine zone in which the cluster resides.
gcp_conn_id (str) – The connection ID to use connecting to Google Cloud Platform.
api_version (str) – The api version to use
-
class
airflow.contrib.operators.gcp_container_operator.
GKEClusterCreateOperator
(project_id, location, body=None, gcp_conn_id='google_cloud_default', api_version='v2', *args, **kwargs)[source]¶ Bases:
airflow.models.BaseOperator
Create a Google Kubernetes Engine Cluster of specified dimensions The operator will wait until the cluster is created.
The minimum required to define a cluster to create is:
dict()
::- cluster_def = {‘name’: ‘my-cluster-name’,
‘initial_node_count’: 1}
or
Cluster
proto ::from google.cloud.container_v1.types import Cluster
cluster_def = Cluster(name=’my-cluster-name’, initial_node_count=1)
Operator Creation:
operator = GKEClusterCreateOperator( task_id='cluster_create', project_id='my-project', location='my-location' body=cluster_def)
See also
For more detail on about creating clusters have a look at the reference:
google.cloud.container_v1.types.Cluster
- Parameters
project_id (str) – The Google Developers Console [project ID or project number]
location (str) – The name of the Google Compute Engine zone in which the cluster resides.
body (dict or google.cloud.container_v1.types.Cluster) – The Cluster definition to create, can be protobuf or python dict, if dict it must match protobuf message Cluster
gcp_conn_id (str) – The connection ID to use connecting to Google Cloud Platform.
api_version (str) – The api version to use
-
airflow.contrib.operators.gcp_container_operator.
G_APP_CRED
= GOOGLE_APPLICATION_CREDENTIALS[source]¶
-
class
airflow.contrib.operators.gcp_container_operator.
GKEPodOperator
(project_id, location, cluster_name, gcp_conn_id='google_cloud_default', *args, **kwargs)[source]¶ Bases:
airflow.contrib.operators.kubernetes_pod_operator.KubernetesPodOperator
Executes a task in a Kubernetes pod in the specified Google Kubernetes Engine cluster
This Operator assumes that the system has gcloud installed and either has working default application credentials or has configured a connection id with a service account.
The minimum required to define a cluster to create are the variables
task_id
,project_id
,location
,cluster_name
,name
,namespace
, andimage
Operator Creation:
operator = GKEPodOperator(task_id='pod_op', project_id='my-project', location='us-central1-a', cluster_name='my-cluster-name', name='task-name', namespace='default', image='perl')
See also
For more detail about application authentication have a look at the reference: https://cloud.google.com/docs/authentication/production#providing_credentials_to_your_application
- Parameters
project_id (str) – The Google Developers Console project id
location (str) – The name of the Google Kubernetes Engine zone in which the cluster resides, e.g. ‘us-central1-a’
cluster_name (str) – The name of the Google Kubernetes Engine cluster the pod should be spawned in
gcp_conn_id (str) – The google cloud connection id to use. This allows for users to specify a service account.
-
_set_env_from_extras
(self, extras)[source]¶ Sets the environment variable GOOGLE_APPLICATION_CREDENTIALS with either:
The path to the keyfile from the specified connection id
- A generated file’s path if the user specified JSON in the connection id. The
file is assumed to be deleted after the process dies due to how mkstemp() works.
The environment variable is used inside the gcloud command to determine correct service account to use.
-
_get_field
(self, extras, field, default=None)[source]¶ Fetches a field from extras, and returns it. This is some Airflow magic. The google_cloud_platform hook type adds custom UI elements to the hook page, which allow admins to specify service_account, key_path, etc. They get formatted as shown below.