airflow.providers.cncf.kubernetes.sensors.spark_kubernetes

Module Contents

Classes

SparkKubernetesSensor

Checks sparkApplication object in kubernetes cluster:

class airflow.providers.cncf.kubernetes.sensors.spark_kubernetes.SparkKubernetesSensor(*, application_name, attach_log=False, namespace=None, kubernetes_conn_id='kubernetes_default', api_group='sparkoperator.k8s.io', api_version='v1beta2', **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks sparkApplication object in kubernetes cluster:

See also

For more detail about Spark Application Object have a look at the reference: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/v1beta2-1.1.0-2.4.5/docs/api-docs.md#sparkapplication

Parameters
  • application_name (str) -- spark Application resource name

  • namespace (Optional[str]) -- the kubernetes namespace where the sparkApplication reside in

  • kubernetes_conn_id (str) -- The kubernetes connection to Kubernetes cluster.

  • attach_log (bool) -- determines whether logs for driver pod should be appended to the sensor log

  • api_group (str) -- kubernetes api group of sparkApplication

  • api_version (str) -- kubernetes api version of sparkApplication

template_fields :Sequence[str] = ['application_name', 'namespace'][source]
FAILURE_STATES = ['FAILED', 'UNKNOWN'][source]
SUCCESS_STATES = ['COMPLETED'][source]
poke(self, context)[source]

Function that the sensors defined while deriving this class should override.

Was this entry helpful?