Module Contents



Creates sparkApplication object in kubernetes cluster:

class airflow.providers.cncf.kubernetes.operators.spark_kubernetes.SparkKubernetesOperator(*, application_file, namespace=None, kubernetes_conn_id='kubernetes_default', api_group='', api_version='v1beta2', **kwargs)[source]

Bases: airflow.models.BaseOperator

Creates sparkApplication object in kubernetes cluster:

See also

For more detail about Spark Application Object have a look at the reference:

  • application_file (str) -- Defines Kubernetes 'custom_resource_definition' of 'sparkApplication' as either a path to a '.yaml' file, '.json' file, YAML string or JSON string.

  • namespace (Optional[str]) -- kubernetes namespace to put sparkApplication

  • kubernetes_conn_id (str) -- The kubernetes connection id for the to Kubernetes cluster.

  • api_group (str) -- kubernetes api group of sparkApplication

  • api_version (str) -- kubernetes api version of sparkApplication

template_fields :Sequence[str] = ['application_file', 'namespace'][source]
template_ext :Sequence[str] = ['.yaml', '.yml', '.json'][source]
ui_color = #f4a460[source]
execute(self, context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?