tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql

Example Airflow DAG for DataprocSubmitJobOperator with spark sql job.

Module Contents

tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.ENV_ID[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.DAG_ID = dataproc_spark_sql[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.PROJECT_ID[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.CLUSTER_NAME[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.REGION = europe-west1[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.ZONE = europe-west1-b[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.CLUSTER_CONFIG[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.TIMEOUT[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.SPARK_SQL_JOB[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.create_cluster[source]
tests.system.providers.google.cloud.dataproc.example_dataproc_spark_sql.test_run[source]

Was this entry helpful?