tests.system.google.cloud.dataproc.example_dataproc_spark

Example Airflow DAG for DataprocSubmitJobOperator with spark job.

Module Contents

tests.system.google.cloud.dataproc.example_dataproc_spark.ENV_ID[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.DAG_ID = 'dataproc_spark'[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.PROJECT_ID[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_NAME_BASE[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_NAME_FULL[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_NAME[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.REGION = 'europe-west1'[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_CONFIG[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.SPARK_JOB[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.create_cluster[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.test_run[source]

Was this entry helpful?