tests.system.google.cloud.dataflow.example_dataflow_native_python

Example Airflow DAG for testing Google Dataflow Beam Pipeline Operator with Python.

Attributes

ENV_ID

DAG_ID

RESOURCE_DATA_BUCKET

BUCKET_NAME

GCS_TMP

GCS_STAGING

GCS_OUTPUT

GCS_PYTHON_SCRIPT

LOCATION

default_args

create_bucket

test_run

Module Contents

tests.system.google.cloud.dataflow.example_dataflow_native_python.ENV_ID[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.DAG_ID = 'dataflow_native_python'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.RESOURCE_DATA_BUCKET = 'airflow-system-tests-resources'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.BUCKET_NAME = 'bucket_dataflow_native_python_Uninferable'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.GCS_TMP = 'gs://bucket_dataflow_native_python_Uninferable/temp/'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.GCS_STAGING = 'gs://bucket_dataflow_native_python_Uninferable/staging/'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.GCS_OUTPUT = 'gs://bucket_dataflow_native_python_Uninferable/output'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.GCS_PYTHON_SCRIPT = 'gs://airflow-system-tests-resources/dataflow/python/wordcount_debugging.py'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.LOCATION = 'europe-west3'[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.default_args[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.create_bucket[source]
tests.system.google.cloud.dataflow.example_dataflow_native_python.test_run[source]

Was this entry helpful?