tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python

Example Airflow DAG for testing Google Dataflow Beam Pipeline Operator with Python for Streaming job.

Module Contents

tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.ENV_ID[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.PROJECT_ID[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.DAG_ID = 'dataflow_native_python_streaming'[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.RESOURCE_DATA_BUCKET = 'airflow-system-tests-resources'[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.BUCKET_NAME[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.GCS_TMP[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.GCS_STAGING[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.GCS_PYTHON_SCRIPT[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.LOCATION = 'europe-west3'[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.TOPIC_ID[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.default_args[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.create_bucket[source]
tests.system.providers.google.cloud.dataflow.example_dataflow_streaming_python.test_run[source]

Was this entry helpful?