airflow.contrib.operators.datastore_import_operator
¶
Module Contents¶
-
class
airflow.contrib.operators.datastore_import_operator.
DatastoreImportOperator
(bucket, file, namespace=None, entity_filter=None, labels=None, datastore_conn_id='google_cloud_default', delegate_to=None, polling_interval_in_seconds=10, xcom_push=False, *args, **kwargs)[source]¶ Bases:
airflow.models.BaseOperator
Import entities from Cloud Storage to Google Cloud Datastore
- Parameters
bucket (str) – container in Cloud Storage to store data
file (str) – path of the backup metadata file in the specified Cloud Storage bucket. It should have the extension .overall_export_metadata
namespace (str) – optional namespace of the backup metadata file in the specified Cloud Storage bucket.
entity_filter (dict) – description of what data from the project is included in the export, refer to https://cloud.google.com/datastore/docs/reference/rest/Shared.Types/EntityFilter
labels (dict) – client-assigned labels for cloud storage
datastore_conn_id (str) – the name of the connection id to use
delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
polling_interval_in_seconds (int) – number of seconds to wait before polling for execution status again
xcom_push (bool) – push operation name to xcom for reference