airflow.contrib.operators.datastore_export_operator
¶
Module Contents¶
-
class
airflow.contrib.operators.datastore_export_operator.
DatastoreExportOperator
(bucket, namespace=None, datastore_conn_id='google_cloud_default', cloud_storage_conn_id='google_cloud_default', delegate_to=None, entity_filter=None, labels=None, polling_interval_in_seconds=10, overwrite_existing=False, xcom_push=False, *args, **kwargs)[source]¶ Bases:
airflow.models.BaseOperator
Export entities from Google Cloud Datastore to Cloud Storage
- Parameters
bucket (str) – name of the cloud storage bucket to backup data
namespace (str) – optional namespace path in the specified Cloud Storage bucket to backup data. If this namespace does not exist in GCS, it will be created.
datastore_conn_id (str) – the name of the Datastore connection id to use
cloud_storage_conn_id (str) – the name of the cloud storage connection id to force-write backup
delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
entity_filter (dict) – description of what data from the project is included in the export, refer to https://cloud.google.com/datastore/docs/reference/rest/Shared.Types/EntityFilter
labels (dict) – client-assigned labels for cloud storage
polling_interval_in_seconds (int) – number of seconds to wait before polling for execution status again
overwrite_existing (bool) – if the storage bucket + namespace is not empty, it will be emptied prior to exports. This enables overwriting existing backups.
xcom_push (bool) – push operation name to xcom for reference