:mod:`airflow.contrib.operators.datastore_export_operator` ========================================================== .. py:module:: airflow.contrib.operators.datastore_export_operator Module Contents --------------- .. py:class:: DatastoreExportOperator(bucket, namespace=None, datastore_conn_id='google_cloud_default', cloud_storage_conn_id='google_cloud_default', delegate_to=None, entity_filter=None, labels=None, polling_interval_in_seconds=10, overwrite_existing=False, xcom_push=False, *args, **kwargs) Bases: :class:`airflow.models.BaseOperator` Export entities from Google Cloud Datastore to Cloud Storage :param bucket: name of the cloud storage bucket to backup data :type bucket: str :param namespace: optional namespace path in the specified Cloud Storage bucket to backup data. If this namespace does not exist in GCS, it will be created. :type namespace: str :param datastore_conn_id: the name of the Datastore connection id to use :type datastore_conn_id: str :param cloud_storage_conn_id: the name of the cloud storage connection id to force-write backup :type cloud_storage_conn_id: str :param delegate_to: The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled. :type delegate_to: str :param entity_filter: description of what data from the project is included in the export, refer to https://cloud.google.com/datastore/docs/reference/rest/Shared.Types/EntityFilter :type entity_filter: dict :param labels: client-assigned labels for cloud storage :type labels: dict :param polling_interval_in_seconds: number of seconds to wait before polling for execution status again :type polling_interval_in_seconds: int :param overwrite_existing: if the storage bucket + namespace is not empty, it will be emptied prior to exports. This enables overwriting existing backups. :type overwrite_existing: bool :param xcom_push: push operation name to xcom for reference :type xcom_push: bool .. method:: execute(self, context)