airflow.providers.google.cloud.hooks.cloud_storage_transfer_service

This module contains a Google Storage Transfer Service Hook.

Module Contents

airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.log[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.TIME_TO_SLEEP_IN_SECONDS = 10[source]
class airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.GcpTransferJobsStatus[source]

Class with Google Cloud Transfer jobs statuses.

ENABLED = ENABLED[source]
DISABLED = DISABLED[source]
DELETED = DELETED[source]
class airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.GcpTransferOperationStatus[source]

Class with Google Cloud Transfer operations statuses.

IN_PROGRESS = IN_PROGRESS[source]
PAUSED = PAUSED[source]
SUCCESS = SUCCESS[source]
FAILED = FAILED[source]
ABORTED = ABORTED[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.ACCESS_KEY_ID = accessKeyId[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.ALREADY_EXISTING_IN_SINK = overwriteObjectsAlreadyExistingInSink[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.AWS_ACCESS_KEY = awsAccessKey[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.AWS_S3_DATA_SOURCE = awsS3DataSource[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.BODY = body[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.BUCKET_NAME = bucketName[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.COUNTERS = counters[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.DAY = day[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.DESCRIPTION = description[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.FILTER = filter[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.FILTER_JOB_NAMES = job_names[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.FILTER_PROJECT_ID = project_id[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.GCS_DATA_SINK = gcsDataSink[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.GCS_DATA_SOURCE = gcsDataSource[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.HOURS = hours[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.HTTP_DATA_SOURCE = httpDataSource[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.JOB_NAME = name[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.LIST_URL = list_url[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.METADATA = metadata[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.MINUTES = minutes[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.MONTH = month[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.NAME = name[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.OBJECT_CONDITIONS = object_conditions[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.OPERATIONS = operations[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.PROJECT_ID = projectId[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.SCHEDULE = schedule[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.SCHEDULE_END_DATE = scheduleEndDate[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.SCHEDULE_START_DATE = scheduleStartDate[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.SECONDS = seconds[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.SECRET_ACCESS_KEY = secretAccessKey[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.START_TIME_OF_DAY = startTimeOfDay[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.STATUS = status[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.STATUS1 = status[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.TRANSFER_JOB = transfer_job[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.TRANSFER_JOBS = transferJobs[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.TRANSFER_JOB_FIELD_MASK = update_transfer_job_field_mask[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.TRANSFER_OPERATIONS = transferOperations[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.TRANSFER_OPTIONS = transfer_options[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.TRANSFER_SPEC = transferSpec[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.YEAR = year[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.ALREADY_EXIST_CODE = 409[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.NEGATIVE_STATUSES[source]
airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.gen_job_name(job_name: str)str[source]
Adds unique suffix to job name. If suffix already exists, updates it.
Suffix — current timestamp
Parameters

job_name --

Rtype job_name

str

Returns

job_name with suffix

Return type

str

class airflow.providers.google.cloud.hooks.cloud_storage_transfer_service.CloudDataTransferServiceHook(api_version: str = 'v1', gcp_conn_id: str = 'google_cloud_default', delegate_to: Optional[str] = None, impersonation_chain: Optional[Union[str, Sequence[str]]] = None)[source]

Bases: airflow.providers.google.common.hooks.base_google.GoogleBaseHook

Hook for Google Storage Transfer Service.

All the methods in the hook where project_id is used must be called with keyword arguments rather than positional.

get_conn(self)[source]

Retrieves connection to Google Storage Transfer service.

Returns

Google Storage Transfer service object

Return type

dict

create_transfer_job(self, body: dict)[source]

Creates a transfer job that runs periodically.

Parameters

body (dict) -- (Required) A request body, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/patch#request-body

Returns

transfer job. See: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs#TransferJob

Return type

dict

get_transfer_job(self, job_name: str, project_id: str)[source]

Gets the latest state of a long-running operation in Google Storage Transfer Service.

Parameters
  • job_name (str) -- (Required) Name of the job to be fetched

  • project_id (str) -- (Optional) the ID of the project that owns the Transfer Job. If set to None or missing, the default project_id from the Google Cloud connection is used.

Returns

Transfer Job

Return type

dict

list_transfer_job(self, request_filter: Optional[dict] = None, **kwargs)[source]

Lists long-running operations in Google Storage Transfer Service that match the specified filter.

Parameters

request_filter (dict) -- (Required) A request filter, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/list#body.QUERY_PARAMETERS.filter

Returns

List of Transfer Jobs

Return type

list[dict]

enable_transfer_job(self, job_name: str, project_id: str)[source]

New transfers will be performed based on the schedule.

Parameters
  • job_name (str) -- (Required) Name of the job to be updated

  • project_id (str) -- (Optional) the ID of the project that owns the Transfer Job. If set to None or missing, the default project_id from the Google Cloud connection is used.

Returns

If successful, TransferJob.

Return type

dict

update_transfer_job(self, job_name: str, body: dict)[source]

Updates a transfer job that runs periodically.

Parameters
Returns

If successful, TransferJob.

Return type

dict

delete_transfer_job(self, job_name: str, project_id: str)[source]

Deletes a transfer job. This is a soft delete. After a transfer job is deleted, the job and all the transfer executions are subject to garbage collection. Transfer jobs become eligible for garbage collection 30 days after soft delete.

Parameters
  • job_name (str) -- (Required) Name of the job to be deleted

  • project_id (str) -- (Optional) the ID of the project that owns the Transfer Job. If set to None or missing, the default project_id from the Google Cloud connection is used.

Return type

None

cancel_transfer_operation(self, operation_name: str)[source]

Cancels an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) -- Name of the transfer operation.

Return type

None

get_transfer_operation(self, operation_name: str)[source]

Gets an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) -- (Required) Name of the transfer operation.

Returns

transfer operation See: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/Operation

Return type

dict

list_transfer_operations(self, request_filter: Optional[dict] = None, **kwargs)[source]

Gets an transfer operation in Google Storage Transfer Service.

Parameters

request_filter (dict) --

(Required) A request filter, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/list#body.QUERY_PARAMETERS.filter With one additional improvement:

Returns

transfer operation

Return type

list[dict]

pause_transfer_operation(self, operation_name: str)[source]

Pauses an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) -- (Required) Name of the transfer operation.

Return type

None

resume_transfer_operation(self, operation_name: str)[source]

Resumes an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) -- (Required) Name of the transfer operation.

Return type

None

wait_for_transfer_job(self, job: dict, expected_statuses: Optional[Set[str]] = None, timeout: Optional[Union[float, timedelta]] = None)[source]

Waits until the job reaches the expected state.

Parameters
Return type

None

static operations_contain_expected_statuses(operations: List[dict], expected_statuses: Union[Set[str], str])[source]

Checks whether the operation list has an operation with the expected status, then returns true If it encounters operations in FAILED or ABORTED state throw airflow.exceptions.AirflowException.

Parameters
Returns

If there is an operation with the expected state in the operation list, returns true,

Raises

airflow.exceptions.AirflowException If it encounters operations with a state in the list,

Return type

bool

Was this entry helpful?