airflow.contrib.hooks.gcp_transfer_hook

Module Contents

airflow.contrib.hooks.gcp_transfer_hook.TIME_TO_SLEEP_IN_SECONDS = 10[source]
class airflow.contrib.hooks.gcp_transfer_hook.GcpTransferJobsStatus[source]
ENABLED = ENABLED[source]
DISABLED = DISABLED[source]
DELETED = DELETED[source]
class airflow.contrib.hooks.gcp_transfer_hook.GcpTransferOperationStatus[source]
IN_PROGRESS = IN_PROGRESS[source]
PAUSED = PAUSED[source]
SUCCESS = SUCCESS[source]
FAILED = FAILED[source]
ABORTED = ABORTED[source]
airflow.contrib.hooks.gcp_transfer_hook.ACCESS_KEY_ID = accessKeyId[source]
airflow.contrib.hooks.gcp_transfer_hook.ALREADY_EXISTING_IN_SINK = overwriteObjectsAlreadyExistingInSink[source]
airflow.contrib.hooks.gcp_transfer_hook.AWS_ACCESS_KEY = awsAccessKey[source]
airflow.contrib.hooks.gcp_transfer_hook.AWS_S3_DATA_SOURCE = awsS3DataSource[source]
airflow.contrib.hooks.gcp_transfer_hook.BODY = body[source]
airflow.contrib.hooks.gcp_transfer_hook.BUCKET_NAME = bucketName[source]
airflow.contrib.hooks.gcp_transfer_hook.DAY = day[source]
airflow.contrib.hooks.gcp_transfer_hook.DESCRIPTION = description[source]
airflow.contrib.hooks.gcp_transfer_hook.FILTER = filter[source]
airflow.contrib.hooks.gcp_transfer_hook.FILTER_JOB_NAMES = job_names[source]
airflow.contrib.hooks.gcp_transfer_hook.FILTER_PROJECT_ID = project_id[source]
airflow.contrib.hooks.gcp_transfer_hook.GCS_DATA_SINK = gcsDataSink[source]
airflow.contrib.hooks.gcp_transfer_hook.GCS_DATA_SOURCE = gcsDataSource[source]
airflow.contrib.hooks.gcp_transfer_hook.HOURS = hours[source]
airflow.contrib.hooks.gcp_transfer_hook.HTTP_DATA_SOURCE = httpDataSource[source]
airflow.contrib.hooks.gcp_transfer_hook.LIST_URL = list_url[source]
airflow.contrib.hooks.gcp_transfer_hook.METADATA = metadata[source]
airflow.contrib.hooks.gcp_transfer_hook.MINUTES = minutes[source]
airflow.contrib.hooks.gcp_transfer_hook.MONTH = month[source]
airflow.contrib.hooks.gcp_transfer_hook.NAME = name[source]
airflow.contrib.hooks.gcp_transfer_hook.OBJECT_CONDITIONS = object_conditions[source]
airflow.contrib.hooks.gcp_transfer_hook.OPERATIONS = operations[source]
airflow.contrib.hooks.gcp_transfer_hook.PROJECT_ID = projectId[source]
airflow.contrib.hooks.gcp_transfer_hook.SCHEDULE = schedule[source]
airflow.contrib.hooks.gcp_transfer_hook.SCHEDULE_END_DATE = scheduleEndDate[source]
airflow.contrib.hooks.gcp_transfer_hook.SCHEDULE_START_DATE = scheduleStartDate[source]
airflow.contrib.hooks.gcp_transfer_hook.SECONDS = seconds[source]
airflow.contrib.hooks.gcp_transfer_hook.SECRET_ACCESS_KEY = secretAccessKey[source]
airflow.contrib.hooks.gcp_transfer_hook.START_TIME_OF_DAY = startTimeOfDay[source]
airflow.contrib.hooks.gcp_transfer_hook.STATUS = status[source]
airflow.contrib.hooks.gcp_transfer_hook.STATUS1 = status[source]
airflow.contrib.hooks.gcp_transfer_hook.TRANSFER_JOB = transfer_job[source]
airflow.contrib.hooks.gcp_transfer_hook.TRANSFER_JOB_FIELD_MASK = update_transfer_job_field_mask[source]
airflow.contrib.hooks.gcp_transfer_hook.TRANSFER_JOBS = transferJobs[source]
airflow.contrib.hooks.gcp_transfer_hook.TRANSFER_OPERATIONS = transferOperations[source]
airflow.contrib.hooks.gcp_transfer_hook.TRANSFER_OPTIONS = transfer_options[source]
airflow.contrib.hooks.gcp_transfer_hook.TRANSFER_SPEC = transferSpec[source]
airflow.contrib.hooks.gcp_transfer_hook.YEAR = year[source]
airflow.contrib.hooks.gcp_transfer_hook.NEGATIVE_STATUSES[source]
class airflow.contrib.hooks.gcp_transfer_hook.GCPTransferServiceHook(api_version='v1', gcp_conn_id='google_cloud_default', delegate_to=None)[source]

Bases: airflow.contrib.hooks.gcp_api_base_hook.GoogleCloudBaseHook

Hook for Google Storage Transfer Service.

_conn[source]
get_conn(self)[source]

Retrieves connection to Google Storage Transfer service.

Returns

Google Storage Transfer service object

Return type

dict

create_transfer_job(self, body)[source]

Creates a transfer job that runs periodically.

Parameters

body (dict) – (Required) A request body, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/patch#request-body

Returns

transfer job. See: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs#TransferJob

Return type

dict

get_transfer_job(self, job_name, project_id=None)[source]

Gets the latest state of a long-running operation in Google Storage Transfer Service.

Parameters
  • job_name (str) – (Required) Name of the job to be fetched

  • project_id (str) – (Optional) the ID of the project that owns the Transfer Job. If set to None or missing, the default project_id from the GCP connection is used.

Returns

Transfer Job

Return type

dict

list_transfer_job(self, filter)[source]

Lists long-running operations in Google Storage Transfer Service that match the specified filter.

Parameters

filter (dict) – (Required) A request filter, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/list#body.QUERY_PARAMETERS.filter

Returns

List of Transfer Jobs

Return type

list[dict]

update_transfer_job(self, job_name, body)[source]

Updates a transfer job that runs periodically.

Parameters
Returns

If successful, TransferJob.

Return type

dict

delete_transfer_job(self, job_name, project_id)[source]

Deletes a transfer job. This is a soft delete. After a transfer job is deleted, the job and all the transfer executions are subject to garbage collection. Transfer jobs become eligible for garbage collection 30 days after soft delete.

Parameters
  • job_name (str) – (Required) Name of the job to be deleted

  • project_id (str) – (Optional) the ID of the project that owns the Transfer Job. If set to None or missing, the default project_id from the GCP connection is used.

Return type

None

cancel_transfer_operation(self, operation_name)[source]

Cancels an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) – Name of the transfer operation.

Return type

None

get_transfer_operation(self, operation_name)[source]

Gets an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) – (Required) Name of the transfer operation.

Returns

transfer operation See: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/Operation

Return type

dict

list_transfer_operations(self, filter)[source]

Gets an transfer operation in Google Storage Transfer Service.

Parameters

filter (dict) –

(Required) A request filter, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/list#body.QUERY_PARAMETERS.filter With one additional improvement:

Returns

transfer operation

Return type

list[dict]

pause_transfer_operation(self, operation_name)[source]

Pauses an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) – (Required) Name of the transfer operation.

Return type

None

resume_transfer_operation(self, operation_name)[source]

Resumes an transfer operation in Google Storage Transfer Service.

Parameters

operation_name (str) – (Required) Name of the transfer operation.

Return type

None

wait_for_transfer_job(self, job, expected_statuses=GcpTransferOperationStatus.SUCCESS, timeout=60)[source]

Waits until the job reaches the expected state.

Parameters
Return type

None

_inject_project_id(self, body, param_name, target_key)[source]
static operations_contain_expected_statuses(operations, expected_statuses)[source]

Checks whether the operation list has an operation with the expected status, then returns true If it encounters operations in FAILED or ABORTED state throw airflow.exceptions.AirflowException.

Parameters
Returns

If there is an operation with the expected state in the operation list, returns true,

Raises

airflow.exceptions.AirflowException If it encounters operations with a state in the list,

Return type

bool

Was this entry helpful?