airflow.contrib.operators.gcp_transfer_operator

Module Contents

airflow.contrib.operators.gcp_transfer_operator.AwsHook[source]
class airflow.contrib.operators.gcp_transfer_operator.TransferJobPreprocessor(body, aws_conn_id='aws_default', default_schedule=False)[source]
_inject_aws_credentials(self)[source]
_reformat_date(self, field_key)[source]
_reformat_time(self, field_key)[source]
_reformat_schedule(self)[source]
process_body(self)[source]
static _convert_date_to_dict(field_date)[source]

Convert native python datetime.date object to a format supported by the API

static _convert_time_to_dict(time)[source]

Convert native python datetime.time object to a format supported by the API

class airflow.contrib.operators.gcp_transfer_operator.TransferJobValidator(body)[source]
_verify_data_source(self)[source]
_restrict_aws_credentials(self)[source]
_restrict_empty_body(self)[source]
validate_body(self)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceJobCreateOperator(body, aws_conn_id='aws_default', gcp_conn_id='google_cloud_default', api_version='v1', *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Creates a transfer job that runs periodically.

Warning

This operator is NOT idempotent. If you run it many times, many transfer jobs will be created in the Google Cloud Platform.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceJobCreateOperator

Parameters
template_fields = ['body', 'gcp_conn_id', 'aws_conn_id'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceJobUpdateOperator(job_name, body, aws_conn_id='aws_default', gcp_conn_id='google_cloud_default', api_version='v1', *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Updates a transfer job that runs periodically.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceJobUpdateOperator

Parameters
template_fields = ['job_name', 'body', 'gcp_conn_id', 'aws_conn_id'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceJobDeleteOperator(job_name, gcp_conn_id='google_cloud_default', api_version='v1', project_id=None, *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Delete a transfer job. This is a soft delete. After a transfer job is deleted, the job and all the transfer executions are subject to garbage collection. Transfer jobs become eligible for garbage collection 30 days after soft delete.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceJobDeleteOperator

Parameters
  • job_name (str) – (Required) Name of the TRANSFER operation

  • project_id (str) – (Optional) the ID of the project that owns the Transfer Job. If set to None or missing, the default project_id from the GCP connection is used.

  • gcp_conn_id (str) – The connection ID used to connect to Google Cloud Platform.

  • api_version (str) – API version used (e.g. v1).

template_fields = ['job_name', 'project_id', 'gcp_conn_id', 'api_version'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceOperationGetOperator(operation_name, gcp_conn_id='google_cloud_default', api_version='v1', *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Gets the latest state of a long-running operation in Google Storage Transfer Service.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceOperationGetOperator

Parameters
  • operation_name (str) – (Required) Name of the transfer operation.

  • gcp_conn_id (str) – The connection ID used to connect to Google Cloud Platform.

  • api_version (str) – API version used (e.g. v1).

template_fields = ['operation_name', 'gcp_conn_id'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceOperationsListOperator(filter, gcp_conn_id='google_cloud_default', api_version='v1', *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Lists long-running operations in Google Storage Transfer Service that match the specified filter.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceOperationsListOperator

Parameters
template_fields = ['filter', 'gcp_conn_id'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceOperationPauseOperator(operation_name, gcp_conn_id='google_cloud_default', api_version='v1', *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Pauses a transfer operation in Google Storage Transfer Service.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceOperationPauseOperator

Parameters
  • operation_name (str) – (Required) Name of the transfer operation.

  • gcp_conn_id (str) – The connection ID used to connect to Google Cloud Platform.

  • api_version (str) – API version used (e.g. v1).

template_fields = ['operation_name', 'gcp_conn_id', 'api_version'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceOperationResumeOperator(operation_name, gcp_conn_id='google_cloud_default', api_version='v1', *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Resumes a transfer operation in Google Storage Transfer Service.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceOperationResumeOperator

Parameters
  • operation_name (str) – (Required) Name of the transfer operation.

  • gcp_conn_id (str) – The connection ID used to connect to Google Cloud Platform.

  • api_version (str) – API version used (e.g. v1).

template_fields = ['operation_name', 'gcp_conn_id', 'api_version'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.GcpTransferServiceOperationCancelOperator(operation_name, api_version='v1', gcp_conn_id='google_cloud_default', *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Cancels a transfer operation in Google Storage Transfer Service.

See also

For more information on how to use this operator, take a look at the guide: GcpTransferServiceOperationCancelOperator

Parameters
  • operation_name (str) – (Required) Name of the transfer operation.

  • api_version (str) – API version used (e.g. v1).

  • gcp_conn_id (str) – The connection ID used to connect to Google Cloud Platform.

template_fields = ['operation_name', 'gcp_conn_id', 'api_version'][source]
_validate_inputs(self)[source]
execute(self, context)[source]
class airflow.contrib.operators.gcp_transfer_operator.S3ToGoogleCloudStorageTransferOperator(s3_bucket, gcs_bucket, project_id=None, aws_conn_id='aws_default', gcp_conn_id='google_cloud_default', delegate_to=None, description=None, schedule=None, object_conditions=None, transfer_options=None, wait=True, timeout=None, *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Synchronizes an S3 bucket with a Google Cloud Storage bucket using the GCP Storage Transfer Service.

Warning

This operator is NOT idempotent. If you run it many times, many transfer jobs will be created in the Google Cloud Platform.

Example:

s3_to_gcs_transfer_op = S3ToGoogleCloudStorageTransferOperator(
     task_id='s3_to_gcs_transfer_example',
     s3_bucket='my-s3-bucket',
     project_id='my-gcp-project',
     gcs_bucket='my-gcs-bucket',
     dag=my_dag)
Parameters
template_fields = ['gcp_conn_id', 's3_bucket', 'gcs_bucket', 'description', 'object_conditions'][source]
ui_color = #e09411[source]
execute(self, context)[source]
_create_body(self)[source]
class airflow.contrib.operators.gcp_transfer_operator.GoogleCloudStorageToGoogleCloudStorageTransferOperator(source_bucket, destination_bucket, project_id=None, gcp_conn_id='google_cloud_default', delegate_to=None, description=None, schedule=None, object_conditions=None, transfer_options=None, wait=True, timeout=None, *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Copies objects from a bucket to another using the GCP Storage Transfer Service.

Warning

This operator is NOT idempotent. If you run it many times, many transfer jobs will be created in the Google Cloud Platform.

Example:

gcs_to_gcs_transfer_op = GoogleCloudStorageToGoogleCloudStorageTransferOperator(
     task_id='gcs_to_gcs_transfer_example',
     source_bucket='my-source-bucket',
     destination_bucket='my-destination-bucket',
     project_id='my-gcp-project',
     dag=my_dag)
Parameters
template_fields = ['gcp_conn_id', 'source_bucket', 'destination_bucket', 'description', 'object_conditions'][source]
ui_color = #e09411[source]
execute(self, context)[source]
_create_body(self)[source]

Was this entry helpful?