airflow.providers.amazon.aws.transfers.gcs_to_s3

This module contains Google Cloud Storage to S3 operator.

Module Contents

Classes

GCSToS3Operator

Synchronizes a Google Cloud Storage bucket with an S3 bucket.

class airflow.providers.amazon.aws.transfers.gcs_to_s3.GCSToS3Operator(*, bucket: str, prefix: Optional[str] = None, delimiter: Optional[str] = None, gcp_conn_id: str = 'google_cloud_default', google_cloud_storage_conn_id: Optional[str] = None, delegate_to: Optional[str] = None, dest_aws_conn_id: str = 'aws_default', dest_s3_key: str, dest_verify: Optional[Union[str, bool]] = None, replace: bool = False, google_impersonation_chain: Optional[Union[str, Sequence[str]]] = None, dest_s3_extra_args: Optional[Dict] = None, s3_acl_policy: Optional[str] = None, **kwargs)[source]

Bases: airflow.models.BaseOperator

Synchronizes a Google Cloud Storage bucket with an S3 bucket.

Parameters
  • bucket (str) -- The Google Cloud Storage bucket to find the objects. (templated)

  • prefix (str) -- Prefix string which filters objects whose name begin with this prefix. (templated)

  • delimiter (str) -- The delimiter by which you want to filter the objects. (templated) For e.g to lists the CSV files from in a directory in GCS you would use delimiter='.csv'.

  • gcp_conn_id (str) -- (Optional) The connection ID used to connect to Google Cloud.

  • google_cloud_storage_conn_id (str) -- (Deprecated) The connection ID used to connect to Google Cloud. This parameter has been deprecated. You should pass the gcp_conn_id parameter instead.

  • delegate_to (str) -- Google account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled.

  • dest_aws_conn_id (str) -- The destination S3 connection

  • dest_s3_key (str) -- The base S3 key to be used to store the files. (templated)

  • dest_verify (bool or str) --

    Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:

    • False: do not validate SSL certificates. SSL will still be used

      (unless use_ssl is False), but SSL certificates will not be verified.

    • path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.

      You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.

  • replace (bool) -- Whether or not to verify the existence of the files in the destination bucket. By default is set to False If set to True, will upload all the files replacing the existing ones in the destination bucket. If set to False, will upload only the files that are in the origin but not in the destination bucket.

  • google_impersonation_chain (Union[str, Sequence[str]]) -- Optional Google service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • s3_acl_policy (str) -- Optional The string to specify the canned ACL policy for the object to be uploaded in S3

template_fields :Sequence[str] = ['bucket', 'prefix', 'delimiter', 'dest_s3_key', 'google_impersonation_chain'][source]
ui_color = #f0eee4[source]
execute(self, context: airflow.utils.context.Context) List[str][source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?