S3ToSnowflakeOperator

Use the S3ToSnowflakeOperator to load data stored in AWS S3 to a Snowflake table.

Using the Operator

Similarly to the SnowflakeOperator, use the snowflake_conn_id and the additional relevant parameters to establish connection with your Snowflake instance. This operator will allow loading of one or more named files from a specific Snowflake stage (predefined S3 path). In order to do so pass the relevant file names to the s3_keys parameter and the relevant Snowflake stage to the stage parameter. file_format can be used to either reference an already existing Snowflake file format or a custom string that defines a file format (see docs).

An example usage of the S3ToSnowflakeOperator is as follows:

airflow/providers/snowflake/example_dags/example_snowflake.py[source]


copy_into_table = S3ToSnowflakeOperator(
    task_id='copy_into_table',
    s3_keys=[S3_FILE_PATH],
    table=SNOWFLAKE_SAMPLE_TABLE,
    schema=SNOWFLAKE_SCHEMA,
    stage=SNOWFLAKE_STAGE,
    file_format="(type = 'CSV',field_delimiter = ';')",
    dag=dag,
)

Was this entry helpful?