SnowflakeOperator¶
Use the SnowflakeOperator
to execute
SQL commands in a Snowflake database.
Using the Operator¶
Use the snowflake_conn_id
argument to connect to your Snowflake instance where
the connection metadata is structured as follows:
Parameter |
Input |
---|---|
Login: string |
Snowflake user name |
Password: string |
Password for Snowflake user |
Schema: string |
Set schema to execute SQL operations on by default |
Extra: dictionary |
|
An example usage of the SnowflakeOperator is as follows:
dag = DAG(
'example_snowflake',
start_date=datetime(2021, 1, 1),
default_args={'snowflake_conn_id': SNOWFLAKE_CONN_ID},
tags=['example'],
catchup=False,
)
snowflake_op_sql_str = SnowflakeOperator(
task_id='snowflake_op_sql_str',
dag=dag,
sql=CREATE_TABLE_SQL_STRING,
warehouse=SNOWFLAKE_WAREHOUSE,
database=SNOWFLAKE_DATABASE,
schema=SNOWFLAKE_SCHEMA,
role=SNOWFLAKE_ROLE,
)
snowflake_op_with_params = SnowflakeOperator(
task_id='snowflake_op_with_params',
dag=dag,
sql=SQL_INSERT_STATEMENT,
parameters={"id": 56},
warehouse=SNOWFLAKE_WAREHOUSE,
database=SNOWFLAKE_DATABASE,
schema=SNOWFLAKE_SCHEMA,
role=SNOWFLAKE_ROLE,
)
snowflake_op_sql_list = SnowflakeOperator(task_id='snowflake_op_sql_list', dag=dag, sql=SQL_LIST)
snowflake_op_sql_multiple_stmts = SnowflakeOperator(
task_id='snowflake_op_sql_multiple_stmts',
dag=dag,
sql=SQL_MULTIPLE_STMTS,
)
snowflake_op_template_file = SnowflakeOperator(
task_id='snowflake_op_template_file',
dag=dag,
sql='/path/to/sql/<filename>.sql',
)
Note
Parameters that can be passed onto the operator will be given priority over the parameters already given
in the Airflow connection metadata (such as schema
, role
, database
and so forth).