Airflow Summit 2021 is coming July 8-16. Register now!

SqliteOperator

Use the SqliteOperator to execute Sqlite commands in a Sqlite database.

Using the Operator

Use the sqlite_conn_id argument to connect to your Sqlite instance where the connection metadata is structured as follows:

Sqlite Airflow Connection Metadata

Parameter

Input

Host: string

MySql hostname

Schema: string

Set schema to execute Sql operations on by default

Login: string

Sqlite user

Password: string

Sqlite user password

Port: int

Sqlite port

An example usage of the SqliteOperator is as follows:

airflow/providers/sqlite/example_dags/example_sqlite.pyView Source


# Example of creating a task that calls a common CREATE TABLE sql command.
create_table_sqlite_task = SqliteOperator(
    task_id='create_table_sqlite',
    sqlite_conn_id='sqlite_conn_id',
    sql=r"""
    CREATE TABLE table_name (
        column_1 string,
        column_2 string,
        column_3 string
    );
    """,
    dag=dag,
)

Furthermore, you can use an external file to execute the SQL commands. Script folder must be at the same level as DAG.py file.

airflow/providers/sqlite/example_dags/example_sqlite.pyView Source


# Example of creating a task that calls an sql command from an external file.
external_create_table_sqlite_task = SqliteOperator(
    task_id='create_table_sqlite_external_file',
    sqlite_conn_id='sqlite_conn_id',
    sql='/scripts/create_table.sql',
    dag=dag,
)

Reference

For further information, look at:

Note

Parameters given via SqliteOperator() are given first-place priority relative to parameters set via Airflow connection metadata (such as schema, login, password etc).

Was this entry helpful?