airflow.operators.sql
¶
Module Contents¶
Classes¶
This is a base class for generic SQL Operator to get a DB Hook |
|
Performs checks against a db. The |
|
Performs a simple value check using sql code. |
|
Checks that the values of metrics given as SQL expressions are within |
|
Performs a value check using sql code against a minimum threshold |
|
Allows a DAG to "branch" or follow a specified path based on the results of a SQL query. |
- class airflow.operators.sql.BaseSQLOperator(*, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]¶
Bases:
airflow.models.BaseOperator
This is a base class for generic SQL Operator to get a DB Hook
The provided method is .get_db_hook(). The default behavior will try to retrieve the DB hook based on connection type. You can custom the behavior by overriding the .get_db_hook() method.
- get_db_hook(self) airflow.hooks.dbapi.DbApiHook [source]¶
Get the database hook for the connection.
- Returns
the database hook object.
- Return type
- class airflow.operators.sql.SQLCheckOperator(*, sql: str, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]¶
Bases:
BaseSQLOperator
Performs checks against a db. The
SQLCheckOperator
expects a sql query that will return a single row. Each value on that first row is evaluated using pythonbool
casting. If any of the values returnFalse
the check is failed and errors out.Note that Python bool casting evals the following as
False
:False
0
Empty string (
""
)Empty list (
[]
)Empty dictionary or set (
{}
)
Given a query like
SELECT COUNT(*) FROM foo
, it will fail only if the count== 0
. You can craft much more complex query that could, for instance, check that the table has the same number of rows as the source table upstream, or that the count of today’s partition is greater than yesterday’s partition, or that a set of metrics are less than 3 standard deviation for the 7 day average.This operator can be used as a data quality check in your pipeline, and depending on where you put it in your DAG, you have the choice to stop the critical path, preventing from publishing dubious data, or on the side and receive email alerts without stopping the progress of the DAG.
- Parameters
- class airflow.operators.sql.SQLValueCheckOperator(*, sql: str, pass_value: Any, tolerance: Any = None, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]¶
Bases:
BaseSQLOperator
Performs a simple value check using sql code.
- Parameters
- class airflow.operators.sql.SQLIntervalCheckOperator(*, table: str, metrics_thresholds: Dict[str, int], date_filter_column: Optional[str] = 'ds', days_back: SupportsAbs[int] = - 7, ratio_formula: Optional[str] = 'max_over_min', ignore_zero: bool = True, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]¶
Bases:
BaseSQLOperator
Checks that the values of metrics given as SQL expressions are within a certain tolerance of the ones from days_back before.
- Parameters
table (str) – the table name
conn_id (str) – the connection ID used to connect to the database.
database (Optional[str]) – name of database which will overwrite the defined one in connection
days_back (Optional[int]) – number of days between ds and the ds we want to check against. Defaults to 7 days
date_filter_column (Optional[str]) – The column name for the dates to filter on. Defaults to ‘ds’
ratio_formula (str) –
which formula to use to compute the ratio between the two metrics. Assuming cur is the metric of today and ref is the metric to today - days_back.
max_over_min: computes max(cur, ref) / min(cur, ref) relative_diff: computes abs(cur-ref) / ref
Default: ‘max_over_min’
ignore_zero (bool) – whether we should ignore zero metrics
metrics_thresholds (dict) – a dictionary of ratios indexed by metrics
- class airflow.operators.sql.SQLThresholdCheckOperator(*, sql: str, min_threshold: Any, max_threshold: Any, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]¶
Bases:
BaseSQLOperator
Performs a value check using sql code against a minimum threshold and a maximum threshold. Thresholds can be in the form of a numeric value OR a sql statement that results a numeric.
- Parameters
sql (str) – the sql to be executed. (templated)
conn_id (str) – the connection ID used to connect to the database.
database (str) – name of database which overwrite the defined one in connection
min_threshold (numeric or str) – numerical value or min threshold sql to be executed (templated)
max_threshold (numeric or str) – numerical value or max threshold sql to be executed (templated)
- class airflow.operators.sql.BranchSQLOperator(*, sql: str, follow_task_ids_if_true: List[str], follow_task_ids_if_false: List[str], conn_id: str = 'default_conn_id', database: Optional[str] = None, parameters: Optional[Union[Mapping, Iterable]] = None, **kwargs)[source]¶
Bases:
BaseSQLOperator
,airflow.models.SkipMixin
Allows a DAG to “branch” or follow a specified path based on the results of a SQL query.
- Parameters
sql (Can receive a str representing a sql statement or reference to a template file. Template reference are recognized by str ending in '.sql'. Expected SQL query to return Boolean (True/False), integer (0 = False, Otherwise = 1) or string (true/y/yes/1/on/false/n/no/0/off)) – The SQL code to be executed, should return true or false (templated)
follow_task_ids_if_true (str or list) – task id or task ids to follow if query returns true
follow_task_ids_if_false (str or list) – task id or task ids to follow if query returns false
conn_id (str) – the connection ID used to connect to the database.
database (str) – name of database which overwrite the defined one in connection
parameters (mapping or iterable) – (optional) the parameters to render the SQL query with.