airflow.operators.sql

Module Contents

class airflow.operators.sql.BaseSQLOperator(*, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]

Bases: airflow.models.BaseOperator

This is a base class for generic SQL Operator to get a DB Hook

The provided method is .get_db_hook(). The default behavior will try to retrieve the DB hook based on connection type. You can custom the behavior by overriding the .get_db_hook() method.

_hook(self)[source]

Get DB Hook based on connection type

get_db_hook(self)[source]

Get the database hook for the connection.

Returns

the database hook object.

Return type

DbApiHook

class airflow.operators.sql.SQLCheckOperator(*, sql: str, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]

Bases: airflow.operators.sql.BaseSQLOperator

Performs checks against a db. The SQLCheckOperator expects a sql query that will return a single row. Each value on that first row is evaluated using python bool casting. If any of the values return False the check is failed and errors out.

Note that Python bool casting evals the following as False:

  • False

  • 0

  • Empty string ("")

  • Empty list ([])

  • Empty dictionary or set ({})

Given a query like SELECT COUNT(*) FROM foo, it will fail only if the count == 0. You can craft much more complex query that could, for instance, check that the table has the same number of rows as the source table upstream, or that the count of today’s partition is greater than yesterday’s partition, or that a set of metrics are less than 3 standard deviation for the 7 day average.

This operator can be used as a data quality check in your pipeline, and depending on where you put it in your DAG, you have the choice to stop the critical path, preventing from publishing dubious data, or on the side and receive email alerts without stopping the progress of the DAG.

Parameters
  • sql (str) – the sql to be executed. (templated)

  • conn_id (str) – the connection ID used to connect to the database.

  • database (str) – name of database which overwrite the defined one in connection

template_fields :Iterable[str] = ['sql'][source]
template_ext :Iterable[str] = ['.hql', '.sql'][source]
ui_color = #fff7e6[source]
execute(self, context=None)[source]
airflow.operators.sql._convert_to_float_if_possible(s)[source]
A small helper function to convert a string to a numeric value
if appropriate
Parameters

s (str) – the string to be converted

class airflow.operators.sql.SQLValueCheckOperator(*, sql: str, pass_value: Any, tolerance: Any = None, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]

Bases: airflow.operators.sql.BaseSQLOperator

Performs a simple value check using sql code.

Parameters
  • sql (str) – the sql to be executed. (templated)

  • conn_id (str) – the connection ID used to connect to the database.

  • database (str) – name of database which overwrite the defined one in connection

__mapper_args__[source]
template_fields :Iterable[str] = ['sql', 'pass_value'][source]
template_ext :Iterable[str] = ['.hql', '.sql'][source]
ui_color = #fff7e6[source]
execute(self, context=None)[source]
_to_float(self, records)[source]
_get_string_matches(self, records, pass_value_conv)[source]
_get_numeric_matches(self, numeric_records, numeric_pass_value_conv)[source]
class airflow.operators.sql.SQLIntervalCheckOperator(*, table: str, metrics_thresholds: Dict[str, int], date_filter_column: Optional[str] = 'ds', days_back: SupportsAbs[int] = - 7, ratio_formula: Optional[str] = 'max_over_min', ignore_zero: bool = True, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]

Bases: airflow.operators.sql.BaseSQLOperator

Checks that the values of metrics given as SQL expressions are within a certain tolerance of the ones from days_back before.

Parameters
  • table (str) – the table name

  • conn_id (str) – the connection ID used to connect to the database.

  • database (Optional[str]) – name of database which will overwrite the defined one in connection

  • days_back (Optional[int]) – number of days between ds and the ds we want to check against. Defaults to 7 days

  • date_filter_column (Optional[str]) – The column name for the dates to filter on. Defaults to ‘ds’

  • ratio_formula (str) –

    which formula to use to compute the ratio between the two metrics. Assuming cur is the metric of today and ref is the metric to today - days_back.

    max_over_min: computes max(cur, ref) / min(cur, ref) relative_diff: computes abs(cur-ref) / ref

    Default: ‘max_over_min’

  • ignore_zero (bool) – whether we should ignore zero metrics

  • metrics_thresholds (dict) – a dictionary of ratios indexed by metrics

__mapper_args__[source]
template_fields :Iterable[str] = ['sql1', 'sql2'][source]
template_ext :Iterable[str] = ['.hql', '.sql'][source]
template_fields_renderers[source]
ui_color = #fff7e6[source]
ratio_formulas[source]
execute(self, context=None)[source]
class airflow.operators.sql.SQLThresholdCheckOperator(*, sql: str, min_threshold: Any, max_threshold: Any, conn_id: Optional[str] = None, database: Optional[str] = None, **kwargs)[source]

Bases: airflow.operators.sql.BaseSQLOperator

Performs a value check using sql code against a minimum threshold and a maximum threshold. Thresholds can be in the form of a numeric value OR a sql statement that results a numeric.

Parameters
  • sql (str) – the sql to be executed. (templated)

  • conn_id (str) – the connection ID used to connect to the database.

  • database (str) – name of database which overwrite the defined one in connection

  • min_threshold (numeric or str) – numerical value or min threshold sql to be executed (templated)

  • max_threshold (numeric or str) – numerical value or max threshold sql to be executed (templated)

template_fields = ['sql', 'min_threshold', 'max_threshold'][source]
template_ext :Iterable[str] = ['.hql', '.sql'][source]
execute(self, context=None)[source]
push(self, meta_data)[source]

Optional: Send data check info and metadata to an external database. Default functionality will log metadata.

class airflow.operators.sql.BranchSQLOperator(*, sql: str, follow_task_ids_if_true: List[str], follow_task_ids_if_false: List[str], conn_id: str = 'default_conn_id', database: Optional[str] = None, parameters: Optional[Union[Mapping, Iterable]] = None, **kwargs)[source]

Bases: airflow.operators.sql.BaseSQLOperator, airflow.models.SkipMixin

Executes sql code in a specific database

Parameters
  • sql (Can receive a str representing a sql statement or reference to a template file. Template reference are recognized by str ending in '.sql'. Expected SQL query to return Boolean (True/False), integer (0 = False, Otherwise = 1) or string (true/y/yes/1/on/false/n/no/0/off)) – the sql code to be executed. (templated)

  • follow_task_ids_if_true (str or list) – task id or task ids to follow if query return true

  • follow_task_ids_if_false (str or list) – task id or task ids to follow if query return true

  • conn_id (str) – the connection ID used to connect to the database.

  • database (str) – name of database which overwrite the defined one in connection

  • parameters (mapping or iterable) – (optional) the parameters to render the SQL query with.

template_fields = ['sql'][source]
template_ext = ['.sql'][source]
ui_color = #a22034[source]
ui_fgcolor = #F7F7F7[source]
execute(self, context: Dict)[source]

Was this entry helpful?