airflow.providers.databricks.sensors.databricks_sql¶
This module contains Databricks sensors.
Classes¶
| Sensor that runs a SQL query on Databricks. | 
Module Contents¶
- class airflow.providers.databricks.sensors.databricks_sql.DatabricksSqlSensor(*, databricks_conn_id=DatabricksSqlHook.default_conn_name, http_path=None, sql_warehouse_name=None, session_configuration=None, http_headers=None, catalog='', schema='default', sql, handler=fetch_all_handler, client_parameters=None, **kwargs)[source]¶
- Bases: - airflow.sensors.base.BaseSensorOperator- Sensor that runs a SQL query on Databricks. - Parameters:
- databricks_conn_id (str) – Reference to Databricks connection id (templated), defaults to DatabricksSqlHook.default_conn_name. 
- sql_warehouse_name (str | None) – Optional name of Databricks SQL warehouse. If not specified, - http_pathmust be provided as described below, defaults to None
- http_path (str | None) – Optional string specifying HTTP path of Databricks SQL warehouse or All Purpose cluster. If not specified, it should be either specified in the Databricks connection’s extra parameters, or - sql_warehouse_namemust be specified.
- session_configuration – An optional dictionary of Spark session parameters. If not specified, it could be specified in the Databricks connection’s extra parameters, defaults to None 
- http_headers (list[tuple[str, str]] | None) – An optional list of (k, v) pairs that will be set as HTTP headers on every request. (templated). 
- catalog (str) – An optional initial catalog to use. Requires Databricks Runtime version 9.0+ (templated), defaults to “” 
- schema (str) – An optional initial schema to use. Requires Databricks Runtime version 9.0+ (templated), defaults to “default” 
- sql (str | collections.abc.Iterable[str]) – SQL statement to be executed. 
- handler (Callable[[Any], Any]) – Handler for DbApiHook.run() to return results, defaults to fetch_all_handler 
- client_parameters (dict[str, Any] | None) – Additional parameters internal to Databricks SQL connector parameters. 
 
 - template_fields: collections.abc.Sequence[str] = ('databricks_conn_id', 'sql', 'catalog', 'schema', 'http_headers')[source]¶
 - template_ext: collections.abc.Sequence[str] = ('.sql',)[source]¶
 - property hook: airflow.providers.databricks.hooks.databricks_sql.DatabricksSqlHook[source]¶
- Creates and returns a DatabricksSqlHook object.