airflow.providers.apache.druid.hooks.druid
¶
Module Contents¶
Classes¶
Connection to Druid overlord for ingestion |
|
Interact with Druid broker |
- class airflow.providers.apache.druid.hooks.druid.DruidHook(druid_ingest_conn_id='druid_ingest_default', timeout=1, max_ingestion_time=None)[source]¶
Bases:
airflow.hooks.base.BaseHook
Connection to Druid overlord for ingestion
To connect to a Druid cluster that is secured with the druid-basic-security extension, add the username and password to the druid ingestion connection.
- Parameters
druid_ingest_conn_id (str) – The connection id to the Druid overlord machine which accepts index jobs
timeout (int) – The interval between polling the Druid job for the status of the ingestion job. Must be greater than or equal to 1
max_ingestion_time (int | None) – The maximum ingestion time before assuming the job failed
- class airflow.providers.apache.druid.hooks.druid.DruidDbApiHook(*args, schema=None, log_sql=True, **kwargs)[source]¶
Bases:
airflow.providers.common.sql.hooks.sql.DbApiHook
Interact with Druid broker
This hook is purely for users to query druid broker. For ingestion, please use druidHook.
- get_uri()[source]¶
Get the connection uri for druid broker.
e.g: druid://localhost:8082/druid/v2/sql/
- abstract insert_rows(table, rows, target_fields=None, commit_every=1000, replace=False, **kwargs)[source]¶
A generic way to insert a set of tuples into a table, a new transaction is created every commit_every rows
- Parameters
table (str) – Name of the target table
rows (Iterable[tuple[str]]) – The rows to insert into the table
target_fields (Iterable[str] | None) – The names of the columns to fill in the table
commit_every (int) – The maximum number of rows to insert in one transaction. Set to 0 to insert all rows in one transaction.
replace (bool) – Whether to replace instead of insert