airflow.providers.apache.hive.transfers.hive_to_samba

This module contains an operator to move data from Hive to Samba.

Module Contents

Classes

HiveToSambaOperator

Executes hql code in a specific Hive database and loads the

class airflow.providers.apache.hive.transfers.hive_to_samba.HiveToSambaOperator(*, hql: str, destination_filepath: str, samba_conn_id: str = 'samba_default', hiveserver2_conn_id: str = 'hiveserver2_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

Executes hql code in a specific Hive database and loads the results of the query as a csv to a Samba location.

Parameters
  • hql (str) -- the hql to be exported. (templated)

  • destination_filepath (str) -- the file path to where the file will be pushed onto samba

  • samba_conn_id (str) -- reference to the samba destination

  • hiveserver2_conn_id (str) -- Reference to the :ref: Hive Server2 thrift service connection id <howto/connection:hiveserver2>.

template_fields :Sequence[str] = ['hql', 'destination_filepath'][source]
template_ext :Sequence[str] = ['.hql', '.sql'][source]
execute(self, context: airflow.utils.context.Context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?