airflow.contrib.operators.vertica_to_hive¶
Module Contents¶
- 
class airflow.contrib.operators.vertica_to_hive.VerticaToHiveTransfer(sql, hive_table, create=True, recreate=False, partition=None, delimiter=chr(1), vertica_conn_id='vertica_default', hive_cli_conn_id='hive_cli_default', *args, **kwargs)[source]¶
- Bases: - airflow.models.BaseOperator- Moves data from Vertia to Hive. The operator runs your query against Vertia, stores the file locally before loading it into a Hive table. If the - createor- recreatearguments are set to- True, a- CREATE TABLEand- DROP TABLEstatements are generated. Hive data types are inferred from the cursor’s metadata. Note that the table generated in Hive uses- STORED AS textfilewhich isn’t the most efficient serialization format. If a large amount of data is loaded and/or if the table gets queried considerably, you may want to use this operator only to stage the data into a temporary table before loading it into its final destination using a- HiveOperator.- Parameters
- sql (str) – SQL query to execute against the Vertia database. (templated) 
- hive_table (str) – target Hive table, use dot notation to target a specific database. (templated) 
- create (bool) – whether to create the table if it doesn’t exist 
- recreate (bool) – whether to drop and recreate the table at every execution 
- partition (dict) – target partition as a dict of partition columns and values. (templated) 
- delimiter (str) – field delimiter in the file 
- vertica_conn_id (str) – source Vertica connection 
- hive_conn_id (str) – destination hive connection