Content¶
Guides
References
Commits
apache-airflow-providers-apache-hive
¶This is a provider package for apache.hive
provider. All classes for this provider package
are in airflow.providers.apache.hive
python package.
You can install this package on top of an existing Airflow 2 installation (see Requirements
below)
for the minimum Airflow version supported) via
pip install apache-airflow-providers-apache-hive
PIP package |
Version required |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.
You can install such cross-provider dependencies when installing from PyPI. For example:
pip install apache-airflow-providers-apache-hive[amazon]
Dependent package |
Extra |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
You can download officially released packages and verify their checksums and signatures from the Official Apache Download site
The apache-airflow-providers-apache-hive 4.0.1 sdist package (asc, sha512)
The apache-airflow-providers-apache-hive 4.0.1 wheel package (asc, sha512)
The hql
parameter in get_records
of HiveServer2Hook
has been renamed to sql to match the
get_records
DbApiHook signature. If you used it as a positional parameter, this is no change for you,
but if you used it as keyword one, you need to rename it.
hive_conf
parameter has been renamed to parameters
and it is now second parameter, to match get_records
signature from the DbApiHook. You need to rename it if you used it.
schema
parameter in get_records
is an optional kwargs extra parameter that you can add, to match
the schema of get_records
from DbApiHook.
Deprecate hql parameters and synchronize DBApiHook method APIs (#25299)
Remove Smart Sensors (#25507)
Move all SQL classes to common-sql provider (#24836)
fix connection extra parameter 'auth_mechanism' in 'HiveMetastoreHook' and 'HiveServer2Hook' (#24713)
This release of provider is only available for Airflow 2.2+ as explained in the Apache Airflow providers support policy https://github.com/apache/airflow/blob/main/README.md#support-for-providers
chore: Refactoring and Cleaning Apache Providers (#24219)
AIP-47 - Migrate hive DAGs to new design #22439 (#24204)
Set larger limit get_partitions_by_filter in HiveMetastoreHook (#21504)
Fix Python 3.9 support in Hive (#21893)
Fix key typo in 'template_fields_renderers' for 'HiveOperator' (#21525)
Support for Python 3.10
Add how-to guide for hive operator (#21590)
Add more SQL template fields renderers (#21237)
Add conditional 'template_fields_renderers' check for new SQL lexers (#21403)
HiveHook fix get_pandas_df() failure when it tries to read an empty table (#17777)
Optimise connection importing for Airflow 2.2.0
Auto-apply apply_default decorator (#15667)
Warning
Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+.
If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade
Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded
automatically and you will have to manually run airflow upgrade db
to complete the migration.
Fix mistake and typos in doc/docstrings (#15180)
Fix grammar and remove duplicate words (#14647)
Resolve issue related to HiveCliHook kill (#14542)
Updated documentation and readme files.
Remove password if in LDAP or CUSTOM mode HiveServer2Hook (#11767)
Initial version of the provider.