This release of provider is only available for Airflow 2.5+ as explained in the Apache Airflow providers support policy.
Bump min airflow version of providers (#34728)
Consolidate hook management in HiveOperator (#34430)
Refactor regex in providers (#33898)
Replace sequence concatenation by unpacking in Airflow providers (#33933)
Replace single element slice by next() in hive provider (#33937)
Use a single statement with multiple contexts instead of nested statements in providers (#33768)
Use startswith once with a tuple in Hive hook (#33765)
Refactor: Simplify a few loops (#33736)
E731: replace lambda by a def method in Airflow providers (#33757)
Use f-string instead of in Airflow providers (#33752)
The provider now uses pure-sasl, a pure-Python implementation of SASL, which is better maintained than previous sasl implementation, even if a bit slower for sasl interface. It also allows hive to be installed for Python 3.11.
Bring back hive support for Python 3.11 (#32607)
Refactor: Simplify code in Apache/Alibaba providers (#33227)
Simplify 'X for X in Y' to 'Y' where applicable (#33453)
Replace OrderedDict with plain dict (#33508)
Simplify code around enumerate (#33476)
Use str.splitlines() to split lines in providers (#33593)
Simplify conditions on len() in providers/apache (#33564)
Replace repr() with proper formatting (#33520)
Avoid importing pandas and numpy in runtime and module level (#33483)
Consolidate import and usage of pandas (#33480)
Fix Pandas2 compatibility for Hive (#32752)
Add more accurate typing for DbApiHook.run method (#31846)
Move Hive configuration to Apache Hive provider (#32777)
This release dropped support for Python 3.7
Sanitize beeline principal parameter (#31983)
Replace unicodecsv with standard csv library (#31693)
This release of provider is only available for Airflow 2.4+ as explained in the Apache Airflow providers support policy.
Bump minimum Airflow version in providers (#30917)
Update return types of 'get_key' methods on 'S3Hook' (#30923)
The auth option is moved from the extra field to the auth parameter in the Hook. If you have extra parameters defined in your connections as auth, you should move them to the DAG where your HiveOperator or other Hive related operators are used.
Move auth parameter from extra to Hook parameter (#30212)
apache.hive provider provides now hive macros that used to be provided by Airflow. As of 5.1.0 version
apache.hive the hive macros are provided by the Provider.
Move Hive macros to the provider (#28538)
Make pandas dependency optional for Amazon Provider (#28505)
hive_cli_params from connection were moved to the Hook. If you have extra parameters defined in your
hive_cli_params extra, you should move them to the DAG where your HiveOperator is used.
Move hive_cli_params to hook parameters (#28101)
Improve filtering for invalid schemas in Hive hook (#27808)
This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.
Move min airflow version to 2.3.0 for all providers (#27196)
Filter out invalid schemas in Hive hook (#27647)
HiveServer2Hookhas been renamed to sql to match the
get_recordsDbApiHook signature. If you used it as a positional parameter, this is no change for you, but if you used it as keyword one, you need to rename it.
hive_confparameter has been renamed to
parametersand it is now second parameter, to match
get_recordssignature from the DbApiHook. You need to rename it if you used it.
get_recordsis an optional kwargs extra parameter that you can add, to match the schema of
Deprecate hql parameters and synchronize DBApiHook method APIs (#25299)
Remove Smart Sensors (#25507)
Move all SQL classes to common-sql provider (#24836)
fix connection extra parameter 'auth_mechanism' in 'HiveMetastoreHook' and 'HiveServer2Hook' (#24713)
This release of provider is only available for Airflow 2.2+ as explained in the Apache Airflow providers support policy.
chore: Refactoring and Cleaning Apache Providers (#24219)
AIP-47 - Migrate hive DAGs to new design #22439 (#24204)
Set larger limit get_partitions_by_filter in HiveMetastoreHook (#21504)
Fix Python 3.9 support in Hive (#21893)
Fix key typo in 'template_fields_renderers' for 'HiveOperator' (#21525)
Support for Python 3.10
Add how-to guide for hive operator (#21590)
Add more SQL template fields renderers (#21237)
Add conditional 'template_fields_renderers' check for new SQL lexers (#21403)
HiveHook fix get_pandas_df() failure when it tries to read an empty table (#17777)
Optimise connection importing for Airflow 2.2.0
Auto-apply apply_default decorator (#15667)
Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+.
If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade
Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded
automatically and you will have to manually run
airflow upgrade db to complete the migration.
Fix mistake and typos in doc/docstrings (#15180)
Fix grammar and remove duplicate words (#14647)
Resolve issue related to HiveCliHook kill (#14542)
Updated documentation and readme files.
Remove password if in LDAP or CUSTOM mode HiveServer2Hook (#11767)
Initial version of the provider.