Content¶
Guides
References
Commits
apache-airflow-providers-apache-hdfs¶Hadoop Distributed File System (HDFS) and WebHDFS.
Note
The snakebite-py3 used by the provider is an old package and it has an old version of argparse in
its dependencies and it might cause in some cases running airflow commands raises the error similar to
TypeError: __init__() got an unexpected keyword argument 'encoding'. In this case make
sure to remove argparse with pip uninstall argparse command to get rid of this error.
Release: 3.2.0
This is a provider package for apache.hdfs provider. All classes for this provider package
are in airflow.providers.apache.hdfs python package.
You can install this package on top of an existing Airflow 2 installation (see Requirements below)
for the minimum Airflow version supported) via
pip install apache-airflow-providers-apache-hdfs
PIP package |
Version required |
|---|---|
|
|
|
|
|
|
This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.
Move min airflow version to 2.3.0 for all providers (#27196)
This release of provider is only available for Airflow 2.2+ as explained in the Apache Airflow providers support policy https://github.com/apache/airflow/blob/main/README.md#support-for-providers
chore: Refactoring and Cleaning Apache Providers (#24219)
hdfs provider: allow SSL webhdfs connections (#17637)
Optimise connection importing for Airflow 2.2.0
Auto-apply apply_default decorator (#15667)
Warning
Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+.
If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade
Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded
automatically and you will have to manually run airflow upgrade db to complete the migration.
Updated documentation and readme files.
Initial version of the provider.