apache-airflow-providers-apache-spark

Changelog

4.1.3

Bug Fixes

  • Validate conn_prefix in extra field for Spark JDBC hook (#32946)

4.1.2

Note

The provider now expects apache-airflow-providers-cncf-kubernetes in version 7.4.0+ installed in order to run Spark on Kubernetes jobs. You can install the provider with cncf.kubernetes extra with pip install apache-airflow-providers-spark[cncf.kubernetes] to get the right version of the cncf.kubernetes provider installed.

Misc

  • Move all k8S classes to cncf.kubernetes provider (#32767)

4.1.1

Note

This release dropped support for Python 3.7

Misc

  • SparkSubmitOperator: rename spark_conn_id to conn_id (#31952)

4.1.0

Note

This release of provider is only available for Airflow 2.4+ as explained in the Apache Airflow providers support policy.

Misc

  • Bump minimum Airflow version in providers (#30917)

4.0.1

Bug Fixes

  • Only restrict spark binary passed via extra (#30213)

  • Validate host and schema for Spark JDBC Hook (#30223)

  • Add spark3-submit to list of allowed spark-binary values (#30068)

4.0.0

Note

This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.

Breaking changes

The spark-binary connection extra could be set to any binary, but with 4.0.0 version only two values are allowed for it spark-submit and spark2-submit.

The spark-home connection extra is not allowed any more - the binary should be available on the PATH in order to use SparkSubmitHook and SparkSubmitOperator.

  • Remove custom spark home and custom binaries for spark (#27646)

Misc

  • Move min airflow version to 2.3.0 for all providers (#27196)

3.0.0

Breaking changes

Note

This release of provider is only available for Airflow 2.2+ as explained in the Apache Airflow providers support policy.

Bug Fixes

  • Add typing for airflow/configuration.py (#23716)

  • Fix backwards-compatibility introduced by fixing mypy problems (#24230)

Misc

  • AIP-47 - Migrate spark DAGs to new design #22439 (#24210)

  • chore: Refactoring and Cleaning Apache Providers (#24219)

2.1.3

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

2.1.2

Misc

  • Add Trove classifiers in PyPI (Framework :: Apache Airflow :: Provider)

2.1.1

Bug Fixes

  • fix param rendering in docs of SparkSubmitHook (#21788)

Misc

  • Support for Python 3.10

2.1.0

Features

  • Add more SQL template fields renderers (#21237)

  • Add optional features in providers. (#21074)

2.0.3

Bug Fixes

  • Ensure Spark driver response is valid before setting UNKNOWN status (#19978)

2.0.2

Bug Fixes

  • fix bug of SparkSql Operator log  going to infinite loop. (#19449)

2.0.1

Misc

  • Optimise connection importing for Airflow 2.2.0

2.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

Warning

Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+. If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded automatically and you will have to manually run airflow upgrade db to complete the migration.

Bug fixes

  • Make SparkSqlHook use Connection (#15794)

1.0.3

Bug fixes

  • Fix 'logging.exception' redundancy (#14823)

1.0.2

Bug fixes

  • Use apache.spark provider without kubernetes (#14187)

1.0.1

Updated documentation and readme files.

1.0.0

Initial version of the provider.

Was this entry helpful?