apache-airflow-providers-dbt-cloud

dbt Cloud is a hosted service that helps data analysts and engineers productionalize dbt deployments. It comes equipped with turnkey support for scheduling jobs, CI/CD, serving documentation, monitoring & alerting, and an Integrated Developer Environment (IDE).

Package apache-airflow-providers-dbt-cloud

dbt Cloud

Release: 3.1.0

Provider package

This is a provider package for dbt.cloud provider. All classes for this provider package are in airflow.providers.dbt.cloud python package.

Installation

You can install this package on top of an existing Airflow 2 installation (see Requirements below) for the minimum Airflow version supported) via pip install apache-airflow-providers-dbt-cloud

Requirements

PIP package

Version required

apache-airflow

>=2.3.0

apache-airflow-providers-http

asgiref

aiohttp

Cross provider package dependencies

Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

pip install apache-airflow-providers-dbt-cloud[http]

Dependent package

Extra

apache-airflow-providers-http

http

Downloading official packages

You can download officially released packages and verify their checksums and signatures from the Official Apache Download site

Changelog

3.1.0

Features

  • Add 'DbtCloudJobRunAsyncSensor' (#29695)

3.0.0

Breaking changes

Beginning with version 2.0.0, users could specify single-tenant dbt Cloud domains via the schema parameter in an Airflow connection. Subsequently in version 2.3.1, users could also connect to the dbt Cloud instances outside of the US region as well as private instances by using the host parameter of their Airflow connection to specify the entire tenant domain. Backwards compatibility for using schema was left in place. Version 3.0.0 removes support for using schema to specify the tenant domain of a dbt Cloud instance. If you wish to connect to a single-tenant, instance outside of the US, or a private instance, you must use the host parameter to specify the _entire_ tenant domain name in your Airflow connection.

  • Drop Connection.schema use in DbtCloudHook  (#29166)

Features

  • Allow downloading of dbt Cloud artifacts to non-existent paths (#29048)

  • Add deferrable mode to 'DbtCloudRunJobOperator' (#29014)

Misc

  • Provide more context for 'trigger_reason' in DbtCloudRunJobOperator (#28994)

2.3.1

Bug Fixes

  • Use entire tenant domain name in dbt Cloud connection (#28890)

2.3.0

This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.

Misc

  • Move min airflow version to 2.3.0 for all providers (#27196)

2.2.0

Features

  • Add 'DbtCloudListJobsOperator' (#26475)

2.1.0

Features

  • Improve taskflow type hints with ParamSpec (#25173)

2.0.1

Bug Fixes

  • Update providers to use functools compat for ''cached_property'' (#24582)

2.0.0

Breaking changes

Features

  • Enable dbt Cloud provider to interact with single tenant instances (#24264)

Bug Fixes

  • Fix typo in dbt Cloud provider description (#23179)

  • Fix new MyPy errors in main (#22884)

1.0.2

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

1.0.1

Initial version of the provider.

Was this entry helpful?