apache-airflow-providers-google
¶
Package apache-airflow-providers-google¶
Google services including:
Google Workspace (formerly Google Suite)
Release: 2.1.0
Provider package¶
This is a provider package for google
provider. All classes for this provider package
are in airflow.providers.google
python package.
Installation¶
Note
On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
does not yet work with Apache Airflow and might lead to errors in installation - depends on your choice
of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
pip install --upgrade pip==20.2.4
or, in case you use Pip 20.3, you need to add option
--use-deprecated legacy-resolver
to your pip install command.
You can install this package on top of an existing airflow 2.* installation via
pip install apache-airflow-providers-google
PIP requirements¶
PIP package |
Version required |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Cross provider package dependencies¶
Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified backport providers package in order to use them.
You can install such cross-provider dependencies when installing from PyPI. For example:
pip install apache-airflow-providers-google[amazon]
Dependent package |
Extra |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Changelog¶
2.1.0¶
Features¶
Corrects order of argument in docstring in GCSHook.download method (#14497)
Refactor SQL/BigQuery/Qubole/Druid Check operators (#12677)
Add GoogleDriveToLocalOperator (#14191)
Add 'exists_ok' flag to BigQueryCreateEmptyTable(Dataset)Operator (#14026)
Add materialized view support for BigQuery (#14201)
Add BigQueryUpdateTableOperator (#14149)
Add param to CloudDataTransferServiceOperator (#14118)
Add gdrive_to_gcs operator, drive sensor, additional functionality to drive hook (#13982)
Improve GCSToSFTPOperator paths handling (#11284)
Bug Fixes¶
Fixes to dataproc operators and hook (#14086)
#9803 fix bug in copy operation without wildcard (#13919)
2.0.0¶
Breaking changes¶
Updated google-cloud-*
libraries¶
This release of the provider package contains third-party library updates, which may require updating your DAG files or custom hooks and operators, if you were using objects from those libraries. Updating of these libraries is necessary to be able to use new features made available by new versions of the libraries and to obtain bug fixes that are only available for new versions of the library.
Details are covered in the UPDATING.md files for each library, but there are some details that you should pay attention to.
Library name |
Previous constraints |
Current constraints |
Upgrade Documentation |
---|---|---|---|
|
|
||
|
|
||
|
|
||
|
|
||
|
|
||
|
|
||
|
|
||
|
|
||
|
|
||
|
|
The field names use the snake_case convention¶
If your DAG uses an object from the above mentioned libraries passed by XCom, it is necessary to update the naming convention of the fields that are read. Previously, the fields used the CamelSnake convention, now the snake_case convention is used.
Before:
set_acl_permission = GCSBucketCreateAclEntryOperator(
task_id="gcs-set-acl-permission",
bucket=BUCKET_NAME,
entity="user-{{ task_instance.xcom_pull('get-instance')['persistenceIamIdentity']"
".split(':', 2)[1] }}",
role="OWNER",
)
After:
set_acl_permission = GCSBucketCreateAclEntryOperator(
task_id="gcs-set-acl-permission",
bucket=BUCKET_NAME,
entity="user-{{ task_instance.xcom_pull('get-instance')['persistence_iam_identity']"
".split(':', 2)[1] }}",
role="OWNER",
)
Features¶
Add Apache Beam operators (#12814)
Add Google Cloud Workflows Operators (#13366)
Replace 'google_cloud_storage_conn_id' by 'gcp_conn_id' when using 'GCSHook' (#13851)
Add How To Guide for Dataflow (#13461)
Generalize MLEngineStartTrainingJobOperator to custom images (#13318)
Add Parquet data type to BaseSQLToGCSOperator (#13359)
Add DataprocCreateWorkflowTemplateOperator (#13338)
Add OracleToGCS Transfer (#13246)
Add timeout option to gcs hook methods. (#13156)
Add regional support to dataproc workflow template operators (#12907)
Add project_id to client inside BigQuery hook update_table method (#13018)
Bug fixes¶
Fix four bugs in StackdriverTaskHandler (#13784)
Decode Remote Google Logs (#13115)
Fix and improve GCP BigTable hook and system test (#13896)
updated Google DV360 Hook to fix SDF issue (#13703)
Fix insert_all method of BigQueryHook to support tables without schema (#13138)
Fix Google BigQueryHook method get_schema() (#13136)
Fix Data Catalog operators (#13096)
1.0.0¶
Initial version of the provider.