Microsoft Azure Data Lake Storage Gen2 Connection¶
The Microsoft Azure Data Lake Storage Gen2 connection type enables the ADLS gen2 Integrations.
Authenticating to Azure Data Lake Storage Gen2¶
Currently, there are two ways to connect to Azure Data Lake Storage Gen2 using Airflow.
Use token credentials i.e. add specific credentials (client_id, secret, tenant) and subscription id to the Airflow connection.
Use a Connection String i.e. add connection string to
connection_string
in the Airflow connection.
Only one authorization method can be used at a time. If you need to manage multiple credentials or keys then you should configure multiple connections.
Default Connection IDs¶
All hooks and operators related to Microsoft Azure Blob Storage use azure_data_lake_default
by default.
Configuring the Connection¶
- Login (optional)
Specify the login used for azure blob storage. For use with Shared Key Credential and SAS Token authentication.
- Password (optional)
Specify the password used for azure blob storage. For use with Active Directory (token credential) and shared key authentication.
- Host (optional)
Specify the account url for anonymous public read, Active Directory, shared access key authentication.
- Extra (optional)
Specify the extra parameters (as json dictionary) that can be used in Azure connection. The following parameters are all optional:
tenant_id
: Specify the tenant to use. Needed for Active Directory (token) authentication.connection_string
: Connection string for use with connection string authentication.
When specifying the connection in environment variable you should specify it using URI syntax.
Note that all components of the URI should be URL-encoded.