Setting Configuration Options¶
The first time you run Airflow, it will create a file called airflow.cfg
in
your $AIRFLOW_HOME
directory (~/airflow
by default). This file contains Airflow’s configuration and you
can edit it to change any of the settings. You can also set options with environment variables by using this format:
AIRFLOW__{SECTION}__{KEY}
(note the double underscores).
For example, the metadata database connection string can either be set in airflow.cfg
like this:
[database]
sql_alchemy_conn = my_conn_string
or by creating a corresponding environment variable:
export AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=my_conn_string
You can also derive the connection string at run time by appending _cmd
to
the key like this:
[database]
sql_alchemy_conn_cmd = bash_command_to_run
You can also derive the connection string at run time by appending _secret
to
the key like this:
[database]
sql_alchemy_conn_secret = sql_alchemy_conn
# You can also add a nested path
# example:
# sql_alchemy_conn_secret = database/sql_alchemy_conn
This will retrieve config option from Secret Backends e.g Hashicorp Vault. See Secrets Backends for more details.
The following config options support this _cmd
and _secret
version:
sql_alchemy_conn
in[database]
sectionfernet_key
in[core]
sectionbroker_url
in[celery]
sectionflower_basic_auth
in[celery]
sectionresult_backend
in[celery]
sectionpassword
in[atlas]
sectionsmtp_password
in[smtp]
sectionsecret_key
in[webserver]
section
The _cmd
config options can also be set using a corresponding environment variable
the same way the usual config options can. For example:
export AIRFLOW__DATABASE__SQL_ALCHEMY_CONN_CMD=bash_command_to_run
Similarly, _secret
config options can also be set using a corresponding environment variable.
For example:
export AIRFLOW__DATABASE__SQL_ALCHEMY_CONN_SECRET=sql_alchemy_conn
Note
The config options must follow the config prefix naming convention defined within the secrets backend. This means that sql_alchemy_conn
is not defined with a connection prefix, but with config prefix. For example it should be named as airflow/config/sql_alchemy_conn
The idea behind this is to not store passwords on boxes in plain text files.
The universal order of precedence for all configuration options is as follows:
set as an environment variable (
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN
)set as a command environment variable (
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN_CMD
)set as a secret environment variable (
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN_SECRET
)set in
airflow.cfg
command in
airflow.cfg
secret key in
airflow.cfg
Airflow’s built in defaults
Note
For Airflow versions >= 2.2.1, < 2.3.0 Airflow’s built in defaults took precedence
over command and secret key in airflow.cfg
in some circumstances.
You can check the current configuration with the airflow config list
command.
If you only want to see the value for one option, you can use airflow config get-value
command as in
the example below.
$ airflow config get-value core executor
SequentialExecutor
Note
For more information on configuration options, see Configuration Reference
Note
See Modules Management for details on how Python and Airflow manage modules.
Note
Use the same configuration across all the Airflow components. While each component does not require all, some configurations need to be same otherwise they would not work as expected. A good example for that is secret_key which should be same on the Webserver and Worker to allow Webserver to fetch logs from Worker.
The webserver key is also used to authorize requests to Celery workers when logs are retrieved. The token generated using the secret key has a short expiry time though - make sure that time on ALL the machines that you run airflow components on is synchronized (for example using ntpd) otherwise you might get “forbidden” errors when the logs are accessed.