Templates reference¶
Variables, macros and filters can be used in templates (see the Jinja Templating section)
The following come for free out of the box with Airflow.
Additional custom macros can be added globally through Plugins, or at a DAG level through the
DAG.user_defined_macros
argument.
Variables¶
The Airflow engine passes a few variables by default that are accessible in all templates
Variable |
Type |
Description |
---|---|---|
|
Start of the data interval. Added in version 2.3. |
|
|
End of the data interval. Added in version 2.3. |
|
|
str |
The DAG run’s logical date as
YYYY-MM-DD .Same as
{{ dag_run.logical_date | ds }} . |
|
str |
Same as |
|
str |
Same as
{{ dag_run.logical_date | ts }} .Example:
2018-01-01T00:00:00+00:00 . |
|
str |
Same as
{{ dag_run.logical_date | ts_nodash_with_tz }} .Example:
20180101T000000+0000 . |
|
str |
Same as
{{ dag_run.logical_date | ts_nodash }} .Example:
20180101T000000 . |
|
pendulum.DateTime
| |
Start of the data interval of the prior successful DAG run.
Added in version 2.3.
|
|
pendulum.DateTime
| |
End of the data interval of the prior successful DAG run.
Added in version 2.3.
|
|
pendulum.DateTime
| |
Start date from prior successful dag run (if available). |
|
DAG |
The currently running DAG. |
|
BaseOperator |
The currently running task.
|
|
A reference to the macros package. See Macros below.
|
|
|
TaskInstance |
The currently running task instance. |
|
TaskInstance |
Same as |
|
dict[str, Any] |
The user-defined params. This can be overridden by the mapping
passed to
trigger_dag -c if dag_run_conf_overrides_params is enabled in
airflow.cfg . |
|
Airflow variables. See Airflow Variables in Templates below. |
|
|
Airflow variables. See Airflow Variables in Templates below. |
|
|
Airflow connections. See Airflow Connections in Templates below. |
|
|
str |
A unique, human-readable key to the task instance. The format is
{dag_id}__{task_id}__{ds_nodash} . |
|
AirflowConfigParser |
The full configuration object representing the content of your
airflow.cfg . See airflow.configuration.conf . |
|
str |
The currently running DAG run’s run ID. |
|
DagRun |
The currently running DAG run. |
|
bool |
Whether the task instance was run by the |
|
int | |
Number of task instances that a mapped task was expanded into. If
the current task is not mapped, this should be
None .Added in version 2.5.
|
Note
The DAG run’s logical date, and values derived from it, such as ds
and
ts
, should not be considered unique in a DAG. Use run_id
instead.
The following variables are deprecated. They are kept for backward compatibility, but you should convert existing code to use other variables instead.
Deprecated Variable |
Description |
---|---|
|
the execution date (logical date), same as |
|
the logical date of the next scheduled run (if applicable);
you may be able to use |
|
the next execution date as |
|
the next execution date as |
|
the logical date of the previous scheduled run (if applicable) |
|
the previous execution date as |
|
the previous execution date as |
|
the day before the execution date as |
|
the day before the execution date as |
|
the day after the execution date as |
|
the day after the execution date as |
|
execution date from prior successful dag run |
Note that you can access the object’s attributes and methods with simple
dot notation. Here are some examples of what is possible:
{{ task.owner }}
, {{ task.task_id }}
, {{ ti.hostname }}
, …
Refer to the models documentation for more information on the objects’
attributes and methods.
Airflow Variables in Templates¶
The var
template variable allows you to access Airflow Variables.
You can access them as either plain-text or JSON. If you use JSON, you are
also able to walk nested structures, such as dictionaries like:
{{ var.json.my_dict_var.key1 }}
.
It is also possible to fetch a variable by string if needed with
{{ var.value.get('my.var', 'fallback') }}
or
{{ var.json.get('my.dict.var', {'key1': 'val1'}) }}
. Defaults can be
supplied in case the variable does not exist.
Airflow Connections in Templates¶
Similarly, Airflow Connections data can be accessed via the conn
template variable. For example, you could use expressions in your templates like {{ conn.my_conn_id.login }}
,
{{ conn.my_conn_id.password }}
, etc.
Just like with var
it’s possible to fetch a connection by string (e.g. {{ conn.get('my_conn_id_'+index).host }}
) or provide defaults (e.g {{ conn.get('my_conn_id', {"host": "host1", "login": "user1"}).host }}
).
Additionally, the extras
field of a connection can be fetched as a Python Dictionary with the extra_dejson
field, e.g.
conn.my_aws_conn_id.extra_dejson.region_name
would fetch region_name
out of extras
.
Filters¶
Airflow defines some Jinja filters that can be used to format values.
For example, using {{ execution_date | ds }}
will output the execution_date in the YYYY-MM-DD
format.
Filter |
Operates on |
Description |
---|---|---|
|
datetime |
Format the datetime as |
|
datetime |
Format the datetime as |
|
datetime |
Same as |
|
datetime |
Same as |
|
datetime |
As |
Macros¶
Macros are a way to expose objects to your templates and live under the
macros
namespace in your templates.
A few commonly used libraries and methods are made available.
Variable |
Description |
---|---|
|
The standard lib’s |
|
The standard lib’s |
|
A reference to the |
|
The standard lib’s |
|
The standard lib’s |
|
The standard lib’s |
Some airflow specific macros are also defined:
- airflow.macros.datetime_diff_for_humans(dt, since=None)[source]¶
Return a human-readable/approximate difference between datetimes.
When only one datetime is provided, the comparison will be based on now.
- Parameters
dt (Any) – The datetime to display the diff for
since (DateTime | None) – When to display the date from. If
None
then the diff is betweendt
and now.
- airflow.macros.ds_add(ds, days)[source]¶
Add or subtract days from a YYYY-MM-DD.
- Parameters
>>> ds_add('2015-01-01', 5) '2015-01-06' >>> ds_add('2015-01-06', -5) '2015-01-01'
- airflow.macros.ds_format(ds, input_format, output_format)[source]¶
Output datetime string in a given format.
- Parameters
>>> ds_format('2015-01-01', "%Y-%m-%d", "%m-%d-%y") '01-01-15' >>> ds_format('1/5/2015', "%m/%d/%Y", "%Y-%m-%d") '2015-01-05'
- airflow.macros.random() x in the interval [0, 1). ¶
- airflow.macros.hive.closest_ds_partition(table, ds, before=True, schema='default', metastore_conn_id='metastore_default')[source]¶
This function finds the date in a list closest to the target date. An optional parameter can be given to get the closest before or after.
- Parameters
table – A hive table name
ds – A datestamp
%Y-%m-%d
e.g.yyyy-mm-dd
before – closest before (True), after (False) or either side of ds
schema – table schema
metastore_conn_id – which metastore connection to use
- Returns
The closest date
- Return type
str | None
>>> tbl = 'airflow.static_babynames_partitioned' >>> closest_ds_partition(tbl, '2015-01-02') '2015-01-01'
- airflow.macros.hive.max_partition(table, schema='default', field=None, filter_map=None, metastore_conn_id='metastore_default')[source]¶
Gets the max partition for a table.
- Parameters
schema – The hive schema the table lives in
table – The hive table you are interested in, supports the dot notation as in “my_database.my_table”, if a dot is found, the schema param is disregarded
metastore_conn_id – The hive connection you are interested in. If your default is set you don’t need to use this parameter.
filter_map – partition_key:partition_value map used for partition filtering, e.g. {‘key1’: ‘value1’, ‘key2’: ‘value2’}. Only partitions matching all partition_key:partition_value pairs will be considered as candidates of max partition.
field – the field to get the max value from. If there’s only one partition field, this will be inferred
>>> max_partition('airflow.static_babynames_partitioned') '2015-01-01'