airflow.timetables.base

Module Contents

Classes

DataInterval

A data interval for a DagRun to operate over.

TimeRestriction

Restriction on when a DAG can be scheduled for a run.

DagRunInfo

Information to schedule a DagRun.

Timetable

Protocol that all Timetable classes are expected to implement.

class airflow.timetables.base.DataInterval[source]

Bases: NamedTuple

A data interval for a DagRun to operate over.

Both start and end MUST be “aware”, i.e. contain timezone information.

start: pendulum.DateTime[source]
end: pendulum.DateTime[source]
classmethod exact(at)[source]

Represent an “interval” containing only an exact time.

class airflow.timetables.base.TimeRestriction[source]

Bases: NamedTuple

Restriction on when a DAG can be scheduled for a run.

Specifically, the run must not be earlier than earliest, nor later than latest. If catchup is False, the run must also not be earlier than the current time, i.e. “missed” schedules are not backfilled.

These values are generally set on the DAG or task’s start_date, end_date, and catchup arguments.

Both earliest and latest, if not None, are inclusive; a DAG run can happen exactly at either point of time. They are guaranteed to be aware (i.e. contain timezone information) for TimeRestriction instances created by Airflow.

earliest: pendulum.DateTime | None[source]
latest: pendulum.DateTime | None[source]
catchup: bool[source]
class airflow.timetables.base.DagRunInfo[source]

Bases: NamedTuple

Information to schedule a DagRun.

Instances of this will be returned by timetables when they are asked to schedule a DagRun creation.

property logical_date: pendulum.DateTime[source]

Infer the logical date to represent a DagRun.

This replaces execution_date in Airflow 2.1 and prior. The idea is essentially the same, just a different name.

run_after: pendulum.DateTime[source]

The earliest time this DagRun is created and its tasks scheduled.

This MUST be “aware”, i.e. contain timezone information.

data_interval: DataInterval[source]

The data interval this DagRun to operate over.

classmethod exact(at)[source]

Represent a run on an exact time.

classmethod interval(start, end)[source]

Represent a run on a continuous schedule.

In such a schedule, each data interval starts right after the previous one ends, and each run is scheduled right after the interval ends. This applies to all schedules prior to AIP-39 except @once and None.

class airflow.timetables.base.Timetable[source]

Bases: airflow.typing_compat.Protocol

Protocol that all Timetable classes are expected to implement.

property can_be_scheduled[source]
property summary: str[source]

A short summary for the timetable.

This is used to display the timetable in the web UI. A cron expression timetable, for example, can use this to display the expression. The default implementation returns the timetable’s type name.

description: str = ''[source]

Human-readable description of the timetable.

For example, this can produce something like 'At 21:30, only on Friday' from the cron expression '30 21 * * 5'. This is used in the webserver UI.

periodic: bool = True[source]

Whether this timetable runs periodically.

This defaults to and should generally be True, but some special setups like schedule=None and "@once" set it to False.

run_ordering: Sequence[str] = ('data_interval_end', 'execution_date')[source]

How runs triggered from this timetable should be ordered in UI.

This should be a list of field names on the DAG run object.

active_runs_limit: int | None[source]

Override the max_active_runs parameter of any DAGs using this timetable. This is called during DAG initializing, and will set the max_active_runs if it returns a value. In most cases this should return None, but in some cases (for example, the ContinuousTimetable) there are good reasons for limiting the DAGRun parallelism.

classmethod deserialize(data)[source]

Deserialize a timetable from data.

This is called when a serialized DAG is deserialized. data will be whatever was returned by serialize during DAG serialization. The default implementation constructs the timetable without any arguments.

serialize()[source]

Serialize the timetable for JSON encoding.

This is called during DAG serialization to store timetable information in the database. This should return a JSON-serializable dict that will be fed into deserialize when the DAG is deserialized. The default implementation returns an empty dict.

validate()[source]

Validate the timetable is correctly specified.

Override this method to provide run-time validation raised when a DAG is put into a dagbag. The default implementation does nothing.

Raises

AirflowTimetableInvalid on validation failure.

abstract infer_manual_data_interval(*, run_after)[source]

When a DAG run is manually triggered, infer a data interval for it.

This is used for e.g. manually-triggered runs, where run_after would be when the user triggers the run. The default implementation raises NotImplementedError.

abstract next_dagrun_info(*, last_automated_data_interval, restriction)[source]

Provide information to schedule the next DagRun.

The default implementation raises NotImplementedError.

Parameters
  • last_automated_data_interval (DataInterval | None) – The data interval of the associated DAG’s last scheduled or backfilled run (manual runs not considered).

  • restriction (TimeRestriction) – Restriction to apply when scheduling the DAG run. See documentation of TimeRestriction for details.

Returns

Information on when the next DagRun can be scheduled. None means a DagRun will not happen. This does not mean no more runs will be scheduled even again for this DAG; the timetable can return a DagRunInfo object when asked at another time.

Return type

DagRunInfo | None

generate_run_id(*, run_type, logical_date, data_interval, **extra)[source]

Was this entry helpful?