airflow.timetables.interval
¶
Module Contents¶
Classes¶
Timetable that schedules data intervals with a cron expression. |
|
Timetable that schedules data intervals with a time delta. |
Attributes¶
- class airflow.timetables.interval.CronDataIntervalTimetable(cron: str, timezone: pendulum.tz.timezone.Timezone)[source]¶
Bases:
_DataIntervalTimetable
Timetable that schedules data intervals with a cron expression.
This corresponds to
schedule_interval=<cron>
, where<cron>
is either a five/six-segment representation, or one ofcron_presets
.The implementation extends on croniter to add timezone awareness. This is because croniter works only with naive timestamps, and cannot consider DST when determining the next/previous time.
Don’t pass
@once
in here; useOnceTimetable
instead.- classmethod deserialize(cls, data: Dict[str, Any]) airflow.timetables.base.Timetable [source]¶
Deserialize a timetable from data.
This is called when a serialized DAG is deserialized.
data
will be whatever was returned byserialize
during DAG serialization. The default implementation constructs the timetable without any arguments.
- __eq__(self, other: Any) bool [source]¶
Both expression and timezone should match.
This is only for testing purposes and should not be relied on otherwise.
- property summary(self) str [source]¶
A short summary for the timetable.
This is used to display the timetable in the web UI. A cron expression timetable, for example, can use this to display the expression. The default implementation returns the timetable’s type name.
- serialize(self) Dict[str, Any] [source]¶
Serialize the timetable for JSON encoding.
This is called during DAG serialization to store timetable information in the database. This should return a JSON-serializable dict that will be fed into
deserialize
when the DAG is deserialized. The default implementation returns an empty dict.
- validate(self) None [source]¶
Validate the timetable is correctly specified.
Override this method to provide run-time validation raised when a DAG is put into a dagbag. The default implementation does nothing.
- Raises
AirflowTimetableInvalid on validation failure.
- infer_manual_data_interval(self, *, run_after: pendulum.DateTime) airflow.timetables.base.DataInterval [source]¶
When a DAG run is manually triggered, infer a data interval for it.
This is used for e.g. manually-triggered runs, where
run_after
would be when the user triggers the run. The default implementation raisesNotImplementedError
.
- class airflow.timetables.interval.DeltaDataIntervalTimetable(delta: Delta)[source]¶
Bases:
_DataIntervalTimetable
Timetable that schedules data intervals with a time delta.
This corresponds to
schedule_interval=<delta>
, where<delta>
is either adatetime.timedelta
ordateutil.relativedelta.relativedelta
instance.- classmethod deserialize(cls, data: Dict[str, Any]) airflow.timetables.base.Timetable [source]¶
Deserialize a timetable from data.
This is called when a serialized DAG is deserialized.
data
will be whatever was returned byserialize
during DAG serialization. The default implementation constructs the timetable without any arguments.
- __eq__(self, other: Any) bool [source]¶
The offset should match.
This is only for testing purposes and should not be relied on otherwise.
- property summary(self) str [source]¶
A short summary for the timetable.
This is used to display the timetable in the web UI. A cron expression timetable, for example, can use this to display the expression. The default implementation returns the timetable’s type name.
- serialize(self) Dict[str, Any] [source]¶
Serialize the timetable for JSON encoding.
This is called during DAG serialization to store timetable information in the database. This should return a JSON-serializable dict that will be fed into
deserialize
when the DAG is deserialized. The default implementation returns an empty dict.
- validate(self) None [source]¶
Validate the timetable is correctly specified.
Override this method to provide run-time validation raised when a DAG is put into a dagbag. The default implementation does nothing.
- Raises
AirflowTimetableInvalid on validation failure.
- infer_manual_data_interval(self, run_after: pendulum.DateTime) airflow.timetables.base.DataInterval [source]¶
When a DAG run is manually triggered, infer a data interval for it.
This is used for e.g. manually-triggered runs, where
run_after
would be when the user triggers the run. The default implementation raisesNotImplementedError
.