Authoring and Scheduling¶
Here you can find detailed documentation about advanced authoring and scheduling airflow DAGs. It’s recommended that you first review the pages in core concepts
Authoring
- Plugins
- Deferrable Operators & Triggers
- DAG File Processing
- Serialization
- Connections & Hooks
- Dynamic Task Mapping
- Simple mapping
- Mapping with non-TaskFlow operators
- Assigning multiple parameters to a non-TaskFlow operator
- Mapping over a task group
- Filtering items from a mapped task
- Transforming expanding data
- Combining upstream data (aka “zipping”)
- Concatenating multiple upstreams
- What data types can be expanded?
- How do templated fields and mapped arguments interact?
- Placing limits on mapped tasks
- Automatically skipping zero-length maps
Scheduling
- Cron & Time Intervals
- Time Zones
- Data-aware scheduling
- Quickstart
- What is a “dataset”?
- What is valid URI?
- Extra information on dataset
- How to use datasets in your DAGs
- Multiple Datasets
- Attaching extra information to an emitting dataset event
- Fetching information from previously emitted dataset events
- Fetching information from a triggering dataset event
- Manipulating queued dataset events through REST API
- Advanced dataset scheduling with conditional expressions
- Example Use
- Dynamic data events emitting and dataset creation through DatasetAlias
- Combining dataset and time-based schedules
- Timetables