The module which provides a way to nest your DAGs and so your levels of complexity.

Module Contents

class airflow.operators.subdag.SkippedStatePropagationOptions[source]

Bases: enum.Enum

Available options for skipped state propagation of subdag's tasks to parent dag tasks.

ALL_LEAVES = all_leaves[source]
ANY_LEAF = any_leaf[source]
class airflow.operators.subdag.SubDagOperator(*, subdag: DAG, session: Optional[Session] = None, conf: Optional[Dict] = None, propagate_skipped_state: Optional[SkippedStatePropagationOptions] = None, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

This runs a sub dag. By convention, a sub dag's dag_id should be prefixed by its parent and a dot. As in parent.child. Although SubDagOperator can occupy a pool/concurrency slot, user can specify the mode=reschedule so that the slot will be released periodically to avoid potential deadlock.

  • subdag -- the DAG object to run as a subdag of the current DAG.

  • session -- sqlalchemy session

  • conf (dict) -- Configuration for the subdag

  • propagate_skipped_state -- by setting this argument you can define whether the skipped state of leaf task(s) should be propagated to the parent dag's downstream task.

ui_color = #555[source]
ui_fgcolor = #fff[source]
_validate_dag(self, kwargs)[source]
_validate_pool(self, session)[source]
_get_dagrun(self, execution_date)[source]
_reset_dag_run_and_task_instances(self, dag_run, execution_date)[source]

Set the DagRun state to RUNNING and set the failed TaskInstances to None state for scheduler to pick up. :param dag_run: DAG run :param execution_date: Execution date :return: None

pre_execute(self, context)[source]
poke(self, context)[source]
post_execute(self, context, result=None)[source]
_check_skipped_states(self, context)[source]
_get_leaves_tis(self, execution_date)[source]
_skip_downstream_tasks(self, context)[source]

Was this entry helpful?