airflow.operators.latest_only
¶
Contains an operator to run downstream tasks only for the latest scheduled DagRun.
Module Contents¶
Classes¶
Skip tasks that are not running during the most recent schedule interval. |
- class airflow.operators.latest_only.LatestOnlyOperator(task_id, owner=DEFAULT_OWNER, email=None, email_on_retry=conf.getboolean('email', 'default_email_on_retry', fallback=True), email_on_failure=conf.getboolean('email', 'default_email_on_failure', fallback=True), retries=DEFAULT_RETRIES, retry_delay=DEFAULT_RETRY_DELAY, retry_exponential_backoff=False, max_retry_delay=None, start_date=None, end_date=None, depends_on_past=False, ignore_first_depends_on_past=DEFAULT_IGNORE_FIRST_DEPENDS_ON_PAST, wait_for_past_depends_before_skipping=DEFAULT_WAIT_FOR_PAST_DEPENDS_BEFORE_SKIPPING, wait_for_downstream=False, dag=None, params=None, default_args=None, priority_weight=DEFAULT_PRIORITY_WEIGHT, weight_rule=DEFAULT_WEIGHT_RULE, queue=DEFAULT_QUEUE, pool=None, pool_slots=DEFAULT_POOL_SLOTS, sla=None, execution_timeout=DEFAULT_TASK_EXECUTION_TIMEOUT, on_execute_callback=None, on_failure_callback=None, on_success_callback=None, on_retry_callback=None, pre_execute=None, post_execute=None, trigger_rule=DEFAULT_TRIGGER_RULE, resources=None, run_as_user=None, task_concurrency=None, max_active_tis_per_dag=None, max_active_tis_per_dagrun=None, executor_config=None, do_xcom_push=True, inlets=None, outlets=None, task_group=None, doc=None, doc_md=None, doc_json=None, doc_yaml=None, doc_rst=None, **kwargs)[source]¶
Bases:
airflow.operators.branch.BaseBranchOperator
Skip tasks that are not running during the most recent schedule interval.
If the task is run outside the latest schedule interval (i.e. external_trigger), all directly downstream tasks will be skipped.
Note that downstream tasks are never skipped if the given DAG_Run is marked as externally triggered.
- choose_branch(context)[source]¶
Abstract method to choose which branch to run.
Subclasses should implement this, running whatever logic is necessary to choose a branch and returning a task_id or list of task_ids.
- Parameters
context (airflow.utils.context.Context) – Context dictionary as passed to execute()