airflow.providers.common.ai.decorators.llm_branch¶
TaskFlow decorator for LLM-driven branching.
The user writes a function that returns the prompt string. The decorator discovers downstream tasks from the DAG topology and asks the LLM to choose which branch(es) to execute using pydantic-ai structured output.
Functions¶
|
Wrap a function that returns a prompt into an LLM-driven branching task. |
Module Contents¶
- airflow.providers.common.ai.decorators.llm_branch.llm_branch_task(python_callable=None, **kwargs)[source]¶
Wrap a function that returns a prompt into an LLM-driven branching task.
The function body constructs the prompt. The decorator discovers downstream tasks from the DAG topology and asks the LLM to choose which branch(es) to execute.
Usage:
@task.llm_branch( llm_conn_id="openai_default", system_prompt="Route support tickets to the right team.", ) def route_ticket(message: str): return f"Route this ticket: {message}"
With multiple branches:
@task.llm_branch( llm_conn_id="openai_default", system_prompt="Select all applicable categories.", allow_multiple_branches=True, ) def classify(text: str): return f"Classify this text: {text}"
- Parameters:
python_callable (collections.abc.Callable | None) – Function to decorate.