airflow.operators.python¶
Module Contents¶
- 
airflow.operators.python.task(python_callable: Optional[Callable] = None, multiple_outputs: Optional[bool] = None, **kwargs)[source]¶
- 
Deprecated function that calls @task.python and allows users to turn a python function into
- 
an Airflow task. Please use the following instead:
- from airflow.decorators import task - @task def my_task() - Parameters
- python_callable (python callable) -- A reference to an object that is callable 
- op_kwargs (dict) -- a dictionary of keyword arguments that will get unpacked in your function (templated) 
- op_args (list) -- a list of positional arguments that will get unpacked when calling your callable (templated) 
- multiple_outputs (bool) -- if set, function return value will be unrolled to multiple XCom values. Dict will unroll to xcom values with keys as keys. Defaults to False. 
 
- Returns
 
- 
class airflow.operators.python.PythonOperator(*, python_callable: Callable, op_args: Optional[List] = None, op_kwargs: Optional[Dict] = None, templates_dict: Optional[Dict] = None, templates_exts: Optional[List[str]] = None, **kwargs)[source]¶
- Bases: - airflow.models.BaseOperator- Executes a Python callable - See also - For more information on how to use this operator, take a look at the guide: PythonOperator - Parameters
- python_callable (python callable) -- A reference to an object that is callable 
- op_kwargs (dict (templated)) -- a dictionary of keyword arguments that will get unpacked in your function 
- op_args (list (templated)) -- a list of positional arguments that will get unpacked when calling your callable 
- templates_dict (dict[str]) -- a dictionary where the values are templates that will get templated by the Airflow engine sometime between - __init__and- executetakes place and are made available in your callable's context after the template has been applied. (templated)
- templates_exts (list[str]) -- a list of file extensions to resolve while processing templated fields, for examples - ['.sql', '.hql']
 
 
- 
class airflow.operators.python.BranchPythonOperator[source]¶
- Bases: - airflow.operators.python.PythonOperator,- airflow.models.skipmixin.SkipMixin- Allows a workflow to "branch" or follow a path following the execution of this task. - It derives the PythonOperator and expects a Python function that returns a single task_id or list of task_ids to follow. The task_id(s) returned should point to a task directly downstream from {self}. All other "branches" or directly downstream tasks are marked with a state of - skippedso that these paths can't move forward. The- skippedstates are propagated downstream to allow for the DAG state to fill up and the DAG run's state to be inferred.
- 
class airflow.operators.python.ShortCircuitOperator[source]¶
- Bases: - airflow.operators.python.PythonOperator,- airflow.models.skipmixin.SkipMixin- Allows a workflow to continue only if a condition is met. Otherwise, the workflow "short-circuits" and downstream tasks are skipped. - The ShortCircuitOperator is derived from the PythonOperator. It evaluates a condition and short-circuits the workflow if the condition is False. Any downstream tasks are marked with a state of "skipped". If the condition is True, downstream tasks proceed as normal. - The condition is determined by the result of python_callable. 
- 
class airflow.operators.python.PythonVirtualenvOperator(*, python_callable: Callable, requirements: Optional[Iterable[str]] = None, python_version: Optional[Union[str, int, float]] = None, use_dill: bool = False, system_site_packages: bool = True, op_args: Optional[List] = None, op_kwargs: Optional[Dict] = None, string_args: Optional[Iterable[str]] = None, templates_dict: Optional[Dict] = None, templates_exts: Optional[List[str]] = None, **kwargs)[source]¶
- Bases: - airflow.operators.python.PythonOperator- Allows one to run a function in a virtualenv that is created and destroyed automatically (with certain caveats). - The function must be defined using def, and not be part of a class. All imports must happen inside the function and no variables outside of the scope may be referenced. A global scope variable named virtualenv_string_args will be available (populated by string_args). In addition, one can pass stuff through op_args and op_kwargs, and one can use a return value. Note that if your virtualenv runs in a different Python major version than Airflow, you cannot use return values, op_args, op_kwargs, or use any macros that are being provided to Airflow through plugins. You can use string_args though. - See also - For more information on how to use this operator, take a look at the guide: PythonVirtualenvOperator - Parameters
- python_callable (function) -- A python function with no references to outside variables, defined with def, which will be run in a virtualenv 
- requirements (list[str]) -- A list of requirements as specified in a pip install command 
- python_version (Optional[Union[str, int, float]]) -- The Python version to run the virtualenv with. Note that both 2 and 2.7 are acceptable forms. 
- use_dill (bool) -- Whether to use dill to serialize the args and result (pickle is default). This allow more complex types but requires you to include dill in your requirements. 
- system_site_packages (bool) -- Whether to include system_site_packages in your virtualenv. See virtualenv documentation for more information. 
- op_args (list) -- A list of positional arguments to pass to python_callable. 
- op_kwargs (dict) -- A dict of keyword arguments to pass to python_callable. 
- string_args (list[str]) -- Strings that are present in the global var virtualenv_string_args, available to python_callable at runtime as a list[str]. Note that args are split by newline. 
- templates_dict (dict of str) -- a dictionary where the values are templates that will get templated by the Airflow engine sometime between - __init__and- executetakes place and are made available in your callable's context after the template has been applied
- templates_exts (list[str]) -- a list of file extensions to resolve while processing templated fields, for examples - ['.sql', '.hql']
 
 
- 
airflow.operators.python.get_current_context() → Dict[str, Any][source]¶
- 
Obtain the execution context for the currently executing operator without
- 
altering user method's signature.
- 
This is the simplest method of retrieving the execution context dictionary.
- Old style: - def my_task(**context): ti = context["ti"] - New style: - from airflow.operators.python import get_current_context def my_task(): context = get_current_context() ti = context["ti"] - Current context will only have value if this method was called after an operator was starting to execute.