Use the PythonOperator to execute Python callables.


def print_context(ds=None, **kwargs):
    """Print the Airflow context and ds variable from the context."""
    return 'Whatever you return gets printed in the logs'

run_this = print_context()

Passing in arguments

Use the op_args and op_kwargs arguments to pass additional arguments to the Python callable.


# Generate 5 sleeping tasks, sleeping from 0.0 to 0.4 seconds respectively
for i in range(5):

    def my_sleeping_function(random_base):
        """This is a function that will run within the DAG execution"""

    sleeping_task = my_sleeping_function(random_base=float(i) / 10)

    run_this >> sleeping_task


Airflow passes in an additional set of keyword arguments: one for each of the Jinja template variables and a templates_dict argument.

The templates_dict argument is templated, so each value in the dictionary is evaluated as a Jinja template.


Use the PythonVirtualenvOperator to execute Python callables inside a new Python virtual environment.


        task_id="virtualenv_python", requirements=["colorama==0.4.0"], system_site_packages=False
    def callable_virtualenv():
        Example function that will be performed in a virtual environment.

        Importing at the module level ensures that it will not attempt to import the
        library before it is installed.
        from time import sleep

        from colorama import Back, Fore, Style

        print(Fore.RED + 'some red text')
        print(Back.GREEN + 'and with a green background')
        print(Style.DIM + 'and in dim text')
        for _ in range(10):
            print(Style.DIM + 'Please wait...', flush=True)

    virtualenv_task = callable_virtualenv()

Passing in arguments

You can use the op_args and op_kwargs arguments the same way you use it in the PythonOperator. Unfortunately we currently do not support to serialize var and ti / task_instance due to incompatibilities with the underlying library. For airflow context variables make sure that you either have access to Airflow through setting system_site_packages to True or add apache-airflow to the requirements argument. Otherwise you won’t have access to the most context variables of Airflow in op_kwargs. If you want the context related to datetime objects like data_interval_start you can add pendulum and lazy_object_proxy.

If additional parameters for package installation are needed pass them in requirements.txt as in the example below:

SomePackage==0.2.1 --pre --index-url
AnotherPackage==1.4.3 --no-index --find-links /my/local/archives

All supported options are listed in the requirements file format.


You can use jinja Templating the same way you use it in PythonOperator.

Was this entry helpful?