Quick Start¶
This quick start guide will help you bootstrap an Airflow standalone instance on your local machine.
Note
Successful installation requires a Python 3 environment. Starting with Airflow 2.7.0, Airflow supports Python 3.8, 3.9, 3.10, 3.11 and 3.12.
Only pip
installation is currently officially supported.
While there have been successes with using other tools like poetry or
pip-tools, they do not share the same workflow as
pip
- especially when it comes to constraint vs. requirements management.
Installing via Poetry
or pip-tools
is not currently supported.
There are known issues with bazel
that might lead to circular dependencies when using it to install
Airflow. Please switch to pip
if you encounter such problems. Bazel
community works on fixing
the problem in this PR so it might be that
newer versions of bazel
will handle it.
If you wish to install Airflow using those tools you should use the constraint files and convert them to appropriate format and workflow that your tool requires.
The installation of Airflow is straightforward if you follow the instructions below. Airflow uses
constraint files to enable reproducible installation, so using pip
and constraint files is recommended.
Set Airflow Home (optional):
Airflow requires a home directory, and uses
~/airflow
by default, but you can set a different location if you prefer. TheAIRFLOW_HOME
environment variable is used to inform Airflow of the desired location. This step of setting the environment variable should be done before installing Airflow so that the installation process knows where to store the necessary files.export AIRFLOW_HOME=~/airflow
Install Airflow using the constraints file, which is determined based on the URL we pass:
AIRFLOW_VERSION=2.9.2 # Extract the version of Python you have installed. If you're currently using a Python version that is not supported by Airflow, you may want to set this manually. # See above for supported versions. PYTHON_VERSION="$(python -c 'import sys; print(f"{sys.version_info.major}.{sys.version_info.minor}")')" CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt" # For example this would install 2.9.2 with python 3.8: https://raw.githubusercontent.com/apache/airflow/constraints-2.9.2/constraints-3.8.txt pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
Run Airflow Standalone:
The
airflow standalone
command initializes the database, creates a user, and starts all components.airflow standalone
Access the Airflow UI:
Visit
localhost:8080
in your browser and log in with the admin account details shown in the terminal. Enable theexample_bash_operator
DAG in the home page.
Upon running these commands, Airflow will create the $AIRFLOW_HOME
folder
and create the “airflow.cfg” file with defaults that will get you going fast.
You can override defaults using environment variables, see Configuration Reference.
You can inspect the file either in $AIRFLOW_HOME/airflow.cfg
, or through the UI in
the Admin->Configuration
menu. The PID file for the webserver will be stored
in $AIRFLOW_HOME/airflow-webserver.pid
or in /run/airflow/webserver.pid
if started by systemd.
Out of the box, Airflow uses a SQLite database, which you should outgrow
fairly quickly since no parallelization is possible using this database
backend. It works in conjunction with the
SequentialExecutor
which will
only run task instances sequentially. While this is very limiting, it allows
you to get up and running quickly and take a tour of the UI and the
command line utilities.
As you grow and deploy Airflow to production, you will also want to move away
from the standalone
command we use here to running the components
separately. You can read more in Production Deployment.
Here are a few commands that will trigger a few task instances. You should
be able to see the status of the jobs change in the example_bash_operator
DAG as you
run the commands below.
# run your first task instance
airflow tasks test example_bash_operator runme_0 2015-01-01
# run a backfill over 2 days
airflow dags backfill example_bash_operator \
--start-date 2015-01-01 \
--end-date 2015-01-02
If you want to run the individual parts of Airflow manually rather than using
the all-in-one standalone
command, you can instead run:
airflow db migrate
airflow users create \
--username admin \
--firstname Peter \
--lastname Parker \
--role Admin \
--email spiderman@superhero.org
airflow webserver --port 8080
airflow scheduler
What’s Next?¶
From this point, you can head to the Tutorials section for further examples or the How-to Guides section if you’re ready to get your hands dirty.