Docker Image for Apache Airflow¶
For the ease of deployment in production, the community releases a production-ready reference container image.
The Apache Airflow community, releases Docker Images which are
reference images for Apache Airflow.
Every time a new version of Airflow is released, the images are prepared in the
for all the supported Python versions.
You can find the following images there (Assuming Airflow version 2.2.3):
apache/airflow:latest- the latest released Airflow image with default Python version (3.6 currently)
apache/airflow:latest-pythonX.Y- the latest released Airflow image with specific Python version
apache/airflow:2.2.3- the versioned Airflow image with default Python version (3.6 currently)
apache/airflow:2.2.3-pythonX.Y- the versioned Airflow image with specific Python version
Those are “reference” images. They contain the most common set of extras, dependencies and providers that are often used by the users and they are good to “try-things-out” when you want to just take airflow for a spin,
The Apache Airflow image provided as convenience package is optimized for size, and it provides just a bare minimal set of the extras and dependencies installed and in most cases you want to either extend or customize the image. You can see all possible extras in Reference for package extras. The set of extras used in Airflow Production image are available in the Dockerfile.
However, Airflow has more than 60 community-managed providers (installable via extras) and some of the default extras/providers installed are not used by everyone, sometimes others extras/providers are needed, sometimes (very often actually) you need to add your own custom dependencies, packages or even custom providers. You can learn how to do it in Building the image.
The production images are build in DockerHub from released version and release candidates. There are also images published from branches but they are used mainly for development and testing purpose. See Airflow Git Branching for details.
AIRFLOW_HOME is set by default to
/opt/airflow/ - this means that DAGs
are in default in the
/opt/airflow/dags folder and logs are in the
The working directory is
/opt/airflow by default.
AIRFLOW__CORE__SQL_ALCHEMY_CONN variable is set then SQLite database is created in
For example commands that start Airflow see: Executing commands.
Airflow requires many components to function as it is a distributed application. You may therefore also be interested in launching Airflow in the Docker Compose environment, see: Quick Start.
You can use this image in Helm Chart as well.