Command Line Interface Reference

Airflow has a very rich command line interface that allows for many types of operation on a DAG, starting services, and supporting development and testing.

usage: airflow [-h] GROUP_OR_COMMAND ...

Positional Arguments

GROUP_OR_COMMAND

Possible choices: celery, config, connections, dags, db, kerberos, kubernetes, pools, rotate-fernet-key, sync-perm, tasks, users, variables, backfill, generate_pod_template, serve_logs, scheduler, webserver, version, info

Sub-commands:

celery

Start celery components. Works only when using CeleryExecutor. For more information, see https://airflow.apache.org/docs/stable/executor/celery.html

airflow celery [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: flower, worker

Sub-commands:

flower

Start a Celery Flower

airflow celery flower [-h] [-A BASIC_AUTH] [-a BROKER_API] [-D] [-c FLOWER_CONF] [-H HOSTNAME] [-l LOG_FILE] [--pid [PID]] [-p PORT] [--stderr STDERR] [--stdout STDOUT] [-u URL_PREFIX]
Named Arguments
-A, --basic-auth

Securing Flower with Basic Authentication. Accepts user:password pairs separated by a comma. Example: flower_basic_auth = user1:password1,user2:password2

Default: “”

-a, --broker-api

Broker API

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-c, --flower-conf

Configuration file for flower

-H, --hostname

Set the hostname on which to run the server

Default: “0.0.0.0”

-l, --log-file

Location of the log file

--pid

PID file location

-p, --port

The port on which to run the server

Default: 5555

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-u, --url-prefix

URL prefix for Flower

Default: “”

worker

Start a Celery worker node

airflow celery worker [-h] [-a AUTOSCALE] [-H CELERY_HOSTNAME] [-c CONCURRENCY] [-D] [-l LOG_FILE] [--pid [PID]] [-q QUEUES] [-s] [--stderr STDERR] [--stdout STDOUT]
Named Arguments
-a, --autoscale

Minimum and Maximum number of worker to autoscale

-H, --celery-hostname

Set the hostname of celery worker if you have multiple workers on a single machine

-c, --concurrency

The number of worker processes

Default: 16

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-l, --log-file

Location of the log file

--pid

PID file location

-q, --queues

Comma delimited list of queues to serve

Default: “default”

-s, --skip-serve-logs

Don’t start the serve logs process along with the workers

Default: False

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

config

View configuration

airflow config [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: list

Sub-commands:

list

List options for the configuration

airflow config list [-h]

connections

Manage connections

airflow connections [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: add, delete, list

Sub-commands:

add

Add a connection

airflow connections add [-h] [--conn-extra CONN_EXTRA] [--conn-host CONN_HOST] [--conn-login CONN_LOGIN] [--conn-password CONN_PASSWORD] [--conn-port CONN_PORT] [--conn-schema CONN_SCHEMA]
                        [--conn-type CONN_TYPE] [--conn-uri CONN_URI]
                        conn_id
Positional Arguments
conn_id

Connection id, required to get/add/delete a connection

Named Arguments
--conn-extra

Connection Extra field, optional when adding a connection

--conn-host

Connection host, optional when adding a connection

--conn-login

Connection login, optional when adding a connection

--conn-password

Connection password, optional when adding a connection

--conn-port

Connection port, optional when adding a connection

--conn-schema

Connection schema, optional when adding a connection

--conn-type

Connection type, required to add a connection without conn_uri

--conn-uri

Connection URI, required to add a connection without conn_type

delete

Delete a connection

airflow connections delete [-h] conn_id
Positional Arguments
conn_id

Connection id, required to get/add/delete a connection

list

List connections

airflow connections list [-h]

dags

Manage DAGs

airflow dags [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: delete, list, list-runs, next-execution, pause, report, show, state, trigger, unpause

Sub-commands:

delete

Delete all DB records related to the specified DAG

airflow dags delete [-h] [-y] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

list

List all the DAGs

airflow dags list [-h] [-S SUBDIR]
Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

list-runs

List DAG runs given a DAG id. If state option is given, it will only search for all the dagruns with the given state. If no_backfill option is given, it will filter out all backfill dagruns for given dag id. If start_date is given, it will filter out all the dagruns that were executed before this date. If end_date is given, it will filter out all the dagruns that were executed after this date.

airflow dags list-runs [-h] [-d DAG_ID] [--no-backfill] [--state STATE]
Named Arguments
-d, --dag-id

The id of the dag

--no-backfill

filter all the backfill dagruns given the dag id

Default: False

--state

Only list the dag runs corresponding to the state

next-execution

Get the next execution datetimes of a DAG. It returns one execution unless the num-executions option is given

airflow dags next-execution [-h] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

pause

Pause a DAG

airflow dags pause [-h] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

report

Show DagBag loading report

airflow dags report [-h] [-S SUBDIR]
Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

show

The –imgcat option only works in iTerm.

For more information, see: https://www.iterm2.com/documentation-images.html

The –save option saves the result to the indicated file.

The file format is determined by the file extension. For more information about supported format, see: https://www.graphviz.org/doc/info/output.html

If you want to create a PNG file then you should execute the following command: airflow dags show <DAG_ID> –save output.png

If you want to create a DOT file then you should execute the following command: airflow dags show <DAG_ID> –save output.dot

airflow dags show [-h] [--imgcat] [-s SAVE] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
--imgcat

Displays graph using the imgcat tool.

Default: False

-s, --save

Saves the result to the indicated file.

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

state

Get the status of a dag run

airflow dags state [-h] [-S SUBDIR] dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

trigger

Trigger a DAG run

airflow dags trigger [-h] [-c CONF] [-e EXEC_DATE] [-r RUN_ID] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

-e, --exec-date

The execution date of the DAG

-r, --run-id

Helps to identify this run

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

unpause

Resume a paused DAG

airflow dags unpause [-h] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

db

Database operations

airflow db [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: check, init, reset, shell, upgrade

Sub-commands:

check

Check if the database can be reached

airflow db check [-h]
init

Initialize the metadata database

airflow db init [-h]
reset

Burn down and rebuild the metadata database

airflow db reset [-h] [-y]
Named Arguments
-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

shell

Runs a shell to access the database

airflow db shell [-h]
upgrade

Upgrade the metadata database to latest version

airflow db upgrade [-h]

kerberos

Start a kerberos ticket renewer

airflow kerberos [-h] [-D] [-k [KEYTAB]] [-l LOG_FILE] [--pid [PID]] [--stderr STDERR] [--stdout STDOUT] [principal]

Positional Arguments

principal

kerberos principal

Named Arguments

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-k, --keytab

keytab

Default: “airflow.keytab”

-l, --log-file

Location of the log file

--pid

PID file location

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

kubernetes

Tools to help run the KubernetesExecutor

airflow kubernetes [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: cleanup-pods, generate-dag-yaml

Sub-commands:

cleanup-pods

Clean up Kubernetes pods in evicted/failed/succeeded states

airflow kubernetes cleanup-pods [-h] [--namespace NAMESPACE]
Named Arguments
--namespace

Kubernetes Namespace

Default: “default”

generate-dag-yaml

Generate YAML files for all tasks in DAG. Useful for debugging tasks without launching into a cluster

airflow kubernetes generate-dag-yaml [-h] [-o OUTPUT_PATH] [-S SUBDIR] dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
-o, --output-path

The output for generated yaml files

Default: “[CWD]”

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

pools

Manage pools

airflow pools [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: delete, export, get, import, list, set

Sub-commands:

delete

Delete pool

airflow pools delete [-h] NAME
Positional Arguments
NAME

Pool name

export

Export all pools

airflow pools export [-h] FILEPATH
Positional Arguments
FILEPATH

Export all pools to JSON file

get

Get pool size

airflow pools get [-h] NAME
Positional Arguments
NAME

Pool name

import

Import pools

airflow pools import [-h] FILEPATH
Positional Arguments
FILEPATH

Import pools from JSON file

list

List pools

airflow pools list [-h]
set

Configure pool

airflow pools set [-h] NAME slots description
Positional Arguments
NAME

Pool name

slots

Pool slots

description

Pool description

rotate-fernet-key

Rotate all encrypted connection credentials and variables; see https://airflow.apache.org/docs/stable/howto/secure-connections.html#rotating-encryption-keys

airflow rotate-fernet-key [-h]

sync-perm

Update permissions for existing roles and DAGs

airflow sync-perm [-h]

tasks

Manage tasks

airflow tasks [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: clear, failed-deps, list, render, run, state, test

Sub-commands:

clear

Clear a set of task instance, as if they never ran

airflow tasks clear [-h] [-R] [-d] [-e END_DATE] [-X] [-x] [-c] [-f] [-r] [-s START_DATE] [-S SUBDIR] [-t TASK_REGEX] [-u] [-y] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-R, --dag-regex

Search dag_id as regex instead of exact string

Default: False

-d, --downstream

Include downstream tasks

Default: False

-e, --end-date

Override end_date YYYY-MM-DD

-X, --exclude-parentdag

Exclude ParentDAGS if the task cleared is a part of a SubDAG

Default: False

-x, --exclude-subdags

Exclude subdags

Default: False

-c, --no_confirm

Do not request confirmation. Use with care!

Default: False

-f, --only-failed

Only failed jobs

Default: False

-r, --only-running

Only running jobs

Default: False

-s, --start-date

Override start_date YYYY-MM-DD

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --task-regex

The regex to filter specific task_ids to backfill (optional)

-u, --upstream

Include upstream tasks

Default: False

-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

failed-deps

Returns the unmet dependencies for a task instance from the perspective of the scheduler. In other words, why a task instance doesn’t get scheduled and then queued by the scheduler, and then run by an executor.

airflow tasks failed-deps [-h] [-S SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

list

List the tasks within a DAG

airflow tasks list [-h] [-S SUBDIR] [-t] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --tree

Tree view

Default: False

render

Render a task instance’s template(s)

airflow tasks render [-h] [-S SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

run

Run a single task instance

airflow tasks run [-h] [--cfg-path CFG_PATH] [-f] [-A] [-i] [-I] [-N] [-l] [-m] [-p PICKLE] [--pool POOL] [--ship-dag] [-S SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
--cfg-path

Path to config file to use instead of airflow.cfg

-f, --force

Ignore previous task instance state, rerun regardless if task already succeeded/failed

Default: False

-A, --ignore-all-dependencies

Ignores all non-critical dependencies, including ignore_ti_state and ignore_task_deps

Default: False

-i, --ignore-dependencies

Ignore task-specific dependencies, e.g. upstream, depends_on_past, and retry delay dependencies

Default: False

-I, --ignore-depends-on-past

Ignore depends_on_past dependencies (but respect upstream dependencies)

Default: False

-N, --interactive

Do not capture standard output and error streams (useful for interactive debugging)

Default: False

-l, --local

Run the task using the LocalExecutor

Default: False

-m, --mark-success

Mark jobs as succeeded without running them

Default: False

-p, --pickle

Serialized pickle object of the entire dag (used internally)

--pool

Resource pool to use

--ship-dag

Pickles (serializes) the DAG and ships it to the worker

Default: False

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

state

Get the status of a task instance

airflow tasks state [-h] [-S SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

test

Test a task instance. This will run a task without checking for dependencies or recording its state in the database

airflow tasks test [-h] [-n] [-m] [-S SUBDIR] [-t TASK_PARAMS] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-n, --dry-run

Perform a dry run for each task. Only renders Template Fields for each task, nothing else

Default: False

-m, --post-mortem

Open debugger on uncaught exception

Default: False

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --task-params

Sends a JSON params dict to the task

users

Manage users

airflow users [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: create, delete, list

Sub-commands:

create

Create a user

airflow users create [-h] -e EMAIL -f FIRSTNAME -l LASTNAME [-p PASSWORD] -r ROLE [--use-random-password] -u USERNAME
Named Arguments
-e, --email

Email of the user

-f, --firstname

First name of the user

-l, --lastname

Last name of the user

-p, --password

Password of the user, required to create a user without –use-random-password

-r, --role

Role of the user. Existing roles include Admin, User, Op, Viewer, and Public

--use-random-password

Do not prompt for password. Use random string instead. Required to create a user without –password

Default: False

-u, --username

Username of the user

examples: To create an user with “Admin” role and username equals to “admin”, run:

$ airflow users create

–username admin –firstname FIRST_NAME –lastname LAST_NAME –role Admin –email admin@example.org

delete

Delete a user

airflow users delete [-h] -u USERNAME
Named Arguments
-u, --username

Username of the user

list

List users

airflow users list [-h]

variables

Manage variables

airflow variables [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: delete, export, get, import, list, set

Sub-commands:

delete

Delete variable

airflow variables delete [-h] key
Positional Arguments
key

Variable key

export

Export all variables

airflow variables export [-h] file
Positional Arguments
file

Export all variables to JSON file

get

Get variable

airflow variables get [-h] [-d VAL] [-j] key
Positional Arguments
key

Variable key

Named Arguments
-d, --default

Default value returned if variable does not exist

-j, --json

Deserialize JSON variable

Default: False

import

Import variables

airflow variables import [-h] file
Positional Arguments
file

Import variables from JSON file

list

List variables

airflow variables list [-h]
set

Set variable

airflow variables set [-h] key VALUE
Positional Arguments
key

Variable key

VALUE

Variable value

backfill

Run subsections of a DAG for a specified date range. If reset_dag_run option is used, backfill will first prompt users whether airflow should clear all the previous dag_run and task_instances within the backfill date range. If rerun_failed_tasks is used, backfill will auto re-run the previous failed task instances within the backfill date range.

airflow backfill [-h] [-t TASK_REGEX] [-s START_DATE] [-e END_DATE] [-m] [-l] [-x] [-y] [-i] [-I] [-sd SUBDIR] [--pool POOL] [--delay_on_limit DELAY_ON_LIMIT] [-dr] [-v] [-c CONF] [--reset_dagruns]
                 [--rerun_failed_tasks] [-B]
                 dag_id

Positional Arguments

dag_id

The id of the dag

Named Arguments

-t, --task_regex

The regex to filter specific task_ids to backfill (optional)

-s, --start_date

Override start_date YYYY-MM-DD

-e, --end_date

Override end_date YYYY-MM-DD

-m, --mark_success

Mark jobs as succeeded without running them

Default: False

-l, --local

Run the task using the LocalExecutor

Default: False

-x, --donot_pickle

Do not attempt to pickle the DAG object to send over to the workers, just tell the workers to run their version of the code.

Default: False

-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

-i, --ignore_dependencies

Skip upstream tasks, run only the tasks matching the regexp. Only works in conjunction with task_regex

Default: False

-I, --ignore_first_depends_on_past

Ignores depends_on_past dependencies for the first set of tasks only (subsequent executions in the backfill DO respect depends_on_past).

Default: False

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

--pool

Resource pool to use

--delay_on_limit

Amount of time in seconds to wait when the limit on maximum active dag runs (max_active_runs) has been reached before trying to execute a dag run again.

Default: 1.0

-dr, --dry_run

Perform a dry run for each task. Only renders Template Fields for each task, nothing else

Default: False

-v, --verbose

Make logging output more verbose

Default: False

-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

--reset_dagruns

if set, the backfill will delete existing backfill-related DAG runs and start anew with fresh, running DAG runs

Default: False

--rerun_failed_tasks

if set, the backfill will auto-rerun all the failed tasks for the backfill date range instead of throwing exceptions

Default: False

-B, --run_backwards

if set, the backfill will run tasks from the most recent day first. if there are tasks that depend_on_past this option will throw an exception

Default: False

generate_pod_template

Reads your airflow.cfg and migrates your configurations into a airflow_template.yaml file. From this point a user can linkthis file to airflow using the pod_template_file argumentand modify using the Kubernetes API

airflow generate_pod_template [-h] [-o OUTPUT_PATH]

Named Arguments

-o, --output-path

output path for yaml file

Default: “/Users/kamilbregula/devel/airflow/airflow-1-10/docs”

serve_logs

Serve logs generate by worker

airflow serve_logs [-h]

scheduler

Start a scheduler instance

airflow scheduler [-h] [-d DAG_ID] [-sd SUBDIR] [-r RUN_DURATION] [-n NUM_RUNS] [-p] [--pid [PID]] [-D] [--stdout STDOUT] [--stderr STDERR] [-l LOG_FILE]

Named Arguments

-d, --dag_id

The id of the dag to run

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-r, --run-duration

Set number of seconds to execute before exiting

-n, --num_runs

Set the number of runs to execute before exiting

Default: -1

-p, --do_pickle

Attempt to pickle the DAG object to send over to the workers, instead of letting workers run their version of the code.

Default: False

--pid

PID file location

-D, --daemon

Daemonize instead of running in the foreground

Default: False

--stdout

Redirect stdout to this file

--stderr

Redirect stderr to this file

-l, --log-file

Location of the log file

webserver

Start a Airflow webserver instance

airflow webserver [-h] [-p PORT] [-w WORKERS] [-k {sync,eventlet,gevent,tornado}] [-t WORKER_TIMEOUT] [-hn HOSTNAME] [--pid [PID]] [-D] [--stdout STDOUT] [--stderr STDERR] [-A ACCESS_LOGFILE]
                  [-E ERROR_LOGFILE] [-l LOG_FILE] [--ssl_cert SSL_CERT] [--ssl_key SSL_KEY] [-d]

Named Arguments

-p, --port

The port on which to run the server

Default: 8080

-w, --workers

Number of workers to run the webserver on

Default: 4

-k, --workerclass

Possible choices: sync, eventlet, gevent, tornado

The worker class to use for Gunicorn

Default: “sync”

-t, --worker_timeout

The timeout for waiting on webserver workers

Default: 120

-hn, --hostname

Set the hostname on which to run the web server

Default: “0.0.0.0”

--pid

PID file location

-D, --daemon

Daemonize instead of running in the foreground

Default: False

--stdout

Redirect stdout to this file

--stderr

Redirect stderr to this file

-A, --access_logfile

The logfile to store the webserver access log. Use ‘-‘ to print to stderr.

Default: “-“

-E, --error_logfile

The logfile to store the webserver error log. Use ‘-‘ to print to stderr.

Default: “-“

-l, --log-file

Location of the log file

--ssl_cert

Path to the SSL certificate for the webserver

Default: “”

--ssl_key

Path to the key to use with the SSL certificate

Default: “”

-d, --debug

Use the server that ships with Flask in debug mode

Default: False

version

Show the version

airflow version [-h]

info

Show information about current Airflow and environment

airflow info [-h] [--anonymize] [--file-io]

Named Arguments

--anonymize

Minimize any personal identifiable information. Use it when sharing output with others.

Default: False

--file-io

Send output to file.io service and returns link.

Default: False

Was this entry helpful?