Command Line Interface and Environment Variables Reference¶
CLI¶
airflowctl has a very rich command line interface that allows for many types of operation on a Dag, starting services, and supporting development and testing.
Note
For more information on usage CLI, see Command Line Interface and Environment Variables Reference
Usage: airflowctl [-h] GROUP_OR_COMMAND ...
Positional Arguments¶
- GROUP_OR_COMMAND
Possible choices: assets, auth, backfill, config, connections, dagrun, dags, jobs, pools, providers, variables, version
Sub-commands¶
assets¶
Perform Assets operations
airflowctl assets [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: create-event, delete-dag-queued-events, delete-queued-event, delete-queued-events, get, get-by-alias, get-dag-queued-event, get-dag-queued-events, get-queued-events, list, list-by-alias, materialize
Sub-commands¶
create-event¶
Perform create_event operation
airflowctl assets create-event [-h] [--asset-id ASSET_ID] [-e ENV] [--extra EXTRA] [--output (table, json, yaml, plain)]
Named Arguments¶
- --asset-id
asset_id for asset_event_body operation
- -e, --env
The environment to run the command in
Default:
'production'- --extra
extra for asset_event_body operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
delete-dag-queued-events¶
Perform delete_dag_queued_events operation
airflowctl assets delete-dag-queued-events [-h] [--before BEFORE] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --before
before for delete_dag_queued_events operation in AssetsOperations
- --dag-id
dag_id for delete_dag_queued_events operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
delete-queued-event¶
Perform delete_queued_event operation
airflowctl assets delete-queued-event [-h] [--asset-id ASSET_ID] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --asset-id
asset_id for delete_queued_event operation in AssetsOperations
- --dag-id
dag_id for delete_queued_event operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
delete-queued-events¶
Perform delete_queued_events operation
airflowctl assets delete-queued-events [-h] [--asset-id ASSET_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --asset-id
asset_id for delete_queued_events operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get¶
Perform get operation
airflowctl assets get [-h] [--asset-id ASSET_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --asset-id
asset_id for get operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-by-alias¶
Perform get_by_alias operation
airflowctl assets get-by-alias [-h] [--alias ALIAS] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --alias
alias for get_by_alias operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-dag-queued-event¶
Perform get_dag_queued_event operation
airflowctl assets get-dag-queued-event [-h] [--asset-id ASSET_ID] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --asset-id
asset_id for get_dag_queued_event operation in AssetsOperations
- --dag-id
dag_id for get_dag_queued_event operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-dag-queued-events¶
Perform get_dag_queued_events operation
airflowctl assets get-dag-queued-events [-h] [--before BEFORE] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --before
before for get_dag_queued_events operation in AssetsOperations
- --dag-id
dag_id for get_dag_queued_events operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-queued-events¶
Perform get_queued_events operation
airflowctl assets get-queued-events [-h] [--asset-id ASSET_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --asset-id
asset_id for get_queued_events operation in AssetsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list¶
Perform list operation
airflowctl assets list [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list-by-alias¶
Perform list_by_alias operation
airflowctl assets list-by-alias [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
materialize¶
Perform materialize operation
airflowctl assets materialize [-h] [--asset-id ASSET_ID]
Named Arguments¶
- --asset-id
asset_id for materialize operation in AssetsOperations
auth¶
Manage authentication for CLI. Either pass token from environment variable/parameter or pass username and password.
airflowctl auth [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: login
Sub-commands¶
login¶
Login to the metadata database
airflowctl auth login [-h] [--api-token API_TOKEN] [--api-url API_URL] [-e ENV] [--password [PASSWORD]] [--username USERNAME]
Named Arguments¶
- --api-token
The token to use for authentication
- --api-url
The URL of the metadata database API
Default:
'http://localhost:8080'- -e, --env
The environment to run the command in
Default:
'production'- --password
The password to use for authentication
- --username
The username to use for authentication
backfill¶
Perform Backfill operations
airflowctl backfill [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: cancel, create, create-dry-run, get, list, pause, unpause
Sub-commands¶
cancel¶
Perform cancel operation
airflowctl backfill cancel [-h] [--backfill-id BACKFILL_ID]
Named Arguments¶
- --backfill-id
backfill_id for cancel operation in BackfillOperations
create¶
Perform create operation
airflowctl backfill create [-h] [--dag-id DAG_ID] [--dag-run-conf DAG_RUN_CONF] [-e ENV] [--from-date FROM_DATE] [--max-active-runs MAX_ACTIVE_RUNS] [--reprocess-behavior REPROCESS_BEHAVIOR]
[--run-backwards | --no-run-backwards] [--to-date TO_DATE] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for backfill operation
- --dag-run-conf
dag_run_conf for backfill operation
- -e, --env
The environment to run the command in
Default:
'production'- --from-date
from_date for backfill operation
- --max-active-runs
max_active_runs for backfill operation
- --reprocess-behavior
reprocess_behavior for backfill operation
- --run-backwards, --no-run-backwards
run_backwards for backfill operation (default: False)
Default:
False- --to-date
to_date for backfill operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
create-dry-run¶
Perform create_dry_run operation
airflowctl backfill create-dry-run [-h] [--dag-id DAG_ID] [--dag-run-conf DAG_RUN_CONF] [-e ENV] [--from-date FROM_DATE] [--max-active-runs MAX_ACTIVE_RUNS] [--reprocess-behavior REPROCESS_BEHAVIOR]
[--run-backwards | --no-run-backwards] [--to-date TO_DATE] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for backfill operation
- --dag-run-conf
dag_run_conf for backfill operation
- -e, --env
The environment to run the command in
Default:
'production'- --from-date
from_date for backfill operation
- --max-active-runs
max_active_runs for backfill operation
- --reprocess-behavior
reprocess_behavior for backfill operation
- --run-backwards, --no-run-backwards
run_backwards for backfill operation (default: False)
Default:
False- --to-date
to_date for backfill operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get¶
Perform get operation
airflowctl backfill get [-h] [--backfill-id BACKFILL_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --backfill-id
backfill_id for get operation in BackfillOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list¶
Perform list operation
airflowctl backfill list [-h] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for list operation in BackfillOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
pause¶
Perform pause operation
airflowctl backfill pause [-h] [--backfill-id BACKFILL_ID]
Named Arguments¶
- --backfill-id
backfill_id for pause operation in BackfillOperations
unpause¶
Perform unpause operation
airflowctl backfill unpause [-h] [--backfill-id BACKFILL_ID]
Named Arguments¶
- --backfill-id
backfill_id for unpause operation in BackfillOperations
config¶
Perform Config operations
airflowctl config [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: get, lint, list
Sub-commands¶
get¶
Perform get operation
airflowctl config get [-h] [-e ENV] [--option OPTION] [--section SECTION] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --option
option for get operation in ConfigOperations
- --section
section for get operation in ConfigOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
lint¶
Lint options for the configuration changes while migrating from Airflow 2 to Airflow 3
airflowctl config lint [-h] [--ignore-option IGNORE_OPTION] [--ignore-section IGNORE_SECTION] [--option OPTION] [--section SECTION] [-v]
Named Arguments¶
- --ignore-option
The configuration option being ignored
- --ignore-section
The configuration section being ignored
- --option
The option of the configuration
- --section
The section of the configuration
- -v, --verbose
Enables detailed output, including the list of ignored sections and options
Default:
False
list¶
Perform list operation
airflowctl config list [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
connections¶
Perform Connections operations
airflowctl connections [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: create, create-defaults, delete, get, import, list, test, update
Sub-commands¶
create¶
Perform create operation
airflowctl connections create [-h] [--conn-type CONN_TYPE] [--connection-id CONNECTION_ID] [--description DESCRIPTION] [-e ENV] [--extra EXTRA] [--host HOST] [--login LOGIN] [--password PASSWORD]
[--port PORT] [--output (table, json, yaml, plain)]
Named Arguments¶
- --conn-type
conn_type for connection operation
- --connection-id
connection_id for connection operation
- --description
description for connection operation
- -e, --env
The environment to run the command in
Default:
'production'- --extra
extra for connection operation
- --host
host for connection operation
- --login
login for connection operation
- --password
password for connection operation
- --port
port for connection operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
create-defaults¶
Perform create_defaults operation
airflowctl connections create-defaults [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
delete¶
Perform delete operation
airflowctl connections delete [-h] [--conn-id CONN_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --conn-id
conn_id for delete operation in ConnectionsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get¶
Perform get operation
airflowctl connections get [-h] [--conn-id CONN_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --conn-id
conn_id for get operation in ConnectionsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
import¶
Import connections from a file. This feature is compatible with airflow CLI airflow connections export a.json command. Export it from airflow CLI and import it securely via this command.
airflowctl connections import [-h] FILEPATH
Positional Arguments¶
- FILEPATH
Connections JSON file
list¶
Perform list operation
airflowctl connections list [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
test¶
Perform test operation
airflowctl connections test [-h] [--conn-type CONN_TYPE] [--connection-id CONNECTION_ID] [--description DESCRIPTION] [--extra EXTRA] [--host HOST] [--login LOGIN] [--password PASSWORD] [--port PORT]
Named Arguments¶
- --conn-type
conn_type for connection operation
- --connection-id
connection_id for connection operation
- --description
description for connection operation
- --extra
extra for connection operation
- --host
host for connection operation
- --login
login for connection operation
- --password
password for connection operation
- --port
port for connection operation
update¶
Perform update operation
airflowctl connections update [-h] [--conn-type CONN_TYPE] [--connection-id CONNECTION_ID] [--description DESCRIPTION] [-e ENV] [--extra EXTRA] [--host HOST] [--login LOGIN] [--password PASSWORD]
[--port PORT] [--output (table, json, yaml, plain)]
Named Arguments¶
- --conn-type
conn_type for connection operation
- --connection-id
connection_id for connection operation
- --description
description for connection operation
- -e, --env
The environment to run the command in
Default:
'production'- --extra
extra for connection operation
- --host
host for connection operation
- --login
login for connection operation
- --password
password for connection operation
- --port
port for connection operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
dagrun¶
Perform DagRun operations
airflowctl dagrun [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: get, list
Sub-commands¶
get¶
Perform get operation
airflowctl dagrun get [-h] [--dag-id DAG_ID] [--dag-run-id DAG_RUN_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for get operation in DagRunOperations
- --dag-run-id
dag_run_id for get operation in DagRunOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list¶
Perform list operation
airflowctl dagrun list [-h] [--dag-id DAG_ID] [--end-date END_DATE] [-e ENV] [--limit LIMIT] [--start-date START_DATE] [--state STATE] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for list operation in DagRunOperations
- --end-date
end_date for list operation in DagRunOperations
- -e, --env
The environment to run the command in
Default:
'production'- --limit
limit for list operation in DagRunOperations
- --start-date
start_date for list operation in DagRunOperations
- --state
state for list operation in DagRunOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
dags¶
Perform Dags operations
airflowctl dags [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: delete, get, get-details, get-import-error, get-stats, get-tags, get-version, list, list-import-errors, list-version, list-warning, pause, trigger, unpause, update
Sub-commands¶
delete¶
Perform delete operation
airflowctl dags delete [-h] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for delete operation in DagsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get¶
Perform get operation
airflowctl dags get [-h] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for get operation in DagsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-details¶
Perform get_details operation
airflowctl dags get-details [-h] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for get_details operation in DagsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-import-error¶
Perform get_import_error operation
airflowctl dags get-import-error [-h] [-e ENV] [--import-error-id IMPORT_ERROR_ID] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --import-error-id
import_error_id for get_import_error operation in DagsOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-stats¶
Perform get_stats operation
airflowctl dags get-stats [-h] [--dag-ids DAG_IDS] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-ids
dag_ids for get_stats operation in DagsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get-version¶
Perform get_version operation
airflowctl dags get-version [-h] [--dag-id DAG_ID] [-e ENV] [--version-number VERSION_NUMBER] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for get_version operation in DagsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --version-number
version_number for get_version operation in DagsOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list¶
Perform list operation
airflowctl dags list [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list-import-errors¶
Perform list_import_errors operation
airflowctl dags list-import-errors [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list-version¶
Perform list_version operation
airflowctl dags list-version [-h] [--dag-id DAG_ID] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for list_version operation in DagsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
list-warning¶
Perform list_warning operation
airflowctl dags list-warning [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
pause¶
Pause a Dag
airflowctl dags pause [-h] [--dag-id DAG_ID] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
The DAG ID of the DAG to pause or unpause
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
trigger¶
Perform trigger operation
airflowctl dags trigger [-h] [--conf CONF] [--dag-id DAG_ID] [--dag-run-id DAG_RUN_ID] [--data-interval-end DATA_INTERVAL_END] [--data-interval-start DATA_INTERVAL_START] [-e ENV]
[--logical-date LOGICAL_DATE] [--note NOTE] [--run-after RUN_AFTER] [--output (table, json, yaml, plain)]
Named Arguments¶
- --conf
conf for trigger_dag_run operation
- --dag-id
dag_id for trigger operation in DagsOperations
- --dag-run-id
dag_run_id for trigger_dag_run operation
- --data-interval-end
data_interval_end for trigger_dag_run operation
- --data-interval-start
data_interval_start for trigger_dag_run operation
- -e, --env
The environment to run the command in
Default:
'production'- --logical-date
logical_date for trigger_dag_run operation
- --note
note for trigger_dag_run operation
- --run-after
run_after for trigger_dag_run operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
unpause¶
Unpause a Dag
airflowctl dags unpause [-h] [--dag-id DAG_ID] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
The DAG ID of the DAG to pause or unpause
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
update¶
Perform update operation
airflowctl dags update [-h] [--dag-id DAG_ID] [-e ENV] [--is-paused | --no-is-paused] [--output (table, json, yaml, plain)]
Named Arguments¶
- --dag-id
dag_id for update operation in DagsOperations
- -e, --env
The environment to run the command in
Default:
'production'- --is-paused, --no-is-paused
is_paused for dag_body operation (default: False)
Default:
False- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
jobs¶
Perform Jobs operations
airflowctl jobs [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: list
Sub-commands¶
list¶
Perform list operation
airflowctl jobs list [-h] [-e ENV] [--hostname HOSTNAME] [--is-alive | --no-is-alive] [--job-type JOB_TYPE] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --hostname
hostname for list operation in JobsOperations
- --is-alive, --no-is-alive
is_alive for list operation in JobsOperations (default: False)
Default:
False- --job-type
job_type for list operation in JobsOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
pools¶
Perform Pools operations
airflowctl pools [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: create, delete, export, get, import, list, update
Sub-commands¶
create¶
Perform create operation
airflowctl pools create [-h] [--description DESCRIPTION] [-e ENV] [--include-deferred | --no-include-deferred] [--name NAME] [--slots SLOTS] [--output (table, json, yaml, plain)]
Named Arguments¶
- --description
description for pool operation
- -e, --env
The environment to run the command in
Default:
'production'- --include-deferred, --no-include-deferred
include_deferred for pool operation (default: False)
Default:
False- --name
name for pool operation
- --slots
slots for pool operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
delete¶
Perform delete operation
airflowctl pools delete [-h] [-e ENV] [--pool POOL] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --pool
pool for delete operation in PoolsOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
export¶
Export all pools
airflowctl pools export [-h] [--output (table, json, yaml, plain)] FILEPATH
Positional Arguments¶
- FILEPATH
File path to read from or write to. For import commands, it is a file to read from. For export commands, it is a file to write to.
Named Arguments¶
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
get¶
Perform get operation
airflowctl pools get [-h] [-e ENV] [--pool-name POOL_NAME] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --pool-name
pool_name for get operation in PoolsOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
import¶
Import pools
airflowctl pools import [-h] FILEPATH
Positional Arguments¶
- FILEPATH
File path to read from or write to. For import commands, it is a file to read from. For export commands, it is a file to write to.
list¶
Perform list operation
airflowctl pools list [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
update¶
Perform update operation
airflowctl pools update [-h] [--description DESCRIPTION] [-e ENV] [--include-deferred | --no-include-deferred] [--pool POOL] [--slots SLOTS] [--output (table, json, yaml, plain)]
Named Arguments¶
- --description
description for pool_body operation
- -e, --env
The environment to run the command in
Default:
'production'- --include-deferred, --no-include-deferred
include_deferred for pool_body operation (default: False)
Default:
False- --pool
pool for pool_body operation
- --slots
slots for pool_body operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
providers¶
Perform Providers operations
airflowctl providers [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: list
Sub-commands¶
list¶
Perform list operation
airflowctl providers list [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
variables¶
Perform Variables operations
airflowctl variables [-h] COMMAND ...
Positional Arguments¶
- COMMAND
Possible choices: create, delete, export, get, import, list, update
Sub-commands¶
create¶
Perform create operation
airflowctl variables create [-h] [--description DESCRIPTION] [-e ENV] [--key KEY] [--value VALUE] [--output (table, json, yaml, plain)]
Named Arguments¶
- --description
description for variable operation
- -e, --env
The environment to run the command in
Default:
'production'- --key
key for variable operation
- --value
value for variable operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
delete¶
Perform delete operation
airflowctl variables delete [-h] [-e ENV] [--variable-key VARIABLE_KEY] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --variable-key
variable_key for delete operation in VariablesOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
export¶
Export all variables
airflowctl variables export [-h] FILEPATH
Positional Arguments¶
- FILEPATH
File path to read from or write to. For import commands, it is a file to read from. For export commands, it is a file to write to.
get¶
Perform get operation
airflowctl variables get [-h] [-e ENV] [--variable-key VARIABLE_KEY] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --variable-key
variable_key for get operation in VariablesOperations
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
import¶
Import variables
airflowctl variables import [-h] [-a {overwrite,fail,skip}] FILEPATH
Positional Arguments¶
- FILEPATH
File path to read from or write to. For import commands, it is a file to read from. For export commands, it is a file to write to.
Named Arguments¶
- -a, --action-on-existing-key
Possible choices: overwrite, fail, skip
Action to take if we encounter a variable key that already exists.
Default:
'overwrite'
list¶
Perform list operation
airflowctl variables list [-h] [-e ENV] [--output (table, json, yaml, plain)]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
update¶
Perform update operation
airflowctl variables update [-h] [--description DESCRIPTION] [-e ENV] [--key KEY] [--value VALUE] [--output (table, json, yaml, plain)]
Named Arguments¶
- --description
description for variable operation
- -e, --env
The environment to run the command in
Default:
'production'- --key
key for variable operation
- --value
value for variable operation
- --output, -o
Possible choices: table, json, yaml, plain
Output format. Allowed values: json, yaml, plain, table (default: json)
Default:
'json'
version¶
Show version information
airflowctl version [-h] [-e ENV] [--remote]
Named Arguments¶
- -e, --env
The environment to run the command in
Default:
'production'- --remote
Fetch the Airflow version in remote server, otherwise only shows the local airflowctl version
Default:
False
Environment Variables¶
- AIRFLOW_CLI_TOKEN¶
The token used to authenticate with the Airflow API. This is only required if you are using the Airflow API and have not set up authentication using a different method. If username and password hasn’t been used.
- AIRFLOW_CLI_ENVIRONMENT¶
Environment name to use for the CLI. This is used to determine which environment to use when running the CLI. This is only required if you have multiple environments set up and want to specify which one to use. If not set, the default environment will be used which is production.
- AIRFLOW_CLI_DEBUG_MODE¶
This variable can be used to enable debug mode for the CLI. It disables some features such as keyring integration and save credentials to file. It is only meant to use if either you are developing airflowctl or running API integration tests. Please do not use this variable unless you know what you are doing.