Amazon Simple Queue Service (SQS)¶
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware, and empowers developers to focus on differentiating work. Using SQS, you can send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available.
Prerequisite Tasks¶
To use these operators, you must do a few things:
Create necessary resources using AWS Console or AWS CLI.
Install API libraries via pip.
pip install 'apache-airflow[amazon]'Detailed information is available Installation of Airflow®
Generic Parameters¶
- aws_conn_id
Reference to Amazon Web Services Connection ID. If this parameter is set to
None
then the default boto3 behaviour is used without a connection lookup. Otherwise use the credentials stored in the Connection. Default:aws_default
- region_name
AWS Region Name. If this parameter is set to
None
or omitted then region_name from AWS Connection Extra Parameter will be used. Otherwise use the specified value instead of the connection value. Default:None
- verify
Whether or not to verify SSL certificates.
False
- Do not validate SSL certificates.path/to/cert/bundle.pem - A filename of the CA cert bundle to use. You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.
If this parameter is set to
None
or is omitted then verify from AWS Connection Extra Parameter will be used. Otherwise use the specified value instead of the connection value. Default:None
- botocore_config
The provided dictionary is used to construct a botocore.config.Config. This configuration can be used to configure Avoid Throttling exceptions, timeouts, etc.
{ "signature_version": "unsigned", "s3": { "us_east_1_regional_endpoint": True, }, "retries": { "mode": "standard", "max_attempts": 10, }, "connect_timeout": 300, "read_timeout": 300, "tcp_keepalive": True, }
If this parameter is set to
None
or omitted then config_kwargs from AWS Connection Extra Parameter will be used. Otherwise use the specified value instead of the connection value. Default:None
Note
Specifying an empty dictionary,
{}
, will overwrite the connection configuration for botocore.config.Config
Operators¶
Publish a message to an Amazon SQS queue¶
To publish a message to an Amazon SQS queue you can use the
SqsPublishOperator
In the following example, the task publish_to_queue
publishes a message containing
the task instance and the execution date to a queue with a default name of Airflow-Example-Queue
.
tests/system/amazon/aws/example_sqs.py
publish_to_queue_1 = SqsPublishOperator(
task_id="publish_to_queue_1",
sqs_queue=sqs_queue,
message_content="{{ task_instance }}-{{ logical_date }}",
)
publish_to_queue_2 = SqsPublishOperator(
task_id="publish_to_queue_2",
sqs_queue=sqs_queue,
message_content="{{ task_instance }}-{{ logical_date }}",
)
Sensors¶
Read messages from an Amazon SQS queue¶
To read messages from an Amazon SQS queue until exhausted use the
SqsSensor
This sensor can also be run in deferrable mode by setting deferrable
param to True
.
tests/system/amazon/aws/example_sqs.py
read_from_queue = SqsSensor(
task_id="read_from_queue",
sqs_queue=sqs_queue,
)
# Retrieve multiple batches of messages from SQS.
# The SQS API only returns a maximum of 10 messages per poll.
read_from_queue_in_batch = SqsSensor(
task_id="read_from_queue_in_batch",
sqs_queue=sqs_queue,
# Get maximum 10 messages each poll
max_messages=10,
# Combine 3 polls before returning results
num_batches=3,
)