1

I have a problem connecting Celery to AWS SQS service. My arhitecture looks like this: First service is API that uses celery.send_task() methods, then I have 2 SQS queues, and after that I have two celery workers, and each worker(separate containers) takes elements from specific queue. For now everything is LOCAL except of course AWS SQS. Everything works fine when I use Redis as a broker.
Here is Celery configuration from API side:

celery_client = Celery(__name__)
celery_client.conf.broker_url = 'sqs://'
celery_client.conf.broker_transport_options = {
            'predefined_queues': {
                'queue_1': {
                    'url': <sqs url/queue_1>,
                    'access_key_id': ...,
                    'secret_access_key': ...,
                },
                'queue_2': {
                    'url': <sqs url/queue_2>,
                    'access_key_id': ...,
                    'secret_access_key': ...,
                }
            }
        }
elery_client.conf.task_routes = (
            {"foo_tasks.*": {"queue": "queue_1"}},
            {"bar_tasks.*": {"queue": "queue_2"}},
        )

I have created two working SQS queues (tested it trough AWS-cli). On the other side (consumer/woker) I have this configuration:

celery_client = Celery(__name__)
celery_client.conf.broker_url = 'sqs://'
celery_logger = get_task_logger(__name__)
celery_client.conf.broker_transport_options = {
            'predefined_queues': {
                'queue_1': {
                    'url': <sqs url/queue_1>,
                    'access_key_id': ...,
                    'secret_access_key': ...,
                },
                'queue_2': {
                    'url': <sqs url/queue_2>,
                    'access_key_id': ...,
                    'secret_access_key': ...,
                }
            }
        }
celery_client.conf.imports = (
    "celery_service.tasks.foo_tasks",
    "celery_service.tasks.bar_tasks",
    ...,
)

celery -A celery_service.celery_worker.celery_client worker --loglevel=INFO -Q queue_1
<br>
celery -A celery_service.celery_worker.celery_client worker --loglevel=INFO -Q queue_2

It gives me this error:

 kombu.transport.SQS.UndefinedQueueException: Queue with name '925baf3c-6aff-39e6-9bc0-f566170afcdc-reply-celery-pidbox' must be defined in 'predefined_queues'.
mehekek
  • 113
  • 9

1 Answers1

0

I found out that

queue_1': {
                    'url': <sqs url/queue_1>,
                    'access_key_id': ...,
                    'secret_access_key': ...,
                }
...

does not work for me. I wound up using this

queue_1': {
                    'url': get_aws_sqs(queue_name)
                }
...

and in function

#custom function
def get_aws_sqs(queue_name):
    sqs = boto3.client(
            'sqs',
            aws_access_key_id=...,
            aws_secret_access_key=...,
            region_name=...
        )
    return sqs.get_queue_url(QueueName=queue_name)

Of course this is just the skeleton of the funciton, you can use try/except blocks, or add timeouts..

mehekek
  • 113
  • 9