I have a problem connecting Celery to AWS SQS service.
My arhitecture looks like this: First service is API that uses celery.send_task() methods, then I have 2 SQS queues, and after that I have two celery workers, and each worker(separate containers) takes elements from specific queue. For now everything is LOCAL except of course AWS SQS. Everything works fine when I use Redis as a broker.
Here is Celery configuration from API side:
celery_client = Celery(__name__)
celery_client.conf.broker_url = 'sqs://'
celery_client.conf.broker_transport_options = {
'predefined_queues': {
'queue_1': {
'url': <sqs url/queue_1>,
'access_key_id': ...,
'secret_access_key': ...,
},
'queue_2': {
'url': <sqs url/queue_2>,
'access_key_id': ...,
'secret_access_key': ...,
}
}
}
elery_client.conf.task_routes = (
{"foo_tasks.*": {"queue": "queue_1"}},
{"bar_tasks.*": {"queue": "queue_2"}},
)
I have created two working SQS queues (tested it trough AWS-cli). On the other side (consumer/woker) I have this configuration:
celery_client = Celery(__name__)
celery_client.conf.broker_url = 'sqs://'
celery_logger = get_task_logger(__name__)
celery_client.conf.broker_transport_options = {
'predefined_queues': {
'queue_1': {
'url': <sqs url/queue_1>,
'access_key_id': ...,
'secret_access_key': ...,
},
'queue_2': {
'url': <sqs url/queue_2>,
'access_key_id': ...,
'secret_access_key': ...,
}
}
}
celery_client.conf.imports = (
"celery_service.tasks.foo_tasks",
"celery_service.tasks.bar_tasks",
...,
)
celery -A celery_service.celery_worker.celery_client worker --loglevel=INFO -Q queue_1
<br>
celery -A celery_service.celery_worker.celery_client worker --loglevel=INFO -Q queue_2
It gives me this error:
kombu.transport.SQS.UndefinedQueueException: Queue with name '925baf3c-6aff-39e6-9bc0-f566170afcdc-reply-celery-pidbox' must be defined in 'predefined_queues'.