0

I've Django application that is working with Celery and using RabbitMQ as message broker. I've separate project for scrapy from where I scraped data and want to push this scrapped data into rabbitMQ then django will consume this RabbitMQ message's through celery. I need help to consume the message that pushed in rabbitMQ from scrapy project.

code snippet.

scrapy

def process_item(self, item, spider):
    publish_message(item)    
    return item

def publish_message(data):
    import pika
    connection = pika.BlockingConnection(
        pika.ConnectionParameters(host='localhost', port=5672))
    channel = connection.channel()
    channel.basic_publish(exchange='topic', routing_key='scrapy', body='Hello From 
     scrapy!')
    connection.close()

In django app, consumers.py

import pika


connection = pika.BlockingConnection(pika.ConnectionParameters('localhost', heartbeat=600,
                                                               blocked_connection_timeout=300))
channel = connection.channel()

def callback(ch, method, properties, body):
    print(" data =============== ", data)
    # I will call celery task here once code print the data to make sure its running. unfortunately its not running. :( 
    return 


channel.basic_consume(queue='scrapy', on_message_callback=callback, auto_ack=True)
print("Started Consuming...")
channel.start_consuming()
connection.close()

celery.py

from kombu import Exchange, Queue

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'my_project.settings.development')

celery_app = Celery('my_project', broker='amqp://guest:guest@rabbit:5672', backend='rpc://0.0.0.0:5672')
celery_app.config_from_object(f'django.conf:settings', namespace='CELERY')
celery_app.autodiscover_tasks()

celery_app.conf.update(
    worker_max_tasks_per_child=1,
    broker_pool_limit=None
)

default_exchange = Exchange('default', type='topic')
scrapy_exchange = Exchange('scrapy', type='topic')

celery_app.conf.task_queues = (
Queue('scrapy', scrapy_exchange, routing_key='scrapy.#'),
)

M Usman Wahab
  • 53
  • 1
  • 10

1 Answers1

0

You didn't declare a queue when consuming. try this:

Publisher

def process_item(self, item, spider):
publish_message(item)    
return item

def publish_message(data):
    import pika
    connection = pika.BlockingConnection(
        pika.ConnectionParameters(host='localhost', port=5672))
    channel = connection.channel()
    channel.exchange_declare(exchange='topic')
    channel.basic_publish(exchange='topic', routing_key='scrapy', body='Hello From 
     scrapy!')
    connection.close()

Consumer

import pika


connection = pika.BlockingConnection(pika.ConnectionParameters('localhost', heartbeat=600,
                                                               blocked_connection_timeout=300))
channel = connection.channel()

def callback(ch, method, properties, body):
    print(" data =============== ", body)
    # I will call celery task here once code print the data to make sure its running. unfortunately its not running. :( 
    return 
channel.exchange_declare(exchange='topic')
channel.queue_declare(queue='scrapy')
channel.queue_bind(exchange='topic', queue='scrapy', routing_key='scrapy')
channel.basic_consume(queue='scrapy', on_message_callback=callback, auto_ack=True)
print("Started Consuming...")
channel.start_consuming()
connection.close()
Ahmad
  • 349
  • 1
  • 9
  • your solution worked but I'm getting this error after publishing message. Django / Celery / Kombu worker error: Received and deleted unknown message. Wrong destination? – M Usman Wahab Jun 07 '22 at 09:31
  • This probably means Celery doesn't understand the message. Maybe this helps: https://stackoverflow.com/questions/14885396/django-celery-kombu-worker-error-received-and-deleted-unknown-message-wron – Ahmad Jun 07 '22 at 12:01
  • Ahmad I'm not using librabbitmq as you can see in my code above. Celery , RabbitMQ, Kombu – M Usman Wahab Jun 09 '22 at 14:35