I have a celery task that adds messages to a database like so:
class ProcessRequests(Task):
def run(self, batch):
for e in q:
msg = Message.objects.create(
recipient_number=e.mobile,
content=batch.content,
sender=e.contact_owner,
billee=batch.user,
sender_name=batch.sender_name
)
gateway = Gateway.objects.get(pk=2)
msg.send(gateway)
Then in the msg.send(gateway) model there is another task which actually sends the message and runs this:
class SendMessage(Task):
name = "Sending SMS"
max_retries = 10
default_retry_delay = 3
def run(self, message_id, gateway_id=None, **kwargs):
logging.debug("About to send a message.")
so some stuff here
logging.debug("Done sending message.")
This all works fine (with over 1000 messages tested), but I read somewhere that you should not chain tasks together, however this is not chaining, right? I don't wait for one to finish before the other can run.
Is this example ok in terms of performance etc?
send is something like this:
def send(self, message):
"""
Use this gateway to send a message.
If ``djcelery`` is installed, then we assume they have set up the
``celeryd`` server, and we queue for delivery. Otherwise, we will
send in-process.
.. note::
It is strongly recommended to run this out of process,
especially if you are sending as part of an HttpRequest, as this
could take ~5 seconds per message that is to be sent.
"""
if 'djcelery' in settings.INSTALLED_APPS:
import sms.tasks
sms.tasks.SendMessage.delay(message.pk, self.pk)
else:
self._send(message)