I want to experiment on grouping tasks from different celery applications, running on different/remote servers/machines.
So for example, in application 1, running in server 1, I have this task:
@app.task(name='func1', bind=True, queue='func1')
def func1(self, args):
ret = {"func1_status":"success"}
return ret
In application 2, running in server 2, I have this task:
@app.task(name='func2', bind=True, queue='func2')
def func2(self, args):
ret = {"func2_status":"success"}
return ret
Now in application 3, running in server 3, I want to have a task that uses the 2 remote tasks ins a group:
from celery import chain, group
from celery.result import allow_join_result
@app.task(name='func3', bind=True, queue='func3')
def func3(self, args):
ret = None
res = group(func2.s(args), func1.s(args))()
with allow_join_result():
ret = res.get()
return ret
Then I will just add/call a task, via func3, like this:
res = func3.delay(args)
So the above is what I wanted to do, but I don't know how can I import/call func1 and func2 tasks to be used, remotely, in the group for func3.
Any ideas are greatly appreciated. Thanks in advance!