0

I'm trying to make a simple RPC server with SimpleXMLRPCServer and Celery. Basically, the idea is that a remote client (client.py) can call tasks via xmlrpc.client to the server (server.py) which includes functions registered as Celery tasks (runnable.py).

The problem is, when RPC function is registered via register_function I can call it directly by its name, so it will be executed properly, but without using Celery. What I would like to achieve is to call it via name.delay() within client.py, the way it will be executed by Celery, but without locking the server thread. So, server.py should act like a proxy and allow multiple clients to call complete set of functions like:

for task in flow:
    job = globals()[task]
    job.delay("some arg")
    while True:
        if job.ready():
            break

I've tried using register_instance with allow_dotted_names=True, but I came to an error:

xmlrpc.client.Fault: <Fault 1: "<class 'TypeError'>:cannot marshal <class '_thread.RLock'> objects">

Which led me to the question - if it's even possible to do something like this

Simplified code:

server.py

# ...runnable.py import
# ...rpc init
def register_tasks():
    for task in get_all_tasks():
        setattr(self, task, globals()[task])
        self.server.register_function(getattr(self, task), task)

runnable.py

app = Celery("tasks", backend="amqp", broker="amqp://")

@app.task()
def say_hello():
    return "hello there"

@app.task()
def say_goodbye():
    return "bye, bye"

def get_all_tasks():
    tasks = app.tasks
    runnable = []

    for t in tasks:
        if t.startswith("modules.runnable"):
            runnable.append(t.split(".")[-1])

    return runnable

Finally, client.py

s = xmlrpc.client.ServerProxy("http://127.0.0.1:8000")
print(s.say_hello())

1 Answers1

0

I've came up with an idea which creates some extra wrappers for Celery delay functions. Those are registered the way RPC client can call rpc.the_remote_task.delay(*args). This returns Celery job ID, then, client asks whether the job is ready via rpc.ready(job_id) and gets results with rpc.get(job_id). As for now, there's an obvious security hole as you can get results when you know the job ID, but still - it works fine.

Registering tasks (server.py)

def register_tasks():
    for task in get_all_tasks():
        exec("""def """ + task + """_runtime_task_delay(*args):
return celery_wrapper(""" + task + """, "delay", *args)
setattr(self, task + "_delay", """ + task + """_runtime_task_delay)
            """)

        f_delay = task + "_delay"
        self.server.register_function(getattr(self, f_delay), task + ".delay")

    def job_ready(jid):
        return celery_wrapper(None, "ready", jid)

    def job_get(jid):
        return celery_wrapper(None, "get", jid)

    setattr(self, "ready", job_ready)
    setattr(self, "get", job_get)

    self.server.register_function(job_ready, "ready")
    self.server.register_function(job_get, "get")

The wrapper (server.py)

def celery_wrapper(task, method, *args):
    if method == "delay":
        job = task.delay(*args)
        job_id = job.id

        return job_id
    elif method == "ready":
        res = app.AsyncResult(args[0])
        return res.ready()
    elif method == "get":
        res = app.AsyncResult(args[0])
        return res.get()
    else:
        return "0"

And the RPC call (client.py)

jid = s.the_remote_task.delay("arg1", "arg2")
is_running = True
while is_running:
        is_running = not s.ready(jid)

        if not is_running:
                print(s.get(jid))
        time.sleep(.01)