1

I want to process clients in different processes asynchronously, that is, each process can process many client connections asynchronously. The code I have at the moment:

import os, socket
import asyncio

from multiprocessing import Process, Queue

async def process_client(client: socket) -> None:
    loop = asyncio.get_event_loop()

    data = await loop.sock_recv(client, 256)

def main_process(q: Queue) -> None:
    loop = asyncio.get_event_loop()

    while True:
        try:
            client, addr = q.get(timeout=1)
            loop.create_task(process_client(client))
        except:
            pass

def main() -> None:
    server_ip = os.environ['SERVER_IP']
    server_port = int(os.environ['SEVER_PORT'])

    q = Queue()
    
    for _ in range(8):
        worker = Process(target=main_process, args=(q, ))
        worker.start()

    with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as server:
        server.bind((server_ip, server_port))
        server.listen(100)

        while True:
            client, addr = server.accept()
            q.put((client, addr))

if __name__ == '__main__':
    main()

create a task does not create a task because the while loop and q.get(timeout=1) block the thread. If the multiprocessing queue was asynchronous, then I would be able to do for example this: client, addr = await q.get()

Jurddox
  • 21
  • 1
  • 4
  • Have you actually tested the above code ? Sockets aren't serializable so you cannot just put them on the queue anyway, at least not the way you expect them to. – Ahmed AEK Mar 18 '23 at 09:27
  • Yes, I know that a socket is a kernel object that is only available in the process in which it appeared. I tested this code, just ran the task through ayncio.run and everything worked. It looks like queue multiprocessing does not only transfer data to processes, but also duplicates of kernel objects – Jurddox Mar 18 '23 at 09:40
  • That depends on the platform, at least on windows you need to call [socket.share](https://docs.python.org/3/library/socket.html#socket.socket.share) to do that, i think there's something similar for linux. – Ahmed AEK Mar 18 '23 at 09:49
  • I have already said that the code works only synchronously, so I don't have to do this. Now the question is how to achieve asynchrony – Jurddox Mar 18 '23 at 09:54

1 Answers1

0

you can call a blocking function within an asyncio eventloop using loop.run_in_executor

client, addr = await loop.run_in_executor(None, functools.partial(q.get,timeout=1))

This needs to happen inside an async function so you need to define one.

async def main_process_async(q: Queue):
    loop = asyncio.get_running_loop()

    while True:
        try:
            client, addr = await loop.run_in_executor(None, functools.partial(q.get,timeout=1))
            loop.create_task(process_client(client))
        except queue.Empty:
            pass

def main_process(q: Queue) -> None:
    asyncio.run(main_process_async(q))

This may fail because you cannot serialize socket objects directly on all platforms, there are platform dependent methods for sharing sockets like socket.share for windows.

Ahmed AEK
  • 8,584
  • 2
  • 7
  • 23
  • Thanks it really worked! But I think that await loop.run_in_executor(None, functools.partial(q.get,timeout=1)), timeout=1 is superfluous here. You can remove it and also remove the try block. – Jurddox Mar 18 '23 at 10:58
  • I use windows and everything works for me without socket.share. – Jurddox Mar 18 '23 at 10:59