In the code i am writing i used to have both threads and multiple processes a fork:
- threads for a websocket connection (and some other background tasks)
multiprocessingfork to create an isolated memory process which can be reloaded
This resulted in a hanging process once in a while. I learned that mixing those two is a bad idea, since the forked process waits for a non-existing thread to give up its lock:
- https://pythonspeed.com/articles/python-multiprocessing/
- https://rachelbythebay.com/w/2011/06/07/forked/
I am thinking of rewriting my code to use asyncio
for the background tasks instead, but i am not so sure if this solves my problem, since i am not familiar with how asyncio
works under the hood. Does asyncio
use locks to perform the context switching? Are all coroutines inherited by the forked process? Could the forked process be stuck somehow?
The linked article suggests a few alternatives to solve this issue, which are not applicable in my case:
- use spawn instead of fork to not copy the current memory and start with a fresh process
- fork before the threads are spun of