I'm trying to attach an event listener to a python subprocess, that is called whenever the process closes/exits. The callback will respawn the subprocess and attach the same event listener to the new one.
Something akin to calling .on
from child_process
in node.js
I'm aware of this thread already. But I'd ideally like to do this using asyncio.subprocess
instead of running each event listener in a separate thread.
I was thinking of something like this-
from asyncio import subprocess as asubprocess
from typing import Callable, Awaitable
async def handle_close(proc: asubprocess.Process, spawner: Callable[[], Awaitable[asubprocess.Process]]):
while True:
await proc.wait()
proc = await spawner()
(This function can also be extended to take a list of processes and their respective spawners and asyncio.wait
on all of them, once any of them stops - they are respawned and the process is repeated)
The proc
argument would be the process returned by asubprocess.create_subprocess_exec
and spawner
would be an async function that respawns that subprocess. (or any other callback)
The only issue is, I don't know how to run this in the background without polluting my entire codebase with async
. Ideally, there will be multiple subprocesses that will require their own handle_close
in the background. While all of these handlers are running, the caller code should not be blocked.
In my own usecase, I don't need the process handles so those can be discarded, as long as the processes keep running and respawning in case they stop and the caller code that spawns all the processes has control to do other things, it's fine.