1

I have a loop (all this is being done in Python 3.10) that is running relatively fast compared to a function that needs to consume data from the loop. I don't want to slow down the data and am trying to find a way to run the function asynchronously but only execute the function again after completion... basically:

queue=[]

def flow():
    thing=queue[0]
    time.sleep(.5)
    print(str(thing))
    delete=queue.pop(0) 

p1 = multiprocessing.Process(target=flow)

while True:
    print('stream')
    time.sleep(.25)
    if len(queue)<1:
        print('emptyQ')
        queue.append('flow')
        p1.start()


I've tried running the function in a thread and a process and both seem to try to start another while the function is still running. I tired using a queue to pass the data and as a semaphore by not removing the item until the end of the function and only adding an item and starting the thread or process if the queue was empty, but that didn't seem to work either.

EDIT : to add an explicit question... Is there a way to execute a function asynchronously without executing it multiple time simultaneously?

EDIT2 : Updated with functional test code (accurately reproduces failure) since real code is a bit more substantial... I have noticed that it seems to work on the first execution of the function (the print doesn't work inside the function...) but the next execution it fails for whatever reason. I assume it tires to load the process twice?

The error I get is - RuntimeError : An attempt has been made to start a new process before the current process has finished its bootstrapping phase...

kmdewey
  • 61
  • 7
  • I think the semaphore should have worked. Show your actual code and we'll help you get it working. – Barmar Mar 03 '22 at 01:52
  • @Barmar edited with functional test code and added some details. Thanks for the help! – kmdewey Mar 03 '22 at 04:26
  • I think you need to use multithreading, not multiprocessing. Multiprocessing gives each process its own memory, so the queue is not shared. And where is the semaphore? – Barmar Mar 03 '22 at 15:29
  • I tried it with threading... with the "queue" just called from the function it works perfectly the first time and doesn't throw an error but it does crashes out before the start of the second function call (I added a print and a wait after the loop and they don't execute), with the queue as an argument of the function it treats the queue as a variable set equal to the appended value and faults on queue.pop(0) because queue is the string "flow" (also prints "f" for print(str(queue[0])) ) instead of a list whos 0th element is the string "flow" – kmdewey Mar 03 '22 at 21:35
  • @Barmar I thought I was using the queue kind of like a semaphore... sorry if I'm using that term wrong, I'm far from an expert on coding... – kmdewey Mar 03 '22 at 21:39
  • I understand that, but you need some kind of mutual exclusion to mediate access to the variable between the threads. Python has real semaphores: https://docs.python.org/3/library/asyncio-sync.html#asyncio.Semaphore – Barmar Mar 03 '22 at 21:41
  • @Barmar So I added a lock (since my limit is only 1) by replacing the if statement with "if not lock.locked():" and then inside my function acquired at the start and released at the end. I'm now getting a fault that seems to relate to me not implementing this properly ("lock.acquire was never awaited")... I feel like i should go back to the drawing board since I seem to be getting farther away from functioning code :( – kmdewey Mar 03 '22 at 22:36
  • While I know the concepts, I don't actually have any expertise in using them in Python. – Barmar Mar 03 '22 at 23:50

0 Answers0