0

I can't quite find solution to a code where I pass to each worker Shared Queue but also a number for each worker.

My code:

The idea is to create several channels for putting audio songs. Each channels must be unique. So If a song arrives I put it to channel which is available

from multiprocessing import Pool,Queue
from functools import partial
import pygame
queue = Queue()


def play_song(shared_queue, chnl):

    channel = pygame.mixer.Channel(chnl)
    while True:
        sound_name = shared_queue.get()
        channel.play(pygame.mixer.Sound(sound_name))
    


if __name__ == "__main__":
    channels= [0,1, 2, 3, 4]
    
    func = partial(play_song,queue)
    p = Pool(5,func, (channels,))

This code of course doesn't return any error, because its multiprocessing, but the problem is that channels is passed to play_song as whole list instead of being mapped to all workers.

So basically instead of each worker initialize channel like this:

channel = pygame.mixer.Channel(0) # each worker would have number from list so 1,2,3,4

I am getting this

channel = pygame.mixer.Channel([0,1,2,3,4]) # for each worker

I tried playing with partial function, but unsuccessfully.

I was successful with pool.map function, but while I could pass individual numbers from channels list, I couldn't share Queue among workers

Jason Aller
  • 3,541
  • 28
  • 38
  • 38
Martin
  • 3,333
  • 2
  • 18
  • 39
  • Pool already uses queues internally, what's the idea behind stuffing another one into it? – Darkonaut Jan 27 '19 at 15:39
  • I am not sure what you mean. https://docs.python.org/2/library/multiprocessing.html in 16.6.1.2 they also use the way I do – Martin Jan 27 '19 at 20:58
  • 1
    They don't use `Pool` in the docs but `Process`. You can build your own pool with `Process` and `Queues` or you just use `multiprocessing.Pool` which handles the plumbing for you in the background. When you use a pool-method like `pool.map` the worker-processes are fed over internal queues so if you want to pass something to your workers, you only have to use a pool-method. – Darkonaut Jan 27 '19 at 21:14
  • 1
    I'm not sure you should use multiprocessing here. Have you tried multithreading before? I'm not familiar with pygame but I would expect it to call into some c-lib for the sound-processing. If most of the time is spend outside of Python, using a ThreadPool would be enough. – Darkonaut Jan 27 '19 at 21:28
  • I already solved my problem due to the Pygame functionality without using threads or multiprocessing. When I have time, I will post both solutions, with pygame and with your advice. Thanks – Martin Jan 28 '19 at 18:37
  • 1
    I'm glad you were able to resolve this. There's one important thing about pool-workers I would have addressed in an answer else: you cannot guarantee that every single pool-worker will get tasks at all ([more](https://stackoverflow.com/a/53746242/9059420)). Hence you cannot use `pool.map()` to pass unique channel number to every single worker-process. For this part you would indeed need the `initializer`-parameter with a shared `multiprocessing.Value` counter for example. – Darkonaut Jan 28 '19 at 18:57

1 Answers1

0

Eventually I found solution to my Pygame problem that does not require threads or multiprocessing.


Background to the problem:

I was working with Pyaudio and since it is quite lowlevel api to audio, I had problems with mixing several sounds at the same time and in general. The reasons are:

1) It is not easy(maybe imposible) to start several streams at the same time or feed those streams at the same time (looks like hardware problem)

2) Based on 1) I tried different attitude - have one stream where audio waves from different sounds are added up before entering stream - that works but its unreliable as adding up audiowaves is not really compatible - adding to much waves results in 'sound cracking' as the amplitudes are too high.

Based on 1) and 2) I wanted to try run streams in different processes, therefore this question.


Pygame solution (single processed):

for sound_file in sound_files:
    availible_channel = pygame.mixer.find_channel() #if there are 8 channels, it can play 8 sounds at the same time
    availible_channel.play(sound_file )

if sound_files are already loaded, this gives near simultaneous results.

Multiprocessing solution

Thanks to Darkonaut who pointed out multiprocessing method, I manage to answer my initial question on multiprocessing, which I think is already answered on stackoverflow, but I will include it.

Example is not finished, because I didnt use it at the end, but it answers my initial requirement on the processes with shared queue but with different parameters

import multiprocessing as mp

shared_queue = mp.Queue()


def channel(que,channel_num):
    que.put(channel_num)

if __name__ == '__main__':
    processes = [mp.Process(target=channel, args=(shared_queue, channel_num)) for channel_num in range(8)]

    for p in processes:
        p.start()

    for i in range(8):
        print(shared_queue.get())
    for p in processes:
        p.join()
Martin
  • 3,333
  • 2
  • 18
  • 39