I have build an application (python3.6) which keeps checking a directory for incoming files and when there are files it spawns some python processes in order to work on those files.This work involves db calls and as of now there is no connection pooling so far. In near future load is going to go so heavy that we can't sustain without db connection pool. Is there a way that I can share connection pool across multiple processes in python. I have gone through the python documentation and stackoverflow but didn't find anything solid. At high level this is how I want it to be working ....
Asked
Active
Viewed 332 times
0
-
In short, no. You cant share a single (network) connection between multiple processes. It's not clear to me though why you can't have e.g. a *single* DB connection in each child process? It's not clear what you mean by "... load is going to go so heavy that we can't sustain without db connection pool". What load are you talking about there and why do you think a DB connection pool will help? – Tom Dalton Oct 10 '19 at 22:02
-
normally you can't do that...maybe you can explore a pist with gevent (asynchronous job)? – Narcisse Doudieu Siewe Oct 10 '19 at 22:03
-
@TomDalton Today I am opening and closing connections to process each file, but later number of files is going to be huge and it wouldn't make sense to open\close connection for each file. – Varun Maurya Oct 14 '19 at 15:31
-
Can you have a connection (or pool) per worker process? – Tom Dalton Oct 19 '19 at 17:28
-
1@TomDalton problem is that I keep creating new processes.. main process gets the work and then creates n worker processes anf distribute work to them – Varun Maurya Oct 25 '19 at 19:00
-
I'd probably use a shared queue (e.g. https://docs.python.org/2/library/multiprocessing.html#exchanging-objects-between-processes), then you can spawn M worker processes, each worker takes a job from the queue, processes it, and then repeats until the queue is marked as 'done'. Or you can use a pool which might avoid the need to epxlicitly create/use a queue at all e.g. https://docs.python.org/2/library/multiprocessing.html#using-a-pool-of-workers – Tom Dalton Oct 26 '19 at 19:08