I am new to python and multiprocessing concepts in python (this is my first python project).
I have written few modules and wired them up together to work in sequential manner. Right now, i have requirement to fasten few things.
What i want to achieve is:
module-one.py
Read a json and store it as dict (normal dict or multiprocessing.Manager.dict)
module-two.method()
module-two.py
-- Some methods for business logic --
multiprocessing.process(target=module-three.method)
module-three.py
def method():
multiprocessing.process(target=module-four.method)
module-four.py
def method():
I should access the dict that was created in module-one
The global dict that mutiple processes can access
--- More business logic and data transformations ---
Note:
I am constrained not to use any frameworks like Flask. Else, i could have tried flask g to store things globally.
I am constrained not to use any external caching mechanisms like memcache or redis
To lessen the overhead, i tried combining the modules three and four into one. That also did not help. The dict in module-four or module-three is always empty.
My questions are:
- Is it possible to achieve what i have posted above?
- If it is not possible, what are the alternate ways to handle my requirements.
I browsed extensively stackoverflow and other forums. I found many single module examples where dict is created at module namespace or inside a class and same dict is passed as an argument to spawning processes. Based on those examples, it looks like i should pass the dict from module-one to module-two and so on upto module-four. I felt that there might be a better approach instead of passing the dict from one module to another. Hence i am posting this question.
Thanks, A newbie python coder