My use case: I want a fire and forget
function decorator, which will just asynchronously call another function.
Here's what I have right now:
from multiprocessing import Process
def fire_and_forget(func):
def inner(*args, **kwargs):
proc = Process(target=func, args=args, kwargs=kwargs)
proc.start()
proc.join()
return inner
Simple enough -- a typical python decorator that just creates a new process for the function and starts it. I believe this will give me the async functionality I'm looking for.
Now, I decorate a test function with this:
@fire_and_forget
def my_test_function(name, age=24):
print(f'Name: {name}, Age: {age}')
my_test_function('John')
However, I get the following error:
Traceback (most recent call last):
File "<pyshell#10>", line 1, in <module>
my_test_function('John')
File "<pyshell#3>", line 4, in inner
proc.start()
File "C:\Users\John\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\process.py", line 112, in start
self._popen = self._Popen(self)
File "C:\Users\John\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Users\John\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Users\John\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
reduction.dump(process_obj, to_child)
File "C:\Users\John\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <function my_test_function at 0x03C93810>: it's not the same object as __main__.my_test_function
I have seen some articles explaining that this is a pickling issue, but I haven't been able to fully understand what is going on here. I don't want to switch to not using a decorator, but I am open to using something other than the multiprocessing library.