I have wrapped a small C library in Cython, and can successfully call it from Python. Here's a simplified example representing how this is as of now:
# wrapper_module.pyx
cdef extern from "my_c_library.h":
void function1()
int function2(int param1, char *param2)
class MyWrapperClass():
def __init__(self):
pass
def do_func1(self):
function1()
def do_func2(self, p1, p2):
function2(p1, p2)
This all works well. My goal now is to create and use an instance of MyWrapperClass
in a separate process, like this:
# my_script.py
import multiprocessing as mp
from wrapper_module import MyWrapperClass
class MyProcess(mp.Process):
def __init__(self):
super().__init__()
def run(self):
self.mwc = MyWrapperClass()
self.mwc.do_func1()
# ... do more
if __name__ == '__main__':
proc = MyProcess()
proc.start()
When I run my_script.py, I get the following error:
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec(). Break on THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC() to debug.
I think I understand generally why it does not allow this cython module to be forked into a different process (due to resources that the underlying C library uses). But generally when I have encountered this type of problem in the past, either in pure Python or using ctypes and calling DLLs, I can solve it by placing all the critical code in the run
method of MyProcess
, causing it only to be initialized in the newly forked process.
In this Cython case, however, I don't know how to include the cdef extern
code only in the forked process, to avoid this error. Any suggestions?
(I am running python 3.6.2 on macOS 10.12)