Some context: I have some C code that when compiled I can call in the terminal like this: ./my_excec -params
It generates some files that I am using in python to generate charts, and other stuff.
I want to pack everything in a python library both the C code and the python code. The C code is not a python extension (prolly going to be in a future but right now is not).
I have a make file to compile the C code and I know I can call it from the setup.py like this:
subprocess.call(['make', '-C', 'word2vec-src'])
What I want to be able to do is: pip install my_module
That should call the makefile, compile the C so the user can call the binaries: my_excec -params
and also be able to import the python code around it.
The problem I am having is when packaging the python package. I am using the data_files
option in setup()
like this:data_files=[('bin', ['bin/binary_file'])],
This moves the files from bin to the installation folder (in a virtual env) and I can call them. But when packaging is also putting the compiled files in the tarball and when I call pip install my_module` is putting the compiled files from my computer.
Thanks.