For an optimization problem, I need to find argmin_u f(u, a)
for many, many different parameters a
. All my code is written in Python, so I figured I would use cython to optimize this particular task. I use scipy.optimize.minimize
for the actual optimization.
I wrote a simple cython class to wrap the function, with separate methods to set the parameter and to evaluate the function with a single input. However, when I try to optimize it, compilation works but I receive a run-time error:
TypeError: wrap() takes exactly 2 positional arguments (1 given)
Here is a minimal example in which I managed to reproduce the error (in a .pyx
file):
from scipy.optimize import minimize
cdef double testfun(double x):
return (x-3)**2
cdef class TestClass(object):
cdef double eval(self, double x):
return (x-5)**2
cpdef optimize_testfun():
# this works:
res = minimize(testfun, [0])
print('Result: {}'.format(res.x[0]))
# this does not work:
cdef TestClass test_object = TestClass()
res = minimize(test_object.eval, [0])
print('Result: {}'.format(res.x[0]))
Note that for a regular compiled function, the optimization works fine. Also, for a regular Python class this construction works. I suppose the problem lies in the way that cython handles the self
argument.
Does anyone have an idea to get the code to work with the compiled class? Or does anyone have an alternative idea of how to wrap the multi-argument function to present it as a compiled single-argument function to minimize
?
My experience with both optimization and cython is limited, so if this is a stupid way to go about things, please let me know.
[NB. My initial question was very vaguely formulated, I almost completely rewrote it to give a clear and reproducible problem outline]