0

I am trying to speed up my functions called by scipy minimize. They were originally all lambda's so I thought I'd replace these with numba @njit functions.

But I get this exception:

  File "/blah/opt.py", line 142, in normalise
    result = minimize(
  File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_minimize.py", line 631, in minimize
    return _minimize_slsqp(fun, x0, args, jac, bounds,
  File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/slsqp.py", line 375, in _minimize_slsqp
    sf = _prepare_scalar_function(func, x, jac=jac, args=args, epsilon=eps,
  File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/optimize.py", line 261, in _prepare_scalar_function
    sf = ScalarFunction(fun, x0, args, grad, hess,
  File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 159, in __init__
    self._update_grad()
  File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 238, in _update_grad
    self._update_grad_impl()
  File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 149, in update_grad
    self.g = grad_wrapped(self.x)
  File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 146, in grad_wrapped
    return np.atleast_1d(grad(np.copy(x), *args))
TypeError: <lambda>() takes 1 positional argument but 2 were given

Here is the code used:

@njit(cache=True)
def fn(x, weights):
    return np.sum((x - weights) ** 2)


@njit(cache=True)
def fn_cons(x):
    return np.sum(np.abs(x)) - 1


cons = ({'type': 'eq',
         'fun': fn_cons
         })


class TestSpeedup:

    def normalise(self, weights):
        result = minimize(
            fn,
            np.array(weights),
            args=(weights,),
            jac=lambda x: 2 * (x - weights),
            bounds=[(0, np.infty) for _ in weights],
            constraints=cons
        )
        minimum = result.x

        # return np.max([new_weights, np.zeros(new_weights.size)], axis=0) / np.sum(np.max([new_weights, np.zeros(new_weights.size)], axis=0))
        return minimum / np.sum(np.abs(minimum))

weights = np.array([ 1.04632843e+00, -6.89001783e-02,  2.17089646e-01, -2.52113073e-01, 4.19467585e-03])


test = TestSpeedup()
result = test.normalise(weights)

The functions are outside the class so the first parameter is not self. So not sure what I am missing here? Any advice?

Olddave
  • 397
  • 1
  • 2
  • 12

1 Answers1

1

The jacobian function is called with the same arguments as the objective function, so you should rewrite the lambda like this, for example:

lambda x, w: 2 * (x - w)

Instead, you can rewrite the objective function so that it calculates the jacobian as well, specifying the parameter jac=True in the call to minimize():

@njit(cache=True)
def fn(x, weights):
    d = x - weights
    err = d @ d
    jac = 2 * d
    return err, jac
aerobiomat
  • 2,883
  • 1
  • 16
  • 19