1

I have a relatively simple optimisation problem of a nonlinear loss/objective function of two parameters whose algebraic solution I'm unable to find, but whose value, Jacobian and Hessian are easy to compute numerically. In fact, they are so easy to compute that I think the overhead per iteration of typical solvers/optimisers from scipy.optimize are the dominating the computation time. Since I need to perform this optimisation for a large number of parameter sets, I wonder if there is a way to optimise the performance e.g. using numba and a C-level implementation of an optimiser, or using a vectorised optimiser that can vectorise the operations.

burnpanck
  • 1,955
  • 1
  • 12
  • 36
  • How many times does your objective function get called per optimization? Show us, with timings, that overhead dominates the time. – hpaulj Oct 07 '18 at 16:13
  • On my machine, a vectorised version of the Jacobian compiled using `numba` takes 15us to execute at 1000 parametersets in parallel, while calling a similarly compiled scaler version 1000 times inside a list-comprehension takes 1.1 ms to execute. In both cases, the pure-python variant is about four times slower. Therefore, the call-overhead is responsible for 98% of the runtime. – burnpanck Oct 07 '18 at 16:46
  • On average, it seems to take about 100 function calls to find the optimum, though that of course also depends on the initial conditions. – burnpanck Oct 07 '18 at 16:47
  • Can you show us the objective? Some one here might be able to find a (partial) analytic solution. – dmuir Oct 08 '18 at 10:10
  • @dmuir: Wouldn't that belong to Math.SE? I'm actually more interested in the numerical solution: I come across similar problems regularly, sometimes it's optimisation, sometimes a more generic multivariate equation; a numerical method could possibly help in many cases. – burnpanck Oct 08 '18 at 11:59

0 Answers0