2

I am using dlib library for c++ to perform box-constrained optimization of custom function:

dlib::find_min_box_constrained(dlib::bfgs_search_strategy(),
                               dlib::objective_delta_stop_strategy(DELTA),
                               m, dlib::derivative(m),
                               starting_point, MIN_CONS, MAX_CONS);

where m is the objective function, starting_point is column vector of the initial values of the variables, MIN_CONS is the minimal allowed value for each of the variable and MAX_CONS is the maximal allowed value.

This works fine, but now, I would like to add another constraint on the variables - I want them to sum to 1. I am able to do this using scipy.optimize.minimize in python3 (answered in this question). Is there any way to achieve this using dlib?

Honza Dejdar
  • 947
  • 7
  • 19
  • Sometimes you can get away with normalizing x before evaluating the function. A better answer is: solvers for constrained NLP problems are readily available. – Erwin Kalvelagen Jul 18 '19 at 12:02
  • 1
    Not with box-constrained optimization. You will need some form ob linear-constrained optimization (and a quick search offers nothing in dlib). Apart from hacking a "projection onto the simplex" (this is your constraint) into some bfgs-code, you probably should look for another solver. One common candidate would be Ipopt. – sascha Jul 18 '19 at 19:43
  • 1
    I meant more something along the following lines. Instead of calling f(x) for function evaluation, call g(x) with g(x) calling f(x/||x||). Same thing for the gradient. No need to change the inside of the solver. – Erwin Kalvelagen Jul 20 '19 at 09:40
  • Thanks for the suggestions, I will look into them. – Honza Dejdar Jul 22 '19 at 08:34

0 Answers0