0

I want to solve the following optimization problem with Python:

I have a black box function f with multiple variables as input. The execution of the black box function is quite time consuming, therefore I would like to avoid a brute force approach.

I would like to find the optimum input parameters for that black box function f.

In the following, for simplicity I just write the dependency for one dimension x.

An optimum parameter x is defined as: the cost function cost(x) is maximized with the sum of

  • f(x) value
  • a maximum standard deviation of f(x)

.

cost(x) = A * f(x) + B * max(standardDeviation(f(x)))

The parameters A and B are fix.

E.g., for the picture below, the value of x at the position 'U' would be preferred over the value of x at the positon of 'V'.

My question is:

Is there any easily adaptable framework or process that I could utilize (similar to e. g. simulated annealing or bayesian optimisation)?

As mentioned, I would like to avoid a brute force approach.

one dimensional example

user7468395
  • 1,299
  • 2
  • 10
  • 23
  • The question is very broad. There are a couple of algorithms in `scipy.optimize`. You should choose one that you think is more appropriate to your function (e.g. local gradient approach, global annealing approach, etc.), then attempt to use it and come back here if you have problems with the code or the output. – Tarifazo Aug 06 '19 at 20:53
  • I am a bit unclear on your statements: what is the dimensionality of f_max(x)? If f(x) is a 1D array (a vector), then f_max(x) is a scalar, and there is no standard deviation for a scalar. If f(x) is a 2D array (a matrix), then f_max(x) is a vector but then you have to specify what it means to calculate a maximum on a 2D array (I.e., across which axis do you calculate the maximum?). Can you maybe post some code to show what you mean? – Infinity77 Aug 07 '19 at 18:06
  • @Infinity77 : you are right about the dimensionalities. I updated my question. – user7468395 Aug 10 '19 at 13:05

1 Answers1

1

I’m still not 100% sure of your approach, but does this formula ring true to you:

A * max(f(x)) + B * max(standardDeviation(f(x)))

?

If it does, then I guess you may want to consider that maximizing f(x) may (or may not) be compatible with maximizing the standard deviation of f(x), which means you may be facing a multi-objective optimization problem.

Again, you haven’t specified what f(x) returns - is it a vector? I hope it is, otherwise I’m unclear on what you can calculate the standard deviation on.

The picture you posted is not so obvious to me. F(x) is the entire black curve, it has a maximum at the point v, but what can you say about the standard deviation? To calculate the standard deviation of you have to take into account the entire f(x) curve (including the point u), not just the neighborhood of u and v. If you only want to get the standard deviation in an interval around a maximum for f(x), then I think you’re out of luck when it comes to frameworks. The best thing that comes to my mind is to use a local (or maybe global, better) optimization algorithm to hunt for the maximum of f(x) - simulated annealing, differential evolution, tunnelling, and so on - and then, when you have found a maximum for f(x), sample a few points on the left and right of your optimum and calculate the standard deviation of these evaluations. Then you’ll have to decide if the combination of the maximum of f(x) and this standard deviation is good enough or not compared to any previous “optimal” point found.

This is all speculation, as I’m unsure that your problem is really an optimization one or simply a “peak finding” exercise, for which there are many different - and more powerful and adequate- methods.

Andrea.

Infinity77
  • 1,317
  • 10
  • 17