0

I am trying to solve an optimization problem with scipy global optimization. I am using differential evolution.

code

def objective(x,*args):
    x = np.append(x,args)
    res = MLmodel.predict(x)
    return res


fun_history = []
x_values = []

def callback(x,convergence):
    fobj = objective(x)    
    x_values.append(x)
    fun_history.append(fobj)

bounds = [(5.5,8.8),(29,40)]
load = (50,)

res = optimize.differential_evolution(objective,bounds,args=load,disp=True,callback = callback)

My objective function takes these three parameters as input and gives an output. I want to optimize on only the first two parameters. So the third parameter I am passing as an argument.

When I run the optimizer I get an error after the first run, which says

ValueError: operands could not be broadcast together with shapes (1,2) (3,) (1,2) 

I guess the argument is not being appended to the x value from the second run.

Can someone help me with how to solve this problem?

chink
  • 1,505
  • 3
  • 28
  • 70
  • 1
    Where does the error occur? Without the traceback we are left guessing. What 3 parameters is your objective taking? I'm not familiar with `differential_evollution`, but typically the `optimize` functions take an initial value array (`x0`, sometimes 1d), and the `args` tuple. Do a test call, `objective(x0, *args)`. If needed add a debugging print to `objective` so you have a clear idea of what `optimize` is passing to your objective. Test, no guessing. – hpaulj Jun 29 '19 at 15:46
  • 1
    To get some help you need to provide a [minimal, reproducible example](https://stackoverflow.com/help/minimal-reproducible-example) so we can reproduce your error in the first place. – SuperKogito Jun 29 '19 at 15:48

0 Answers0