0

I'm starting in python and I'm trying to solve a problem that Fmincon solves in Matlab.

Basically, my problem has 12 variables, and a list of 2000 values is created (linearly) and my objective is to maximize the largest value of this list.

In addition, the problem has a linear constraint.

I have tried, without success, using scipy but in all the free-gradient solvers or approximate gradient that I tried, it is not possible to insert linear constraints.

I've also tried using cvxopt, but I did not find any free-gradient solver or approximate gradient.

In addition, I would not like to use tools like Genetic Algorithm, PSO, Harmony Search, etc.

I would like to use a free-gradient solver or approximate gradient that is possible to insert linear constraints, like Fmincon in Matlab.

This is my objective function:

import numpy as np

def max_receita(X, f, CONSTANTE_CVAR):

# f is a matrix with 2000 rows and 12 columns
# CONSTANTE_CVAR is a matrix with 2000 rows and 12 columns

NSERIES = len(f)
REC = np.zeros(NSERIES)
X = np.transpose(X)

CONST_ANUAL = np.sum(CONSTANTE_CVAR,1)

for i in range(NSERIES):
    REC[i] = np.dot(f[i],X) + CONST_ANUAL[i]

return -max(REC)

There are 12 variables represented by the vector X with 12 elements and this vector must have sum equal 1. That's my constraint.

Beside this, each variable has a different bound. So the solver must allow to insert bounds.

The vector X (all the 12 variables), with the inputs f and CONST_CVAR, creates the vector REC (1x2000) and my objective is to maximize the biggest value of the vector REC.

In this way, I need a solver that allows:

  • Free-gradient or approximate gradient
  • Linear constraints
  • Non linear function
  • Bounds
  • Python

Could anyone suggest any solver?

  • What's wrong with scipy's *SLSQP* or *COBYLA*?. They both support (linear) constraints (and numerical-differentiation). cvxopt is surely not the correct approach for black-box opt. (That being said: if this one constraint is all you have, it allows customized projection-based algorithms; but for such a small-scale problem you won't need it). **Edit** your objective looks somewhat misaligned and is hard to read. If np.max is what i think it is; you should convert this. ```max``` is not differentiable (which can be a problem). This usually can be rewritten with helper-vars and linear constraints – sascha Mar 13 '18 at 12:57
  • As mentioned above, it's hard to get your objective. But it's quite possible it's very simple and there is no good reason to treat it as black-box. I think you are better off formulating your problem inside some well-defined standard-problem (LP? QP? Conic? NLP?). – sascha Mar 13 '18 at 13:03
  • First of all. Thanks for the attention. – Albuquerque Vinicius Mar 13 '18 at 13:27
  • _COBYLA_ does not allow to use bounds, at least, I don't know how to use. _SLSQP_ does not converge. Unfortunately, this is one of the my objectives of my problem: to maximize the biggest value of the list REC. Do you suggest reformulate the black-box? How would you suggest? – Albuquerque Vinicius Mar 13 '18 at 13:53
  • (1) Bounds can be formulated as (singleton-)constraints. (2) Using max like that you are invalidating the solver-assumptions (already mentioned). (3) Again: i think this might be a trivial non black-box opt-problem and there is no need for NLP-solvers. But despite my comments, you did not repair the syntax of your code, nor explained the objective (mathematically). It's also hard to help when there is no reproducible example (especially when *does not converge* is only some of the things returned by the solver)! – sascha Mar 13 '18 at 13:57
  • I do not think that I can change the objetive because it's a method that the company wants to use. But, if there is another method to optimize the max value of a list, I will think and try, because now I do not know how to do. One another objetive is to maximize the mean of the 1800 lower values of REC. Do you have a suggestion? I editted the question, trying to explain the objective and the problem but I do not know if it is enough. Thanks for your attention, again. – Albuquerque Vinicius Mar 13 '18 at 15:29
  • I did not say you should change it. I said you need to make it differentiable (google it) to be compatible with the solver. There are so much potential things to do here. But as presented, there is no help possible. No code to run (and the one presented is not valid python), not enough details. – sascha Mar 13 '18 at 15:32

0 Answers0