I am trying to minimize a function with 2 variables x[0],x[1]. A, B, and C are dataframes with dimensions 10x10. The optimization works as intended when I don't use constraints, however I also care for the constrainted case. For the constrainted case, I want
A.iloc[i,j]*x[0]*B.iloc[i,j]*x[1]*C.iloc[i,j]
to be equal to or greater than zero for all combinations of i and j. To achieve this, I have defined constraints in the following way:
cons=[]
def f(a):
def g(x):
return A.iloc[i,j]*x[0]*B.iloc[i,j]*x[1]*C.iloc[i,j]
return g
for i in range (10):
for j in range(10):
cons.append({'type':'ineq', 'fun': f(t)})
While I am getting the right number of constraints (i.e. len(cons) = 100), the optimization results do not satisfy the constraints that I had in mind, meaning it results in values for x[0], x[1] and x[2] for which
A.iloc[i,j]*x[0]*B.iloc[i,j]*x[1]*C.iloc[i,j]
is smaller than zero for many j,i. I have ascertained that result.success = True, so the optimization suddenly stopping can be ruled out as a potential problem. While looking for a solution to this problem, I have found this case of someone trying to iterate constraints in scipy aswell, but they only iterated over one range rather than over two and I was not able to modify their solution to work for my case.