I recently started to code the benders decomposition algorithm into python. For the math.formulations numpy is used.
My question/problem is that I need to calculate a lambda value (which can be seen below), however the bad part about it that I use a while loop for the calculation. Which causes long run times, and/or crashes.
Any Ideas to calculate lambdas in more efficient way?
Math:
L = -Obj -sum(mü_i * const_i) - lambda * variable
grad(L) = 0
Code for calculating lambdas with making the grad function equals to 0:
x,y = np.mgrid[-100:100, -100:100] #Grid for gradient
d1 = s.dual[s.const1] #mü
d2 = s.dual[s.const2] #mü
d3 = s.dual[s.const3] #mü
d4 = s.dual[s.const4] #mü
L = -y-(d1*(y-x-5.0))-(d2*(y-x/2.0-15.0/2.0))\
-(d3*(y+x/2.0-35.0/2.0))-(d4*(-y+x-10.0))
Lambda = -2.0 #Lambda_0 starts -2.0
while Lambda <= 2.0:
Ex,Ey = np.gradient(L - Lambda*x)
if np.allclose(Ex,0.0) == True and np.allclose(Ey,0.0) == True: #works better for now 0.001 worked.
break
else:
Lambda += 0.001