3

I am trying to simulate an exact line search experiment using CVXPY.

objective = cvx.Minimize(func(x+s*grad(x))) 
s = cvx.Variable()
constraints = [ s >= 0]
prob = cvx.Problem(objective, constraints)
obj = cvx.Minimize(prob)

(cvxbook byod pg472) enter image description here

the above equation is my input objective function.

def func(x):
np.random.seed(1235813)
A = np.asmatrix(np.random.randint(-1,1, size=(n, m)))
b = np.asmatrix(np.random.randint(50,100,size=(m,1)))
c = np.asmatrix(np.random.randint(1,50,size=(n,1)))
fx = c.transpose()*x - sum(np.log((b - A.transpose()* x)))
return fx

Gradient Function

def grad(x):
np.random.seed(1235813)
A = np.asmatrix(np.random.randint(-1,1, size=(n, m)))
b = np.asmatrix(np.random.randint(50,100,size=(m,1)))
c = np.asmatrix(np.random.randint(1,50,size=(n,1)))
gradient = A * (1.0/(b - A.transpose()*x)) + c
return gradient

Using this to find the t "Step Size" by minimising the objective function results in an error 'AddExpression' object has no attribute 'log'.

I am new to CVXPY and Optimization. I would be grateful if someone could guide on how to fix the errors.

Thanks

timrau
  • 22,578
  • 4
  • 51
  • 64
Krishna Kalyan
  • 1,672
  • 2
  • 20
  • 43

1 Answers1

5

You need to use CVXPY functions, not NumPy functions. Something like this should work:

def func(x):
   np.random.seed(1235813)
   A = np.asmatrix(np.random.randint(-1,1, size=(n, m)))
   b = np.asmatrix(np.random.randint(50,100,size=(m,1)))
   c = np.asmatrix(np.random.randint(1,50,size=(n,1)))
   fx = c.transpose()*x - cvxpy.sum_entries(cvxpy.log((b - A.transpose()* x)))
   return fx
steven
  • 671
  • 4
  • 8