0

As a test in understanding nonlinear optimization using Julia's JuMP modeling language, I am trying to minimize the Rosenbrock function in 10 dimensions with constraints 0 <= x[i] <= 0.5. First Rosenbrock with variable arguments:

function rosen(x...)
    local n = length(x); local s = 0.0
    for i = 1:length(x)-1
        s += 100*(x[i+1] - x[i]^2)^2 + (x[i] - 1)^2
    end
    return s
end
## rosen (generic function with 1 method)

Define the optimization model with Ipopt as solver,

using JuMP; using Ipopt
m = Model(solver = IpoptSolver())
## Feasibility problem with:
## * 0 linear constraints
## * 0 variables
## Solver is Ipopt

and the variables with bound constraints and starting values x[i] = 0.1:

@variable(m, 0.0 <= x[1:10] <= 0.5)
for i in 1:10 setvalue(x[i], 0.1); end

Now I understand that I have to register the objective function.

JuMP.register(m, :rosen, 10, rosen, autodiff=true)

I am uncertain here whether I can do it like this, or if I need to define and register a mysquare function, as is done in the "User-defined Functions" section of the JuMP manual.

@NLobjective(m, Min, rosen(x[1],x[2],x[3],x[4],x[5],x[6],x[7],x[8],x[9],x[10]))

How can I write this more compactly? An expression like

@NLobjective(m, Min, rosen(x[1:10]))
##ERROR: Incorrect number of arguments for "rosen" in nonlinear expression.

gives an error. What if I would like to solve this problem with 100 variables?

Now we solve this model problem and, alas, it returnes a solution, and indeed the correct solution as I know from solving it with the NEOS IPOPT solver.

sol = solve(m);
## ...
## EXIT: Optimal Solution Found.

As I am only interested in the exact value of x[10], extracting it thus:

getvalue(x[10]
## 0.00010008222367154784

Can this be simplified somehow? Think of it how easy it is to solve this problem with fminsearch in MATLAB or optim in R.

R> optim(rep(0.1,10), fnRosenbrock, method="L-BFGS-B",
         lower=rep(0.0,10), upper=rep(0.5,10),
         control=list(factr=1e-12, maxit=5000))
## $par
## [1] 0.50000000 0.26306537 0.08003061 0.01657414 0.01038065
## [6] 0.01021197 0.01020838 0.01020414 0.01000208 0.00000000

Except, of course, it says $par[10] is 0.0 which is not true.

Hans W.
  • 1,799
  • 9
  • 16
  • Possibly related to: [Julia+JuMP: variable number of arguments to function](https://stackoverflow.com/questions/44710900/juliajump-variable-number-of-arguments-to-function/44711305#44711305) – Dan Getz Oct 08 '17 at 15:16
  • @DanGetz : This is a bloody trick, `setNLobjective` is not even exported from JuMP. I really don't understand what this `Expr` does, though it appears to work. Should there not be an easier approach for the casual user? -- as this is likely a common problem for every use of JuMP for nonlinear minimization.? – Hans W. Oct 08 '17 at 17:09
  • 1
    Here is some brief explanation. What @NLobjective does is similar to R's non-standard evaluation, and setNLobjective is the corresponding function doing the real work, `Expr` thing is just to make the expression object `rosen(x[1],x[2],x[3],x[4],x[5],x[6],x[7],x[8],x[9],x[10])`, in R it is like that you are constructing `quote(rosen(x[1],x[2],x[3],x[4],x[5],x[6],x[7],x[8],x[9],x[10]))`. I'm not sure but maybe we could post some issue and make a PR on this? Shouldn't be too difficult. – Consistency Oct 13 '17 at 20:16

0 Answers0