1

I was trying to solve a simple optimization problem, first via Python.Cvxpy framework and then via Julia.JuMP framework, but Julia.JuMP formulation is 15x slower.

My optimization problem:

  1. In Python.Cvxpy: (runtime: 4 sec)
# Run: time python this_file.py
import cvxpy as cp
import numpy as np
n = 2
b = np.array([2,3])
c1 = np.array([[3,4],[1,0],[0,1]])
c2 = [1,0,0]

x = cp.Variable(n)
prob = cp.Problem( cp.Minimize(b@x), [ c1@x >= c2 ])
prob.solve(cp.MOSEK)   # FOSS alternative: prob.solve(cp.GLPK)

print('Solution:', prob.value)
  1. In Julia.JuMP: (runtime: 1min 7sec)
# Run: time julia this_file.jl
using JuMP
using Mosek, MosekTools   # FOSS alternative: using GLPK

function compute()
    n = 2
    b = [2,3]
    c1 = [3 4 ; 1 0 ; 0 1]
    c2 = [1,0,0]

    prob = Model(optimizer_with_attributes(Mosek.Optimizer))   
    # FOSS alternative: Model(optimizer_with_attributes(GLPK.Optimizer))
    @variable(prob, x[1:n])
    @objective(prob, Min, b'*x)
    @constraint(prob, c1*x .>= c2)
    JuMP.optimize!(prob)

    println("Solution: ", JuMP.objective_value(prob))
end;

compute()

Any tips or tricks to fasten the Julia.JuMP code?

pqrz
  • 133
  • 5
  • 2
    You appear to be measuring compile time. If you try bigger problems, you will likely see very different results – Oscar Smith Apr 26 '21 at 15:44
  • 2
    Yes, JuMP has extremely long compile times (up to something about a minute). But in production when you have problems with several hundred thousands of decision variables that just does not matter. Also when running `compute()` for the second time in your case you will see something like few seconds. – Przemyslaw Szufel Apr 26 '21 at 20:38
  • 4
    Possibly worth mentioning that the biggest thing you can do to speed this up is upgrade to 1.6 if you haven't yet. It often is 2-3x faster compiling. – Oscar Smith Apr 26 '21 at 21:05
  • Maybe you can show how you performed the benchmark? Did you run it from your OS's command line, or from within the Julia REPL? And Did you allow it to compile before timing? Unfortunately, Mosek appears to be a commercial product that requires a license, so not everyone can test this out. – DNF Apr 27 '21 at 07:05
  • @OscarSmith - Thanks, julia v1.6 indeed seems promising – pqrz Apr 27 '21 at 07:42
  • @DNF - for runtime - had used linux's `time` utility and for solver - you may use FOSS alternative: `GLPK`. Have updated the above code for both issues. – pqrz Apr 27 '21 at 07:44
  • Do you mean to say that you are running this from the Linux command line? In that case you are including package loading and compilation, as well as jit time for your function. In that case you should not expect Julia to outperform python for short-lived computations. Until static compilation tools mature more in Julia, calling code from Bash is not an optimal workflow. – DNF Apr 27 '21 at 08:33
  • @PrzemyslawSzufel - Ok, this is more like a one-time cost. Thanks, didn't knew that. – pqrz Apr 27 '21 at 08:35
  • @DNF - If not **bash** - what is the best possible workflow? Or, in general - what are the ways to fasten this script (other then v1.6 upgrade)? – pqrz Apr 27 '21 at 08:46
  • @pqrz you can compile the packge into Julia sysimage. Here is how: https://stackoverflow.com/questions/62752978/why-julia-takes-long-time-to-import-a-package – Przemyslaw Szufel Apr 27 '21 at 08:53
  • @pqrz The usual workflow for Julia is from the Julia REPL. Of course, that is not convenient for everyone, but outside of the REPL, working with Julia requires some extra effort. Personally, I never use Julia outside of the REPL, so I'm not very familiar with alternative workflows. – DNF Apr 27 '21 at 09:24

1 Answers1

4

More than 1 minute is excessive. Did you update packages or something and recompile?

Here's what I get;

(base) oscar@Oscars-MBP lore % cat ~/Desktop/discourse.jl
@time using JuMP
@time using GLPK

function compute()
    n = 2
    b = [2,3]
    c1 = [3 4 ; 1 0 ; 0 1]
    c2 = [1,0,0]

    prob = Model(GLPK.Optimizer)
    @variable(prob, x[1:n])
    @objective(prob, Min, b' * x)
    @constraint(prob, c1 * x .>= c2)
    optimize!(prob)
    println("Solution: ", objective_value(prob))
end

@time compute()
@time compute()
(base) oscar@Oscars-MBP lore % time ~/julia --project=/tmp/jump ~/Desktop/discourse.jl
  4.070492 seconds (8.34 M allocations: 599.628 MiB, 4.17% gc time, 0.09% compilation time)
  0.280838 seconds (233.24 k allocations: 16.040 MiB, 41.37% gc time)
Solution: 0.6666666666666666
 12.746518 seconds (17.74 M allocations: 1.022 GiB, 3.71% gc time, 44.57% compilation time)
Solution: 0.6666666666666666
  0.000697 seconds (2.87 k allocations: 209.516 KiB)
~/julia --project=/tmp/jump ~/Desktop/discourse.jl  22.63s user 0.55s system 100% cpu 23.102 total

Breaking it down

  • Total: 23 seconds
  • Of which, 4 seconds is using JuMP
  • 13 seconds is the first solve
  • ~0 seconds is the second solve
  • so that leaves 6 seconds to start Julia

We're working on improving the using JuMP and our "time-to-first-solve" issue, but there are a few things you can do in the meantime.

  1. Don't run scripts via julia file.jl. Open Julia once and use the REPL. That avoids the 6sec overhead.
  2. Solve more than one JuMP model in a session. You only need to pay the 13 seconds once. The second solve was quick.
  3. Solve bigger models. If the solve time is measured in minutes, you probably don't care about 13 seconds of start-up.
  4. Use PackageCompiler https://github.com/JuliaLang/PackageCompiler.jl to avoid some of the latency issues.
  5. Use a different tool. If your workflow is to solve lots of small optimization problems and you can't do the above things, at this moment JuMP might not be the right tool for the job (although we plan on improving the latency issues going forward).
Oscar Dowson
  • 2,395
  • 1
  • 5
  • 13
  • 2
    Your timing is slightly wrong (see https://github.com/JuliaLang/julia/pull/39802. Using `@time @eval using JuMP` et. cetera, will account for most of those 6 seconds. Julia startup time should be closer to .25 seconds – Oscar Smith Apr 28 '21 at 22:09