2

I want to distinguish optimizers (gradients based and free). If I use the sample optimization in the main webpage of OpenMDAO which uses SLSQP and check if the optimizer supports gradients I get "False" as in;

prob.driver.supports['gradients']

is this OpenMDAO or Scipy related issue?

Is there another way to see if the optimizer will use the gradient calculations or not before the problem is run.


Based on the answer below I added this in the beginning of my script; Thanks!

    prob = om.api.Problem()
    prob.driver = user_driver_object
    prob.setup()
    prob.final_setup()
    grads.append(prob.driver.supports['gradients'])
ali ali
  • 53
  • 5

1 Answers1

3

In the ScipyOptimizeDriver not all optimizers support gradient optimization, hence you cannot determine the correct value until you setup your driver This is done in final_setup() of your problem (which calls _setup_driver() in your driver). This method is called in run_model() and run_driver(), but you can also call it in itself to get the correct properties of your optimizer.

In the example below I am asking the driver 3 times if it supports gradients. The first time, after the problem setup it gives a False answer (the default), because the driver was not touched yet. If I call final_setup(), this will setup the driver, and all the properties of the driver will be correct. If run_model() or run_driver() is called, of course this will also setup the driver.

So my advice is to just use final_setup() before querying anything from your driver, that can change during the setup (which are mostly optimizer specific properties).

import openmdao.api as om

# build the model
prob = om.Problem()
indeps = prob.model.add_subsystem('indeps', om.IndepVarComp())
indeps.add_output('x', 3.0)
indeps.add_output('y', -4.0)

prob.model.add_subsystem('paraboloid', om.ExecComp('f = (x-3)**2 + x*y + (y+4)**2 - 3'))

prob.model.connect('indeps.x', 'paraboloid.x')
prob.model.connect('indeps.y', 'paraboloid.y')

# setup the optimization
driver = prob.driver = om.ScipyOptimizeDriver()
prob.driver.options['optimizer'] = 'SLSQP'

prob.model.add_design_var('indeps.x', lower=-50, upper=50)
prob.model.add_design_var('indeps.y', lower=-50, upper=50)
prob.model.add_objective('paraboloid.f')

prob.setup()
print("\nSupports gradients (after setup)?")
print(prob.driver.supports['gradients'])

prob.final_setup()
print("\nSupports gradients (after final setup)?")
print(prob.driver.supports['gradients'])

prob.run_driver()
print("\nSupports gradients (after run)?")
print(prob.driver.supports['gradients'])

This results in the following output:

Supports gradients (after setup)?
False
Supports gradients (after final setup)?
True
Optimization terminated successfully.    (Exit mode 0)
            Current function value: -27.33333333333333
            Iterations: 5
            Function evaluations: 6
            Gradient evaluations: 5
Optimization Complete
-----------------------------------
Supports gradients (after run)?
True
onodip
  • 635
  • 7
  • 12
  • Ok thanks for this. I was hoping to fix a problem in an automated manner but I won't be able to it seems. Sometimes my problem size is too big to set full Jacobians (some components I have to use 'fd' approximation which populates the full Jacobian). So I have a flag in the setup() of each component that in case a gradient-free optimizer is selected the partials wont be declared. (otherwise it wont fit into memory). But then this flag is set with respect to the selected optimizer gradient support. (which I can not know before setting my problem) – ali ali Oct 14 '19 at 08:32
  • i will fix this by setting up a dummy problem with user given optimizer and then checking the dummy problems "support gradient" and then setting up my own components with respect to that – ali ali Oct 14 '19 at 08:45
  • If you would like to go this way, making a dummy problem is indeed a solution. Making a dummy driver, and calling `_setup_driver()` or importing the `_gradient_optimizers` from `scipy_optimizer` and checking if your optimizer is in the list also does the job. Since both are private, these are a bit dirtier solutions. – onodip Oct 14 '19 at 09:38