1

In Matlab, I know how to program a multidimensional optimization. But I would like to dynamically choose a subset of variables to be optimized.

Suppose I have a three-dimensional variable vector, but I want Matlab to only optimize the first and the second variable. How can this be achieved?

x1 = 0:5; p_true = [1 2 3];             % True parameters
y1 = polyval(p_true, x1);               % True data
yn = y1 + sin(x1);                      % Noisy data

optimizationOptions = optimset('Display', 'final', 'TolFun', 1e-7, 'MaxFunEvals', 1e5,...
    'MaxIter', 1e4);
p0 = [0.5 0.75 1];                      % Initial guess
[p1, ~, ~, optOut] = fminsearch(@(parameters) objFunB(parameters, x1, yn), p0,...
    optimizationOptions);

plot(x1, yn, 'rx');
hold on
plot(x1, polyval([p1(1:2) 3], x1), 'b');

function rmse = objFunB(parameters, x1, yn)
    % Manipulate third component to be fixed; still, fminsearch tries adjusting it
    parameters(3) = 3;
    rmse = sum((yn - polyval(parameters, x1)).^2);
end

This clumsy solution tricks fminsearch into considering the third variable as insensitive, since it is overwritten inside of the objective function and thus, does not affect the output value.

Defining the third value as a separate variable (i.e., outside parameters) is not an option, since this would require considerable recoding each time I choose a different variable to be optimized.

There must be a better solution. Any ideas?

winkmal
  • 622
  • 1
  • 6
  • 16
  • Did you read somewhere that this is a viable way of doing optimization, or is this your own idea? Is this supposed to be a way to influence the "direction" of convergence to the one optimum that you're looking for? Perhaps you might want to consider [Global Optimization](https://www.mathworks.com/products/global-optimization.html) algorithms (I just want to help you avoid reinventing the wheel, if it is the case). – Dev-iL Jan 18 '18 at 08:43
  • Actually, the variables of my optimization are the parameters of a much bigger model, of which some are known *a priori*. That's why I want to dynamically choose the ones which are opzimized. – winkmal Jan 24 '18 at 12:33

2 Answers2

1

You can use an optimization method where you can specify lower and upper bounds an your parameters and than set bounds to equal each other. This way the parameter is fixed and your optimizer will not try to change it.

This approach assumes, you know the value of the parameter, that you do not want to optimize. The result of the optimization may change if you change this value.

I used fmincon to solve your example:

x1 = 0:5; p_true = [1 2 3];             % True parameters
y1 = polyval(p_true, x1);               % True data
yn = y1 + sin(x1);                      % Noisy data


p0 = [0.5 0.75 1];                      % Initial guess

lb = [-Inf, -Inf, 3];
ub = [Inf, Inf, 3];

[x,~,~,optOut] = fmincon(@(parameters) objFunB(parameters, x1, yn),p0,[],[],[],[], lb, ub);

plot(x1, yn, 'rx');
hold on
plot(x1, polyval([p1(1:2) 3], x1), 'b');


function rmse = objFunB(parameters, x1, yn)
    rmse = sum((yn - polyval(parameters, x1)).^2);
end
Alex bGoode
  • 551
  • 4
  • 16
0

While bounded optimization with lb == ub is possible, it limits the available algorithms to those which accept constraints (and may also have a performance impact, which I have yet to test). The built-in fminsearch does not facilitate bounded multidimensional optimization (although fminsearchbnd does).

I have come up with the following solution, starting from line 4 of the original script. It makes use of logical indexing.

all_parameters = p_true';
logOfPar2Opt = [1 0 1]';
p0 = nonzeros(all_parameters.*logOfPar2Opt); % Initial guess
optimizationOptions = optimset('Display', 'final', 'TolFun', 1e-7, 'MaxFunEvals', 1e5,...
    'MaxIter', 1e4);
[p1, fval, ~, optInfo] = fminsearch(@(parameters) objFunB(parameters, logOfPar2Opt,...
        all_parameters, x1, yn), p0, optimizationOptions);
indOfPar2Opt = find(logOfPar2Opt);
p_opt = all_parameters;
p_opt(indOfPar2Opt) = p1;

plot(x1, yn, 'rx');
hold on
plot(x1, polyval(p_opt, x1), 'b');

%% Separate objective functions
function rmse = objFunB(par2opt, logOfPar2Opt, all_parameters, x1, yn)
    indOfPar2Opt = find(logOfPar2Opt);
    prms = all_parameters;
    prms(indOfPar2Opt) = par2opt;
    rmse = sum((yn - polyval(prms, x1)).^2);
end
winkmal
  • 622
  • 1
  • 6
  • 16