1

I want to minimize a function like below:

enter image description here

Here, n can be 5,10,50 etc. I want to use Matlab and want to use Gradient Descent and Quasi-Newton Method with BFGS update to solve this problem along with backtracking line search. I am a novice in Matlab. Can anyone help, please? I can find a solution for a similar problem in that link: https://www.mathworks.com/help/optim/ug/unconstrained-nonlinear-optimization-algorithms.html .

But, I really don't know how to create a vector-valued function in Matlab (in my case input x can be an n-dimensional vector).

kayas
  • 703
  • 1
  • 5
  • 14

1 Answers1

4

You will have to make quite a leap to get where you want to be -- may I suggest to go through some basic tutorial first in order to digest basic MATLAB syntax and concepts? Another useful read is the very basic example to unconstrained optimization in the documentation. However, the answer to your question touches only basic syntax, so we can go through it quickly nevertheless.

The absolute minimum to invoke the unconstraint nonlinear optimization algorithms of the Optimization Toolbox is the formulation of an objective function. That function is supposed to return the function value f of your function at any given point x, and in your case it reads

function f = objfun(x)
    f = sum(100 * (x(2:end) - x(1:end-1).^2).^2 + (1 - x(1:end-1)).^2);
end

Notice that

  • we select the indiviual components of the x vector by matrix indexing, and that

  • the .^ notation effects that the operand is to be squared elementwise.

For simplicity, save this function to a file objfun.m in your current working directory, so that you have it available from the command window.

Now all you have to do is to call the appropriate optimization algorithm, say, the quasi Newton method, from the command window:

n = 10; % Use n variables
options = optimoptions(@fminunc,'Algorithm','quasi-newton'); % Use QM method
x0 = rand(n,1); % Random starting guess
[x,fval,exitflag] = fminunc(@objfun, x0, options); % Solve!
fprintf('Final objval=%.2e, exitflag=%d\n', fval, exitflag);

On my machine I see that the algorithm converges:

Local minimum found.

Optimization completed because the size of the gradient is less than the default value of the optimality tolerance.

Final objval=5.57e-11, exitflag=1

Robert
  • 575
  • 3
  • 12