-3

I'm facing an issue with the minimization of a specific objective function.

Let's consider the following scenario: We have a vector (V) with a length of 600.

Our objective is to minimize a function that involves two unknown parameters. Our aim is to find these parameters in a way that minimizes the value of our objective function.

Here's the code I've been using:

I've tried various optimization methods such as trust-constr, Nelder-Mead, SLSQP, and more. However, the performance of all these methods seems similar, and I'm concerned that they might not be optimizing the function properly.

For instance, I've found some promising parameter values that, when used as initial guesses, should lead to optimal results. Despite providing these values as initial guesses, the optimized parameter values obtained from the optimization process don't seem to converge to these promising values as I expected.

To elaborate further, the primary goal is to identify the unknown parameters that result in the lowest possible value of the objective function. And also the goal is finding those parameters regardless of the method.


import numpy as np
from scipy.optimize import minimize
import matplotlib.pyplot as plt


# Load actual data

batch_size = 100
horizontalsteps = np.arange(1, 60000, batch_size)

import numpy as np
from scipy.optimize import minimize

def objective(x):
  sum_val = 0
  for i in range(len(horizontalsteps)):
      sum_val += (V[i] - (x[0] + x[1] / np.sqrt(horizontalsteps[i]))) ** 2
  return sum_val

# Initial guess for the parameters
initial_guess = [0.003, 6]

# Define parameter bounds
param_bounds = [(-10000, 10000), (-10000, 10000)]



# Use trust-constr algorithm for optimization
result = minimize(objective, initial_guess, method='trust-constr', bounds=param_bounds)

optimized_params = result.x
print("Optimized Parameters:", optimized_params)
  • 2
    What is `V[I]`? Also, what makes you say the result is not optimal? – jared Aug 24 '23 at 15:05
  • Have you tried a genetic algorithm? https://en.wikipedia.org/wiki/Genetic_algorithm – Martin Clever Aug 24 '23 at 15:16
  • Is the objective function strictly convex, semi-definite, etc.? What is the optimal solution? There is not enough information here to help. – Michael Ruth Aug 24 '23 at 15:59
  • @jared, thanks. I've corrected it. (i), as the accessing index of the vector V in the For loop. – p.golestaneh Aug 25 '23 at 06:30
  • @Michael Ruth, thank you very much! No actually the objective function is not convex or semi-definite. The optimal solution is unknown. I think that these minimization methods don't work properly, since with trail and error I can find the values of two unknown parameters in a way that are better (means, the mse error between the curv ($x[0] + x[1] / np.sqrt(horizontalsteps[i]))$) and our actual curve (V)is less than this value corresponding to the methods' parameter solutions) than these methods' solutions. – p.golestaneh Aug 25 '23 at 06:41
  • @Martin Clever, I'm afraid not. I've not tried GA for this problem, but I've exploited it for other problems but for each epoch the results are quite different. – p.golestaneh Aug 25 '23 at 07:20
  • Without V, we cannot really help you. – jared Aug 25 '23 at 15:22

1 Answers1

0

I don't know if this is what you are looking for, but this program does find local minima:

import numpy as np
import random

batch_size = 100
horizontalsteps = np.arange(1, 60000, batch_size)
V = [random.randint(1, 100) for i in range(600)]
print(V)

def objective(x):
    sum_val = 0
    for i in range(len(horizontalsteps)):
        sum_val += (V[i] - (x[0] + x[1] / np.sqrt(horizontalsteps[i]))) ** 2
    return sum_val


# Initial guess for the parameters
initial_guess = [0.003, 6]
step = 1.0
value = objective(initial_guess)
while step > 2**-10:
    x, y = 0, 0
    for k in [-step, 0, step]:
        for j in [-step, 0, step]:
            new_value = objective([initial_guess[0] + k, initial_guess[1] + j])
            if new_value < value:
                x, y, value = k, j, new_value
    initial_guess = [initial_guess[0] + x, initial_guess[1] + y]
    if x == 0 and y == 0:
        step /= 2.0
    elif x != 0 and y != 0:
        step *= 2
    print(value, initial_guess, step)

It basically is a simple gradient descent method. If you are looking for something else, please be more specific in your question.

Martin Clever
  • 147
  • 1
  • 11