I'm facing an issue with the minimization of a specific objective function.
Let's consider the following scenario: We have a vector (V
) with a length of 600.
Our objective is to minimize a function that involves two unknown parameters. Our aim is to find these parameters in a way that minimizes the value of our objective function.
Here's the code I've been using:
I've tried various optimization methods such as trust-constr
, Nelder-Mead
, SLSQP
, and more. However, the performance of all these methods seems similar, and I'm concerned that they might not be optimizing the function properly.
For instance, I've found some promising parameter values that, when used as initial guesses, should lead to optimal results. Despite providing these values as initial guesses, the optimized parameter values obtained from the optimization process don't seem to converge to these promising values as I expected.
To elaborate further, the primary goal is to identify the unknown parameters that result in the lowest possible value of the objective function. And also the goal is finding those parameters regardless of the method.
import numpy as np
from scipy.optimize import minimize
import matplotlib.pyplot as plt
# Load actual data
batch_size = 100
horizontalsteps = np.arange(1, 60000, batch_size)
import numpy as np
from scipy.optimize import minimize
def objective(x):
sum_val = 0
for i in range(len(horizontalsteps)):
sum_val += (V[i] - (x[0] + x[1] / np.sqrt(horizontalsteps[i]))) ** 2
return sum_val
# Initial guess for the parameters
initial_guess = [0.003, 6]
# Define parameter bounds
param_bounds = [(-10000, 10000), (-10000, 10000)]
# Use trust-constr algorithm for optimization
result = minimize(objective, initial_guess, method='trust-constr', bounds=param_bounds)
optimized_params = result.x
print("Optimized Parameters:", optimized_params)