0

I am using scipy library for an optimization problem. My objective function is a SVR regressor. Different initial values give different optimal values. why is it?


import numpy as np
from scipy.optimize import minimize
from scipy.optimize import Bounds

bounds = Bounds([26,26,8,6,400,100,0,25,2],[36,38,28,28,1800,800,100,50,7])


def objective(x):
    x_trail = x.reshape(1,-1)
    x_trail = sc_X.transform(x_trail)
    y_trail = regressorSVR.predict(x_trail)
    y_trail = y_trail.reshape(1,-1)
    y_trail = sc_Y.inverse_transform(y_trail)
    return y_trail[0]

x0 = np.array([26,36,11,7,580,377,84,43,4.3])
res = minimize(objective, x0, method='trust-constr',
               options={'verbose': 1}, bounds=bounds)

optimal_values = res.x

if I change x0 to different values, my optimal values are different. Why is it??

this is the code for svr regression:

X = dataset.iloc[:,:-1 ].values
y = dataset.iloc[:,9:10].values

# Splitting the dataset into the Training set and Test set
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.2, random_state = 0)

# Feature Scaling
from sklearn.preprocessing import StandardScaler
sc_X = StandardScaler()
sc_Y = StandardScaler()
X_train = sc_X.fit_transform(X_train)
X_test = sc_X.transform(X_test)

y_train = sc_Y.fit_transform(y_train)
y_test = sc_Y.transform(y_test)


from sklearn.svm import SVR
regressorSVR = SVR(kernel = 'rbf')

regressorSVR.fit(X_train, y_train)
chink
  • 1,505
  • 3
  • 28
  • 70

1 Answers1

2

I got the answer. My objective function here is non-linear. So it is a non-convex optimization problem. All the solvers there in scipy provide local convergence. If your optimization problem is non-convex it is possible to end up with local convergence. There is a concept of global solvers but not within scipy and local-convergence vs. global-convergence on non-convex problems is simplified a P vs. NP thing.

chink
  • 1,505
  • 3
  • 28
  • 70