-1

Please help me find an approach to solving the following problem: Let X is a matrix X_mxn = (x1,…,xn), xi is a time series and a vector Y_mx1. To predict values ​​from Y_mx1, let's train some model, let linear regression. We get Y = f (X). Now we need to find X for some given value of Y. The most naive thing is brute force, but what are the competent ways to solve such problems? Perhaps there is a use of the scipy.optimize package here, please enlighten me.

get an explanation or matherial to read for understanding

1 Answers1

0

Most scipy-optimize algorithm use gradient method, for those optimization problem, we could apply these into re-engineering of data (find the best date to invest in the stock market...)

If you want to optimize the result, you should choose a good step size and suitable optimize method.

However, we should not classify tge problem as "predict" of xi because what we are doing is to find local/global maximum/minimum. For example Newton-CG, your data/equation should contain all the information needed/a simulation, but no prediction is made from the method.

If you want to do a pretiction on "time", you could categorize the time data in "year,month..." then using unsupervise learning to "group" the data. If trend is obtained, then we can re-enginning the result to know the time

limlim1
  • 86
  • 2