0

If I have data series and a set of constraints and want to predict the most likely values, what is the right algorithm or approach? For example, given the data table as follows:

enter image description here

The first three rows illustrate typical data values. Imagine we have dozens or hundreds of such rows. The constraints on the system are as follows:

G1 + G2 + G3 + G4 == D1 + D2 + D3
G1 + G2 = D1 - C1
G3 = D2 + C1 - C2
G4 = D3 + C2

So, given D1, D2 and D3 we need to predict G1, G2, G3, G4, C1, and C2. Note that there may not necessarily be enough information to solve the system by linear programming alone and so some kind of trend analysis or probability distribution might need to be made.

What is the right algorithm or approach to solve a problem like this?

Tyler Durden
  • 11,156
  • 9
  • 64
  • 126
  • Without additional assumptions (e,g, some model of trend-analysis) there is nothing you can do. There is probably no off-the-shelf approach. The general problem in optimization theory would be constrained optimization (which is hard). Depending on the model it might be convex (feasible) or nonconvex constrained optimization (very hard). As you know the most about that data, you have to create some statistical model. – sascha Jan 11 '17 at 23:35
  • What are the constraints on G1-4 and C1-2 in terms of the domain or set of values that they could take? And in relation to the sample dataset you have? Because you could out and values in to just satisfy those 4 equations. – Barry Hurley Jan 12 '17 at 08:17

0 Answers0