0

I want to express and solve below equations in a constraint programming language.

I have variables t and trying to find best multipliers k which minimizes my objective function.

Time: t1, t2, t3... given in input

Multipler k1, k2, k3... (This is continuous variables which needs to be found)

c1, c2,.. cN are constants

Main equation k1*sin(c1*x)+k2*sin(c2*x)+k3*sin(c3*x)+k4*cos(c1*x)...

Problem is to minimize results of all equations below with best possible values of (k1, k2, k3..). Also it is known that there is not an exact solution to the problem. So,
when x is t1 --> P1-k1*sin(c1*t1)-k2*sin(c2*t1)-k3*sin(c3*t1)-k4*cos(c1*t1)...

when x is t2 --> P2-k1*sin(c1*t2)-k2*sin(c2*t2)-k3*sin(c3*t2)-k4*cos(c1*t2)...

when x is t3 --> P3-k1*sin(c1*t3)-k2*sin(c2*t3)-k3*sin(c3*t3)-k4*cos(c1*t3)...

P1 is a bound value of time variable. But P(t) is not a analytic function, i just have values for them, like when t1 = 5 P1=0.7 t2= 6 P2= 0.3 etc..

Is it possible to solve this in minizinc or any other CP system?

Gökhan Yu
  • 25
  • 6
  • I may have misunderstood, but it seems for every time tj the expression `sin(ci*tj)` is a constant, call it `Sij`. Your equations then look like `Pj-k1*S1j-k2*S2j-k3*S3j...` i.e. they are simply linear. The question then is what you mean exactly by "minimize all equations": you could minimize their sum, their maximum, a pareto-optimum, etc – jschimpf May 23 '15 at 21:53
  • Another question to Gökhan: When you talk about the function "P(t)", is this the "P1", "P2" etc, so it should be "P(1)", "P(2)", etc? – hakank May 25 '15 at 07:06
  • @hakank Yes, P(t) gives P(t1)=P1, P(t2)=P2... etc. – Gökhan Yu May 25 '15 at 07:17
  • @jschimpf They are simply linear but i wrote it that way because someone would make use of it's gradient. And "minimize their sum" is the objective function. – Gökhan Yu May 25 '15 at 07:20

1 Answers1

2

I don't think that CP is particularly suited to solve this problem, as you don't really have constraints here. All you have are functions you want to minimize (f1,.., fi), and a few degrees of freedom to do so (k1,.., ki).

I feel like the problem is a pretty good candidate for the least squares method. Instead of trying to "fit" your functions f to a given value, you are trying to minimize them. So what you can do is try to fit to 0. (So we would be dealing with non-linear least squares in that care).

Here is what it would like written in Python:

import numpy as np
from scipy.optimize import curve_fit

xdata = np.array([t1, t2, t3, t4, ..., t10])
ydata = np.zeros(10)  # this is your "target". 10 = Number of ti 

def func(x, k1,k2,...ki):
  return (P(x)-k1*sin(c1*x)-k2*sin(c2*x)-k3*sin(c3*x)-k4*cos(c1*x)...)**2  # The square is a trick to minimize the function 

popt, pcov = curve_fit(func, xdata, ydata, k0=(1.0,1.0,...)) # Initial set of ki
Eric Leibenguth
  • 4,167
  • 3
  • 24
  • 51