I want to simulate a time series y1
with an AR(p) process, and then solve the differential equation dy/dt = AR(p)(y1)
. This is the Scilab code I wrote (the AR coefficients are calculated with the lev()
function and then normalized in a part of the code I omitted to keep it short).
t = [0:0.1:2*%pi];
y1 = sin(t);
C = 1;
y2 = -1*cos(t) + C;
ar = [1. - 0.0195380 - 0.0154317 - 0.0116690 - 0.0081661 -0.1015448]
y1_m = filter(1, ar, y1); //Generates simulated series
function y_m = far(t, y, ar, y1)
y_m = filter(1, ar, y1);
endfunction
//Solves ODE
y0 = y1(1); t0 = t(1);
y2m = ode(y0, t0, t, list(far, ar));
scf(1); plot(t, y2, '-k', t, y2m, '.k');
However, the code fails with message:
y2m = ode(y0, t0, t, list(far, ar_n));
!--error 98
La variabile restituita dalla funzione argomento di Scilab non è corretta.
at line 26 of exec file called by :
How can I solve this ODE?
Added Case Description:
What I have is a system that performs certain operation on the input signal Y
, which is a time series, producing an output Z
. The system equations are known, so I can determine Z
provided that Y
is a known function of time.
And by writing that I realize where my mistake lies: with an AR model, I am expressing Y
not as a function of time, but of its own past values.
So I should rather ask, how can I fit a function of time to a time series?