I have studied the OpenCV Kalman filter implementation and done some basic mouse pointer simulations and understand the basic. I seem to miss a few key points for using it in my application and was hoping someone here can provide a little example.
Using a simple model with velocity and position:
KF.statePre.at<float>(0) = mouse_info.x;
KF.statePre.at<float>(1) = mouse_info.y;
KF.statePre.at<float>(2) = 0;
KF.statePre.at<float>(3) = 0;
KF.transitionMatrix = (Mat_<float>(4, 4) << 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1);
setIdentity(KF.measurementMatrix);
setIdentity(KF.processNoiseCov, Scalar::all(1e-2));
setIdentity(KF.measurementNoiseCov, Scalar::all(1e-1));
setIdentity(KF.errorCovPost, Scalar::all(.1));
i can do a prediction
Mat prediction = KF.predict();
and i can do a correction
Mat estimated = KF.correct(measurement);
Doing this in a loop i dont fully understand what the meaning of prediction, estimate and measurement is.
Ofcause a measurement is a "truth" value meassured with some equitment. Lets say GPS latitude and longitude. I think this video shows some interesting ideas. https://www.youtube.com/watch?v=EQD0PH09Jvo. It is using a GPS measure unit that updates at 1hz and then it uses a kalman filter to predict the value at a 10 hz rate instead.
How would such a setup look? Is the following example the way to do it?
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat estimated = KF.correct(measurement);
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat estimated = KF.correct(measurement);
I am unsure if one can predict 9 values and then as the 10th provide a measurement.
I am some logged data that I would like to test with. The logged data is gps data at 1hz in a file where each row is : timestamp:lat:lng and seperatly a series of events are logged at 15hz in a seperate file : timestamp: eventdata.
I would like to use the kalman filter to simulate the data gathering and predict a position of all the eventdata timestamps based on the 1hz measurements. Ofcause a full example of doing this with opencv would be nice, but just the starting pointers on the questions above with predict and correct is also acceptable.
update
I gave it a try.
int main()
{
char * log = "C:/dev/sinnovations/opencv_examples/kalman_tracking/data/geosamples.txt";
char * log1 = "C:/dev/sinnovations/opencv_examples/kalman_tracking/data/geosamples1.txt";
ifstream myReadFile(log);
ofstream myfile(log1);
myfile.precision(15);
KalmanFilter KF(4, 2, 0,CV_64F);
Mat_<double> state(4, 1);
Mat_<double> processNoise(4, 1, CV_64F);
Mat_<double> measurement(2, 1); measurement.setTo(Scalar(0));
KF.statePre.at<double>(0) = 0;
KF.statePre.at<double>(1) = 0;
KF.statePre.at<double>(2) = 0;
KF.statePre.at<double>(3) = 0;
KF.transitionMatrix = (Mat_<double>(4, 4) << 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1); // Including velocity
KF.processNoiseCov = (cv::Mat_<double>(4, 4) << 0.2, 0, 0.2, 0, 0, 0.2, 0, 0.2, 0, 0, 0.3, 0, 0, 0, 0, 0.3);
setIdentity(KF.measurementMatrix);
setIdentity(KF.processNoiseCov, Scalar::all(1e-4));
setIdentity(KF.measurementNoiseCov, Scalar::all(1e-1));
setIdentity(KF.errorCovPost, Scalar::all(.1));
std::vector < cv::Point3d > data;
while (!myReadFile.eof())
{
double ms;
double lat, lng, speed;
myReadFile >> ms;
myReadFile >> lat;
myReadFile >> lng;
myReadFile >> speed;
data.push_back(cv::Point3d(ms, lat, lng));
}
std::cout << data.size() << std::endl;
for (int i = 1, ii = data.size(); i < ii; ++i)
{
const cv::Point3d & last = data[i-1];
const cv::Point3d & current = data[i];
double steps = current.x - last.x;
std::cout << "Time between Points:" << current.x - last.x << endl;
std::cout << "Measurement:" << current << endl;
measurement(0) = last.y;
measurement(1) = last.z;
Mat estimated = KF.correct(measurement);
std::cout << "Estimated: " << estimated.t() << endl;
Mat prediction = KF.predict();
for (int j = 0; j < steps; j+=100){
prediction = KF.predict();
myfile << (long)((last.x - data[0].x) + j) << " " << prediction.at<double>(0) << " " << prediction.at<double>(1) << endl;
}
std::cout << "Prediction: " << prediction.t() << endl << endl;
}
return 0;
}
but there is missing something as the results can be seen in the picture. Red circles are logged value and blue line are predicted values for the first coordinate of the lat lng values:
I dont think the way I am handling the time values for observations and predicting values are correct.