Suppose I've got a set of results from a study into the behaviour of a particular migratory bird. The bird has been tagged, and a GPS receiver records the migration path it follows each year over a five year period. The results are stored in a SQL Server table that contains one geography linestring for each year's path.
How would you go about defining the linestring representing the "average" path followed over the five year period?
Note that each sample linestring may contain a different number of points. They also don't start and end at exactly the same points.
The best approach I've got so far is to use interpolation to determine the points at certain set proportions along each linestring. So, for example, the start point, a quarter of the way along, halfway along each route etc. Then calculate the mean average lat/long of those positions across all routes, and construct a new geography linestring from those averaged points.
I've looked in a few computational geometry books to see if there's a better known algorithm or technique to do this, but there doesn't seem to be anything relevant. I can't believe that it isn't something that somebody else hasn't done before though...
I don't need exact code - just suggestions for any better general approaches. I don't need "super-accuracy" either. As a sidenote, I'd ideally like the approach to be applicable to two or more polygons too.
Thanks for any suggestions!