I'm measuring some system performance data to store it in a database. From those data points I'm drawing line graphs over time. In their nature, those data points are a bit noisy, ie. every single point deviates at least a bit from the local mean value. When drawing the line graph straight from one point to the next, it produces jagged graphs. At a large time scale like > 10 data points per pixel, this noise is compressed into a wide jagged line area that is, say, 20px high instead of 1px as in smaller scales.
I've read about line smoothing, anti-aliasing, simplifying and all these things. But everything I've found seems to be about something else.
I don't need anti-aliasing, .NET already does that for me when drawing the line on the screen.
I don't want simplification. I need the extreme values to remain visible, at least most of them.
I think it goes in the direction of spline curves but I couldn't find much example images to evaluate whether the described thing is what I want. I did find a highly scientific book at Google Books though, full of half-page long formulas, which I wasn't like reading through now...
To give you an example, just look at Linux/Gnome's system monitor application. I draws the recent CPU/memory/network usage with a smoothed line. This may be a bit oversimplified, but I'd give it a try and see if I can tweak it.
I'd prefer C# code but algorithms or code in other languages is fine, too, as long as I can port it to C# without external references.