I have many (~ 1 million) irregularly spaced points, P, along a 1D line. These mark segments of the line, such that if the points are {0, x_a, x_b, x_c, x_d, ...}, the segments go from 0->x_a, x_a->x_b, x_b->x_c, x_c->x_d, etc. I also have a y-value for each segment, which I wish to interpret as a depth of colour. I need to plot this line as an image, but there might only be (say) 1000 pixels available to represent the entire length of the line. These pixels, of course, correspond to regularly spaced intervals along the line, say at 0..X1, X1..X2, X2..X3, etc, where X1, X2, X3 are regularly spaced. To work out the colour for each pixel, I need to take an average of all the y-values that fall within the regularly-spaced pixel boundaries, weighted by the length of the segment that falls in that interval. There are also likely to be pixels which do not contain any value in P, and which simply take the colour value defined by the segment that passes across the entire pixel.
This seems like something that probably needs to be done a lot in image analysis. So is there a name for this operation, and what's the fastest way in numpy to calculate such a regularly-spaced set of average y-values? It's a bit like interpolation, I guess, only I don't want to take the mean of just the two surrounding points, but a weighted average of all points within a regular interval (plus a bit of overlap).
[Edit - added minimal example]
So say there are 5 segments along a horizontal line, delimited by [0, 1.1, 2.2, 2.3, 2.8, 4] (i.e. the line goes from 0 to 4). Assume each of the segments takes an arbitrary shading values, for example, we could have the 5 shading values [0,0.88,0.55,0.11,0.44] - where 0 is black and 1 is white. Then if I wanted to plot this using 4 pixels, I would need to create 4 values, from 0...1, 1...2, etc, and would expect the calculation to return the following values for each:
0...1 = 0 (this is covered by the first line segment, 0->1.1)
1...2 = 0.1 * 0 + 0.9 * 0.88 (1 ... 1.1 is covered by the first line segment, the rest by the second)
2...3 = 0.2 * 0.88, 0.1 * 0.55 + 0.5 * 0.11 + 0.2 * 0.44 (this is covered by the second to the fifth line segments )
3...4 = 0.44 (this is covered by the last line segment, 2.8->4)
Whereas if I wanted to fit this data into a 2-pixel-long line, the 2 pixels would have the following values:
0...2 = 1.1 / 2 * 0 + 0.9 / 2 * 0.88
2...4 = 0.2 / 2 * 0.88 + 0.1 / 2 * 0.55 + 0.5 / 2 * 0.11 + 1.2 * 0.44
This seems like the "right" way to do downsampling along a 1d line. I'm looking for a fast implementation (ideally something build-in) for when I have (say) a million points along the line, and only 1000 (or so) pixels to fit them in.