I am processing a waveform stored in a vector. The audio channels are interleaved. I have to process the channels separately and then provide the final result as interleaved (again). The functions that operate on every single channel accept a range delimited by random access iterators, and assume the range is continuous.
As these operations have to be performed real-time (or sort of), I'd like to fake the deinterleave phase: in other terms, I am looking for a way to make the single-channel functions operate on a particular channel without actually deinterleaving anything and without affecting their current code which assumes that the ++
(or --
) operator of the iterators moves to the next (or previous) element.
What do you suggest? I am currently thinking about writing a custom random access iterator. Is there any other viable solution? I'd prefer not to use boost.