I would like a way to capture video from the camera interface in Raspberry Pi, run it through a filter written as OpenGL shaders, and then send it to the hardware encoder.
This blog post talks about applying OpenGL shader filters onto the output of the camera when using raspistill. This is the corresponding source code. The output in that case however does not go to the video encoder, and this is not running on video, only on stills. Also (not completely sure) I think this ties into the preview, see these bits: raspitex_state A pointer to the GL preview state
and state->ops.redraw = sobel_redraw
.
The blog also talks about "fastpath", can someone explan what that means in this context?