6

I would like a way to capture video from the camera interface in Raspberry Pi, run it through a filter written as OpenGL shaders, and then send it to the hardware encoder.

This blog post talks about applying OpenGL shader filters onto the output of the camera when using raspistill. This is the corresponding source code. The output in that case however does not go to the video encoder, and this is not running on video, only on stills. Also (not completely sure) I think this ties into the preview, see these bits: raspitex_state A pointer to the GL preview state and state->ops.redraw = sobel_redraw.

The blog also talks about "fastpath", can someone explan what that means in this context?

genpfault
  • 51,148
  • 11
  • 85
  • 139
Alex I
  • 19,689
  • 9
  • 86
  • 158

1 Answers1

5

The texture conversion will work on any MMAL opaque buffer i.e. camera preview, still (up to 2000x2000 resolution), video. However, the example code only does the GL plumbing for stills preview. I think someone posted a patch on the RPI forums to make it work with RaspiVid so you might be able to use that.

Fastpath basically means not copying the buffer data to ARM memory and doing a software conversion. So, for the GL rendering it means just passing a handle to GL so the GPU driver can do this directly.

Currently, there is no support/fastpath in the drivers for feeding the OpenGL rendered buffers into the video encoder. Instead, the slow and probably impractical path is to call glReadPixels, convert the buffer to YUV and pass the converted buffer to the encoder.

A fastpath is certainly possible and I've done some work in porting this to the RPI drivers but there's some other framework required and I won't get chance to look at this until the New Year.

Tim Gover
  • 116
  • 2
  • Thank you, this is good info. If I can summarize, MMAL buffer -> GL texture exists but is currently only used in RaspiStill, and GL texture -> encoder doesn't exist? What is the encoder input, a MMAL buffer? How does YUV in memory -> encoder work right now? Could you point to some docs on MMAL, VCHIQ, or the encoder? Thanks!! – Alex I Dec 20 '13 at 07:59
  • Also, you might find my other question interesting, here: http://stackoverflow.com/questions/19149441/decode-video-in-raspberry-pi-without-using-openmax Thanks again! – Alex I Dec 20 '13 at 08:02
  • Oh, one more question: Does RPi have unified or separate GPU and CPU memory? Is copying from one to the other (like glReadPixels) just equivalent to a memcpy() between two areas of the same memory, or is it more overhead than that? – Alex I Dec 20 '13 at 08:38
  • RPI has unified memory i.e. the CPU can address all of SDRAM e.g. as root you can look in /dev/vc-mem. However, at boot time the allocation is split between the GPU and CPU and configured by /boot/config.txt. – Tim Gover Jan 01 '14 at 13:06
  • glReadPixels is more than a memcpy because it has to DMA from GPU partition (physical contiguous) to virtual address region managed by ARM. The glReadPixels API also allows pixel format conversions and IIRC cropping etc. A better implementation is to allocate a frame-buffer in shared memory i.e. GPU region mapped directly into user-process. – Tim Gover Jan 01 '14 at 13:13
  • There's a Linux Kernel driver from another project to safely share the memory i.e. talk to GPU allocator to pin memory region. However, it needs to cache fixes and tidyup before it can be released. It's also likely to be an RPI Kernel specific driver i.e. not upstreamable. I'll post the RPI forums when it happens. – Tim Gover Jan 01 '14 at 13:15
  • afaict there is no MMAL solution**, I've asked elsewhere, and been looking for some months... **Tim in the first reply says: Re: RaspiVid "someone posted a patch on the RPI forums" but without a link, this isn't a solution fastpath would be very welcome!!! – Jonathan Chetwynd Mar 24 '14 at 15:39