You basically need a combination of the app_sink.rs
and app_src.rs
example from https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/tree/master/examples/src/bin .
The first shows you how to get (in this case audio) buffers out of a pipeline. You'll get a callback called for each buffer that arrives. You can then either
- pull them from the callback and pass them through some kind of channel to your skia code
- just notify your skia code that a buffer is available and then pull it from the skia code
- directly pull from the skia code: this will block until a buffer is available
The second shows you how to use the gst_video::VideoFrame
API to map a gst::Buffer
and access the raw video data. That should allow you to somehow pass it to skia. I don't know how skia works, so you'll have to figure out that part yourself.
You will also need to make sure that GStreamer gives you the data in the correct format, e.g. ARGB. The app_sink.rs
example does that by setting caps on the appsink
element, and in your case you want to set the corresponding video caps there that describe the format that skia would like to have.
Another example in that directory that could be useful is the glupload.rs
example. That uses an appsink
to get GStreamer buffers as a GL texture and then renders them via glutin. Something similar would also be possible with Vulkan but currently the GStreamer Vulkan library is not part of the bindings yet.