3

I tried to receive numpy arrays from frame to frame in real-time from the GStreamer framework.

I already tried to use a pipeline like this (from http://stackoverflow.com/questions/8187257/play-audio-and-video-with-a-pipeline-in-gstreamer-python/8197837 with modification) in Python:

self.filesrc = Gst.ElementFactory.make('filesrc')
self.filesrc.set_property('location', self.source_file)
self.pipeline.add(self.filesrc)

# Demuxer
self.decoder = Gst.ElementFactory.make('decodebin')
self.decoder.connect('pad-added', self.__on_decoded_pad)
self.pipeline.add(self.decoder)

# Video elements
self.videoqueue = Gst.ElementFactory.make('queue', 'videoqueue')
self.pipeline.add(self.videoqueue)

self.autovideoconvert = Gst.ElementFactory.make('autovideoconvert')
self.pipeline.add(self.autovideoconvert)

self.autovideosink = Gst.ElementFactory.make('autovideosink')
self.pipeline.add(self.autovideosink)

# Audio elements
self.audioqueue = Gst.ElementFactory.make('queue', 'audioqueue')
self.pipeline.add(self.audioqueue)

self.audioconvert = Gst.ElementFactory.make('audioconvert')
self.pipeline.add(self.audioconvert)

self.autoaudiosink = Gst.ElementFactory.make('autoaudiosink')
self.pipeline.add(self.autoaudiosink)

self.progressreport = Gst.ElementFactory.make('progressreport')
self.progressreport.set_property('update-freq', 1)
self.pipeline.add(self.progressreport)

All pipeline also already linked. But, I running out of idea how to do numpy array retrieval in real-time from the stream. Do you have any suggestion?

SuleymanSah
  • 17,153
  • 5
  • 33
  • 54
jefflgaol
  • 73
  • 1
  • 5

1 Answers1

4

The pipeline in the original question is designed to display video and play audio, so it uses the autovideosink and autoaudiosink elements, respectively. If you want your video frames to go to your application instead of to the screen, you need to use a different sink element, namely appsink instead of autovideosink.

self.appsink = Gst.ElementFactory.make('appsink')
self.pipeline.add(self.appsink)

The appsink element has a signal called "new-sample" that you can connect to that fires when a new frame is available, if you enable appsink's "emit-signals" property.

serf.appsink.set_property("emit-signals", True)
handler_id = self.appsink.connect("new-sample", self.__on_new_sample)

Then it's a matter of converting GStreamer's buffer format to a Numpy array.

def __on_new_sample(self, app_sink):
    sample = app_sink.pull_sample()
    caps = sample.get_caps()

    # Extract the width and height info from the sample's caps
    height = caps.get_structure(0).get_value("height")
    width = caps.get_structure(0).get_value("width")

    # Get the actual data
    buffer = sample.get_buffer()
    # Get read access to the buffer data
    success, map_info = buffer.map(Gst.MapFlags.READ)
    if not success:
        raise RuntimeError("Could not map buffer data!")

    numpy_frame = np.ndarray(
        shape=(height, width, 3),
        dtype=np.uint8,
        buffer=map_info.data)

    # Clean up the buffer mapping
    buffer.unmap(map_info)

Note that this code makes certain assumptions about the frame data, namely that it's a 3-color format like RGB and that the color data will be unsigned 8-bit ints.

Velovix
  • 527
  • 1
  • 6
  • 19