0

My app (in Python), loads the Gstreamer library, parses and launches a pipeline spec that composites subtitles from an SRT file on top of a prepared video from an MP4 file, then creates a control source with a binding to the 'alpha' property of the sink pad of the videomixer element that is linked to the subtitle image source.

First I wrote a small proof-of-concept which works like a champ. If you can run it with an X-windows server (in Unix or Linux for example), you will see a black square on a green background. After a second, the black square gradually fades out over several seconds.

My app has a pipeline that is a bit more complex. Below is a summary of the relevant code:

pipeline_spec = '''
videomixer name=mixer ! ... other stuff downstream
filesrc location=sample_videos/my-video.mp4 ! decodebin name=demuxer ! mixer.sink_0
filesrc location=subtitles.srt ! subparse ! textrender ! mixer.sink_1
demuxer. ! audioconvert ! audioresample ! faac ! muxer.
'''

self.pipeline = Gst.parse_launch(pipeline_spec)
mixer = self.pipeline.get_by_name('mixer')
#vidpad = mixer.get_static_pad('sink_0')
srtpad = mixer.get_static_pad('sink_1')
self.logger.debug([ pad.name for pad in mixer.pads ])

cs = GstController.InterpolationControlSource()
cs.set_property('mode', GstController.InterpolationMode.LINEAR)
binding = GstController.DirectControlBinding.new(srtpad, 'alpha', cs)
cs.add_control_binding(binding)

with open(srtfilepath) as srtfile:
    for timestamps in parsesrt.parse(srtfile):
        start, end = timestamps
        self._set_subtitle_fade(alpha_cs, start, end)

def _set_fade_effect(self, controlsource, start, duration, alpha_begin, alpha_end):
    controlsource.set(start, alpha_begin)
    controlsource.set(start + duration, alpha_end)
    self.logger.debug('set fade-{0} from {1} to {2}'.format('in' if alpha_begin < alpha_end else 'out', start, start + duration))

def _set_subtitle_fade(self, controlsource, start_subtitle, end_subtitle):
    self._set_fade_effect(controlsource, start_subtitle, self.DURATION_FADEIN, 0, 1)
    self._set_fade_effect(controlsource, end_subtitle - self.DURATION_FADEOUT, self.DURATION_FADEOUT, 1, 0)

One difference between the two pipelines is that in the first example, the videomixer pads are request pads. But in the real app, they turn out to be static pads. And only 'sink_1' is present in the log statement.

DEBUG, ['src', 'sink_1']

I'm not sure why this is so or whether it makes a difference.

When I run the app in a web server and check in a browser, the subtitles appear but they do not fade in or out.

I checked the timestamps and they look good. They are in nanoseconds (10^9).

set fade-in from 2440000000 to 3440000000
set fade-out from 2375000000 to 4375000000
set fade-in from 7476000000 to 8476000000
...

So what stone have I left unturned?

Lawrence I. Siden
  • 9,191
  • 10
  • 43
  • 56

1 Answers1

0

The other big difference between your first and second prototypes is videotestsrc changing to filesrc ! decodebin. gst_parse_launch won't immediately connect decodebin to videomixer. What'll happen is this:

  • Pipeline is parsed but decodebin doesn't know the contents of filesrc until it de-muxes it. It could be audio or a Powerpoint presentation or a PGP signature or anything. So it returns no src pads initially.

  • You play the pipeline. decodebin begins receiving data from filesrc, identifies the content as mp4, and demuxes it. It discovers it has video content that matches pads for videomixer and makes the connection to the first open pad.

So what you probably need to do is listen for the pad-added event on decodebin, check that it's the right pad, and then make your binding.

def decodebin_pad_added(self, decodebin, pad):
  #return if pad is wrong type

  #make the binding to the pad

decodebin.connect("pad_added", decodebin_pad_added)

You can see know that this behavior will be present by running gst-inspect-1.0 on the element in question and examining the pads. You can see that decodebin has a "sometimes" pad template vs. a constant pad that's present on subparse:

subparse:
Pads:
...
  SRC: 'src'
    Implementation:
      Has custom eventfunc(): gst_sub_parse_src_event
      Has custom queryfunc(): gst_sub_parse_src_query
      Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
    Pad Template: 'src'

decodebin:
Pad Templates:
  SRC template: 'src_%u'
    Availability: Sometimes
    Capabilities:
      ANY
mpr
  • 3,250
  • 26
  • 44
  • Thank you @mpr! I'm going to try that, then I'll get back to you. – Lawrence I. Siden Mar 16 '17 at 15:51
  • gst-inspect decodebin shows all sorts of useful goodness, such as the pad-added signal. But what version of gst-plugins-base do you have. My version does not show "custom eventfunc()" for the subparse.src pad. I'm using version 1.8.1 of the plugins. – Lawrence I. Siden Mar 16 '17 at 16:00
  • I'm on 1.6. I'm not sure that you need to worry about those custom funcs, think they're just informational. – mpr Mar 16 '17 at 16:08
  • I want to connect my control source to the videomixer sink that receives the rendered subtitle text. I set a listener for pad-added on videomixer, but the listener only gets called once and pad.get_name() ==> 'sink_0'. But there should be a second sink, 'sink_1'. Am I missing something else? – Lawrence I. Siden Mar 16 '17 at 17:16
  • I'd guess you're not getting the `pad-added` event for sink_1 because you're adding the event listener after the pipeline parses and connects what it can. For that one, maybe you can just get it by name using `gst_element_get_static_pad()`. – mpr Mar 16 '17 at 18:05
  • I just created a gist that shows my debug output from a pipeline test: https://gist.github.com/lsiden/48b0b05370fb43e7f43f9bee08265d32. It's now sending the rendered text to videomixer.sink_0 which is a static pad. sink_1 gets connected to the decoded video stream after the pipeline has started. So it should work now, but isn't. – Lawrence I. Siden Mar 16 '17 at 20:17
  • So you are saying it all connects but then when you change the values in the sliders it does nothing? – mpr Mar 16 '17 at 20:22
  • That's what it looks like. I just added " video/x-raw,height-540 ! mixer..." before each of the mixer sinks. Now both pads are available at startup. The subtitles are connected to videomixer.sink_1 again. Still no fade effect. I was afraid that the timestamps might be wrong, but i verified that they are in nanosec (10^9) and match the subtitle timestamps. – Lawrence I. Siden Mar 16 '17 at 20:43
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/138271/discussion-between-mpr-and-lawrence-i-siden). – mpr Mar 16 '17 at 20:51
  • If you're still interested, I just updated https://gist.github.com/lsiden/48b0b05370fb43e7f43f9bee08265d32 with my latest debug output. – Lawrence I. Siden Mar 16 '17 at 20:52