My app (in Python), loads the Gstreamer library, parses and launches a pipeline spec that composites subtitles from an SRT file on top of a prepared video from an MP4 file, then creates a control source with a binding to the 'alpha' property of the sink pad of the videomixer element that is linked to the subtitle image source.
First I wrote a small proof-of-concept which works like a champ. If you can run it with an X-windows server (in Unix or Linux for example), you will see a black square on a green background. After a second, the black square gradually fades out over several seconds.
My app has a pipeline that is a bit more complex. Below is a summary of the relevant code:
pipeline_spec = '''
videomixer name=mixer ! ... other stuff downstream
filesrc location=sample_videos/my-video.mp4 ! decodebin name=demuxer ! mixer.sink_0
filesrc location=subtitles.srt ! subparse ! textrender ! mixer.sink_1
demuxer. ! audioconvert ! audioresample ! faac ! muxer.
'''
self.pipeline = Gst.parse_launch(pipeline_spec)
mixer = self.pipeline.get_by_name('mixer')
#vidpad = mixer.get_static_pad('sink_0')
srtpad = mixer.get_static_pad('sink_1')
self.logger.debug([ pad.name for pad in mixer.pads ])
cs = GstController.InterpolationControlSource()
cs.set_property('mode', GstController.InterpolationMode.LINEAR)
binding = GstController.DirectControlBinding.new(srtpad, 'alpha', cs)
cs.add_control_binding(binding)
with open(srtfilepath) as srtfile:
for timestamps in parsesrt.parse(srtfile):
start, end = timestamps
self._set_subtitle_fade(alpha_cs, start, end)
def _set_fade_effect(self, controlsource, start, duration, alpha_begin, alpha_end):
controlsource.set(start, alpha_begin)
controlsource.set(start + duration, alpha_end)
self.logger.debug('set fade-{0} from {1} to {2}'.format('in' if alpha_begin < alpha_end else 'out', start, start + duration))
def _set_subtitle_fade(self, controlsource, start_subtitle, end_subtitle):
self._set_fade_effect(controlsource, start_subtitle, self.DURATION_FADEIN, 0, 1)
self._set_fade_effect(controlsource, end_subtitle - self.DURATION_FADEOUT, self.DURATION_FADEOUT, 1, 0)
One difference between the two pipelines is that in the first example, the videomixer pads are request pads. But in the real app, they turn out to be static pads. And only 'sink_1' is present in the log statement.
DEBUG, ['src', 'sink_1']
I'm not sure why this is so or whether it makes a difference.
When I run the app in a web server and check in a browser, the subtitles appear but they do not fade in or out.
I checked the timestamps and they look good. They are in nanoseconds (10^9).
set fade-in from 2440000000 to 3440000000
set fade-out from 2375000000 to 4375000000
set fade-in from 7476000000 to 8476000000
...
So what stone have I left unturned?