0

I have the following pipeline in gstreamer in Python that receives audio and video buffers and plays them:

pipeline2_str = 'appsrc name=videosrc do-timestamp=true format=3 ! jpegparse ! jpegdec ! queue2 max-size-buffers=0 max-size-bytes=0 max-size-time=0 !
nvoverlaysink sync=false
appsrc name=audiosrc format=3 do-timestamp=true ! audio/x-raw,format=F32LE,layout=interleaved,rate=44100,channels=1 ! audioconvert !
queue2 ! alsasink device=hw:0,3 sync=false'

After a while, the video and audio lose synchronization and the video gets behind the audio. If I put sync=true for nvoverlaysink and alsasink, the video moves in still frames and audio becomes silent. I try supplying PTS and DTS to the buffers, but it doesn't seem to help, maybe I'm doing something wrong or maybe jpegparse ! jpegdec lose the timestamps. How can I debug my application? How can I synchronize the audio and video?

Thanks!

I tried supplying PTS and DTS to the buffers for appsrc, but it doesn't seem to be working. The video still moves in frames (1 FPS instead of 30) and the audio is silent.

0 Answers0