0

I work with gstreamer and have the following pipeline:

   appsrc name=source is-live=true block=true format=GST_FORMAT_TIME
   caps=video/x-raw,format=BGR,width=1280,height=720,framerate=30/1
   ! videoconvert ! video/x-raw,format=I420 ! x264enc  !
   h264parse config-interval=3  ! queue ! mpegtsmux !
   hlssink playlist-length=2 max-files=5
   playlist-location="/tmp/hls/stream.m3u8" playlist-root="/tmp/hls"
   location="/tmp/hls/fragment%06d.ts" target-duration=10

I feed the pipeline through appsrc, which is subscribed to the 'need-data' event in my application.

After starting the application, I expect to see the segments fragment000000.ts, fragment000001.ts, fragment000002.ts ... fragment00000N.ts in the /tmp/hls folder. The playlist file stream.m3u8 should also appear here.

But in fact, only fragment000000.ts appears in /tmp/hls folder. This segment contains the entire video.

I can’t understand why my pipeline doesn’t segment the video.

There are so many examples of starting pipelines where the video source for hlssink is videotestsrc. For example, everything works great if I run the following:

gst-launch-1.0 videotestsrc is-live=true ! video/x-raw, framerate=25/1, width=720, height=576, format=I420 ! x264enc bitrate=1000 key-int-max=25 ! h264parse ! video/x-h264 ! queue ! mpegtsmux ! hlssink playlist-length=10 max-files=20  playlist-location="/tmp/hls/stream.m3u8" location="/tmp/hls/fragment%06d.ts" target-duration=10

But I did not find any example of starting gstreamer using the appsrc -> hlssink scheme. Help please understand what the problem is

gonzo
  • 549
  • 1
  • 8
  • 14

1 Answers1

0

Looks like your encoder is not generating key frames, so hlssink has nowhere to "chop" the stream into segments. In the working example you posted, key-int-max=25 tells the encoder every 25th frame should be a key frame.

Since you're using 10 second target duration, you should at least have a key frame every 10 seconds.

Michiel
  • 81
  • 4