I have been trying to implement HTTP live streaming using mpeg-dash but need guidance on some issues.
Provided :
- I have audio and video encoded stream in buffered input.
- a direct mpeg-2 transport stream for above is also available in a buffer.
Current approach :
- Save the transport stream into chunks of fixed length.
- use ffmpeg to extract video stream.
ffmpeg -i latest_chunk.ts -s 720x480 -c:v libx264 -b:v 600k -y -an output_video_stream.mp4
- use ffmpeg to extract video stream.
ffmpeg -i latest_chunk.ts -c:a aac -b:a 128k -y -vn output_audio_stream.mp4
- use mp4box to create dash segments and mpd.
mp4box -dash 7000 -profile live output_video_stream.mp4 output_audio_stream.mp4 -out manifest.mpd
- A server running continuously in another thread serves the generated mpd and segments.
Issues :
- The above approach gives a considerable amount of latency. Can this be done more efficiently?
- I want to know if there is a method to take directly encoded streams buffer as input and produce mpeg-dash segments and mpd. HTTP server will do the rest. If there is, please provide an example.
- Also i provided the length of the transport stream chunks (in sec) in mp4box as argument
-mpd-refresh 12
, but the player only requests for the mpd once, plays the segments, and stops. It also does not include minimumUpdatePeriod attribute in the generated mpd file
mp4box -dash 7000 -profile live -mpd-refresh 12 output_video_stream.mp4 output_audio_stream.mp4 -out manifest.mpd
- Does the mpeg-dash has support for mpeg-2 encoded media streams?
Any advice/solution/reference for the same is appreciated.