1

I'm working on a server side re-streaming service for some IP Cameras. I'm using ffserver on Linux to serve the streams and ffmpeg on Windows to feed. I'm getting the cameras video (H264) using rtsp.

My ffserver config example is as follows

<Feed test.ffm>
File ./test.ffm
</Feed>

<Stream test.mjpg>
Feed test.ffm
Format mpjpeg
VideoFrameRate 3
VideoSize 704x480
NoAudio
FileMaxSize 100k 
VideoHighQuality
</Stream>

and the way ffmpeg is feeding:

ffmpeg -rtsp_transport tcp -i "rtsp://admin:admin@192.168.1.12:554/cam/realmonitor?channel=1&subtype=0" -vcodec copy -acodec copy "http://192.168.1.101:8090/test.ffm"

Resulting video is Very pixelated and it differs from the real image.

Cameras' configuration are as follows:

Resolution: D1 (704*480)
FrameRate: 3
BitRate: 256k
BitRateType: Variable

Is there anything i'm missing or doing wrong?

Thanks in advance for any help

Luis Ruiz
  • 51
  • 1
  • 10

2 Answers2

2

For anyone out there having the same issue. I solved with this:

<Feed test.ffm>
File ./test.ffm
</Feed>

<Stream test.mjpg>
Feed test.ffm
Format mpjpeg
VideoFrameRate 3
VideoBufferSize 80000
VideoBitRate 512
VideoQMin 1
VideoQMax 10
VideoSize 1280x720
PreRoll 0
</Stream>

Streamed video is now the same quality as the source.

Luis Ruiz
  • 51
  • 1
  • 10
0

I think the problem is at the source camera streaming, in my opinion 256 Kb/s is a poor bitrate for a D1 resolution.

If you can do it and it doesn't affect to your network bandwidth you could to try to increase camera bitrate to 768 or 1024 Kb/s to view the difference.

Also FrameRate of 3 is maybe a little framerate. Depending of what are you capturing with your camera (a static image like a landscape or a very dynamic image like a traffic road) in a dynamic capture stream 10 or 15 frames/s are more realistic values.

Hokusai
  • 2,219
  • 1
  • 21
  • 22
  • One of the issues is that I (no matter what) cannot use a lot of bandwidth from the cameras as they're streaming using a cellular connection. I know that is not the best setup, but the difference between those images (the pre and the post processed one) is Huge. the one i'm getting from ffserver is like a grid, i mean, i can see clearly a bunch of squares which i don't see on the original one. – Luis Ruiz Jan 10 '17 at 23:33
  • Then the camera sources config seems are ok. Are you tried to increase FileMaxSize to another high values like 1M? (only to check where is the problem) – Hokusai Jan 10 '17 at 23:48
  • I have tried that as well. The only thing i'm getting with that is a Hyperlapse whenever you request the video from ffserver until it catches up with the actual time. That doesn't help with the image quality – Luis Ruiz Jan 11 '17 at 00:09