0

I can capture the camera image by ffmpeg and send it to ffserver, but what happens next with the data? Can I collect it on the other site with some other client that uses ffmpeg (e.g. some c# wrapper for it)? If so - how exactly does the data look like and how can I present it to the final user? Can I just display the data on display port? Or is there some other controller for that? Thanks!

randomuser1
  • 2,733
  • 6
  • 32
  • 68

1 Answers1

1

ffserver takes a feed as input (your camera stream) and outputs a stream based on the specs you put in its configuration file for that particular feed. You decide the output format based on your needs.

You can send to any video player that supports the format you chose.

ffserver

Example configuration that will output a stream compatible with Windows Media Player:

<Feed feed1.ffm>
    File /tmp/feed1.ffm
    FileMaxSize 200K
    ACL allow 127.0.0.1
</Feed>

# ASF compatible
<Stream test.asf>
    Feed feed1.ffm
    Format asf
    VideoFrameRate 15
    VideoSize 352x240
    VideoBitRate 256
    VideoBufferSize 40
    VideoGopSize 30
    AudioBitRate 64
    StartSendOnKey
</Stream>

You would then access you stream using http://<ffserver_ip_address_or_host_name>:<ffserver_port>/test.asf

In C# you could use a MediaElement to play it. Here's the list of supported formats.

Extra reading:

Streaming media with ffserver

Sample ffserver configuration

aergistal
  • 29,947
  • 5
  • 70
  • 92