2

I wrote an android app that sends a live video stream from the camera over a socket to my computer. Is it possible to use FFmpeg to decode the MPEG4 video stream and some how display what the camera is seeing in real time? I'm guessing I would have to create a bitmap from the latest information it had from the byte stream and display it on the computer at 20+ FPS.

How would I go about doing something like this? C++, C# or Java is fine. From my understanding FFmpeg is written in C++

Ricky Casavecchia
  • 560
  • 3
  • 9
  • 28

1 Answers1

1

First of all, do you want to create a video player yourself or you just want to see your stream? Because when you have a video stream already, almost every modern video player can play streaming media. Try using VLC (Which is using FFmpeg). In VLC just click "Media" -> "Open Network Stream..." and fill in your URL.

And secondly, FFmpeg is written in C (C99). FFmpeg itself is just a tool to convert media. When you want to create your own program, you can use the FFmpeg libraries (libavcodec, libavformat, ect).

Omega
  • 1,101
  • 9
  • 14
  • I don't need to create a video player, I just need to see the stream. From what I understand about reading a stream with VLC is that is uses rtp and / or rtsp which are protocols built on top of the video stream. Also I will be running an app using 4G internet from my phone, and streaming it to my computer. If I understand how vlc and rtp work in general, the source of the stream is the server. This won't work since my wireless company (AT&T) blocks the ports when running in 4G mode. – Ricky Casavecchia Mar 14 '13 at 22:55
  • So libavcodec contains the code needed to decode the MPEG4 video stream. What does a decoded video stream give me? Would I be able to see the latest frame of the stream in real time? Also there is no header to the encoded stream since the header gets added after you are done recording. But I am streaming this live from the camera, so I will need to be able to do this without a header. Is this possible? – Ricky Casavecchia Mar 14 '13 at 23:00
  • Yes, you need to provide video stream in one of the possible protocols, that's the most easy way. Don't invent the wheel making your own socket-solution. These protocols contain headers to view your video. Also the libavcodec only contains code to encode/decode video. So you also need other libraries from ffmpeg like reading from the stream using libavformat and some others. So just try to get a decent stream out of your device to open in VLC, since that program has all the needed ffmpeg libraries. – Omega Mar 15 '13 at 08:04
  • Then yes, you can see your stream, almost real-time. Streams always have a little delay depending on your settings. Try keeping your GOP-size small. The header will be provided in the transport stream. – Omega Mar 15 '13 at 08:10
  • "So just try to get a decent stream out of your device to open in VLC" -Nick van Tilborg. I cant use VLC since the source of the stream has to be the server. Also encoding the stream with rtp and rtsp sounds unnecessarily complicated. Wouldn't it be easiest to just decode the video stream and pull a picture of the latest frame as it comes in? – Ricky Casavecchia Mar 16 '13 at 19:31