3

I have an Open GL application that renders a simulation animation and outputs several PNG image files per second and saves these files in a disk. I want to stream these image files as a video streaming over HTTP protocol so I can view the animation video from a web browser. I already have a robust socket server that handles connection from websocket and I can handle all the handshake and message encoding/decoding part. My server program and OpenGL application program are written in C++.

A couple of questions in mind:

  1. what is the best way to stream this OpenGL animation output and view it from my web browser? The video image frames are dynamically (continuously) generated by the OpenGL application as PNG image files. The web browser should display the video corresponding to the Open GL display output (with minimum latency time).

  2. How can I encode these PNG image files as a continuous (live) video programmatically using C / C++ (without me manually pushing the image files to a streaming server software, like Flash Media Live Encoder)? What video format should I produce?

  3. Should I send/receive the animation data using a web-socket, or is there any other better ways? (like JQuery Ajax call for instead, I am just making this up, but please guide me through the correct way of implementing this). It is gonna be great if this live video streaming works across different browsers.

  4. Does HTML5 video tag support live video streaming, or does it only work for a complete video file which exists at a particular URL/directory (not a live streaming)?

  5. Is there any existing code samples (tutorial) for doing this live video streaming, where you have a C/C++/Java application producing some image frames, and have a web-browser consuming this output as a video streaming? I could barely find tutorials about this topic after spending few hours searching on Google.

all_by_grace
  • 2,315
  • 6
  • 37
  • 52

2 Answers2

3

You definitely want to stop outputting PNG files to disk and instead input the frames of image data into a video encoder. A good bet is to use libav/ffmpeg. Next, you will have to encapsulate the encoded video to a network friendly format. I would recommend x264 as an encoder and MPEG4 or MPEG2TS stream format.

To view the video in the web browser, you'll have to choose the streaming format. HLS in HTML5 is supported by Safari, but unfortunately not much else. For wide client support you will need to use a plugin such as flash or a media player.

The easiest way I can think of to do this is to use Wowza for doing a server-side restream. The GL program would stream MPEG2 TS to Wowza, and it would then prepare streams for HLS, RTMP(flash), RTSP, and Microsoft Smooth Streaming (Silverlight). Wowza costs about $1000. You could setup an RTMP stream using Red5, which is free. Or you could do RTSP serving with VLC, but RTSP clients are universally terrible.

Unfortunately, at this time, the level of standardization for web video is very poor, and the video tooling is rather cumbersome. It's a large undertaking, but you can get hacking with ffmpeg/libav. A proof of concept could be writing image frames in YUV420p format to a pipe that ffmpeg is listening to and choosing an output stream that you can read with an RTSP client such as VLC, Quicktime, or Windows Media Player.

vipw
  • 7,593
  • 4
  • 25
  • 48
  • I marked this answer as accepted, because this answer provides an efficient solution for the given question. Inputting the frames directly into a video encoder is indeed more efficient than outputting PNG files to disk. The live encoding with ffmpeg may not seem to be easy, but I will try to write programs for ffmpeg to listen to live input streams. Is there any available C++ examples for writing ffmpeg server program that listens to live input streams and encode these input streams automatically? Or alternatively, can I just send the frames of image directly as RTMP stream to Red5 server? – all_by_grace Mar 20 '12 at 22:25
  • @all_by_grace You can probably do it without writing anything but a somewhat complicated command line to ffmpeg. If you can get your GL program to write "YUV4MPEG2" formatted frames to stdout(or some other pipe), then a command line like this could encode video and send it to your Red5 server: ffmpeg -f yuv4mpegpipe -i pipe:0 -vcodec libx264 -vpre normal -f flv rtmp://localhost/live/testStream Here's a link about the frame format: http://kylecordes.com/2007/pipe-ffmpeg – vipw Mar 21 '12 at 07:52
0

Most live video is MPEG2 internally, wrapped up as RTMP (Flash) or HLS (Apple). There is probably a way to render off your OpenGL to frames and have them converted into MPEG2 as a live stream, but I don't know exactly how (maybe FFMPEG?). Once that is done you can push the stream through Flash Media Live Encoder (it's free) and stream it out to Flash clients directly using RTMP or push publish it into Wowza Media Server to package it for Flash, Http Live Streaming (Cupertino), Smooth Streaming for Silverlight.

Basically you can string together some COTS solutions into a pipeline and play on a standard player without handling the sockets and low level stuff yourself.

PTiddy
  • 126
  • 3
  • I could already produce PNG image frames from the opengl program, and this will be continuous image frames. Using Flash Media Live encoder means that I have to manually push the image frames through Flash Media live encoder software right? My main problem is that the opengl application will continue to produce new png image files as long as the application runs, and I need these files to be converted as video stream automatically (without me inserting these files manually) and have it displayed on the web browser as live streaming (until I stop the application). Can I do that programatically? – all_by_grace Mar 14 '12 at 03:08
  • That is all that remains - finding a solution to batch that frames->mpeg pocess. Writing PNGs at a decent frame rate to disk is expensive, so doing it in memory would be advisible. I have experience with every step downstream of that...but you can probably find an encoder that will do exactly that, take PNGs as input and string them into live mpeg. You could possibly just find an mpeg library and write a service to do it. – PTiddy Mar 14 '12 at 13:04
  • Wowza has some posts on doing something similar. They have a, Eclipse plugin and development is easy. For HTTP Live streams you can write plylists and segments on the fly. That and the HTTP Live Cupertino Streaming protocol may get you pointed in the right direction. http://www.wowza.com/forums/content.php?145 – PTiddy Mar 15 '12 at 13:38
  • Internet video is usually MPEG4 AVC (h.264), not MPEG2. But the basic workflow you describe it probably what the original poster needs. – vipw Mar 20 '12 at 07:43