2

Hey I'm running into a similar problem as: Converting RGB to YUV, + ffmpeg

From AIR, I figured the encoding was too long to render frames at a reasonable rate - so I exported the argb ByteArray from bitmap.getPixels(rect) directly to a file.

So for a 30sec flash animation, I'd export let's say 1500 frames to 1500 .argb files.

This method works great. I was able to render HD video using the ffmpeg cmd:

ffmpeg -f image2 -pix_fmt argb -vcodec rawvideo -s 640x380 -i frame_%d.argb -r 24 -qscale 1.1 -s 640x380 -i ./music.mp3 -shortest render-high.mpg

So far so good! However, inbetween the two processes we need to store those ~3gb of data.

I then tried to append all the argb to one single file and have ffmpeg consume it, but didn't get anything good out of it... Also tried messing tcp/udp but getting stuck...

Does anyone know of a way to streamline that process and hopefully pipe both Air and ffmpeg together?

Community
  • 1
  • 1
mika
  • 1,411
  • 1
  • 12
  • 23
  • You should be able to run `ffmpeg` from within your air application using `NativeProcess`. Combine the images into one raw video stream. Send it using stdin/stdout. You can also try using [named pipe](http://en.wikipedia.org/wiki/Named_pipe), if your systems supports them. – Piotr Praszmo Jul 10 '12 at 18:03
  • I like the named pipe idea, do you know how the command line would look like? – mika Jul 10 '12 at 18:26
  • @mika The [FFmpeg FAQ entry on joining video files](http://ffmpeg.org/faq.html#How-can-I-join-video-files_003f) has an example of using named pipes with FFmpeg that should be illustrative. – blahdiblah Jul 10 '12 at 19:40
  • This is good help! I am planing to try writing to named pipe from flash using tcp socket, would you recommend differently? As of the rest your link should do it! I'll keep you posted - Thanks! – mika Jul 11 '12 at 01:40
  • why was that question downvoted? – mika Oct 02 '13 at 14:33

1 Answers1

4

You need to start a ffmpeg NativeProcess with arguments like these:

ffmpeg -f rawvideo -pix_fmt argb -s 640x480 -r 24 -i - -c libx264 -b:v 1024k video.mp4

Here you need to specify input frame size (-s) frame rate (-r) and output bitrate for video (-b:v) and output filename. Also the order of these arguments matters.

Then you just pipe byte arrays from bitmap.getPixels(rect) to standardInput of this nativeProcess with _process.standardInput.writeBytes(data, 0, data.bytesAvailable); frame by frame.

Occasionally an IOErrorEvent.STANDARD_INPUT_IO_ERROR will occur - this means that ffmpeg can't keep up with your data and frames will be dropped. Apart from lowering frame size, frame rate or bitrate there is nothing you can do about it - you may want to have some kind of queue for your frames, but high resolution uncompressed images are very large so you will be able to store just tens of them in memory and you won't be able to store them on disk because of slow IO speeds. And this problem only occurs when dealing with HD video encoding.

Call _process.closeInput(); when you have no more frames to send and wait for ffmpeg process to exit with code 0.

borisgolovnev
  • 1,780
  • 16
  • 20