Hey I'm running into a similar problem as: Converting RGB to YUV, + ffmpeg
From AIR, I figured the encoding was too long to render frames at a reasonable rate - so I exported the argb ByteArray from bitmap.getPixels(rect)
directly to a file.
So for a 30sec flash animation, I'd export let's say 1500 frames to 1500 .argb
files.
This method works great. I was able to render HD video using the ffmpeg cmd:
ffmpeg -f image2 -pix_fmt argb -vcodec rawvideo -s 640x380 -i frame_%d.argb -r 24 -qscale 1.1 -s 640x380 -i ./music.mp3 -shortest render-high.mpg
So far so good! However, inbetween the two processes we need to store those ~3gb of data.
I then tried to append all the argb to one single file and have ffmpeg consume it, but didn't get anything good out of it... Also tried messing tcp/udp but getting stuck...
Does anyone know of a way to streamline that process and hopefully pipe both Air and ffmpeg together?