I ended up using a different package - ffmpeg-stream
I was getting the images from AWS, so there is some AWS code below:
const frames = ['frame1.jpg', 'frame2.jpg', ...]
const conv = ffmpeg() // create converter
const input = conv.input({f: 'image2pipe', r: 30}) // create input writable stream
conv.output('out.mp4', {vcodec: 'libx264', pix_fmt: 'yuv420p'}) // output to file
// for every frame create a function that returns a promise
frames.map(filename => () =>
new Promise((fulfill, reject) =>
s3
.getObject({Bucket: '...', Key: filename})
.createReadStream()
.on('end', fulfill) // fulfill promise on frame end
.on('error', reject) // reject promise on error
.pipe(input, {end: false}) // pipe to converter, but don't end the input yet
)
)
// reduce into a single promise, run sequentially
.reduce((prev, next) => prev.then(next), Promise.resolve())
// end converter input
.then(() => input.end())
conv.run()
Here's the issue I had posted on the github repo:
https://github.com/phaux/node-ffmpeg-stream/issues/5