33

Can anyone recommend a Java library that would allow me to create a video programmatically? Specifically, it would do the following:

  • take a series of BufferedImages as the frames
  • allow a background WAV/MP3 to be added
  • allow 'incidental' WAV/MP3s to be added at arbitrarily, programmatically specified points
  • output the video in a common format (MPEG etc)

Can anybody recommend anything? For the picture/sound mixing, I'd even live with something that took a series of frames, and for each frame I had to supply the raw bytes of uncompressed sound data associated with that frame.

P.S. It doesn't even have to be a "third party library" as such if the Java Media Framework has the calls to achieve the above, but from my sketchy memory I have a feeling it doesn't.

Neil Coffey
  • 21,615
  • 7
  • 62
  • 83
  • You may try this. http://wiki.xuggle.com/MediaTool_Introduction – Ovilia Feb 18 '12 at 11:17
  • I looked at Xuggle, but it doesn't half look like a pain in the arse to set up. Not sure why these people have a mental block with just giving you a jar/dll/exe to download... – Neil Coffey Feb 18 '12 at 16:16
  • P.S. A solution I have for now-- pending anything better-- is to save the frames as e.g. PNG files then call the commandline ffmpeg utility on the frames. I suppose that's effectively what a library might do under the hood anyway. – Neil Coffey Feb 18 '12 at 16:18
  • For others coming across this question, if you're looking for a scalable system to do this, and you wouldn't mind temporarily allocating an Amazon EC2 node, you might have some luck with [MovieMasher](http://www.moviemasher.com/doc/?page=mmserver). Absolutely overkill if it's just a one-off project, but you could write Java code that creates XML like [this](http://www.moviemasher.com/demo/example/static/media/xml/mash.xml) and submit it to the MovieMasher server for rendering. – btown Feb 26 '12 at 07:45
  • once heard about this project but never used it. http://kenai.com/projects/trident/pages/Home – John Eipe Feb 27 '12 at 04:59
  • @btown -- interesting project, even if having to set up an Amazon VM does win my award for "most faffy solution to a simple problem" :) – Neil Coffey Feb 28 '12 at 02:57
  • @NeilCoffey did you successfully run a commandline generated native code on android without root? – Guy Dec 17 '12 at 12:28
  • Ummm who said anything about Android? – Neil Coffey Dec 17 '12 at 15:39

5 Answers5

6

I've used the code mentioned below to successfully perform items 1, 2, and 4 on your requirements list in pure Java. It's worth a look and you could probably figure out how to include #3.

http://www.randelshofer.ch/blog/2010/10/writing-quicktime-movies-in-pure-java/

janoside
  • 2,625
  • 1
  • 17
  • 12
  • This is definitely functionally close to the specification I mentioned. As I see it, the main disadvantage is that it effectively amalgamates individual frames rather than using a video codec, so you end up with bigger files. On the other hand, I like the idea of there being no patent issues. – Neil Coffey Feb 28 '12 at 02:55
4

You can try a pure Java codec library called JCodec.
It has a very basic H.264 ( AVC ) encoder and MP4 muxer. Here's a full sample code taken from the their samples -- TranscodeMain.

private static void png2avc(String pattern, String out) throws IOException {
    FileChannel sink = null;
    try {
        sink = new FileOutputStream(new File(out)).getChannel();
        H264Encoder encoder = new H264Encoder();
        RgbToYuv420 transform = new RgbToYuv420(0, 0);

        int i;
        for (i = 0; i < 10000; i++) {
            File nextImg = new File(String.format(pattern, i));
            if (!nextImg.exists())
                continue;
            BufferedImage rgb = ImageIO.read(nextImg);
            Picture yuv = Picture.create(rgb.getWidth(), rgb.getHeight(), ColorSpace.YUV420);
            transform.transform(AWTUtil.fromBufferedImage(rgb), yuv);
            ByteBuffer buf = ByteBuffer.allocate(rgb.getWidth() * rgb.getHeight() * 3);

            ByteBuffer ff = encoder.encodeFrame(buf, yuv);
            sink.write(ff);
        }
        if (i == 1) {
            System.out.println("Image sequence not found");
            return;
        }
    } finally {
        if (sink != null)
            sink.close();
    }
}

This sample is more sophisticated and actually shows muxing of encoded frames into MP4 file:

private static void prores2avc(String in, String out, ProresDecoder decoder, RateControl rc) throws IOException {
    SeekableByteChannel sink = null;
    SeekableByteChannel source = null;
    try {
        sink = writableFileChannel(out);
        source = readableFileChannel(in);

        MP4Demuxer demux = new MP4Demuxer(source);
        MP4Muxer muxer = new MP4Muxer(sink, Brand.MOV);

        Transform transform = new Yuv422pToYuv420p(0, 2);

        H264Encoder encoder = new H264Encoder(rc);

        MP4DemuxerTrack inTrack = demux.getVideoTrack();
        CompressedTrack outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, (int) inTrack.getTimescale());

        VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
        Picture target1 = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.YUV422_10);
        Picture target2 = null;
        ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);

        ArrayList<ByteBuffer> spsList = new ArrayList<ByteBuffer>();
        ArrayList<ByteBuffer> ppsList = new ArrayList<ByteBuffer>();
        Packet inFrame;
        int totalFrames = (int) inTrack.getFrameCount();
        long start = System.currentTimeMillis();
        for (int i = 0; (inFrame = inTrack.getFrames(1)) != null && i < 100; i++) {
            Picture dec = decoder.decodeFrame(inFrame.getData(), target1.getData());
            if (target2 == null) {
                target2 = Picture.create(dec.getWidth(), dec.getHeight(), ColorSpace.YUV420);
            }
            transform.transform(dec, target2);
            _out.clear();
            ByteBuffer result = encoder.encodeFrame(_out, target2);
            if (rc instanceof ConstantRateControl) {
                int mbWidth = (dec.getWidth() + 15) >> 4;
                int mbHeight = (dec.getHeight() + 15) >> 4;
                result.limit(((ConstantRateControl) rc).calcFrameSize(mbWidth * mbHeight));
            }
            spsList.clear();
            ppsList.clear();
            H264Utils.encodeMOVPacket(result, spsList, ppsList);
            outTrack.addFrame(new MP4Packet((MP4Packet) inFrame, result));
            if (i % 100 == 0) {
                long elapse = System.currentTimeMillis() - start;
                System.out.println((i * 100 / totalFrames) + "%, " + (i * 1000 / elapse) + "fps");
            }
        }
        outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));

        muxer.writeHeader();
    } finally {
        if (sink != null)
            sink.close();
        if (source != null)
            source.close();
    }
}
Stanislav Vitvitskyy
  • 2,297
  • 20
  • 17
  • your link is wrong [this](https://github.com/jcodec/jcodec/blob/master/src/main/java/org/jcodec/api/transcode/TranscodeMain.java) is the newer. – Eboubaker Nov 26 '19 at 22:43
4

I found a tool called ffmpeg which can convert multimedia files form one format to another. There is a filter called libavfilter in ffmpeg which is the substitute for vhook which allows the video/audio to be modified or examined between the decoder and the encoder. I think it should be possible to input raw frames and generate video. I researched on any java implementation of ffmpeg and found the page titled "Getting Started with FFMPEG-JAVA" which is a JAVA wrapper around FFMPEG using JNA.

Som Pra
  • 453
  • 3
  • 13
0

Try JavaFX.

JavaFX includes support for rendering of images in multiple formats and support for playback of audio and video on all platforms where JavaFX is supported.

Here is a tutorial on manipulating images

Here is a tutorial on creating slideshows, timelines and scenes.

Here is FAQ on adding sounds.

Most of these are on JavaFX 1.3. Now JavaFX 2.0 is out.

John Eipe
  • 10,922
  • 24
  • 72
  • 114
0

Why not use FFMPEG?

There seems to be a Java wrapper for it:

http://fmj-sf.net/ffmpeg-java/getting_started.php

Here is an example of how to compile various media sources into one video with FFMPEG:

http://howto-pages.org/ffmpeg/#multiple

And, finally, the docs:

http://ffmpeg.org/ffmpeg.html

Jonathan
  • 5,495
  • 4
  • 38
  • 53
  • So at the moment I'm actually using FFMPEG, though in a slightly clunky way because I haven't figured out how to use all the options. ATM I'm saving all the frames as images in a folder, then running FFMPEG on them. I assume there's a way to pipe the frames directly one by one to FFMPEG without having to save them, but haven't figured this out yet. I thought there might already be a library that did this, for example. – Neil Coffey Feb 28 '12 at 14:53