I'm trying to make a program that can encode and decode h.264 video so I can edit this video. Can anyone tell me how I do this if I want to make this program in java?
-
Eventually `FFMpeg` could do this, but I'm not very sure. If it does, there's sure a Java Wrapper for it. – evotopid Jan 24 '12 at 15:19
-
Try [Xuggler](http://www.xuggle.com/xuggler/) API – JuanZe Jan 24 '12 at 15:21
-
Now you mention it, that seems a much better idea (than trying to use JMF). +1 – Andrew Thompson Jan 24 '12 at 15:30
-
Xuggler is dead since 2012, replaced by [humble-video](https://github.com/artclarke/humble-video) – Matthieu May 01 '18 at 07:34
4 Answers
You can use JCodec ( http://jcodec.org ).
To decode a video sequence go:
int frameNumber = 10000;
FileChannelWrapper ch = null;
try {
ch = NIOUtils.readableFileChannel(new File(path to mp4));
FrameGrab frameGrab = new FrameGrab(ch);
frameGrab.seek(frameNumber);
Picture frame;
for (int i = 0; (frame = frameGrab.getNativeFrame()) != null && i < 200; i++) {
// Do something
}
} finally {
NIOUtils.closeQuietly(ch);
}
To encode a sequence:
public class SequenceEncoder {
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(Picture bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion ( RGB -> YUV )
transform.transform(bi, toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
}
And finally to convert TO and FROM Android image use:
public static Picture fromBitmap(Bitmap src) {
Picture dst = Picture.create((int)src.getWidth(), (int)src.getHeight(), RGB);
fromBitmap(src, dst);
return dst;
}
public static void fromBitmap(Bitmap src, Picture dst) {
int[] dstData = dst.getPlaneData(0);
int[] packed = new int[src.getWidth() * src.getHeight()];
src.getPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());
for (int i = 0, srcOff = 0, dstOff = 0; i < src.getHeight(); i++) {
for (int j = 0; j < src.getWidth(); j++, srcOff++, dstOff += 3) {
int rgb = packed[srcOff];
dstData[dstOff] = (rgb >> 16) & 0xff;
dstData[dstOff + 1] = (rgb >> 8) & 0xff;
dstData[dstOff + 2] = rgb & 0xff;
}
}
}
public static Bitmap toBitmap(Picture src) {
Bitmap dst = Bitmap.create(pic.getWidth(), pic.getHeight(), ARGB_8888);
toBitmap(src, dst);
return dst;
}
public static void toBitmap(Picture src, Bitmap dst) {
int[] srcData = src.getPlaneData(0);
int[] packed = new int[src.getWidth() * src.getHeight()];
for (int i = 0, dstOff = 0, srcOff = 0; i < src.getHeight(); i++) {
for (int j = 0; j < src.getWidth(); j++, dstOff++, srcOff += 3) {
packed[dstOff] = (srcData[srcOff] << 16) | (srcData[srcOff + 1] << 8) | srcData[srcOff + 2];
}
}
dst.setPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());
}
FINALLY on decoding you'll get YUV frame out, in order to transform it to RGB frame go:
Transform transform = ColorUtil.getTransform(pic.getColor(), ColorSpace.RGB);
Picture rgb = Picture.create(pic.getWidth(), pic.getHeight(), ColorSpace.RGB);
transform.transform(pic, rgb);
And make sure your download JAR with all deps: http://jcodec.org/downloads/jcodec-0.1.3-uberjar.jar

- 2,297
- 20
- 17
-
Thank you for the answer. How computationally intensive the encoding operation? – biggvsdiccvs Jun 15 '17 at 07:12
There is a Java wrapper available for VLC: http://www.capricasoftware.co.uk/projects/vlcj/

- 435
- 4
- 11
Check ffmpeg library and x264. Compile and compose these libraries. It has very good API for video encoding/decoding. If you compile ffmpeg only then you can't encode h.264, only decodeing. To encode h264 you need to compile x264 library. http://ubuntuforums.org/showthread.php?t=786095. Quite good tutorial. This is C API, but you can use native methods to call it.
Best regards, and sorry for my english.

- 343
- 3
- 7
Obtain or write a class that can encode and decode the desired format. Ensure that it is a Service Provider Interface. Add it to the run-time class-path of the app. Play it using the default player of the JMF. Not too sure about editing it.

- 168,117
- 40
- 217
- 433