0

I am trying to save jpgs into a movie, I have tried jcodec and alothough my s3 plays it fine other devices do not. including vlc and windows media

I have just spent most of the day playing with MediaCodec, although the SDK is so high, it will help people with jelly bean and above. But I can not work out how to get the Files to the encoder and then write the file.

Ideally I wont to support down to SDK 9/8

Has anyone got any code they can share, either to get MediaCodec to work or another option. If you say ffmpeg, I'd love to but my jin knowledge is non existent and I will need a very good guide.

Code for MediaCodec so far

public class EncodeAndMux extends AsyncTask<Integer, Void, Boolean> {
    private static int bitRate = 2000000;
    private static int MAX_INPUT = 100000;
    private static String mimeType = "video/avc";

    private int frameRate = 15;     
    private int colorFormat;
    private int stride = 1;
    private int sliceHeight = 2;        

    private MediaCodec encoder = null;
    private MediaFormat inputFormat;
    private MediaCodecInfo codecInfo = null;
    private MediaMuxer muxer;
    private boolean mMuxerStarted = false;
    private int mTrackIndex = 0;  
    private long presentationTime = 0;
    private Paint bmpPaint;

    private static int WAITTIME = 10000;
    private static String TAG = "ENCODE";

    private ArrayList<String> mFilePaths;
    private String mPath;

    private EncodeListener mListener;
    private int width = 320;
    private int height = 240;
    private double mSpeed = 1;

    public EncodeAndMux(ArrayList<String> filePaths, String savePath) {
        mFilePaths = filePaths;
        mPath = savePath;   

        // Create paint to draw BMP
        bmpPaint = new Paint();
        bmpPaint.setAntiAlias(true);
        bmpPaint.setFilterBitmap(true);
        bmpPaint.setDither(true);
    }

    public void setListner(EncodeListener listener) {
        mListener = listener;
    }

    // set the speed, how many frames a second
    public void setSpead(int speed) {
        mSpeed = speed;
    }

    public double getSpeed() {
        return mSpeed;
    }

    private long computePresentationTime(int frameIndex) {
        final long ONE_SECOND = 1000000;
        return (long) (frameIndex * (ONE_SECOND / mSpeed));
    }

    public interface EncodeListener {
        public void finished();
        public void errored();
    }

    @TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
    @Override
    protected Boolean doInBackground(Integer... params) {

        try {
            muxer = new MediaMuxer(mPath, OutputFormat.MUXER_OUTPUT_MPEG_4);
        } catch (Exception e){ 
            e.printStackTrace();
        }

        // Find a code that supports the mime type
        int numCodecs = MediaCodecList.getCodecCount();
        for (int i = 0; i < numCodecs && codecInfo == null; i++) {
            MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
            if (!info.isEncoder()) {
                continue;
            }
            String[] types = info.getSupportedTypes();
            boolean found = false;

            for (int j = 0; j < types.length && !found; j++) {
                if (types[j].equals(mimeType))
                    found = true;
            }

            if (!found)
                continue;
            codecInfo = info;
        }


         for (int i = 0; i < MediaCodecList.getCodecCount(); i++) {
                 MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
                 if (!info.isEncoder()) {
                     continue;
                 }

                 String[] types = info.getSupportedTypes();
                 for (int j = 0; j < types.length; ++j) {
                     if (types[j] != mimeType) 
                         continue;
                     MediaCodecInfo.CodecCapabilities caps = info.getCapabilitiesForType(types[j]);
                     for (int k = 0; k < caps.profileLevels.length; k++) {
                         if (caps.profileLevels[k].profile == MediaCodecInfo.CodecProfileLevel.AVCProfileHigh && caps.profileLevels[k].level == MediaCodecInfo.CodecProfileLevel.AVCLevel4) {
                             codecInfo = info;
                         }
                     }
                 }
         }

        Log.d(TAG, "Found " + codecInfo.getName() + " supporting " + mimeType);

        MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
        for (int i = 0; i < capabilities.colorFormats.length && colorFormat == 0; i++) {
            int format = capabilities.colorFormats[i];
            switch (format) {
                case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                colorFormat = format;
                break;
            }
        }
        Log.d(TAG, "Using color format " + colorFormat);

        // Determine width, height and slice sizes
        if (codecInfo.getName().equals("OMX.TI.DUCATI1.VIDEO.H264E")) {
            // This codec doesn't support a width not a multiple of 16,
            // so round down.
            width &= ~15;
        }

        stride = width;
        sliceHeight = height;

        if (codecInfo.getName().startsWith("OMX.Nvidia.")) {
            stride = (stride + 15) / 16 * 16;
            sliceHeight = (sliceHeight + 15) / 16 * 16;
        }

        inputFormat = MediaFormat.createVideoFormat(mimeType, width, height);
        inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
        inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
        inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
        inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
//          inputFormat.setInteger("stride", stride);
//          inputFormat.setInteger("slice-height", sliceHeight);
        inputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, MAX_INPUT);

        encoder = MediaCodec.createByCodecName(codecInfo.getName());
        encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        encoder.start();

        ByteBuffer[] inputBuffers = encoder.getInputBuffers();
        ByteBuffer[] outputBuffers = encoder.getOutputBuffers();

        int inputBufferIndex= -1, outputBufferIndex= -1;
        BufferInfo info = new BufferInfo();
        for (int i = 0; i < mFilePaths.size(); i++) {

            // use decode sample to calculate inSample size and then resize
            Bitmap bitmapIn = Images.decodeSampledBitmapFromPath(mFilePaths.get(i), width, height);   

            // Create blank bitmap 
            Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);                   

            // Center scaled image
            Canvas canvas = new Canvas(bitmap);                 
            canvas.drawBitmap(bitmapIn,(bitmap.getWidth()/2)-(bitmapIn.getWidth()/2),(bitmap.getHeight()/2)-(bitmapIn.getHeight()/2), bmpPaint);

            Log.d(TAG, "Bitmap width: " + bitmapIn.getWidth() + " height: " + bitmapIn.getHeight() + " WIDTH: " + width + " HEIGHT: " + height);
            byte[] dat = getNV12(width, height, bitmap);
            bitmap.recycle();

            // Exception occurred on this below line in Emulator, LINE No. 182//**
            inputBufferIndex = encoder.dequeueInputBuffer(WAITTIME);
            Log.i("DAT", "Size= "+dat.length);

            if(inputBufferIndex >= 0){
                int samplesiz= dat.length;
                inputBuffers[inputBufferIndex].put(dat);
                presentationTime = computePresentationTime(i);
                if (i == mFilePaths.size()) {
                    encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                    Log.i(TAG, "Last Frame");
                } else { 
                    encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                }

                while(true) {
                   outputBufferIndex = encoder.dequeueOutputBuffer(info, WAITTIME);
                   Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                   if (outputBufferIndex >= 0) {
                       ByteBuffer encodedData = outputBuffers[outputBufferIndex];
                       if (encodedData == null) {
                           throw new RuntimeException("encoderOutputBuffer " + outputBufferIndex +
                                   " was null");
                       }

                       if ((info.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                           // The codec config data was pulled out and fed to the muxer when we got
                           // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                           Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                           info.size = 0;
                       }

                       if (info.size != 0) {
                           if (!mMuxerStarted) {
                               throw new RuntimeException("muxer hasn't started");
                           }

                           // adjust the ByteBuffer values to match BufferInfo (not needed?)
                           encodedData.position(info.offset);
                           encodedData.limit(info.offset + info.size);

                           muxer.writeSampleData(mTrackIndex, encodedData, info);
                           Log.d(TAG, "sent " + info.size + " bytes to muxer");
                       }

                       encoder.releaseOutputBuffer(outputBufferIndex, false);

                       inputBuffers[inputBufferIndex].clear();
                       outputBuffers[outputBufferIndex].clear();

                       if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                           break;      // out of while
                       }

                   } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                       // Subsequent data will conform to new format.
                       MediaFormat opmediaformat = encoder.getOutputFormat();
                       if (!mMuxerStarted) {
                           mTrackIndex = muxer.addTrack(opmediaformat);
                           muxer.start();
                           mMuxerStarted = true;
                       }
                       Log.i(TAG, "op_buf_format_changed: " + opmediaformat);
                   } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                       outputBuffers = encoder.getOutputBuffers();
                       Log.d(TAG, "Output Buffer changed " + outputBuffers);
                   } else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                       // No Data, break out
                       break;
                   } else {
                       // Unexpected State, ignore it
                       Log.d(TAG, "Unexpected State " + outputBufferIndex);
                   }
                }

            }     
        }

        if (encoder != null) {
            encoder.flush();
            encoder.stop();
            encoder.release();
            encoder = null;
        }

        if (muxer != null) {
            muxer.stop();
            muxer.release();
            muxer = null;
        }

        return true;

    };

    @Override
    protected void onPostExecute(Boolean result) {
        if (result) {
            if (mListener != null)
                mListener.finished();
        } else {
            if (mListener != null)
                mListener.errored();
        }
        super.onPostExecute(result);
    }



    byte [] getNV12(int inputWidth, int inputHeight, Bitmap scaled) {
        int [] argb = new int[inputWidth * inputHeight];
        scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
        byte [] yuv = new byte[inputWidth*inputHeight*3/2];
        encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
        scaled.recycle();
        return yuv;
    }


    void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
        final int frameSize = width * height;
        int yIndex = 0;
        int uvIndex = frameSize;
        int a, R, G, B, Y, U, V;
        int index = 0;
        for (int j = 0; j < height; j++) {
            for (int i = 0; i < width; i++) {

                a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
                R = (argb[index] & 0xff0000) >> 16;
                G = (argb[index] & 0xff00) >> 8;
                B = (argb[index] & 0xff) >> 0;

                // well known RGB to YVU algorithm
                Y = ( (  66 * R + 129 * G +  25 * B + 128) >> 8) +  16;
                V = ( ( -38 * R -  74 * G + 112 * B + 128) >> 8) + 128;
                U = ( ( 112 * R -  94 * G -  18 * B + 128) >> 8) + 128;

                yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
                if (j % 2 == 0 && index % 2 == 0) { 
                    yuv420sp[uvIndex++] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
                    yuv420sp[uvIndex++] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
                }

                index ++;
            }
        }
    }
}

This has now been tested on 4 of my devices and works fine, is there are way to

1/ Calculate the MAX_INPUT (to high and on the N7 II it crashes, I Don't want that happening once released) 2/ Offer an api 16 solution? 3/ Do I need stride and stride height?

Thanks

RuAware
  • 979
  • 1
  • 9
  • 26
  • Some `MediaCodec` samples can be found on http://bigflake.com/mediacodec/ and in https://github.com/google/grafika . – fadden Apr 03 '14 at 20:39
  • Thanks, i have seen bigflake and will take a look at grafika. At first look they take from the camera as its active rather then using the jpgs. – RuAware Apr 04 '14 at 07:41
  • None of them do exactly what you want. bigflake's EncodeAndMuxTest is pretty close; you would just need to convert bitmaps to GL textures with glTexImage2d and render them. This is more round-about than converting the bitmaps to YUV and handing them directly to MediaCodec, and it requires API 18 rather than 16, but you avoid the odd behavior noted in http://bigflake.com/mediacodec/#q9 . – fadden Apr 04 '14 at 14:27
  • is this why I am getting INFO_OUTPUT_FORMAT_CHANGED I have converted to YUV – RuAware Apr 04 '14 at 14:34
  • On API 18+ you always get a format-changed message from the encoder. (That's so you have a MediaFormat that you can pass into the MediaMuxer configure call.) – fadden Apr 04 '14 at 14:38
  • I have put my code so far, i'm using new BufferedOutputStream(new FileOutputStream( and not MediaMuxer – RuAware Apr 04 '14 at 14:49
  • Nothing leaps out at me as wildly wrong. What failures are you seeing? Without MediaMuxer you'll have a raw H.264 file rather than a .mp4 file, but a few players can handle that. – fadden Apr 04 '14 at 16:34
  • have updated code now works on 4 devices, just two questions left? – RuAware Apr 05 '14 at 16:30
  • 1
    (1) None of the bigflake code sets `MAX_INPUT_SIZE`. The input buffers will be sized appropriately by `MediaCodec` for width/height/format. (2) You have to use YUV input on API 16, and there are some format issues (e.g. the 2K align on qcom SoCs) that go away in API 18. (3) Stride and slice-height are all sorts of confused in `MediaCodec`. None of the bigflake examples set them. The only non-public keys used by bigflake is in EncodeDecodeTest, when *decoding* to YUV (crop-*). – fadden Apr 05 '14 at 17:43
  • Thank you, I have noticed that by not setting the MAX_INPUT_SIZE on my s3; crashes the app. Thanks for all your help. I will leave it as an api 18+ at the moment. It will be in my next release of eduDroid – RuAware Apr 05 '14 at 17:46
  • On API 18+, the YUV formats used for buffer input in the CTS tests have size (width * height * 3 / 2), where stride is always equal to the dimension (so make sure it's a multiple of 16, or 1080). The actual buffer allocated by the codec may be larger to accommodate a realignment, but it won't be smaller. – fadden Apr 05 '14 at 21:37
  • @fadden, what does the "slice-height" mean? – thiagolr May 14 '15 at 20:10

1 Answers1

1

You can control ffmpeg with few lines of Java code if you can afford dependence on a 3rd party app. I am not sure whether this project does the best effort to use hardware encoder.

Alternatively, on API 9+ you can use stagefright (you need JNI to communicate with it, and there is no public sources available except AOSP).

You can build your own ffmpeg library, see e.g. http://www.origenboard.org/wiki/index.php/FFmpeg_on_Android.

Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
  • Thanks for the info, I will look into this when i have more time, I am going with MediaCodec and leaving the option to export to mp4 off for anything before api 18 – RuAware Apr 06 '14 at 11:26
  • some of the links are broken on that origenboard page – RuAware Apr 06 '14 at 11:43