4

I'm passing a SurfaceView surface from Java to JNI where I obtain the native window from that surface. Stagefright decodes h264 frames from an mp4 file. During the decoding process I call ANativeWindow::queueBuffer() in order to send decoded frames to be rendered. There are no errors on decoding or on calling queueBuffer(), all I get is a black screen.

I really feel like I'm not setting up the native window properly so that when queueBuffer() is called, it is rendered to the screen. However, I can render pixels to the native window directly via memcpy. Unfortunately, after I instantiate the OMXClient a segfault occurs when trying to manually draw pixels, so it seems I must use queueBuffer().

My surfaceview being setup in onCreate():

protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    SurfaceView surfaceView = new SurfaceView(this);
    surfaceView.getHolder().addCallback(this);
    setContentView(surfaceView);
}    

Once the surface is created, I call my native init() function with the surface:

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    NativeLib.init(holder.getSurface(), width, height);
}

The native window is created in JNI and a decode thread is started:

nativeWindow = ANativeWindow_fromSurface(env, surface);
int ret = pthread_create(&decode_thread, NULL, &decode_frames, NULL);

My routine for decoding frames a la vec.io's Stagefright decoding example

void* decode_frames(void*){
    mNativeWindow = nativeWindow;
    sp<MediaSource> mVideoSource = new AVFormatSource();
    OMXClient mClient;
    mClient.connect();

    sp<MediaSource> mVideoDecoder = OMXCodec::Create(mClient.interface(), mVideoSource->getFormat(), false, mVideoSource, NULL, 0, mNativeWindow);
    mVideoDecoder->start();

    while(err != ERROR_END_OF_STREAM ) {
        MediaBuffer *mVideoBuffer;
        MediaSource::ReadOptions options;
        err = mVideoDecoder->read(&mVideoBuffer, &options);

        if (err == OK) {
            if (mVideoBuffer->range_length() > 0) {

                sp<MetaData> metaData = mVideoBuffer->meta_data();
                int64_t timeUs = 0;
                metaData->findInt64(kKeyTime, &timeUs);
                status_t err1 = native_window_set_buffers_timestamp(mNativeWindow.get(), timeUs * 1000);
                //This line results in a black frame
                status_t err2 = mNativeWindow->queueBuffer(mNativeWindow.get(), mVideoBuffer->graphicBuffer().get(), -1); 

                if (err2 == 0) {
                    metaData->setInt32(kKeyRendered, 1);
                }
            } 
            mVideoBuffer->release();
        }
    }
mVideoSource.clear();
mVideoDecoder->stop();
mVideoDecoder.clear();
mClient.disconnect();
}

EDIT: Taking Ganesh's advice, I interfaced with the Awesome Renderer in order to change color space. During this it became apparent that the color format wasn't being set in Stagefright.

08-06 00:56:32.842: A/SoftwareRenderer(7326): frameworks/av/media/libstagefright/colorconversion/SoftwareRenderer.cpp:42 CHECK(meta->findInt32(kKeyColorFormat, &tmp)) failed.
08-06 00:56:32.842: A/libc(7326): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 7340 (hieu.alloclient)

Trying to set the color space explicitly (kKeyColorFormat to a yuv420P color space) leads to a dequeue problem. Which probably makes sense because the color format I specify is arbitrary.

08-06 00:44:30.878: V/OMXCodec(6937): matchComponentName (null)
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.qcom.video.decoder.avc' quirks 0x000000a8
08-06 00:44:30.888: V/OMXCodec(6937): matchComponentName (null) 
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.google.h264.decoder' quirks 0x00000000
08-06 00:44:30.888: V/OMXCodec(6937): Attempting to allocate OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): Successfully allocated OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): configureCodec protected=0
08-06 00:44:30.918: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] AVC profile = 66 (Baseline), level = 13
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] setVideoOutputFormat width=320, height=240
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] portIndex: 0, index: 0, eCompressionFormat=7 eColorFormat=0
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] found a match.
08-06 00:44:30.938: I/QCOMXCodec(6937): Decoder should be in arbitrary mode
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] video dimensions are 320 x 240
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] Crop rect is 320 x 240 @ (0, 0)
08-06 00:44:30.958: D/infoJNI(6937): before started
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 2 buffers of size 2097088 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x417037d8 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x41703828 on input port
08-06 00:44:30.978: V/OMXCodec(6937): native_window_set_usage usage=0x40000000
08-06 00:44:30.978: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 22 buffers from a native window of size 147456 on output port
08-06 00:44:30.978: E/OMXCodec(6937): dequeueBuffer failed: Invalid argument (22)
Ganesh
  • 5,880
  • 2
  • 36
  • 54
mathieujofis
  • 349
  • 3
  • 16
  • What is the color format of the decoded frames supported by the decoder? Please check as I have a suspicion that this could be your problem. Alternatively, you could consider employing a Local Renderer interface which has an explicit `YUV-RGB` conversion as can be observed here:http://androidxref.com/4.3_r2.1/xref/frameworks/av/media/libstagefright/AwesomePlayer.cpp#94 . Can you please confirm the version of Android in your case? – Ganesh Jul 31 '13 at 09:58
  • This was a suspicion of mine, too. I just assumed the frames of video would be in YUV format because of the h264 encoding and that Stagefright would decode in the same format. What would be the best way to verify what color format the decoded frames are? Stagefright is using OMX.qcom.video.decoder.avc. I also assumed that the ANativeWindow would be able to handle rendering a YUV image. Maybe I made a lot of assumptions, but that's how it seemed to work when I was decoding the same .mp4 file using the Java low level APIs with MediaCodec. The Android version I'm using would be 4.2.2_r1. – mathieujofis Jul 31 '13 at 20:41
  • I just interfaced with AwesomeLocalRenderer, and it revealed that kKeyColorFormat has not been set at all with my subclass of MediaSource. FFmpeg which handles the incoming data also reports that it can't read the color format. I tried setting the color format explicitly to OMX_COLOR_FormatYUV420Planar, but then the Stagefright buffers fail to dequeue. – mathieujofis Jul 31 '13 at 23:24
  • Is it possible for you to enable logs in `OMXCodec`, `AwesomePlayer` and share the same? It would become easier to help if you could get some logs. – Ganesh Jul 31 '13 at 23:32
  • Sorry, Ganesh-- you've been a great help so far. In the process of enabling logs in OMXCodec, recompiling and replacing libstagefright.so on the device, I bricked my phone (ugh). I recompiled using source from 4.2.2_r1 by using 'make -j4 stagefright' and I put the resulting libstagefright.so in my /sdcard/ via Android File Transfer. From there I went into adb shell and used the 'cp' command to copy it to /system/lib. Did I go wrong somewhere? – mathieujofis Aug 01 '13 at 20:39
  • I got my phone up and running again-- I enabled verbose logging and have posted the output above. – mathieujofis Aug 06 '13 at 01:02
  • Sorry, i couldn't respond earlier. From your logging, it looks like that `OMXCodec` was unsuccessful in setting the buffercount to `SurfaceTexture` i.e. `BufferQueue`. Is this error message reproducible? – Ganesh Aug 06 '13 at 16:13
  • Yes, it happens every time. – mathieujofis Aug 06 '13 at 18:28
  • From the available information, it appears that your application has `dequeued` some buffers from the same `native window` which is passed to the decoder. Can you please check if this is the case? Can you please ensure that native window is created afresh and not reused? – Ganesh Aug 07 '13 at 15:49
  • I'm not calling any function in my application that dequeues buffers from the native window, e.g. the way `OMXCodec` does in `native_window_dequeue_buffer_and_wait`. The native window is created from an existing `SurfaceView` named `ANativeWindow *nativeWindow` which is then assigned to `sp mNativeWindow`, as shown above. After that, the loop to read decoded frames and render them is entered, which upon exiting, ends the program. So I don't believe I'm reusing this window anywhere else. – mathieujofis Aug 07 '13 at 22:43
  • 1
    In your previous comment, you mentioned that you are creating `native window` from existing `SurfaceView`. Is it possible for you to try and create a new `Surface` and provide it to the player? Is it a hard constraint to reuse the same `Surface`? Could you give a small snapshot of your application code? – Ganesh Aug 08 '13 at 00:44
  • Create a new surface in native code? Here are the 3 files that make up the core of this project: [MainActivity.java](http://pastebin.com/3GgJvXUX), [MainJNICode (called AlloClient)](http://pastebin.com/ThAFe8Q8), and [MediaSource](http://pastebin.com/9pba54Cq) – mathieujofis Aug 08 '13 at 01:45

3 Answers3

1

I ended up solving this issue by using the Java low level APIs instead. I set up a native read_frame function that parses video frames using FFmpeg. I call this function in a separate Java decoder thread, which returns a new frame of data to be decoded by MediaCodec. It was very straight forward to render this way-- just pass MediaCodec the surface.

Alternatively, I could have used MediaExtractor, but FFmpeg had some other functionality that I needed.

Ganesh
  • 5,880
  • 2
  • 36
  • 54
mathieujofis
  • 349
  • 3
  • 16
1

Just in case the problem has not been solved! I have had the same problem, and found the problem pure accidentally!

@Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { NativeLib.init(holder.getSurface(), width, height); }

you have to allocate the frame buffer to a dimension dividable by 16, which is the macro block size. Otherwise, the graphic buffer is not large enough for decoding output. H264 encoder has internal little large frame size for encoding process if the provided video sequence has the width or height not aligned to macro block. Just apply following: width = 16 * (width + 15)/16; height = 16 * (height + 15)/16;

Steven
  • 48
  • 4
-1

You need call native_window_set_scaling_mode(mNativeWindow->get(), NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);

ssddn
  • 1
  • 1