Questions tagged [android-mediacodec]

MediaCodec is a class from the package "android.media" of Android API that can be used to access low-level, native media codec, i.e. encoder/decoder components.

MediaCodec is a class from the package android.media of Android API that can be used to access low-level, native media codec, i.e. encoder/decoder components. Along with the common IO it supports input data of the following decoder mime types:

  • "video/x-vnd.on2.vp8" - VPX video (i.e. video in .webm)
  • "video/avc" - H.264/AVC video
  • "video/mp4v-es" - MPEG4 video
  • "video/3gpp" - H.263 video
  • "audio/3gpp" - AMR narrowband audio
  • "audio/amr-wb" - AMR wideband audio
  • "audio/mpeg" - MPEG1/2 audio layer III
  • "audio/mp4a-latm" - AAC audio
  • "audio/vorbis" - vorbis audio
  • "audio/g711-alaw" - G.711 alaw audio
  • "audio/g711-mlaw" - G.711 ulaw audio

The reference to the API docs: http://developer.android.com/reference/android/media/MediaCodec.html

1173 questions
0
votes
0 answers

Android decoder dequeueOutputBuffer returns -1

I try to run ExoPlayer demo app on my device, but I've had this problem: outputIndex = codec.dequeueOutputBuffer(outputBufferInfo, 0); always returns -1. I viewed all questions and answers on stackoverflow, but nothing could help me. I set sps and…
Jane
  • 61
  • 7
0
votes
1 answer

Render Android MediaCodec output on two views for VR Headset compatibility

What I know so far is that I need to use a SurfaceTexture that can be rendered on two TextureViews simultaneously. So it will be: MediaCodec -> SurfaceTexture -> 2x TextureViews But how do I get a SurfaceTexture programmaticly to be used in the…
0
votes
1 answer

Converting ByteBuffer obtained from Mediacodec to a bitmap to store it in SD card

I'm using MediaCodec to read a video file and store the frames on SD card. However it is storing a green rectangle instead of the actual frame. Here is the code: int outIndex = decoder.dequeueOutputBuffer(info, 10000); switch (outIndex) { …
Anirudh GP
  • 345
  • 1
  • 4
  • 15
0
votes
1 answer

Android MediaCodec with input buffer from JNI code

I use MediaCodec in a standard form such as: public void run() { MediaExtractor extractor = new MediaExtractor(); try { extractor.setDataSource("/sdcard/video-only.mpg"); } catch (Exception e1) { } MediaFormat format =…
0
votes
1 answer

Using MediaCodec to compress video

I'm trying to use MediaCodec in Android to compress videos. It looks like MediaCodec will give you back the raw stream. Is there a way to go from A->B (compress video with an output file at the end)? Thanks.
James Nguyen
  • 91
  • 1
  • 3
  • 12
0
votes
1 answer

Nexus 5 crash: FinalizerWatchdogDaemon - SurfaceTexture.finalize()

We're seeing a peculiar issue when attempting to re-encode videos, and it only seems to happen on the Nexus 5. We use MediaCodec to compress and re-encode videos clientside, and sometimes when the task is running on a Nexus 5 we get the following…
gcgrant
  • 361
  • 5
  • 12
0
votes
0 answers

extract Bitmap from SurfaceView

I'm displaying a real-time video on Surfaceview ,the data comes from mediacodec. So if I'm directly decoding the data to a surface, How can I get the bitmap of the current frame efficiently (with in 30ms).Currently I am dealing with the Bytebuffer…
0
votes
1 answer

MediaCodec different colours on genymotion and huddle 2

My aim: Use filters (cropping, Black and white, Edge detection) on a MP4 video from the SD card using render script. Attempted Solutions -Use MediaCodec to output to a surface directly. The rendered colour were correct but I could not find a way to…
dewijones92
  • 1,319
  • 2
  • 24
  • 45
0
votes
0 answers

Combine EncodeAndMux with HelloEffects in Android

I am trying to create a video with a series of images using MediaCodec and MediaMuxer. I want to also apply an Effect on each Bitmap before the encoding. The video gets created with the correct duration but the output is only black screen. I have…
Kaitis
  • 628
  • 9
  • 21
0
votes
1 answer

Delay frame while encoding video file using google/grafika

I'm using google/grafika's examples to decode, transform and encode back to file a video clip. The transformation is downscaling and translating, it is done via shader stored in Texture2dProgram. My main activity is based on CameraCaptureActivity.…
Krzysztof Kansy
  • 305
  • 2
  • 13
0
votes
0 answers

Andrdoid MediaCodec OutputBuffer approach method

I am using Mediacodec on Android for H264 stream Decoding. The raw data stream consists of a series of NAL Units. Every frame(640*480) in the video is divided into four parts in the stream. Each time I send a buffer(One NAL Unit) into Mediacodec…
0
votes
1 answer

Is it possible/how to feed MediaCodec decoded frames to MediaCodec encoder directly?

My goal is to splice video fragments from several video files. Fragments are defined by the arbitrary start time and end times. Initially I wanted to do it using a library like mp4parser but it can only cut streams at sync (IFRAME) points, while I…
0
votes
1 answer

How to process audio(.3gp) files in android

I need to process an audio file in android to know if the voice is contiguous with no periods. the audio file that I need to process it is not huge it is a very small file regarding the time and the size. I need to know if there is any library or…
user4338712
0
votes
1 answer

What, if any, are the differences in MediaCodec encoders (I420, NV12, Planar, Semi-Planar, etc)?

Referring to this page: http://bigflake.com/mediacodec/ A5. The color formats for the camera output and the MediaCodec encoder input are different. Camera supports YV12 (planar YUV 4:2:0) and NV21 (semi-planar YUV 4:2:0). The MediaCodec encoders…
0
votes
1 answer

Android MediaCodec/NdkMediaCodec GLES2 interop

we are trying to decode AVC/h264 bitstreams using the new NdkMediaCodec API. While decoding works fine now, we are struggling to the the contents of the decoded video frame mapped to GLES2 for rendering. The API allows passing a ANativeWindow at…