Questions tagged [openmax]

Questions related to OpenMAX, a royalty-free, cross-platform set of C-language programming interfaces that provides abstractions for routines especially useful for audio, video, and still images. This includes OpenMAX AL, DL and IL layer APIs.

OpenMAX (Open Media Acceleration) is a royalty-free, cross-platform set of C-language programming interfaces that provides abstractions for routines especially useful for audio, video, and still images. It's intended for devices that process large amounts of multimedia data in predictable ways.

OpenMAX provides three layers of interfaces: Application Layer (AL), Integration Layer (IL) and Development Layer (DL). OpenMAX is managed by the non-profit technology consortium Khronos Group.

OpenMAX AL is the interface between multimedia applications, such as a media player, and the platform media framework. It allows companies that develop applications to easily migrate their applications to different platforms (customers) that support the OpenMAX AL API.

OpenMAX IL is the interface between media framework such as DirectShow or GStreamer and a set of multimedia components (such as an audio or video codecs). It allows companies that build platforms (for example an MP3 player) to easily change components like MP3 decoders and Equalizer effects and buy components for their platform from different vendors.

OpenMAX DL is the interface between physical hardware, such as DSP chips and CPUs, and software, like video codecs and 3D engines. It allows companies to easily integrate new hardware that supports OpenMAX DL without reoptimizing their low level software.

More information on OpenMAX at

  1. http://www.khronos.org/openmax/
  2. http://en.wikipedia.org/wiki/OpenMAX
85 questions
1
vote
1 answer

OpenMAX, Raspberry PI: Get Video Dimensions of H264

is there any way to get the video dimensions of a H264 video on the raspberry pi using OpenMAX directly without having to use ffmpeg or something else? All the pi examples appear to have hardcoded values for that. Thanks!
moka
  • 4,353
  • 2
  • 37
  • 63
1
vote
1 answer

how do you build gstreamer's gst-launch pipelines?

Let's say you have a video file. As far as I searched, you first need to know what container it uses by mediainfo command. $ mediainfo your_path_to_a_video.file you then need to find a demuxer for the container, so you do $ gst-inspect-1.0 |…
kukrt
  • 2,117
  • 3
  • 21
  • 32
1
vote
1 answer

Using hardware encoder from android camera hardware implementation

I want to use a hardware encoder to create JPEG image files from my hardware/ti/omap3/camera/CameraHardware.cpp file. Currently it uses libjpeg to do the encoding. From java code, I have hardware encoding working, but I don't know how to use it from…
Dennis Estenson
  • 1,022
  • 10
  • 11
1
vote
1 answer

mpeg2 ts android ffmpeg openmax

The setup is as follows: Multicast server 1000Mbs, UDP, Mpeg2-TS Part 1 (H.222) streaming live TV-channels. Quad core 1.5Ghz Android 4.2.2 GLES 2.0 renderer. FFMpeg library. Eclipse Kepler, Android SDK/NDK, etc. Running on Windows 8.1. Output…
WLGfx
  • 1,169
  • 15
  • 31
1
vote
1 answer

Stagefright: OMX Subsystem in Stagefright and OMX Core are running in which process context

I am facing some issues with Stagefright command line utility where I am unable to understand if the OMX subsystem (OMX, OMXMaster) in Stagefright and OMX core are running in the current application's process or different process. Which part of the…
stackuser
  • 629
  • 8
  • 19
1
vote
1 answer

Can we use OMX to do video encode on android?

There is a 'native-media' project in NDK samples, in which it calls OMX functions in C level to do the video decode and play stuff, but it seems that NDK doesn't support encode of OMX now, is that true? Besides, I also find this link. It seems that…
Brendon Tsai
  • 1,267
  • 1
  • 17
  • 31
1
vote
2 answers

Does Android NDK support native method(OMXAL) of video encoding?

In android-NDK-r9c, Google provide a sample 'native-media'. In this sample, we can use OMXAL in the C level to do the MeidaPlayer job. I am wondering if we can do media encoding this way? I tried to write the corresponding functions, but I failed,…
Brendon Tsai
  • 1,267
  • 1
  • 17
  • 31
1
vote
0 answers

QOMX_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka converter

I need to handle YUV data from H/W decoding output on Android. Actually, I'm using Nexus4 and the decoding output format is QOMX_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka type. But I need YUV420 Planar format data, it need to be…
1
vote
0 answers

Transport stream processing in Openmax al

I am trying to understand the flow of audio/video in OpenMAX AL.I have the following doubts. According to my understanding the audio/video flow goes like: From the android application the mpeg2 stream goes to OpenMAX AL which gives it to soc where…
Mayank Agarwal
  • 447
  • 1
  • 7
  • 21
1
vote
2 answers

How to analyse .so file? [Android/Linux]

everyone. I need to analyse the libOMXAL.so of Android. (That one is in the NDK folder) // I need to find out whether this file contains the Implementation of OMXCLient.cpp or not. I want to analyse this .so file to find out which function it…
Brendon Tsai
  • 1,267
  • 1
  • 17
  • 31
1
vote
1 answer

How to shorten the latency of openmax avc decoder?

I am trying to write an android video conference app by using codec of openmax. When I have coped my way with OpenMAX IL for avc decoding, found it a big latency from sending empty buffer command to fill buffer done callback. My case is dealing with…
Zighouse
  • 13
  • 3
1
vote
1 answer

Is it possible to create multiple instances of OMXCodec using stagefright

I want to use libstagefright.so in android phones for media operations. I have explored example given in this page. I have been trying to implement the same. While testing this implementation for multiple OMXCodec instances, it's output buffer…
sam18
  • 631
  • 1
  • 10
  • 24
1
vote
1 answer

Variable length structures

OMX provides a struct with following definition /* Parameter specifying the content URI to use. */ typedef struct OMX_PARAM_CONTENTURITYPE { OMX_U32 nSize; /**< size of the structure in bytes */ OMX_VERSIONTYPE nVersion; /**< OMX specification…
Stasik
  • 2,568
  • 1
  • 25
  • 44
0
votes
0 answers

FFmpeg codec internal error with my decoder

`Hey, I write a ffmpeg codec based on openmax. It can run success with my test code or ffmpeg test. However it run with error when I use it in chrome. The error is [3845:3880:0710/110530.508867:ERROR:ffmpeg_video_decoder.cc(487)] avcodec_open2…
0
votes
1 answer

keras.backend.function() won't accept model.layers[0].input as input tensor

I'm trying to use this tensorflow implementation of openmax and adapt it to CIFAR-100 to use it in my project. As part of openmax instead of softmax you have to get the activation vector of the penultimate layer, for which the following function is…
Saeed Aram
  • 11
  • 3