Bravia engine is mainly employed for video/image post-processing prior to rendering on the framework. There is an interesting link at http://developer.sonymobile.com/2012/06/21/mobile-bravia-engine-explained-video/.
In AOSP
, I presume the user settings from the menu are read and subsequent filtering is enabled/applied in SurfaceFlinger
or HwComposer
parts of the framework. Another link of interest could be: http://blog.gsmarena.com/heres-what-sony-ericsson-mobile-bravia-engine-really-does-review/
EDIT: Interaction between Video Decoder - AwesomePlayer - HwComposer
The following is a summary of interactions between the different actors in the playback and composition pipeline.
AwesomePlayer
acts as a sink to the OMX Video Decoder
. Hence, it will continuously poll for a new frame that could be available for rendering and processing.
When OMX Video Decoder
completes the decoder, the FillBufferDone
callback of the codec will unblock a read
invoked by the AwesomePlayer
.
Once the frame is available, it is subjected to the A/V
synchronization logic by the AwesomePlayer
module and pushed into SurfaceTexture
via the render
call. All the aforementioned steps are performed as part of AwesomePlayer::onVideoEvent
method.
The render
will queue
the buffer. This SurfaceTexture
is one of the layers
available for the composition to the SurfaceFlinger
.
When a new layer
is available, through a series of steps, SurfaceFlinger
will invoke the HwComposer
to perform the composition of all the related layers
.
AOSP
only provides a template or an API
for the HwComposer
, the actual implementation of which is left to the vendor.