I have a Unity3D application that plays videos on the UI. I find myself in need to find a better way to convert a YUV video buffer to a RGB buffer. My situation is this:
- Unity3D with a UI image that renders a video
- Gstreamer external process which actually plays the video
- A native plugin, called from Unity3D to convert the video YUV buffer to a RGBA one
My C++/Native plugin portion of code related to the YUV->RGBA conversion:
unsigned char * rgba = (unsigned char*) obj->g_RgbaBuff;
unsigned char * yuv = (unsigned char*) obj->g_Buffer;
while ( ta < obj->g_BufferLength)
{
int ty = (int)yuv[i];
int tu = (int)yuv[i + 1];
int tY2 = (int)yuv[i + 2];
int tv = (int)yuv[i + 3];
int tp1 = (int)(1.164f * (ty - 16));
int tr = Clamp((int)tp1 + 1.596f * (tv - 128));
int tg = Clamp((int)tp1 - 0.813f * (tv - 128) - 0.391f * (tu - 128));
int tb = Clamp((int)tp1 + 2.018f * (tu - 128));
rgba[ta] = tb;
rgba[ta + 1] = tg;
rgba[ta + 2] = tr;
ta += 4;
int tp2 = (int)(1.164f * (tY2 - 16));
int tr2 = Clamp((int)tp2 + 1.596f * (tv - 128));
int tg2 = Clamp((int)tp2 - 0.813f * (tv - 128) - 0.391f * (tu - 128));
int tb2 = Clamp((int)tp2 + 2.018f * (tu - 128));
rgba[ta] = tb2;
rgba[ta + 1] = tg2;
rgba[ta + 2] = tr2;
ta += 4;
}
This code gets called by Unity3D in a while loop to continuously update the output of my image, which is correctly showing the video. Thing is, it's really slow. When I'm opening more than one video, my FPSs drop from 60 to way below 30 with just three 720p videos. Is there a way to do this on the GPU? Or a smarter way to do it. Should I approach it in a different way?
To render the buffer to a texture I'm using this code in my native code, there's the rendering being done every frame by using Unity GL.IssuePluginEvent()
static void ModifyTexturePixels(void* textureHandle, int w, int h, void* rgbaBuff)
{
int textureRowPitch;
glBindTexture(GL_TEXTURE_2D, (GLuint)(size_t)textureHandle);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, w, h, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, rgbaBuff);
}