So I have a Unity3D plugin written in c++ and compiled for Android.
When I started off I used OpenGLES2 to maximize device reach but recently I decided I wanted to try moving up to OpenGLES3, so I included the gl3 headers instead of the gl2 headers, built, and switched Graphics API on Unity to OpenGLES3.
Unfortunately, it is not working correctly anymore.
The code is the following:
In one plugin I have this:
glBindTexture(GL_TEXTURE_2D, textureID);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, videoWidth, videoHeight, glPixFormat, glPixType, frameBuffer->buff);
where textureID is a pointer to a texture passed by Unity, and framebuffer->buff is the bytearray of the image I want to put in it.
In a second plugin I have this:
static std::vector<unsigned char> emptyPixelsAlpha(height * width, 0);
glBindTexture(GL_TEXTURE_2D, alphaID);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, width, height, GL_ALPHA, GL_UNSIGNED_BYTE, emptyPixelsAlpha.data());
where alphaID is a pointer to a texture passed by Unity. (I simplified this part a bit, what it does in this case is simply fill another texture of the same dimensions of the previous one with black on a single channel)
These two textures are fed to the following shader on Unity side:
Shader "alphaMaskShader"
{
Properties{
_MainTex("Base (RGB)", 2D) = "white" {}
_Alpha("Alpha (A)", 2D) = "white" {}
}
SubShader{
Tags{ "RenderType" = "Transparent" "Queue" = "Overlay" }
ZWrite Off
ZTest Off
Blend SrcAlpha OneMinusSrcAlpha
ColorMask RGB
Pass{
SetTexture[_MainTex]{
Combine texture
}
SetTexture[_Alpha]{
Combine previous, texture
}
}
}
}
Before, this code would simply make the displayed texture completely invisible since the "alphaID" texture becomes the alpha channel. Instead, it now displays the "textureID" texture as if the alpha channel isn't there or that it's somehow set to complete opaqueness.
I read over the OpenGLES3 specs but it clearly states that it's backward compatible with OpenGLES2 and haven't found much about porting issues.