0

We're trying to optimize memory usage in our program, and we've noticed that some of our textures could be stored as greyscale with an alpha value as when stored as RGBA the RGB values are all the same.

We'd like to save disk space as well as memory by only needing to store 2 channels: greyscale and alpha, both 8-bit. Thus providing a 50% saving on both disk and memory over 32-bit RGBA.

However the hurdle we've hit is that we can't work out how to give this info to openGL. We can store the image on disk either as one image with two 8-bit channels, or as two images with one 8-bit channel each. I can't find a way for openGL to see this as a greyscale image so that it only has these two channels in memory without looking at them as being specific colours like Red and Green - as I have found GL_RG8 exists which has the right number of channels at the right bitdepth, but wrong colours.

Is this even possible in openGLES 2? Can we give 2 channels to openGL and tell it that they're not Red and Green but Grey and Alpha?

I have seen this question: Can I use a grayscale image with the OpenGL glTexImage2D function? but that's rather old and GL_LUMINANCE is now depricated.

genpfault
  • 51,148
  • 11
  • 85
  • 139
Force Gaia
  • 514
  • 2
  • 8
  • 24

2 Answers2

0

Note that GL_RG8 only exists on OpenGL ES 3.x upwards; it's not part of OpenGL ES 2.x. For ES 2.x you can use GL_LUMINANCE_ALPHA. It's deprecated, but not removed, so should still work just fine.

If you really want to use GL_RG8 then you can fix "wrong color" either by using swizzles in the shader code to reorder the channels when you use them, or you can swizzle in the sampler by setting TEXTURE_SWIZZLE_[R|G|B|A] using glTexParameteri().

solidpixel
  • 10,688
  • 1
  • 20
  • 33
0

If you don't mind the venial sin I have noticed that every ES2 implementation I have encountered (iOS, Angle, Adreno, Mali, and some other weird Android drivers,) silently support GL_RG8 (for format) (and GL_RG for internalFormat.)

nmr
  • 16,625
  • 10
  • 53
  • 67