0

When i render the cameraTexture to a low resolution SurfaceView, it looks pixelated.

Seems i need to generate mipmap for the camera texture, but it doesn't work this way.

GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glGenTextures(1, glTextures, 0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, glTextures[0]);

GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GL11.GL_LINEAR_MIPMAP_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);

mInputSurfaceTexture = new SurfaceTexture(inputTexture);
mInputSurfaceTexture.setDefaultBufferSize(CCamera.SIZE.getWidth(), CCamera.SIZE.getHeight());
mInputSurfaceTexture.setOnFrameAvailableListener(new CameraFrameListener(), mGLHandler);
mInputSurface = new Surface(mInputSurfaceTexture);

# feed mInputSurface to camera service.

public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    surfaceTexture.updateTexImage();
    GLES20.glGenerateMipmap(GLES11Ext.GL_TEXTURE_EXTERNAL_OES);   
    //GLES11Ext.glGenerateMipmapOES(GLES11Ext.GL_TEXTURE_EXTERNAL_OES);
}

BTW, that is the different of :

GLES11Ext.glGenerateMipmapOES

GLES20.glGenerateMipmap

BC.Lee
  • 11
  • 3

1 Answers1

1

You can't, at least not directly.

Implement an offscreen pass that converts the YUV to RGB to write an RGB image, and then mipmap that. If you know you only need the low resolution version, that YUV to RGB pass could also implement the initial 2:1 downsample to minimize the memory bandwidth overheads.

solidpixel
  • 10,688
  • 1
  • 20
  • 33
  • What do you mean by initial 2:1 down sample? like create a FBO with only half resolution of that OES texture? then mipmap the FBO and render to my preview SurfaceView? – BC.Lee May 26 '21 at 05:40
  • Yes, exactly - don't write a full resolution FBO you don't really need. – solidpixel May 26 '21 at 09:46
  • If OP is rendering each downsampled camera frame only once before discarding, I wonder if they're better off avoiding mipmaps altogether and just writing a shader that implements a downsample/blur by taking lots of samples, perhaps using a Kawase filter. – Columbo May 26 '21 at 11:06
  • 1
    Possibly. I know some mobile GPUs are slower at sampling YUV than RGB, so converting to RGB first (at lower res) still might be a prudent choice. I'm going to have to go and write a test app now ... – solidpixel May 26 '21 at 22:45