0

As we know, glReadPixels() will block the pipeline and use CPU to convert data format, especially when I want to read depth value out to system RAM.

I tried PBO provided by Songho, but I found it was only useful when param of glReadPixels() was set to GL_BGRA.

  • When I use PBO with param GL_BGRA, the read time is almost 0.1ms and CPU usage is 4%.
  • When I change param to GL_RGBA, it reads 2ms with CPU usage 50%.

It is the same when I try GL_DEPTH_COMPONENT. Apparently the slowness is caused by converting, so any one knows how to stop it converting data format?

In my program, I have to read depth value and calculate 16*25 times in less one second, so 2ms is not acceptable.

unwind
  • 391,730
  • 64
  • 469
  • 606

2 Answers2

1

so any one knows how to stop it converting data format?

D'uh, by reading a data format that does not need converting. On-screen framebuffers are typically formated as BGRA and if you want something different the data needs to be converted first.

You could use a FBO with texture/renderbuffer attachments that are in the format expected and render to that.

datenwolf
  • 159,371
  • 13
  • 185
  • 298
  • Do you know the default format for depth component of on-screen framebuffers? Only GL_DEPTH_COMPONENT is available for glReadPixels, and all data types like GL_UNSIGNED_BYTE, GL_UNSIGNED_SHORT have been tried but useless – bluezones Jan 19 '13 at 04:05
  • @bluezones: There's also GL_DEPTH_STENCIL, which is what most GPUs actually use. The type must be GL_UNSIGNED_INT_24_8. Depth is in the lower 24 bits, stencil in the upper 8. – datenwolf Jan 19 '13 at 19:31
0

Desktop OpenGL will give you the data in whatever format you want, so unless you specify the format that doesn't require conversion, it will convert it for you. Because that's what you asked for.

Given an implementation that supports ARB_internalformat_query2 (just NVIDIA right now), you can simply ask. You ask for the GL_READ_PIXELS_FORMAT​ and GL_READ_PIXELS_TYPE​, and then use those. It should return a format that doesn't require conversion.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
  • I downloaded new glew and I don't know how to use this function. I simply draw a cube with glBegin, glEnd, then use glReadBuffer(GL_FRONT) to load framebuffer, then when I try to use the function like: glGetInternalformativ(GL_RENDERBUFFER,GL_BGRA ,GL_READ_PIXELS_FORMAT,4,&pixelFormat), it shows "exception: visit confliction" – bluezones Jan 19 '13 at 04:10
  • 1
    @bluezones: `GL_BGRA` is *not* an [image format](http://www.opengl.org/wiki/Image_Format); it is a [pixel transfer format](http://www.opengl.org/wiki/Pixel_Transfer#Pixel_format). They're not the same thing. You pass image formats to `glGetInternalformativ`. Also, never read from the front buffer. – Nicol Bolas Jan 19 '13 at 04:33