4

I've been working on this issue for a while and it's time to ask the greater community for help. I have read many other StackOverflow questions on this topic and have not yet found a relevant solution.

I have a well established Android OpenGL project that renders to a texture, before it then renders that texture to the screen. This mechanism is fundamental to my application and I have a lot of history and confidence in it. I recently added new functionality to internally take a screenshot of the rendering; that is to say, my application is able to save the rendered texture to a file as well. These images have traditionally been exactly the size of the display.

Now, I want to generate images that are larger than the screen size, so that the screenshots generated reflect the larger image size, but are also scaled to the screen size when displayed on screen. This should be a straightforward and easy process, however, I am getting unexpected results. The resulting screenshot is the correct size, but is empty except for an area the size of the screen. For example, if the rendered texture and resulting screenshot is intended to be 4 times the screen display size (twice the size of the screen for each dimension X and Y), the screenshot image file will be that intended size, but only the upper left quadrant of the image will have been drawn. In this example, here is the resulting generated screenshot. My viewport is 768x887 and the resulting screenshot is correctly 1536x1774 and within the screenshot, the only colored area is 768x887. For our purposes here, my fragment shader for rendering to texture is a test of the coordinate mapping to the screen...

gl_FragColor = vec4(uv.x, 0.0, uv.y, 1.0);  // during render to texture

Note that when we draw this same texture to the screen during execution, the full screen is colored consistent with that shader. Why is only one quadrant of the screenshot filled, instead of the whole thing? And why, when this texture is drawn on screen, does it display only the part that's the size of the screen, rather than the whole thing with the three empty quadrants?

I get the original size of the viewport from GLSurfaceView.Renderer.onSurfaceChanged() and store it into _viewportWidth and _viewportHeight. When I create the frame buffer texture, I traditionally created it directly from _viewportWidth and _viewportHeight. Now, I have, as an example...

float quality = 2f;
_frameBufferWidth = (int)((float)_viewportWidth * quality);
_frameBufferHeight = (int)((float)_viewportHeight * quality);

... and generate the frame buffer of size _frameBufferWidth by _frameBufferHeight.

I am also calling glViewport() twice. After my first call to glBindframebuffer() to render to the texture and not the screen, and after doing relevant error handling, I call glViewport(0, 0, _frameBufferWidth, _frameBufferHeight), which passes without error. When I later want to draw this texture to the screen, I make my second glBindframebuffer() call, and immediately after, call glViewport(0, 0, _viewportWidth, _viewportHeight). The idea is, the original render to texture is going into a _frameBufferWidth by _frameBufferHeight sized image and when we present it on screen, we want a _viewportWidth by _viewportHeight size.

Any ideas what I may be missing? Thanks in advance.

EDIT (March 10, 2016): I just tried quality=0.5f and am getting unusual results. I would prefer to share more images to clarify this scenario, but I'm a new member and am only allowed two. When we draw to the screen with quality=0.5f, the screen is colored properly according to the GLSL code above: the display is identical to the 768x887 upper left quadrant of the screenshot linked above (corresponding to quality=2f). The quality=0.5f screenshot that is generated, however, is colored differently from the screen. This screenshot correctly has the intended 384x443 size, but is still being rendered as though it's 768x887 and just cropping out a 384x443 part.

Even though the code suggests otherwise, it seems as though we're always rendering to a _viewportWidth by _viewportHeight area, rather than the intended _frameBufferWidth by _frameBufferHeight area.

I have basically a full screen quad for both rendering passes and am used to that working OK. When I render to the screen, I sample the texture I just rendered to:

gl_FragColor = texture2D(u_sampler, uv);  // during render to screen

The u_sampler accesses the texture we rendered to and uv is in [0,1] for both dimensions. So, for the screen to show anything, it must be doing a texture lookup to get its color information. Thus, the bright red and blue shown on the screen must exist in the framebuffer originally, even though it's missing from the correctly sized screenshot.

genpfault
  • 51,148
  • 11
  • 85
  • 139
patriot4code
  • 41
  • 1
  • 5
  • This all sounds quite reasonable. Do you have other devices you can test it on? Ideally from different vendors, with different GPUs? – Reto Koradi Mar 10 '16 at 04:48
  • Hi Reto. I just tested this on another device and had the same result. I started with a Samsung Galaxy Tab A and just tried an LG G2. According to my OpenGL queries, both use the Qualcomm Adreno renderer and allow a max texture size and viewport dimensions of 4096. I believe I got the idea about multiple glViewport calls from one of your other [answers](http://stackoverflow.com/questions/25937282/opengl-es-drawing-to-texture). Any additional perspective you could provide would be immensely beneficial! – patriot4code Mar 11 '16 at 02:00
  • From the sound of it, you're doing a blit operation (glBlitFramebuffer). These have always worked for me. Could you post the exact GL calls you're using? – Swifter Mar 11 '16 at 03:00
  • Hi Swifter. I'm using OpenGL ES 2.0, which does not use glBlitFramebuffer. I am using a presumably simple/straightforward combination of glBindFramebuffer, glFramebufferTexture2D, and glTexImage2D. I'll contemplate this further and may post more specific calls. Thank you for your suggestion and help. – patriot4code Mar 11 '16 at 04:45
  • It seems like your glViewport call isn't working. Have you checked it by calling getViewport afterwards? Also note that glViewport does not set a framebuffer viewport, but a part of GL state. Your order of operations needs to be: bind offscreen framebuffer -> set enlarged viewport -> draw offscreen -> bind screen framebuffer -> set screen viewport -> draw on screen. Is this the case? – Swifter Mar 13 '16 at 22:26
  • Hi Swifter. Correct, that is my order of operations. Using glGet* is a great suggestion and I just tried it after each glViewport call. The query results do validate that the viewport is internally being set correctly. – patriot4code Mar 22 '16 at 01:27
  • @patriot4code: If think you should post more code. It is not really clear what is going on. – derhass Mar 27 '16 at 13:25

1 Answers1

2

I met the same issue with before, as on iOS, also with openGL ES. I tried to render to a 4096*4096's framebuffer and it's on the top left corner.

The solution I found is to add

glViewport(0, 0, 4096, 4096)

before any other function such as

glClearColor(0, 0, 0, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

in the main render function.

I was able to render it later with

glViewport(0, 0, view.bounds.width*3, view.bounds.height*3);

as glViewport will translate the normalized coordinate to the pixel coordinate.

Another thing to mention is that, on iOS view's size is not in pixel but in point, so I need to multiply it by 3. You can check out if you get the actual pixel coordinate on Android.

qq456cvb
  • 167
  • 1
  • 7