6

I am working on a graphical application that supports multiple operating systems and graphical back ends. The window is created with GLFW and the graphics API is chosen at runtime. When running the program on windows and using OpenGL, Vsync seems to be broken. The frame rate is locked at 60 fps, however, screen tearing artifacts appear. Following GLFW documentation, glfwSwapInterval(0); should unlock the frame rate from the default of using VSync. That works as expected. Using glfwSwapInterval(1); should lock the frame rate to match the monitors refresh rate. Not calling glfwSwapInterval(); at all should default to using VSync. While frame rate is correcly locked / unlocked using these calls, I experienced extremely interesting behaviours.

When glfwSwapInterval(); isn't called at all, VSync is set as default. But the wait for the next frame happens at the first draw call! One would think that the delay for the next frame would happen at glfwSwapBuffers(). No screen artifacts are visible what so ever.

When calling glfwSwapInterval(1);, Vsync is set and the delay for the next frame happens at glfwSwapBuffers()! That's great, however, when explicitly setting VSync, screen tearing artifacts appear.

Right now, not calling glfwSwapInterval() for using VSync seems to be a hacky solution, but :

  • The user wouldn't be able to disable VSync without window reconstruction,
  • The profiler identifies the first draw call taking way too long, as VSync wait time is somehow happening there.

I have tried fiddling with GPU driver settings and testing the code on multiple machines. The problem is persistent across machines if using windows and OpenGL.

If anyone can make any sense of this, please share, or if I am misunderstanding something, I would greatly appreciate some pointers in the right direction.

EDIT: Some other detail: the tearing happens at a specific horizontal line. The rest of the frame seems to work properly.

After doing some more tests, it seems that everything is working as intended on integrated graphics. Correct me if I am wrong, but it looks like it is a graphics driver issue.

Lambda
  • 133
  • 1
  • 13
  • Is the tearing stable, i.e. happens on nearly the same line each frame? – numzero Apr 15 '20 at 22:56
  • Yes! Happens at a specific line only! – Lambda Apr 15 '20 at 23:02
  • 3
    That means it does synchronize the buffer swap with vertical refresh, not just keeping the frequency. But it does that at wrong time. Buggy drivers probably. Or as it happens on different machines, buggy OpenGL implementation, MS loves DX and hates GL, we all know that. – numzero Apr 15 '20 at 23:05
  • It makes perfect sense, as the screen is quite smooth everywhere, except on a line.Do you have any idea how I could fix this "shift" in the buffer swap timing? – Lambda Apr 15 '20 at 23:09
  • Sadly no. There *might* be some option in the driver configuration, but even here on Linux, I don’t remember any (I had a similar problem some time ago; no configuration helped IIRC, only switch to GPU of another vendor). – numzero Apr 15 '20 at 23:26
  • That isn't an option for me sadly. Well, thank you for pointing out the weird detail that the tearing happens at a specific line. Investigating that might lead somewhere – Lambda Apr 15 '20 at 23:32
  • @numzero: `TearFree` on the `intel` and `amdgpu` Xorg drivers? – genpfault Apr 16 '20 at 00:05
  • Are you running these tests on a multi-monitor setup ? – rotoglup Apr 18 '20 at 17:08
  • @genpfault `amdgpu` is fine currently (it had wrong reclocking timing earlier, but that’s a bit different problem). `TearFree` on `intel` was non-functional (can’t retest, I changed the MB since then). – numzero Apr 18 '20 at 19:55
  • and the question is about Windows™, not Xorg – numzero Apr 18 '20 at 19:55
  • 1
    tearing can occur if your camera translation code is in another thread from your drawing thread and your camera changes position while your frame is drawn, which can occur if you use callback for mouse or keyboard input and that code changes your camera position, so you should have 2 camera positions and only change the real one at the start or end of rendering a frame – HopefullyHelpful Jul 20 '21 at 17:37
  • Sadly the code I wrote when I uploaded this question was single threaded. I can see how that could mess up things, but the root of the problem lies somewhere else. What I noticed is that with integrated Intel graphics, the tearing disappeared. But running the same code with a gtx 1060 showed tearing. Perhaps a driver issue? – Lambda Jul 20 '21 at 20:32

0 Answers0