0

I'm building a module that is supposed to display images at a certain rate (not pre defined, but not very high - 10Hz max for the swapping of images).

From my research I got to the conclusion that QGLWidget is the right tool for this task, after enabling vsync with openGL calls(SwapInterval family).

Yet, I am not sure how to actually implement the swapping mechanisem - should I use a timer? If I set a timer for 333.3 ms(3 Hz), when the refresh rate is 60 Hz (16.67 per cycle, thus the timer is 20 cycles), and I be sure that timing will be fine? And if the rate should be 9Hz, I need to set the timer for 100+16.67 because this is the best I can get? And if a timer is ok, should I just call paintGL() when it sends me the timeout event?

Thanks

JLev
  • 705
  • 1
  • 9
  • 30

1 Answers1

1

should I use a timer?

Yes, but not in a naive way. If you'd simply use a timer for pinpointing the presentation of the images your timer frequency will beat against the display V-Sync/refresh oscillator – program timers run from a different clock source than the display output.

This beating will result in missed swap intervals, which will be perceived as frame stuttering.

Instead you should do the following: Use a V-Synced buffer swap (SwapBuffers¹) as a reference point to start a high precision time measurement timer.

Then render the frame for the next presentation time in the future you aim for; take into account that frame intervals come in at the display refresh interval granularity – unless G-Sync or FreeSync are used. Use glFinish to force completion of the frame rendering process, then stop the timer and determine how long it took to render the frame. If the frame was finished earlier than the refresh period you did aim for add a (high resolution delay) that delays your program into the aimed for display period (aim for the middle of the period), followed by a SwapBuffers which will be the reference point for the next iteration.


¹: This will work reliably only for Nvidia and AMD cards and their proprietary drivers. The drivers for Intel GPUs have a different timing behaviour.

datenwolf
  • 159,371
  • 13
  • 185
  • 298
  • Thanks, but what do you mean by "V-Synced buffer swap"? How can I start measuring when it occurs? Is that something I can control or access? From what I understand buffer swap is called implicitly, and not by me. – JLev Feb 05 '17 at 09:43
  • 1
    @JLev: You're using QGLWidget, which technically is a legacy from Qt-4, and you're right that in Qt-5 with QOpenGLWindow things have become a little bit more complicated. Anyway, QGLWidget has a member function `QGLWidget::swapBuffers` which does exactly what its name implies. However to be actually synchronized V-Sync must be enabled. This can be enforced/prohibited in the graphics driver settings and behaviour requested at runtime fine tuned using the `…swap_control` extensions (see http://opengl.org/registry search for `swap_control`). – datenwolf Feb 05 '17 at 11:52
  • Trying to implement your suggestion, I'm having a hard time to get the v-sync to be consistent. I think maybe the problem is initializing the first timer start - if I call swapbuffers just for reference, I get "QOpenGLContext::swapBuffers() called with non-exposed window, behavior is undefined". Should I just render something blank and use it? – JLev Feb 08 '17 at 11:30
  • And one more thing, when you say 'add a delay', is sleep() ok for this? – JLev Feb 08 '17 at 12:08
  • @JLev: you should use `usleep`, the microsecond resolution definitely is required and Qt is smart enough to choose the proper high resolution system timers to implement it. I also suggest that for the initial frame interval you skip the timing and delay, and only for subsequent frames implement it. – datenwolf Feb 08 '17 at 14:06
  • Sorry for bothering again, tried it but I keep getting my gui freezing and can even measure the actual display rate. I suspect this has something to do with Qt's event loop, but not really sure how to solve it. void `paintGL() { glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT); glEnable(GL_TEXTURE_2D); drawTexture((0,0,1,1),texture,GL_TEXTURE_2D); swapBuffers(); t1 = chrono::high_resolution_clock::now(); glDisable(GL_TEXTURE_2D); swapImages(); }` Is this the way to paint correctly? Currently I get X11 connection broke: I/O error (code 1) when force-quit – JLev Feb 09 '17 at 12:16