1

My QT application relies upon TimerEvent (startTimer/killTimer) to animation GUI components. Recently, however, I compiled and ran my app on my Mac laptop (as opposed to the Windows desktop computer I was developing on) and found that now everything appears to run/update at half the speed it usually does.

The application is not lagging, it simply appears that the update rate is less frequent than what is was originally. What should I do to guarantee consistent timing with the application on all platforms?

Alternatively, should I be using a different feature for temporary timer events? I would prefer not to, as TimerEvent is unbelievably convenient for integrating update cycles into Widgets, but would be interested if they provide consistent timing.

(basic example code for context):

// Once MyObject is created, counts to 20. 
// The time taken is noticeably different on each platform though.

class MyObject: public QObject {

public:
  MyObject() {
    timerId = startTimer(60);
  }

protected:
  void timerEvent(QTimerEvent* event) {
    qDebug() << (counter++);
    if(counter == 20) {
       killTimer(timerId);
    }
    Object::timerEvent(event);
  }

private:
  int timerId = -1, counter = 0;
}
Griffort
  • 1,174
  • 1
  • 10
  • 26

1 Answers1

4

You are likely seeing problems due to accuracy. QTimer's accuracy varies on different platforms:

Note that QTimer's accuracy depends on the underlying operating system and hardware. The timerType argument allows you to customize the accuracy of the timer. See Qt::TimerType for information on the different timer types. Most platforms support an accuracy of 20 milliseconds; some provide more. If Qt is unable to deliver the requested number of timer events, it will silently discard some.

You could try passing Qt::PreciseTimer to startTimer (the default is Qt::CoarseTimer), but additionally I recommend checking the current timestamp against some start time or against the timestamp of the previous tick. This will allow you to adjust how you handle the varying amounts of time between timer events. This is not dissimilar to how time steps are sometimes handled in games.

For example:

class MyObject: public QObject {

public:
  MyObject() {
    timerId = startTimer(60, Qt::PreciseTimer);
    startTime = std::chrono::steady_clock::now();
  }

protected:
  void timerEvent(QTimerEvent* event) {
    qDebug() << (counter++);
    if(std::chrono::duration_cast<std::chrono::microseconds>(std::chrono::steady_clock::now() - startTime) / 60 >= 20) {
       killTimer(timerId);
    }
    Object::timerEvent(event);
  }

private:
  int timerId = -1, counter = 0;
  std::chrono::steady_clock::time_point startTime;
}

Another example using QElapsedTimer:

class MyObject: public QObject {

public:
  MyObject() {
    timerId = startTimer(60, Qt::PreciseTimer);
    elapsedTimer.start();
  }

protected:
  void timerEvent(QTimerEvent* event) {
    qDebug() << (counter++);
    if(elapsedTimer.elapsed() / 60 >= 20) {
       killTimer(timerId);
    }
    Object::timerEvent(event);
  }

private:
  int timerId = -1, counter = 0;
  QElapsedTimer elapsedTimer;
}
Daniel Waechter
  • 2,574
  • 2
  • 21
  • 21
  • Oh awesome! Quick question then: Would it be wise to simply use “startTimer(1, Qt::PreciseTimer)” and then not run any of the TimerEvent contents until the change in time desired has occurred? I’m concerned about performance since the event function will be called so frequently. At most, I expect 5 widgets to have active timer events running simultaneously. – Griffort Apr 03 '18 at 23:23
  • 1
    That seems like an overly aggressive interval to me unless you actually need millisecond-level precision. It depends on what you're trying to accomplish, though. See eg [this question](https://stackoverflow.com/questions/15730765/qt-high-resolution-timer) for discussion of timer resolution. They also point out a QElapsedTimer class that would likely be more appropriate than my example solution. – Daniel Waechter Apr 03 '18 at 23:37
  • My logic is that I’m using “startTimer(5)” for some of the widgets, so in order to account for the half-speed on Mac, I’ll need to double that to perhaps “startTimer(2)”? In which case, I might as well bring it down to 1. In these extreme examples though, the timer only runs for ~30 cycles. Everything else is 10 ~ 15, so perhaps an interval of 5 would be sufficient for the longer running ones? (Thanks for the other advice, will use QElapsedTimer) – Griffort Apr 03 '18 at 23:43
  • I would test if simply using my suggestions (`Qt::PreciseTimer` + calculating elapsed time) provides good enough behavior for you in practice before dropping the poll time that much. It might be sufficient. 1ms might also be fine, of course, depending on how expensive it turns out to be, but I tend to err toward minimizing polling frequency. – Daniel Waechter Apr 03 '18 at 23:48
  • Also remember that your observation that things happened at "half speed" might not change at all by reducing your timer interval if the coarse timer's precision on Mac just isn't sufficient. Using the precise timer might fix it completely, but interpolating your timer event's behavior based on the observed time step is your best chance for synchronizing the cross-platform behavior. – Daniel Waechter Apr 03 '18 at 23:55