4

I'm trying to make a Snake clone using C++ and OpenGL/GLUT. However, I've been having trouble programming the short time intervals allowed for input between movements. I've tried a few timing methods, and I ended up making a class for it (as you'll see below). This seems to be the best way to program the input delays (rather than glutTimerFunc() or sleep()), because the timer runs separately from the game loop, instead of putting the whole program on hold. This is important because I want the player to be able to pause at any time. Unfortunately, I'm having issues with this method now too. My timer class seems to ignore the double I give it for the time limit (simply represented as double "limit").

To test the class, I've set up a simple, looping console program that displays directional input from the user at the point when the timer reaches the time limit. It's supposed to display input every 0.33 seconds. Instead, it displays input at fixed intervals that seem to be around 0.8 seconds apart, regardless of what value has been given for the time limit. Why won't it display input at the given time intervals, and why has it made it's own time limit?

This also happens to be my first major C++/OpenGL project without a tutorial, so any comments or advice on my code/methods is appreciated!

#include <iostream>
#include "timer.h"
// Include all files necessary for OpenGL/GLUT here.

using namespace std;

Timer timer;

// Insert necessary OpenGL/GLUT code for display/looping here.

void update(int value)
{
    if (timer.checkTime())
    {
        if (GetAsyncKeyState(VK_LEFT))
            cout << "You pressed LEFT!" << endl;
        else if (GetAsyncKeyState(VK_RIGHT))
            cout << "You pressed RIGHT!" << endl;
        else if (GetAsyncKeyState(VK_UP))
            cout << "You pressed UP!" << endl;
        else if (GetAsyncKeyState(VK_DOWN))
            cout << "You pressed DOWN!" << endl;
    }

    glutTimerFunc(1000/60, update, 0);
    glutPostRedisplay();
}

timer.h

#pragma once
#include <time.h>

class Timer
{
public:
    Timer();
    bool    checkTime(double limit = 0.33);
private:
    double  getElapsed();
    time_t  start;
    time_t  now;
    double  elapsed;
    bool    running;
};

timer.cpp

#include "timer.h"

Timer::Timer()
{
    running = false;
}

bool Timer::checkTime(double limit)
{
    elapsed = getElapsed();

    if (elapsed < limit)
    {
        return false;
    }
    else if (elapsed >= limit)
    {
        running = false;
        return true;
    }
}

double Timer::getElapsed()
{
    if (! running)
    {
        time(&start);
        running = true;
        return 0.00;
    }
    else
    {
        time(&now);
        return difftime(now, start);
    }
}
  • 2
    Since you asked for advice/comments, you might want to consider a newer alternative to GLUT. At very least take a look at freeglut (if you aren't using it already). Also, you are using OpenGL 3+, right? – fintelia Jul 20 '14 at 04:27
  • I'm not even sure. I started with GLUT when I found a Pong tutorial that included them. No version information was included. I didn't know anything about graphical programming, and I just wanted to get started. I'll look into this though. Do you have any suggestions for alternatives? –  Jul 23 '14 at 00:20
  • 1
    [GLFW](http://www.glfw.org/) seems to be a good alternative. It even exposes an [API](http://www.glfw.org/docs/latest/group__time.html) to access the current time. – fintelia Jul 23 '14 at 18:22
  • I'll check that one out, thanks! –  Jul 24 '14 at 16:56

2 Answers2

0

Every 1000/60 milliseconds the glutTimer fires and you check Timer::checkTime, which calls getElapsed. You have all your time functions defined in terms of double, but you use time_t, which has resolution of only 1 second.

Therefore, you get something that looks like this (simulated numbers)

start time: 1234.5 seconds
glutTimer: 1234.516 seconds  (getElapsed = 0) // checkTime returns false
glutTimer: 1234.532 seconds  (getElapsed = 0)
...
glutTimer: 1234.596 seconds (getElapsed = 0)
glutTimer: 1235.012 seconds (getElapsed = 1)  // condition finally returns true
...

So, the actual delay depends on when you set your start time relative to the actual start of a second from the Epoch used by time();

I suspect it averages close to 0.5 seconds if you statistically measure it.

Answering question about "resolution":

Different functions for returning the current time return a different level of accuracy. For example, right now, the clock in the bottom right of my screen reads "12:14 PM". Is that exactly 12:14 and no seconds, or 12:14 and 59 seconds? I can't tell because the "resolution" of the clock display is one minute. Similarly, I might say it's a quarter past 12, when it is really 14 minutes past the hour if I report the time in a resolution of "quarter of an hour". As humans we do this all the time without thinking about it. In software you have to be conscious of these details for any function you call.

If you are on Windows, there is a high-resolution timer available through the QueryPerformanceCounter APIs. On most platforms, the Performance Counter is hardware based and has a resolution in the micro-second range.

Here's an example of calling it: http://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx#examples_for_acquiring_time_stamps

LARGE_INTEGER StartingTime, EndingTime, ElapsedMicroseconds;
LARGE_INTEGER Frequency;

QueryPerformanceFrequency(&Frequency); // get number of ticks per second
QueryPerformanceCounter(&StartingTime); // get starting # of ticks

// Activity to be timed

QueryPerformanceCounter(&EndingTime);  // get ending # of ticks
ElapsedMicroseconds.QuadPart = EndingTime.QuadPart - StartingTime.QuadPart;


//
// We now have the elapsed number of ticks, along with the
// number of ticks-per-second. We use these values
// to convert to the number of elapsed microseconds.
// To guard against loss-of-precision, we convert
// to microseconds *before* dividing by ticks-per-second.
//

ElapsedMicroseconds.QuadPart *= 1000000;
ElapsedMicroseconds.QuadPart /= Frequency.QuadPart;

There may be a similar facility on Linux, but I am not familiar with it.

Try this:

void update(int value)
{
    if (timer.hasTicked())
    {
        if (GetAsyncKeyState(VK_LEFT))
            cout << "You pressed LEFT!" << endl;
        else if (GetAsyncKeyState(VK_RIGHT))
            cout << "You pressed RIGHT!" << endl;
        else if (GetAsyncKeyState(VK_UP))
            cout << "You pressed UP!" << endl;
        else if (GetAsyncKeyState(VK_DOWN))
            cout << "You pressed DOWN!" << endl;
    }
    else if (!timer.isRunning())
    {
       timer.start();
    }

    glutTimerFunc(1000/60, update, 0);
    glutPostRedisplay();
}

timer.h

// this class provides a timer that can be polled and will allow the user to tell if a period has elapsed.
// note that this timer does NOT throw any events for timeout.
class PollTimer
{
public:
    PollTimer();

    // assuming time limit is a number of msec that and fits within a normal integer rather than the 64 bit  
    // variant (that would be a LONG LONG delay).
    void    setTimeout(int msDelay);

    // Timers generally expose start/stop and it’s not generally a good idea to make a single function 
    // that overloads complex combinations of behaviors as here both the start & elapsed operations.
    // admit this is a simple case, but generally it’s a bad design pattern that leads to “evil”.
    void    start();
    void    stop();
    bool    isRunning();

    // Has the timer expired since the last poll
    bool    hasTicked();

private:
    LARGE_INTEGER startTime;
    LARGE_INTEGER frequency; // per second
    int     delay; // in milliseconds
    bool    running;
};

timer.cpp

#include "timer.h"

PollTimer::PollTimer()
{
    // omitting error check for hardware that doesn’t support this.
    QueryPerformanceFrequency(& frequency); // get number of ticks per second
    running = false;
}

void PollTimer::setTimeout(int msDelay)
{
     delay = msDelay;
}

void PollTimer::start()
{
    QueryPerformanceCounter(&startTime);
    running = true;
}

void PollTimer::stop()
{
    running = false;
}

bool PollTimer::isRunning()
{
    return running;
}

bool PollTimer::hasTicked()
{
    if (!running)
        return false;

    LARGE_INTEGER now;
    QueryPerformanceCounter(&now);

    LARGE_INTEGER ElapsedMilliseconds;
    ElapsedMilliseconds.QuadPart = now.QuadPart - startTime.QuadPart;

    ElapsedMilliseconds.QuadPart *= 1000000;
    ElapsedMilliseconds.QuadPart /= frequency.QuadPart; // now microseconds
    ElapsedMilliseconds.QuadPart /= 1000; // milliseconds

    bool fExpired = ( ElapsedMilliseconds.HighPart > 0 || ElapsedMilliseconds.LowPart >= delay ) ; 
    if (fExpired)
    {
        // reset start time
        start(); // don’t copy/paste code you can call.
    }
    return fExpired;
}
Brad
  • 3,190
  • 1
  • 22
  • 36
  • Thanks for taking the time to answer! I'm not sure I'm following you though. What exactly does resolution mean in this context? –  Jul 20 '14 at 04:20
  • Edited answer above to answer this question – Brad Jul 20 '14 at 19:29
  • Okay, I understand the issue now. But I'm not sure how to use QueryPerformanceCounter with my code. How can I use this with the limit variable? I tried casting the LARGE_INTEGER for elapsed time to a double, so that checkTime() could use it with the limit variable. This caused it to seemingly never reach the time limit. I tested what the double elapsed was returning from getElapsed() to checkTime(), and it returns something like -7.17978e+009, the numbers never changing while the program is looping. –  Jul 23 '14 at 00:06
  • See my edit. (You could also send the timeout to hasTicked, which may be better for you.) – Brad Jul 23 '14 at 00:41
  • I've got it working now! Thanks a lot! But I'm curious, what exactly is a timeout, and what do you mean by "throwing for a timeout"? What would the practical use for setTimeout() be? –  Jul 24 '14 at 17:00
  • 1
    by "timeout" I just mean the timer has expired or reached it's desired delay. By "throwing" I mean most event driven programming models (e.g. Windows APIs, Qt library, web pages) have timer objects that fire (or throw) an event when a timer expires so you write your code in terms of handling events: kinda like this: while(1) { getNextEvent(); handleEvent(); } where there is an event handler for timers that expire. This timer requires that it be "polled" (regularly checked) for expiration instead of throwing events. – Brad Jul 25 '14 at 00:18
0

Time_t

Arithmetic type capable of representing times.

Although not defined, this is almost always an integral value holding the number of seconds (not counting leap seconds) since 00:00, Jan 1 1970 UTC, corresponding to POSIX time.

(According to cppreference.com)

Since time_t typically holds an integer number of seconds, even though difftime returns a double it can't have a precision greater than 1 second. Thus any interval less than 1 second will have to wait until it reports that a second has elapsed. This might lead you to expect that you would observe a full 1 second delay. However, when the start time is 123.999 seconds that gets reported as 123 seconds*, so a millisecond later it appear that the full second has elapsed. Other times though you do have to wait the full second. On average you would expect to observe a ~0.5 second wait time.

*The rounding rules might be different, but it really doesn't matter where the cutoff point is, since the net result is the same.

fintelia
  • 1,201
  • 6
  • 17
  • That makes sense, except for when I tried to use a number over 1 for my time limit. I had tried 5 seconds, but it still would display input approximately every 0.5 seconds. Shouldn't it happen somewhere between every 4 and 5 seconds? –  Jul 23 '14 at 00:10
  • That is very odd. Especially since 0.5 seconds is *more* frequent than you were getting before... – fintelia Jul 23 '14 at 18:14