Timing Code : why your game is broken

Discussion in 'Game Development (Technical)' started by princec, Mar 10, 2005.

  1. princec

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    4,873
    Likes Received:
    0
    My laptop, which is about a year or so old now, has a 3.2GHz HT P4M SpeedStepped chip in it. It's not a particularly uncommon processor, and SpeedStepping and Hyperthreading are being shipped on most new machines now.

    Every single game I have played on it is broken. Even my own games, which has made it somewhat tricky to develop them. I didn't really know how to fix it until a couple of days ago and I'm exceedingly pleased with the results, because now my games are the only ones that run correctly on it, and even HalfLife 2 with its multimillion dollar budget doesn't run right on it as a result of the problem.

    The problem is this:

    If you are using a Windows hires timer to measure time, it gives pretty much random results on SpeedStep, or Hyperthreaded, or dual CPU machines (and in some other circumstances on older machines but that's not really an issue). Even if you are using vsync you have to measure time with the hires timer because vsync cannot be reliably determined to be working with any but Nvidia drivers anyway.

    The problem manifests itself thus: 60fps gameplay for a few seconds... then suddenly 30fps for a few seconds... then back to 60fps... and so on. The whole experience is totally offputting and ruins any game completely.

    I've fixed the problem by calibrating the hires timer with the O/S timer; it's not 100% perfect because it has to make a guess when the hires timer has gone awry and can only do this typically after the worst-case O/S resolution has passed (50ms, or about 3 frames). But it works brilliantly.

    I urge you all to have a damned hard look at your timer code and get it fixed because the problem will only become more common as more and more CPUs get Speedstep and / or Hyperthreading and / or multicore.

    Cas :)
     
  2. dima

    Original Member

    Joined:
    Feb 7, 2005
    Messages:
    345
    Likes Received:
    0
    Thank you Cas.

    This definitely is a problem. I havent personally had to deal with this before, but thanks for pointing it out. I use regular timer which is not a hi-res one, but in the future I will keep an eye on this issue.
     
  3. Sharpfish

    Original Member

    Joined:
    Feb 25, 2005
    Messages:
    1,309
    Likes Received:
    0
    My laptop is an AMD64 (3400) and has a similar technology (clock throttling) - the worst I have seen in games is that they can run too fast as they are calibrated against the initial 800mhz speed instead of what the game will actually run at when the CPU ramps up (2.2 ish ghz). Speed Switch XP cures that for me.

    Not that this is what YOu are talking about, but I have noticed it is something to take into account when coding. I use QueryPerformanceCounter/hires and will have to have a good look through all the code and make sure it is tested on various CPUs before I can consider it robust.

    thanks for the info.
     
  4. luggage

    Moderator Original Member Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    2,132
    Likes Received:
    0
    That's quite odd. We just use timeGetTime - would that still be affected?

    Scott
     
  5. princec

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    4,873
    Likes Received:
    0
    I'll tell you when I get home and try your games out on my laptop :)

    Further reading here.

    Cas :)
     
  6. Jim Buck

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    1,158
    Likes Received:
    0
    I seem to recall timeGetTime being the recommended fallback if the hi-res counters go screwy.
     
  7. princec

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    4,873
    Likes Received:
    0
    S'rite. Unfortunately almost no-one does it, it seems, to my great disappointment :(

    Cas :)
     
  8. Hamumu

    Indie Author

    Joined:
    Jul 26, 2004
    Messages:
    557
    Likes Received:
    0
    I've always used timeGetTime, and I'm glad to hear that my cluelessness is a virtue!
     
  9. princec

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    4,873
    Likes Received:
    0
    Ah, but Hamumu games aren't really reknowned for being silky smooth are they? :D

    Cas :)
     
  10. Hamumu

    Indie Author

    Joined:
    Jul 26, 2004
    Messages:
    557
    Likes Received:
    0
    Now you have me concerned... I haven't gotten any complaints. Is there something I don't know? They seem silky smooth to me (one issue: vsync doesn't work, in Supreme at least, but that's a separate problem).
     
  11. princec

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    4,873
    Likes Received:
    0
    Don't worry about it; I think your target customers aren't probably used to silky smooth 60fps games anyway. I loved Supreme with Cheese just as it was.

    Cas :)
     
  12. Jim Buck

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    1,158
    Likes Received:
    0
    In something I was messing around with a couple years ago, I was also initially using timeGetTime. I was using it to calculate the velocity and acceleration as I was moving the mouse around. The values were totally not smooth - jumping around a lot. With the hi-res timers, I was getting what I was originally expecting - very nicely smoothed velocity and acceleration curves. That's why, worst case, you can use it as a fallback.

    Having said that, Quake2 uses only timeGetTime. :)
     
  13. dima

    Original Member

    Joined:
    Feb 7, 2005
    Messages:
    345
    Likes Received:
    0
    Having a high-res timer isnt required for silky smooth animations. timeGetTime() is accutate to a millisecond, hi-res timers are down to microseconds. If your game is running at 60 FPS, a hi-res timer wont really help much. It does return more accurate time readings, but sometimes it messes up, so it's more difficult to sync the timer to the clock. I personally don't think hi-res timers are needed for these games, unless your game is running faster than 1000 fps, timeGetTime() is more than sufficient. There are ways to smooth out the delta time results, by averaging or others.
     
  14. Sharpfish

    Original Member

    Joined:
    Feb 25, 2005
    Messages:
    1,309
    Likes Received:
    0
    The code I use asks for a performance counter, if it is not available it falls back on the old methods, I thought that was standard practice?

    I do have *some* trouble with Vsync issues and jerky motion on time based modeling one of my (older) systems though.
     
  15. dima

    Original Member

    Joined:
    Feb 7, 2005
    Messages:
    345
    Likes Received:
    0
    This is probably a common issue with time based modeling. You use delta time to scale all the speed and animation variables, but sometimes the delta time fluctuates too much every frame. Say game runs at 100fps, so the dt=10ms, next frame it might be 16ms, next 9ms, next 30ms. This is usually the case, specially in windowed mode, as other processes are in the background and every frame rendered isn't exactly the same, so the delta time changes drastically every frame.

    There are ways to smooth out this problem however, without using hi-res timers. One is to average out the dt. Say you acumulate the dt in an array of 30 or so, average them, and use the average to scale the animation. This way you have a smooth number that changes rarely with only big changes in FPS. Not the best way, but it works well.
     
  16. gmcbay

    Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    280
    Likes Received:
    0
    Timing issues on Windows really are a mess, but there must be some games out there that run OK on your laptop, because I've heard of the method of using timeGetTime/GetTickCount to recalibrate the highres times in the past.

    Charles Bloom and Jon Watte, specifically, have brought up this issue a few times that I've heard of.

    Charles has a pretty good timer class here:

    http://www.cbloom.com/misc/

    It uses GetTickCount for the calibration, not timeGetTime. Generally timeGetTime is more accurate, but for the purposes of just calibrating the highres timer, it may not be worth it (GetTickCount generally gives results faster, even if they might be less accurate).

    For those of you just using timeGetTime, you may want to at least look into timeBeginPeriod and use that to attempt to set the timer to as good of a resolution as possible, since the default is different on different versions of Windows and it uses a really crappy resolution on some systems (pre-2000 Windows NT, so maybe nobody here cares) by default. On those Windows NT systems, timeGetTime has a 5 ms resolution so if you call it 5 times in 5 ms, you could get the same return value every time. timeBeginPeriod can usually override that and get it down to a 1 ms resolution.
     
  17. Pyabo

    Original Member

    Joined:
    Jul 27, 2004
    Messages:
    1,315
    Likes Received:
    0
    Sounds like an operating system issue to me. :p
     
  18. Larry Hastings

    Original Member

    Joined:
    Jul 27, 2004
    Messages:
    213
    Likes Received:
    0
    ltimer

    Howdy. I realize this thread is a bit dead, but this is my first time reading the Indie Gamer boards after a vacation of several months. And it wasn't very dead, and I have something to say, darn it!

    I wrote a reasonably clever timer class for Win32 a couple years ago: ltimer. You can find it here.

    ltimer currently uses a two-stage process. When your program first starts up, ltimer runs in "calibration" mode: it writes down the current QPC() and RDTSC values. If you ask it for the current time, it'll compute it based on QPC(). Then, when two seconds have elapsed (according to QPC() and its frequency as reported by QueryPerformanceFrequency()), it switches to using RDTSC.

    Though, truth be told, my game actually uses ltimer in "safe" mode, where it's really just a wrapper around QPC(). FUD about SpeedStep is what made me switch. On the other hand, QPC() isn't perfect either. See this hoary old ghost story, the ominously-titled Microsoft Knowledge Base article Q274323. (And, on the handle... was a hook!)

    Since then, I've been meaning to rewrite ltimer to handle all these problems. My current thinking: after the "calilbration" mode, compute the current time using RDTSC, but sanity-check it against QPC(). If they fall out of sync, use a third timer (GetTickCount() or timeGetTime()) to decide which of the two is more likely. Whichever one is more likely, advance ltimer's return value by that amount then keep both those values.

    Mr. C, do you think that approach would work well on your smokin' hot new PC?

    Edit: An additional thought for my rewrite of ltimer.

    Once per second (or, perhaps, per user-defined interval), recompute the cycles/sec for RDTSC, and if it changes "drastically" (defined by some tunable percentage), it's presumably because SpeedStep kicked in. So "recalibrate" it from the QPC() value. You don't have to worry about QPC() and RDTSC failing at the same time because QPC() only fails on specific chipsets when under heavy PCI load. (My caution in ltimer about recalibrating being a "bad idea" is because of drift, but SpeedStep changes in CPU speed would be much bigger and more obvious than that. A change of +-10%, say, could not be caused by "drift".)

    This would mean that when SpeedStep screwed with the CPU speed, you'd only see the game behave badly for half a second on average. Mr. C, does your PC flit in and out of full-speed on a constant basis, perhaps even many times per second, or is it a more large-scale thing where it might slow down for five seconds then speed up again?
     
    #18 Larry Hastings, Apr 3, 2005
    Last edited: Apr 3, 2005
  19. princec

    Indie Author

    Joined:
    Jul 27, 2004
    Messages:
    4,873
    Likes Received:
    0
    It slows for 5 secs, then back to normal, ad infinitum. Now here's the wierd part: I changed two things and it's right as rain now. Firstly, I just use timeGetTime, as advised by the master of cheap hackery Mike Hommel ;) Secondly, I changed my game loop delay timer from:
    Code:
    while (time < frameTime) {
    time = Sys.getTime();
    Thread.yield();
    }
    
    which busy loops on a yield() to this
    Code:
    while (time < frameTime) {
    time = Sys.getTime();
    Thread.sleep(1);
    }
    
    The second version is not 100% accurate per frame compared to QPC but over a few frames it's pretty much spot on. In truth I don't know how it fares on DOS boxes yet but seeing as DOS boxes don't generally run HT chips I expect I can safely check for that situation and use the old code again.

    Cas :)
     
  20. 20thCenturyBoy

    Original Member

    Joined:
    Sep 23, 2004
    Messages:
    178
    Likes Received:
    0
    My CPU is and AMD64 3000+ which has "cool 'n' quiet" technology. It needs a driver from AMD but what it does is reduce the CPU clock and voltage from 1800MHz/1.4V to about 900MHz/1.0V while idling. The driver monitors CPU activity and if it gets busy it bumps the CPU speed up. This is instantaneous and is never noticable. In fact I play loads of games on this system and have never noticed any framerate issues, including with all of Cas's games. Is Intel Speedstep different from this?
     

Share This Page

  • About Indie Gamer

    When the original Dexterity Forums closed in 2004, Indie Gamer was born and a diverse community has grown out of a passion for creating great games. Here you will find over 10 years of in-depth discussion on game design, the business of game development, and marketing/sales. Indie Gamer also provides a friendly place to meet up with other Developers, Artists, Composers and Writers.
  • Buy us a beer!

    Indie Gamer is delicately held together by a single poor bastard who thankfully gets help from various community volunteers. If you frequent this site or have found value in something you've learned here, help keep the site running by donating a few dollars (for beer of course)!

    Sure, I'll Buy You a Beer