>20% 16.67% (ntsc games are 20% faster than pal games) >You can't reliably get vsync (and you generally can't get it at all, in windowed mode)[...] Vsync works in windowed mode with opengl on nvidia and ati cards. But I've often seen a fixed tear line in the upper third of the screen with ati setups. Dunno really whats the reason for that. Guess you mixed that a bit up with java2d. Well, but its of course true that you cannot tell if enabling (trying to enable) vsync worked. With nvidia cards the current state can be querried, but thats the only exception. Other vendors dont think its necessary to follow the spec in that regard. Ah well... backing it up with a framecap doesnt hurt anyways and its certainly more robust. edit: Oh and even if you dont port it over to consoles, you still might get some PAL drag. I'm using an ati card right now and if tv-out and vsync are enabled I cannot exceed 50fps (hz of the primary display doesnt make a difference). Pretty annoying, isnt it?
Pedant Actually I've never used Java2D for anything other than GUIs so I don't have a clue how it works. But even nvidia cards lie about whether vsync is enabled or not because the user can force it off in the control panel. A lot of hardcore 3d FPSers do that for some unknown reason, but the end result is a game that runs at 1000fps instead of 60 which renders it unplayable. Cas
Better let the folks at Epic know they're doing it all wrong with Unreal Engine 3 then. And pretty much every other retail game I know of. I'm pretty sure Dead or Alive uses Cas' method though. Seeing as you can't play the game over here unless you have a tv that will do 60hz.
Fortunately 99.99% of TVs do 60Hz automatically so they can sell them in other markets without adjusting the design. It all goes out the window when you start doing things in 3D or over a network of course. But a great place to start is a 60Hz logic loop and 60Hz render loop Cas
>Actually I've never used Java2D for anything other than GUIs so I don't have a >clue how it works. Lies. Wasnt the first incarnation of puppytron a j2d game? I remember your rantings about its random "no sir, I wont accelerated that"-behaviour. And I also remember that you chimed in in one of those j2d/windowed/vsync hackery threads. Like there is some odd dx call for that kind of thing. (Well, its like 3 years ago... excuse the lack of details.) >But even nvidia cards lie about whether vsync is enabled or not because the >user can force it off in the control panel. Last time I checked that (~2 years ago) it worked perfectly fine with all 4 settings. Setting swapinterval to 1 and then obtaining the current value gave me: always on - 1 on by default - 1 off by default - 1 always off - 0 And setting it to 0 and getting the current value gave me: always on - 1 on by default - 0 off by default - 0 always off - 0 The same with an ati card resulted always in 0. >Fortunately 99.99% of TVs do 60Hz automatically[...] So far I havent seen a tv which cant do pal60 (aka PAL-60/525 aka PAL-M). However, with tv-out its a different thing. My card for example cant do that (its a shame really), but there are some indicators (ati tray tools) that this isnt generally true for ati cards.
Since Unreal Engine 3 (presumably) has network play support, it probably uses method #2 or #4. (Either that, or it has a problem with network games becoming desynchronised. Or a pure client-server architecture for network games, but this seems unlikely due to the latency involved.) Nonetheless, if I had a refund for every for every timing bug I encountered in a commercial game due to use of #3, I would significantly more money than I do right now.
No. Unreal Engine 3 uses a delta method - trust me. Because some coders struggle with using delta doesn't mean the method is bad. It's just bad coding - you can get timing bugs if you're a bad coder no matter which method you use. And another note. Network games getting desynchronised? All network games are desynchronised now. Can't remember the last network game that required it to be in sync.
I think he means he can't remember the last FPS that was synced. Most RTSs are synced, so Tribal Trouble is not a surprise.
Delta-time is an inherently flawed technique. A good coder can work around a flawed design to some extend, but a competent coder will not choose a flawed design in the first place. "Desynchronised" in the sense of "the game state on one computer is (significantly) different than the game state on another computer"? People in a deathmatch getting kills on their local computer while their opponent successfully dodged the attack on their computer (without retroactive correction)? I'm going to assume that you don't actually mean that.
You keep saying it's inherently flawed, yet the majority of retail games use this technique. Cutting edge development like Unreal Engine 3 does - if it's so wrong then why haven't they dropped it? Why are they writing one of the most advanced game engines, which they'll license for a truck load of cash incidentally, and using a technique which you say won't work? You said a problem with the delta system is network games becoming desynchronised. Network games aren't realy synced any more, everyone sees something slightly different. So why would a delta system not work? We're past the stage where you send keypresses to each of the machines playing.
I suspect most network games have a fixed server update rate with a variable client update rate. Obviously there must be sync problems involved, but it looks like they get taken care of without anyone noticing.
Basically your arguments amounts to "everybody is doing it", while failing to address my technical arguments at all. Large game studios do all kinds of stupid things, and commerical games regularily ship with serious bugs. Even the mainstream software development is generally ahead of the game industry in terms of development practices. It clearly does "work" in the sense that you can write a more or less playable game despite the flaws of the technique. I have higher standard of software engineering than that.
Rainer, people have addressed some of your points already in this thread. It's great that you strive for this higher standard of software engineering, but you can really only say that if you are in a tiny sandbox where the simulation is all that matters. Soaking up CPU time to the point of inefficiency and/or buffering a crapload of frames could be seen as "fundamentally flawed" in the right context. I.e. everyone is making compromises, even you. Since it sounds like you are obviously more experienced than I am with this sort of thing (I'm serious; not being patronizing or sarcastic), can you give your opinion on the method PeterM/Fabio suggested in this thread? It sounds like the way to go, given my needs. I welcome any caveat warnings.
For BreakQuest I used #2 for physics/collisions, #3 where possible for example all particles code (also used integer math here), and finally a low latency timer (10 fps) for game events that do not need more than that. Objects can be dynamically added/removed from a bunch of lists and are processed as required (preFrame, postFrame, highFrequency, lowFrequency, ...). I'm sticking twith this scheme.
Perhaps if we merged this thread with "Who is who" it would gain authoritativeness. ;^) I am... so I claim... "Ouch!"
I'm interested in using either #1, or #2, for my current game. It's 2d, and I hate all this delta time nonsense. I'm quite sure #2 would run very smoothly; I had planned on updating my game physics in 1ms increments, so I emtimate anywhere between 10 and 17 game physics updates for every render. So for each game render the number of times I call the game physics would be something like (in case of 60hz screen refresh): 16,17,17,16,17,17,16,17,17,17,16,17,17,16,17,17, .... Because of the small (1ms) time slice for each physics update, the movement won't look choppy. I had also thought about doing some (physics related) things, only once every 4, or 8, calls to the physics update function, and calculation of which animation frame to render could be done once per frame update (render). I'm also curious about option #1 ... What's your experience been with #1? If the player has his monitor running at 80hz say, would the game just run a bit faster? If it was set at 120hz, it would run twice as fast? Or am I missing something? For me, this is the core issue with #1; getting my game to run at same speed, no matter what the monitor frequency. Also, what's the reliabilty of forcing the monitor frequency to 60hz? The issue of windows freezing up or stalling, doesn't bother me too much. I wouldn't want to set up a situation, where my game stops rendering, for lets say one second (while windows 'hiccups'), and then the action suddenly jumps forward one second, with the player having no control or interation with the game for that one second. That's just ugly. I would prefer the game to just halt for a second, while windows gets it's act back together again. Also, if your game goes on XBox live arcade, would there be a problem with using #1? Thanks.
I've used various strategies over the years, including just using the vsync on consoles to provide timing which is fine if you can guarantee your code will run in a frame. But for general work I use delta timing. As long as you are sensible, the results for most game code will be more than sufficient. All game coding is about compromise anyway, there seems little point in getting on your high horse about one aspect of it. For the most part in games if it looks right, that is usually good enough.
Been using #1 since the dawn of time. Monitor can nearly be switched to 60Hz. I ask afterwards whether the monitor is indeed running at 60Hz and if it says yes, I turn vsync on as well. Lastly I cap the frame rate with a hires timer at 60Hz either way, which means drivers can't lie to me about the monitor rate and get away with it The great bit about #1 is that a game loop looks like this: Code: while (!finished) { for (Entity e : entities) { e.tick(); } for (Entity e : entities) { e.render(); } swapBuffers(); syncTo60Hz(); } no crazy floats, no deltas, no worries. Like I say, I'm an idiot and can't cope with complicated code when I just want some sprites to whiz around in a reliable sort of way. Cas