This is on a quad core, 8GB ram, 1GB video card machine - it's no slouch - but the subtle difference was enough to make my task impossible.
It doesn't stop at gravity either. Virtually everything in the game world is moving via acceleration. If you were going to do this in a consistent manner you would need to do this for every single physics calculation for every single game world object.
You are correct that you would have to profile it to determine how much of an issue it would be but it sounds like potentially a lot of deadweight to add to the game. Especially since when games slow down it is often due to having a lot of objects in the gameworld simultaneously.
Unless your game is running like garbage this will not matter. You are talking about a dt of something like 1/30 or 1/60
Also, it's very important to people's perceptions of performance (whoa) to maintain a consistent frame-rate. The variations are much more noticeable than the absolute rate. Finally, on lots of hardware, you can't actually display at 59Hz, say. You'll either be at 60 or 30, and will oscillate between the two in a most annoying way.
You are already storing all these values in floats which is inherently inaccurate. Also if the dt becomes very large you have bigger problems than gravity anyway.
No, they don't.
Game logic should run at constant 60 or more fps, even when graphics framerate is lower: http://gafferongames.com/game-physics/fix-your-timestep/
But yes, it's an interesting article, and this method allows for a bit more accuracy.
And with a constant delta, if you choose it right, it can be easier to get collision detection right. You can possibly get away with checking if two things are colliding with each other right this tick instead of checking if they might have run through each other between ticks. With variable delta, if you want to get it right, you might have to check where things where half a frame ago and such, and at that point it might be easier to just check more often instead.
And I think you should probably ask the same question about variable deltaTime. Assuming that coupling rendering and game logic is not some best practice that you should default to, why would you want variable deltaTime? (There may or may not be some good answers to that, and maybe it totally depends on the game and so on.)
One good argument for variable dT is if you do enough logic such that game time may be a major performance bound. If on a machine that is running fixed dT cannot perform all computation in the allotted time, then the game will get behind schedule, and how do you resolve the problem of having a wall clock 10, 100, or more frames ahead of simulated time when you can't keep up. In that regard variable dT degrades more gracefully.
Sure, it makes some things easier, but it comes at some cost.
Like, you probably would believe the amount of people playing Quake Live at 7 fps.
Plus there is a lot of things you don't know; maybe he haves the minimum requirements but he is running a lot of background process because he installed a bunch of things he doesn't use.
During the far majority of use cases the timestep is fine, and you are just creating deadweight by doing this. Plus I guess you are now doing tunneling checks as well to create further deadweight for the rare occasion when someone has framerate issues.
I am looking at this in the sense that I would actually implement this in a game I make, and I would not because the upside of doing this is basically "if the game is already fucked I want it to be maybe not as fucked but still having many other big issues with pretty much every other component of the game logic", vs a bunch of deadweight when the game is running correctly, which should be 99.5% of use cases.
Am I adding assumptions here? I guess so but they are real ones for real game developers. Unless you are doing something very unusual these would be your concerns.
This is not an academic exercise for me like it maybe is for you and most of the commenters it seems like.
My point originally was about this statement of yours, that is purely conceptual and you skew the discussion to address the other part of your comment (or you through that was the point but didn't explain so in your answers)
But anyway, one assumption is that every game should have a fixed gamestep, in many single player games (real single player games) you don't want to do that, because is preferable that the user can see the ball coming to the player than to magically appear behind the player even if that is "timely correct"; the ball/interface going slow is a lot less frustrating that losing without realizing why. Platformers for PC with (virtual) high speed movements come to mind as a common example of this.
Even multiplayer games suffer of this; in most online FPS if the server suddenly slows down all the players start experiencing lag and everyone looks like they are "teleporting"; with a not-fixed-timestep you could slow everyone down so the problem becomes a lot less frustrating because the player still have complete control and understanding of their in-game character; just a bit slow-mo until the server speeds up to normal. The teleporting should be used only to sync with the server when the client connection is the one with problems updating.
I can't think of a framework that doesn't have a fixed timestep except for maybe gamemaker or some of the simpler frameworks. Maybe some of the html5 stuff is non-fixed timestep but that is a field that is just coming into its own right now.
None of the bigger 2d or 3d physics engines today that I know of use a non-fixed timestep.
In platformers with high speed movements fixed timesteps are even more important because of tunneling. You can often account for tunneling in some of the better physics engines but it is sloooow.
In fps games there is a lot of interpolation going on with the server correcting player and object positions. If the server lags you and everything else is going to teleport no matter what. The server is not even accepting your control input at that point so either you are going to desync or you are going to teleport when the server tells you your real position. Generally in fps games only critical physics objects are being server corrected anyway, this is usually a small fraction of the physics objects. For example if an enemy fires a rocket at you that is not usually server corrected. If you get hit in the server calculations then you got hit whether or not you think you dodged it client side.
My point is that if the server hiccups your jump being slightly off is not the big problem, The big problem is that the controls are unresponsive. Also when the server hiccups it is rarely the actual fixed timestep serverside that is your problem, it is the internet. If your computer hiccups then it is not updating positions from the server so all critical objects are going to teleport. Could there maybe be slightly less teleporting on non-player controlled critical physics objects? Maybe so but if you are losing that many timesteps than things are going to jump around on your screen no matter what because your framerate is like 5.
Why would server lag effect FPS? I can only see how it would effect the reported positions of the other players because you aren't getting updates to changes in their movement. Server lag should never effect your FPS.