
Fix Your Timestep - replyifuagree
https://gafferongames.com/post/fix_your_timestep/
======
zubspace
The mixing of unfixed and fixed timesteps leads in my opinion to one of the
greatest headaches in game development. Unity and Godot for that matter advice
you to put all physic related scripts in a FixedUpdate method and game Logic
scripts inside Update.

But it is never that simple. So you have to process your input in Update and
forward actions to FixedUpdate yourself if physic objects are affected. And
this alone leads to numerous tricky problems.

And then everything goes down quickly if you have physic related objects which
also affect your gamelogic. What to do? Process everything in FixedUpdate
always? Well, say goodbye to immediately processed input.

Additionaly, don't forget that you need to interpolate or extrapolate
rigidbodies, otherwise you won't see them move smoothly if the physic world
updates at 60HZ and your screen at 144Hz. And as soon as you do that, the
representation on the screen does not always match the physic world, which is
really bad for fast gameplay.

And if you do something wrong, your whole physic simulation can explode.

I haven't found a nice and simple solution to all of this, which works for all
cases, even though I really tried [1] and I believe many games suffer from
delayed input handling, micro-stuttering or unreliable physics due to mixing
fixed and unfixed timesteps.

[1] [https://www.zubspace.com/blog/smooth-movement-in-
unity](https://www.zubspace.com/blog/smooth-movement-in-unity)

~~~
tgb
Why do they recommend game logic in the non-fixed update?

~~~
OskarS
A good example is moving the camera: lets say you have a camera set to follow
a particular character (in some fancy way), but you only update it in
FixedUpdate(). If your framerate is faster than the physics update,
FixedUpdate() will not be called every frame, and your camera will get a
visible stutter (it will essentially be "temporally aliased" with the physics
updates). Therefore, camera movement updates should always go in Update().
This kind of reasoning applies for many "game logic" things.

There's a flip-side to that though: if the stuff in the world are physics
objects (especially if it's your main character), and they only update in
FixedUpdate(), then even if the camera moves smoothly, the objects will
stutter. Hence, you need to do some kind of interpolation or something of them
as well in FixedUpdate().

The point is: this stuff is really, really hard to get right. There is no
simple right answer.

~~~
badsectoracula
On the other hand if you update the camera per frame you'll end up with the
shotgun bug in Deus Ex: The Fall when running with vsync turned off (see Total
Biscuit's video on the game as an example for all sorts of bugs that exist
because of timing issues from updating stuff per frame - note that TB didn't
realize that).

If you want to follow a character in a fancy way (ie. not locked), update the
camera's target position in fixed steps and perform smooth interpolation
between the last position and the current position for the frames between the
updates.

------
klodolph
Physics simulation is really interesting. I had the fortune of working on a
physics simulation engine in industry (although the company culture was
dysfunctional, that’s another story), and I often hand-roll my physics
simulation code in games that I make as a hobby, or use off the shelf engines
like Unity’s or P2.

The timestep naturally has to be small enough that you can get good results
with values that change quickly. The simulator I worked on professionally used
adaptive step size. For these applications where accuracy is paramount, the
adaptive step size gets you there with a smaller computational budget. For
games, you often want a very _repeatable_ physics simulation. For example, if
the player can normally jump onto a 3.5m ledge, and the step size changes,
maybe the player can now jump 3.6m, or maybe only 3.4m. This can frustrate the
player (or it can be exploited in speedruns).

I’d also say that for most games, physical accuracy is very rarely a useful
tool. That’s just my experience. 90% of the time, I just want velocity,
momentum, and collisions. If you are using an existing engine like Unity, this
means, _among other things,_ that you probably don’t want to make everything
in your game a Rigidbody. The physics simulator is great for things like
knocking crates over and making characters ragdoll, but for your characters
in-game, try doing the physics yourself in a custom character behavior instead
of using Rigidbody.

You might be surprised that a ton of examples from the Unity site are like
done like this.

~~~
thedirt0115
There's a billiards simulator/game[1] that uses event prediction instead of
step-based physics for realistic and reproducible results. It solves the
equations of motion to find the next event, animates to that point, and
continues until nothing is moving. This is feasible since there are only 16
balls and 1 cue tip, but I imagine that as computational power continues to
increase, the scale of such simulations will increase too. I'm excited to see
what's next :)

[1] - I read a paper that specifically talked about step-based vs event-
prediction for billiards simulation, but I can't find it now :/ Virtual Pool 4
might use the system for its physics, but don't quote me on that!

P.S. - Unity has a pretty handy built-in CharacterController class that lets
you move around while respecting collisions -- you just need to add your own
Gravity. Also, for more advanced character movement, Catlike Coding has a
great series of tutorials:
[https://catlikecoding.com/unity/tutorials/movement/](https://catlikecoding.com/unity/tutorials/movement/)

~~~
klodolph
That’s sort of the way things are done in industry.

One strong example is in mixed-signal electronic simulation. Your digital
portion of the system describes digital outputs in terms of digital inputs and
gate delays, more or less. Changing the digital inputs results in a queue of
events.

So you run the analog portion of the system until the pending digital event,
process the digital event, and then continue the analog simulation.

------
donatj
> The problem is that the behavior of your physics simulation depends on the
> delta time you pass in. The effect could be subtle as your game having a
> slightly different “feel” depending on framerate

Is THIS why the physics in GTA San Andreas felt different on each console, and
WAY different on PC? I have always suspected something like this but never
knew for certain.

~~~
fxtentacle
Most likely not. To make cross platform multiplayer possible, they practically
have to enforce the same physics timesteps on all platforms.

What most people perceive as a different feel is the input to display latency.
On a TV with a controller that can easily be 4x higher than with a mouse on a
PC screen.

In the early days, Microsoft did a famous study of how strongly input latency
affects the way how people use office software. If I recall correction, 50ms
in additional latency would lead to significantly less exploratory behavior.

And I think we all know that intuitively from the web. When a bloated
newspaper site needs 5 seconds to load, you start to consciously consider if
you really want to click that link, as opposed to just checking out everything
slightly related on Wikipedia.

~~~
WillKirkby
GTA San Andreas (2004) did not have cross-platform multiplayer, so OP's
assessment may be accurate.

~~~
rypskar
2004 is a long time ago. I don't remember threading discussed any place when I
started learning game development on a hobby and university level at that
time. Probably because multiple cores wasn't a thing, so threading mostly
would add complexity and cost performance. Core 2 was released in 2006
[https://en.wikipedia.org/wiki/Intel_Core_2](https://en.wikipedia.org/wiki/Intel_Core_2)

IIRC all places I did learn from did tell to throw everything into the game
loop and remember to use delta since last frame in all calculations

------
raxxorrax
This is interesting and explains behavior I often encountered. I am no game
developer, but I dabbled with different combinations of render and physics
engines. While the times when game speed was dependant on framerate is
probably over, this disentanglement can be a problem for physics simulations.

In many cases rendering is done mainly on GPU while physics is often based on
your CPU, although that isn't necessarily true anymore either. But I am sure
you can break quite a few games, if one component is disproportionately slower
or faster than the other.

I wonder if physics engines might profit from mechanisms we know from video
compression, having P- and I-frames, so that the errors from interpolation
like suggested as a fix in the article could be corrected. If I understand it
correctly and there is need for such a correction mechanism that is.

------
michae2
I read this article for the first time a few months ago while creating a
rudimentary sound engine. The article makes a lot of sense, but I struggled to
map the concepts to sound generation. I initially tried to replace the render
step from the article (rendering a single display frame) with rendering a
single fragment of sound buffer (called a period in ALSA, the unit of transfer
to the sound card). My first mistake was ignoring the interpolation part of
the article. The output was very choppy whenever the sound card transfers were
too coarse.

And then I learned that I can ask the sound card how much sound buffer it
wants, instead of always transferring a fixed amount. In other words, the rate
of the main loop did not need to match the rate of transfers to the sound
card.

Now I've rewritten the main loop as a series of inner loops, first running a
variable amount of simulation at simulation rate, then rendering a variable
amount of sound buffer at output sample rate ( _not_ fragment transfer rate,
like I was doing before). This seems to be working better.

I still need to think about the spiral of death. Perhaps an upper limit on
simulation would be enough?

(Any advice welcome!)

------
dom96
As someone who's been learning to develop games mostly from the fundamentals
I've learned the hard way that this is really important. If you try to run
your game updates at the same time delta as your last frame draw then you're
going to have a really bad time debugging anything. A fixed timestep is a
must.

~~~
jeffreygoesto
This is true. For all simulations.

~~~
kranner
Not true for non-interactive simulations, some of which e.g. as used in
Systems Biology, choose the time-step to the next event from a probability
distribution. That is a huge speed-up compared to a fixed time-step simulation
in which you choose whether or not an event occurs in the current time-step
randomly.

[https://en.wikipedia.org/wiki/Gillespie_algorithm](https://en.wikipedia.org/wiki/Gillespie_algorithm)

------
jrockway
I get the impression that most games are using some sort of hybrid approach.
They run the game simulation at a fixed "tickrate", and then add some hacks on
top to make the visuals match the physics in certain areas. For example, your
mouse position and whether or not you fired your weapon can be updated between
simulation iterations, to make shooting and killing smoother.

~~~
meheleventyone
It's not simple in either direction. You have some events where responding
faster is considered better most notably detecting input and some where you
need a consistent frame-rate to avoid issues (e.g. simulating drag is really
sensitive to framerate and will behave quite differently as a game slows
down). A fixed timestep also causes the game to slow down if it can't hit the
desired update rate.

FWIW neither Unity nor UE4 use a fixed timestep for their default update and
tick.

------
tracker1
My only real experience with this type of logic has been with training
simulations, where you expressly don't want real time, just real-ish events
that happen in an appropriate order. If you're an incident commander, then you
aren't going to wait 5-10 minutes for additional units to "arrive" in a
training simulation (60-90 seconds is good enough). Same for waiting for
ladder companies to check a building, etc.

I find real-time games very interesting technically and conceptually, just no
real interest in working on them. Understanding the technical approaches can
be very enlightening for a number of different use cases.

------
nurbel
Johnatan Blow has an intersesting video on issues with framerate and how it
was handled in Braid at
[https://www.youtube.com/watch?v=fdAOPHgW7qM](https://www.youtube.com/watch?v=fdAOPHgW7qM)
. As someone working on web stuff, I always find it fascinating to see all the
complexity that arise in other domains, in games in particular.

~~~
hunterloftis
Me too! I went deep into that rabbit hole and ended up with a presentation on
things webdev can learn from gamedev:

[https://youtu.be/avwDj3KRuLc](https://youtu.be/avwDj3KRuLc)

------
aaanotherhnfolk
Only other optimization I would add to this is that you don't need to use the
same 'consumer' value for all parts of your game loop. Collision detection you
probably want to be tight so that fast moving objects don't warp thru thin
objects like walls. But enemy ai may be more tolerant to larger chunks.

~~~
bogwog
The only way to reliably prevent the situation where fast moving objects from
moving through thin walls (or even thick walls) is to use continuous collision
detection. Even if you step the physics engine at a ridiculously high rate
(say 1000 frames per second), objects will _still_ pass through each other if
they're moving fast enough.

~~~
jfkebwjsbx
I guess most games cap velocities for most objects if not all, so I doubt that
is an issue.

~~~
bogwog
That's not practical. If you cap velocities, you'll also need to enforce a
minimum size for all objects. If you don't do this, it will still be possible
for objects to pass through each other. So you can either have really small
but really slow objects, or really large but really fast objects.

What most games do is enable continuous collision detection for important fast
objects only, like the player or projectiles.

------
Shryyyk
Gaffer on Games was an invaluable resource when I was at University - I
remember this article fondly.

------
phkahler
If I'm not mistaken, the first arcade game to fully decouple time step and
frame rate was I,Robot in 1984. No physics, but the rasterization was variable
enough to require this. Oh, and it didnt do physics.

------
blamestross
I figure the major reason they don't just run DEVS under the hood is that they
don't know about it?

------
jeffreygoesto
This has a lot in common with how you design safety systems by the way ;-)...

