
A closer look at frame times: why FPS can mislead - ezrakewa
https://www.4alltech.com/2020/05/a-closer-look-at-frame-times-why-fps.html
======
gentleman11
It’s hard to write a unity game where there aren’t large frame time spikes
periodically. This is even in scenes where there is almost no code running,
just from turning the camera too quickly. In unreal, I haven’t encountered the
same issue. So, for some scenes, the frame time averages might make unity look
pretty good, but then it just has a these massive drops (sometimes over a half
second), again, with almost no scripting running.

Maybe I am just being petty. I am mostly just mad about unity revoking asset
store usage rights post-purchase as a cash grab (before feb your freelancers
could use your assets - now you need to buy them additional licenses for each)

~~~
ReactiveJelly
Is it GC pauses?

I know .NET's GC is supposed to be pretty good, but Unity is still using the
Mono implementation, right?

I can barely stand C++, but I'll put up with it to avoid long GC pauses or
dynamic typing. (JS)

~~~
vertex-four
Unreal’s C++ code (specifically, anything deriving from UObject) is garbage
collected as well.

~~~
rowanG077
Really? Why is that in the times of smart pointers? You can just malloc a
pointer and it gets cleaned up automatically by unity if it's no longer used?

~~~
vertex-four
Because you want garbage collection to happen at specific, well-defined times
(i.e. in the slack between frames), in a separate thread - not in the middle
of rendering a frame on the main thread when you happen to drop something out
of scope.

~~~
rowanG077
Still I can't imagine GC is the best solution for this. You can easily make a
memory allocation strategy that hands out shared_pointers or something like it
that get cleaned up at will defined times instead of when it goes out of scope
for a user.

------
cks
This is something we deal with in game development regularly, tracking spikes
as fixing them.

I’m not sure how much sense it makes to measure them in the context of video
card reviews as the article alludes to.

In my experience frame time spikes are more often a result of software rather
than hardware. It would make more sense in the context of game reviews I
think.

------
Pulcinella
I feel like Bloodborne is what put “frame time” on the map for a lot of
gamers. Game is usually pretty good about maintaining an average 30 fps at
1080p, especially on the pro with boost mode, but has very bad frame timing
that boost mode does nothing to fix. It also never go a Pro patch,
unfortunately.

~~~
highmastdon
In my opinion gaming on 30 FPS is extremely horrible. Especially with
shooters. This has in part to do with the 33.3 ms response delay that you can
get between moving the mouse and the change being processed and shown on the
screen. When compared to the reaction speed of ~200 ms, you add a 15% delay.
Compare that to 144fps (6,94ms frametime) you loose only 3,5% which makes the
perception much smoother

~~~
wing-_-nuts
I understand PC players get a better gaming experience, but looking at the
average specs of someone on /r/nvidia, my god do they pay a premium for it.

I'm ok with 30 fps outside of VR, as long as it's stable and everyone's on an
even playing field (i.e. same hardware).

The next gen consoles will target native 4k60hz, which is nice. Consoles are
fantastic value for money, and you're guaranteed to be able to run the newest
games even at the end of a console's release cycle, which can be 8-10 years.

~~~
QuixoticQuibit
Next-gen onsoles definitely aren’t targeting 4k60 as a standard. A few big
launch ones are coming in at 4k30. Really unfortunate but not surprising
considering the limitations of the hardware. A DLSS-like solution would’ve
worked wonders on the consoles.

