On consoles you used to get a vsync interrupt. A reliable signal-- an event which you could control very precisely from your game software.
Nowadays when I program games, I cannot get a promise that a frame event will fire. I can get a "well, this code may run, unless the operating system needs those cycles".
So, so is skipping a whole frame, OR halving the frame rate really the best that is possible? Why is it either/or? What are operating systems, hardware vendors and driver writers doing about it?
The problem is that you can only start sending a frame to the display at 16.67ms intervals. If you miss that deadline, you can either swap buffers now (causing tearing), or you can swap buffers at the next vertical blank interval (which results in an effictive 30Hz refresh rate). These are the only two options that are supported by current displays. There's nothing you can do on the computer side to get around these constraints, because the limitation comes from display connections like VGA, HDMI, etc. that only deal in fixed refresh rates. Trying to drive such connections with a variable refresh rate would be like changing the baud rate of a serial connection on the fly without any way to let the receiving device know about the change.
This is an interesting point. The point is made elsewhere in the thread that if you miss the deadline, but send what you have drawn so far anyway, with modern typical rendering software/hardware you'd get an incomplete drawing with holes and missing layers etc.
But that is not what tearing is. Tearing is, you've missed the deadline, so you finish the rendering to completion, then swap the buffer in the middle of a vertical scan. What you're saying is, you can do that, or wait until the next vertical scan before swapping the buffer. And further more, that can either result in simply a skipped frame, or the game switches down to a 30fps mode, perhaps based on some running statistical about frame render durations.
I'm reminded about a discussion Carmack (was it him? or am I having a brain fart) about mitigating the tearing and framerate problem by doing per scanline rendering instead of per frame rendering.
Nowadays when I program games, I cannot get a promise that a frame event will fire. I can get a "well, this code may run, unless the operating system needs those cycles".
So, so is skipping a whole frame, OR halving the frame rate really the best that is possible? Why is it either/or? What are operating systems, hardware vendors and driver writers doing about it?