

The Problem of Vsync - blackhole
http://blackhole0173.blogspot.com/2011/09/problem-of-vsync.html

======
Ogre
This is one of the things the Amiga did really well. Among its specialized
hardware was the Copper, which was a very simple processor whose clock was
tied to the video clock. You could use it to schedule blits (using the
Blitter, another special bit of hardware whose specialty was moving
rectangular bits of rasters around) and other operations (like moving hardware
sprites around) immediately after the CRT beam passed the location where it
was being drawn. So with careful planning, you could get all your draws done
without tearing and without having to resort to double buffering.

I recall hearing that the reason the Video Toaster was an Amiga exclusive
piece of hardware was the availability of the video clock as well.

I miss that stuff. But not all the time, double and triple buffering are
certainly much easier. It was also possible quite easily to do double
buffering on the Amiga, and I think it was probably much more common than the
bare-metal copper approach. But for people who really wanted that level of
control, it was there.

~~~
CJefferson
The amiga is a fasinating example.

The various processors did give the Amiga amazing performance, but they also
made it very difficult to update the Amiga hardware as time went on, as apps
were tuned to the exact number of cycles which would occur before X or Y, and
break if anything changed.

There were many examples of Amiga games breaking if you added even a RAM
extension, as it threw off the timings, and programs expected to know exactly
how the hardware was laid out.

~~~
Flow
The programs that had those problems also probably jumped directly into the
KickStart ROM and expected it to be of version 1.2. When KickStart 1.3 came
out quite a few games broke.

I know of only one case where some german demo coders used CPU/chip timing to
multiplex sprites to get more than 8 sprites per row.

Most of the timing bugs were of the kind that the CPU got faster(maybe due to
code running from RAM not affected by custom chip cycles, or perhaps a 68020
or better was used) and neglected to wait for the blitter to be ready.

"Fast-RAM" was common enough after a few years to this not being a huge
problem.

AmigaOS and the whole of Amiga was a thing of beauty IMHO.

------
scoopr
This is an issue that I've been interested lately. I've been prototyping a
yet-another-gl-context-lib (like sdl,glut), and this is one of the issues that
I wanted to focus on.

On apple platforms, there is CVDisplayLink (osx) and CADisplayLink (ios), that
triggers a notification on the vsync interrupt, which should give rock solid
framerate-stability with minimum cpu-usage.

I've yet to find similar solutions to other modern platforms, which is a
little disappointing. Though there is hope that vsync swapping is implemented
little more sanely on other platforms. Also dwm (windows compositing window
manager) has some more knobs to tune, don't know if they help.

For the input-lag, handling messages in a different thread than rendering
should work, but doesn't help with battery-draining.

And also the input that comes as messages (keys, mouse) have timestamps on
them, not sure if they are granular enough though. So while it might come with
lag, at least you should be able to simulate the game state so it makes sense
with the inputs you've provided

I wonder, do media players on windows work with pure luck or do they attempt
some kludges try to keep the framerate stable, without burning the
cpu/battery?

~~~
exDM69
Please, give a link to your gl context library. I've written one too
(<http://libwm.googlecode.com> \- use latest tag, not master branch).

OpenGL infrastructure is so damn complicated - internally and externally. It
seems to be impossible to write 3d graphics that work well in a modern
environment when our API's are designed in the 1990's.

~~~
scoopr
If you remember, we've talked over irc =)

I actually at first tried to improve upon libwm, but didn't go too far with
it. There were some design points I disagreed with, and you've seemed to stop
updating, so I've started prototyping on my own codebase. I wanted some
simplicity, and make it easier for various ffi-bridges, so I went with plain
c.

The code is too raw, so I won't link to it, but I'll definitely make some
noise when I think it is at usable state.

~~~
exDM69
The FFI point was a factor why I have not used/updated libwm recently. libwm
is C++ with constructors, destructors and exceptions. The semantics of C++
make it poor for FFI. Some interpreter-based languages can somewhat do it
(e.g. python), but on a compiler based language like Haskell it becomes a bit
painful.

Another factor was that libwm supports a lot of legacy API's, like GLX 1.0. I
wanted to support my g/f's crappy intel based macbook because at the time I
was sometimes stuck that as my only computer.

Anyways, the morale of the story is: don't write system level libraries in C++
unless you want to spend the rest of your life writing C++.

------
iam
I'm surprised that there's not some kind of real time-ish callback when the
screen is about to update, giving the game a last chance to update its buffer?

Or better yet there should be a way of saying "use this buffer next time you
update the screen" (and don't block during vblank time).

------
daedhel
I regret that DirectX won over OpenGL.

~~~
oconnore
OSX + IOS + Android + Linux + WebGL + Windows vs. Windows

And you are calling the game now?

~~~
keeperofdakeys
From the point of view of the quantity of PC games that use DirectX, OpenGL is
losing. WebGL, as you mentioned, may really be a turning point. If Internet
Explorer implements it natively, then the OpenGL stack on Windows may become
much stronger. If game manufactures then started using OpenGL, DirectX would
finally start to die (although, it still has the XBox).

~~~
msbarnett
So from the point of view of only games that run on the only platform that
supports DirectX, OpenGL is dying?

You don't think that's maybe a problematic sample to draw conclusions from?

I mean, from the point of view of iOS and PS3 games, DirectX is non-existent.
Taking these kind of sub-samples of the total game market produces degenerate
results.

~~~
cwp
He's saying there are more games on Windows than on the platforms that use
OpenGL. I don't know if that's true or not, but it's a reasonable argument. It
doesn't really matter how many platforms implement OpenGL if nobody uses it.

~~~
msbarnett
> It doesn't really matter how many platforms implement OpenGL if nobody uses
> it.

Except that it is used, extensively; it's just not used much on Windows.
Therefore, looking at Windows games and proclaiming that OpenGL is dying/dead
is a really silly thing to do -- OpenGL ES knowledge is an extremely hot
commodity among iOS and Android devs, etc.

------
MostAwesomeDude
Is this a problem on Linux as well? See, those of us developing graphics
drivers view vsync as the best thing since sliced bread for its ability to
make our composited desktops and movie players tear-free. If GLX can't
properly do this for people, I'd be interested in hearing what game developers
want, since we hear from them so infrequently.

~~~
blackhole
5hoom is correct - the problem is the technique. Simply put, if you don't want
any input lag, you have to IMMEDIATELY render to the screen, which by
definition will cause tearing. The windows DWM does not have this problem, but
judging from a large number of presentation API functions I toyed around with
in the DirectX9 Ex, they are probably using a very low level flipping
technique available to the operating system to eliminate the input lag. I have
not been able to find anything concerning how they manage this and if I could
utilize it for a fullscreen game, nor have I investigated whether openGL
implements a working triple buffer system (mostly since I can't use it...
yet). The real question here isn't whether it affects Linux, its a question of
whether Linux provides an alternative technique (presumably via openGL) that
can be utilized by a game developer.

~~~
caf
The parent is asking what those alternative techniques would ideally look
like, from a game developer's point of view.

