
Accuracy takes power: one man’s 3GHz quest to build a perfect SNES emulator - skryl
http://arstechnica.com/gaming/2011/08/accuracy-takes-power-one-mans-3ghz-quest-to-build-a-perfect-snes-emulator/
======
T-hawk
_Or consider Air Strike Patrol, where a shadow is drawn under your aircraft.
This is done using mid-scanline raster effects, which are extraordinarily
resource intensive to emulate._

So what's really going on here is that the emulator must emulate not only the
SNES hardware, but also the _television_. Video game emulators have had to
deal with this for a long time, to varying and increasing levels of accuracy.
Televisions (especially analog CRTs) have quite a bit of emergent behavior in
processing the display input that is not easily captured and replicated by
your typical frame buffer. Interlacing is a major such phenomenon; most
emulators still simply treat the 60 fields per second as 60 distinct frames
rather than interlacing them. (And younger players are used to seeing the
games that way, never having played on original console and TV hardware.)

The ultimate example of this effect occurs in emulating games that originally
used a vector CRT. An emulator writing to a raster frame buffer simply can't
replicate the bright, sharp display of a real Asteroids or Star Castle or
Battlezone machine.

TV behavior even goes beyond electronics. Consider the characteristics of the
phosphor coating and the persistence time between refreshes. Some games made
use of effects where that characteristic mattered, so if you want to emulate
that with high fidelity, yes that will take a lot of CPU cycles.

~~~
aardvark179
That particular problem isn't a case of emulating the television, but rather
accurately emulating the console's video hardware and its interactions with
the rest of the system. If one were simply interested in emulating the
television's behaviour then you could construct a frame buffer based on the
visible sprites and postprocess that (possibly in conjunction with several
preceding fields).

If the console allowed sneaky things to be done on each raster line (like
changing the colours) then constructing that frame buffer becomes considerably
more resource intensive, as it must now probably be done line by line with the
correct timing wrt. the rest of the emulation.

If you could pull tricks mid scanline (presumably through careful timing after
an interrupt) then the problem will be a whole lot worse, though I'd guess it
can be reduced by recording changes to the relevant hardware registers along
with timestamps in the emulation so that the timing of your scanlines'
construction becomes less of an issue.

~~~
T-hawk
You're correct, this particular problem can be handled with sufficiently
sophisticated frame buffer logic. I was generalizing from that to other
concepts where emulating the television or its signal processing would be
required.

I'll give you another example. On the Atari 2600 game console, the vertical
sync is _software controlled_. The software is responsible for enabling the
vertical sync pulse. This can be done 60 times per second as standard -- or
you could play tricks with it. Suppose you strobe it at a different or even
irregular rate. On an analog TV, the picture starts rolling vertically. That
breaks way outside the sandbox of a framebuffer, with signal being displayed
in overscan areas, and during the normally-blank retrace interval resulting in
ghosting effects. (No commercial game did that, but it's been done in tech
demos, and conceivably a horror game could do it intentionally for mood.) To
produce that same behavior on framebuffer-based hardware, you need to emulate
or at least approximate the workings of a TV's vertical sync logic, none of
which appears in the console itself.

(I know this from experience, I wrote an Atari 2600 game:
<http://www.dos486.com/atari/> )

~~~
Keyframe
I recently bought 'Racing the beam' book which goes into Atari 2600 VCS
details and programming -
[http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&...](http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=11696)
I recommend this book to everyone interested in this topic.

------
sp332
Lots of conversation from when this was posted 9 months ago:
<https://news.ycombinator.com/item?id=2864531>

------
gouranga
Perhaps we should start using CardBus and PCI-E based FPGA cards and go for
hardware simulation rather than software emulation?

Performance is far easier to achieve there.

The devices aren't exactly expensive either.

~~~
evan_
if you're going to add hardware you might as well just get a real SNES from
eBay and use a memory cart.

~~~
Wilduck
The issue is that buying an SNES becomes decreasingly viable as time goes on
while using an FGPA becomes increasingly viable. I don't know if we've reached
the point where the balance shifts yet, but presumably we will at some point.

~~~
gouranga
Also, the medium on which the software is delivered is not the problem either.
It's just information after all. The problem is accurately emulating the
software in software which is easier achieved by emulating the harware in
hardware :)

~~~
factorial
The SNES is now close to 25 years old, and the older the hardware gets, the
more serious the issue of "bit rot" will become. Among NES collectors, it's
not rare to find people lamenting not being able to play their original
cartridges anymore. Those cartridges don't last forever. Thus, emulation
serves an important archival purpose, and without it, those games may be lost
forever.

~~~
rikthevik
Aren't there a bunch of knock-off systems coming out of Asia these days? I
thought I saw a box that could be play NES, SNES and Genesis original
cartridges down at my local games shop.

~~~
voltagex_
IIRC these clones may not be able to play all games - especially ones with
add-on chips (Super Mario RPG is the only example I can think of right now)

------
neverm0re
What I find interesting is that most attention is being spent on BSNES when
the subject of cycle accuracy and proper emulation of hardware comes up, like
this is a new or novel discovery. BSNES is hardly the first to aim for this
sort of goal, nor is its efforts as close to complete as the efforts spent by
others. The NES scene in particular is now down to the level of breaking out
scopes to measure response times on real NES hardware to get the sort of
information they need to further push their level of accuracy up. Even
'obscure' systems like the MSX have had this sort of cycle-accurate push in
emulation.

It's no longer the 90s and people shouldn't even have to mention NESticle
existed in an article unless they're that out of touch with trends of the last
decade.

~~~
byuu
> What I find interesting is that most attention is being spent on BSNES when
> the subject of cycle accuracy and proper emulation of hardware comes up,
> like this is a new or novel discovery.

I never intended to convey that this is a new idea, sorry. I do believe my
cooperative threading model is a new concept _in emulation_ , but it's still a
ridiculously old one in computer science.

> The NES scene in particular ...

... is not as rosy as it seems. Having recently written an NES emulator, I can
tell you that they're far from completion. For just one example, all of those
mapper chips are basically a big unknown. Those chips have ways of detecting
scanline edges to simulate IRQs and split-screen effects. This is done by
monitoring the bus for certain patterns from the cart-side. And the details of
this stuff? Completely unknown. Not even Nestopia nor Nintendulator attempt to
simulate this: they just have the PPU -tell- the mapper when a scanline edge
is hit. I could be wrong, but I believe I'm the first to even attempt to have
the mapper detect scanlines by monitoring the bus.

And we're talking chips that are dozens of times less complex in the _worst_
case than some of the SNES coprocessors.

> It's no longer the 90s and people shouldn't even have to mention NESticle
> existed in an article unless they're that out of touch with trends of the
> last decade.

The important part of that article is that the SNES was (and largely still is)
in the NESticle phase of development, which was the purpose of bsnes.
Unfortunately, just as we go from NESticle (25-50MHz) to Nestopia (800MHz
required); ZSNES (200MHz) to bsnes (2-3GHz) needs a huge jump. But this time,
that jump is hitting the wall of where most computer users are at. While
people didn't notice Nestopia because everyone has at least a 1GHz processor
these days, bsnes was not so fortunate.

So the article was more about explaining what that level of overhead is
required for.

------
leeoniya
source: <http://byuu.org/bsnes/accuracy>

this is also awesome, posted a while back:
<http://byuu.org/articles/emulation/snes-coprocessors>

------
tluyben2
There are subtle bugs and issues with timing, but the worst, for me, is that
the games are not accurate. I play a lot of 80s games on original hardware,
but also (when i'm not at home) on my laptop/ipad/android in emulation. Games
I have been playing for almost 30 years are locked solid in my mind; every
enemy, path, _timing_ has been set in my brain. I can play those games blindly
on original hardware. But not on emulators. For people playing these kind of
things for the first time this is not an issue; for me it's not being true to
the original. The quirks which were in there are supposed to be in there.
Horizontal shooters which had visual/audio issues in the original game because
the end boss was actually too big for the poor Z80 + VDP to do fast enough, is
now suddenly smooth in the emulator in case you lose the edge of attacking it
when the computer is in pain. More fun or not is not the issue here; correct
emulation is important to preserve all millions upon millions of carefully
crafted assembler instructions on the platform of choice. Until we have this
working well, I'll keep buying old computers for peanuts just to make sure.

------
normchan
follow up to that story here: [http://www.tested.com/news/44376-16_bit-time-
capsule-how-emu...](http://www.tested.com/news/44376-16_bit-time-capsule-how-
emulator-bsnes-makes-a-case-for-software-preservation/)

~~~
kristofferR
That's a great article, thanks!

------
zxoq
What worries me is if future generations will be able to enjoy the games of
today. Will it ever be feasible to emulate a PS3 to the level demanded here?
Will my grandchildren in 50 years be able to play GTA 6 on a PS4 emulator?
Processing power does not appear to scale to allow this, and there will barely
by any of today's consoles still alive by then (also, I doubt 2060s television
sets will have HDMI input).

~~~
mappu
Not to mention DRM.. it's a serious problem.

~~~
drivebyacct2
Not in the current generation of _console_ optical media. In the future it
will be a rotten travesty as no one will ever physically own their games.

Though DLC is locked up as future consoles' content is likely to be.

------
radarsat1
Hm. The reason synchronization is needed is because the emulator must simulate
the chip behaviours over time, and ensure that they act as if they are all
behaving in real-time lock step. Might an alternative approach be to simply
simulate each chip _independently_ , in parallel, without synchronization, but
"somehow" ensure that their behaviour is exactly corresponding to real time,
so that time does not need to be "faked" by synchronization?

Yes, this is probably a recipe for disaster, and I have little idea what
mechanism could be used to ensure time accuracy, but just a thought. (Perhaps
an RTOS?) I also wonder what would be possible with FPGAs, whether
programmable logic might provide a better approach to emulating these chips in
synchrony.

------
Whodi
Great find. This is the most in-depth article I've read on this subject. I
actually use this as an open-ended interview question at the game company
where I work. The PlayStation 2 has a bunch of quirks with floating point
numbers because it doesn't follow the IEEE standard - for example, floats
don't become infinity when they overflow, they just get clamped to the maximum
possible float. Now you can't use your FPU for the tens of thousands of
floating point calculations that are happening per frame. Sure, your processor
is faster, but it's fighting with one arm tied behind its back.

------
formic_
I think the author is wrong

It doesn't take a 3000mhz machine to emulate a ricoh 5a22

Look how Nintendo has virtual consoles working on a 730 MHz power PC - the wii

~~~
artmageddon
I don't think you're understanding the point of the article. Yes, emulators
have been around for quite a long time, but none of them are _perfect_ , which
is the purpose of this guy's undertaking to create this emulator requiring a
3GHz CPU.

The Wii isn't going to be around forever: those will start to fail in a number
of years. In addition, not every single NES game is available on the Wii
virtual console, anyway.

~~~
JonnieCache
The emulators on the wii virtual console are not accurate at all, not in the
way that byuu is talking about. Also the games get heavily patched for that
purpose, as well as the emulator being patched per-game.

~~~
artmageddon
I think I may have only bought the Donkey Kong Country series of games for the
Wii VC. It worked well enough, but I imagine that there were indeed issues
with the emulation on the Wii. I didn't do any research on it so I couldn't
comment on that either.

The one thing I realized after the last few years is that you can't count on
game companies to do backwards compatibility forever. Microsoft
(understandably) gave up on patching the 360 over and over again to expand the
original Xbox library of playable games on the 360, and Sony stopped selling
the PS3 models that had the PS2 hardware directly on-board(not to mention
"Other OS" support, but I digress).

~~~
tedunangst
Back compat is only useful in a business sense in that consumers don't have to
decide between new console and no games or old console and lots of games. Once
the new console has lots of games, that simply becomes the only option.

