
Booting the Final GameCube Game - phire
https://dolphin-emu.org/blog/2016/09/06/booting-the-final-gc-game/
======
starshadowx2
I think the best (funniest) part of this is that some crashes are now emulated
correctly.

>Easier access to memchecks means that Dolphin can accurately emulate well
known crash glitches in games without Dolphin itself crashing!

~~~
rocky1138
Normally, an emulator strives for accuracy. But, since there are so few (if
any) Wii and Gamecube games coming out, the library is fixed. So, I think it
might be a better choice to hard-code a fix for specific games for these known
crashes rather than emulating the crash accurately.

~~~
lazerwalker
That's a philosophical question more than a pragmatic one: even if they can
manually hard-code crash fixes for specific games, is the role of an emulator
to provide cycle-perfect emulation or to provide the best possible game
experience?

~~~
hiddenkrypt
It depends on the philosophy that guides the emulation author. For example,
the creator of bsnes believes in accuracy over everything. He's proud that his
emulator will show the same bugs as the console when other emulators don't.

~~~
DSMan195276
I would also add that practicality definitely comes into play. For example,
there's a GBC emulator for the GBA that is written entirely in ARM assembly
because that was the only way to make it run fast enough. There is a certain
amount of accuracy that was sacrificed to make it work, but the fact that it
actually runs games at playable speed is notable in-and-of-itself. It probably
could have been done more accurately, but that doesn't _really_ matter to most
people if the games run at completely unplayable speeds, accurate or not.

~~~
derefr
One thing I always wonder about low-resource-usage emulation projects for old
consoles with fixed game libraries (e.g. GBA emulating GBC), is why you can't
just throw some heavy ahead-of-time analysis at the problem of statically
recompiling the _whole game library_ for the native destination architecture.
Why write an "emulator" at all, when you know exactly what code you want to
run, and know exactly what code that code should translate into when run on
the target CPU?

I guess a potential problem would be that this requires shipping a (IP-
infringing) copy of the recompiled game library with the emulator. But it
doesn't quite: instead of a library of ROMs, you could just ship a library of
hint files to turn the static-analysis step of the recompilation into a few
seconds of work, such that you still need an input ROM and the dest-ISA
program is only generated in memory.

------
blondie9x
People often complain about the legality issue surrounding emulation and roms
in general. One thing to remember about the proliferation of the roms and emu
scene is that of convenience. People now a days tend to appreciate having less
stuff and digital content. Steaming, Steam, etc. If Nintendo created some sort
of digital game marketplace odds are good emu and rom scene would deteriorate.
fundamentally issue is the scene itself and how gamers now a days want to
consume content.

Also important to remember is that often times the older consoles and
cartridges no longer function well, the condition of the physical hardware
starts to deteriorate and unfortunately things can wear out.

~~~
CobrastanJorji
Fun story about the legality problems. Way back in the day, I worked for a
company that created a "play old videogames" service. Our lawyers went out and
hunted down the owners of old Commodore 64 titles and such. According to the
internal rumor mill, some of these companies required us to convince them that
they really were the owners of these games do to a long series of acquisitions
before they'd agree to sell us the rights to the games.

Anyway, once we had the rights, we of course asked for a copy of the game.
Well, naturally, they didn't have one. And several of these old systems had
some form of copy protection. Well, fortunately, the 1980s demo scene had led
to cracked copies of these games being easily downloaded (bonus fun fact: I
may be the only person in history who clicked 'agree' on a 'you must have
written permission from the publisher to download this ROM' and meant it). But
of course the crackers had included crack intros which we certainly couldn't
use. So we ended up including the cracked version, with the cracks, and just
initializing the games to a memory state just after the crack intro played.

Without those crackers cracking the game, we might never have found a copy
that we could have published.

~~~
delroth
Nintendo is also known for distributing "pirate dumps" of their NES games as
Virtual Console titles.
[http://forums.nesdev.com/viewtopic.php?t=4412](http://forums.nesdev.com/viewtopic.php?t=4412)

~~~
tekklloneer
I read that as they just used the same file format as the piracy community.
So, I don't know where the source of the dumps is specified?

~~~
phire
It's entirely possible that they used existing ROM dumping software on their
own cartridge which would have appended the header.

Not likely, but possible.

~~~
tekklloneer
Or they prepended the header themselves as whoever developed the emulator used
the de facto standard.

------
ponco
I love that this project documents its development so thoroughly and
entertainingly. I first used Dolphin 5+ years ago (craving some Double Dash)
and even then I remember being intrigued about the under the hood stuff.
Congratulations to them for reaching this milestone.

------
daenney
I love stuff like this:

> Because the CPU can't directly map the auxiliary RAM to the address space
> due to a missing hardware feature, the game has to read or write to an
> invalid memory address to invoke an exception handler. This exception
> handler would then use Direct Memory Access (DMA) to move data from the
> auxiliary RAM into a game designated cache in physical memory.

So essentially we're just leveraging exception handlers to code parts of the
game treating the exception as "desired" behaviour.

~~~
monocasa
I mean, your swap file does the exact same thing.

------
grenoire
Dolphin development logs NEVER disappoint.

------
kylek
>> Secondly, the GameCube only has 24MB (and some specialized areas) of RAM
across a 4GB address space, meaning most memory addresses have no RAM backing
them!

Is this common with consoles or anywhere else? I knew that there's very little
RAM on consoles, but is there a reason for having the large address space
available? (possibly convenience?)

~~~
kbsletten
I don't know for sure, but I believe that memory addresses are generally
represented as an unsigned integer of one of a few sizes. 24MB is well over
the maximum value for a 16-bit integer (approx. 65K) so they simply used the
next larger size -- a 32-bit integer (max size approx. 4B) -- and figured that
most of the address space being un-mapped wouldn't be a problem. This is
common in most computers because the allowed memory size grows at an
exponential rate with increased size for the addresses.

~~~
rocky1138
Yes, this reminds me of older computers. My Atari ST 1040 has something like
2MB of RAM but I'm sure the addressing is 32-bit, meaning that there are
addresses that it can't ever reach.

In fact, doing some research shows that 8-bits of the address are often
ignored on the Motorola 68000 leaving a 24-bit memory address.

~~~
coldpie
I remember reading on HN that developers would use those unused bits to carry
more data around. There was so little RAM available that the unused bits in
pointer addresses were useful.

~~~
serge2k
That type of thing is still done today.

For example, in V8 they use the bottom 2 bits of pointers for other things.
They get away with it because objects are 4 byte aligned.

~~~
jburgess777
Last time I looked, the current 64bit Intel CPUs only used the lower 48 bits
for userspace virtual addresses. This leaves 16 bits available if you want to
tag your pointers with some additional information. This can be useful for
saving space or using them with an 8 byte cmpxchg.

This kind of trick is easier to get away with if you know exactly which CPUs
your software is going to run on. Otherwise your software could break horribly
when a new generation of CPU comes around.

------
raverbashing
I wonder if all this craziness in the SW game is an attempt at copy protection
or something else

~~~
glaberficken
From reading the article I gather the intention was performance optimization
on the GameCube's limited hardware not copy protection.

~~~
gambiting
I have an interesting anecdote about GC development - I know of a game that
almost released on GC that would break the drive on every console it was
played on - a developer responsible for the loading code used an undocumented
method to increase the read speed from the optical drive, so that data would
stream in sufficiently fast for the game. Unfortunately, that was more than
the drive was designed for, and it would actually die after few hours of use
that way. QC reported some of their test consoles dying, but Nintendo ruled
they must have been defective, and replaced them - and the game actually went
to manufacturing like that. The whole release was halted a week before it was
due because someone spoke to the guy and asked how he got the data streaming
to work quickly enough.

You could do something similar with PS1, PS2 and PSP - use undocumented
methods to overclock the CPU and make them run quicker.

~~~
Sniffnoy
Crash Bandicoot did something like that, didn't it?

~~~
msl
Crash did not really do anything weird with the CD drive, it just read from
the disc a lot. So much, in fact, that a single playthrough would hit the disk
more times than the drive was rated for (the drive was rated for some 70000
reads, but completing the game required something like 120000). [1]

[1] [http://all-things-andy-gavin.com/2011/02/06/making-crash-
ban...](http://all-things-andy-gavin.com/2011/02/06/making-crash-bandicoot-
part-5/)

------
AceJohnny2
I'd love to see some of those games' original devs chime in on the how and why
of their hacks revealed by the emulation community.

It'd be a wonderful kind of postmortem.

------
chris_wot
It would be funny if they released an SDK and someone wrote new games for the
GameCube...

~~~
wolrah
As with most console homebrew communities, this is already a thing.

[https://github.com/devkitPro/libogc](https://github.com/devkitPro/libogc)

It doesn't seem like it's been heavily used on GC, but the Wii has a decent
homebrew scene.

Nothing really compares to the OG Xbox though, starting off in console modding
on that platform really ruined me for everything else.

~~~
bri3d
Although, it's worth noting that most OG Xbox homebrew (that I know of, at
least) was made using leaked official SDKs (XDK), not open-source reverse
engineered SDKs.

IMO the PSP's homebrew community was the best without relying on leaked SDKs -
the open source PSPSDK is very good and was used in lieu of leaked SDKs by
almost all homebrew authors.

