QEMU  is an emulator used to run programs for one machine on another.
N64  is the Nintendo 64 games console.
SGI Indy  was a desktop SGI workstation from 1993.
UNIX: Actually Unix. Not an acronym. https://en.m.wikipedia.org/wiki/Unix
"In 1970, the group coined the name Unics for Uniplexed Information and Computing Service (pronounced "eunuchs"), as a pun on Multics, which stood for Multiplexed Information and Computer Services."
More specifically one processor architecture to another. E.g. running on your desktop (usually an x86 based architecture) a Linux Operating System designed and compiled for a Raspberry Pi (ARM based architecture) and it's incompatible architecture. In this case they're running software designed for the same processor that the Nintendo 64 was targeting which so happens to also have ran a Unix OS known as IRIX.
edit: oh, says it right there in the Wiki
Nintendo is a little mysterious when it comes to what their actual tooling was, but I remember Donkey Kong Country being the first time I read they were using SGIs (or at least the studio "Rare" was).
It's somewhat surprising they used the Indy for developing Mario 64 – I always got the sense that it was somewhat lightweight in performance compared to the Indigo, but a very cool machine either way.
The Nintendo 64 had a MIPS R4300 chip, the SGI Indigio also used the MIPS Rchip, the early one had R4000/R4400 chip, the later ones R8000+ chips. I can only speculate that by using SGI, you could run some of your non specific N64 code locally and debug faster.
Original PSX had a R3000 chip, but Sony opted for BSD, their devkit ran on FreeBSD PCs and you built the code and ran on actual PSX device. Cheaper...
BSD would've been a strange choice as the Playstation 1 debuted on December 3rd, 1994. "FreeBSD 1" came out just 13 months earlier
The Playstation 2 "TOOL" machines ran Red Hat for some of them, which was a bit more mature by 2001
The Playstation 3 and 4 though both run Net and FreeBSD under the hood internally though
Also BSD != FreeBSD, BSD 4.3 Net/1 (the first BSD released under the BSD license instead of containing AT&T code) was released in 1989.
By the time the Indy came out, the Indigo2 had replaced the Indigo, and I suspect a midrange Indy was a good match for a midrange Indigo1 (at much lower cost).
Nintendo made an N64 dev board for the Indy, essentially an N64 on a GIO board, complete with an adapter card to connect controllers.
The joke always was that the Indy was the Indigo without the go :-)
But it was a decent enough machine to develop on, you didn’t need the 3D stuff if you spent all day in Emacs or compiling. Whereas an Indigo was really targeted at say CAD users.
The Indy had XZ graphics available, which I believe were the same as the top Elan option available on the Indigo (4 GEs)
"We get the inside story on the legendary Rare with an all-star panel - David Doak (GoldenEye), Chris Marlow & Shawn Pile (Conkers Bad Fur Day, David Wise (Donkey Kong Country series) and Kevin Bayliss (Battle Toads/Killer Instinct)"
What does this mean?
I think of QEMU as emulating hardware... What exactly is being emulated here?
Do you have a link to a comprehensive guide on doing this by chance? I was thinking tomorrow I’d just launch an arm instance in AWS and figure it out but I have a dual Xeon workstation at work (windows) that I might try as well.
You can skip the first part about creating the image, since presumably you already have one. So the process for me is something like:
apt-get install qemu qemu-user-static binfmt-support
cp /usr/bin/qemu-arm-static ~/rpi_mnt/usr/bin
systemd-nspawn -D ~/rpi_mnt bin/bash
Sometimes qemu shows an error saying some operation isn't supported, but this hasn't broken anything yet for me, even after I did a whole Raspbian Stretch -> Buster upgrade this way.
Now, if your ARM binary was compiled to look for libc at /lib/libc.so, but /lib/libc.so is the host's x86 libc, then that obviously won't work; and the easiest way to get the libraries all sorted out is to use a chroot with OS install of the target architecture. If you do go the chroot route, you need to make sure that `qemu-ARCHITECTURE` is statically linked, because it won't have access to the x86 libraries it needs to run after the chroot(2) call happens (which is why most normally-dynamically-linked distros have a "qemu-user-static" package in addition to their normal "qemu-user" package).
But with a multilib scheme like Debian's, where all libraries get installed to /lib/ARCHITECTURE-TRIPLET/ instead of /lib/, then it should be possible to install all of the appropriate target libraries on the host system without a chroot! You "should" just need to configure APT to let you install packages built for that architecture. (I haven't actually tried this; I'm not a Debian user, but I am envious of their multilib).
Notable because the sgi indigo had a MIPS R3000A CPU.
Not quite. It means that they got qemu to emulate IRIX's syscall layer on linux. So you can run, lets say, a MIPS IRIX binary on x86 linux without having to emulate the entire machine.
Wine impersonates OS calls, (including syscalls) but does not perform emulation on the binary itself. Wine can only run windows applications written for x86, but not windows applications written for itanium.
This appears to be running both hardware emulation on the supplied binary, (which is what VMware/KVM/virtualbox etc do) as well as wine-like OS impersonation.
I made up the word "impersonates" for what wine does just to avoid confusion. It's not a word that's used in the literature afaik, although perhaps it (or a word like it) should be.
One thing I know (and can be seen in this repo) is that SM64 emulates a version of the NES/SNES "Object Attribute Memory", as a pure-software ring-buffer. (I'd love to know whether that carries on to later titles like Galaxy, 3D World, NSMB(U), Mario Maker, etc.)
You can trace the evolution of "LiveActor" all the way through until it ends up in Super Mario Odyssey.
Sunshine - https://github.com/shibbo/Corona/blob/master/include/actor/T...
Galaxy 1 - https://github.com/shibbo/Petari/blob/master/include/Actor/L...
Odyssey - https://github.com/shibbo/OdysseyReversed/blob/master/includ...
This architecture was so successful it ended up as the basis for all new Nintendo game development, so Breath of the Wild, Pikmin 3 and Splatoon, Mario Maker all use this new "Actor Library", or "al".
I have not looked at NSMBU, but NSMBWii uses a different core structure originally developed (as far as I know) by the Zelda team. I think it's mostly phased out these days, as is the set of "egg" libraries developed by the Mario Kart: Wii and Wii Sports teams.
I mean, you're right, it's not a literal implementation of OAM in the sense of controlling the same things OAM controls. I was speaking kinda metaphorically.
NES/SNES OAM was useful for reading back entity physics data (because it gave objects X/Y position registers) which meant that developers (incl. Nintendo themselves) often chose to rely on the OAM-object "components" of a entity as the canonical handle for tracking the entity in the game physics (Rather than having a table somewhere in work-RAM of separate "physical" components for entities.) Games like SMW literally just index a table of actor behaviors off the OAM-object's name-table data; what an entity "is" from the game's perspective, is determined by what it currently looks like!
Since the OAM had a finite size, this reliance on OAM for tracking entities forced games into a structure where entities' lifetimes are coupled to the lifetime of their OAM-object representations. Which meant that every NES/SNES game relying on OAM to track entities needed an algorithm for dynamically allocating OAM-object slots to entities; and so, for evicting entities if OAM was exhausted. (Level design was done with a hard eye for avoiding OAM "thrashing" by keeping entities spaced apart, but the system still needed to be able to handle the case where mobile entities ended up following you and piling up.) Which brought into existence the common OAM LRU cache-eviction algorithm—i.e., the practice of "despawning" the oldest off-screen entities when new on-screen entities need OAM slots.
This determined a lot about the design of these NES/SNES games. It made mobs in these games into things that would lose their state whenever they were scrolled "far enough" off the screen; which in turn forced a design where—rather than a level just running a "start script" that would spawn entities at initial positions, tracking them in RAM from then on—you instead had adopt a hybrid approach where entities had both an OAM-object representation, and also an associated "spawner" (usually existing just as static level-data in ROM, though sometimes coupled to a bitflag tracking destroyed spawns) that would trigger [re]spawning for the entity.
SM64 is essentially "emulating OAM" in the sense that it assigns entities handles in a fixed-sized buffer, and then uses a very OAM-like logic (basically, "memory pressure" on this buffer) to decide when entities should be de-spawned; and then uses spawners to recreate entities that have been de-spawned due to this memory pressure (meaning that most entities don't "exist" until you get close enough to them.)
SM64 didn't need to do things this way; the N64 has enough RAM to track all the entities in every SM64 map at once, IIRC. They chose to impose this constraint artificially, in order to continue to build SM64 levels according to the design philosophy they had "discovered" due to the original constraints of the OAM system.
Later games in the Mario series, if-and-when they choose to have this de-spawn/re-spawn tracking feature†, are essentially "pretending to have OAM", but not really emulating it the way SM64 does. For example, Mario Maker de-spawns entities when they're scrolled sufficiently far off the screen, in a way that mimics OAM sufficiently well that re-spawning and enemy spawner semantics still work—but which isn't really an OAM-like system, in that there's no static buffer with memory-pressure causing de-spawning (and in fact, as long as the entities are willing to squeeze into one visual screen, existing entities will never be forced to de-spawn.)
† You could get a very interesting analysis of the way Nintendo probably internally divides/project-manages the Mario games, by just determining which titles "emulate" OAM the way SM64 does; which titles loosely mimic OAM, like Mario Maker; and which titles don't even bother with de-spawn/re-spawn tracking at all, but instead have persistent physical entities that just "go quiescent" when they're out of sight. (IIRC there's no Mario title that uses the fourth option—pure view-frustum culling of distant models that continue to "tick" while culled.)
I don't know how the SNES worked, but AFAIK most NES games did not track objects in this way. Instead, the game engine maintained its own buffers containing object state and copied necessary information to OAM every frame.
OAM only stored graphics state for the rendering hardware, which is not a convenient form for the game engine for a number of reasons. For instance, objects are nearly always composed of several OAM sprites placed next to each other, objects that are not visible during a given frame are not present in OAM, and a single animated object can switch between so many different graphical forms that it would be complicated to identify which object corresponds to a graphics tile from OAM. Additionally, OAM doesn't have extra room for non-graphical object state (like behavior timers or velocity information).
Nintendo is quite clearly second-to-none on the design/creative end, how much does that translate to the technical aspect of game development? Speaking purely in terms of software.
I find this particularly interesting in the context of a company that appears to retain many of the same programmers today as they did 30 years ago, when software development was obviously much different.
(Side-note: I've always wondered how the mini-games in IS's WarioWare series work—whether each game is entirely custom code, or whether they've come up with some sort of DSL for specifying reflex games. If the latter, I would bet that that has a decent genealogy too.)
FTFY. I don't think the RPG elements of SPM should be ignored; the game plays very differently to any of the other Mario platformers.
It may not be to everyone's tastes but to simplify the matter for the sake of a quick jab is hugely unfair, especially given it has one of the most touching stories in the Paper Mario canon.
Picture a background jobs system like Sidekik/Resque. Imagine that one worker-node of this jobs system had a fixed-size ring array of jobs it had taken. Now imagine that you could push new jobs onto a specific node. And now imagine that the worker-node responded by not just overwriting one of the filled slots of the local jobs set, but actually ACKing said job to drop it from the global job-queue system. It's destroying a real entity with persistent global identity, in order to reclaim the slot that the local representation of that entity takes up.
That's what OAM is, when combined with the design pattern I'm talking about. It's a ridiculous system that'd never fly in a business; but it happens to work for games, where you control the world such that you can make the world hold "reminders" for the state you destroyed.
"Miyamoto: We were using the Super Mario 64 engine for Zelda, but we had to make so many modifications to it that it's a different engine now. What we have now is a very good engine, and I think we can use it for future games if we can come up with a very good concept. It took three or so years to make Zelda, and about half the time was spent on making the engine. We definitely want to make use of this engine again."
I've noticed spillover effects into Japanese gamers as well -- people being suspicious of or derisive about mods, even when they're perfectly legal and the game has built-in mod support (looking at you Monster Hunter World).
My (Japanese) girlfriend is on the very conservative side of the spectrum there and absolutely hates it when I bring up any kind of modding, and so do her friends -- the culture of "authorial intent is king" is very strangely strong for a culture that also appreciates and enjoys doujin.
N64 emulators are all pretty bad (inaccurate, use a decent amount of resources) and upscaling is relatively expensive. At least, way too expensive for an rpi to handle.
It will work fine as an emu at 240p though
Usually after you get source releases to games, you get people that port them to different platforms. Like how we had Doom on iPods and Kodak digital cameras.
I haven't tested N64 games on a RPi personally, but I imagine it would have no trouble with it, and there seem to be several retro-gaming projects that involve N64 games and use the Rpi.
Pi format is still fun to tinker with and I encourage you to get one if you're at all interested. The 3b+ just wasn't the right tool for the job in my case. I haven't tried the pi 4, however.
Like Panzer Dragoon Saga.
Of couse you can reverse an optimized binary, just launch IDA and start to have fun to get an idea of the work. Doable, but of course harder.
Nintendo still has the source to SM64.
I’m saying this same decompile process could indeed be done to any released game where the source code is lost, because that is effectively what happened in this case.
On the technical level, you are correct in that in both scenarios the end result would be the same, as you are going from compiled code to decompiled code.
What I believe the parent is saying, is that applying this to Panzer Dragoon accomplishes more (on the human level), because devs of that game don't have the original source code anymore, while Super Mario 64 devs do.
> It would be great if this could be done...
codesushi42 would have said (emphasis mine):
> It would be great if this would be done...
Not that I think it was incorrect as it was, just a little ambiguous, I guess.
Also, I would agree more with your point if the parent said "It could be great if this could be done..." instead of "It would be great if this could be done". The first "would" seems to indicate to me pretty clearly that the parent was talking about a request rather than ability.
in the case of something like the glass of water, "could" makes the sentence more indirect, and more polite.
the original post is "it would be great if [huge task undertaken by unspecified persons] could be done". this native speaker would not attempt to polite-ify a request for something like that (and i don't think other native speakers would either), so the original post can't be making a request. it is expressing a hope that the thing is possible. mburns (reasonably) then explains that it is possible. then codesushi42 sort of goes on the rails, and i can't figure out what they're attempting to convey at this point.
Context is important. Why bother decompiling a game if you have the source already? Of course I meant decompiling games for which there is no source code available on any machine. Nintendo has the source for SM64.
What a ridiculous load of pedantry.
A misunderstanding occurred. The misunderstanding was clarified, acknowledged, and explained. I'm not sure it contributes anything to make accusations of pedantry.
That's why I said I didn't think it was incorrect, but merely ambiguous. The use of "could" could also be interpreted as talking about the physical ability to perform the action. That was precisely how mburns seemed to have interpreted it. My comment was merely trying to clarify your explanation with a simpler version that tried to eliminate the ambiguity that was probably the source of the confusion.
I got an upvote for that comment. Maybe it was them and it did work.
> "It could be great if this could be done..."
That sounds like one wouldn't be sure if it would be great or not. I don't think anyone meant or interpreted that.
Old style: "If I had tried, I would have succeeded."
New style: "If I would have tried, I would have succeeded."
The extra "would" style used to be restricted only to adding strong emphasis, as in "if you would just LISTEN to me...". Slowly, this extra "would" has crept into other areas, like replacing the subjunctive as in the example:
Old style: "It would be great if this were done."
New style: "It would be great if this would be done."
The new style is "incorrect" English as of a couple of decades ago, but its usage is increasing. It still sounds terribly wrong to my ear, but what determines whether grammar is "correct" is the way in which people actually speak.
Disks fail. Machines are thrown away. Who knows how it happened, but it isn't a rare event, sadly.
But not much has changed, I guess it's hard to make progress in a month.
> This is evidence of a removed second player, likely Luigi.
> This variable lies in memory just after the gMarioObject and
> has the same type of shadow that Mario does. The `isLuigi`
> variable is never 1 in the game. Note that since this was a
> switch-case, not an if-statement, the programmers possibly
> intended there to be even more than 2 characters.
And more results when searching for "luigi":
(The additional characters in the DS remake were horribly unbalanced, so I wonder if the earlier implementation would have been better...)
I love these instructions!
Also, I'd love to see this converted to a native executable. I wish Nintendo would actually allow that, although I'm sure they wouldn't.
I can not figure out the right keywords to find it again, but you may be able to if you are interested.
EDIT: Even though I can't find the video anywhere (I promise it existed!), from https://warosu.org/vr/thread/5644072
"To answer your questions, yes: This is a full source code which can be recompiled with modern toolchains (gcc: ive done this already) and even target other platforms (PC) with quite a bit of work. There already exists some proof of concept wireframe stuff."
That's a shame!
Copyright is not purely literal, especially when it's copyright of computer code...
Part of what makes this game such a watershed moment for 3d gaming is how the controller was designed to maximize it's potential.
To this day Mario 64 is one of the best games ever made.
Also there are "Controller Converter" N64->USB available on i.e. Amazon and iNNEXT has an Retro 64-Bit N64 Controller on Amazon as well.
Don't know if any of these will work for you, at least the iNNEXT is mentioned in the retropie wiki.
Am I being downvoted because I don't have an optimal solution to some problem?
Most of the other "assembly" files are for data, like the level scripts. It's not assembly of machine code.
You could try to put those into C, but you're not gaining much--assuming that it's even something that can be represented in C without a bunch of fancy compiler specific tricks. You'd be better off creating a DSL or a custom program suite, which is probably what Nintendo was doing 25 years ago.
More than likely SM64 was written in c with some critical performance parts in ASM (like mode7 and some of the OAM stuff other threads talk about).
Let me just say this. Although not a complete restructuring, it's a TON more readable than a bog-standard decompilation of the ROM. This is something you'd know if you spent 10 minutes reading it.
Mods! Much easier to modify source code than a compiled binary.
What I am saying is that it doesn't surprise me people do these projects, what surprises me is that enough people care about them for it to make the front page of HN.