Hacker News new | past | comments | ask | show | jobs | submit login
How I Fixed a 10-Year-Old Guitar Hero Bug Without the Source Code [video] (youtube.com)
383 points by adamnemecek on Sept 24, 2017 | hide | past | favorite | 61 comments

An amazing technical accomplishment, and in addition, I love this teaching style. Doesn't assume any level of knowledge of reverse engineering, but moves quickly enough to be a teaser, not a tutorial. Visuals are painstakingly put together, with supplementary information flashed on the screen. A sense of humor throughout. If I were teaching a CS class, this would be a great thing to play.

This also demonstrates how much "human working memory" reverse engineering requires. No matter how much the tools try to assist, you need a tremendous amount of context to be able to make sense of what you're seeing. Really impressive work.

Man do I feel stupid every time I watch something like this

Don't. Been there, done that, it's counterproductive. You have to realize that the OP has probably put hundreds to thousands of hours into reverse engineering (it's a skill just like any other) and therefore he can do something like this quickly. But note that what he's doing isn't like next level stuff. It's like someone being impressed with someone playing "Fur Elise" (i.e. a task that anyone who's spent even a couple months doing it will be able to accomplish).

Check out https://godbolt.org/, you'll learn some assembly.

As an example, I once reverse engineered encryption for data files in a game. Took me about three weeks to determine which encryption algorithm was being used, and what the key was (including learning some ARM along the way).

But I was fortunate in that I could run the game in the debugger and see what was happening, the path from reading the data to decrypting it was rather simple to figure out, and the algorithm was pretty easy to spot. If I were to reverse engineer the actual asset formats I'd imagine it would take months as you need to know how everything is tied together in order to make sense of what is happening. It looks like in the case of guitar hero, this guy has spent an extortionate amount of time reverse engineering everything.

Yet at the same time, I struggle to think of one project/area of study where I have invested enough time to wow onlookers. Maybe someone who doesn't know that electricity exists would be impressed, but that's a significantly lower bar

Does anyone have any recommendations for YouTube channels that focus on reverse engineering?

Fantastic, entertaining editing for what can sometimes be a dull topic to watch!

Fast paced and captivating. And the surprise ending, which turns from sweet to bittersweet, after what must have been weeks and weeks of work.

i wish i loved something or someone like this fella loves his guitar hero.

I wonder how some people have this sort of passion too.

Captivating! Anyone have other recommendations for blogs, YouTube channels, etc. about reverse-engineering cool stuff?

This video [0] has definitely made the rounds here before, discussing how to complete a level in Super Mario 64 with less than 1 full A press and in the process discovering parallel universes in the map.

[0] https://www.youtube.com/watch?v=kpk2tdsPh0A

I found this video on hacking old Thinkpad keyboards onto new Thinkpad models interesting. Required a lot of reverse engineering due to how the firmware in the Thinkpad maps keys:


This was fascinating and I have to admire the guy's reverse engineering skills. Very nice tool set up and crazy screen resolution!

Thank you for posting this. It was very interesting to watch. I love seeing other areas of CS that I don't get to touch in my daily job.

Another reason to hate DRM. Not only it doesn't work but it makes custom fixing harder.

I must admit my initial reaction was "What - Guitar hero is 10 eyears old already?"

Very nice video.

Not just Guitar Hero, Guitar Hero III. Guitar Hero is almost 12 years old.

Guitar Hero is 10 years old?!

That's Guitar Hero 3, released in 2007, so yes, 10 years old. The original Guitar Hero is a couple of years older (released in 2005).

Ah sorry, I was thinking about Rocksmith.

Loved this!

I'd like to see more of this on HN, actually interesting technical content. The video is amazingly well put together. I don't know if it is sped up or not, my only experiences with windows now adays are through my non-technical relatives computers which I'm asked to fix, but I was surprised at how quick windows appears to be in this video.

But what bothers me, is the bug itself. It is shocking to me that someone set a hard limit on a pool and didn't add any code for replenishing it. I would just never do such a thing.

One of the first programming environments that I used was MicroWorlds LOGO, and they had a cheaper version which allowed a maximum of 200 turtles, and a more expensive version that differed only in that the number of turtles was unlimited. My father got me the cheaper one and since then I've had a religious hatred of arbitrary limits in programs. But even if you didn't have this experience, I'd have a hard time respecting any programmer who would have left that pool unreplentished and called it a finished project. I accept the fact that we all make mistakes and make bugs, but that wasn't a mistake bug, that was consciously sucking out of laziness and not even bothering to gracefully handle failure.

This limit is anything but arbitrary. There's a few things to keep in mind:

- This code was written for the XBox 360 which only has 512MB of memory.

- Because the game was made for a console, it needs to go through the notorious Xbox certification process. This involves running the game for hours at a time at maximum load to ensure no memory leaks exist and that the game runs at a consistent 60 fps.

- Games are not the same as long-living applications. They are made on a project basis. There are constant trade-offs where good enough is the correct decision.

- This is a port, so there's care to be taken in reducing the footprint of changed code.

It's unfortunate that this particular bug wasn't found before release but I think the root cause wasn't necessarily this pool size being low but instead insufficient testing.

I worked on an Xbox title many years ago, when I was still a bit inexperienced, and one of the requirements was that it had to run the attract mode for a week without fucking up. The other guy working on it was even worse than me, so no prizes for guessing who ended up sorting that out. And what a huge pain it was!

There was nothing awfully wrong with the code, sure, much of which had been through numerous PC games, and a lot of which was shared with the PS2 version... but finding all the stuff that wasn't awfully wrong, just a bit wrong, and wrong enough to cause problems after running it for 3+ days solid, was surprisingly painful. But I figured it out. Even if I'm not the sharpest tool in the box, I'm persistent, and I've been served well over the years by my ability to bring this to bear on whatever problem I'm facing. So I did that, and it all got fixed, and it shipped like that.

Once it got signed off, I started working on another Xbox title that was nearing completion. But by the time that one reached the certification stage, about 3 months later, the requirement had mysteriously changed to just 2 days! I wonder why...

That sounds like a pretty painstaking process, I would've been a bit deflated after finding the requirement was more than halved after you were done!. BTW what is meant by "attract mode" ?

Attract mode is where a game cycles between front end, intro video, rolling gameplay demo, and so on: https://en.wikipedia.org/wiki/Glossary_of_video_game_terms#A

(I expect what they had in mind was using it in shops, either running in the shop window, or as part of a counter display, and so on, and that drove the requirement that it had to run for a week.)

Holdover from the arcade days where games were competing to 'attract' players.

Ahhhhh makes sense, thanks!

Oh yeah. Hard limit pools are the bread and butter of console engines. You want to preallocate everything because fragmentation will murder you.

Source: Ex-gamdev who's shipped things on x360, ps3 and psp.

Interesting. Yes, fragmentation is a big problem on memory constrained systems. So the memory pool approach is quite a nice solution, but did you use caching allocators where memory could be allocated and marked as purgeable, so it could be reclaimed when memory gets low? Just curious. By the way do you know any good resources that cover memory management on consoles?

The problem with making pools purgeable is you've now just moved your memory fragmentation problems up a level in the stack to the allocator. Now when you need to bring in a ~30mb level asset you might have one of those cells right in the middle again.

For reference we usually split the psp 8MB/24MB System/Video+Audio so you'd have to get everything in < 8MB. For stuff that was highly dynamic we'd use Lua in a fixed block of 400kb. Usually that was enough to do all the game state and could be reset from level to level.

Best public information I've seen on this stuff comes from Mike Acton, he's got lots of great stuff around Data Oriented Design which touches on this(and also why caches are so friggen important).

Ah, that's very interesting - thanks for the explanation. I see what you mean about the fragmentation. When the cached blocks drop out they may not be necessarily be usefully re-used, so you end up with memory looking like Swiss cheese again. Thanks also for the reference - I will go check that out now - this whole question of memory management on consoles sounds like it could be a fascinating area for me to read up on! Cheers!

> It's unfortunate that this particular bug wasn't found before release but I think the root cause wasn't necessarily this pool size being low but instead insufficient testing.

Unless I'm misinterpreting the video, the bug only occurs if you modify the game to add extra custom songs. The bug won't occur with the original unmodified game data. As with so many things in tech, it works just fine for the intended use case.

You misinterpreted, yes. Custom setlists and downloaded songs are a big part of the intended use case of Guitar Hero.

Custom setlists were most definitely not a big part of the intended use case, as they were hacked in after release by modders.

Downloaded songs were part of the intended use case on the lead platform, but no official DLC ever came out for the PC version to make use of it.

Having a setlist with 200+ songs was probably never even tested.

This is a console game, and consoles are always starved for memory since it's such a huge BOM cost so they put as little in as they can get away with. This ends up screwing the console game developers who have to start accounting for every single byte.

So there are a lot of these things in games where they just run a script on all their assets and what not to figure out how much memory they will need for it, then allocate fixed pools at the start. These programmers would love to just throw that stuff into a vector but given their growth strategy you can end up wasting half your memory. Even worse, every element added into the vector might now cause it resize and when that happens during gameplay it might just ruin your 16 ms frame target. A fixed pool is as fast as you are going to get, no random chance your game is going to stutter and no waste.

Here is a great talk on this:


Thanks, I learned something, though if they really were accounting for every byte, I'd say that function call obfuscation really didn't help! But I understand what you're getting at.

It's possible that's only in the PC version. I don't see anything on Google about Securom on xbox, so this seems likely.

You have to bare in mind that it's not really a bug in the original game. This "bug" only effects people modding the game outside of the original plan, and would never have caused a problem otherwise.

I'd agree that it's sloppy coding, but I can also see it from the original developers point-of-view. Why check something you know isn't going to change?

I agree with you but I wouldn't call it sloppy coding.

> how quick windows appears to be in this video.

Your relatives probably have 2gb of memory and a i3 cpu or something similar.

Don't forget the spinning, humming & clicking rust they use rather than an SSD. One errant kid hits it too quickly and shatter goes those platters!

Have you ever taken apart a hard drive? I did and was shocked to find the platters were so strong that I had a hard time bending them with two hands.

A side note: a lot of "newer" HDDs (< 15 years I guess ?) now have ceramic/glass platters that break instead of bending, so be careful if trying this to avoid hurting yourself.

But knock them while it’s spinning and it would be game over for hundreds of sectors.

...and possibly the read/write heads.

SSDs have many advantages over rotary drives, but reliability is not one of them.

I've yet to see any reliable evidence that SSDs are less reliable than hard drives. Plenty of anecdotes, but those also exist aplenty for hard drives. Personally I've had multiple hard drives fail on me, but no SSDs.

At my work I have to get a new SSD every 6-12 months because they just fail after writing around ~100TB to them, or if they don't straight up fail they become slow and start giving you read errors. I can write even a petabyte to a normal HDD and it will still work fine, but SSDs have a finite number of writes, and if you do what we do, you're going to run out within months rather than years.

You might want to consider trying another manufacturer's disks next. None of the SSDs tested by The Tech Report failed anywhere near that early [1].

[1] http://techreport.com/review/27909/the-ssd-endurance-experim...

They are always either Samsung 840 or 850 PRO - so according to that test they should last much longer than they do. But they don't - I suspect it's the nature of what we're doing that's killing them, it's almost all random writes, very little sequential writing. Techreport did sequential writes - so each cell was only written once per pass, while when I write 1TB of data to my drive majority of it might be focused in the same area - sure broken cells they will get moved around but the drive will run out of cells to relocate to after a while.

If you're using Samsung in Windows, make sure you turn on RAPID mode. Also, don't run any SSD almost full, it kills your speed.

Killing PRO's in 100T is an aberration

I'm running 2x1TB in RAID 0, so Samsung Magician doesn't let me enable RAPID mode, it just says it's unsupported. And yeah, I never run them near full, usually about 200-300GB are completely free.

Besides, RAPID mode seems to destroy performance if you care about number of operations per second(and I do very much, I'm writing millions of files to the drive every day)


I've seen early model SSDs fail (eg 60GB and 120GB consumer SSDs years ago), but an 850 Pro will keep on going for a long time.

That thinking is a little dated IMO, but even 6 years ago:


(where's ssd's 'hotness' is more important than their 'craziness')

Platter Shatter would be a good name for a chiptune metal band

Come on, downvoting Abraham, it's a guitar hero thread!

>but I was surprised at how quick windows appears to be in this video.

SSD + disabling animations go a long way to increase "speediness"

Yep, I've recently set up a PC with an i5 750, so literally the first-gen i5. 4GB + Win 10, but it was running off an old 500GB hard drive - the machine was nearly unusable, things were taking minutes to launch. I ended up installing an older 60GB SSD into it and it's almost as usable as a modern computer, chrome starts within seconds of clicking the icon, everything is nice and responsive. I bet we could keep a lot of older computer going just fine if they had their hdds replaced with SSDs.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact