Hacker News new | past | comments | ask | show | jobs | submit login
“What Went Right and What Went Wrong”: Analysis of 155 Postmortems from Gamedev (microsoft.com)
95 points by winterismute on March 26, 2016 | hide | past | favorite | 49 comments



The old post-mortems in Game Developer magazine were written almost in template form.

Went right: 1) Great team/culture 2) Everyone put forth the effort (scheduled crunch) 3) Some amazing techwork or artwork or designwork etc...

Went wrong: 1) Didn't anticipate x/y/z 2) Burnout (remember that crunch)? 3) Bad/lazy scheduling and/or capacity planning

Kudos to the people who did this at MS but you're looking at tainted data. Most game developers know what really went wrong and do not publicly broadcast it in fear of losing future contracts or publisher trust. In the end the postmortem becomes a soundboard for 'shoutouts' and cheer-leading instead of that raw, unbiased feedback.

I've been to all hands post-mortems that were like that and it can get really really ugly.


I've been to an all-hands post-mortem where the director of engineering had to say "anyone with an opinion besides [name of director of engineering] sucks"?

Said director of engineering literally watched us blow milestone after milestone yet somehow the knowledge we were going to miss our commits (and not by a little) never propagated upwards to the ceo. Either that or the ceo didn't want to listen; it could be any combination of those two.

Either way, the engineers were pissed.

Incredibly poor milestone management and a failure to triage didn't end up in our fucking learnings document somehow.


Exactly this. Only one game I've shipped had a Gamasutra postmortem but that one was so generic that it could have been applied to any other game, ignoring the specific issues that project had.


What are the real reasons for successes and failures? You left me hanging...


People making mistakes, of one type or another. In these post mortems though, they rarely attribute mistakes to any individual, or even team.


> 9. CONCLUSIONS

>

> We find that we were able to identify both best practices and pitfalls in game development using the information present in the postmortems. Such information on the development of all kinds of software would be highly useful too. Therefore we urge the research community to provide a forum where postmortems on general software development can be presented, and practitioners to report their retrospective thoughts in a postmortem.

>

> Finally, based on our analysis of the data we collected, we make a few recommendations to game developers. First, be sure to practice good risk management techniques. This will help avoid some of the adverse effects of obstacles that you may encounter during development. Second, prescribe to an iterative development process, and utilize prototypes as a method of proving features and concepts before committing them to your design. Third, don't be overly ambitious in your design. Be reasonable, and take into account your schedule and budget before adding something to your design. Building off of that, don't be overly optimistic with your scheduling. If you make an estimate that initially feels optimistic to you, don't give that estimate to your stakeholders. Revisit and reassess your design to form a better estimation.


There's some world-shaking insight.

The best piece of advice I can give is to never, ever give an estimate to stakeholders until you've worked on the estimate first. And don't finish an estimate until you've actually built some small proofs of concept.

Your estimates will still be wrong, but they'll be much more accurate than some off-the-cuff number that your stakeholders will be building their various plans around.

It all sounds straightforward, but the hardest discipline for me as a developer is to keep from saying things in stakeholder meetings that make my stakeholders happy in the moment, but ultimately needed more thought and effort. It's something I struggle with in every such meeting, to varying degrees of success.


I think this is a symptom of a fear of "I don't know." If I could change one thing in dev cultures, I would instill a healthy respect for the sentence:

"I don't know, but here's $how_I'll_find_out and here's $when_I'll_have_an_answer."


Exactly right. Too often I've seen people commit to a schedule where the milestones were of unknown difficulty. When cutting through a new jungle, you have no idea what is between you and your goal, so really all you can say is that the best you can hope for is X if this jungle happens to be like previous ones.

This is especially true in startup situations where you are learning technologies that may themselves not be complete, with people who are learning their own new things, to achieve a result which is only hypothetically possible. It drove my CFO nuts but rather than commit to a schedule for a big deliverable I would walk backwards from the end point and say, "These are the stops between where we are, and where we are going. We measure our progress by getting to each stop, but like a subway map there isn't a known amount of time between stops, only what the stops are." And he would come back with "well we only have money to get to this date, will it be done by then?" and then we would talk about the uncertainty between each of the milestones. At some point you can reach a common understanding of what the unknowns are and how discovering their value will inform on the difficulty of the next step.

That said, I've met managers who just say "Oh we will be done by %x date." and then basically worked the problem the same way I have.

If you're agile you can break down the intermediate stops as sprints but estimating the backlog is still the killer step.


It's not agile to list out 32 milestones on the way to completing a project and estimate them up front. It's agile to list 32 steps and estimate the first step, which you will still get wrong.


Skimming over it, there seems to be much more specific details and insights in the body of the analysis.

The conclusion does not accurately reflect that... Which seems to be more of a "recommendation" than a takeaway or a real summery.


Turns out software estimates are hard. Who knew?


I found the Games Outcome Project to be much more rigorous(and anonymous) on the what practices help and hurt gamedev:

http://gamasutra.com/blogs/PaulTozour/20141216/232023/The_Ga...


If you compare the conclusions, this topic's research, while obvious, is closer to reality that the one, which concluded that crunch literally "makes games worse" despite the fact that every single game, which made a significant critical/financial impact had a period of crunch.


> despite the fact that every single game, which made a significant critical/financial impact had a period of crunch

this argument doesn't necessarily disprove the notion. it's plausible to suggest that whatever impact they had was mitigated to some degree.


It sure does not. My concern is that in this research they correlate the amount of crunch with quality of a game and show that "better" games have statistically less crunch. Since I know the games, considered successful in the industry, all have a lot of crunch, it sounds like they use their own criteria for quality, which has nothing to do with the common meaning.


The recommendations at the end of the pdf seem to be written by captain Obvious. They are so generic they can apply to about ANY project even non gamedev related.


One of my friends who just left being full time in the game industry pointed out to me that the game industry is 10 years behind in software development best practices. Churn and burn (like many startups) is a daily reality.


I got into professional game development in the last few years and found that the state of good, modern software engineering practices in the game world is years behind, say, web engineering.

Games are still all closed source. Game engineers prefer to sell their components for a pittance (for example, on the Unity Asset Store) instead of collaborating on GitHub. They really, really hate writing tests. There's a deep reliance on manual testing. They still have a "ship" culture ("who cares about the code, as long as we ship by the deadline?"), disregarding the fact that games run live for years now. Multiple managers actively fought me on doing code reviews (I was new-ish, I wanted my code to be reviewed). I saw and worked on games that had no codified version control branching strategy. No coding standards. Multiple issue tracking systems. A pathetic grip on sharing code among projects. Four implementations of a state machine in one game. It goes on.

I'd like to think it was just my employer's problem, but from talking to people who have been in games for a long time, it's endemic to the industry.

I'm now out of the industry :)


I still remember suggesting code reviews to my dev manager in 2003-ish. He didn't know what they were, and scheduled a meeting where he could review my performance, instead. Awkward.


If your friend was not very senior (10+ years experience), I would not believe what he says, since he probably did not have enough experience to judge the situation.

I think the idea that the game industry is "behind" other fields is kind of comical, given that games are some of the most complex software in the world, and big game teams have only a few hundred people on them, and meanwhile something relatively trivial like Twitter has 4000 people. It's true that game teams don't do a lot of Agile or TDD or whatever the next buzzword is, but that is because those things are mostly superstition and obviously don't work when you start attacking hard problems.

So if you are someone a few years out of school who learned TDD it is easy to say "games are behind, they don't do all the new stuff!!" while being unaware that almost all the new stuff is bogus cargo-cultism anyway.

I do agree that the game industry engages in unhealthy levels of crunch that are to its long-term detriment, but this is mostly an orthogonal issue to software engineering practices.


I was in the games industry from 1999 to 2012 (and then left to go to Twitter!) and I think his friend is absolutely right, the games industry shoots itself in the foot repeatedly by believing that it's a special snowflake full of "hard problems" that no-one else tackles and continually (and badly) tries to reinvent the wheel. The universal culture of deathmarches and the denigration of mainstream software engineering and project management techniques are not in any way coincidental.

(I loved the Witness, btw.)


If only game studios could raise money on negative profit margins instead of being forced to deliver milestones to be able to run regular operations... Every day I'd be writing unit tests and doing standups in between code reviews and updating my GitHub profile.


If you're comparing them to VC-funded startups, games studios run on negative profit margins until the game is released!


Some might, but it's not a common business model. Most game studios work for publishers and they sell the game continuously in form of milestones. Even a studio owned by publisher has to continuously deliver or it will be shut down. Twitter, which I am comparing to, just gets money on the promise of the future profits. Correct me if I am wrong, but I don't believe it has any software deliverables it has to meet every quarter in order to make payroll.


Twitter, now, sells advertising, which is 90% of its revenue. While it was VC-funded, which is really what you should be comparing it to as it's the prevalent stage of an startup, it raised money from VC funds by hitting non-contractual milestones in terms of MAU/DAU increases. I don't think that's particularly different to the milestone model that's prevalent in games - a third party pours money incrementally into the development of a product contingent on the mutually agreeable evolution of the product.

Unless you're in the habit of delivering RC-quality milestones (which is afaik unheard of in AAA development), the marginal value of a milestone is negative right up until the last one.


I am sorry, it seems that my point is completely lost behind snark. Let me put it straight then.

The argument in this thread is that games industry is somehow behind times and not using the best practices used at places such as Twitter. The question is - how do you know your practices are the best and not the other way around?

Game studios go out of business if they don't deliver quality software on time and on budget. I know it first hand since I have been through several studio closures. The practices game studios use are tested through natural selection - if they fail at delivering software they are out of business. To make things more interesting there is also competition: a good game decimates sales of the worse games released around the same time. And if the sales go below the projected ROI - it's the game over. So it's not enough to be good enough to survive, you need to be better than the competition.

Trendy web companies don't sell software. They sell services and the quality of software they use is secondary. E.g. if I wrote a Facebook's clone but 100 times faster, using 10 times less memory and with 1/1000th of Facebook's staff it would not threaten Facebook. People would not close their accounts and move to my network just because I have better software. A web company is fine as long as their website runs semi-reliably.

So how come the battle-tested practices of the games industry are so bad compared to the practices of the industry, which mostly sells advertisement? What are the criteria you use to compare?


Games are more like web products than you think, I'd say. If you come up with a game-stack: 3d engine, netcode, the works, that vastly outperforms a game like League of Legends, that does nothing to threaten them either.

I'm not siding with this notion that Twitter (or webdev in general) is ahead, mind you, but it's fair to say that neither websites nor games are about selling software. They're more about entertainment.


You are right, f2p games are a lot like web - they sell service, not software. Interestingly, these companies look a lot like web firms - thousands of people working on the product, which barely changes and, I've heard, using the same programming practices. This is not what people mean when talk about games industry.


> something relatively trivial like Twitter has 4000 people

there is more to twitter than the web app you type dozen words into.

i don't much about them (including whether 4000 is accurate), but presumably there's a lot to work on regarding international near-realtime messaging infrastructure, high availability, machine learning, sentiment analysis, smart advertising systems, yadda yadda. 4k does seem like a lot, but i doubt their suits are interested in needlessly hiring people to sit around picking their nose.


I would argue that crunch is at the root of good software development practices.

I've spent time in the industry and its behind in practices, wages and quality of life. There may be some smart, motivated people but the industry as a whole has not grown well.


TDD might not apply well to games because they are used by big corporations for 15 or 20 years. It makes more sense when you're working with old legacy code and you want to add features to it. Unit tests are definitely not a all-around solution but they have their use cases.


It makes more sense when you're working with old legacy code and you want to add features to it.

No!!

Testing is not merely about future proofing (although it's a really nice benefit), it's about proving that your code actually works now. By writing tests and verifying that they pass you can be sure that the code actually does what you think it does. Furthermore you can ensure that when other parts of the code changes your code still does what you think it does, which might be an issue tomorrow, next week, or in ten years.

\* note: Testing does not prove that your code is correct, only that it satisfies the conditions of the test. As with all engineering testing is only effective as the person implementing it.


Unit testing makes a lot of sense for game engine code, but less so for game-logic code, which tends to be highly volatile and specified only qualitatively.


A lot of bugs in games involve asking if state is X,Y,Z ... does A actually execute.


Maybe you would say that most webapps, no matter the complexity of business logic are "not hard", but I've seen TDD work quite well and practiced by senior (10+ years) programmers to great success.


Well they weren't that obvious to 155 projects it seems...


I would assume it was (to most of them) but knowing and doing are two very different things (i.e. everyone knows smoking is bad for you and yet people still do it / start doing it).

So to me the far more interesting and helpful question to answer is: Are there most efficient ways to keep people from making the same mistake over and over, other than repeating what everyone already knows, over and over?

Identify the forces that drive people away from best practices most often and give us ideas and tools to tackle them.


The best approach is to apprentice with someone who knows not to make them. If you can't do that, check in with someone regularly who:

  1. Won't sugar coat feedback,
  2. Has shipped games,
  3. Has a healthy sense of humility, and
  3. Is genuinely concerned about your success.
I really like sitting on advisory boards in this role, but my secret to being good at it is that I regularly ask for feedback from others who are smarter than me at the things I don't get (which is most things).

In the long run, it's like most things. It takes practice.

To the criticism that the game industry is behind the rest of the software industry in software engineering practices, I do believe the rest of the software industry can eat something phallic. :)

I'd like to see them pull off what we have to with about 10X the competition, juggling massive game assets, a fraction of the budget, finance, and exit options, needing to meet a tough frame rate budget (even more exacting now with VR), all issues GPU-ish (when was the last time you wrote a shader, Mr. Website Developer?), and having people with a huge range of technical skills requiring direct access to the source/asset repo.


Something about your last paragraph rubbed me the wrong way. Sorry.

> about 10X the competition

Steam added 1500 games in the first 7 months of 2015. Crunchbase had over 8000 investments in 2015. So I don't think that's true, even accounting for a Christmas peak.

> juggling massive game assets

Lots of data goes into games, lots of data comes out of websites. The difference to my eyes is that the industry has multiple, principled, well-understood frameworks to deal with the data problem and the games industry has...what? Maybe the state of the art has moved on in the last 5 years but when I left it was all "well it's just a DAG, how hard can it be to write a tool to recursively build it".

> a fraction of the budget, finance, and exit options

Being willing to work for less and with fewer resources is not a sign of good engineering practice. I don't know if I'd say it's the opposite but it's at best irrelevant.

> a tough frame rate budget

Not very different to needing to meet tough latency/error rate SLAs, except instead of failing TRC (or getting your exec producer to stare down MSFT/Sony), you'll just go broke. Oh, and you don't know what the SLA is, ever, so it's a continuous process of tradeoff.

> all issues GPU-ish (when was the last time you wrote a shader, Mr. Website Developer?)

Ignoring that some "website developers" (many of whom are not "Mr" - something else that the games industry does not do well at) do write shaders in the form of GPGPU optimization of ML tasks, obviously tasks differ. Let's not pretend that banging out something in Cg is PhD-level shit. It's not harder than writing a Spark job, for sure.

> having people with a huge range of technical skills requiring direct access to the source/asset repo.

You have Mr. Blow upthread sneering at Twitter for employing 4000 people to build something "trivial" but you think that each of those people have identical programming chops in every bit of the code they touch? If there's a single thing that argues against your idea it's this - the rest of the industry has become pretty good at modularizing areas of the business and auditing changes to it with code review, build/asset/deployment tagging etc. The last games company I worked at, someone stayed at the office until 5am before going on holiday for a week and sent an email that effectively said "I've rewritten quite a bit of the engine, you'll probably need to fix the build because I didn't test anything, see you in a fortnight". People gasp when gmail/twitter/facebook is down for half an hour but I've worked in places where there _wasn't even a canonical build_ that could be broken.


"Let's not pretend that banging out something in Cg is PhD-level shit."

Almost everything you implement in CG is "PhD-level shit". Implementing techniques from SIGGRAPH papers in your game engine are usually non-trivial. Game engines/games are some of the most complex systems in software engineering, the complexity can't be compared to web apps at all.


I think by CG you mean computer graphics? Cg was nVidia's shader language that compiled to HLSL or GLSL. Writing shaders is sometimes serious business but more often than not it's just as drudgey as writing CRUD pages.


Empirical research often turns up "no duh" results.

It also, often, doesn't.

In both cases, there is value in publishing.


>The recommendations at the end of the pdf seem to be written by captain Obvious.

Doesn't that simply mean that this research confirms other people's experience/gut feeling?


It simply means the research is not insightful.


PDF is on the right, for anyone else wondering what to do about an abstract that doesn't contain any spoilers.



In the past, Gamasutra has also published data indicating correlations between "biggest problems/biggest successes" and marketplace success.

The takeaways I remember about that: A good producer matters, a team of a least three people does better than one or two, and experience shipping stuff matters.


Apparently they didn't achieve any insights suitable for mentioning in the summary.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: