Hacker News new | past | comments | ask | show | jobs | submit login
John Carmack discusses the art and science of software engineering (2012) (uw.edu)
145 points by dshankar on Sept 13, 2015 | hide | past | web | favorite | 25 comments



Very good essay. However, Carmac's view that there is very little quantified info about what works and what doesn't overlooks the work of software engineering practitioners, such as Capers Jones, who has long studied projects quantitatively and published detailed statistics on the effectiveness of different development approaches. His book "The Economics of Software Quality" is an excellent summary of his findings across thousands of projects.

Reading it is like the first time you run a profiler and you realize that what you think you knew is true in part, but that other factors you'd not counted are have an outsize effect that you've misestimated.


It's shocking how immature software engineering is as an industry. There is a ton of work that hasn't been done and a ton of information that we just don't have, but even so there is a stark lack of adoption of already well-known and well-documented best practices. We very much are not trying our best.


I agree with you, but also think we need to understand why this is happening. We know how to build safe factories, but people keep getting hurt in them and once in a while they collapse entirely. If you go to emerging economies or even just compare northern versus southern Europe you'll see that there is no inherent reason why things can't be more sophisticated other than culture and possibly a status quo. It's overwhelmingly just that the sophistication in knowledge, across the board from worker to law maker and in between, doesn't keep up with the demand backed by economic incentives.

The widespread solutions in software seem to be to make software engineering more rigorous. Which while I agree with to some extent still largely avoids the problem. Not only do we know that even the best programmers currently can't consistently make programs that don't display catastrophic failure in terms of e.g. compromising data, but what we currently demand from software doesn't fit well into the field of engineering. John Carmack hasn't made a leading game in a decade, because technology isn't enough anymore. It's games like Minecraft, creative with technical problems, and Call of Duty, almost cinematic entertainment, that is pushing the boundaries.

I think to a large extent we missed the opportunity to build a good foundation in computing. The Internet in terms of decentralization and security, software in terms of process and robustness, and probably many more examples I don't know about. Partly because the field (and demand) is moving so fast, but also because of the culture in software. I'm always surprised how conservative (not in terms of politic) people in software are and how we collectively seem to agree, despite things like "software eats the world", that it's still a niche thing.


The reasons are somewhat complicated, though some are straightforward. There is a ton of money in software, and a ton of people who want to be devs. Which means that even if a company makes a lot of mistakes they can still be profitable. And even if they burn out or drive away a ton of talent there is still no end of bodies willing to try to do the work, they just may not have the same level of skills.

I think rigor is an important part of the "solution" but in a way that's too easy. It's simplistic to just say "do better", which is sort of what a call for rigor is. Ultimately I think it's not rigor but just seriousness which is called for. Right now the industry is downright delusional and avoiding even talking about some of the fundamentals of software engineering as a practice. So many people want to feel like they're on the bleeding edge, rather than struggling through the mire (which is the reality). We have to get to the stage where people are talking about these things first before we get to the stage where we can do anything about them.

Right now the industry is in a rough shape. And the industry doesn't realize this mostly because there's so much money still (and money is a salve for many wounds) and because not long ago the industry was in crisis, so we're still seeing improvement from that state. Merely producing anything of value from a software project is still a big deal, and not a sure thing. Moreover, shipping any major product or service without relying on "crunch mode" or a death march is a rarity in the industry these days.

But we can do better. Not just with rigor but with how people use their time and do work. The norm is still too close to everyone running around in firefighting mode all the time, while technical debt blooms and burnout looms. A better mode is to work at a slower pace, get the right work done at a higher quality level, and still maintain good development velocity because you aren't slowed down by your technical debt all the time. There are some, small, pockets of the industry where that's the norm, but it's still a hard sell. At the end of the day the default is still to fall back to the ridiculous notion that software dev is like factory work and you get more bang for your buck by just adding more bodies and by having more butts in seats (or fingers at keyboards) for more hours per week.


I've run into the same opinion from other developers: "there is very little quantified info about what works and what doesn't" In my experience they are looking for the wrong answers. They want an answer of how to design software. How to define the entities, classes, functions etc. Quantitative software development discusses less of design, IMO, and more organizational factors. EG: How a companies organization and processes impact quality. This is not what practitioners expect. Yet, this is where the science of software development is. Thanks for the book reference! An economic perspective on software quality sounds enlightening.


Now I am curious as to whether there has been any work in quantifying (or at least studying) the impact of processes vs engineering (in the sense of design and development) on quality and long term maintenance of software.

My google-fu is clearly failing me.


Thanks for the reading suggestion!

That said, it seems that his last practical experience in the field was better than two decades ago ( https://en.wikipedia.org/wiki/Capers_Jones ).

A criticism of that sort is available to other folks as well--I feel compelled to ask, "Interesting results, but what have you shipped recently?"


Don't you think this is a bit like not listening to coaches or sports medicine experts because they haven't been professional players in two decades?


Well, when software engineering is as good for one's personal social life as being an athlete, perhaps I'll buy your argument. :)

More seriously, I see what you're saying--that said, there is a huuuuge difference between how we develop software now versus even five years ago, much less ten or twenty.

Github. Stack Overflow. Agile methods (ugh). Lean methodologies. Various testing methodologies. Changes in the funding structure of companies. Order of magnitude increase (comfortably) in debugging tools for web and network applications. The entire single-page browser app movement. The entire GPGPU movement. Truly convenient and open collaborative coding platforms.

The field, and the people in it, have changed a great deal, and we've learned a lot. I'm increasingly skeptical of professors of professional software engineering who don't, as a day job, professionally engineer software. This applies to other folks--and I mean this nicely--like Uncle Bob or Martin Fowler as well.

Folks like Carmack who have been pushing the envelope and getting scarred the hard way for 20 years are probably simply better sources of learning.


TDD and agile are both about 15 years old, and both built on well-established pre-existing development patterns. Github doesn't change many of the fundamentals of how software improves in quality or how development works.

What I've observed in the industry has been a shocking lack of self-awareness about application of best practices, and decidedly half-assed attempts to improve. The state of the industry is not dev shops on the bleeding edge of the best tools, best methodologies, best policies, etc. It's almost universally a tale of failing to even get to "ok" in terms of well-known best (or even better) practices. There are a ton of dev shops that don't do any code reviews at all, and formal code reviews (which have been shown to be one of the best methods for improving code quality) are almost unheard of. Even at places that take testing seriously or do TDD they still typically don't do it very well.

That's why folks like Uncle Bob and Martin Fowler continue to have so much traction in the industry, because simply following good advice that was old 20 years ago is still a huge step forward for the average, or even above average, dev shop in the industry today.

Saying "we've learned a lot" just tells me that I shouldn't take you seriously, because clearly we haven't. Software dev. is still a shambles. Security is still a nightmare. Performance is still a nightmare. Quality and robustness is still a nightmare. Work/life balance is still a nightmare.

Reading books like "The Mythical Man Month" (which is 40 years old!) still holds a ton of lessons that have yet to be taken to heart by the majority of the industry.


"but what have you shipped recently?"

Looks like he has shipped 14 books and at least 36 papers on software engineering methodologies.


Why is that relevant?

It's really only relevant when someone is passing down personal wisdom based on their own experiences (like John is).

Capers is conducting and providing research findings and so we should care about his research abilities, and the evidence he presents. His personal software development abilities aren't important.


I think that it is completely reasonable to question the applicability of any operations research conducted by people who aren't themselves intimately embedded in the current practice of the fields they're commenting on.

I believe that because I find there to be a lot of dynamics in software development that are seemingly completely unaccounted for in the literature. I admit that this may be because I'm unfamiliar with the current edge of the research being done, but then again, research does tend to lag rather far behind the state-of-the-art, and our field moves quickly.


"and our field moves quickly."

Is that really true in terms of organisational structure, communication patterns, team sizes, management structure, or best practices?

Tooling has improved significantly across the board but, for the most part, it has improved and refined existing patterns and processes rather than enabling radically different/novel practices.

The two major differences, IMO, over the last 15 years have been in 1) remote work and 2) the ops space.

On 1) Better tooling has really made remote work viable in many places it didn't use to be. And many companies are starting to take advantage of it.

On 2) the real improvement here the power shift from ops to development. When I first started working in software I was frequently blocked from experimenting by the ops team. These days Ill fire up a few dynos, try out my idea, and then if it works bring the ops team on-board. The workflow itself has significantly changed.


I feel compelled to ask, do you think the fundamentals of software engineering have changed so much?


Of course!

First of all, we still haven't settled on a set of common abstractions and tools for the industry--so much reinvention happens that most projects can almost be considered bespoke every time. There is no common and simple way of estimating development time. There is no common and effective way of accrediting engineers. There is no commonly-understood legal liability and responsibility for software engineers. Frankly, any schmuck who decides to smoosh jQuery libraries together can claim to be just as much of an engineer as a high-end PhD or person with decades of experience--and the nature of the work is such that, quite frankly, that schmuck may deliver more value than their more experienced counterparts.

That said, the technology on which we are deploying has changed a great deal in the last forty years. Orders of magnitude increase in both processing speed, secondary storage, and memory have made a lot of formerly unthinkable approaches commonplace. The problems we face in terms of networked systems and security concerns are far different and more ubiquitous than they've ever been.

If the fundamentals of software engineering haven't changed, we shouldn't take ourselves seriously at all.


That doesn't speak of advancement, it speaks of stagnation for the most part.

And sure, technology has changed, some ways of doing things are different, but the core software engineering fundamentals haven't changed that much.

Take a classic book that is definitely showing its age, Code Complete, and look at what that book has to say about what the fundamental nature of software projects is: managing complexity. That's still true and will always be true, that's what software is all about. As I pointed out in another comment, you could go back even further to a book like The Mythical Man Month and find tons of lessons that are still applicable today. Because despite all the technological advancement that's happened, the fundamental practice of software engineering has not advanced much. "Agile" is just a way to make sure that a team is able to do anything at all and actually ship something that works and is not miles away from what the customer wants. Agile isn't an advanced technique, it's a rudimentary technique to rescue potentially failing projects/teams.

And security? Security is more important than ever. But what does that mean? That means that dev shops follow, oh, say 10% of the most important security best practices instead of 1% or 0% as they might have done 20 years ago. And it means you have to protect against a few specific kinds of attack vectors that were previously unknown. But that's not a sea-change in the way things are done, it's just incremental changes to details.


"I would like to be able to enable even more restric­tive sub­sets of lan­guages and restrict pro­gram­mers even more because we make mis­takes con­stantly."

This seems to be very much the direction Rust is going in.


> It’s about social inter­ac­tions between the pro­gram­mers or even between your­self spread over time.

There's nothing quite like looking at some code and saying "who in the hell did this?", then looking at the blame and realizing it was you, years ago. Learning from those moments is super valuable.


I call that guy "Yesterday Joel" ... I'm constantly cursing that imbesel in normal conversation. Today Joel hates that guy. But Today Joel really looks up to Tomorrow Joel and hopes someday to be just like him.


Tomorrow's his day to shine!


A great talk, pithy and full of insights gleaned from a lot of experience. Thanks for posting it.


>And it’s nice to think where, you know we talk about func­tional pro­gram­ming and lambda cal­cu­lus and mon­ads and this sounds all nice and sci­ency, but it really doesn’t affect what you do in soft­ware engi­neer­ing there, these are all best prac­tices, and these are things that have shown to be help­ful in the past, but really are only help­ful when peo­ple are mak­ing cer­tain classes of mis­takes.

We're looking for the right set of constraints on the programming activity based on patterns of common mistakes.


I find that best practices are a bit like design patterns. They are emergent in that a successful team will undoubtedly be using many of them. However, they are not something you can prescribe directly. Just like you can't use design patterns like ordering off of a menu (I'll have a bowl full of factories, a side order of singletons and a large bridge, please), you can't pick a handful of "best practices", sew them together and expect that, in itself, to make you successful.

Being aware of best practices gives you the ability to move in that direction when you see opportunities. It is a mistake, in my opinion, to try to force the opportunities by dictating best practices (something that is done far too often in software process engineering). The best practice you choose depends a lot on the situation and the people involved.

Having said that, I have all too often been in situations where I'm thinking, "I know how to be successful if we could do X, Y, and Z. I have absolutely no idea how to be successful doing the things we are doing now." It's tempting to try and force people's hands, but I have never seen it work. It's usually much better to search for an A, B, and C that will work instead. Not always easy, but it's I suppose it is how you demonstrate that you deserve the big bucks ;-)


You've described exactly how I feel about these things.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: