
John Carmack discusses the art and science of software engineering - gb
http://blogs.uw.edu/ajko/2012/08/22/john-carmack-discusses-the-art-and-science-of-software-engineering/
======
HeyLaughingBoy
_With the NASA style devel­op­ment process, they can deliver very very low bug
rates, but it’s at a very very low pro­duc­tiv­ity rate_

I wonder how many non-developers understand this. I, along with the rest of my
team, am trained in PSP
(<http://www.sei.cmu.edu/library/abstracts/reports/00tr022.cfm>) and TSP
(<http://www.sei.cmu.edu/tsp/>) and we use it in our day-to-day development.

It definitely helps us keep our defect rate below one bug/kLOC but it's an
_expensive_ process that results in very low LOC/day productivity. If very low
shipped bug counts are very important to your organization, great. But most
businesses these days seem to care more about having a _usable_ product than
they do a _perfect_ (or close to it) product. Especially if it's on the Web
where you can do multiple releases per day.

As an industry, we really need to bear in mind that different business domains
need radically different approaches to software engineering.

~~~
pnathan
Can you follow up with some information about the (P|T)SP experience?

I looked into it about a year ago and thought it was a ridiculous amount of
overhead, and the blurbs about the initial data Humphrey used to create it was
not persuasive. The "take our class" ads were not encouraging either.

So if its working for you in actual development, I'd LOVE to hear more about
what it does/doesn't do for you.

~~~
HeyLaughingBoy
First, it helps to understand that even before moving to PSP/TSP, we were
already in a process-heavy regulated Medical Device development environment,
so it wasn't a big change. My understanding is that many teams starting with
TSP didn't have much of a process to begin with.

The good: PSP encourages a high level of developer responsibility for quality.
So you use a checklist to review your code before running the unit tests. You
record every defect you find and if applicable, use that information to make a
better checklist. Every team has a TSP-trained Coach to guide the process,
answer questions, and keep the team on track. The metrics generated from the
process are analyzed weekly to see if the team is on target, if quality is
where the predictions say it should be and if there are any roadblocks.

The bad: it can be a _major_ change to how you are used to working. The data
collection, while as automated as possible, is annoying. The constant emphasis
on tracking time on various stages of fixing a bug/adding a feature adds a
noticeable amount of friction to your workflow. While it's not Waterfall, TSP
is definitely not Agile. Its entire focus is on predictability of output. It's
an attempt to take what works well for Manufacturing and apply some of that to
software dev.

In short, TSP/PSP is a good idea at heart for those types of development where
initial product quality is critical or where you may never have a chance to
fix a defect. This is not the case for most instances of modern software
projects.

------
adastra
I've gotten to know John a little bit, and I have to say it's a strange
feeling to have a conversation with a person of his off-the-charts
intelligence. I consider myself to be pretty smart-- was known as "the math
whiz" in high school, went to top engineering universities and did very well
there, have a couple of (minor) entrepreneurial accomplishments, etc. And yet
talking to Carmack I feel like I'm talking to someone who is a full two
standard deviations beyond me in raw intelligence horsepower. It's a pretty
sobering and humbling experience.

Part of it is that he really does spend 8+ hours per day coding, every
weekday, and has done so for 20 years. You'd think his experience level there
is about as high as you can get, so it's always cool to hear him talk about
the new things he's still learning at his work. I have to wonder if there's
anyone else in the world that has both his raw ability _and_ all those man-
years of programming experience. It seems like most successful technical
people end up doing management and business.

There's a couple of things people probably don't know about Carmack. For one,
he can talk intelligently on a lot of different topics. A lot of nitty-gritty
aerospace engineering, as well as the history of the space program and NASA
for example. He's also up to speed on the latest across a wide range of
technology, including things like cleantech.

Second, he has a pretty good sense of humor and can be quite funny. Which is
surprising I think just because he spends so little time (effectively zero)
out being traditionally social, which you'd think would be necessary to
getting good at making people laugh. But in conversation he has a pretty sense
of comedy and timing.

An example from his twitter feed that I clipped a while back:

<https://twitter.com/ID_AA_Carmack/status/167739644853747712> "Adding film
grain, chromatic aberration, and rendering at 24 hz for film look is like
putting horse shit in a car for the buggy experience."

------
nightski
He is definitely right about the social aspect of software engineering. But I
think he sells a lot of the tools short. For example, he brings up things like
Monads, Lambda Calculus, and whatnot - but then immediately dismisses them as
not affecting what one truly does.

But I think this really misses the point. In our industry it is really easy to
disguise oneself as a professional (or even just someone who knows what they
are doing), without really knowing much of anything. Meaning, our focus as an
industry has been on making the simple things as simple as possible (i.e.
scripting/dynamic languages, code generation, frameworks).

But what I see happening in the Haskell space for example (and even further in
languages such as Agda) are attempts to distill things down to their elements.
To find the true semantics behind a problem. This not only helps by producing
cleaner and more readable code, but it also helps with communication.

I really do believe software is a scientific (and mathematical) exercise. The
problem is most of industry does not treat it as such, and hence we end up in
the mess we are in.

~~~
rprospero
While I love what Haskell has done for me as a programmer, I don't think that
it's finding the true semantics behind our problems. I rather think that it's
finding a different basis. To take an analogy, monads and lambda calculus are
like Fourier series. For many classes of problems, they produce a cleaner,
simpler understanding of the solution. I'd hate to try and solve a boundary
value problem with just Taylor series. However, while expressing linear
functions with Fourier series is possible, it's less clear than the Taylor
series and you're more likely to mess it up.

In the same way, monads, arrows, and recursion are great ways of describing
many classes of programs. Additionally, they help with communication when your
problem is a monad or an arrow. However, certain classes of problems are
better described under other paradigms than being forced into the functional
one.

This comes back to Carmack's point. It's important to know Haskell, since it's
distilled computation down to a set of elements which are useful for
describing a large class of problems. Being able to communicate these
solutions is important. However, other paradigms are less error prone and do
communicate solutions more clearly on other classes of problems.

~~~
nightski
I agree, but do not boil down Haskell to Monads. The most interesting
abstractions are the ones you write. The only contribution Haskell has here is
a flexible type system. It does not give you anything for free per se.

Rather, what I am saying is I see a trend in the Haskell community where the
developers strive to find the best semantics for a problem and not just stop
at the first arrived at solution because it works.

See all the conversations on pipes vs. conduit if you want an example.

------
adjwilli
I program mostly in Objective-C nowadays, but I started professionally with
PHP. When I first started Objective-C, I found it really constricting in
comparison. In PHP you can do things a thousand ways, most of the terrible. In
Objective-C, specifically with Cocoa, things are a lot more rigid and
prescribed. I found this frustrating, but love it now. It's made my PHP better
too when I do occasionally go back. It forced me to think more about
architecture. I also understand why my CS dept taught us Scheme first, not
Java.

To link that back to the post, this is the type of constriction he's talking
about to make better programmers. Cocoa and Objective-C restricted me to only
writing at least halfway decent code. With PHP, because of it's flexibility,
you're free to get things done quickly, but in a terrible way. Sure with PHP
you can do things right too, but it takes a lot more self-discipline and also
a priori knowledge.

Sorry to post yet another rag on PHP.

~~~
Androsynth
Its not ragging when its constructive with a thoughtful argument.

------
musashibaka
I would suggest watching his entire talk on youtube:
<http://www.youtube.com/watch?v=wt-iVFxgFWk>

~~~
wwwtyro
I was glued to the screen for the entirety of that talk. That guy is
completely engrossing to listen to.

------
cobrausn
Running your code through static analysis can be eye-opening. And just like
when you opened your eyes for the first time... you'll probably cry.

~~~
davidcuddeback
Static code analysis has a long-term benefit as well as the more obvious
short-term benefit. That is, it teaches us to be better developers as we
strive to have the static analysis catch less issues in our code the next time
[1]. I used static analysis to improve my style for C, Python, and most
recently Ruby [2].

I think it had a lasting effect on my personal coding habits. But every once
in a while, I will use the tools on my new code and it still finds things. I
would probably benefit from being more persistent in using these tools.

[1]: This assumes that the issues caught be your static analysis tool are
valid concerns, which in my experience, they tend to be.

[2]: Some static analysis tools that I've used with Ruby are reek, roodi,
flay, and flog. Reek and roodi report code smells. Flay reports structural
similarities (opportunities for refactoring). And flog estimates the
complexity of your methods.

~~~
wh-uws
what tool did you use for ruby?

~~~
davidcuddeback
I mentioned that in my second footnote.

------
iandanforth
The quest for perfection may be futile.

DNA is also code, and it's full of bugs. That code lives for hundred of
thousands of years, if not millions.

Biological processes offer the suggestion that your system can be functional
in the face of constant failures and random variations in behavior.

Biology can even offer a very high reliability rate. While we get sick all the
time, and people are born with all sorts of genetically disadvantageous
traits, many key processes are mind-bogglingly reliable. (No sight v No sense
of touch: Compare the rates of blindness to the rates of congenital analgesia
type 2)

While the math behind CS offers tantalizing guarantees of reliability the
reality of software development and developers deliver a reliability far
lower.

I think it is a fascinating thought experiment to imagine a development
process where instead of writing any code, all you're writing is tests (or
feature descriptions) and let the code adapt to the environment you've
defined.

~~~
CamperBob2
_The quest for perfection may be futile._

Agreed, and I think it's easy to observe that fact with nothing more than your
DNA example. In biology, perfection will always be outcompeted by "good
enough."

------
pnathan
One of the things I've been persuaded of is that software writing is
fundamentally a non-scientific activity. Outside of the time and space
constraints for a given subsystem, the craft of software is almost entirely
subjective and limited only by mental capacities and flaws.

------
zxcvbn
Richard Feynman wrote[1]:

"We could, of course, use any notation we want; do not laugh at notations;
invent them, they are powerful. In fact, mathematics is, to a large extent,
invention of better notations. The whole idea of a four-vector, in fact, is an
improvement in notation so that the transformations can be remembered easily."

What he said about mathematics, I think it applies even more to programming.

[1] The Feynman Lecture on Physics, Volume 1, Chapter 17

~~~
6ren

      At PARC we had a slogan: "Point of view is worth 80 IQ points." It was based on a
      few things from the past like how smart you had to be in Roman times to multiply two
      numbers together; only geniuses did it. We haven't gotten any smarter, we've just
      changed our representation system. We think better generally by inventing better
      representations; that's something that we as computer scientists recognize as one of
      the main things that we try to do.
    

Alan Kay [http://billkerr2.blogspot.com.au/2006/12/point-of-view-is-
wo...](http://billkerr2.blogspot.com.au/2006/12/point-of-view-is-worth-80-iq-
points.html)

~~~
taybin
According to this, multiplication wasn't so bad:
[http://turner.faculty.swau.edu/mathematics/materialslibrary/...](http://turner.faculty.swau.edu/mathematics/materialslibrary/roman/)

~~~
btilly
Actually real Romans did multiplication on an abacus. Reading from Roman
numerals to/from an abacus is an incredibly natural operation.

In Europe the disappearance of abacuses was directly tied to the rise of
Arabic notation.

------
figglesonrails
Actually, I think one of his strongest points is about making mistakes --
everyone does them and everyone makes the most amateur of them. If you haven't
tried running SCA on your code, try it sometime. I've got a ridiculous ego,
but when you get hundreds of warnings on your code, you realize just how
imperfect you can be and how impossible it is to be cognizant of all things at
all times. This is why teams are almost always better, and I'd prefer to
review and be reviewed by someone else -- I'm my own biggest blind spot.

------
VMG
If he wrote a book about programming, I'd buy it in a heartbeat. Too bad that
even he probably is still figuring everything out.

~~~
bsphil
The more you know, the more you realize you don't know.

That would be a great book though.

~~~
whalesalad
Yep.

Also, the smarter you are the more you tend to doubt yourself. Whereas less
intelligent people tend to have more confidence in what they're doing.

Bites me in the ass all the time.

~~~
seoguru
The more you know, the more you know that you don't know. As you throw logs on
the campfire, the perimeter of darkness grows.

~~~
matwood
_As you throw logs on the campfire, the perimeter of darkness grows._

I like that one. I've always thought of knowledge as a sphere and what the
surface of the sphere touches is the intersect of your knowledge with what you
know and don't know. As your knowledge grows so the does the sphere and your
awareness about what you do not know.

------
prawks
I look forward to the QuakeCon keynote every year. Anyone who engineers
software should really listen to Carmack.

Also, this article's kind of silly... there's almost no discussion? Just watch
the video.

EDIT: I guess it's not so much an article as a sharing of info. Still. Watch
the video.

------
benthumb
As much as I respect John Carmack, I have to say that I'm a little
disappointed that he is rehashing this meme of software development not being
a science... OK, wonderful, the state of the art in his shop doesn't rise to
the level of being a consistently reproducible, measurable process, but that
doesn't mean that this is a permanent condition or that its an insurmountable
one.

~~~
rapala
I didn't get the same impression that you got. To me, Carmack is saying that
the biggest thing to focus is how to enforce the known best practices. That we
need to understand the social aspects of software development well enough to
reap the benefits of improvements in the engineering side. There is science
involved in that, but it is mainly stuff from the social sciencies.

------
Arjuna
Interesting comments on Mac and Linux platforms [1]:

 _"Other interesting sort of PC-ish platforms, we have... the Mac still
remains a viable platform for us. The Mac has never required any charity from
id, all of those ports have carried their own weight there; they've been
viable business platforms.

I actually think that the Mac is going to become a little bit more important
for us. Interestingly, we have a ton of people that use, like Macbooks at the
office, but we don't have any really rabid, OS X fanboys at the company that
drive us to go ahead and get the native ports out early.

But, one of my pushes on the greater use of static analysis and verification
technologies, is I pretty strongly suspect that the Clang LLVM sort of
ecosystem that's living on OS X is going to be, I hope, fertile ground for a
whole lot of analysis tools and we'll wind up benefiting by moving more of our
platform continuously onto OS X just for that ability to take advantage of
additional tools there.

Linux is an issue that's taken a lot more currency with Valve announcing Steam
for Linux, and that does change, factor, you know, changes things a bit, but
we've made two forays into the Linux commercial market, most recently with
Quake Live client, and, you know, that platform just hasn't carried its weight
compared to the Mac on there. It's great that people are enthusiastic about
it, but there's just not nearly as many people that are interested in paying
for a game on the platform, and that just seems to be the reality. Valve will
probably pull a bunch more people there. I know absolutely nothing about any
Valve plans for console, Steam-box stuff on there; I can speculate without
violating anything.

One thing that also speaks to the favor of Linux and potential open source
things is that the integrated graphics cards are getting better and better,
and they really are good enough now. Intel's latest integrated graphics cards
are good. The drivers still have issues. They're still certainly not going to
blow away somebody's top of the line SLI system, but they are completely
competent parts that are delivering pretty good performance.

And one of the wonderful things is that Intel has been completely supportive
of open source driver efforts, that they have chipset docs out there, and they
work openly with community to develop that, and that's pretty wonderful. I
mean, anybody that's a graphics guy, if you program to a graphics API, use D3D
or OpenGL, you owe it to yourself at some point to go download the Intel
chipset docs. There's hundreds of pages of them, but you really should read
through and see what happens at the hardware level. It's not the same
architecture that Invida and AMD have on there, but there's a lot of
commonalities there. You'll grow as a graphics developer to know what happens
down at the bit level.

Another one of those things, if I had more time, if I could go ahead and clone
myself a few times, I would love to be involved in working on optimizing the
Intel open source drivers there.

So, it's enticing, the thought there that you might have a well-supported,
completely open platform that you could deliver content through the Steam
ecosystem there. It's a tough sell on there, but Valve gets huge kudos for
having the vision for what they did with Steam, sticking through all of it.
It's funny talking about Doom 3, where we can remember back in the days when
they're like, 'Well, should you ship Doom 3 on Steam, go out there, make a
splash?' ... I'm like, 'You're kidding, right?' That made no sense at all at
that time, but you know Valve stuck with it and they're in a really enviable
position from all of that now.

It still seems, probably crazy to me that they would be doing anything like
that, you know, but, it's something that's not technically impossible, but
would be really difficult from a market, sort of ecosystems standpoint."_

[1] <https://www.youtube.com/watch?v=wt-iVFxgFWk#t=44m28s>

~~~
Lockyy
The statement that people are less willing to pay for things on the platform
is kind of refuted by how the linux demographic always pays almost double what
people on osx, and triple what people on windows pay for the humble bundles.

~~~
simonh
And yet he's stating facts based on actual experience doing it.

~~~
ondrasej
These two facts are not necessarily in contradiction. At this moment in the
current Humble bundle, the average price paid by Linux and Mac users is
significantly higher than the average price paid by Windows users, but the
total income from Windows is still way higher than the income from Linux and
Mac combined, just because of the raw numbers of users.

It looks to me like there are some Linux users who don't mind buying games
(and do not mind paying premium for that), but the majority is not interested
in buying games at all, regardless of the price.

Moreover, my own experience with buying games is that (digital distribution
aside) it was practically impossible to get Linux versions of games (well, at
least in Czech rep., where I used to live). And if there was a Linux version
at all, using it meant buying a box with the Windows version and the patching
it. So, even though I use Linux for work, I used to play games almost
exclusively in Windows (and then on XBox, which made things even easier).

------
aantix
Software development is usually about creating some sort of competitive
advantage. And a fundamental tenant of competitive advantage is
differentiation.

Standard implementations/algorithms/patterns are commodities and are purposely
so.

------
dllthomas
The talk itself is well worth the 3 hours.

------
jcdavison
i just finished the devbootcamp program so as a person who was thrown into oo
programming with no real cs underpinnings, it is interesting to hear him talk
about the social component . it is also pretty interesting to here him talk
about making mistakes and the need to just get things done versus
optimization.

------
jberryman
I don't want to be an ass, but this seems like a lot of vague contradictory
rambling. Maybe I missed some big ideas in the parts I skimmed?

~~~
rapala
I was also a bit confused. He said that stuff like monads are nice, but not
that useful in the end. On the other hand, he seemed concerned about how to
enforce best practices, to which I count monads in.

