
No Silver Bullet (1986) [pdf] - ayberkt
http://worrydream.com/refs/Brooks-NoSilverBullet.pdf
======
DenisM
To this day this piece from 1987 remains one of the most wanted pieces of
wisdom:

 _I still remember the jolt I felt in 1958 when I first heard a friend talk
about building a program, as opposed to writing one. In a flash he broadened
my whole view of the software process. The metaphor shift was powerful, and
accurate. Today we understand how like other building processes the
construction of software is, and we freely use other elements of the metaphor,
such as specifications, assembly of components, and scaffolding.

The building metaphor has outlived its usefulness. It is time to change again.
If, as I believe, the conceptual structures we construct today are too
complicated to be specified accurately in advance, and too complex to be built
faultlessly, then we must take a radically different approach. Let us turn
nature and study complexity in living things, instead of just the dead works
of man. Here we find constructs whose complexities thrill us with awe. The
brain alone is intricate beyond mapping, powerful beyond imitation, rich in
diversity, self-protecting, and selfrenewing. The secret is that it is grown,
not built._

~~~
MalcolmDwyer
Cool quote. Maybe now we "grow" software, in the sense that any application
consists of probably thousands of interconnected systems/libraries/frameworks
that work together in a way that is organic or emergent.

------
elihu
I wonder if people don't take the wrong lesson from this essay. There might
not be any single 10x productivity-enhancing technology, but there have been
many modest improvements. I think it's fair to say that a good programmer in
2015 is probably 10x more productive than a good programmer in 1995.

Someone reading this today might be tempted to dismiss tools they just don't
understand as "not a silver bullet". However, even modest improvements are
worth striving for, and if you only adopt what is "industry standard", you'll
always be behind your early-adopter peers. (Or at least, those of your early-
adopter peers who have good enough taste to distinguish a good, useful
technology from the next shiny new thing that isn't ready for production and
may never be.)

Things that have been a boon to me have been getting into Linux back in the
early days when it was new-ish, the arrival of Google, learning Haskell,
learning to use source control (first svn, now git), and the availability of
things like Stack Overflow and Wikipedia. If I had to go back to using the
tools of the mid nineties (Borland Turbo C++ on Windows 3.1, NCSA Mosaic and a
14.4kbps modem, 33mhz processors, 8 megs of ram, etc...), I could still sort
of get stuff done as a programmer but really we've come a long long way since
then.

~~~
narag
" I think it's fair to say that a good programmer in 2015 is probably 10x more
productive than a good programmer in 1995."

I don't know about fairness, but definitively that's wrong. I was around then
and the most I could believe is that we are 2x.

~~~
elihu
I think it depends on what you're doing, and what counts as productivity. I
think lines of code per developer per day haven't gone up much, but it's a lot
easier to deliver much more complex products now than it was then, in part due
to the large volume of readily accessible library code. There's less need to
write special-purpose code for everything, because someone probably wrote the
generic parts already.

There are some software domains that haven't changed much at all since the 90s
or earlier. Systems programming is still done largely in C and C++, and modern
kernels work basically the same way they did then with a few minor
differences. Scripting has changed dramatically, web frameworks have changed
dramatically, business applications have changed quite a bit. C# is noticeably
different from Java, which is different from C++ or Cobol or whatever it was
people used to write business software in back then.

------
osullivj
And here's Brad Cox's rejoinder...

[http://virtualschool.edu/cox/pub/NoSilverBulletRevisted/](http://virtualschool.edu/cox/pub/NoSilverBulletRevisted/)

Planning the Software Industrial Revolution is also a classic...

[http://virtualschool.edu/cox/pub/PSIR/](http://virtualschool.edu/cox/pub/PSIR/)

------
pron
I remember how, as a young programmer, how hopeful I was that this technique
(OO, RAD, "Components") or that language (Scheme, ML -- those two were the
"ancient secret knowledge" that we "rediscovered" and our bosses had
"overlooked" \-- C++, Ada) will change programming forever and make software
development completely different, and how angry I was at the experienced
developers who told me that the most approaches will fail, and the best few
would only yield perhaps significant, but evolutionary rather than
revolutionary advances. Now I'm the one saying this to others...

I think that two specific advances did end up yielding significant (though
evolutionary) progress since that text was written: automatic garbage
collection, and automated testing practices. Both were viewed with skepticism,
the latter with some derision, but they have both helped us out of a real
crisis of the software industry in the nineties, when too many projects just
couldn't get off the ground (or, rather, continuously crashed).

~~~
slacka
Yes, the world would be a better place if young minds all stopped dreaming and
accepted the wisdom of the elders. There's nothing left to explore. </sarcasm>

~~~
pron
Boy, are you a pessimist! That's not at all the message. The message is about
working towards progress rather than jumping from one fad to the next with
blind faith that something will "save" us. It's a call for technological
progress rather than an alchemist's semi-religious search for a way to turn
lead into gold.

~~~
slacka
My first reaction after reading the paper was similar to yours. That I had
fallen for the OOP hype train. Later when my boss cited it as an excuse to
iterate our aging platform instead of attempting a redesign, my view of it
began to change.

It's true many technologies have been overhyped, but it's far too early to
throw in the towel. A little over optimistic youth fueled zeal, has let to
some of our greatest advances. I think Bret Victor sums my views up best here:

[http://worrydream.com/dbx/](http://worrydream.com/dbx/)

~~~
DonaldFisk
Great talk. Alan Kay and others have said similar things.

We've squandered all the gains the hardware people have made on slower, more
bloated software, resulting in computers whose response times are no faster -
and sometimes actually slower - than their equivalents were 30 years ago.

And I don't see much youth fuelled zeal going into solving the problem, or
even coming out with much in the way of innovation which might help. Instead,
people are excited about "new" languages like Golang and Rust (which doesn't
even have a garbage collector!), and rebranded copies of BSD and Linux.

People are incredulous when I tell them that while the trailing edge - people
entering COBOL into IBM mainframes on punched cards - has advanced
considerably, the leading edge has regressed. As Philip Greenspun put it: "
These days, most former Lisp programmers are stuck using Unix and Microsoft
programming environments and, not only do they have to put up with these
inferior environments, but they're saddled with the mournful knowledge that
these environments are inferior."

And don't get me started on AI.

~~~
kibwen

      > Rust (which doesn't even have a garbage collector!)
    

I think you need to take an actual look at Rust if you think that the lack of
garbage collection is somehow a design flaw. :P You're free to deride the
language as "just" an improvement on the state of the art of systems
programming instead of attacking the fundamental underlying problem (which is
that the underlying systems themselves are tremendous clusterfucks), but not
only is that outside of the domain of a programming language, it is also the
case that merely ignoring the status quo out of personal dissatisfaction does
not succeed in moving the world forward.

~~~
DonaldFisk
"The reasonable man adapts himself to the world; the unreasonable one persists
in trying to adapt the world to himself. Therefore all progress depends on the
unreasonable man." \-- George Bernard Shaw

AIUI Rust grew out of a dissatisfaction with C++, and it does have a few good
features, such as type inference. But since Java was released, it's been
unthinkable for a high level language to require programmers to do their own
memory management. If Rust is aimed _solely_ at systems programming, to be
used the way Algol 60 was used to implement MCP, PL/1 Multics, C Unix, and
Oberon Oberon, then the lack of a garbage collection is understandable. But
you can never guarantee your product will be used as intended: I've
encountered people who think C is an acceptable choice for _everything_.

I'm not merely dissatisfied with the current status quo: I'm actively doing
something about it. IMO we're doing practically _everything_ wrong, from
hardware upwards. Fixing it requires systems very different from those in
widespread use today.

~~~
dbaupp
As you say, Rust is aimed at being a C++ replacement mainly for systems
programming, and having a garbage collector is a hinderance there. Sure people
will use it for not-strictly-systems-programming things, but that's on their
head: they evaluate the trade-offs (e.g. if they can do without a GC) and
choose Rust. Compromising the core goal (memory safety without garbage
collection) by including a pervasive GC just because some people might make a
slightly silly choice and use Rust where some other tools is better is just
_wrong_. It would make Rust inappropriate for the places where there is
essentially no other choice like it, putting in into a class with many other
alternatives (e.g. Haskell, OCaml, D etc.).

In any case, on one hand you complain about slow, bloated software, and on the
other about Rust not having a GC. The unpredictability of a GC being a major
component of bloat in a lot of software. Don't get me wrong: it is possible to
write sleek software in a managed languages with GCs, but for many tasks this
generally requires fighting against the GC, with things like object pools and
buffers (basically reimplementing the standard techniques from non-GC'd
languages), and, of course, it is definitely possible to write bloated
software without one.

Rust is designed to make it easier to write code without a GC, by adopting
many of the advantages that garbage collectors/managed languages bring to
table (and more, e.g. static protection against iterator invalidation).
There's been a lot of people from Ruby, Python, JavaScript (etc.) backgrounds
adding lower-level/more-control programming to their toolbox via Rust,
something that was too daunting previously. This means that they can write
software (or at least, sensitive parts of their code) that doesn't suffer from
the overhead/bloat of the managed languages.

Lastly, Rust's memory management is nothing like "manual memory management" in
C (or historical C++), I mean, it compiles down to be essentially the same
thing at runtime, but what the programmer writes is very different. The
combination of lifetimes, destructors and generics mean that it is difficult
to do it wrong: memory leaks are rare, and the compiler will tell you about
any dangling pointers, etc.

------
ageyfman
Open source is surely a silver bullet that increased productivity 10x at
least, for most of us.

------
carsongross
I now believe there is one at least silver-alloy bullet: simplicity,
painstakingly maintained and violently defended by well paid developers.

~~~
the_af
Everyone agrees with simplicity when stated in such general terms, but in
practice, simplicity _where_?

Note that simplicity in one piece of software often leads to complexity in
another. For example, a language with a minimal set of instructions may be
considered "simple", but building software with it can be complex and
unwieldy. A minimalist API may be elegant and concise, but lead to headaches
when attempting to use it. Etc, etc.

~~~
carsongross
I don't think there there is a good objective definition for simplicity, since
a large component of what I'm getting at is subjective and context-dependent,
and therefore involves "appropriateness". API design is different than app
building is different than container building, etc. This is why good
developers are so important.

Regarding APIs, I like them to be layered:
[http://devblog.guidewire.com/2008/10/05/api-
design/](http://devblog.guidewire.com/2008/10/05/api-design/) so that code
built on top of them can be as simple as possible.

~~~
the_af
Agreed. That was my point, that simplicity is subjective, so while people seem
to agree that simplicity is desirable, they often disagree about _where_ to
simplify (which often comes with trade-offs).

------
ScottBurson
I believe Brooks will eventually be proven wrong about AI and Automatic
Programming. Granted, he has not been proven wrong yet.

It is interesting, perhaps even a little surprising, that reasoning about
programs has proven to be one of the most intractable challenges for AI. Why,
to take a simple-sounding example, when my program does something wrong, can I
not simply ask the machine why it did it? Why do I have to go in with a
debugger and find the problem myself? This wouldn't require solutions to any
of the familiar bugaboos of AI: it doesn't need a massive database of facts
about the world, it doesn't require visual image recognition, it doesn't
require natural language understanding, it doesn't require robotics or
simulated emotions or any of those things. It just requires logical reasoning.
Can't computers do that?

Well, no, it turns out, they can't do it, not in its full generality. Despite
all the things it doesn't require, general logical reasoning is still almost
AI-complete, which is a cute way of saying we still don't know how to make
machines do it.

And fully general reasoning is required to solve the problem. We have lots of
static analysis algorithms that can answer specific kinds of questions about
programs, but we're still not very close to being able to answer an
_arbitrary_ question, even fairly simple ones. The spaces we would have to
search are just too big; the branching factors are too high.

I think there is hope on the horizon, though. The so-called automated
reasoning systems that have been built over the last few decades -- also
called theorem provers -- have mostly not made use of any machine learning
techniques. These two technologies are starting to be combined, and machine
learning is of course a very hot area right now. I believe that eventually we
will have Automatic Programming worthy of the name as a result of this
combination.

------
crpatino
This is required reading for every promising young hacker. The more promising,
the more required.

~~~
slacka
I couldn't disagree with you more. This paper offers no insight, only excuses
to accept that software will always be buggy. If anything I'd encourage young
engineers to explore technologies that have been not been explored to their
fullest potential. Computer Science is a very young field and many unexplored
paths such as flow based or some automated programming method that hasn't been
dreamed up yet that may lead to far more reliable software.

I dread to think what today's CPUs would be like if an electrical engineering
version of this was required reading for every EE in the 70s. We should just
accept buggy CPUs because of the complexity in designing CPUS with 7 billion
transistors.

~~~
crpatino
> This paper offers no insight, only excuses to accept that software will
> always be buggy.

The paper does not address the problem of buggy software, though it is
related. It addresses the problem that software is complex.

And the insight it offers is that not all complexity is equal: there is
accidental complexity (programming is hard because the way we have done things
historically is not yet optimized) and inherent complexity (programming is
hard because we are modeling complex stuff and it is inherently hard for human
minds to juggle all the relevant details).

The excuses you complain about are really examples where people confuse one
with the other. i.e. you cannot solve the problem of Requirement Gathering
with better compilers because the problem is not technical, it is that stake
holders are playing political games in the background, so if left to their own
devices they'll give you vague, confusing requirements now with the tacit
intention of leveraging those when the inevitable struggles come in the
future.

> I dread to think what today's CPUs would be like if an electrical
> engineering version of this was required reading for every EE in the 70s.

I am sorry to inform you that an EE version of this will be direly needed in
the near future. As the Moore law keeps hitting more and more fundamental
limits of physics, we need engineers who are able to work in the problems that
matter and make hardware that gives us the best possible performance given the
constrains available. It would not do to have each generation of young EEs to
start yet another cargo cult every 5 years or so, wasting valuable resources
in the attempt to pack twice as much transistors per wafer.

