
No Silver Bullet: Essence and Accidents of Software Engineering (1987) - mmphosis
http://www.cs.nott.ac.uk/~pszcah/G51ISS/Documents/NoSilverBullet.html
======
throwawayjava
Developing web applications today is at least a twofold gain over what it was
10 (edit: oh, more like 15 I guess) years ago, and that mostly has to do with
available tooling (browsers, frameworks, etc.) The same is true for many other
areas of software. We do still spend the same amount of time building things,
but the things we build are far larger than what we would've built 10 years
ago. That, however, is a different phenomenon, and it's not the one that
Brooks lays out in NSB.

Also, it's worth noting that the Brooks thesis is also somewhat vaguely
defined because it's never clear what is meant by "productivity". For example,
if we increase the quality of software by an order of magnitude (e.g., by
completely eliminating huge classes of once-common vulnerabilities), is that
an improvement in productivity?

------
arikrak
> The techniques used for speech recognition seem to have little in common
> with those used for image recognition, and both are different from those
> used in expert systems. I have a hard time seeing how image recognition, for
> example, will make any appreciable difference in programming practice.

Machine learning has now made order of magnitude changes in solving many of
the "Essential difficulties" so that the same ML techniques can be used to
solve a wide range of problems without requiring the specific details to be
spelled out.

Despite this, "Accidental Difficulties" have actually grown in some areas, as
large companies build even more complex software with various testing and
performance requirements.

------
marktangotango
Although dated, this essay ought to be required reading for everyone involved
in software development IMO.

Story time; about 15 years ago I was at Big Dumb Inc on a team dedicated to a
specific customer. Our group manager thought it would be a great idea to have
a different member lead the weekly staff meeting every week. This is how
valuable the meeting was (/s it was a waste of everyone's time). When my turn
came around, I printed off copies of this, handed it out, and assigned it to
be read, and said "that's all for this week". Some people actually read it,
but I'm pretty sure that stunt contributed to a souring the realationship
between my boss and I.

------
Peaker
This essay is an opinion piece: It claims that little accidental complexity is
left and most software complexity is essential. It does not really establish
its claim that the complexity is essential, yet the entire premise rests on
that.

I find that proposition preposterous: I believe almost all complexity we deal
with in software is accidental.

~~~
syncsynchalt
And yet 30 years later the predictive claim has held up.

~~~
Peaker
Did it?

We are making many kinds of software easily an order of magnitude faster than
we did 30 years ago.

~~~
syncsynchalt
The claim is within one decade, though. So it'd be compared to 2007, not 1987.
The other claim is that we'll won't see a 2x gain in software every 2 years.
Keep in mind this was in the era of AI promises, when this kind of claim
seemed less bizarre.

Incidentally, what software did you have in mind? I'm curious if you mean
something like rails (stand up an API in ten minutes), or machine learning.

~~~
Peaker
I mean both UIs (not specifically rails) and machine learning, and emulators,
and CRUD, and even software like compilers which was already well-researched
by then. These days a compiler can be whipped up by an expert in a day or two
(for small languages).

