

Programming Laws and Reality: Do We Know What We Think We Know? - alok-g
http://www.drdobbs.com/architecture-and-design/programming-laws-and-reality-do-we-know/240166164

======
ggchappell
I'm intrigued by the one law (unless I missed something) that he says
definitely no longer applies:

> Wirth's Law

> Software performance gets slower faster than hardware speed gets faster.

> This law was formulated by Niklaus Wirth during the days of mainframes and
> seemed to work for them. However, for networked microprocessors and parallel
> computing, the law no longer seems to hold.

Certainly it holds for _some_ things. My first computer (late 1970s) started
up in less than a second. A decade later I had to wait perhaps 20 seconds.
Today waiting a full minute for a computer to get to a usable state is not at
all unusual. On the other hand, we have Google, which, in a fraction of a
second, can do things undreamed of only a few years ago.

This leads to what I think is an interesting line of inquiry. What kids of
software does this law still apply to? Why does it apply there and not
elsewhere? And what can we do about it?

~~~
pjmlp
It applies to the bloat in software development nowadays with layers on top of
layers that seldom add any real value to the product.

~~~
seanmcdirmid
That is definitely the myth, where is the non-anecdotal evidence that this
statement is actually true?

~~~
mpweiher
"Spending Moore's Dividend" was a 2009 CACM article (paywalled):

[http://cacm.acm.org/magazines/2009/5/24648-spending-
moores-d...](http://cacm.acm.org/magazines/2009/5/24648-spending-moores-
dividend/fulltext)

Microsoft Research has a version of the article (not exactly identical):

[http://research.microsoft.com/pubs/70581/tr-2008-69.pdf](http://research.microsoft.com/pubs/70581/tr-2008-69.pdf)

I also discuss this in my (unfinished) book, with lots of data:

[http://www.amazon.com/Objective-C-Performance-Tuning-
Develop...](http://www.amazon.com/Objective-C-Performance-Tuning-Developers-
Library/dp/0321842847)

The whole book is / will be (well, the manuscript is) full of examples of
techniques that are 2x, 10x or even 100x slower than necessary, but now
"recommended practice", often for no discernible benefit.

~~~
seanmcdirmid
Nothing in the Larus article states that extra software overhead is just
wasted without any real benefit; we do get benefits from it:

* Increased functionality

* Increased programmer productivity (programmers are often more scarce than available CPU cycles)

* The bottlenecks have changed

The thesis being challenged in grandparent (pjmlp):

> It applies to the bloat in software development nowadays with layers on top
> of layers that seldom add any real value to the product.

implies that we are just being wasteful for no real benefit, but we are
obviously getting more functionality in or writing more software via increased
programmer productivity. Now that power is more of an issue for mobile, we are
seeing programmers be more thrifty again, but that is not without a cost in
productivity.

------
ggreer
While this list is certainly educational, remember that most of these "laws"
are heuristics based on personal experience.

The discipline of software development is about as rigorous as parapsychology.
To learn what we _really_ know about software, I recommend _Making Software:
What Really Works & Why We Believe It_[1], _The Leprechauns of Software
Engineering_ [2], and browsing It Will Never Work in Theory[3].

And no matter what languages, testing frameworks, or methodologies you use,
always remember:

 _Every "bug" or defect in software is the result of a mismatch between a
person's assumptions, beliefs or mental model of something (a.k.a. "the map"),
and the reality of the corresponding situation (a.k.a. "the territory")._[4]

If you want your map to reflect the territory accurately, you'll need more
than a few "laws" handed down as gospel.

1\. [https://www.amazon.com/Making-Software-Really-Works-
Believe-...](https://www.amazon.com/Making-Software-Really-Works-Believe-
ebook/dp/B004D4YI6G/)

2\. [https://leanpub.com/leprechauns](https://leanpub.com/leprechauns)

3\. [http://neverworkintheory.org/](http://neverworkintheory.org/)

4\.
[http://lesswrong.com/lw/2rb/why_learning_programming_is_a_gr...](http://lesswrong.com/lw/2rb/why_learning_programming_is_a_great_idea_even_if/)

~~~
Patient0
That's what struck me: the use of the word "laws" to describe what are, at
best, "principals", seems particularly inappropriate and pretentious.

I have the image of a self-titled "software engineering consultant" trying to
gain prestige (and funding) by using lots of big words (i.e. bullshitting).

~~~
dvogel
Pretty sure the word choice was for effect -- like Murphy's Law.

------
DougMerritt
> Ward Cunningham's technical debt concept is a great metaphor, but not such a
> great metric. Technical debt omits projects canceled due to poor quality.
> Since about 35% of large systems are never finished, this is a serious
> omission.

In my experience, projects large and small are typically cancelled because
executive management changes their minds, not because of poor quality.

Certainly there are lots of famous examples of projects that crumble under
their own weight, but I really don't think it's anything like 35%.

Business goals change (for better or for worse), market conditions change,
egos change, etc.

One cultural difference seems to be that a CEO of a Fortune 500 company who
writes off a billion dollar investment due to a change in business strategy is
a visionary who makes brave decisions, but anyone who writes off a
project/division due to a billion dollars in technical debt is going to be
tarred and feathered for blundering.

------
saraid216
...where is this purported data?

------
mclean
Good list. If only project management people read this, maybe then sky would
be brighter on programming horizon.

------
icefox
Google doesn't seem to know about "Hartree’s law" outside of this article...

------
eekee
Article seems nice, but without data it's rather hollow.

