

Too many layers of abstraction -- worthless crap (2007) - acqq
http://blogs.msdn.com/b/ricom/archive/2007/02/02/performance-problems-survey.aspx

======
praptak
I choose point 3. - not _too many_ layers of abstraction but rather a _bad
abstraction model_ which, sort of, results in 1. (bad algorithm). Not because
an inexperienced programmer did it, but because the architecture makes an
efficient algorithm _impossible_ to implement.

The best example that comes to mind is the "n+1 selects" problem in bad ORMs:
if your ORM hides the power of SQL joins behind a crappy abstraction, then you
need n selects for what could be achieved with one.

------
protagonist_h
From my experience, it's not "over-abstraction" per se which is causing
performance issues. Rather, it's that people who work at the upper levels of
abstraction don't understand how layers below work. E.g. web-developer not
understanding what SQL their ORM library generates and how database execute
that SQL. Abstractions are supposed to hide details at certain level. However,
in practice you still need to understand how things work at lower level, as
Joel Spolsky's famous Law of Leaky Abstractions states.

I've seen that problem frequently in the world of enterprise software, where
you often have many layers developed by different groups of developers.
Developers at level N treat level N-1 as black box. This often leads to
performance issues.

------
watmough
If #1 is pervasive throughout an application, e.g. badly done SQL code, then
it can be a pain to get performance up, but with #2, there may be no way to
get the codebase back under control, or bring new developers on at a
reasonable pace.

I think the consensus seems to be that #1 is manageable, but #2 is a
potentially project threatening issue.

It's a shame that 'You Ain't Going to Need It' hasn't permeated to all areas
yet, but with many people working at cost+, hours billed often tends to be
maximized in worse projects. It's a shame that there's very little mechanism
to be paid more for doing a great, undramatic job, than a 50% over-budget
cluster-flock code-dump that people hate.

------
ntoshev
Well, #2 should be "the code is a mess" which sometimes is a result from too
many layers of abstraction, sometimes from improper abstractions, sometimes
from lack of abstractions. In my experience novices underengineer, while
experienced coders do tend to overengineer.

~~~
acqq
No, it's not "a mess." People who write it follow a lot of rules. The problem
is, following rules just "because it should be done so" and uncritically
applying them leads to very weird things (^) I believe you haven't seen what a
lot of programmers working under bad lead can produce in a year of two out of
the simple goals... Really experienced programmers don't "overengineer."

(^) It looks like: "public class RecursiveFactorialImplementation implements
FactorialAlgorithm" from: <http://news.ycombinator.com/item?id=2143310>

~~~
ntoshev
I agree with you, I just say there are other similar sources of avoidable
complexity.

------
acqq
The results of a survey:

[http://blogs.msdn.com/b/ricom/archive/2007/02/08/performance...](http://blogs.msdn.com/b/ricom/archive/2007/02/08/performance-
problems-survery-results.aspx)

------
wisty
Interesting - a lot of people are saying that #2 has become more common with
time.

------
Silhouette
I would give a lot to work only on projects where there were too many levels
of abstraction. It's a very real problem, I agree, but at least it implies
that the people working on the project understand the concept of abstraction
and trying to build a large code base with some kind of systematic design. The
most common alternative seems to be having little deliberate software
architecture at all, and that is _much_ worse.

------
pointyhat
I see #2 (over-abstraction) daily. People pick an "architecture" and almost
religiously apply it without concern as to whether or not it's too complicated
for the job at hand, which it usually is.

