
In Defense of Not-Invented-Here Syndrome (2001) - aycangulez
http://www.joelonsoftware.com/articles/fog0000000007.html
======
phaedrus
Many years ago, when I was first learning game programming, I insisted on
implementing everything myself, including low level 3-d math and writing to
video memory manually. Now, I'm working on a new game and I've created a
script-binding technology that makes it very easy for me to incorporate third
party C/C++ libraries into the game's scripting engine, like the way the Borg
assimilates technologies. So I've had experience at both extremes of NIH.

Here's what I've found: using someone else's library is NOT faster than
implementing the code yourself just to do a certain task at hand. The code you
write for yourself does no more than you need it to, and you implicitly
understand how it works. When you're trying to use someone else's library to
accomplish a simple task at hand, you have to learn all the library's concepts
tangentially related to what you want to do, and then you can spend days
tracking down a silly bug because you misunderstood some small detail of what
the library is doing (i.e. the library violated the law of least
astonishment). At best, it takes THE SAME time to learn/debug code with a
dependency as it does to implement it yourself, provided you're a good
programmer.

But, I still recommend learning to reuse code from elsewhere. The benefit of
hooking up to and learning a third party framework or library comes after the
task at hand, which is that once you've made the initial investment,
incrementally enabling other features the library provides begins to cost less
and less than reimplementing them yourself, because the initial investment of
learning and debugging the foreign code begins to pay back.

So you have to look at it as being like an investment which loses money in the
short term but has the potential for earning back interest in the long term.

But here's the most important reason to drop NIH and embrace code reuse: Do
you want to push the boundaries of what is possible? If you envision program-
space as the set of all possible programs, it must be clear that there are
programs on that space that lie outside what one team or one man can write by
himself in a lifetime. If all you're doing is accomplishing one task at hand,
by all means make the cost-benefit analysis and choose reuse or not
pragmatically. But if you're interested in pushing the boundaries of the state
of the art, then to reach the "interesting programs" that lie in program-space
outside the boundaries of what one man can write in his lifetime, you must do
it by extending the work of others who have gone before you, just as all
progress in science is accomplished.

~~~
jasonkester
You're right. It amazes me how people are so willing to drop 3rd party code
into a project to accomplish the most trivial things.

You can put together a class to do, for example, ASP.NET URL Rewriting in a
single sitting if you want to. And the thing you end up with will do exactly
what you need and nothing else. It's like 50 lines of code. And still Junior
Dev Jimmy will go find something on somebody's blog that sorta does it and
pollute your project with it.

It adds up quickly, and if you're not careful you'll find yourself with 2
dozen random pieces of CodeProject wackiness interacting it unexpected ways
and throwing you into maintenance mode.

In my experience it's just not worth it.

If you're bringing in a 3rd party library, make sure it's for something big
and complicated that you just don't want to deal with. Credit Card processing,
Amazon Web Services integration, that sort of thing. Otherwise, take half a
day and roll your own. Your life will be a lot nicer.

~~~
_delirium
> If you're bringing in a 3rd party library, make sure it's for something big
> and complicated that you just don't want to deal with.

It should also imo be one that appears to have a decent track record of active
maintenance. If it doesn't, sooner or later you're going to be the one
maintaining it, which might not turn out to be much easier than having rolled
your own. I've seen more than one project that eventually _did_ roll their own
once their unmaintained 3rd-party library bit-rotted so much that it wasn't
worth saving.

~~~
phaedrus
Absolutely!

That's one of the top criteria (right up there with type of license) when I
evaluate whether to include a library in my game: is this library actively
maintained?

Incidently, it's one of the things that worries me about the pthreads for
Windows library: it hasn't been updated in quite some time.

------
jasonkester
_When you're working on a really, really good team with great programmers,
everybody else's code, frankly, is bug-infested garbage_

There are two classes of code that this seems to apply to 95% of the time:
Other People's Javascript, and Web Templating Systems.

Bad Javascript is understandable, since any Joe with a copy of notepad and
"Teach Yourself Web Programming in 21 days" can (and will at some point) write
a crappy DHTML menu system and throw it out on the web. Junior Dev Jimmy will
find it and drop it into your project without asking anybody, and suddenly
it's your job to deal with it (or rewrite it, which is generally the only
thing to be done. Extra credit if you open source it and inflict it on the
next generation.)

Fortunately, the jQuery guys have managed to defy this rule, and it's doing a
good job of fixing the issue. Still, "No 3rd Party Javascript" is a good
mantra if you're running a dev shop worth its salt.

Web Frameworks are the fun one. Usually they're the embodiment of the Second
System Effect, where a bunch of devs who've done exactly one project in one
framework sit down and figure out what their Dream System would look like, and
end up reinventing either Classic ASP or Cold Fusion, but in a comically
terrible way. Often they'll even get traction with it.

See FuseBox for the canonical example of taking the joke too far. Yikes!

~~~
statictype
_where a bunch of devs who've done exactly one project in one framework sit
down and figure out what their Dream System would look like, and end up
reinventing either Classic ASP or Cold Fusion, but in a comically terrible
way._

We have our own webframework that grew over time and I don't feel bad about
it. It's custom-built for our apps and so, it does exactly what we need and
abstracts away the exact feature set we need. It's not more powerful than any
existing framework and is certainly not for everyone but it suites our needs
perfectly.

I don't see writing custom frameworks or templating systems as any different
from writing a DSL.

~~~
jasonkester
_We have our own webframework that grew over time and I don't feel bad about
it._

Nor should you. You invested some time and put together something that exactly
fits your needs. Chances are you're moving faster on it than some random new
team using Rails, because you're so familiar with it and starting a new
project involves simply copying large chunks out of a half dozen previous jobs
and taping them together.

The key is that you didn't try to build your thing out into a platform and
release it as open source.

------
cperciva
Another reason for NIH: If something breaks, you have the in-house expertise
necessary to figure out why.

Of course, this breaks down as soon as people leave the company -- but for an
early startup where the founders are expected to stick around, it's much
faster to get something fixed if the author is within arm's reach.

~~~
joe_the_user
If you have sound documentation and design methodologies, then even if someone
leaves, you should be in good share.

Whether that's going to be the case is another question, of course.

~~~
rapind
This might be true in some rare cases, but in general I disagree with it.

The people who originally designed it, no matter how well documented, are much
more likely to understand and be able to extend it. Tests help too, however
losing your initial developers, if they were any good, will always have a high
cost associated to it. You can get by sure, but it has a high cost. I think
the difference in comprehension in pretty large.

I think there are a couple things which cause this difficulty: 1) I don't
enjoy writing reams of documentation, and I bet most other developers don't
either. 2) I don't enjoy reading reams of documentation before I can dig in,
and again, I bet most other developers don't either.

~~~
jsharpe
This is why it's really important to do code reviews. Not just so that any
egregious practices are eliminated before they are inflicted on the codebase,
but also so that more than one person knows what's going on, and what the
original intent was, with every piece of code in the codebase.

------
canterburry
If the Excel team's compiler and widgets were so much better, why wasn't the
rest of MS re-using theirs instead? To me, this particular example does not
argue for building it yourself, it just demonstrates re-use should have gone
the other direction.

I do agree with the basic premise that if an available solution does not meet
expectations, the benefit of re-use does not always outweigh the inadequacies,
thus supporting the business case of building it yourself.

HOWEVER...

...there are far too many organizations who think their needs are unique and
different when they really aren't. I think that is a more common reason why
companies re-invent the wheel and then post-factum realize their needs really
weren't so special after all and could have bought off-the-shelf instead.

What cracks me up even more is, when a company realizes that an in house built
system could be replaced with an off-the-shelf packet, then they have to
weight their in-house maintenance costs against going out and buying an
existing package. This quite frequently comes out cheaper over time, so right
after they build their shiny new enterprise app, it's discontinued and
replaced by a bought system.

~~~
RodgerTheGreat
"If the Excel team's compiler and widgets were so much better, why wasn't the
rest of MS re-using theirs instead?"

Because Microsoft has cultivated a combative internal environment in which
different projects fight for resources and maintaining a library for other
silos to use would only serve as a liability. Source control (which, big
surprise, MS handles via many in-house tools) would also become even more of a
nightmare than it is already.

~~~
bad_user
> _Source control (which, big surprise, MS handles via many in-house tools)_

Why is it a surprise? They sell developer tools. They sell infrastructure.

Actually it was kind of shitty on their part for not using Visual SourceSafe.

~~~
snprbob86
Right on: Visual SourceSafe SUCKED. Microsoft used/uses a heavily customized
internal version of Perforce called Source Depot. The entire company is being
transitioned to the actual product Team Foundation Server, but the transition
is very slow, and very painful.

~~~
faragon
In my opinion, they deserve that pain: TFS SCM is a big shit (Visual Studio is
OK though, when using it with Git/HG/SVN).

<http://whygitisbetterthanx.com/>

~~~
dfox
TFS SCM is Subversion.

~~~
faragon
Unfortunately, it is not. TFS has source control built-in, which is slow and
buggy.

<http://en.wikipedia.org/wiki/Team_Foundation_Server>

------
api
"What these hyperventilating "visionaries" overlooked is that the market pays
for value added."

THIS.

Best quote in the article, and sums up the problem with a lot of trendy recent
business BS.

~~~
ataggart
Joel is correct, but there are cases where the "trendy couple sipping
Chardonnay in their living room outsourcing everything" is in fact adding
value, namely, their entrepreneurial insight to recognize that a market is in
disequilibrium, that current means are available to meet current demands, but
that such has gone unnoticed. In fact, having such insights is what defines an
entrepreneur.

I would recommend that anyone interested in the economic theory of
entrepreneurship read Israel Kirzner's seminal work, "Competition and
entrepreneurship."

<http://books.google.com/books?id=OFNVW3N9hUUC>

~~~
jsharpe
Exactly. Coordination of different systems _is_ added value. It's like
mechanical turk. Amazon is not actually providing anything except a connection
between those who need work done, and those who are willing to do it. They're
not adding anything besides a connection.

------
snprbob86
I read this years ago, but reading it again is incredibly interesting now that
I use GitHub. GitHub has changed the way I think about dependencies.

I'm no longer afraid to take a dependency and I'm no longer afraid to just
start making changes to those libraries. It's like the best of both worlds:
everyone else's code becomes my code. I've got "Not Under Version Control Here
Syndrome".

(This breaks down miserably if you are shipping boxed software and have _hard_
dependencies on other things on disk which are out of your control, like Excel
does)

~~~
skybrian
I can imagine this works great until you create so many patches that you spend
all your time upgrading your dependencies. (Or just stop upgrading.)

~~~
jsharpe
I think the point is that with GitHub, it makes it easy to contribute your
patches back to the main project, so you don't have to spend time reapplying
your patches to newer versions of the dependency.

------
CapitalistCartr
I think it varies from person to person, and subject to subject. I find that
if I can write the code in a two digit number of lines, I'm better off doing
it myself. Not making stupid concatenated long lines, of course. I used to
write for OS390 and got used to the 80 char limit to lines, anyway.

If it feels like it'll take X time to write, it'll be twice as long, due to
unforeseen stuff, and then X time again to make it work, debug, etc. If it
takes a thousand lines of code, it's worth it to look around. If it's between
a hundred and a thousand, it's fuzzy.

None of this negates the value of off-the-shelf solutions. It's a matter of
time spent making my own, versus learning someone else's stuff. I can learn
anything thoroughly; it's a matter of time and motivation.

------
xsmasher
Libraries and abstractions are good to the extent that you can _trust_ them -
there are some libraries that I trust more than my own code, and others I
wouldn't touch with a 10-foot pole.

On topic, Excel was always more solid and reliable than Word or Ppt, and it
turns out that was no accident.

------
api
<http://c2.com/cgi/wiki?JunkyardCoding>

<http://c2.com/cgi/wiki?RubeGoldbergMachine>

I've seen that before. NIH can be pathological, but so can anti-NIH.

------
fmora
Read this and most of his articles around 2007. Some of the best articles I've
read. Unfortunately he seems to have said most of the stuff he was going to
say so he has stopped writing. Ho well, too bad.

