
There's no shame in code that is simply "good enough" - Brajeshwar
http://blog.phiz.net/theres-no-shame-in-good-enough
======
breckinloggins
This actually causes "coder's block" in me when I start a new project. If I
have this cool app or site idea, it often never gets off the ground because I
spend too much time trying to create some architectural framework that I
assume I need.

This is REALLY bad for me in Cocoa/Cocoa Touch apps. Objective-C is so verbose
and writing classes is so "mechanically cumbersome" (having to write .h and .m
files and duplicate stuff, for example) that I sometimes lose interest before
I ever really get off the ground.

The trick I use to beat this tendency is to totally eschew good software
engineering principles at first. Typically all of my code lives in main.m (and
I usually start out in CodeRunner rather than XCode). Interfaces are done
sloppily in Interface Builder or entirely in code. This really helps me get
started.

Before long, of course, I have this unwieldy spaghettified mess of a single
source file. By that time, though, I have some stuff WORKING, so doing some
refactoring into a good architecture for the code I have while keeping the
working stuff working is kind of fun.

~~~
HeyLaughingBoy
I find that the limited free time (wife, kids, etc.) I have available for side
projects forces me to prioritize. If I gold-plate everything, nothing will
ever get done, so I simply don't write code that's not absolutely necessary.

I will code _very_ thin "frameworks" if it becomes obvious that it will save
time over the long haul, but there is no "this might be useful someday" code.
If it doesn't move the project forward, it doesn't get done.

If your most precious resource -- time -- is limited, it forces ruthless
prioritization.

~~~
seanp2k2
Yeah, this is basically my approach too; my first priority when "banging out
an idea" is /get it working/ and make a "minimum viable product" (or tool,
utility, script...whatever). If it's awesome, I might consider re-writing it,
learning from the mistakes or architectural flaws of my first design. Note
that the very first revision can usually be dropped in favor of a paper design
of the thing, where you can visualize the flaws before writing the first line,
and correct them before you start typing.

There are also things that I have that technically work, but are ugly, but are
never getting fixed. I learned from them and it's fun to look back at how much
better I got over time. I could probably go back and turn everything into nice
classes and remove big blocks of commented-out code, but...it'd be pointless.
I'd rather work on something new and do it better the next time. Tackling
exciting new problems is what makes coding interesting for me, and I'd hate to
lose that "just" to be proper.

I like to think of my lang dirs in my homedir (ruby, c, python, js, php, etc)
as language-specific "coding sketchbooks" where I'm developing recipes that I
might borrow from later. It's kind of similar to being a chef, I suppose --
you finely craft and refine dishes for your day job ("staging/production"),
maybe you're lucky enough to get some paid time to work on your ideas ("dev"),
and you dedicate one night a week to culinary experiments on your own time,
maybe with friends or family ("passion").

I feel like a good programmer is like a good chef, and if you lose the
passion, you mostly stop getting better.

~~~
systemizer
It's important to get your hands dirty and do a bottom-up approach, but it is
equally as important to think top-down as well. What I mean is don't be
ignorant of the "gold plate" while you are creating the minimal viable
product, but be flexible enough so that it doesn't get in your way.

------
adaml
Joel Schindall, an EE at MIT, tells a good story about this:

The ship date for a chip he was managing was just a few weeks away when he got
a call from a supplier informing him that a key component would be delayed.
Worried, he went to one of the engineers who designed the chip and told him of
the problem.

The engineer was not concerned; "I thought they might not deliver that
component, so I left extra space around it where we can add these additional
parts that do the same thing."

Joel was happy, but surprised, and he took a closer look at the chip. "But you
didn't leave extra space for other components on the chip."

"Yeah," the engineer replied, "I just didn't think those would be a problem."

One of the harder jobs in engineering and design is anticipating problems.
That engineer took the time to think carefully about the problems that might
arise with each of the components on the chip, prioritized the risk, and only
spent the time to really carefully architect the parts of the circuit that
were most likely to be trouble.

It seems that software is the same way; the best coders will assess the risk
of all the code they write, and only spend their time and energy to protect
against the problems that are most likely to arise. That leads to code that is
better than "good enough" but is not over-architected. Of course, that
intuition is really hard to develop and impossible to perfect.

------
mgkimsal
"Would an engineer design a small, single lane bridge for a rural
Northumberland village so that it could support the weight of a thousand
double decker buses? No. So why do we, as software engineers try to do exactly
this? That day will never come."

It comes every time a boss or client says "oh, and now we need it to do XYZ.
And you can't rewrite or start over. We need it tomorrow. Build on what you
have - reusable software is our goal." and so on.

No one is going to go to a bridge engineer _after_ a two lane bridge is up and
running, and that took 2 years to build, and say "ok, add 4 more lanes by next
week. Oh, and you have no budget". No one in their right mind would ever dream
of doing that. But with software we face it all the tim.

Basically, many of us are conditioned to have to extend on top of whatever our
first iteration is, so we try to make the first iteration extendable. I'm not
saying it's _right_ or _good_ , but I've fallen in to this trap too many times
in my career and seen it happen to too many other people to think it's not a
contributing factor.

~~~
fragsworth
I can tell you from first-hand experience, though, that if you cannot scale
immediately when the time comes, your business can drastically suffer for it.

~~~
mgkimsal
I don't disagree, but a few points spring to mind. And in my earlier post, I
wasn't really thinking about 'scaling', but new functionality/features.

Occasionally "immediately" really does mean "in the next 5 minutes".
Oftentimes it actually means more on the order of days or a couple weeks.
"Scaling" can mean different things, and anything beyond "scaling" web
requests alone will likely mean a shift in business operations that
can't/won't happen overnight either.

In _most_ situations where I've been in where people request something, but
the rest of the business unit really isn't ready to handle the change. "We
need the new data reports _now_ " but when you make the change you realize the
other company who pulls the data to process it is on vacation and won't be
back for two weeks, and if you make the change now, then it'll break
everything.

~~~
fragsworth
Yeah. I was only referring to scaling in the sense that the number of users
increases. Often, it is easier to implement an application by ignoring
scalability, and you can get it out the door quicker.

However, this can be a terrible mistake if your users are not resilient to
downtime or lost data (e.g. in the case of Facebook games). Even just a few
hours of downtime or slowdown can cause a significant, permanent drop in
users.

------
rmason
Always keep in mind the great Larry Wall quote:

"Always write code as though it will be maintained by a homicidal, axe-
wielding maniac who knows where you live ..."

There are two ways to write poorly maintainable code. One is poorly organized
spaghetti code and the other is overly architected code.

Larry Wall also once said, "a programmer can write in assembly using any
language."

~~~
andyjenn
Never realised Larry Wall was a Young Ones fan.

------
steve8918
The key I've found is not worrying about how beautiful or clever the code is,
but how _maintainable_ it is.

If the code is designed properly, the only requirement I have is that I can go
back into my code and change it easily to what I now need it to do. It should
be malleable like silly putty. If I can do that easily, without requiring
massive rewrites, then this means that the code can change as my requirements
change, and that to me is a good design, and "good enough" code. So don't
worry about optimizing too early, but make sure the code can be optimized
easily when you need it to be.

If I want to make code changes that requires massive rewrites when my
requirements change a little, then it means I've coded myself into a corner,
and I've done a poor job designing the code.

~~~
ams6110
This has to be tempered, though, with some consideration of how likely it is
that the requirements will change. Of course we might say "requirements always
change" but there are definitely situations where to know with a high degree
of certainty that you are writing something that will be used once or a few
times and never again, or something small enough and well defined enough that
the requirements _can't_ change much. If you worry too much about
maintainability on code that will never be maintained, you're wasting time.

------
newyear2012
Agreed with much of this!

However I disagree with "that's the nature of software engineering. You never
stop learning and evolving."

The reason: burnout. My guess is that at least 10% of developers begin to lose
enthusiasm for coding after a few years and then at some point either change
jobs, become managers, or just have very little motivation to learn having
seen the futility of it all. They may be forced to continue to learn, but may
do so at a slow pace.

Why? You write code and after years or less, it can be thrown away or unused
without much of a thought. You see that many of those driving projects really
don't have some sort of higher purpose, and other than some perceived business
need, must of it is just "wouldn't it be nice".

I feel that it is sick for a person to continue blindly learning new
technology just for the sake of it. You need to have a reason. Jobs was not my
favorite person in the world, but one thing he did right was to believe in
what he was doing and why he was doing it. Without this, any evolution is
worthless.

~~~
devs1010
I agree with this for the most part, this is one reason I have fallen in love
a bit recently with RAD frameworks, personally I have been using Spring Roo,
which is somewhat similar to Ruby on Rails, but for Java, from what I
understand. Basically, when working on a project, I want to take the shortest
route to accomplishing the end need, maybe it won't be immediately scalable to
thousands of users or fully optimized but as long as you don't paint yourself
into a corner that can come later. I have noticed a lot of other developers
are so caught up in the minutia that they can't see the forest for the trees,
they care more about endless iterations, writing test cases, etc than just
delivering something that works.

~~~
chipsy
Agreement; the more I code the more I want to find "quick iteration" solutions
and minimize deliberate engineering of complexity.

I know a few coders who have spun their wheels for a decade or possibly longer
because they're still idealizing wheel reinvention. The reinvention is the
_easy_ part - after all, someone already did it, so you're just learning what
they did, "the hard way." What's hard is learning to leverage the ecosystem as
much as possible while bringing in original ideas; as in entrepreneurship,
there are no tutorials for that.

~~~
devs1010
yep, I just found myself saying something like this the other day when talking
with another engineer about the issues at the current company I work for. Its
my belief that modern web application development is primarily about
leveraging existing frameworks and libraries to deliver the results, failure
to take advantage of an existing resource can cause the project to be much
more complex to maintain and waste tons of efforts programming aspects of the
project that aren't concentrating on the actual domain problem that is trying
to be solved.

------
thangalin
A college teacher of mine quipped to his students that if the software is
inefficient, upgrade the hardware. I have flip-flopped on this mentality over
the years. Part of me longs to write code the way Charles Lindbergh knocked
together the Spirit of St. Louis.

<http://www.charleslindbergh.com/history/sec/>

Throughout the design of his aircraft, Charles knew that his life hung on the
details. For example, rather than use an all-metal design, much of the
exterior was cotton fabric (to reduce weight). His design was pragmatic,
practical, and brilliant.

For the most part, software developers do not create systems where the
modularity, efficiency, and stability are paramount to the success of a
business or the safety of the people who use the programs. Developers often
create systems for data entry and data analysis. It is the data that allows a
business to take flight, as it were.

You can replace system front-ends in a fortnight. Dirty data, however, can
skew results and impart inflexibility in the system. Bad data can ground a
business. These days I care about the software, as my mind reminisces about
the minimalism and beautiful design imparted upon the Spirit of St. Louis. Yet
I care much more about the quality of the database and the cleanliness of
data.

------
j45
I've often noticed how software is similar to hardware in one way.

Software and hardware are both destined to hit a limit, in any current
configuration, no matter how it's built or put together.

Whatever computer we buy, it has a limit. The day will come that the
capability we originally had will not be able to power what we need. We decide
how much (and how far) to invest into the future to stay on a machine. This
can be a benefit sometimes, other times, not. Sometimes we need a computer to
be good enough to do a certain task, other times not.

Building software has a similar shelf life. All software, no matter how it is,
or isn't architected, will have it's limits because of it. This can be a
benefit sometimes, other times, not. When those limits are hit, you'll have to
deal with it. Throw more horsepower at it, or refactor.

If there's code that isn't updated often, and doesn't need to be super
performant, good will be the same as great code.

When starting a new project, I find myself more and more asking the questions:

\- How long will I need this codebase to do what it does? \- Will the codebase
grow? \- How soon/often will it grow? \- Will additions be trivial / non-
trivial?

Most often I now just start with an ultra lightweight MVC framework to keep my
coding semi-organized and primed to re-factor, but not much more. I have a set
of scripts that will initialize an entire project how I like and I can quickly
start hacking on a new project/idea in a few minutes.

The less I obsess over every small architectural detail and let my decent
habits of being reasonably kind to my future developer self, I find myself
having fun while being responsible.

~~~
seancron
Would you mind expanding on the MVC framework and scripts that you use to
initialize an entire project?

It sounds like that setup would really cut down on the mental overhead need to
start a new project, and also cut down on obvious errors (typos, forgetting to
add something, etc).

~~~
j45
Sure, what language do you like to work in?

~~~
seancron
Any of these:

    
    
      * Python
      * Java (Android)
      * HTML/CSS/JS
    

If a different language works better, feel free to use that instead. Thanks.

------
JoeAltmaier
Write code, expecting to iterate. Organize it with large strokes and plenty of
wiggle room.

Then if its 'good enough', leave it at that.

If it needs work later, anybody can do it - you didn't make it too dense or
compact or concise to easily reinterpret or diagnose.

Its not just a good idea, its pretty much an obligation if you are paid to
create it. Keep personality out of it, complete it on time and under budget,
move on.

------
Spoom
I still experience the drive to "over-architect" a solution every so often,
especially when starting a new project. For me, it tends to be a result of
thinking, "oh, I can add that additional feature with little impact to the
timeline of the project." I have to force myself to remember the mantra:
You're Not Gonna Need It. More often than not it ends up being right.

<http://c2.com/cgi/wiki?YouArentGonnaNeedIt>

------
16s
Nice read. I really appreciate his comments about over-engineering to the
point that nothing useful actually gets accomplished. I've seen a lot of that.

And really the over-abstraction is wasted time. I've seen things added because
"we might need them some day". These things take days or weeks to add and then
never are used by customers even after many years have passed.

The over-abstraction seems to be a drug to some developers. They can't stop
doing it.

~~~
trusche
Not to mention that code that is only there to provide future-proofiness can
be a huge hurdle to understanding an existing codebase for any developer new
to a project. The hours spent on trying to understand code that doesn't seem
to make sense, only to be told that "we put that in just in case we needed
xyz, but it's actually not used anywhere", can be incredibly unproductive and
frustrating.

------
DrinkWater
Good article. I experienced this transition myself. Small, easy projects
unfold to super-huge constructs in your head and you lose passion and
productivity.

------
lhnz
"Perfect is the enemy of good enough; good enough is the enemy of all." [1]

There's no shame in code that's 'good enough' but I think there's a danger in
this article of missing an important point: you can't classify your output if
you don't know its context and goals. If you're coding something that
definitely won't be used again then make it 'good enough' for this use case,
if you need it to be extended by others over the next few months then make it
'good enough' for that use case, if you have a contractual obligation to get
it out to the client 'now' then stop thinking and start doing!

But paralysis is costly. If you lack the required knowledge to take a sensible
decision, then take a gamble: never hold up making a low-risk choice because
you don't know enough. The opportune time to learn what the right choice was
is once you've made it and can evaluate the outcome with data.

The other thing to note is that complexity on its own isn't bad,
complicatedness is [2]. The knowledge you've gained shouldn't be making your
code more complicated. You should spend your time on creating an arsenal of
simple solutions to complex problems. That is perfection (and you will never
reach it.)

Optimize your time by evaluating after instead of planning before; over time
create simple solutions to complex problems and use these as shortcuts to act
decisively.

[1] [http://paulbuchheit.blogspot.com/2007/04/perfect-is-enemy-
of...](http://paulbuchheit.blogspot.com/2007/04/perfect-is-enemy-of-good-
enough-and.html)

[2] [http://usabilityworks.org/2006/12/13/simplicity-
complexity-a...](http://usabilityworks.org/2006/12/13/simplicity-complexity-
and-complicatedness/)

~~~
cdmoyer
I was confused about your point until I clicked the first reference. You made
a slight misquote that change the whole quote.

Good enough is the enemy of _at_ all.

~~~
lhnz
Oh, oops. Thanks for that! I actually miswrote it ages ago when I first read
it. When I went to riff off it today I knew that I had liked the quote but I
couldn't quite understand why. I told myself, "Ah Seb, this makes sense,
you're just too tired to understand it!" ;)

Cheers.

------
singular
Herein lies the art of programming - writing clean, elegant code that isn't
overwrought. It's hard, but the key is to keep in mind that complexity is the
enemy and simplicity is the aim.

Easier said than done :)

------
RandallBrown
I was working on a project with another more senior developer and I remember
saying we should refactor some part of the code or design it differently
somehow and he said "Why, it already works?"

The only argument I could really come up with involved too many coding
buzzwords to be taken seriously and we moved on to the next task.

That's kind of stuck with me. Whenever I'm thinking about a code change or
"clever" design I just try and see if there's a justification beyond something
involving words like "abstract" or "cohesion".

------
medius
Trade-off between building quick functionality vs a flexible, maintainable
code is a business decision. Engineers at a startup cannot sit down and design
a system for months that will work for 10 million users. However, they also
cannot patch their code and have it create problems within a month and then
having to redesign it again.

Once your product vision is clear, you can try to see as much into the future
as you can. You know where your product can/will go based on what problem you
are trying to solve. Once you get into "what-if" territory, you know you have
ventured too far.

Therefore, I truly believe that engineers should be aware of the business
needs and the product roadmap/vision to make such decisions. Engineers can
then decide where(and how much) flexibility should be added. Most of the
future-proofing is done for scenarios that may not exist out of the
imagination of the engineers. They should know what _can_ be possible and what
cannot. No system can be designed to handle all scenarios without adding
untold complexity.

------
codeonfire
There is absolutely shame in "good enough". Mainly that created by weaselly
coworkers trying to get ahead and establish technical superiority in the eyes
of the non-techie management. These people never write any code themselves but
are the first to start WTF'ing really loudly when someone else completes
something. Usually the volume and banality of things programmers complain
about identify their technical competence, with the bottom of the barrel being
complaints about whitespace and formatting. "WTF! Bob put TWO spaces instead
of ONE all after method names all over the ENTIRE file! whaaah!" Yes, it
becomes this petty with some people.

So a smart programmer will strive for 'quick perfection', establish respect in
some other way to counter this, or if they really want to sink that low, fight
back with similar tactics. Smart programmers can also create review traps if
they can guess what colleagues will attack them on.

Its also good if you beat up another programmer on the first day so the others
know not to mess with you ;).

~~~
kalendae
enjoyed reading your comment, but if one is drawing too many similarities
between their work environment and prison, it could be time to find/create a
new work environment.

------
rosariom
This is the experienced developers dilemma: "To engineer or to not engineer";
engineering usually turning out to be over engineering. A more experienced
developer friend of mine would always tell me: "build for today's
requirements". I try hard to fight the design/architect voices in my head that
always want to imagine this made up future where we will need X or else we
cannot go live. These voices usually only stall real actual work, instill
fear, and serve very little to no purpose.

I once had an argument with a Wall Street Java developer who was made the
"lead" of one of our team projects. He decreed that every single class have an
interface so that we can be generic and not tightly couple any of the
components to concrete classes. I agreed that in some instances where
functionality, i.e. methods that can be represented by different classes with
the same method signatures made sense but not every single class needs an
interface (if that is the case just go with a beautiful dynamically typed
language like Python and avoid the code bloat). He got management on his side
and we went off and built an overly engineered Straight Through Processing
solution. It was a sheer nightmare to debug and the code bloat made me scream
one day when we had a serious production issue. Even our manager (who finally
had to look at the code when most of us were out on vacation once to answer
some user questions) was flabbergasted at the amount of code he had to read
through in order to answer the most trivial of questions. One extreme example
was an interface for trade references. Our trade references were always
strings with a date and some numeric value concatenated to it. The "engineer"
decided that we needed an interface for this and added one interface and
concrete class for our trade references. I told him that all classes needing
trade references instance variables could just have a String instance variable
named tradeReference or something like that and he went on to give me a design
pattern lecture. We argued for nearly 20 minutes about this silly thing as he
kept insisting that the future was unknown so we have to future proof the code
from unforeseeable changes. When he said this I asked him to remove the
Crystal ball plugin he had in Eclipse for predicting the future and get real.
He got angry and we had a team call to waste yet another hour of developer
time to discuss this. In the call I mentioned that our trade references scheme
had not changed in 8 years and was unlikely to change... I lost the debate
anyway. The ratio for most of the code base from interface to concrete class
was largely 1-1 thus not justifying this code bloat approach.

Experienced developers (at least I think) seem to have these crystal balls in
their heads or IDEs and usually try to be clairvoyant when it comes down to
building a product. We need to get out of the business of overly engineering
and just do as my friend said: "build for today's requirements". It is called
software for a reason: it is soft. It can change (most likely will), can be
refactored, redesigned, and/or incrementally made better or more abstract to
accommodate changes. I am in no way saying no design, just limit it and get to
work. A successfully built product is more satisfying then the imaginations of
your head and the "perfect" engineering/scaling solution that never
materializes. Users will like you, you will like you, and the team will get an
andrenaline boost with each and every release keeping the spirits high. Remove
the Crystal ball plugin from your head/IDE and stop trying to be clairvoyant
and be a developer.

~~~
dan00
There's a reason why this happens more often in languages like Java. Because
of the verbosity of the language, it's just painful to rewrite anything, even
if the current solution isn't that much over engineered. So there might be a
greater tendency to over engineer at the beginning, just to avoid any
rewriting, which at the end isn't possible.

I'm a full time C++ developer, which might be a bit better in this regard than
Java, but not that much, and a hobby Haskell programmer, and one of the
greatest things about Haskell is it's brevity. It makes rewriting a lot less
painful, so you're not avoiding it that much.

~~~
Sandman
I disagree with your point that the verbosity of Java is the reason for over-
engineering. The refactoring capabilities of modern-day IDEs help immensely
with reducing the amount of work one has to do to make syntactic changes over
the whole codebase. So the argument that developers tend to over-engineer when
coding in Java to avoid any pains that may arise because of its verbosity
doesn't hold, IMO.

I think that the over-engineering happens simply because it's a pain to do
serious refactoring when working on large enterprise software in general,
never mind what language it's written in. The sad truth of our profession is
that the customer requirements may change quickly and drastically, requiring
us to rewrite large portions of our code, and very often we find ourselves
thinking "If I only engineered it that way instead of this way, I wouldn't
have so much trouble right now". This is why we strive to create the most
robust, flexible solution that will be able to handle any future customer
requirement. So we basically turn our code into a framework that, we hope,
will allow us to respond to change quickly. Unfortunately, we can never
predict _everything_ that the users might want, so this whole approach falls
down like a house of cards when a user requirement comes in and we need to
change a large portion of the code. I believe this is true for a sufficiently
large app written in any language, Haskell included.

~~~
dan00
"The refactoring capabilities of modern-day IDEs help immensely with reducing
the amount of work one has to do to make syntactic changes over the whole
codebase. So the argument that developers tend to over-engineer when coding in
Java to avoid any pains that may arise because of its verbosity doesn't hold,
IMO."

Refactoring tools might be nice and will help you here and there, but there's
a difference in the abstraction abilites of a language like Java compared to
language like Haskell.

It's not only about the amount of code, but also about the complexity of the
code, when building abstractions.

Yes, a refactoring tool might help you dealing with the complexity, but it's
still there and makes it more difficult.

I never understood the point of using a less capable language and then using a
tool to compensate it, e.g automatically generate code for it.

~~~
Sandman
_I never understood the point of using a less capable language and then using
a tool to compensate it, e.g automatically generate code for it._

I definitely agree with you on this one :). Sure, it's better to use a
language that lets you have less complexity even as your codebase grows quite
large. You mentioned Haskell. Since I don't have any experience with it, what
do you think is the reason that it's not used very often for building large
enterprise applications (or maybe it is, and I just don't know about them)?

~~~
dan00
"Since I don't have any experience with it, what do you think is the reason
that it's not used very often for building large enterprise applications (or
maybe it is, and I just don't know about them)?"

Well, I don't know if it's even clear why other languages are used for
enterprise software?

I don't think that their technical or whatever superiority was the main
reason. Sometimes it seems that everything that is needed is to push it with a
lot of marketing into the mainstream and then just let it go.

At some point there're more libaries for a language, most people use that
language, universities are teaching it, so that's then the main reason to use
a language.

Java might been there, pushed into mainstream, at the right time, with the
right features, which made it less complex and less error prone (garbage
collection, no memory pointers) to use, compared to C/C++.

But perhaps there's something about "object orientation", how it's implemented
in Java, which makes it for people easier to grasp, if I read all the hate
about these strange scheme/lisp courses in universities, but perhaps they're
just already used to much to other languages.

On Haskell, at the beginning it looks very strange, especially compared to
languages like C/C++, Java or C#, but I think that most of the felt strangness
is a matter of habit, because most of the mainstream langugages aren't that
different.

I don't think that learning Haskell was that much harder for me than learning
to program in C++ or Java. Sometimes people seem to forget the challenges they
had, when they learned programming for the first time.

------
goblin89
I think it all boils down to solving well-defined tasks with minimum possible
effort and time spent, and keeping in mind that the end goal is the product,
not the code.

For me, the understanding happened in similar order:

\- First, as a beginner, I solve all problems with minimum effort possible.
The goal is the product.

(It's not real programming, but working with CMS that involves writing code
sometimes.)

\- Then I see the way to make much cooler and more “custom” products—with a
web framework. In order to be able to do that, however, I need to start doing
real programming.

\- Learning programming, I find that what I was writing before was pure crap.
I also forget that the product is the end goal, and care instead about writing
code.

\- Lots of LOC but few finished projects after I discover that code actually
doesn't matter much. Instead, other stuff does—like speed, communication,
measurement.

\- Learning to make and deliver products with minimum possible effort—that's
where I am now.

------
olliesaunders
I think what this article is referring to is the importance of retaining
perspective of the goals of the project. An experienced programmer might have
the ability to program to a very high standard but he should reserve that for
times when it is justified—not forgetting that practice or curiosity alone is
occasionally adequate justification.

You can go further with this thinking and suggest maybe it isn’t even useful
to ask “is this good enough?” and instead ask “is this sensible given xyz?” or
”will this be worth doing?” and forget about what is means to be good enough
or assess the quality of your work in absolute terms.

------
ww520
I would rather do "forgettable" development. It means that I don't have to
worry about it afterward and can forget about it. I don't have to go back to
fix tons of bugs. The software should run by itself as much as possible. It
takes much less time to develop over the entire life cycle of the software,
not just the initial sprint.

One corollary is to use simple and proven technology and libraries.

Another corollary is to have crash-friendly design, i.e. software that can
crash in any time and recover at the next restart.

Third is to make software self configured and no configuration. It makes
operation and scaling very simple.

------
ChrisNorstrom
For the longest time I felt guilty for using short INLINE CSS on certain
elements because of the whole "markup and styling must be kept separate".

My reasoning for this was that maintainability was SOOO much easier. Usually I
use it on tiny simple elements that have only a tiny amount of unique css
(maybe a unique background for each) and repeat a lot on one page (but no
other pages). So I really don't want to create 20+ unique IDs in the css,
triple the code size, and all for what?

------
Avshalom
It's a nonstandard definition of optimization in many of these case but I
still feel like this all falls under "Premature optimization is the root of
all evil"

Also importantly apparently the full(er) quote is "say about 97% of the time,
premature..." Because even Knuth knew that sometimes you should design for
that herd of buses.

------
kaffeinecoma
Please forgive this comment which is totally unrelated to the article, but
does anyone else have trouble scrolling this page in Chrome on OSX? I often
encounter pages that give Chrome fits, but I can't quite figure out what's
causing it.

~~~
duopixel
It must be the comments, they are lazy loaded a la disqus, so you start
scrolling and it stutters as you scroll.

------
wr1472
In fact be proud of code that is good enough.

------
chj
my guideline is to make things happen with as few lines as possible.

~~~
jarek
You must love Perl.

------
its_so_on
one way to think about this is code as a "consumable" instead of as a
"durable" good.

There is an essential difference between a paper cup and a glass.

Say you are throwing two outdoor parties a year, but otherwise will not serve
more than 8 people. The best solution, if you start with nothing, is to buy a
set of dishware for 8, and then throwaway plastic cups and plastic knives
twice a year ad hoc.

Code could be seen as similar. There is one-off bad solutions, that are no
better than a paper cup. You can't iterate on them (like washing a plastic
cup) because it starts to fall apart.

Then there is glassware. It's more expensive, but durable.

So, one approach is to look at your resources and your immediate and expected
future needs, realize that better code is more durable but more expensive, but
that there is nothing wrong with "consumable" code that you can't wash more
than once or twice before it starts being "a mess".

Just because "code is forever" we tend to think of it as not being consumed
after being used for its purpose, but due to the nature of engineering, in
fact thinking of it in terms of just that is quite appropriate, in my humble
opinion.

Once you realize this difference, you can make strategic investments into
durable and consumable code. You usually can't fortify a paper cup, nor turn a
plastic cup into a glass one, though, so often this is a decision to make
several times over the lifetime of your "household"! :)

For personal use, if you have very little money (time/resources) there is
nothing wrong with starting with paper, buying plastic and then glass or
ceramic, spending, overall, three or four times as much money as if you had
just bought a beautiful antique set of dishware for yourself to begin with.
Often, though, that is not the real situation: realistically, you could "do
without" for a while, and then buy a durable good you won't replace.

These are difficult investment decisions for households, individuals, and
companies.

Don't discount renting, either! In this case, that could be analogous to
licensing someone else's software.

~~~
rue
> _Say you are throwing two outdoor parties a year, but otherwise will not
> serve more than 8 people. The best solution, if you start with nothing, is
> to buy a set of dishware for 8, and then throwaway plastic cups and plastic
> knives twice a year ad hoc._

Not really. Only if you've no time to look for good dishware (surprise
party?), can't afford it this instant (or fall in the common pit of not doing
the math for the long term) or you absolutely do not have space for the
“extras”, which is fairly rare.

To extend the metaphor, this “disposable” code tends to end up in unexpected
places and stick around polluting the ecosystem forever.

Or, like some people, using disposable stuff every day and probably having
trash all over the house to show for it.

…

That said, disposable code is fine. The metaphor isn't exact, but does point
to some things to be cautious of.

~~~
sliverstorm
Plastic cups and plastic knives don't break like your fine china though. At
parties, this is a good thing.

~~~
its_so_on
right, but forget fine china: some people would prefer to give their guests
real plates, and do so when they have 10 guests, but don't have 100 plates, so
when 100 guests show up on their lawn they use disposable plates. The reason
doesn't matter. See my comment above about why it's hard to think of code as
being a consumable good AT ALL (for any reason).

My point wasn't really about tableware, it was about disposable versus durable
goods.

~~~
Natsu
That then points to a reasonable solution: use disposable stuff for rare
events but have a good set for everyday use.

~~~
its_so_on
i guess i was being incredibly unclear then because that was my whole point.
:/

a lot of code is for rare disposable events and doesn't need to be built like
a ship; more like a paper airplane.

