
You're a Bad Programmer. Embrace It. - Garbage
http://java.dzone.com/articles/youre-bad-programmer-embrace
======
JaredRichardson
Hi all. I'm a long time lurker, but when I saw my article made the front page,
I made an account. :)

I don't think that using tools makes us "bad"... I think pretending that we're
"good" when we really need the tools make it much more difficult for us to
past our egos and really look for what we need. There are so many great
practices and tools available, but most developers I've encountered because
things are "good enough". I was hoping to encourage people to try something
new. Something a "good" programmer would never need, but a "bad" programmer
would desperately want.

In regards to knowing the API versus using an IDE... spend some time with a
Ruby developer. Someone who's not used to having great IDE support. They've
been forced to learn the language. The difference in their productivity level
is amazing. Contrast them with a developer who only learned Java or dot Net
and was using Eclipse or Visual Studio from day one. Those developers tend to
not learn the underlying APIs and I think it hurts their ability to know what
functionality and libraries are available to them.

Anyway, it's very cool seeing an article I wrote on HN!

------
edw519
Wait, what? OP claims that we are bad programmers because we use too many
tools:

 _We can't remember all the methods our code needs to call, so we use
autocompleting IDEs to remind us._

and then recommends using tools to become better programmers:

 _These tools flag common mistakes... really common mistakes._

I know I'm in the minority because I've had this debate with almost every
programmer I know, but I do not use an IDE for exactly this reason. I want the
"firmware" that knows how to assemble building blocks of code to be in my
brain, not in my computer. To me, writing code is like driving a stick shift,
for much of it I don't even think, it's unconscious competence. Can you
imagine driving a car with auto-complete, color coding, or drop down menus? To
this day, I get pissed off when something doesn't compile the first time,
usually because of a typo.

There's a time and a place for every tool. An IDE or test suite is like a back
brace or crutches; they are indispensable when you really need them, but they
only weaken you when you're healthy. I guess I draw the line in a different
place from OP (and almost everyone else I know).

~~~
netmau5
Why is there such massive nerd-rage when it comes to people using IDEs versus
what amounts to text editors with scripting toys?

IDEs are fantastic and I am a more productive programmer for using one. I work
with a pretty big codebase and just having excellent cross-file navigation and
usage searches are big time-savers. Color coding allows me to establish a
visual understanding of more code at a glance which is especially useful when
I didn't write it. Integrating with source control directly from your editor
gives you fine-grained organization when you are working on multiple changes
in parallel. Task managers (ie, Mylyn or Atlassian), in conjunction with
source control integration, significantly reduce the overhead time that is
required to perform workflow-related management during the day. And despite my
hacker pride, I love having good debugging tools around as they are wonderful
at aiding in maintenance.

Code completion saves you time and accelerates learning. Maybe I'm in the
minority here, but I simply cannot understand why autocomplete makes you so
bad. Once your team grows to beyond 4-5 programmers, you're going to be seeing
plenty of code you didn't write yourself. A knowledge of naming conventions
and design patterns allow you to use autocomplete to help find the behavior
you want before turning to documentation or duplicating it yourself. At the
very least, it helps me type faster because I am typing less and it helps me
avoid spelling mistakes. Autocomplete is a search tool and a typing assistant;
it does not create a black hole in your brain that sucks away any memory of
what you have seen or done in the code.

If you find a function that is useful, stop and find out what it does. Why are
you calling it if you don't know what it does? If you don't remember it after
the first time, you probably will have it committed to memory by the third.
And who cares if you don't remember it, do we really need to know every API of
every library we use? No, we used it as a tool to get the job done. We have
precious few brain cells to spend them on memorizing an ever-changing
dictionary; let the computers do what they do so well.

I use Textmate and occasionally vim when the situation arises. They are
excellent tools for certain types of work. But when I'm working on a feature
in a big codebase, there is no place like my IDE. It takes care of all of us
average programmers at average companies.

~~~
anatoly
The boring answer is that there is no particular nerd-rage directed against
IDEs. There are many nerds who prefer IDEs and there are many nerds who prefer
text editors. The modern culture encourages us to self-identify as rebels
breaking away from uniformity. Thus your attention focuses on people putting
down IDEs, while a vim user's attention focuses on people putting down text
editors.

I don't use IDEs, because I find myself somewhat less productive in them.
Syntax coloring, to me, is a fantastic feature, and I use it in a text editor.
Integration with source control and usage searches are things I like to do in
a separate window, because I want them to be deliberate enough - I think
they'd break my concentration more easily if I didn't have to context-switch
to do them. I dislike autocomplete for a similar reason - it creates the
impression of my thought moving in tiny jerks all the time - even if it isn't
really true, the impression itself is a nuisance. I don't think that the
typing speed gains from autocomplete could really matter to me. The time spent
actually typing lines of source code is a very small portion of my most
productive days.

All of these things are not true for some programmers I know who feel at home
in IDEs, some of them my betters.

------
Maro
I'm a good programmer. Embrace it!

I used to think I'm a bad programmer, and I was. But ever since I'm doing my
startup, I'm concentrating on one project in one language, instead of ten
different things in 4 different languages. My co-founder and I found and
agreed on a practical subset of the language (C++), and we write great code.
I'm still a "bad programmer" in the sense that if I wrote Javascript code,
it'd not be great, but now I know that the key to success is to concentrate on
one thing for a long time --- and actually it's possible to be a "good
programmer" if you do that. Okay, I still don't do enough testing, but in our
case (distributed DB) the long-term smart solution is to hire a test
engineer..

~~~
kls
You sir get a gold star, The big reason that projects fail is "technology
soup", everyone has to get their pet project or technology into the project.
It gets worse the bigger the projects get. When you get to the enterprise
where the big project are, you get governance and a bunch of design by
committee. Where people who are not writing the code, tell the people that are
"here is what you are going to use".

It is funny but every time I walk into a project that is "fad" programming I
see failure. When I walk in a see a bunch of guys heads down, iterating and
releasing the most crucial features first, the project always seem to succeed.
Maybe the solution is less magic technology dust not more.

~~~
hammerdr
"heads down, iterating and releasing the most crucial features first"

That's the key.

The "number" of technologies involved is moot. Every piece of software is
build upon thousands of layers of technology. Adding one or more ingredients
to the soup doesn't necessarily ruin it.

Often the simplest way to solve a problem is to duct-tape two different items
from different technology stacks just well enough that it works for you. In
that simplest case, you may be duct-taping 2 technologies together with a 3rd.

~~~
kls
I disagree somewhat, one of the reason the web is a battle ground of corpses,
that where once projects is due to the fact that it became a convoluted mess.
The bar to be proficient required HTML, CSS, XML, (JSP, ASP, PHP pick one)
generally some frameworks (taglibs, struts, tiles), SQL, Web servers,
applications servers, Database servers, CMS servers that double as a bad
application server, the list sprawls out from here.

Developers had to master all of this just to put a page together. Technology
soup can bloat to the point where it is unmaintainable. When you get into the
enterprise you can add MQ, LDAP, SAP, Peoplesoft and a slew of others, that
require integration. SOA and ESB's are doing a lot to clean it up and give a
single implementation technology but it is still a mess.

Jaguar once had the distinction of being the automobile with the most moving
parts, they also had the distinction of being one of the most unreliable
vehicles. Technology for technologies sake is always a project killer. I have
seen it too many times.

When I take over a troubled project (one of my specializations is troubled
project rescue) one of my first efforts is to reduce the amount of technology
load the project carries. That coupled with a focus on MVP saves a good deal
of projects.

------
SanjayUttam
While this doesn't necessarily negate the "we can't remember syntax"
argument...I remember back before mobile phones were rampant, I was _really_
great at remembering phone numbers. People besides by parents were regularly
surprised at how I'd remember numbers I'd seen or read just once. Then I got a
mobile phone.

I think I probably have less than 5 numbers memorized (those of my immediate
family). In essence, using an IDE encourages us not to memorize syntax.

------
j_baker
Am I the only one who feels the whole egoless programming thing has gone too
far? Yeah, ego is a bad thing. But self-esteem and confidence aren't. I'm not
afraid to say that I feel I'm good at what I do.

That doesn't mean I'm never wrong or that I don't make mistakes. It just means
I'm proud of my abilities.

------
devmonk
Great post! Some suggestions that were new to me, too, although you're mixing
Java and Ruby in your examples.

I agree with everything except the DSL stuff at the end. People have mentioned
on HN recently, and I believe it is true, that Cucumber, etc. may be something
customers get involved with in the beginning, but then they lose interest in
it. Sure there are a number of other things that vary from place to place like
politics, conformance to standards, etc. But, customers typically just want to
see results (in a reasonable amount of time). If you spend time on good
design/UI/UE (read "nice looking and easy to use"), it is fast, it doesn't
break, and you've tweaked to their needs, you're set.

~~~
JaredRichardson
I've spent more time in Ruby and Java than other languages. My background is
leaking through. :) Feel free to post any other tools to the comments on the
article.

Re: DSLs... I don't think the DSL specs always stay in front of the customer
(they sometimes do), but they're a valuable tool. Whether it's a developer, a
technical customer, or a non technical customer, you've provided a much
simpler way to show what the system does. That's a huge win.

I also find that developers who spend time in DSLs tend to write much cleaner
code. Once you get used that level of abstraction, you use it in other places.

------
reynolds
I can't imagine using autocomplete or even an IDE. I remember when I first
started I was writing C++ on Windows in Visual C++. My learning only
accelerated once I moved off Windows and stopped using autocomplete. Once I
started working on Linux and in vim, I didn't even consider looking back.

I like to think I'm a minimalist when it comes to my tools, so obviously my
experiences and insights are very different from someone working in an IDE. I
prefer the command line to a gui any day.

------
tomlin
I've always believed that I am a pretty terrible programmer.

I'm cool with that because I know I am gradually becoming better and better
each day.

I invoke 3 core principals/ideals:

* No matter how good you are, someone is always better than you. Yes. Always.

* Surround yourself with people smarter than you, rather than shun them egotistically. Intelligence is contagious.

* Admit fault, show humility to others who do the same.

------
weego
Does anyone still believe there are really any ground breaking insights to be
made in these kind of self-loathing rhetoric articles?

Shouldn't there be more "You know what, sometimes just having something work
is pretty much awesome" articles instead of everything fooling themselves that
every programmer in the world needs to be a mini-genuis?

------
ndl
A big theme in programming is to get as much complexity out of one's head and
into the computer as possible. In this regard, I consider the IDE an extension
of the high level language. Whether I use one depends on what I'm coding in.
I'm usually fine with gedit when coding Python. In Java, the IDE helps cut
through the verbosity.

There is some truth to this article - humans are much worse than machines at
simple, regular processes. Manufacturing eventually figured out how to divide
up labor along the assembly line and automate as much as possible to make the
process regular. Software is comparatively in its infancy. Maybe we will see a
similar progression as companies replace the artistry of software with more
standardized, repeatable processes. I probably won't be coding then.

Kind of reminds me of <http://xkcd.com/378/>

------
twymer
The article starts of a little depressing, telling the reader that humans are
bad at programming and we have no hope. At least it finishes telling us we
just need to use a bunch of cool toys to look smarter.

------
argv_empty
_We can't remember all the methods our code needs to call, so we use
autocompleting IDEs to remind us._

I thought memorizing all of the details of these was only important in school,
and even then only at exam time.

~~~
points
Remembering things is _VERY_ important. It means you have an instant grasp of
the code base, the APIs available to you. It means you can solve problems much
quicker than someone who only has a limited memory, or uses auto-complete all
the time.

~~~
kenjackson
Remembering the _right_ things is very important. I know the code base far
better now with an IDE than I ever did without one. Because it lets me focus
on what is really important in the code.

I get to focus on code structure, algorithms, I can quickly drill into the
code and back out. I can in a few seconds see all uses or all definitions.

The thing that makes IDEs great IMO, is when I'm looking at code that isn't
mine. The ability to abstract structure from other peoples code quickly is a
godsend.

------
TamDenholm
Personally i dont use an IDE, i use gedit because it does what i want, colour
coding, auto indentation and a file browser pane that supports (s)ftp (but i
suppose thats actually the OS), thats all i'll ever want or need.

If you want to use an IDE, then by all means go ahead, but if your trying to
hire me, dont make me use it, and dont make me use an OS i dont like either.

Everyone's different and they've got their own way of working, just let them
do it their way and they'll be more productive, they'll achieve what you're
after.

------
JaredRichardson
I've posted a follow up article. You're a Bad Manager. Embrace It.
<http://agile.dzone.com/articles/youre-bad-manager-embrace-it>

I'd appreciate any help spreading the word. I'm involved with much more of the
technical sites than the managerial ones.

------
jister
A short bio at the bottom of the article says:

"Jared Richardson works at Logos Technologies As a recognized expert in the
software industry..."

Take note of the word EXPERT. He claims developers are bad programmers yet he
is an expert. It's contradicting or maybe the right word in the bio should be
WORST? :P

~~~
JaredRichardson
Nice catch! It's just marketing though. Don't read too much into that...

------
Tichy
I am struggling with that: how to market myself as a consultant, knowing that
all code sucks?

~~~
patio11
"I made X several million dollars." Why talk about code?

~~~
Tichy
If I knew how to make X several million dollars, I would probably also now how
to make myself several million dollars. I agree that if I had make X several
million dollars, it would be a good hook.

~~~
patio11
So you know my whole shtick is engineering marketing outcomes, right? I'm
working for a client right now, doing SEO/conversion optimization/metrics/etc.
It is quite reasonable to expect that I will increase their business by 5%. (I
quoted a much, much higher number than that.) I could do that for myself, too,
no problem.

Step #1 to making myself a million doing it: grow a business to where it gets
$20 million in sales annual. Step #2: Successful A/B test.

Or, alternatively, make $CLIENT a million and get paid handsomely, without the
"build a business for a decade" step.

~~~
Tichy
Good idea, and I'd actually like to get into that area of work. Will study
your articles as a starting point :-)

------
known
You're a good programmer if you _reuse_ code.

------
jasonlotito
It reminds me of the saying (probably butchered here) that the more we learn
about a topic, the more we realize how little we really know.

------
kahawe
This simple and completely true statement never ceases to fascinate me:

> In short, as an industry, we all stink.

We really do stink and we as an industry are nowhere near a state where there
are any real standards and best practices in place like e.g. when building a
skyscraper or a road. Those are fairly to pretty damn complicated tasks and
there is so much knowledge, experience and best practice involved and everyone
in the construction industry (should..) knows them and when it is done you can
have it checked and certified that it was well built and won't crumble the
next day.

I don't think anything like this even remotely exists in IT. We have a lot of
RFCs describing protocols and what not but nobody can really objectively
certify your serious-business software as well-built or can verify whether you
applied even the smallest best practices or common sense guidelines because
there are so many almost religious wars being fought over completely minor
advantages and dis-advantages which simply do not matter on a global scale.

And far too many really, really actually bad programmers just get away with
their mess or horrible, insane code.

~~~
mechanical_fish
We need to remember that computers are young. Really, really young. The
invention of the first-ever electronic digital computer is still an event
within living memory.

This can be hard to remember. Especially when you yourself are young, you're
used to thinking of things that are even _slightly_ older than you as eternal.
(When you reach my age, and you are beginning to have colleagues who literally
weren't alive during formative periods of your early career, you get a deeper
appreciation for the fact that people -- including yourself -- have these
horizons.) For example, I did graduate work on lasers, and I built my work on
top of a great deal of earlier work, so I tend to think of lasers as things
that have been around for a long time. But the laser isn't very old. It turned
fifty this year. Many of my friends are older than the laser. Many of its
early pioneers are still wandering around.

Look up the early years of engineering. Read certain chapters of the excellent
_Structures, or Why Things Don't Fall Down_ , or for a more personal view
check out Chapter 20 of Mark Twain's _Life on the Mississippi_ :

<http://www.classicreader.com/book/2886/21/>

The early history of tall buildings is the history of towers falling down. The
early history of modern bridges is the history of bridges collapsing. The
early history of steam-powered transportation is the history of boiler
explosions, especially in the USA, which had a reputation for quick-and-dirty
mechanical hacks as far back as the early 19th century. These situations took
_decades_ to change... or longer. Sometimes an order of magnitude longer.

And, crazy ideas about the Singularity notwithstanding, technology proceeds at
a human pace. Right now, we're still at the point where substantial portions
of the world population don't even have access to a computer (although the
smart phone promises to change that). Very few people know how to program at
any level. And that is the major issue facing programmers today, as it has
been since the microcomputer was invented in the 1970s: There is more value in
spreading the temporary hacks around more widely and cheaply than there is in
inventing more permanent and solid stuff. Until we can meet the insatiable and
_growing_ demand for poor implementations of 1970s-era computing technologies,
there's really no time or money for anything else. This condition is probably
temporary, but when I say "temporary" I may still be talking on the scale of
decades.

~~~
demallien
No. Although your point about the youth of the system involved being a
critical determinant to it's reliability is correct, you make an error in
naming that system as "computers".

"Computers" are reliable these days. We rarely have design problems with the
actual hardware (ok, sometimes we have higher than desired failure rates, but
most of the time, computers these days are very reliable). They function
pretty much as we expect them to. As proof, they are so reliable that when we
have a bug in our programs, we no longer even think to point the finger at the
hardware. Instead, we point the finger at the software (though in too many
cases devs still tend to point the finger at _other_ people's code, when they
should look at their own code first - it is the youngest, and hence least
reliable, as a general rule).

The problem is software. I seem to recall pg writing an essay about this - we
aren't making copies of something that already exists, we're making _new_
things, things that have never even been dreamt of before. Of course we're
going to have problems. I've watched in amazement as Rails has gone from a
not-particularly-reliable framework to something relatively solid and flexible
in a matter of years. MacRuby has gone from a dream to one of the best ruby
implementations out there over the same sort of time frame. The list could go
on and on.

But getting reliability out of a system takes time. As a rough heuristic, I
think you could say that you need about a year of catching and fixing bugs in
a product from the time that you stop adding features for every year that the
product took to develop. If you add features, the new parts won't be right
until another year has gone by (although hopefully unit tests should prevent
regressions in the existing code).

Of course, in a shop with sloppy coding standards, you may never get a stable
product out the other end, but I think that most competent engineers these
days know the techniques that will give a reliable system - verifying inputs
to APIs, Unit Tests, documentation, specs.

~~~
RyanMcGreal
>We rarely have design problems with the actual hardware

Hard drives, still the main data storage hardware in most computers, are
_utterly_ unreliable.

~~~
demallien
No, you're missing my point. Consumer hard drives are unreliable because
consumers would rather save a few dollars rather than pay for a reliable
system. That's one end of the spectrum. The other end is, say a bank, where
databases are backed up remotely in conjunction with high-reliability RAID
drives to achieve a very high level of reliability.

It's not that we aren't capable of designing systems that are reliable, but
rather that we don't value the reliability as much as it costs. And that's
because even a crappy consumer drive is 'good enough' for most people,
unreliability and all. If it's not, we put in some sort of a back-up system to
mitigate the risk.

In the early days of computers, that just wasn't so. If you're old enough, you
probably remember having a C64 game on a cassette that you just couldn't get
to load. Or a floppy disk that became corrupted because it got heated to over
50°C and some bits got jumbled. Go back even further when only prototypes of
these devices existed in labs, and the situation was even worse. Well, most
software is more akin to the prototype in the lab than a product on a shelf.
It hasn't been thoroughly tested, it is still under development, we don't
understand the limitations of the system, etc etc etc.

