Hacker News new | comments | show | ask | jobs | submit login
Signs that you are a bad programmer (yacoset.com)
450 points by okal on Oct 19, 2011 | hide | past | web | favorite | 168 comments



I wish articles like this would namespace their assertions by telling us what they mean by "good" or "bad", so we could avoid the perennial echo chamber debate that goes like this:

    A: GOOD MEANS SHIPPING AND PLEASING YOUR CUSTOMERS
    B: NO YOU FOOL! GOOD MEANS WRITING CLEAR CODE THAT OTHER HACKERS CAN READ
    C: NO YOU FOOLS! GOOD IS A HAPPY MEDIUM BETWEEN BOTH OF THOSE THINGS
    D: DEBATE IS HARD, LET'S GO SHOPPING!
God forbid we agree on what words mean before we talk about them...


Except this is sort of a cop out isn't it? In the sense that for any type of artist there are good artists and bad artists but while the good artists may be highly differentiated bad artists share a common set of deficiencies.

The article is almost in the 'you might be a <foo> if <bar>' style (most common instantiation I am familar with is 'you might be a redneck if <insert denigrating attribute of redneck>' And in the spirit of that humor I liked it.

That being said, if you ask me as a hiring manager what I consider 'good' programming, its 'readable/understandable', it does what it is supposed to do, and it has tests.


Anna Karenina would have you believe that there are more ways to fail than to succeed, but I would argue the opposite in most cases involving creative endeavors:

Bad programmers are all alike; every good programmer is good in his or her own way.


Props just for the Anna Karenina reference :-)


"Good" already has a meaning, you can't arbitrarily redefine it. You have to actually figure out what the most "good" tradeoff is between shipping and writing maintainable code.


Good has a meaning, but it is a meaning that is highly context sensitive.

When talking about "Good cars" for instance, what makes a good car for a mother of 5 is likely different from what makes a good car for a single person who is very worried about the environment and both of those are different from a "Good car" for drag racing.


Yes, and likewise someone can be a very good web programmer while being a crappy embedded programmer. But there's common factors involved: likewise, a car that doesn't run probably isn't a good car regardless of your requirements. (OK, maybe you're a hobbyist that likes working on cars when they break down, or maybe you enjoy having broken down cars on your lawn for decoration. But by the same token, maybe you're a hobbyist who enjoys the IOCCC.)

Defining the word "good" is a philosophical rabbit hole for sure, but if someone comes out and says "a good programmer is someone who writes maintainable code", and someone else comes out and says "a good programmer is someone who writes fast code", and someone else comes along and says "a good programmer is someone who writes code that pleases the customer", you're still having a meaningful discussion about which "good" is better, so to speak.


"Good" is also arguably subjective and relative to the subject's perceptions of "bad"(and vice-versa).

So of course that "good" tradeoff to one programmer might be a "bad" one to another.


That's a "good" point.


I laughed, you bastard.


I'm not bad. I'm just drawn that way.


Thanks dude... everybody has their own evaluation criteria in their mind (which is not disclosed) and people start calling things good or bad. Most of the debates end up being a very unscientific way of evaluating something!


Not sure this relates 100%, since the article is mostly talking about your understanding of concepts.

If programmer A understands recursion and programmer B does not programmer A will write better code regardless of deadlines.


The opposite approach is much simpler. There's only one sign that you are a great programmer:

Clients and fellow programmers are still happy with your work two years after you've delivered it.

(Of course: "still" implies that they we're happy at delivery, which includes actually shipping working software in a timely fashion.)


"Clients and fellow programmers are still happy with your work two years after you've delivered it."

Disagree. The first system I wrote kept the client happy for years. It was a mess, but it was literally millions of dollars better than not having any system.

If what you mean is something like "focus on doing work of long-term value, and consider programming skill as such an input", then I'd agree with that.


"and fellow programmers"

Did your client ever hire programmers to maintain and improve the first system you wrote? Were those programmers happy with your work?

I guess if no one is ever going to maintain your code it doesn't matter as much. I wouldn't ever be so bold as to predict ahead of time whether that's going to be the case.


It also assumes those fellow programmers are also good programmers.

Can a mediocre programmer tell the difference between mediocre code and good code?

To the original point, the happiness of the client and other developers is a good indicator, but not the only.


That's true. It's not so much about other programmers being a passive critic of your code so much as it is about the maintenance programmer down the line being able to understand what your code is doing and either fix bugs or add features to it.

The maintenance programmer should be competent, but shouldn't have to be brilliant to understand what's doing on (except in the rare case that your code actually is doing something brilliant). If the maintenance programmer doesn't understand a basic language feature, it's his fault if he doesn't understand your use of it (be it ternary operators, blocks, list comprehensions, regular expressions...). If the maintenance programmer does understand ternary operators but you do something like nest them four layers deep, it's probably your fault if he gets lost.


  Can a mediocre programmer tell the difference between mediocre code and good code?
Maybe not directly, but they can probably tell how well they understand the code and how hard it is to change. Not as well as a good programmer, since they'll have more trouble with both regardless, but somewhat.


My bad; thanks to philwelch and saraid for flagging the error. The resulting discussion is interesting, as is 13 upvotes for a post based on a misreading :-). FWIW, my favorite bit on programmer's understanding each other's work is Yegge's Done and Gets Things Smart.


That was an AND, not an OR.


This article stands out from many that point out common errors. This one also provides great* tips on how to improve -- even a "contrapositive" link: http://www.southsearepublic.org/article/2024/read/what_makes...

* Disclaimer: maybe I've seen so many "you're doing it wrong" posts that I see something in the opposite direction and get too excited.

But then, that's what makes something like PG's "Great Hackers" (http://www.paulgraham.com/gh.html) rare. It takes a lot of things: education, some innate ability, good habits, work environment -- including a good team...

Innate ability isn't a huge factor, but writing wonderful code definitely isn't for everyone. So then, what is to be done with all the mediocre code being produced? I don't see the world overrun by it, long term: code is too expensive. But to get there, something definitely needs to be done.


This is the only metric that matters. Ability to achieve. Everything else is a tool.


This is a ridiculous assertion. I believe it's the kind of attitude that keeps our profession from gaining respect among other professional groups.

I've cooked meals for people that were impressed and happy, does that make me a great chef? Perhaps the people I fed are easy to please, have a simple tastes, spend most days eating unflavored oat meal.

If i'm capable of changing a car's spark plugs, and it continues to run well does that make me a great mechanic?

People having low or incorrect expectations does not change the overall quality of something you create or deliver.


I wonder if the test is closer to one writers face: you're probably a decent writer if people value your work enough to read it. Likewise, whatever else one can say of programmers, you might be a reasonably decent one if someone else values your work enough to use it. "Reading" and "using" in this case might be similar.


Well, I wonder. As the neighborhood bookclub's pain in the ass (I did not seek the vocation; it found me) I see quite a number of books with great blurbs, and even the occasional award, that I think are very sloppily done. A large number of best sellers seem to scratch an itch widely but briefly felt. Will anybody read _The Da Vinci Code_ in 2025?


I would definitely stipulate that the clients applied to this metric need to be "clueful". You might give them exactly what you asked for, only to have them discover that what they asked for wasn't really what they wanted; "clients from hell" might ask you to re-write it for free, or at least be cross with you for not reading their minds. In any case, they won't be happy.

Ignore those.


I'm not sure being a great programmer (as in being great at the craft) is the same as ship on time & be sucessful. It's quite possible to write crappy code but still have a usable product that satisfies the customer much more than the previous pile of crap.


I think the very first judgement you should make about a piece of software is whether it works or not. After all the user will never care about anything else.

That being said, if you are in a team and you know you all will have to extend or maintain the code at some point, then discussion of coding style, program features, choice of data structures, etc, is healthy. These things are important for a growing application, getting them right will make the software more robust and make maintenance more efficient.


That could also be a sign that you work in an industry that doesn't change very much.


I suspect that people who don't "get" pointers (#4 in the article) actually have a much harder time with the pointer declaration and manipulation syntax in C than actually understanding how pointers work and what they let you do.

For instance, in C the * character is used both to declare a pointer and to dereference one and they can be stacked to dereference nested structures. Then & references a variable memory location but is also used in method signatures to change the semantics of calling a function into pass-by-reference. To complicate & further, & is often seen alongside const declarations, which are a whole other thing that people have to keep in their heads.

On top of all of this, * and & have their own operator precedences and associativities that you have to memorize, which is a whole discussion about binary versus unary operators and precedence tables. And I didn't even mention [].

I really don't think that people fail to comprehend pointers because their mental models need updating nearly as often as they just haven't fully internalized all the hoops that C makes you jump through to mess with pointers.


> For instance, in C the * character is used both to declare a pointer and to dereference one and they can be stacked to dereference nested structures.

This problem can be solved by thinking of * as always dereferencing a pointer. So:

  int *a; /* "*a" is an int, thus a is a pointer to one */
  int *b, *c; /* *b and *c are both ints */
This also has the nice side effect of explaining why good C programmers write the * next to the variable name and why multiple variables declared on the same line all require a * next to the variable name.

> Then & references a variable memory location but is also used in method signatures to change the semantics of calling a function into pass-by-reference. To complicate & further, & is often seen alongside const declarations, which are a whole other thing that people have to keep in their heads.

Now it sounds like you're talking about C++ moreso than C. In C, & is pretty much always the "address-of" operator (except in contexts where it's clearly logical- or bitwise-AND).

From your post, it seems that the problem is more one of poor teaching, poor explanations (metaphors), or poorly-designed extensions (C++) as opposed to shortcomings of C.


I second the remarks about poor teaching (though C type syntax does sucks hard). The main problem probably comes from the fact that ordinary variables are themselves an indirection of sorts, and programming courses do not make that clear (we tend to confuse the variable and the value it holds). Excerpt from my article on assignment[1]:

[The pervasive use of the assignment statement] influenced many programming languages and programming courses. This resulted in a confusion akin to the classic confusion of the map and the territory.

Compare these two programs:

  (* Ocaml *)        │    # most imperative languages
  let x = ref 1      │    int x = 1
  and y = ref 42     │    int y = 42
  in x := !y;        │    x := y
     print_int !x    │    print(x)
In Ocaml, the assignment statement is discouraged. We can only use it on "references" (variables). By using the "ref" keyword, the Ocaml program makes explicit that x is a variable, which holds an integer. Likewise, the "!" operator explicitly access the value of a variable. The indirection is explicit.

Imperative languages don't discourage the use of the assignment statement. For the sake of brevity, they don't explicitly distinguish values and variables. Disambiguation is made from context: at the left hand side of assignment statements, "x" refer to the variable itself. Elsewhere, it refers to its value. The indirection is implicit.

Having this indirection implicit leads to many language abuses. Here, we might say "x is equal to 1, then changed to be equal to y". Taking this sentence literally would be making three mistakes:

(1) x is a variable. It can't be equal to 1, which is a value (an integer, here). A variable is not the value it contains.

(2) x and y are not equal, and will never be. They are distinct variables. They can hold the same value, though.

(3) x itself doesn't change. Ever. The value it holds is just replaced by another.

The gap between language abuse and actual misconception is small. Experts can easily tell a variable from a value, but non-specialists often don't. That's probably why C pointers are so hard. They introduce an extra level of indirection. An int* in C is roughly equivalent to an int ref ref in Ocaml (plus pointer arithmetic). If variables themselves aren't understood, no wonder pointers look like pure magic.

[1]: http://www.loup-vaillant.fr/articles/assignment


Thanks for the informative reply and the excellent blog post!


I'm actually astonished by the hero-worship surrounding the inventor of C. Its got so many embarassing flaws, or had at any rate; its been patched and bandaged for decades.

Sure, say that hindsite is 20-20, but I knew C was busted long before I learned anything about objects or monads etc. Syntax is goofy, backward as you say (I thought that the 1st time I ever saw a C program decades ago); its promise as an expression-based syntax was botched and mortgaged, resulting in generations of hacks; its fundamental expression evaluation syntax was broken to begin with, and as patched its cumbersome and baffling at times. </rant>


I found the "Alternative careers" hilarious.

My version:

  Your code sucks.
  Alternative careers: "Do you want fries with that?"
OP's version:

  1. Inability to determine the order of program execution
  Alternative careers: Electrician, Plumber, Architect, Civil engineer
  
  2. Insufficient ability to think abstractly
  Alternative careers: Contract negotiator, Method actor
  
  3. Collyer Brothers syndrome
  Alternative careers: Antique dealer, Bag lady
  
  4. Dysfunctional sense of causality
  Alternative careers: Playing the slot machines in Vegas
  
  5. Indifference to outcomes
  Alternative careers: Debt collection, Telemarketing
So the real test of a good programmer is one who can write a routine that crawls the source code of a bad programmer and tells them what they should really be doing.


I have to wonder if someone who can't determine order of execution should be an electrician or plumber either -- those aren't static systems!


I guess the point is that there's no "privileged" position in these systems which changes with time, like the point of execution in programming.

What is the proper name for the bit of code that's currently being executed? I've suddenly realised that its a concept you think about constantly when coding, but its so much a part of the scenery so to speak that you never consciously consider it.


At the hardware level it's called the program counter, which is a register containing the address of the current operation the processor is running. After the operation is finished, it increments itself and moves to the next step in the program.

Edit: just realized I didn't really answer the question, I believe what your actually talking about is a statement.

I usually jump at the chance to talk about stuff from my old EE days :)


The program counter isn't the only element of that "what the program is doing now" state though. The context in which the current instruction is being executed makes a world of difference.


And the ability to save all of that into a variable so that you can pass it around to other parts of your program means that your programming language supports Continuations.


inclusive of that, the best term is probably "machine state" or similar.


I consider the "context" the scope. What is the scope in which the current statement is being executed...


On the amd64/i386 platforms, the Instruction Pointer CPU register points to the executed instruction. Its name is IP for 16-bit code, EIP (extended) for 32-bit code, or RIP (register) for 64-bit code.


Current instruction or active instruction (at the processor level)

Currrent statement or active statement at the code level


That's an interesting point.

I know that when I'm in the flow coding, there is a sort of Tick:the-program-does-this, Tock:the-program-does-that stepwise execution of the code in my head.

There are many answers as to what the hardware is actually doing with the instructions, but more generally what exactly is the name of an imaginary discrete state of a program that currently only exists in a coders head?


>but more generally what exactly is the name of an imaginary discrete state of a program that currently only exists in a coders head?

This is what I'm trying to get at. There isn't one is there? It's weird.


What is the proper name for the bit of code that's currently being executed?

I would have used 'Cursor' or 'Program Cursor'. But I'm now convincing myself that that a database term.


Top of the stack?



Program counter?


I'm glad that as a purely functional programmer I don't have to worry about order of program execution so much.


Unless you are using a magical computer, you still have to worry; there are resources you must manage that depend on the underlying OS and von Neumann architecture.

An example that bites Haskell programmers are space leaks due to too much partial evaluation and not enough full evaluation. Other problems include keeping a file open for too long and leaking the fd. An ideal computer has infinite space and no files. A real one... nope :)

The advantage of functional programming languages is that you get to write part of your program as though you have an ideal lambda calculus evaluator. But at some point, you still have to realize that there's a grey box under your desk that doesn't know what a lambda or fixed point is. (But this is better than programming languages that say "hey, you've got a heap, a stack, and some registers! well, so long!")


Yes. Notice my choice of words. "Not so much" vs "not at all."


In fairness, "so much" fits the "weasel words" category (https://en.wikipedia.org/wiki/Weasel_word). You and jrockway placed different values on the phrase because it is a "numerically vague expression". I admit to making similar expressions. :)


What a condescending piece of junk. It serves absolutely no other purpose than to make programmers who do grok everything mentioned feel good about themselves.

Worse yet, programmers who could actually benefit from an article like this (i.e. programmers who shouldn't be programmers) won't understand it. E.g. "(Functional) Creating multiple versions of the same algorithm to handle different types or operators, rather than passing high-level functions to a generic implementation" - a bad programmer won't understand what is meant here, so he will simply skip over it and not get the hint.

Finally, I find the notion that you can't be a good programmer if you've never really used Lisp laughable.


The actual quote about lisp is: "5. Lisp is opaque to you". I don't think the criterion isn't whether you used Lisp or not. He probably meant: "could you understand Lisp if you tried? If you can't, then that is a red flag".


Actually I found this article quite amusing. I've done almost every single one of the things he mentions. Perhaps these are universal mistakes programmers make as they are learning? Anyway, I am please I'm not the only one.


I think that if somebody just skips over programming-related things that they don't understand, it might be indicative that they are destined to be bad at programming.


I paused for a bit when I read, "A programmer can't acquire this kind of knowledge without slowing down." It seems that many businesses that rely on programmers are unwilling to give them the time to be better.

I have seen coders with CS degrees from top-tier schools believe that they couldn't code anymore when placed in environments that do not mentor them, that treat them as horses to be run until their backs break.

If you feel this way, take a part-time job and focus on becoming a better coder. If you are a business that does this... no wonder you waste money on recruiters.


I'm in this position now. I'm not a great developer, and I love learning new tricks, but because I'm billed hourly, am constantly interrupted, and get brought in to useless meetings, I'm left with zero time to learn.


The other problem with this is that you never have the opportunity to improve the business, either. Often, management wants the rank and file to improve processes ("because they know them best") but it's hard to find the time for that when you're already working full time (and more) on your deliverables.


No doubt. In fact, this is exactly what happened to me today. I'm supposed to have great ideas, but when trying to implement them, told that I need to focus on paying the bills. I don't know how people manage this long-term.


By jumping ship and landing at a less-dysfunctional workplace.

If their business plan doesn't include time for improvements of the processes it's a failure. Leave now. Don't try to make up for their mistakes. They won't see it (positively), let alone reward you for it.

Do good for your old co-workers by recruiting the good ones into the more-functional company.


This. Don't ask me to make process improvements when you are expecting me to be billable 115% of the time.

Fortunately, my group typically allows for 10%-20% of my time to be used for either training (learning new tech/languages) or process improvements (mostly internal tools). Other groups at the company don't have it so lucky, and those are the groups that desperately need improvements.


Depends on what process you are talking about. If you are trying to improve a process which will hurt the time to deliver of a major project costing a company money, then is it really right to call the company a failure?

Every company/division/organization has priorities and while there maybe inefficiencies in their processes, they maybe a minor cost in relation to the other priorities. (I can't believe I actually just played devils advocate/defended this).


No, I stand by it. If a company doesn't include time for process improvement they have failed. Maybe they haven't tanked yet but the first bend in the road will finish them.

This doesn't mean they have to stop and fix every non-problem, but if there's no time budgeted to fix what comes up the first real problem will ruin everything.


I think you'll still learn much from osmosis, anyway.


I wrote this years ago to get it out of my system, and after Infogami went to the big web host in the sky I ported it and my other nonsense to Google Sites. The slightly updated version of the same article is now here:

http://www.yacoset.com/Home/signs-that-you-re-a-bad-programm...


Just FYI, I love your articles. I routinely forget who wrote them and where to find them again, then fail to find the link in my bookmarks, so I spend a half hour Googling phrases I vaguely remember being included in the essays ("how to deliver without everything getting fucked up", nope. "how to write code without shit exploding", nope. "how to..."). I think they reside in some kind of cognitive null zone in my brain. So, thanks for linking them again!


The problems described by the article are certainly real problems I've seen and seem like an indications of someone who indeed hasn't taken the time to deeply understand the programming process. But the term "bad programmer" and the general attitude of the article seems deeply mean-spirited,

I would rather work with an actual "bad programmer" than with the kind of person who'd spend their time thinking up "alternative careers" for these people.


In my experience just about everybody fancies themselves a great programmer and considers anybody who doesn't know (insert semi-random list of skills) to be a shitty programmer.

I kinda think it's like the Roth test for obscenity. I can't really explain what makes a shitty programmer, but "I know it when I see it."


I don't know about that.

Most people I've worked with over the years have understood their limitations and would never consider themselves great, with the possible exception of half a dozen or so folks, who truly are great.

There have, of course, been exceptions, too.


My opinion is probably tainted by my own working experiences. I've run into various mediocre IT pros who have been at it for a while and possibly are trying to appear smart for purposes of job security.

I've only had the pleasure of working with a handful of what I considered truly great programmers. I don't think a single one of them considered themselves to be "great" even though I thought they were.


I get what you're saying and the programmer community can be a mean and snarky place, but I'm not sure there is too much ill-will in the article.

Thinking that you are better than another person just because they haven't focused on your own domain of knowledge is arrogant and delusional, and in my experience many of the truly great are humble in the extreme (think Dunning–Kruger effect).

That said, a bad programmer can be a big problem if you are victim to their work. The same could be said for a bad taxi-driver or chef.

I think the article is more in the spirit of making fun of bad practices than character assassination, and that's something I can support :)


I spent about an afternoon on it.


"You're not reading this article."

But seriously, it probably never occurs to bad programmers that they should be finding articles like this.

Perhaps a better title/approach would be "A Bad Programmer Taxonomy, And How You Can Help."


Sounds quite arrogant to me. And didn't get the biggest problem: Not finishing stuff.


I must admit I'm bad at this. However, in my defense it comes down to: bad management.

I'm forever being told to work on the next MOST IMPORTANT THING EVER only to be told after a few days that I need to work ON THE NEXT MOST IMPORTANT THING and that the MOST IMPORTANT THING EVER that I was working on is no longer important at all.

le sigh


"Hey, FuzzyDulop, we're a day before deadline but I've been thinking, we could do SUPER BRILLIANT IDEA THAT CHANGES EVERYTHING and it'd be amazing! What do you think?"

...

"You can forget about your deadline if you want that in, boss."

* skip forward a week *

"Hey, FuzzyDunlop, we're a day before deadline but..."


Dealt with this from an asshat running a semester-long school project. Solution: Drafting my own spec doc, sharing w/ rest of team, asking for comments, saying it would lock in two weeks.

Followed that doc for the next month or two.

Of course, the asshat was ticked off when the semester was about to close and tried to edit the doc (for the first time). I just said no.

I consider it a success even though even Those Kids who contribute nothing got As and I got a B+ despite working quite a bit.


And then getting chewed out for missing the original deadline. Usually from the boss who just drop the SUPER BRILLIANT IDEA THAT CHANGES EVERYTHING in your lab.


We get the sort of unsure, passive aggressive criticism, full of ifs and maybes. The sort that actually makes you feel worse because the boss appears to be personally disappointed.

Get chewed out proper and at least you know it's just because jobs have to be done.


Ha! Old memories. We usually skipped forward 3 months and changed everything again.


> "Bulldozer code" that gives the appearance of refactoring by breaking out chunks into subroutines, but that are impossible to reuse in another context (very high cohesion)

I feel like this might give people the wrong idea. Surely some amount of cohesion is desirable. Also, I'm not quite sure how one "gives the appearance of refactoring" without actually refactoring. Whether it's useful or not may enter into it, but I would usually consider the breaking out of chunks into subroutines refactoring.


I once dealt with code that looked like this pattern (sorry for the giant vertical size of this post):

  DO_F() {
    A;
    B;
    C;
    D;
    B;
    C;
    D;
    E;
    F;
    A;
    B;
    C;
    D;
    G;
    H;
    I;
    J;
    A;
    B;
    C;
    H;
    I;
    // a bunch more
  }
So I suggested, "hey perhaps you should break this into functions" and got back:

  DO_F() {
    A;
    B;
    C;
    D;
    B;
    C;
    D;
    E;
    F;
    A;
    state.a = a;
    state.b = b;
    // ... for other state
    return DO_F2(state)
  }
  DO_F2(state);
    B;
    C;
    D;
    G;
    H;
    I;
    state.a = a;
    //...
    return DO_F3(state);
  }
  DO_F3();
    J;
    A;
    B;
    C;
    G;
    H;
    I;
    // a bunch more
  }
When I suggested blocks such as B; C; D; or G; H; I; could be functions, and perhaps a loop could be useful too, the guy looked at me like I was trying to do voodoo and got really defensive about how that couldn't possibly work.

I think that is what the OP meant by bulldozer code...


Not only that, the purpose of refactoring isn't simply for reuse, but also for readability. If you break out chuncks that aren't reusable, but enhance the readability of the code then you've likely improved the code.

Given the choice between reusability and readability I'd say that readability should usually win (although it is a false choice).


I would contend that if you're having code readability issues, it'd be better to fix it by sprucing up your whitespace and comments, instead of saying "Well, let's chunk it into more functions" and have method(), which returns method1(), which returns method2(), which returns method3()...

Chunking for the sake of itself doesn't seem like a viable strategy to me.


When you add a new-line or two into a section of code, you're usually saying:

"These two blocks of code do something different"

Thats what is normally called a method. If it's only for one class then by all means make it private and name it well, so the caller remains readable.


I agree that that strategy's pretty feeble, but IMO it's still better than e.g. a thousand-line function. If nothing else it forces you to break out the parameters explicitly, instead of just having a thousand lines of intermingled state transitions.


Chunking out in more functions is good in itself, because there's usually a well defined interface for what can get in and out of a function.

Whitespace and comments can suggest that some pieces of code don't interact that much, but chunking into functions lets the compiler prove it.


Fully agree and I want to add, that it is also a requirement for writing good, simple, small unit tests. Clean code means, a function should only do one thing and that's the thing you can test.


I find this is one of the things I struggle with the most. I definitely "refactor" code by moving it to a subroutine sometimes purely for readability's sake. I also try not to stress out about how reusable something might be until I actually NEED to reuse it. Otherwise it often seems like "premature optimization" trying to generalize a routine that I'm not even sure will ever get used anywhere else.


In my experience, the drive to make things reusable from the start ends up delaying projects to a point where they might just not get done.

I also refactor for the sake of readability (and urge my co-workers to do the same). Of course, this might mean I'm a bad programmer and I shouldn't ever get 6 levels deep in logic in the first place...


Interestingly, his enumerated set in the "Bad Programmer" list is almost exactly what I was grilling for when I asked people to implement stupid algorithms on a whiteboard and then step through them, debug them, etc.

I somewhat disagree with:

> 6. Cannot fathom the Church-Turing Thesis

Truly understanding the Church-Turing thesis probably requires more computability theory than the two-week overview even top CS students and otherwise great programmers get in their ABET-required Discrete Math course. It would be great for more people to have an understanding of what a computable function is and isn't, because it would make it easier to explain why we don't have more fancy-pants type systems around,\footnote{Inference is undecidable for most interesting ones.} but it certainly isn't a requirement for great or even good programmers.


> Truly understanding the Church-Turing thesis probably requires more computability theory than the two-week overview even top CS students and otherwise great programmers get in their ABET-required Discrete Math course.

Mathematicians may have more luck in their training. But that depends on your chosen speciality.


A lot of these make sense, but I would caution against being discouraged if you show any of these symptoms. Like another front page article that says IQ is not static, I think this also definitely applies to programming ability.

I hope this isn't used by some to push the mantra that 'not everybody can code'.


Yeah, I think the title should have been "Signs that you are a mediocre programmer", which would have been less discouraging. I'm sure most good programmers have started out making many of these mistakes. At least it has a "Remedies" section, so it's not a total downer.


I completely agree. Several of the mistakes listed, particularly thinking in sets, were things I had issues with until that "aha" moment when the paradigm shift in my thinking about the problem occurred.

On the other hand, I doubt the author was trying to discourage bad programmers, but rather bring common deficiencies or mistakes up to help programmers avoid them. To me, many of these are signs that you are a new or inexperienced programmer, not a bad one (although if you've been a programmer for a decade and are still making them, you may have something to worry about).


Especially symptom #2 - "Executing idempotent functions multiple times (eg: calling the save() function multiple times "just to be sure")"

That could just be OCD rather than bad programming.


I would say that's OCD causing them to program badly. That there's some underlying condition behind the erroneous pattern does not excuse it's presence in the code base.


My first thought was, "wow, infogami is still up?" I guess I left it in a good state. :)


Looking at the homepage, it says that the most popular pages were converted into static form, and now they're trying to sell the domain.


Yeah, we did that a long time ago. I'm just surprised it is still running! I don't think I even told the other reddit guys about infogami or where it runs or anything.


I guess I can't log-in and add a link to its new home, eh?


Nope, it's static only. But if you ask the reddit guys real nice, they might be willing to edit the file manually for you.


To me, everything in that article can be addressed with a little experience.

My one indicator that someone is going to be a bad programmer actually has little to do with technical skill sets, but rather personality. As my sanity check, I make sure to examine the open-mindedness (willingness to research and use unfamiliar tools) of programmers. More often than not, you find programmers who are very comfortable in given languages, environments, tools etc, but once you take away their comfort blanket, they keep reaching back for it.

An extreme example would be a scenario such that a programmer who knows C very well decides to do a decent amount of parsing in C instead of researching better languages for the job such as python or perl.


"You seriously consider malice to be a reason why the compiler rejects your program"

Of course it's malice.


Someone has never dealt with incompetently written scripting languages or tragically wrong documentation for the same. Sufficiently advanced incompetence is indistinguishable from malice.

I remember having to test exactly how a bunch of different predicates actually worked, when the documentation thereof was a complete lie and the predicates just evaluated to false instead of creating an error when fed data they supposedly accepted (but actually couldn't).


the compiler in many simple cases could have tried to repair the program, like attempting to insert(remove) missing(extra) ';' or '}' for example. While it may be not a malice, it is definitely an absence of a good will.

Note: i did work on a real product which had parser inside that had such style of repair implemented, and such approach is also quickly mentioned in the dragon book as well


It's possible that such automatic repair would create a bug. Much safer to have a human check and ensure that the ; goes in the right place.

A better point is that the error messages from many compilers can be very obscure. This was particularly true 10-15 years ago.


Yeah, if a compiler only compiles the code you actually wrote, it's still your fault. If a compiler tries to infer what you meant and compiles that instead, then it might genuinely be the compiler's fault when something breaks. Plus, the original code should still be fixed instead of relying on the compiler to compile the same broken code the same "right" way each time. We spent the last 20 years trying to convince web developers of that idea.

Clang does something like this. If it notices a simple enough syntax error that it can correct, it still generates the error message but it makes the correction internally and continues the compilation far enough to generate useful error messages about the rest of the code. But it doesn't output anything or change the code.


I don't need an article to tell me I'm a bad programmer.


The OP may be being ironic/sarcastic to a certain degree but I'm always wary of these types of articles.

"You are a bad programmer". How would it make you feel to hear that? Probably not open to further suggestions on how to improve.

While I agree that as programmers we should always strive to be getting better, these types of put-downs and belitting don't foster a culture of communication and trust.

Personally, I'm most receptive to hearing that I could improve when it's coming from someone I trust and respect. If a random person or poster on the Internet sent me this article and rubbed it in my face, I'd probably write them off as a major a-hole. And, I might miss out on a great opportunity to learn.

It reminds me of this post: http://news.ycombinator.com/item?id=2322696 - I read it and at the time thought it was great. But, sadly, not much has changed and this post is sad evidence of that fact.


I saw at least a half-dozen habits that have cost me I-don't-want-to-think-how-much time. And at least a half dozen I'm still doing. I was dimly aware of them as repeated frustrations, but identifying them makes it a lot easier to target them for deletion.

I'm going to mark this thing up, add "haven't crossed any of these off in the past year", and postpone my judgment until next fall.


Why is "Performing atomic operations on the elements of a collection within a for or foreach loop" a symptom of mediocre programmer "inability to think in sets"?

How should one do that if your "thinking is sets"?


That particular symptom is built under the assumption that your language supports map/filter/reduce or some equivalent functionality. These are generally preferred when available because there are fewer places to make mistakes.


my ruleset is much more liberal

signs that you are a good coder: you enjoy coding, you enjoy a good piece of code

signs that you are a bad coder: you don't enjoy coding, you can not appreciate beautiful code

if you enjoy coding it jut doesnt matter if anyone else thinks that your code sucks. you will get better. if you hate your job (programming) then you code stinks and will get worse.

in my careere i went from good to great to bad to i dont code anymore to good


How long have you worked as a professional software developer? Have you ever worked at a large company? There are plenty of people with a decade of experience who enjoy writing code who are just bad at it. Some of these people work on software that is much more important than Facebook.


I suspect that #1 sign is that you go trolling for self-assurance in articles like this, or worse, by writing smug, random articles like this.

The best possible signs either way will be in your career so far.


A sign that I found missing from the list: you are totally devoted to a particular programming language and believe it's a silver bullet. All good programmers that I know understand that languages are just tools and all have tradeoffs of some sort.


I'm pretty sure I saw the "everything is a nail" type listed, which might well cover the folks that think Technology A is the be-all and end-all.


Good point. Being overly attached to a language means you end up choosing the wrong tool for the job at some point.


I'm a student in the third year of my CS degree and I constantly worry about not being a good programmer. There's things on that list I need to work on.


I'm presently repairing a project that violates nearly every one of these principles. Literally, it's like they had a checklist made from this post and went down it to ensure they did everything wrong.


I think this stuff comes from someone who has a colleague with these "symptoms", and is either frustrated because they can't educate them into better techniques, or is a subordinate with no power.

In any case it's pretty unforgiving.

I have had a colleague who fits the bill here, but actually I just let them get on with their work and I got on with mine. Sure, it was hard when he delivered something that was excruciatingly poorly conceived and executed and I had the task of taking it to the the next stage. But you know this isn't a life and death business, he was a really nice guy, and I wasn't going to rock the boat. The next project, I was working with someone else and everyone was happy.

Contrast that with making an issue of it, and you might find yourself being labelled as a trouble maker, even if you did have the best intentions for the development.

If you run the show however, don't stand for it, because it's your business, and it calls for the best at all times.


Is it odd that any tense of the word "ship" does not appear in this article?


Believe it or not, sometimes we code for purposes other than selling stuff.


But you still have something to ship, even if it's just to your buddies on github.


What you give to someone else may not be code -- it might just be the code's output.


That is still called shipping.


It's probably written by a CS professor.


I guess "ends justify the means" would be another article.


Hilarious, but some of the alternative careers I don't agree with. Particularly the first example of order of execution.

If you can't grasp that and decide to be an architect, I wouldn't want to drive on your bridge, work in your building, or live in your home. Same for the options!


You may be able to grasp the instantaneous distribution of forces in a building or charges in a circuit without needing to know the order that they deploy. While a computer itself is an example of a circuit where the order of execution matters, a radio is not.

Order-of-execution is not perceived the same way as cause-and-effect, even though they're both the same thing. It's a damn weird thing that I've seen people do. A little bit like how artists can comprehend how the position of an arm has an effect on the way bones, fat and muscle are distributed over the rest of the body, but can't understand why they got smacked with an overdraft fee when their average daily balance would have covered it easily.


How do I know that list is a little off? Using it as a metric, I am mediocre-to-good, when I know for a fact I haven't yet reached mediocre.

For example, I grokk pointers, but I use them so infrequently I make foolish errors and have to stop and block everything out.


In my experience you have nothing to fear: such thinking and self understanding makes you a better programmer than many I have worked with. The worst usually think they are way better than they are, and the better ones usually know where they can improve.

I have worked with people who do some or most of these, and let me tell you, the experience is not pleasant -- frequently turning into "Oh you changed that code, I can't maintain it anymore, object are too tricky" (when refactoring bulldozer functions and dicts of lists of dicts of tuples (some of which contained dicts) into a few classes in python) or "My code was working, why would you change that? I can't be responsible for looking at it anymore!" (When the spec changed and we needed to check for extra error conditions.)

Basically what I'm suggesting here, is bad programmer is less a function of skill at any particular moment, but of attitude and approach. Skills that don't change, continued lack of understanding, and laziness, are all symptoms of a bad programmer, who won't improve. I'd rather work with a neophyte who wants to improve over a veteran who is static any day.


" I'd rather work with a neophyte who wants to improve over a veteran who is static any day."

So true.


> eg: the "Delete Database" button is next to "Save", just as big, has no confirmation step and no undo

That was common practice all over the Web for over a decade, it only stopped recently (on most sites). [Submit] [Reset] Argh!


If you are reading this article, you are not a bad programmer; you are a programmer learning to master your craft.


I'm starting to think that the majority of errors listed here are simply phases of the learning process we all go through.


I'll admit, when I was a C programmer, I was victim to "Dysfunctional sense of causality."

I remember taking a C class in my undergrad, nailing the code to perfection, and getting an F because I had written the code in under different architecture and compiled with a different compiler (not specified in the prompt by any means). It ran well on my computer (and I proved it), but since it wouldn't run on my professors computer or TAs computer, I still received an F.

So yes, I still maintain strong malice towards C :-)


Whooops... One of the symptoms is:

"(Functional) Manually caching the results of a deterministic function on platforms that do it automatically (such as SQL and Haskell)"

Isn't that what memcached is doing?


This article is ignorant bullshit. Speaking as someone who's been a professional programmer for a long time, and for a series of good companies. What's present may be technically correct (may), but most of the important stuff is absent. And as for alternative careers for people who are too dumb to progam. Oh please.


I think one of the biggest errors a bad programmer makes is seeing 'bad' code everywhere. Sure, some code stinks but just because some code doesn't do something the way you would have done it doesn't make it bad.

If you do have to complain about some piece of code then do so in concrete terms:

- Presense of bugs

- Lack of error handling

- Preformance issues (real ones, that cause real problems)

- Security issues

- Poor robustness

- Redundancy


I'd add:

- bad factoring

- overly complex or overly simplistic object model

- not extensible (although this itself is not strictly bad, as spec changes are pretty hard to predict, but sometimes it is requires more effort to make something not extensible than it is to make it extensible these cases are a problem unless there is a good reason for it).

- abstractions that introduce more effort (factory factories...)


Your username doesn't aid in your argument against criticism of bad code. Just sayin.


Actually, that was a username I've used for 13 years. Currently I'm mainly developing in TCL professionally and Python personally.


It's in jest. There's something funny about the username "VBProgrammer" in a discussion about identifying bad developers, from the stereotypical POV.

Disclaimer: I started as a VB guy myself. Now I fall in the C# and Python camps.


> (OOP) Writing lots of "xxxxxManager" classes that contain all of the methods for manipulating the fields of objects that have little or no methods of their own

How about writing lots of xxxxController classes that manipulate objects that have little methods of their own.


This article is mean spirited, too serious and not well written.

Bad programmers I have known generally:

- Are oblivious and clueless. They just lack cleverness.

- They lack problem solving abilities and willingness to solve problems.

- They are usually slow witted.

- They are very conventional-minded - external approval means a lot to them.

- They are willing and ready to work too hard at programming for a result.

- They lack abstract reasoning skills.

And most bad programmers don't like Star Trek or science fiction, but I can't figure out why other than simple correlation with the mindset of a programmer.

The first set of six "real" causes pretty much encapsulate 99% of the mediocrity I've witnessed in industry in this field. Bad programmers usually have two or more of these problems. Even one of them is a serious problem if you are looking for a great developer.

All other badness, such as "meal ticket" thinking, usually radiates outward from these things.

I say this because if you lack the quickness and the desire to seek simple and elegant solutions, the work becomes like drudgery and your work reflects the other deficiencies that the article described.

A specific example: if you care too much what non programmers think (including your managers) and if you are a bit of a toady, you will not take the time to learn what is going on in-depth, and you will look for quick solutions. So it's important to strategically ignore your bosses from time to time and if you are trying to be politically popular, software development isn't a good place for this.

Another example: if you "want" to work very, very hard, you will not seek ways to simplify the code. You will instead cope with growing complexity and spaghetti.

I question the article writer's programmer instincts because he did not figure out a much more efficient and concise way of expressing the same things. Above all else, a real programmer figures out how to say it just once - not repeatedly cast in different ways.


Where does the "gets shit done" programmer fit into all of this though? I'll admit that some of my code is confusing and not always "clean" but it does what it's supposed to do, and I'm able to get more "shit done" when I focus on the end goal vision rather than the cleanliness of my code... which in a prototyping phase of product development is really rather meaningless anyway, since you don't even know yet if there's a market for your product/service offering


> "Yo-Yo code" that converts a value into a different representation, then converts it back to where it started (eg: converting a decimal into a string and then back into a decimal, or padding a string and then trimming it)

The other day I was at a hack night and some guys were working on something written in Node. They were having all kinds of problems with an object, so they serialized it to JSON and loaded it back, and somehow it magically started working.


It's easier to fix the technology than to fix the users. I don't think in my wildest dreams I could walk away from a program I created and say the user of said program is bad. I would find a way of fixing it. Programs should serve users, not the other way around. Likewise, Programming Languages should serve Programmers. Clearly, we aren't properly being serviced if programming makes us constantly look bad.


Signs you're a bad writer:

- Your blog post only consisting of lists

- Using only -ing verb forms in your lists

- Not having an introduction in your article

- Not having a conclusion either

- Having only short 2-line paragraphs

- Lack of any style


The best part is at the very bottom... if you think it's TL;DR, you could have missed a lot of points.


I think the bad programmer won't see this post as they don't check Hacker News on a daily basis... :P


I wonder who wrote this article originally? Google finds about 9 identical copies:

https://encrypted.google.com/search?q=%22deficiency+a+progra...


Hacker News user clawrencewenham

http://news.ycombinator.com/item?id=3132533


Signs that you are a mediocre programmer

Adding columns to tables for tangential data (eg: putting a "# cars owned" column on your address-book table)

I thought this is actually good for performance in different cases? Why is this considered mediocre?


Another sign: not reading this article because you couldn't possibly be a bad programmer and therefor this article can't have anything to offer.


There are no bad programmers. Just bad programs.


There is one surefire sign you're a bad programmer: faced with a problem, your reaction is "pshaw, I see how I can solve this..."

Whereas a good programmer has the reaction, "uh-oh. I see how I can solve this..." And knows that this is just half the battle, and what he's getting himself into.


Oy, I used to do this all the time... till I stared sitting down to solve some of those problems. That was about the time I started saying "uh-oh". I (think) I have moved on from this phase fortunately.


Well, that's just experience. After a while, you lose your confidence that you are the new hawtness, and you start seeing all the ways things can go wrong... and ways you can avoid them.


oh my! I love when you recommend specific career base on specific issues. You made my day! I can't stop laughing. "Bag Lady" seriously looool


I agree with joe_the_user. I found the article snide and condescending. I suggest alternative careers for the author:

- Leader of a high school clique - Movie reviewer who concentrates on who's "in" - Coiner of demeaning nicknames

I wouldn't want to work with this person.


I agree too. I found it arrogant and very condescending.

But his alternative career section is humorous.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: