Hacker News new | past | comments | ask | show | jobs | submit login
What does it take to be a good programmer? (dimitrov2k.wordpress.com)
196 points by ingve on Jan 23, 2017 | hide | past | favorite | 111 comments

My personal advice would be to follow the Ira Glass quote:

> Nobody tells this to people who are beginners, I wish someone told me. All of us who do creative work, we get into it because we have good taste. But there is this gap. For the first couple years you make stuff, it’s just not that good. It’s trying to be good, it has potential, but it’s not. But your taste, the thing that got you into the game, is still killer. And your taste is why your work disappoints you. A lot of people never get past this phase, they quit. Most people I know who do interesting, creative work went through years of this. We know our work doesn’t have this special thing that we want it to have. We all go through this. And if you are just starting out or you are still in this phase, you gotta know its normal and the most important thing you can do is do a lot of work. Put yourself on a deadline so that every week you will finish one story. It is only by going through a volume of work that you will close that gap, and your work will be as good as your ambitions. And I took longer to figure out how to do this than anyone I’ve ever met. It’s gonna take awhile. It’s normal to take awhile. You’ve just gotta fight your way through.

That is, code a lot of stuff. And, while doing so, pay attention to how you feel. Sometimes it'll be exhilarating, or you'll be proud of what you done. Other times, you'll be confused, or afraid, or pissed off about how something you wrote earlier is now unmaintainable or hard to make work. This is how you actually learn the insights behind all the dogmas, rules of thumbs, heuristics, good practices, etc. That's how you develop taste - that intuition which will help you write good code.

Rarely do devs ever remember what its like to be at the bottom. I've found the elitist attitude everywhere where devs will suggest completely unrealistic solutions to problems that are on such a higher level of development they miss the objective completely. I once saw a Python+JS dev tell a newcomer to JS to just write all their code in Python and transpile down to JS. They completely missed the point that in order to properly do so, you first have to know what your JS is going to do and how it works.

Its even more upsetting when you see all these stories of people that only took them 'a year' to start making a good living with it. Typically, they show the most arrogance by asserting that they have what they have because of their merit, when really the company that hired them just couldn't get anyone else that had some specific knowledge in what they were doing, or the job description was wrong, as they always are, so it attracted the wrong people. I don't consider these individuals to be developers of any usefulness outside of their scope of work, because they simply haven't spent time outside of it, so they are not problem solvers, they just look up guides online and make the same stuff everyone else makes.

Being a good developer takes time, it takes a lot of failure, and it takes a continued effort to learn.

This nails it in my opinion. Too often I see people assume that programming/coding is somehow science based. It's not really, it's an art and you need to be a creative person to be good at it. The science comes later (sometimes never for some).

This is really interesting to me, how different people view programming at a conceptual level.

My current view (it changes every couple of years) is that as a skillset, programming is primarily based on planning and management, most akin to logistical management and orchestration. (we're just lucky our workers are CPUs and resources are memory/bandwidth vs workers you have to individually train, and physical resources that are permanently consumed)

The focus of a programmer's work, and the skills required include:

    - domain knowledge and understanding of the work being done  
    - developing use cases and modelling a solution, often built from well defined and understood 'components' and relationships between them (eg. factory/production, shipping/freight, warehouse/storage, storefront)
    - defining processes and procedures to deal with the various states throughout the project's lifecycle  
    - logic and control flow through these processes including synchronization and validation  
The more I've looked at examples in areas outside CompSci, the more parallels I've drawn to general project planning and management skills, vs those more artistic or scientific.

In case it's unclear, I _am_ talking about writing code, and not software development in general.

I often think of my data model as a physical system, and its behavior in terms of real world systems. As a simple example, I visualize the function processing a FIFO queue as someone working a specific station in an assembly line factory: their job is to take incoming pieces of a known, expected type and do _something_.

I find this helps me "zoom in and out" mentally, grouping things in context with some understanding of what's going on internally, while still treating it as a "black box". (eg. you can still picture what's going on inside a "factory" when you're looking at it as a whole, with several inputs and the produced output)

> programming is primarily based on planning and management, most akin to logistical management and orchestration.

That's how I feel as well. Maybe it's a by-product of the software we work on? I work on IT management software and distributed control systems.

When working with control systems, the "science" part typically is in the analysis and testing phases (since we have to come up with models of physical systems). For coding itself, however, it is closer to what you describe.

> I often think of my data model as a physical system, and its behavior in terms of real world systems.

I do this too, especially when programming in Erlang. When building software systems that are modeled as a large number of concurrent, message-passing actors/processes, it readily lends itself to this "physical systems" analogy.

I graduated with a music degree and have been a professional programmer for twenty years... I have honestly never really seen programming as an art. Not in that sense. It's got some elements in common with different kinds of music, like how you choose to make it balanced and cohesive I guess, but to me it's kind of a pale shade of musical artistry.

If you're performing, it requires you to tap into some sort of "being in the moment" type of flow that is almost the opposite of deep-programming flow. You have to be aware of the present moment while also interacting with the present moment. Getting "into the zone" while programming, if applied to performing music, is a great way to lose listeners. And if you're performing with others, you'll screw up the ensemble.

If creating/writing, music requires a kind of delving-for-inspiration I almost never have to do while programming. Early 2016 I had to solve a programming problem that didn't fit any previous patterns and wasn't googleable, and I solved that in a kind of lateral-thinking flash of inspiration, so that's somewhat close, but it's still rare compared to normal programming practice... and even then, in programming those flashes are less personal.

Programming is enjoyable in the sense that it's like both solving puzzles/problems and sometimes tending a zen rock garden. You think, you solve, you clean. And to a degree, you express. I don't really think that clears the bar of artistic, but it is manifestation and it is satisfying and it is sometimes personal so that's good too.

There are multiple definitions of "art". And debates on how to define and classify "art" have been around forever. Where art thou, bro?..

The post you replied to seems to be using "art" in the very common sense like the phrase "more art than science" uses it: it's an art in that it requires some type of successful human judgment and subjectivity applied to the needs of a skill. If a fancy calculator can't give an official, right solution (or you don't have this calculator if it exists) somehow that became called an art. Technically, "art" applied to a skill or craft may be the oldest definition of the word. Obviously, these things are gray areas that involve both art and science. See Donald Trump's book "The Art of the Deal" for more info.

Just a question if you see this: Why and how would "delving for inspiration" be an art or artistry? Creativity and inspiration don't seem to be the same thing. Where does the control and art come in when delving for inspiration?

Likewise, in music you have the notes on the page (a computer can even play them). You also have accents and dynamics etc. on the page that require a higher level of subjectivity but instructions/descriptions can be written. Is there another level above that? Is there a part to performing a piece of music that can't be written? Is there a name for the unwritable part?

Sure, I guess I meant to restrict my answer mostly to art-as-music.

Your first question: I think delving is more about artistry when it's personal, perhaps when one's own answer would be different than anyone else's. It's easy to be pedantic about this point, but it being personal is on the right track for me.

As for your second, I think part of the artistry of music is intrinsically related to the human element. In other words, let's say there were two recordings of performances or even an original composition. One was composed/performed by a person who felt it as they went, and another was composed/performed by a computer program. Even though it's the same sequence of pitches and waveforms, and impossible to tell the difference from listening alone, I still see the human one as different than the computer one. And if someone says, ah, surprise we told you it was human but it's actually computer-generated, I don't think that makes any kind of deep point, it's more just a dirty trick.

Anyway, the difference between those two recordings, that's the other level above that. I can better respect the human artistry that heard those sounds internally before manifesting them, than the computer program that generated them with AI and statistics.

I always thought programming is kind of literature, because you write in language for machines and humans at the same time ie. mix of logic and inspiration.

It is absolutely a creative endeavor. The science and math are just the paintbrushes and canvas.

I moved from coding to teaching. It's amazing how they use math scores to predict how well someone will do at computing.

I was rubbish at maths, but fine at coding.

It's more art than anything.

More of a craft than an art. Sometimes it's more like woodworking, sometimes it's more like plumbing, but it's rarely anything like painting.

I have a friend who is an artist that spent the last 10 years working on his painting skills. I observed one interesting parallel between developing skills as a painter and as a programmer based on conversations with him.

Beginning painters tend to get overly focused on very small details without having a good understanding of how they are bringing the painting as a whole together. They spend too much time on these details, whereas advanced painters are able to move very quickly with faster strokes. It's pretty amazing to see a painting where the strokes don't make sense when you look at them one-by-one, but the overall effect truly reflects something from reality.

I found this to be true also with inexperienced developers. We focus on the latest trends, micro-optimizations, coding styles, and "small" details that are easier to master -- yet we struggle with the bigger picture things like delivering a product quickly.

Kind of like the saying "you can't see the forest for the trees". (You focus on the incidental details, but not the larger purpose)

On a slightly different note, I have several friends who are writers and I have found programming to be very similar to writing a novel. You have to construct a coherent world that "compiles", e.g. the characters interactions are consistent and make sense; each character has a role in the story like a component in a program would have; simplicity and directness tend to be more effective than ornateness; small changes in one part of a dialogue can have cascading changes across the rest of the storyline, much like a refactoring gone wrong.

Some years ago now, I came upon a lovely essay entitled "Hackers and Painters†". Here is how it begins:

    When I finished grad school in computer
    science I went to art school to study
    painting. A lot of people seemed
    surprised that someone interested in
    computers would also be interested in
    painting. They seemed to think that
    hacking and painting were very different
    kinds of work-- that hacking was cold,
    precise, and methodical, and that
    painting was the frenzied expression of
    some primal urge.

    Both of these images are wrong. Hacking
    and painting have a lot in common. In
    fact, of all the different types of
    people I've known, hackers and painters
    are among the most alike.

    What hackers and painters have in common
    is that they're both makers. Along with
    composers, architects, and writers, what
    hackers and painters are trying to do is
    make good things. They're not doing
    research per se, though if in the course
    of trying to make good things they
    discover some new technique, so much the
I think he got it right.


Paul has an entire book of his essays by the same title, it's a great read all kinds of programmers. :)

I'd have to disagree; I view programming as an art similar both to music composition but also very much like painting---this may require expanding your definition of panting from the classical sense to the Dada sense of found object sculpture as a form of painting/making-your-own-paint (can't find citation, but there is a good Marcel Duchamp paper about making your own paint I was trying to find---but late for work!).

The source is "Ira Glass on Storytelling".


Will you permit a couple more quotes apropos taste?

"Good judgement is the result of experience and experience is the result of bad judgement." - Mark Twain


"To write a good program takes intelligence, taste, and patience. You are not going to get it right the first time; experiment!" - Bjarne Stroustrup

Finishing things.

So many coders don't do that. It's really hard to just say "Done." and click 'release'.

Then wait for the criticism.

That's a great quote. Thanks for that.

A great quote, and a great interpretation at the bottom.

> And, while doing so, pay attention to how you feel.

Absolutely. Excellent point.

Big smile on face after reading this.

Well, I've been programming 30+ years.....

I think one of the main things that makes a good programmer is someone who cares about the final product. Ironically this doesn't necessarily mean you have to be super good at writing code. I have seen people with what seems basic coding capability very methodically build some really nice software. This is because they are very focused on the "end" not the "means". You see the thoughtfulness of what has been designed in. Sometimes, in the quest of writing "good code" it interferes with writing good software because the emphasis becomes getting good at the means to an end above the end itself.

Not to say improving your coding capability isn't of value, but it's done so you can create even more fantastic things.

Been programming for 10 years and it's distressing how inadequate and incompetent I sometimes feel. I like to rebuff it as Impostor's syndrome, but who knows, may be I am just bad and surviving because mediocrity is what the industry wants.

On the other hand, the pleasure that a good piece of code in a beautiful software gives is something I can perhaps relate to an opiate high (as a metaphor, never tried honestly :) )

I find it more like the feeling of having sorted out my wardrobe (folded things and thrown out old clothes, I hate doing this despite feeling better afterwards). Or a mix of that and say building a good looking Lego MOC from a limited set of pieces. But only for some things - I have several self-contained python scripts that give me this feeling. Sometimes fixing a bug in a clean way.

Other things - Android apps for example - tend not to give me this vibe. They feel more nebulous and "unfinished", like it's a miracle they work at all and will break at any moment in ways I couldn't understand.

work environment can mess with your mindset to care. To the point of being told not to. you just need to achieve X, and move on, and you never get to make things in a way you care about them.... so you may decide personally to focus on the means, try different languages, learning new frameworks, techniques..... but never quite being in environment where you can build something you care about. Interviewing people you often hear peoples stories around this, it's a motivation to find a new job, or they end up with side projects / open source which they care about and start wishing they could do those full time. They just want a space where they can build things that matter to them.

Some places get it, some don't, some partially get it. One of the classic things I have seen is "managers" that say they want you to care but don't really care themselves.

Interestingly a number of years ago I went through a program called "Better by Design" which the NZ government sponsored for local businesses, see:- http://www.betterbydesign.org.nz/why-design-matters/ and a big part of the message about good design was really all about trying to make people care at every level of the company.

I have also been programming for over 30 years. I agree with you when the problem is small but all the rules change when it is bigger. If you are not up to it then it does not matter how much you care, and I do want people who care working on stuff, but at scale if it is not really good then the chances of failure are much higher.

even at scale, it's the same thing, even more so.

just remember I'm not saying caring vs coding skills.... I'm saying caring from when you are noob through to your entire growth as a programmer will define you as a good programmer, as opposed to an average one. An average one will also adopt coding skills and be... ok, average programmers will be inspired by good programmers and want to emulate them, "wow! that person did some awesome stuff with React!, lets do React! it is all things awesome!", but then you find while it's certainly got some things you appreciate, you end up with average React code because you adopted the the framework and not the mindset. My observation is good programmers start... and continue with a mindset. "It works" is never quite the goal, it needs this quality of being "good", good code, good design, good Ux, which we won't try and define, but more importantly is driven by caring about the outcome.

I guess that's more my reflection on my observations, what I have observed is having taken on people out of uni ( or crossed into programming from math / science / engineering ), and seen them develop over the years I have this sense of "good programmer" from the beginning, it's not once they reach some level of coding ability. They always do good things with their current capabilities. They then carry on and improve those capabilities and do even more good things.

Usually they methodically build very nice nightmare, but of course 30+ years is long enough to see a couple of exceptions. I've been there for 15+, and half of that in 'consulting' — it's when you write business code right here, right now after 10m discussion. These are really two different styles, but I feel what you mean. 'Good code' is a hard thing to define, but for me it is the code that fully covers all aspects of the initial set of intents (I intentionally let the purpose and profitability off, because it may be not of value in some cases). If intents are unclear, we can't really evaluate anything in between. If it was a very dirty hack for a hard task, and we intentionally made it, it is still good code. Thus you can't tell good code from bad without knowing its history and the rationale behing it. Code doesn't fail, projects do.

Usually ... I find many people can make nightmares for all kinds of reasons ... but for me, usually, people who have a strong sense of what they want and they have a strong sense that they want it to be "good" but only have a basic coding skills. They don't end up with nightmares, possibly slightly strange things, but they tend to adopt coding advice in a considered manner and start growing their coding capability. The biggest problems there comes when the person is dealing with frameworks which have concepts beyond their current coding capability.

I think we see at the other end of the spectrum with the frameworks, languages and software we really like, they are done by people who really want their end result to have this quality of "good" and are very considered. They leverage their coding ability to great effect.

(lua and its implementation is a good example of this )

I think when we see your more run of the mill business app where you spec -> implement, often you don't see good programming, often because no one super cares. It's just average programming. However, when someone starts thinking, well, applications like this should be super easy to put together and really starts to care that they are easy to do, then you'll see some good programming :)

I call it honesty. To be a good programmer you have to be honest about what you are doing. Smarts won't help.

Shocking to me that there are so many articles on this. The strategy to becoming a good programmer is the same strategy to become good at anything else.

1. Identify what it means to be good

2. Work endlessly towards that aim

3. Deliberately practice

4. Decide to refine strengths or squash weaknesses. Do this as frequently as possible. Make sure you have an accurate assessment of what your strengths and weaknesses are. (5) and (6) can help with this.

5. Consult the advice of those more experienced. As frequently as possible

6. Allow your work to be publicly scrutinized. Do not work privately, the feedback loop should be as tight as possible to inform (1)

7. Repeat (1) - (6) as frequently as possible

Do that and you will become better. Guaranteed. Note, this won't necessarily make you the best. However, it will make you better. Continue this process until you're as good as you desire. There, of course, are diminishing returns after a while.

>1. Identify what it means to be good

You'll never get a solid definition for a good programmer, and that's the problem. If we knew that, we would have a list of things to learn and teach and we could easily make that a standard interview test before giving people a job. And no one would need to write any more "10 things you need to know as a programmer" articles.

If you ask other programmers, you will get a bunch of answers. If you ask the guy who cuts your checks, a good programmer is someone who can solve their problems and make their business more efficient.

The guy that cuts the checks doesn't know if the programmer is sacrificing long term maintenance for short term problem solving.

This is a good generalized process for getting better at anything, but it's still awfully subjective and squishy once you try to apply it to a given domain. How do you `Identify what it means to be good`? How exactly do you `Deliberately practice`? How do you assess your strengths and weaknesses? How do you find someone more experienced? How do you know that experience has given them good lessons?

Answering all those questions for a given domain is how we have so many articles like this. Everyone has their own take, and it can be useful to look at others' so you can compare it to your own.

Step five is, in my experience, one of the fastest ways to get better. Very solid list of steps.

Works for life, really.

I think there's a personality trait that's really useful in programming, which is to be pessimistic in the short term and optimistic in the long term.

What I mean by that is that you don't expect some thing you just wrote to work the first time, and maybe not the second or third, and so on. But you expect that you'll eventually figure it out and it will work.

A short-term optimist would become discouraged by their expectations not being met over and over, while the long-term pessimist would not be sustained through the hard parts by imagining how happy they'll be when they made their thing work.

I think programming either repels people who are short term optimists or long term pessimists, or the process of learning to code transforms them into short term pessimists and long term optimists.

I'm reminded of something Splosky wrote:

>The difference between a tolerable programmer and a great programmer is not how many programming languages they know, and it’s not whether they prefer Python or Java. It’s whether they can communicate their ideas. By persuading other people, they get leverage. By writing clear comments and technical specs, they let other programmers understand their code, which means other programmers can use and work with their code instead of rewriting it. Absent this, their code is worthless. By writing clear technical documentation for end users, they allow people to figure out what their code is supposed to do, which is the only way those users can see the value in their code. There’s a lot of wonderful, useful code buried on sourceforge somewhere that nobody uses because it was created by programmers who don’t write very well (or don’t write at all), and so nobody knows what they’ve done and their brilliant code languishes.

This boils down to: write code for your peers, not for yourself. I always try to remember that, and write code that will be easily understood by others, rather than the cleverest possible way.

Personally I think the most important thing is a thirst for knowledge. There are plenty of people I finished school with that were much better engineers than me out of school, but they don't learn in their free time and they've stagnated. If you love to learn and you put your time into it, you will continually improve. With time you will run laps around those who don't.

Exactly. If you don't have passion to do something, it's really hard to get better at it.

Obligatory Randall Monroe quote that hits the nail on the head: "I never trust anyone who's more excited about success than about doing the thing they want to be successful at."

If you're not passionate or enjoying what you're doing, you're probably wasting your time.

Here's my take:

- You need to think a certain way. You need to be able to see how to translate a problem into something a machine can execute. You need to be able to switch from the macro to the micro smoothly.

- You need a certain level of proficiency with the tools you're using. Those can be programming languages/OSes/libraries/other software.

- You need experience. Having seen different approaches to solving various problems. Which ones worked. Which didn't. Having tried a few different approaches, different languages.

- You need domain knowledge. If you're working in a certain field you need to know the state of the art. You need to be able to read papers in the field. You need to be able to apply whatever techniques are required for the domain.

- You need to be a problem solver.

These are sort of basic table stakes for being a good "programmer".

In most real world settings you need to have some more on top of that:

- You'll need to be a good communicator. You will need to interact effectively with many different people, e.g. your boss, a product manager, junior developers. It's a team effort.

- You need to understand users/customers. You need to be able to gather requirements, ask the right questions, and deliver something useful.

- You need to understand the life-cycle of software. How long will it be in use? What sort of maintenance work will it need. How will it evolve?

- You need to have some business understanding. Is what you're doing creating business value?

- You need to appreciate the right trade-offs. Sometimes a crappy solution sooner wins over a perfect solution too late. And sometimes you should wait and have a perfect solution. That's just one example of a trade-off but our job is full of them.

- You need perseverance. Sometimes things take longer than you think. Some bugs take a lot of work to chase down.

Algorithm knowledge too

For the most part I was including this under domain knowledge. You mostly are concerned with algorithms that apply to the domain you're working in.

Rarely, assuming you can read the docs enough to understand what pre-built library to use.

And assuming that A) the docs for the libraries do exist, and B) that, if the docs do exist, they don't look like the output of a million coked-up monkeys, that was then run through an automatic javadoc generator.

It's just staggering how shitty a lot of documentation is. I can forgive a one-person github repo for skimping on it, but I get a little irate when I'm using APIs from Fortune 500s and have to resort to decompiling because their docs are missing, incomplete, incorrect, or otherwise hot garbage.

Switch to Perl or Python if that stuff bother you. I find the libraries there are generally of a decent quality. JavaScript not so much.

I've been programming professionally for the last 6 or so years, I've been in 4 teams across 3 companies. The reason this question generates so many answers with such great diversity is that the answer really depends on where you work and what you consider success to be.

At the first company, a good software engineer was a person who could effectively work inside of the team and get through the requirements for code review in a timely manner. While working at another job a good programmer was someone who could figure out the requirements by themselves and have a working product to show the boss in a timely manner. At my current a good software developer is someone who has a strong grasp of their language of choice, can deliver features to a Product Owner, and who can grab a project by the horns and lead from an architectural or design perspective.

None of these answers are wrong, they are just reflections of the engineering cultures they are born from.

Based on my experience a good programmer is someone that understands what success means in the culture they are currently in and can execute on that. Long term a good programmer is someone that continues to learn popular skills, products, frameworks, tools and can continue to follow the path that best suits them, whether that be a startup, large corp, freelance, or entrepreneurship.

It's all about opening doors for yourself to walk through.

I wish everyone were forced to program everything on a machine with the resources of a Pentium I-class processor and say 64MB RAM.

Learn to conserve your resources, learn to eschew that which is not needed, and for crying out loud learn to not write bloated stuff. Even today, most programs I use on Windows 7 were written back in the days of Windows 95/98 and they still work faster than any current modern implementation you can find, and they're smaller.

Every programmer should be given ZX81 with 1kB of memory to start with.

And after that, the maximum hardware they should be allowed is what I mentioned. Unless you're crunching massive data sets, you don't need 64-bit (and most programs don't, excepting games and of course databases!)

Just tell them to do microchip programming, like on an Arduino. Though, the idea of 'it works faster' in comparision of todays software is not really valid, as the expectations and requirements of software have gone up since 2000. Of course Firefox 2.x was fast and not memory hungry! It didnt have to:

  - support x64
  - support new web tech like WebGL, WebRTC, Crypto API, all CSS options, the latest JavaScript specs
  - update rendering engine for CSS3
  - support retina/hiDPI
  - handle webpages that are big in size + have loads of javascript libraries + multiple advertisement parties
  - update to a modern UI
  - implement new cryptos for HTTPS
  - ship with lots of rejected certificates
  - still support old systems
  - do all this in a secure way

And then we would only have a fraction of the current tools and innovations, because we would all waste an enormous amount of time doing tedious memory and resource management.

While modern tools and frameworks may sometimes feel bloated, they allow us to develop and iterate fast. I know I would've finished less projects if I had to be programming with performance and size in mind. For a lot of projects it just doesn't matter that much. It's more important that the job gets done.

"While modern tools and frameworks may sometimes feel bloated, they allow us to develop and iterate fast"

And insecurely, at the same time. The majority of vulnerabilities today come from shoddy programming practices taught by the newest languages and frameworks.

> I divide and conquer and get things done.

I had the same insight after many years like the author - that every problem you do can be broken down into smaller problems, which can be worked independently and tested independently. Why it wasn't obvious to me early-on I have no idea. I guess I never developed the intuition to see the efficiency of the method. Did anyone else among you had the same experience?

I remember early on in my programming education having subroutines pounded into my head. I learned them grudgingly, because nothing that I'd written so far was complex enough to need them. It was always simpler to just write my <100 line program without any functions/subroutines.

Things got more complex. Named subroutines started seeming logical to encapsulate certain behaviors. Eventually, it became natural to think "I wish this part of the program were separated out and implemented elsewhere, so that I could treat it as a solved problem." That's about the point where a divide-and-conquer approach started occurring to me, at least in a limited context. That context expanded over time.

I definitely had that same experience. I've been writing code like that for about 15 years now and it's only started changing over the past two years or so.

I think it stems from a few reasons - 1. I have a tendency to fix small problems that only need a one-time fix. So, lots of code, but disposable and messy. 2. HS programming classes lacked a lot of the meta stuffs, like refactoring, working in groups, etc. 3. Most of my coding years have been spent in a terminal where a combination of awk, sed, grep, xargs, etc works perfectly fine. A non-trivial number of problems can be solved that way, but it hits a wall pretty quickly.

For the past few months, I've been writing a lot of Clojure and have found that it fits my mental model significantly better than Python, C, etc. What's interesting is that all other languages I code in have dramatically improved ever since Clojure "clicked". Hell, even my bash-fu has become cleaner, terser, faster and more bug free. Funny how that works..

Hey, I'm currently using Clojure too and I feel the same way!

I think the reason it feels more natural is its emphasis on process instead of things. You think of computation as series of transformations, and that makes easy to establish causal relationships among events and thus help reason better about the system. (I think Steve Yegge first complained about Java, that it is a land of nouns)

I think the failing of OOP for me is that I never got what an object is! To me, an object always felt like a weird plane in a n-dimensional axes(because an object contains other objects), and it was hard to reason about that. For example, my first (atleast in my latent consciousness) reaction about Abstract classes was, what is it abstract about? If it is abstract about a bunch of things, that are only related because of the flow of code, I guess I wasn't smart enough to crack that effectively.

I agree, but I have an alternate intuition. Humans are only able to understand anything through decomposition to smaller problems, working out those problems, then recomposing to a whole again. This is evident not just in software development, but in everything.

At some point you have to ask yourself: If this "decomposition" hammer is the only tool we have to use in the world, it's no surprise that everything looks like a nail.

> At some point you have to ask yourself: If this "decomposition" hammer is the only tool we have to use in the world, it's no surprise that everything looks like a nail.

Agreed! I think this is why it was not intuitive to me. I was (reasonably) good at math in school, but I never tried to decompose the problem. I always tried to get a feel of the problem by looking at it from various angles and get the crux of the problem. That's the best I can say about my thinking method then.

That makes sense. The thing about decomposition is that there is more than one (a toolbox with multiple hammers). So even if you decompose, you have to look at multiple angles before you choose a decomposition.

That is only easy if the overall architecture has already been taken care of by someone else.

You are essentially describing 'test driven development', and many of us have either been taught the methodology, or stumbled on to it.[1]

[1] https://en.wikipedia.org/wiki/Test-driven_development

No, abc_lisper is describing decomposition of the problem. That may or may not be implemented in a TDD environment; it may not even involve tests at all.

abc_lisper specifically said: "every problem you do can be broken down into smaller problems, which can be worked independently and tested independently".

Unless you delete or ignore "and tested" from that post, abc_lisper's approach definitely involves tests.

Testing does not automatically imply automated tests.

I do break down problems and follow the same kind of process, yet my goal is less automated testing than it is raw flexibility, ability to recombine pieces of code at will and move fast.

Chasing automated tests, even if it supposedly yields this kind of composite constructs is in my opinion taking an interesting side-effect of good craft for an end in itself.

Then of course, saying the code can be tested in isolation doesn't imply coding is driven by tests.

TDD, in my opinion, like all idols in the history of cults is, as an end, the dawn of critical thinking and progress.

I stand corrected; what abc_lisper said definitely involves tests. That's not necessarily TDD, though; see my post at https://news.ycombinator.com/item?id=13465459 for the difference.

I agree that abc_lisper did not 'exactly' describe test driven development, which is why I said he 'essentially' described test driven development. I am a believer in test driven development, yet I often write a bit of application code before I write the tests; I acknowledge that probably makes me a pariah, but I still keep tests (and TDD) in mind.

Meh, I think TDD has value, but I don't think a methodology should become a religion. Write a bit of application before your tests, write most of the tests before the code, you're good to call it TDD in my book. (Of course, being non-religious about methodology may make me a pariah, and not just on TDD; but I can't work up the energy to care...)

I agree decomposing and testing go hand-in-hand, but it wasn't even apparent to me decomposition is the way to go. I guess I didn't know it is possible to design solutions to problems while breaking them down. Btw, I discovered (He hee) testing of modules independently very shortly after I found out about decomposing.

I'm training my nephew, he's living with me while going to college in the philadelphia. He's been getting really frustrated hearing platitudes like this from his professors.

I've basically been mentoring him and having him read https://github.com/braydie/HowToBeAProgrammer , Comp Science with Python, and a couple books on linux and C to help with some lower level stuff.

He's been struggling a lot with C and Linux concepts. I've been there to help, and now I have strong opinion C and Linux would make a lot of programmers have a better understanding of their programs and computers.

I agree that it would help with a better understanding. I can't help but wonder why aspiring sysadmins and programmers DO NOT have immediate interest in such things? To me it just 'made sense' and 'clicked' almost immediately. Maybe I learned from a different angle, but it seems obvious to me to want to learn these things (C and Linux fundamentals, I mean).

I don't know for sure but. I noticed a lot of his class notes were written for windows and mac users. Also a lot of it was focused on java and python. I wonder if C and Linux classes became higher level electives.

A good programmer is a good problem solver. But it also requires technical knowledge or ability to acquire it, as well as a good ability to communicate ideas.

Now, you can solve problems in many ways. They can be solved in the best interest of the programmer, or in the best interest of the company.

If I have a sales team, and give them a goal to sell $100, and they make a loan for $100 on behalf of the company and give me $100, that's probably not a good sales team. Let alone a 10x sales team.

But in software, technical debt (equivalent to buying with a loan) is something done frequently and some people mistakenly perceive it as being a good programmer. Usually because they cannot perceive the build-up of accrued work.

> never be dogmatic about anything

Mostly agree with that with the exception of learning new things. Sometimes it helps to be dogmatic about things if your goal is to learn something new.

A prime example is TDD. When my goal is to build a working piece of software, I tend to do TDD for some parts and write tests post facto (or not at all) for other parts. When learning TDD it helped to force myself to be dogmatic.

Similarly, if you're learning functional programming in a language that isn't strictly pure, its helpful to be dogmatic about your approach and force yourself to do things the functional way, while in the real world introducing limited impurities may be better.

"Write unit tests, they could prove to be invaluable, especially when you introduce changes to your codebase."

I always read this... What exactly is a unit test? What parts of my software should be tested? What are some examples of good unit tests and the code that is tested?

A unit test is something that calls the code you wrote, and makes sure that the responses are what they should be. It's often (though not always) written using a testing framework, like jUnit or CxxTest. These frameworks handle some of the tedium of writing the tests, but not all of it - testing is tedious to do, and there's not much way around it.

What parts should be tested? Everything that can be conveniently tested, and some distance past that. The goal is "every externally visible behavior of a class"; that is, everything that matters to a caller. For example, let's suppose I have a "sorted container" class. I'd want to test that it returns stuff in sorted order, maybe with two or three items, and with lots (hundreds or thousands). I'd want to test that it handled duplicates correctly. I'd want to test that it worked correctly with only one item, and with zero. I would not want to test whether it was internally implemented with a vector - that's not something that the caller cares about. So if the sorted container is changed from a vector to a tree, the unit tests should run unchanged.

Just last week, I was breaking apart a singleton into three related singletons. "Related" means that they each need a reference (or pointer) to at least some of the others. Turns out I coded a recursive constructor loop. A unit test pointed that out to me.

I've seen unit tests catch race conditions. Catch that I was calling a pure virtual method in a base class destructor. Plus of course all a whole bunch of the usual regular bugs.

"Unit" is not well-defined. However, it usually means a function/method. Being a unit test also implies some explicit setup for when that function is used.

The easiest examples of unit tests are math functions, since they're so well-behaved.

Here is a square root function in Javascript:

    function sqrt(n) {
      guess = n/2;
      while(Math.abs(guess*guess - n) < 0.00001) {
        guess = (guess + n/guess)/2;
      return n;

    assert_equals(4.0, sqrt(16));
There is nothing to set up because this code doesn't depend on a database, or an HTML input field, or the phase of the moon.

The call to assert_equals there is a unit test. (Also, keep in mind that assert_equals is a glorified if statement)

"well, what isn't a unit test??"

Since we can call anything we want a unit, the answer is "whenever we say we aren't doing a unit test". But a more practical answer is: whenever you're testing the interaction between two or more units (where a unit is a function/method).

If you're making a video game, you might have enemies who can jump, and guns that can fire. It would be a unit test to check if enemies jump correctly, and it would be a unit test to see if guns fire correctly. It wouldn't be a unit test to see if guns fire correctly from a jumping enemy, since this is the combination of two pieces.

Another way to (sort of) define a unit test is to say "how does this behave assuming that the rest of the system is correct"? That is, we aim to check the behavior of our code in isolation from the rest of the system.

This is important when something is wrong, because if we know what is behaving correctly, we don't have to waste time checking if it's the cause.

TDD is like object-oriented programming. Nobody can agree how precisely it should work in real world, but "you'll know it when you'll see it". And you're 50% likely to embrace it, and 50% likely to decide it's utter crap.

TDD is more than having unit tests. TDD means that the first thing you write are the tests.

Why does that matter? It means that you're writing a user of your class before you write the class. That means that the interface of the class gets designed from the mindset of a user, not an implementor.

More: The interface gets designed by someone who wants to be able to test all the externally-visible behavior of the class. If you can't test it, you have to think about re-designing the class interface - often by breaking it up into smaller classes. I've seen this in practice; the net effect of this is better class design (plus thorough tests).

That said, I'm considerably stronger of a proponent of tests than I am of TDD. If you don't like the TDD approach, still write tests. Write lots of them. They'll often save you when you make subtle mistakes in your next set of changes. (Nothing like fixing a bug, running the tests, and getting informed of all the implications of your change that you forgot about.)

You see, I'm in that 50% of people who don't particularly like TDD. Even though I tried it, honest to god, several times.

> That means that the interface of the class gets designed from the mindset of a user, not an implementor.

From my experience, something entirely different happens. The class gets designed from the mindset of a third party - the tester. Which is, I believe, a mindset different from the user. If all you're testing is the externally-visible behaviour, fine. You're pretty much treating the class like a library. But if you start injecting stuff into the class to test if some other dependent services got called properly, etc. - and especially if you start designing the interface around such testability, then I believe it'll lead to bad, unreadable code. In particular, I believe adding complexity to the class for the sole purpose of making it easier to test is a code stink.

TDD taken to the extreme prescribes that you should only ever write dumbest possible code that makes current tests pass. If you need more complicated behaviour, you first have to write tests for it. But this gets quickly out of hand if your project is meant to do anything more complicated than being a simple CRUD layer, because test complexity rises in lockstep with production code complexity. I've seen cases when people blindly following the test->code->refactor cycle created tests that themselves were isomorphic to the algorithm they were implementing, which makes one ask where did they have the code that tested if tests themselves are implemented correctly?

So personally, while I like tests (particularly regression tests), I just can't make myself follow TDD.

Well, when I did it, I was the tester, and the user, and the coder, and I switched back and forth frequently - maybe in five minute increments. But I thought differently when wearing each of those hats. I could see the difference between ways of thinking in the design of the code.

> But this gets quickly out of hand if your project is meant to do anything more complicated than being a simple CRUD layer...

I've done things a lot more complicated than a simple CRUD layer. TDD held up just fine.

> I've seen cases when people blindly following the test->code->refactor cycle created tests that themselves were isomorphic to the algorithm they were implementing...

Well, blindly following any methodology is likely to get you in trouble, one way or another. The problem is blindly following, not that they're following TDD.

That said, I did TDD, and now I don't. It worked well for me, I loved the kind of code that came out of it, and I still don't do it any more.

I'm recently completely disagreeing with that wisdom.

You should have thoughtful integration tests. Unit tests are only useful when you have an obvious independent "unit" with semantics that are completely independent from the rest of your code. If so, you test those.


If you want to make sure your functions work, write unit tests.

If you want to make sure your application works, write integration tests.

It is just an automated test of a piece of code, normally of a function. Personally I prefer to write automated systems test. i.e. Tests that test the entire system. For example: A test that plays every single level of your game.

It helps to be obsessive. Program in your spare time. Don't be one of these people that just learns what they need to know to do their job and then stops learning.

I've been programming professionally for 6 years and I'm learning as much today as when I started.

You don't need to be obsessive or program in your spare time to be a good programmer. It's great to continue to learn indefinitely. It's also great to leave work at work and build a life rich with a diversity of interest and activity. You can do both.

Simplicity. And when you think you have a good simple plan of attack, simplify more.

I have rarely had problems from being too simple, and when you do they are easy to fix, but over complication (over generifying) is always a beast to fix.

I agree with this, the simplest solution is often the best, and then work on it until it is elegant. In the visual arts, i use the same line of thought when i am turning a sketch into a full painting, for comparison.

I have read a really good book on this subject:


I think it is only available in German.

First and foremost, add value to business by solving a business problem. Once you can do that, follow the DRY and KISS principles.

I've noticed developers who can quote scripture (best practices), but can't solve a non-trivial problem.

Time, intelligence, resourcefulness, curiosity, stubbornness, ideally being physically next to programmers more competent than yourself, TIME

Adderall or something like that could also be a plus...

If you have a choice between 500 lines of easy to understand, dumb code or 50 lines of abstract elegance, go with the easy to understand, dumb code.

I call that distinction the difference between an elegant and a clever solution. Clever being what you mention.

If the programmer can't tell the difference when implementing, then go for a simpler version, as you suggest.

- Elegant: Nice, clean, well thought out.

- Clever: Hackish, terse, too hard to understand for whatever benefit it brings.

Ten times as many places for bugs to hide.

In my opinion, "be humble" isn't a winning strategy.

Don't be so arrogant that you stop improving or fail to understand mistakes, but by all means don't be humble. Be proud of who you are and the work you've done. You work in an industry where 90% of people don't understand what you do. Chances are your own boss won't understand the actual value you're providing.

So don't be humble. "There's always someone better than you out there" -- sure, but that person's not here, right?

I think that's crossing the wires of humility and confidence. You can be confident but humble. Confidence is your ability to come up with solutions and feel like they are the right solutions, and that they are good or excellent solutions. Humility is your ability to come up with solutions and realize that (1) they might not be the _absolute best_ solutions, (2) some solutions that look like they're better are actually better (and identifying these).

I think it interacts well with the concept of “strong opinions weakly held”. Have a strong opinion that you've built a great thing, but hold it weakly so that in the face of clear arguments or evidence that it is in fact not great, or that another solution is better, you can admit it and adjust. This, to me, is the right balance of confidence and humility---in programming, as in life.

You can be confident but humble.

I agree with you in principle, but again, I don't think that's a winning strategy.

A humble, competent woodworker creates a piece that's beautiful, visually rich, has a solid, study feel to it, and can be appreciated by most. The software engineer, who checks in a fix on the back-end service that pre-empts several bugs that had yet to be discovered, goes relatively unappreciated.

So can one be humble and confident? Yes. Will it be understood and appreciated from the outside? Probably not. Is it a good strategy? Probably not.

I'd say a person has three options:

1) work for people who are technically competent enough to understand the true value of your work. In my experience, this is pretty rare out in the real world.

2) become a salesman and sell yourself at all times. This seems fundamentally opposed to dominant personality types in software.

3) stop being so humble and be proud of your work. This seems like the most logical choice to me.

Our industry is rife with imposter syndrome, legends of the 10x engineer, impressions that everyone else is succeeding flawlessly thanks to polished blog posts and GitHub portfolios [1], and endless pontificating on things that often boil down to fashion and code style. I don't think we need anything telling us to be more humble.

You may be a completely average engineer writing CRUD software for some local bank. Yeah, maybe that guy blogging about his work at Google would do a better job than you. But he's not here, and your the best you company has found. You're valuable and capable. Be proud of what you've accomplished!

[1] - people have written recently about how Facebook causes negative feelings since you seem to see others only at their best. I think this is normal folks finding out what tech types have experienced for decades, but nobody has ever talked about it.

While “pride” is often used as the opposite of “humility”---which I think is maybe what you're getting at here---I don't think pride is always the opposite of humility. In particular, I think the word can mean many things, and the meaning you're using isn't the one that's often used as the opposite of humility.

Fundamentally, I agree that you should be proud of what you've done, even if 10X awesome dude at Google could have done better. For one thing, 10X awesome dude at Google is surrounded by other Google folks, and by 15+ years of Google technology. There's being aware that there are better possibilities out there---indeed, being interested in them and even thirsty to learn more about them---and there's being embarrassed about what you've done. You can be interested in the crazy things some people do and want to learn more about those things while still working at a smaller, more limited scope and being proud of what you achieve with what you've got. The latter is how you get confidence, the former is how you get humility.

In short, I think we're probably saying the same thing. When you build a thing, you should be proud of that thing---unless you deliberately cut corners for some other tradeoff, in which case you should be cognizant of why you did it and satisfied with the tradeoffs you made. But that doesn't mean you have to think there's no better way, nor even that you have to avoid looking into those better ways. Often, I think these things (thinking there's no better way, not looking for improvements) are what people mean when they refer to folks who aren't humble. Certainly, that's what I mean.

> You can be confident but humble

So many people miss this, unfortunately.

Failing to be humble eventually leads to arrogance. This advice is counterproductive. The part regarding being proud of who you are is completely tangential. I also highly doubt that "90% of people" in the tech industry don't understand what a "good programmer" does.

The best advice is just to become as best as you can by trying as much as you can:

1. Deliberately working

2. Learning from people you perceive to be better.

3. Keeping up with trends.

4. Mastery of the basics/fundamentals.

5. A razor sharp focus on what really matters -- value in the context of the business/domain.

> 5. A razor sharp focus on what really matters -- value in the context of the business/domain.

This is a tricky part, and you have to ask yourself explicitly - what do you want to work on? Becoming a better programmer, or a better employee/entrepreneur? Because business does not optimize just for good, quality work. It optimizes for earning money. Which involves things like:

- doing good, quality work which can be sold

- doing good enough work, minding the essential tradeoff between quality and business realities

- doing shitty, half-assed work and having a good sales team sell it anyway

- doing useless shiny things that don't really help the intended audience, but help your business to get their money

If the needs of your business are from the latter two groups, being a good programmer goes directly against your career success in that place. Unfortunately, it is my belief that most of the jobs in our industry fall into those two groups. From the POV of everyone else but programmers, the code is only means to an end.

I want to emphasize it, because a lot of advice for programmers tells you about the "razor sharp focus on what really matters -- value in the context of the business/domain", and I found that for some reason, it never resonated with me. It took me a while to understand why - it's because this advice comes from a business mindset, where making money is the goal and the means to it are tangential. I have the opposite mindset - for me, it's the product that matters, not how to make more money off it. Realizing that those are two different worldviews has helped me stop feeling inadequate just because I couldn't bring myself to enjoy the business value. It also helped me understand the POV of management and what they expect from me.

There are people who are very opinionated and also "appear" to be right almost all of the time. But in practice, I've found that this is an illusion and that there are long-term repercussions for being too invested in ideas or technical philosophies.

I think that the "be humble" criterion makes sense because it is directly correlated to the awareness that there is no single right way to do something. Changing requirements can pull the rug from underneath you and turn a good idea into a bad idea.

Software engineering is a bit like economics; a lot of people think that they understand it, but in reality nobody does.

> "There's always someone better than you out there" -- sure, but that person's not here, right?

Great response to a devil I've had on my shoulder for a long time.

> be kind and realize you might be wrong

this is the author mean for "humble", and I think it's damn true

Being proud of your work and being humble are not mutually exclusive.

The current state of software development has been bothering me. Having spent several years in the software industry, I don't find the motivation to write good code, or design elegant fixes for bugs. I have worked in large MNCs as well as small companies and I see the same trend every where: The management does not care about code quality or elegance. They just want to ship things ASAP. As a result, even if a few developers care about code quality and elegance, majority of the developers don't. They just want to put together a working software, no matter how it is achieved, and be done with their job.

Now, even if the minority of developers have coded a few modules that look absolutely elegant and neat, it interacts with the majority of the software that is complex, ugly and sometimes bloated. On top of that, in many teams, developers often touch each others' code and modules to make bug fixes, enhancements, etc. which means all code (whether currently elegant or not) tends towards ugly.

How, under these circumstances, do you maintain your motivation to write good, clean and elegant code after having worked about 10 years in the software industry?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact