
Some things I've learnt about programming - jgrahamc
http://blog.jgc.org/2012/07/some-things-ive-learnt-about.html
======
AngryParsley
Programming may be a craft, but researchers have published tons of studies
about this craft. Many of these studies contradict anecdotal evidence. For
example, copying code isn't as bad as you might think:
<http://www.neverworkintheory.org/?p=102>

Another example is TDD. People espouse the benefits, then some study comes
along (<http://www.neverworkintheory.org/?p=139>) saying the benefits are
largely illusory and that code reviews are more effective.

Instead of listening to the experts _at_ programming, listen to the experts
_on_ programming. Read some studies about the effectiveness of various tools
and methods. Try new things. Programming is a craft, and like many crafts it
contains significant amounts of dogma passed from teacher to apprentice.

~~~
bambax
> _Instead of listening to the experts_ at _programming, listen to the
> experts_ on _programming_

Wow. What a strange advice. All it takes to be an expert _on_ anything is
labeling yourself as such. I'm an expert _on_ a lot of things.

Being competent _at_ something, on the contrary, means actually going there,
doing the stuff and learning the craft.

If the guy trying to teach me mechanics doesn't have dirty hands and nails,
I'm not very interested in what he has to say.

~~~
Confusion
If the guy doesn't know how he does it, what he has to say will not be very
illuminating. To be able to teach, you have to understand why you are doing
things the way you are doing them. It's surprising how many people are good at
what they do, but still don't know why they do things a certain way, have not
considered other ways and noticeably slow down when they leave their comfort
zone.

------
defdac
"You have to ask yourself: "Do I understand why my program is doing X?"" "I
almost never use a debugger." "And if your program is so complex that you need
a debugger [simplify]"

To me, being afraid of a debugger is like being afraid of actually knowing
exactly what is going on - being lazy and just read logs and guessing what
might have gone wrong, instead of letting the debugger scream in your face all
the idiotic mistakes you have made.

I would argue that using the debugger is being lazy in an intelligent way,
instead of spending hours reading endless logs trying to puzzle together logic
the debugger can show you directly.

~~~
wpietri
If you are spending endless hours puzzling things out, you already have no
idea what's going on. So sure, go for the debugger if that helps. But that's
not what's he's talking about.

Like him, I use a debugger rarely. Not because I'm opposed; they're great when
they work. But it means I don't understand what my software is up to. Which
for me is a sign of design and code quality issues. Or just ignorance. Both of
which are solved by working to clean things up.

~~~
icebraining
Cleaning up is fine and commendable if you're working on a piece of software
from scratch.

My code is often just a class or two plugged into a behemoth. I know exactly
what my code is doing, but not exactly how it's being called by the platform,
or what responses is it getting.

~~~
wpietri
Sure. In my view, that's another context where I don't know what's going on.
Ergo, sometimes I give in and use a debugger.

------
bambax
A lot of those things apply to many human activities.

I don't build bridges but I would be very surprised if an architect described
his work as "pure science and no craft at all" (how would it be possible,
then, to build beautiful / ugly bridges?)

I do a little woodworking and have many tools; friends sometimes look at my
shop and ask if I really need all that -- yes, I do. In the course of a
project you get to use many different tools. You can get around to missing one
but it takes exponentially longer to work with not the exact tool. (Same thing
with photography).

I'm learning to fly, and the most important word regarding human factors is
"honesty". The way to fly is not to avoid mistakes, it's to detect them and
minimize the consequences; if you feel you can do no wrong you'll eventually
kill yourself.

------
vbtemp
> 0\. Programming is a craft not science or engineering

Unless, of course, you are software engineer-ing.

Flight guidance-and-control systems, among many other things, are are
precisely engineered software systems. In a world of web-apps and mobile-apps,
people tend to forget this kind of software exists.

Sure, working on your web app, writing some JQuery widgets, or coding up some
python scripts is a craft.

~~~
MichaelGG
Can you explain more about how high quality engineering software projects do
not rely on programmer's experience and intuition, but instead follow
formulaic rules and achieve excellent?

From my limited reading, it seems that most "mission critical" software is
achieved by applying a lot of resources, especially testing, to the project.
Not to mention having a very well-defined (and relatively unchanging?) problem
space.

Surely, if there were engineering principles that enabled folks to reliably
create high quality software, we wouldn't see the horrible failure rates
across all sorts of software projects.

~~~
pnathan
I work in the area of critical infrastructure software.

There's no special software engineering sauce that gets used. But there is a
dedicated commitment to careful code review, and exhaustive testing, as well
as a _very_ rigorous process for defect handling. We have a dedication from
the top down to ensure our stuff doesn't break our customers. We don't have a
"software pirate ninja rock star" culture, we have a culture of careful work.

Doesn't mean null pointers don't get accessed, or that weird code doesn't show
up. It just means that hey, we worked real hard to ensure that these issues
are minimized. Quality is a journey, and every day we have to work on it.

This is a great article about how this sort of stuff is done and the kind of
culture that you want to cultivate. It's a bit dated, but still solid.

<http://www.fastcompany.com/magazine/06/writestuff.html>

~~~
ajross
This is mostly semantics, but in general attributes like "dedicated
commitment" are hallmarks of a "craft" practiced by individuals. "Engineering"
tasks are things like "processes" and "models". So really, I think you're
agreeing more than disagreeing.

And the "software pirate ninja rockstar" thing is a shameless strawman.

~~~
pnathan
Processes are nothing without people dedicated to following them. Models are
unreliable without taking care to craft a reliable system.

I don't really consider disciplined programming to be a branch of engineering.
While I don't have a sophisticated metaphysics of code, it seems that there is
an essential ontological difference between "engineering" a software system
and "engineering" a bridge or a chemical process.

Richard Gabriel once suggested the idea of a MFA in software, and I think that
he is onto something.

<http://www.dreamsongs.com/MFASoftware.html>

~~~
Daniel_Newby
I have done both hardware and software engineering, and I see no difference.

It is difficult to put into words, but I would say that the heart of
engineering is the discipline of understanding how and why something is
useful, as distinguished from feelings or hopes about its utility.

An MFA in software is pretty much the opposite of engineering. Engineering is
not a matter of taste or opinion, it is about creating such hard sparkling
truths that opinion would be superfluous.

------
StavrosK
I never noticed until now, but I never use a debugger either. An employer
urged me to start using one while I was working on their code, but it's just
not natural for me. If I need to debug a crash, I just print the 1-2 variables
of the state involved, see what's wrong, and fix it.

Is there anyone who uses a debugger for more than inspecting state?

EDIT: I guess lower level languages and more involved applications use
debuggers much more extensively.

~~~
Xion
Sometimes the logging is not enough, and you cannot isolate the case to
prepare appropriate unit test.

Just few days ago I had a strange problem with the order of imports in Python
at the border of my code and external library (Celery). There were import
hooks involved but they didn't seem to be executed properly in certain
conditions. I could reproduce them quite reliably but I needed to pinpoint the
exact import (inside Celery itself, mind you) that was causing the problem.
pdb (Python debugger) was indispensable while solving it.

On the other hand, though, it was probably the first time in many months that
I used pdb for more than 5 minutes, and for something more complicated than
checking why a particular test fails.

~~~
StavrosK
I agree, you certainly have to trace execution flow somehow, and a debugger is
the best way to do that. However, my realization was more that the debugger
wasn't useful for the vast majority of cases. Until now, I somehow thought I
used a debugger a lot, and that it was a big part of my debugging (after all,
how do you debug without a debugger?).

This post made me realize that 95% of the time I fix errors from just the
stack trace, 4.9% from logging, and 0.1% from actually debugging.

------
Tashtego
"I'm not young enough to know everything"

Having recently started mentoring/managing the first really junior engineer on
our team (self-taught, <1 year programming experience), boy does this ring
true. Luckily I'm of the temperament to find the "advanced beginner" stage of
learning more funny than annoying.

I think it's possible to understand as little about your code when using
loggers as when using debuggers, so I have a hard time agreeing with him
there. I think his general point about having tools and knowing when to use
them applies just as much to that as it does to language, so he contradicts
himself.

------
asto
A large part of the post can be rewritten as "don't be lazy".

1\. Don't be lazy and just do something that works without taking the time to
learn why it works.

2\. Don't be lazy and just stop when you have something that works. Go through
the code again and see if you can make it better.

4\. If you find yourself writing the same thing twice, don't be lazy and carry
on, put the code in a single place and call it from where you need it.

Or at least that's how I see it. I do all of the things I shouldn't do,
largely because doing things the wrong way is so much easier!

Edit: Rather than rewritten, I meant "falls under the general category of".
The article was great!

~~~
HarrietJones
Though point 4 should be rewritten as "If you find yourself writing the same
thing twice, DO be lazy and carry on, put the code in a single place and call
it from where you need it."

~~~
asto
There is usually an upfront cost of time and/or effort. The choice comes down
to copy-pasting what you need everywhere or put effort into making a generic
function that you can put in one place. If you are working against an
impossible deadline, option 1 seems very enticing. In the long run, you are
right, the effort is much less to maintain a well thought out codebase.

~~~
HarrietJones
I don't really understand why people think that copy pasting takes less time
than making a new function.

Your function may need some refactoring to make sense, but just cutting that
block of text out and then calling it in a function can usually be done in
seconds.

Copy Pasting: Copy the code. Paste in new place. Change variable names to fit
in new place.

New Function Cut the code. Place inside function declaration. create call to
function in old and new places.

~~~
Wilya
I doesn't usually happen like that for me. More likely, I stumble on a point
where I have to reuse part of an old code, but with slightly different goals,
or with some steps added/removed in the middle.

Copy-pasting and just dropping the useless parts is easier than making a new
function, because you have to think about how to make the new function apply
to both cases (the easiest way is "well, I'll just put a switch in the
parameters and if statements", but it doesn't really lead to better code).

------
zethraeus
It's nice to see number 5.

>Be promiscuous with languages

>I hate language wars. ... you're arguing about the wrong thing.

It's easy to take this for granted, but it's a concept that is very important
to stress to new coders. If you spend too much time focusing on one language
you run the risk of the form becoming the logic . This is a dangerous place
where your work can be better analogized to muscle memory than to logical
thought.

At least in a college environment, I think this lack of plasticity causes
discomfort with different representations of similar logic - and so flame wars
abound.

------
mustafa0x
> Programming is much closer to a craft than a science or engineering
> discipline. It's a combination of skill and experience expressed through
> tools

You seem be implying that the latter statement doesn't apply to the
disciplines of science and engineering. "skill and experience expressed
through tools" is highly important in both watchmaking _and_ bridge building.
I would advise anyone who says elsewise to reconsider.

I understand your point, but why create a hugely false dichotomy between a
craft discipline and the science and engineering disciplines?

\---

I strongly concur with points 2 and 6.

~~~
HarrietJones
Bridge Building vs Software Engineering
[http://www.codinghorror.com/blog/2005/05/bridges-software-
en...](http://www.codinghorror.com/blog/2005/05/bridges-software-engineering-
and-god.html)

There is something different between software and other engineering methods.
I'm not sure he expresses it properly, but it's definitely there.

~~~
ObnoxiousJul
The core of programming is creativity.

No developers develop the same way, and even though there are some obvious
ways, there seldom exists an absolute all cases best way.

It is the thesis of the mythical month (F. Brooks) and I do like this theory
since its corollary the "no silver bullets syndrome" is quite accurate.

The essence of programming is creativity, thus no tools can improve software
productivity in its essence.

The problem with school, is studious dull boys with no imagination thinks they
worth something in programming by incanting mantras of pseudo tech gibbish.
They have a 90K$ loan, no gift, and they pollute the eco system because else,
they become hobos. At least, most of them are hired as java, C++ or PHP
developpers where they fit best.

~~~
TeMPOraL
> The essence of programming is creativity, thus no tools can improve software
> productivity in its essence.

Completely not true. Creativity is a fragile thing and anything that stands
between you and expressing your thougths may break the creative process
altogether. Good tools can also enhance the process[1], for example by
allowing to see you the thing you're working on in realtime (see e.g. Bret
Victor's "Inventing on Principle").

Also, software productivity is a function of both creativity AND being able to
turn the idea into reality efficiently. Good tools do a great job on the
second part.

[1] I do have the feeling though that most of the creativity still happens on
paper and/or whiteboard, not inside computer programs.

~~~
ObnoxiousJul
most painful bugs are in the conception, thus at paper level.

And also most breakthrough are also in simple efficient conception.

Delivering is what most people calls craft.

I do pride myself in delivering, however, any monkey coder can deliver.

------
einhverfr
These are all golden lessons that people who _think_ about writing code
generally learn.

One thing I would add though is that there are many times when there is time
pressure and a kludge works. The right thing to do here is to document that it
is a kludge so that if/when it bites you later you have a comment that
attracts your attention to it.

"I don't understand why this fixes the problem of X but this seems to work" is
a perfectly good comment. It's great to admit in your comments what you don't
know. (That's why questions relating to commenting are great interview
questions IMO.)

Finally, I think it's important in the process of simplification to
periodically revisit and refactor old code to ensure it is consistent with the
rest of the project. This should be an ongoing gradual task.

Anyway, great article.

------
bromagosa
The debugger part doesn't look generic enough to me. As a Smalltalk
programmer, I can only say usage of debuggers depends _a lot_ on which
language you code in.

In Smalltalk, you practically live inside the debugger. Also, if you are an
ASM programmer, the debugger is indispensable.

------
10098
> I almost never use a debugger. I make sure my programs produce log output
> and I make sure to know what my programs do.

I used to do precisely that. Sprinkle code with log messages, recompile and
run. When I finally learned how to use gdb, my debugging productivity
increased tenfold.

I mean, just the ability to stop your program at any given point gives you an
enormous advantage. You can not only examine the local state of your program,
but also you can see how the state of systems outside of your program (e.g.
database) changes, and all of this without polluting the code with tons of
useless debug messages.

Often when I had new ideas during bug hunts, to test my hypothesis without a
debugger, I had to go back and add new logs, the recompile, then run (and make
sure it reaches the same state as before!) - lots of wasted time. With a
decent debugger it's as easy as typing an expression.

And I don't think debuggers lead to lazy thinking. The process of finding the
problem is the same whatever method you use - you analyze the code, have an
idea about what could be wrong, change one thing, then see what happens.
Debuggers just make it easier.

------
HarrietJones
Good article, though I don't agree with it all.

It is harder to grow software than it is to initially build it. Preconceptions
bite you on the ass, data structures don't allow for new features, side
effects multiply.

You don't need to learn the layers. In fact, if you're learning all the
layers, you're probably an innefective coder. This is not to say that you
shouldn't investigate the layers or have a poke around the layers. But,
software's about reuse and reuse is about reusing other people's work via
known interfaces without worrying overmuch about what goes on underneath the
hood.

I'm actually more of a debugger than a profiler, and as much as I'd like to
believe that my way is as valid as his, I suspect that he's probably right on
this and I'm probably wrong.

~~~
kenbot
I find preconceptions bite ass way harder when you try to design big things
all at once. If you grow them you can pivot and refactor more easily while
they're small.

~~~
sp332
Growing sucks with large codebases. I worked with about a dozen people on a
huge app, that would have been much easier to make if we had had a better idea
of what we were building when we started instead of making stuff up as we went
along.

------
raheemm
#7. Learn the layers. Is this even possible anymore? Seems like with apis,
frameworks, there are layers upon layers just within code. Then you have the
OS, the hardware, the network layer (whoa! 7 layers right there!)...

~~~
gaius
Well, you are referring to Wheeler's Corollary to Lampson's Law, but it
doesn't have to be like that. It's actually less work to just write the
program you want in the base language, than trying to shoehorn it into a
framework, or wrestle the framework around your code.

I've recently rediscovered the joy of writing GEM apps in m68k ASM...

~~~
someone13
For people's information:

"All problems in computer science can be solved by another level of
indirection." - Butler Lampson

And, from what limited sources I could find:

"[above] ... Except for the problem of too many layers of indirection" - David
Wheeler

~~~
gaius
It's a shame Wheeler isn't better known. He _invented the subroutine_ back in
the '50s. Calling a subroutine used to be called a "Wheeler jump" (relevant to
#4 in TFA).

~~~
pnathan
His group with the EDSAC did some absolutely tremendous and groundbreaking
work.

Wilkes, Wheeler, and Gill wrote a book, "Preparation of programs for an
electronic digital computer", that pretty much describes the core software
engineering precepts we use today - in 1951.

Gill wrote "The Diagnosis of Mistakes in Programmes on the EDSAC", which gives
a good snip of what they were doing at the time. I think that it can be
obtained for free online.

~~~
gaius
We need to educate people that there was computing before the advent of Web
2.0. There are riches in the past to be mined by those who will take their
blindfolds off.

------
MindTwister
9\. You count from 0

------
plg
Is "learnt" a real word?

~~~
tymekpavel
Yes. It is used instead of "learned" in nearly every english-speaking country
outside the U.S.

------
cryptide
>>a printf that's inserted that causes a program to stop crashing.

Huh?

~~~
salgernon
printf can/does have side effects. Perhaps one of the parameters to the call
is obscured by a macro, which has some other effect. Or perhaps stdout was
redirected and the printf was required to prevent line buffering from hanging
a downstream consumer. The point is, if it seems very unlikely that such and
such could cause a problem, it probably isnt the true cause. (Printf itself
isn't the fix, it is just involved in the fix)

~~~
cryptide
interesting, thanks for the explanation.

------
alainbryden
I like that you starting numbering at 0 - very apropos.

------
wissler
Taking personal pride in not using a debugger is a bad idea. Sometimes it's
the right tool for the job, and if your picking it up makes you feel dirty,
you're only handicapping yourself.

~~~
wpietri
But sometimes it isn't the right tool for the job. In particular, it enables
you to deal with a confusing, poorly factored code base. The right tools there
are the ones that help you clean the mess up, rather than making the mess more
tolerable.

~~~
wissler
If you have a rational argument that proves that debuggers are only useful on
poorly designed code then I would certainly be interested in hearing it. It is
true that the worse the code the more frequently you need to debug, but that's
a quite different proposition.

~~~
wpietri
I'm not saying they're only useful there. But we both agree that they're very
helpful in understanding a bad codebase. That's because they make bad code
easier to handle. Which for some people removes the incentive to clean it up.
Basically, they use debuggers as crutches.

------
pjmlp
Quite interesting

