

Lost skills: What today's coders don't know and why it matters - jfruh
http://www.itworld.com/it-managementstrategy/190213/lost-programming-skills

======
schrototo
_Kris Rudin, Senior Developer and Associate Partner at digital marketing
agency Ascentium says, "One 'lost skill' that I see all the time with new
developers -- how to debug without an integrated debugger. When I started
programming (in 1986, just after punch cards, using a mainframe & dumb
terminal), we didn't have any IDEs and debuggers, we had to put trace
statements in our code to track values and execution.

"Today," says Rudin, there are occasionally times with you can't use the
integrated debugger in your IDE (usually with some weird web application
frameworks and server configurations), and younger programmers are at a loss
as to what to do, and resort to hack-and-slash coding to try to randomly fix a
bug, using guesswork. Me, I just calmly put in some code to display output
values on the web page, find the bug, and fix it."_

Oh come on. What he's describing isn't some "lost skill". If you can't figure
out to use print statements for debugging you're not much of a programmer in
the first place.

Most statements in that article describe really basic, common sense stuff.

~~~
narcissus
I work a lot in PHP, so it's kind of funny that I find that new developers in
PHP are exactly the opposite: most of them don't know how to use IDE debuggers
with PHP. Instead they have print statements all through their code and they
end up veering away from any form of OO or 'complex' code, as it becomes hard
to 'debug' this code when there's a problem.

Don't get me wrong: 'debugging by echo' is useful at times, but I would say
that 9 times out of 10, a proper debugger is more valuable. In my case. YMMV.
etc.

~~~
bsaunder
I find analyzing trace statements to be far faster than setting conditional
break points and stepping through a debugger.

I can quickly run a test, grep/skim through the trace output and ignore/drill
down into detailed minutia that are irrelevant/relevant to the problem being
investigated.

I also tend to write my trace statements in an easily parsible format so if I
need to I can write another program to analyze what happened and find the
problem. Writing a full on program to process log files happens less often
than chaining several unix commands to find the needle in the haystack
(usually the program happens when I need to thread together several widely
separated trace lines).

~~~
cube13
It really depends on the program architecture and what issue you're debugging.

For most single-threaded or simple multithreaded programs, it's easier just to
throw a print statement where you need it and analyze that. Even with a large
data set, a simple grep will probably get you what you need.

When you're debugging a complex, multithreaded program, on the other hand, an
external debugger is much more valuable, because you can break at the exact
point where things go wrong, and examine the entire program's behavior at that
point. Debugging a race condition with prints can be pretty difficult,
especially when the printing changes the timings of the threads.

~~~
scott_s
I've debugged my fair shared of parallel programs (multithreaded and
distributed), and I've used a mix of both techniques. The advantage to print
statements is that it gives you an execution _trace_. It's easier for me to
reconstruct the sequence of events - which is not trivial in a parallel
program. What I give up is full knowledge of any one instant. And that's what
a debugger gives me.

It's a trade-off, but I usually start with traces.

------
skrebbel
I'm sorry, but an article that starts with an authority argument as follows:

> _Bernard Hayes, PMP (PMI Project Mgmt Professional), CSM (certified Scrum
> Master), and CSPO (certified Scrum Product Owner)._

makes me not want to continue reading much further.

To clarify: you become a CSM and a CSPO by taking a 2-day course each (well
ok, there's an online exam, but you can look up the answers on wikipedia and
pass). PMP is the only certification in that last that takes some effort, but
it's totally unrelated to software and, thus, to most of this article's
subject matter.

------
kstenerud
This smacks of a "kids these days" article.

I remember similar articles in the 80s and 90s bemoaning how "programmers
these days" didn't know how to use a protocol analyzer or logic probe, or
didn't know that xor made for a faster register clear operation (or moveq for
68k fans), or any other number of esoteric trivia that, while useful in
context, did not usually contribute significantly towards a programmer's
ability to get the job done.

My first debugger was an in-circuit-emulator for a Z80. It was the size of a
small television set, had a crappy UI, and limited functionality. Today's
debuggers can be hosted on the system itself, and have become so powerful that
most people don't know how to use them to maximum effect (myself included).
IDEs check your code as you type. No more writing something in VI, compiling,
tracking down the cryptic error messages your compiler spat out and trying to
figure out where the REAL error is because the compiler is dumb. You're
shielded from the ugliness underneath, and for 99.9% of cases that's more than
enough.

Do "kids these days" really need to know the sound of a hard drive dying? The
last drive I heard going bad was in the 90s. Since then drives have become so
quiet that you'd need a stethoscope to even hear the arm thrashing (which is
why I use RAID). And how useful is the knowledge that you can open up a frozen
drive and spin it up with your finger going to be as disks are replaced by
SSDs?

We live in the future, where things have gotten a LOT better. Do the new batch
of developers really need to know assembly language? After moving to "fluffy"
languages, I only twice found need to use it (once to disassemble a stack dump
from a JNI crash, and once to monkey patch a buggy device driver). Twice in
all my years since using Java, Python, PHP, Objective-C, Scheme, COBOL, VB,
and C#. Was it damn handy to have the right skill at an opportune time? Hell
yeah. Does EVERYONE need this skill? Hell no.

How about bit packing? Memory has become so cheap and plentiful that even
routers come with 16MB or more. Beyond low level networking and peripheral
protocols, what use is there in packing up bits and coming up with clever
encoding schemes that make for complicated (and potentially buggy) codec
routines? Saving one byte in a packet header is hardly the triumph it once
was.

All the "kids these days" need to know is their algorithms, profiling,
debugging, multithreading issues, and how to write well structured,
maintainable code in their paradigm of choice. The rest is usually industry
specific, and can be learned as-you-go.

Don't worry about the kids. The kids are alright.

------
quanticle
>Other low-level skills that today's engineers aren't getting, according to
Carl Mikkelsen: "programming tight loops, computational graphic approximations
like a circle algorithm, _machining cast iron_ , designing CPU instruction
sets, meeting real-time constraints, programming in the inefficient zone -
recovering some of the last 30% of potential, and analog design, such as audio
amps."

What, do they also expect us to cut, polish and etch our own silicon wafers as
well?

~~~
mechanical_fish
For the record: Everyone who calls themself an engineer should have tried to
machine cast iron (or some kind of metal; better to start with aluminum) at
least once. And everyone should know how silicon is made, hands-on if possible
(which it generally isn't; fabs are expensive).

Why? Because these things are awesome, and form the foundation of our culture.

Now, having said that: No, having hands-on experience with these things
doesn't really help with programming. ;)

~~~
Ixiaus
I think these things _do_ help with programming in a manner that is probably
hard to measure. _Making something_ has an affect, I find hard to describe, on
my ability to think logically and creatively (in more areas than just
programming, but it so happens that I spend most of my time programming).

For those interested, here's a series of books on how to build your own
machine shop from scratch (by building your own foundry, casting parts,
etc...): <http://www.lindsaybks.com/dgjp/djgbk/series/index.html> (I'm not
affiliated with Lindsay Books in any way)

~~~
mechanical_fish
I don't disagree. But the problem with this argument is that it's hard to find
any creative or imaginative activity that _doesn't_ potentially help you with
programming.

Tinkering? Cooking? Music? Writing? Reading? Puzzles? Even _sleeping?_ It's
all potentially good.

------
ColinWright
It quotes our own bensummers:

    
    
        Ben Summers, Technical Director at ONEIS, a U.K-based
        information management platform provider, points out
        that "habits learned when writing web applications for
        14.4kbps dial-up telephone modems come in rather handy
        when dealing with modern day mobile connections. When
        you only had couple of Kbytes per second, and latencies
        of a few hundred milliseconds, you were very careful to
        minimize the size of the pages you sent, and just as
        importantly, minimize the amount of back and forth with
        the server."
    
        With today's mobile connections, says Summers, "the
        latency is much worse than using a telephone modem
        connection, and that's compounded by error rates in
        congested areas like city centers. The fast 'broadband'
        headline speeds are pretty irrelevant to web applications.
        It's the latency which determines how fast the response
        time will feel, and tricks learned when phone modems
        ruled the world come in awfully handy. As a bonus, when
        someone uses your app on a fixed connection, it'll feel
        as fast as desktop software!"

~~~
bensummers
I was shocked to learn that I was a "industry veteran and/or seasoned coder".

~~~
weaksauce
Out of curiosity, was this particular quote taken from HN, a more formal
interview, or a blog posting that you wrote?

~~~
bensummers
The author put out a request for contributions, and somehow it ended up in my
inbox. I thought, "why not?". I just proposed a topic, and when the offer was
accepted, wrote a short bit of text.

I'm curious as to the effect it will have, if any. So far it's resulted in two
hits to our web site, neither of which explored any further than the home
page.

~~~
weaksauce
Interesting. One of the hits to your blog was me looking to see what your
credentials might be so sorry to inflate them by a factor of two.

~~~
bensummers
How did my credentials look?

(Two more hits since the last comment!)

------
dspillett
One thing I find coders who have not been through a formal course like
University (and even some that have...) lack is an understanding of basic
complexity scaling issues.

And by basic I mean not understanding the different performance implications
of a table scan, an index scan and an index seek in an SQL query plan. And
also why "it takes ages first time, but is quick after that (when everything
is already in RAM)" is usually not acceptable (every time could be the first
time around if the query isn't run often or RAM is limited).

Some of the stuff that article lists is just not needed at all by a code,
really. Some are strictly hardware issues. Others are oddly specific:
"programming tight loops" is part of the complexity theory thing:
understanding how a process will behave at relevant scales and optimising
accordingly.

~~~
Ixiaus
I agree with this - but it is more than that too; I've undertaken a self-study
of machine fundamentals too. That is something almost all web application
programmers lack (those that don't come from EE/CS that is). By machine
fundamentals, I mean how instructions are executed on the processor - how
memory works - how different kinds of work can either be optimized for the CPU
and the CPU's cache or how it can be optimized for memory - what the
difference is between a 64bit bus and a 32bit bus - what's so special about
PCI-Express and the AGP buses - how I/O _actually works_ and what random seeks
on the disk are (yay for SSDs).

The biggest gap, IMHO though, is a lack of knowledge about big O and why
quadratic behavior can be bad, what it is, &c... That goes hand-in-hand with a
lack of knowledge in algorithms. Why is bubble-sort considered bad? What's a
generator? Why does everyone keep saying to use xrange() in python? Why is it
bad to use list concatenation?

------
zwieback
_"Assembly language, interrupt handlers, race conditions, cache coherence,"
says Jude Miller, long-time system consultant and industry curmudgeon._

This is what I do every day at my job and there are other people who know how
to do this, plenty of them. Most of them are EE's, though, and consider SW/FW
development as their secondary job.

It isn't easy to hire embedded programmers, graduates with computer
engineering, EE or CS degrees will have some knowledge but it's experience
more than anything that will help you acquire these skills. When I look at
resumes I look for experience building small circuits, Arduino or PIC. If I
don't see anything like that or if the skill list starts with Java, PHP, ...
it would be kind of unfair to expect detailed knowledge about how to use
"volatile" in C or how to use a scope to find race conditions.

~~~
jmaygarden
I agree with all your points, but just want to add that I've seen very ugly
code from EE-types (I'm a EE). An outstanding board design and audio/video
engineer I worked with avoided loops whenever possible cost because they "made
the code confusing." However, his FPGA code was immaculate. I chalked it up to
a parallel versus sequential thought process that mirrors the differences
between hardware and software.

~~~
numeromancer
_I chalked it up to a parallel versus sequential thought process that mirrors
the differences between hardware and software._

I don't think this is an essential difference. Programming has preferred the
sequential models of computation, but more as a fashion than a necessity. Now,
Turing's empire wanes as Church's empire waxes, and those parallel elements of
computation that the hardware has hidden from software for so long can no
longer be sequestered in silicon.

------
arohner
"I see poor understanding of the performance ranges of various components"

I'm totally guilty of this. I write new code on new hardware, and have very
little intuitive of how fast it should go. Is 10k ops a second good? 1M? I
just don't know how fast it should go. Of course, then I pull out the
profiler, and think about my algorithm, but it takes a lot of second-guessing
to decide how close to the limit I am.

For example, I was writing some clojure code to write to a SQL database. I'm
relatively new to the JVM stack. I was writing to the DB at 1MB/s. I thought
"well, that's not great, but not bad. Maybe after network traffic and DB
constraints, and writing to a laptop disk drive, I suppose that's alright".
No, I replace the JDBC DB thread connection pooling driver, and the same code
now writes at 8 MB/s.

It'd be nice if there were a web resource for general guidelines on what it
takes to max out hardware. Basically, benchmarks for real-world tasks.

~~~
thirdstation
"It'd be nice if there were a web resource for general guidelines on what it
takes to max out hardware. Basically, benchmarks for real-world tasks."

Here here!!

I had the same thought when reading the first two pages of the article. I'd
love to be able to better intuit performance (or heck, troubleshoot slow
systems - which I do more often). The problems I encounter are lack of
accurate and understandable information about the underlying hardware and the
various layers between my program and the hardware (especially important for
me lately as more of my stuff runs in a VM).

It seems like you need to be lucky and find a mentor willing to teach this
esoteric material.

------
KirinDave
It's unfortunate the folks who edited this couldn't differentiate the real
aces from the people pining for the days of the waterfall model.

------
nadam
I love speed optimization over what I am mostly payed for at my workplace
(usual business applications). I am also better in what I love. Unfortunatelly
no one really wants to pay me for optimizing the hell out of an algorithm.
They want me to maintain their boring Java enterprise applications. I've
searched for such tasks here at Hacker's News also, no one was interested. Not
a single company. So I don't think it is a skill which is really in demand
today.

~~~
mattmanser
Try embedded systems programming, speed is usually pretty crucial there.

I've met a guy who runs a company of about 6 programmers doing this, has more
work than he can handle and has difficulty finding good enough programmers. I
think they're mainly C++ but were recently trying to find a C# guy.

So it's in demand, but you've got to know where to look.

~~~
bradfa
Any info on who this guy is or what the company is?

As an example of embedded programming, do some timing critical work with
microcontrollers and you'll find all sorts of fun optimization problems.
Recently I had an algorithm that took 13 microseconds to execute but needed to
do it in 11 (there was another interrupt coming!). I got to have a good time
with the debugger, understanding optimization levels used by GCC, reading lots
of assembly, and playing with a logic analyzer. It's quite fun, actually.

------
jswinghammer
I would be happy if more than 1/10 interview candidates managed to pass my
most basic programming questions. I consider my standards to be too low for
the type of jobs I interview people for and I'm still disappointed routinely.

~~~
kahawe
How basic are those questions if you don't mind me asking for details?

~~~
Afton
I've resorted to starting with "find the greatest int in an array of ints".
And yes, I'm routinely disappointed.

~~~
bostonpete
Easy -- quicksort! :-)

~~~
Yeroc
I'm curious why you'd choose an algorithm that on average takes O(n log n)
comparisons with a O(n^2) worst case versus simply iterating O(n)?

Unless there are other requirements I don't see why you'd suggest sorting the
elements.

~~~
bostonpete
I certainly wouldn't choose that algorithm. I thought the correct solution was
obvious enough that I could joke about it...

------
ominous_prime
This still comes back to the basic guideline; "learn your fundamentals". You
don't need to learn them all (designing CPU instruction sets??), but you do
need to know the primary layers you interact with. A web programmer generally
doesn't need to be aware of the machine code generated by his application, but
should be knowledgeable in networking (layer 3 and up), caching, databases,
and so on.

~~~
StavrosK
I'm a web programmer, and, although it wasn't _necessary_ per se, designing
and implementing a CPU was one of the most fun things I ever did.

------
ff0066mote
I was caught off guard by the first line, where the author uses PHP as one of
the examples of an environment out-of-touch with hardware issues.

I started to learn how to program in PHP. Back then there was a sentiment that
PHP and similar high level scripting languages weren't real programming.

With the web so ubiquitous today, I didn't that sentiment had survived, but
here it is.

~~~
ominous_prime
Simply because you may be an exception, it doesn't disprove the general case.
_Many_ self taught programmers (which php brought a rush of) lack programming
fundamentals. They are blissfully unaware of the implications of the Von
Neumann architecture, and don't understand how their actions translate down
the various layers that support their code. If this was written 10 years ago,
the author would have likely cited Perl, or VB.

------
mvanga
I agree with some of the points the author was trying to make although I don't
think he really expressed them with the right arguments.

I think our culture was and will always be based on exploration and innovation
and this is simply moving to higher levels of abstractions today. There is
nothing wrong with this.

However, I personally am not satisfied with simply being able to use an
abstracted interface. I have a strong curiosity of how things work under the
hood, of tinkering with something to make it do new things and even try and
rewrite things in simpler forms.

I think a different kind of hacker evolves when you have a basic understanding
of the entire technology stack. This breed is inevitably going to fade with
the increasing complexity of this entire stack (breadth and height) paired
with the current speed of innovation.

In the end, we can lament all that we are losing or work towards everything
that lies ahead unexplored :)

------
ColinWright
Single page, faster loading: <http://www.itworld.com/print/190213>

------
numeromancer
People like this author and the people whose complaints he propagated
convinced Socrates that the Oracle was right--because he was the only person
who was aware of _his own_ ignorance.

------
jjm
I'd like to add that a huge issue I saw in big Corp were when people/teams
used FOSS and _didn't_ look at the source.

Just like the phrase rtfm, there should be rtfs.

~~~
prodigal_erik
I do see <http://www.catb.org/jargon/html/U/UTSL.html> every so often.

------
alexk7
I disagree. If you are able to achieve your goals without going low-level,
please do it. If you need to understand low-level stuff to get the job done,
please start learning. But please don't fall for the "every programmer should
know this" meme.

------
MaxPresman
Feels like the author of this article (and the people he quotes) are super
bitter about the "new generation". Perhaps things did not work quite well for
them, but there is no need to blame it on the kids : /

------
grantismo
Today's hardware isn't as much of a bottleneck as it used to be. That's the
simplest explanation for fewer programmers with low-level knowledge.

------
samlevine
Obligatory:

<http://www.pbm.com/~lindahl/mel.html>

------
ia
wow, an entire article to complain about increasing abstraction and
specialization. tldr; get off my lawn.

------
nollidge
tl;dr: cite anecdotes, generalize them to an entire generation of coders,
wring hands.

------
cafard
listening to disk drives? Betcha that works great with cloud-based storage...

------
mcantor
How is this hacker news?

------
kahawe
Those are, by and large, not useless skills to have but most of them have
nothing to do with software engineering or just "coding". Morse code and
listening to my hard drive??? If the data is worth anything to me, I should
have backups anyway and should not have to worry about what sound my hard disk
arm makes also considering that modern hard drives might not have an arm at
all.

And I think they left out the most basic and fundamental skill or
understanding: there are no silver bullets. A lot of programmers nowadays seem
to religiously follow whatever new language is being hyped and try to fit
their problems and tasks to the language instead of the other way around... a
bit of critical thinking and seeing a bigger picture than "thisandthat is THE
SH*T (right now)!!" would work wonders.

In the real world, real coders could not care less how god-like the latest
scripting languages or NO-SQL-but-relax-data-maps are because chances are very
good that my customers use Java and Oracle or a few other big names and since
they are paying me, who am I to push religious plugs about the latest fads on
them?

Bottom line is: real coders (should) just know enough about computers,
hardware, software and networks to make educated decisions and guesses and
typically they don't care that much which language they are getting paid to
develop in... pretty much all languages "suck" in the way that they ALL have
their short-comings and it is up to the engineer to understand them and work
with them.

~~~
arethuza
You do realise that once upon a time pushing Java was pretty much a "religous"
activity and indeed the same applies to SQL. All technologies have got to
start somewhere....

Edit: I know, I was an evangelical Java fanatic from about '95 to '00 or so.

~~~
true_religion
While I wasn't working at that time. I agree.

Once relational databases were considered to be "academic" because of their
mathematical underpinnings. What was considered practical was flat files, or
graph databases.

