
The demise of the low level Programmer. - wqfeng
http://www.altdevblogaday.com/2011/08/06/demise-low-level-programmer/
======
SoftwareMaven
I have conflicting feelings about this. The low-level programmer is
disappearing because the abstraction layers are getting better, which means
the complexity we deal in is "how do I get a 30 machine cluster to map-reduce
efficiently" instead of "how do I optimize these instructions so I don't blow
out my L1 cache".

But...all abstractions in computing are leaky, and not having an understanding
of lower levels of abstraction _will_ come back to bite you.

The simplest example is floating point: how many pennies have been mis-
appropriated because a developer used a float instead of a fixed-decimal
number. If you understand how the underlying floating point abstraction leaks,
you would never use an IEEE float for anything requiring exact decimal values;
if not, you feel comfortable using the native type that has the ability to
represent decimal values. You don't need to actually know how the float is
represented to achieve that.

The real trick is understand how much of each layer of abstraction you need to
know. A generalist needs to know enough about computer architecure, the
network stack, security, etc., that he knows when he either needs to consult
an expert (Google often counts ;) or when he is ok to rely on that
abstraction, but if you try to learn too much, well, you aren't a generalist
anymore.

Great generalists are, by definition, great at figuring that out. As a result,
they are incredibly valuable because they are really good at just getting
stuff done.

~~~
gaius
As an aside, this is one of IBM's secret tricks - hardware accelerated money
datatype
[http://www.ibm.com/developerworks/data/library/techarticle/d...](http://www.ibm.com/developerworks/data/library/techarticle/dm-0801chainani/)
. No other major vendor has this.

~~~
cwzwarich
It's not exactly secret. Most "big iron" architectures had packed decimal
support in the past, and IBM pushed to get decimal floating-point added to
IEEE 754-2008.

~~~
gaius
Well obviously, since they have a web page about it :-) But "secret" in the
sense that not all devs seem to be aware of it yet, if anyone is still using
classical floats (or even ints) for handling money.

------
subwindow
I don't think that knowledge of low-level systems programming is ever going
away. The order in which we learn it is merely changing. In the past, everyone
had to understand the absolute basics _first_ because the higher level
abstractions were very leaky. Over time the abstractions have gotten better,
performance has become less of a problem for most cases, and low-level systems
are no longer something everybody _needs_ to know in order to get by at the
higher layers of abstraction.

However, that doesn't mean that people aren't learning it anymore. They're
just starting with a broad set of high-level knowledge and _drilling down_
into the parts that are important to them. For instance, recently I've been
toying with compact string representations of big integers (using bit
shifting: how topical).

The end result is that you end up with developers who have broad knowledge of
the higher layers of abstraction but incomplete and specialized knowledge of
the lower layers. That seems perfect to me.

~~~
ajross
The knowledge _can't_ go away, because someone needs to write the systems:
sure apps might move to managed/interpreted environments exclusively, but
those interpreters and language runtimes need to be written first. And they
run on a kernel, with drivers for hardware that needs firmware.

I don't lose much sleep at night worrying about my job security as a "systems"
programmer. If anything, my skill set is becoming underrepresented and hard to
staff.

~~~
ZephyrP
They just can't write LKMs like they used to.

------
adrianhoward
The low level developer hasn't gone. They're just a smaller proportion of a
much larger population of developers.

That makes them harder to find, but there still there. Wouldn't be surprised
to find out there are actually more of them than there were 20 years back.

The border of where "low level" begins is also shifting. I, for one, am glad I
no longer have to hand assemble my own multiplication code :-)

~~~
olalonde
In China, there is a lot of low level programmers. I guess it has to do with
the fact that electronics are all manufactured here. If you live in Silicon
Valley/Seattle, where most startups are building web apps, it's not surprising
to see few low level programmers.

~~~
jbester
They still exist in the US, they are working "embedded" now. Embedded has made
a huge jump in the last 15 years from custom OSes or no-OS to now COTS
RTOSes/Linux. Even the embedded guys lament the newer guys can't debug an ISR
anymore...

~~~
SteveMoody73
I'm an embedded developer myself and while the original article is aimed at
game development it is just as valid for embedded development.

I've come across many situations where saving a few extra bytes of RAM here
and there and saving a few instructions during an interrupt routine have been
vital.

~~~
yummyfajitas
It's even valid at much higher levels, sometimes. Hadoop gives you high levels
of abstraction, but you can still get major performance gains with small
adjustments to file formats.

[http://www.chrisstucchio.com/blog/2011/mapwritable_sometimes...](http://www.chrisstucchio.com/blog/2011/mapwritable_sometimes_a_performance_hog.html)

------
gommm
One book I'd recommend for this is Michael Abrash's Graphics Programming Black
Book. Of course it's a bit dated but there's a lot of knowledge to be learned
from it and it has good discussions about why and when to optimize...

Plus some of the more dated chapter can be seen as a good history lesson :-)

[http://www.gamedev.net/page/resources/_/technical/graphics-p...](http://www.gamedev.net/page/resources/_/technical/graphics-
programming-and-theory/graphics-programming-black-book-r1698)

~~~
angersock
I cannot recommend this book highly enough. It's got a lot of great insight
into how to reason about low-level issues, but Abrash (who is now a programmer
at Valve, I think; first in the credits for Portal 2) isn't just a cranky
assmebly hacker. He recognizes when problems are better solved at a higher
level, and tells the reader as much in several amusing chapters.

Also, the man's a fucking beast:
[http://www.mobygames.com/developer/sheet/view/developerId,21...](http://www.mobygames.com/developer/sheet/view/developerId,213/)

------
madrox
In every technology stack, there's always some layer below which is
unnecessary to understand and above which is essential to writing good code.
Where you think that cutoff belongs is usually the layer at which you,
yourself, understand it.

I mean, we live in a world where we have programming languages that COMPILE TO
JAVASCRIPT.

That said, I applaud this guy's OCD fascination with low level optimization.
It's only because of people like him that higher level programmers can even
exist.

~~~
cwzwarich
> In every technology stack, there's always some layer below which is
> unnecessary to understand and above which is essential to writing good code.
> Where you think that cutoff belongs is usually the layer at which you,
> yourself, understand it.

It's important to be able to work nominally at the N'th level of abstraction
but be able to jump to level N - 1 if need be. Computer systems are full of
leaky abstractions, and if you work long enough they will probably impact you
at some point.

~~~
dalke
I read that as a statement of arrogance. We are not überprogrammers able to
delve down to all levels of the system.

When was the last time an Excel macro developer needed to know semiconductor
physics in order to debug a fault lying in the silicon layer?

When has a Ruby on Rails programmer ever needed to know quantum
electrodynamics to track down a hard disk error?

Abstractly you may consider that knowledge to be useful, but concretely,
there's a reason people specialize - each of those domains takes years to gain
competence, and there's a good probabilistic argument against everyone being
"able to jump to level N - 1" all the way down.

~~~
marcusf
I doubt the poster meant it as a recursive statement. Rather, if you work in
Java you should know how the JVM works, if you work in Javascript you should
have a good grasp of how the browser works, etc.

~~~
dalke
madrox wrote: "In every technology stack, there's always some layer below
which is unnecessary to understand", which is a fully recursive statement."
cwzwarich's comment is in the form of an inductive argument. Why should I not
see it as also being recursive?

Most people can't "jump" to a level lower than they are used to working. They
have some idea of the overall concepts, yes, but it takes time to get up to
speed. Most Java people have some ideas of how the JVM works, but few can jump
to (say) the JVM's memory allocator code (even when available) in order to fix
problems.

In any case, does a Java programmer need to know chip layout design in order
to program, or do you agree that that is "some layer below which is
unnecessary to understand"? Yes, someone, somewhere, might have had a problem
in Java code caused by noise leaking across from one circuit to another, but
you can't seriously expect that most people will be able to jump there if
needed.

------
Luyt
It's possible to descend to lower levels, if given enough curiosity. Some
people just want to know how things work under the hood.

I can remember starting to program our brand new PET-2001 in BASIC (at 13
years old or so). After a while I became bored with the sluggishness of BASIC,
and the cramped memory space. Only very small BASIC programs would fit. But
there was a new challenge! I wanted to discover what made the PET really tick
-- how did it understand BASIC? Then I learnt about ROMs, RAMs and the 6502
CPU, and how to program it in machine language. It was similar to programming
programmable calculators, which I'd done since I was 9. But the 6502 worked
much more faster, and had a rich instruction set, compared to those old
calculators; and a much better screen, with graphics!

Since I was a kid, I always wanted to look into the inside of toys how they
worked, for example, my 'Brummkreisel'[1] spintop. At 6, I wanted to open my
hamster to look inside, but my mother wouldn't allow it... (Otherwise maybe
I'd be a surgeon nowadays ;-)

[1]
[http://en.wikipedia.org/wiki/File:Brummkreisel_BW_2011-08-12...](http://en.wikipedia.org/wiki/File:Brummkreisel_BW_2011-08-12_20-37-51.JPG)

------
akgerber
I graduated in 2010 with a ECE degree and learned all of this. Essentially all
of that material is covered in CMU's 15-213 "Introduction to Computer
Systems", which is taken by a huge proportion of the undergrad CS & ECE
program.

It looks like book from the course, which covers all of that material, is
pretty widely used, too: <http://csapp.cs.cmu.edu/public/adoptions.html>

However, my first job out of college was doing low-level optimization at a
cellular silicon provider, but now I work on the web since it seems like most
hardware companies are bigger and have more suburban locations (neither of
which is my preference), and if you're working hardware you can't work
remotely. Hardware companies also aren't necessarily the best software
organizations.

------
scott_w
There's a certain irony that the author laments the loss of low-level
developers, yet the site doesn't work without JavaScript.

I like that he provides some links for further reading, and I'll bookmark the
page for that reason alone.

Besides those links, his diatribe seems to be misplaced. For many
applications, it's not necessary to know the difference between floating and
fixed-point.

Perhaps one day, it will no longer be necessary to care about the difference
between decimal/integer/floating-point, as the compiler/interpreter will know
how and when to use the "right" one. And I will welcome this. I've seen
inexperienced developers use float(val) when counting money; and no amount of
training will stop the next generation of new programmers making the same
mistakes.

~~~
lucian1900
I disagree about automation. Compilers should never silently lose information.
Lisp had it right a long time ago: default to fixed point + real fractions and
only allow floating point math when explicitly requested.

~~~
scott_w
I think, with current languages, that would probably be the right way to do
it. It's not beyond the realms of possibility for a language to try and do the
right thing (with sane and predictable rules of course), and let the user
override it with their preference.

The hardest part would be getting those sane defaults, otherwise everyone will
override them anyway, and you're back to square one or worse.

~~~
lucian1900
Sane defaults shouldn't hard at all. Just like in Python you have to do
Decimal('1.0') now, we could just as well have to do Float(1.0) to override
the default.

Clojure has with-precision [1] and Common Lisp has defaulted to fixed point,
fractions and explicit ^float type annotations.

[1] [http://clojuredocs.org/clojure_core/clojure.core/with-
precis...](http://clojuredocs.org/clojure_core/clojure.core/with-precision)

------
gaius
Because programmers no longer grow up on 8-bit systems programming in assembly
language, they jump straight in to Java or whatever - they've never gotten
closer than 10 layers away from the actual CPU, it's all a magic black box to
them.

~~~
cageface
The mind can only take in a window of information of a certain maximum width.
Abstracting away the lower levels is crucial in order to free the attention
for the higher level questions.

Software is solving much more difficult problems than ever before thanks to
this evolution to higher abstractions.

~~~
kstenerud
Yes, and everything's fine until something blows up. Then you need to know
what's going on at a lower level to be able to fix it.

Case in point: I once had a co-worker show me a Java crash report from a crash
that nobody could figure out. Every now and again, the server, Tomcat, JVM and
all, would just die, spitting out a stack trace that went into native land.

One really nice thing on the crash report was a full register dump, stack
address dump, and a dump of the memory around the PC. So I disassembled the
memory, figured out that it was calling memcpy when it died, and lo-and-
behold, one of the register operands was 0! A comparison of the stack
addresses showed that it was dying in a library used for biometric fingerprint
based authentication.

The vendor kept insisting that the problem was on our side (we had a JNI
library of our own that called their library, but I desk checked it twice and
concluded that it was fine), so I disassembled the whole library and traced
what it was doing. It turns out that certain kinds of incomplete fingerprint
scans wouldn't trigger a rejection at the high level, but would cause a
rejection at the lower levels (almost... it would clear the pointer to the
data but would then return TRUE instead of FALSE). The library would then
carry on its merry way passing a null pointer lower and lower until it died in
memcpy, taking the JVM with it.

~~~
cageface
I think it's valuable for most teams to have one member that can understand
raw crash dumps but this is an age of specialization and I'd rather have the
rest of my team each learning some other niche.

~~~
kstenerud
Agreed, however I do believe that each person should have a basic
understanding of at least one level below where they're working at.

------
hobin
The low level programmer is obviously not gone, as other people have already
mentioned. Rather, what I think most of us who've been through a few decades
of development can see, is that the hobbyist has become a different kind of
person. Years ago, all the hobbyists would go "Oooh! I _finally_ managed to
print ASCII on the screen. Ok, now let's spend another day trying to get it
working in color!" In 2012, there is (in most cases) very little point in
doing this, and the 'hobbyists' we see now are mostly building websites 'n
stuff.

------
martincmartin
There are lots of jobs for low level programmers outside of games too. I've
worked for ITA Software (mentioned by PG in Great Hackers), Endeca (recently
bought by Oracle for $1B, and we had nothing to do with photo sharing), have
been wooed by robotics companies, and now work on file system caches with a
peer-to-peer component.

~~~
pagekalisedown
I have the impression that new grads aren't attracted to low-level programming
because of the greater proportion of higher level programming jobs.

I think low level jobs need more visibility. Perhaps a monthly "Who's Hiring"
thread for low-level jobs would help.

~~~
cube13
I think the main issue is that a lot of CS programs don't really teach these
subjects. The top rated CS programs teach them(I learned all of the subjects
listed in TFA while I was an undergrad at UIUC) but I think a lot of CS
degrees are just Java or C# programming training now.

~~~
diek
Computer Science programs that are taken seriously are certified by ABET,
which requires computer architecture and operating systems courses that cover
most of the topics he lists.

Many employers (specifically Intel, but I've seen many others) specify an
ABET-accredited BS program in education requirements for job openings.

------
sgaither
I looked at the title and thought "good riddance", assuming it meant mediocre
maintenence-only programmers were being replaced by automated-
programming/systems, just as low-level lawyers can be replaced with document
scanner/databases.

But systems programming seems to be an essential role that uptime-concerned
operations will always want on retainer...

------
kalleboo
The low-level programmer isn't gone, but over the past decade we've seen the
rise of the meteoric rise of the high-level programmer.

For instance working on web apps, when you an extra server costs $400/month (4
developer hours or less), what value is there in spending days hand-optimizing
an algorithm (into a form that'll be harder to debug later)?

Another example: writing mobile apps. On a phone today you have 1 GHz of
power, but still a very limited interaction model. It seems basically only
games actually need to optimize for performance.

~~~
yummyfajitas
_...what value is there in spending days hand-optimizing an algorithm...?_

Reducing latency?

Throwing servers at a latency problem doesn't usually help (unless a server is
overloaded).

------
Vlaix
I've always felt attracted by low-level systems and I'm currently one of many
students in Comp. Sci. For the moment all by love goes to all kinds of bare-
metal stuff, optimization tricks etc. I'd like to go all the way through it
but being about more abstracted development (although the term looks unproper
to me, since being able to grasp a machine's ways is a valuable effort of
abstraction itself) seems like what is and will be expected of me.

So I'm asking : career-wise, are the optimization ghettos (video games,
embedded systems, critical systems etc.) all that one could expected when
refusing to code and manage projects around buzz-words or is there actual
innovation / change and a shiny future to come for bit nerds ?

~~~
georgieporgie
I think the only people I've seen using these sorts of low-level techniques on
a regular basis were electronic engineers who were writing embedded code
(specifically: for Wacom-style tablets). Then again, I've never worked with
games programmers, but my friends who are games programmers don't seem to come
near this level.

------
jmspring
I started life mostly doing low level stuff - and still do as needed. But, it
has been about 4 years since needing to do any assembler (which was for
windows mobile). Embedded systems still need some of this.

That said, my day to day life lives more in the realm of scripting/interpreted
languages and occasionally diving into C/C++. Even in C++, though, frameworks
like Boost make C++ seem higher level than it is.

Today, we see a need for people that can work on Cobol and companies providing
Cobol=>Java bridges. I suspect the need for the lower level programmer may
occupy a similar nitch in a few years...

------
kazuya
Your low level is my high level.

Even assembly code appears high level when you are looking into how the
electrons run through silicon and copper. But it's not realistic to cover all
the stack down to the atoms. Then which layer you can stop digging at?

For a programmer, the processor architecture (including the instruction set
architecture and the bus architecture) level would be fine. It is the
strongest abstraction barrier and you rarely need to look below it.

That said, if you are working on embedded systems, knowledge about digital
electronics is almost a must, and some analog bits really helps.

------
krollew
Wise rule says "Don't tweak efficiency until needed". Since we have fast
CPUs/GPUs and faster are made very often tweaking efficiency is less needed. I
think this rule is sometimes overused, but what can I do? :P Anyway, there are
still people that know much about internals even if they don't use it at work.
:) Currently I code in Perl and can tell you something even about pageing
internals. By the way we had quite nice class on it on university. :)

~~~
vonmoltke
1) My problem with this attitude is it leads to, as gaius said elsewhere in
this thread[1], "requiring a 2Ghz quad-core with 8G RAM to edit a simple
document". Developers should not be relying on improvements to their users'
hardware to boost their software's performance, at all.

2) My (soon to be former) work is in signal processing software for military
applications. Our customers do not want to buy and upgrade hardware just
because some new algorithms don't fit. This creates a situation like the
console world (as klez mentioned in his reply) and the mobile device world,
where hardware is basically fixed for some period of time for a given user.
You have to pay attention to the efficiency of code in these circumstances if
you want to expand the capabilities of your applications in any meaningful
way.

[1] <http://news.ycombinator.com/item?id=3827096>

------
speednoise
Meanwhile, half of Hacker News is implementing DCPU-16 compilers. The low
level programmer is still flourishing, there's just a lot more people
programming.

------
aswanson
It looks like several people have interest in this as a career path.
Personally, I'm tired of it and am trying to move up the stack. But for those
interested in terms of a career path: <http://www.eetimes.com/design/embedded>

------
andyfirth45
I'm surprised this article is still circulating after the hits it received
last august. If anyone would like to discuss the subject further or needs some
help breaking into a role like this then please feel free to ping me directly
through linkedin/facebook

\- andy firth (the author)

------
gte910h
Low level programming _of PCs_ perhaps.

There is more embedded code and applications out there than ever before, and
tons of mobile devices that need similarly low level code.

------
sedachv
I'm surprised no one has mentioned the story of Mel yet:
<http://www.pbm.com/~lindahl/mel.html>

Low-level programmers have been dying ever since the first assembler was
invented.

~~~
jff
That's because the story of Mel has probably already been mentioned at least
20 times this week :-)

------
ZephyrP
Come, come

Up and coming low-level programmers still exist. I just turned 20 this past
January and I've been doing x86 at least once a week for 5 years now, in fact,
my meal ticket -- IDA Pro, sits as I type this whirring away at calculating
the relocations of some PIC in our favorite browser.

While it may be true that that the majority of people with assembly knowledge
are slaving on heap sprays more than swizzling pointers or tackling alignment
issues these days (or any other "real" low level work). All and all, auditing
isn't half bad and I suppose the markets agree.

I guess to that end, There's going to be no shortage of low level junkies as
long as kids still think hackers are cool. Kids are going to be attracted to
the cyberpunk glam of hackers (or, naturally, Angelina Jolie's smoking body in
1995). Of course, as it goes, If you want to be any good at that sort of
thing, you have to write memory corruption bugs, and if you're writing memory
corruption bugs then low level is a necessity.

So while it may be true that producing anything save VX crypters in assembly
is a dead end, the deep knowledge is immensely rewarding and while it's true
you might not be able to erect castles, you can perform some deep machine
magicks that will make a few people's heads turn.

You unlock many things that many programmers simply dream of -- The ability to
reverse applications, to tweak some 'interesting', decidedly-nonoptimal
optimizations (LEA is always faster!) and find cleverly exploitable bugs in
applications that you don't have to source to.

You can optimize in ways that other programmers, no matter how skilled in any
other language, simply cannot. A recent and related anecdote is my database (
zv.github.com/artifact ) which, once upon a time, was extremely slow during
the mkey exchange when it diffs the hashes against it's own, it was copying
every hash and comparing it to every other hash it had received thus far (For
when quadratic time is too fast). Normally this would have been a few hours
long job to rewrite some pithy external library, write tests and somehow
implement speedy string manipulation in Erlang. Nope, HiPE allows you to treat
an object file as a module as long as it has the correct export interface. My
humble database is now an order of magnitude faster on convergence (This may
speak more towards the performance of string processing in Erlang than any
personal prowess, but still).

So, in conclusion Mr. Andy Firth, it's probably true that the vast majority of
people who you are interviewing aren't very good, because hey, that's the
central limit theorem at work!

However, I assure you, somewhere out there, there's a boy and his linker,
learning low level the hard way. If I were you, I'd hit up security and
reversing/crackme forums, you're practically guaranteed to find a wizard or
two.

That's the Circle of life :)

------
tkahn6
For what it's worth these topics are taught in university to a certain extent.

I took a class last semester where we had to write an assembler and virtual
machine for MIPS (in C), had exams on branch prediction, instruction
pipelining, program-performance as impacted by different types of caches.

But really, there is a lot of information to absorb. Intel's been doing heavy
research on processors for decades. It's one thing to be familiar with the
concept of branch prediction and instruction pipelining, it's another thing to
know enough to actually have a [positive] impact on the performance of a
program.

~~~
mjn
I can't remember where I saw it, but there was some mildly controversial talk
from someone in industry to that effect at a CS conference, chastising
universities for teaching their students about how CPUs work by using a model
based more or less on 1980s RISC machines, which in his view was not very
relevant to how a modern CPU works, and gave the wrong performance intuitions.
I don't remember if he proposed a solution, though. In computer-engineering
majors you solve that by having multiple courses, which work their way up to
the modern complexities, but in a one-semester architecture course as part of
a CS major, there isn't infinite room.

~~~
klez
If someone could remember what mjn is talking about, I would really appreciate
a link to the talk.

------
ranit8
Previous discussion: <http://news.ycombinator.com/item?id=2857422>

