
How to Succeed as a Poor Programmer - Impossible
https://psgraphics.blogspot.com/2019/09/how-to-succeed-as-poor-programmer.html?m=1
======
overgard
The "Avoid Learning Anything New" advice is insane. (Well, practically all of
it is, but that one really stands out).

I think the exact opposite advice is far better: never assume the way you know
how to do something is best, and always be on the lookout for what others are
doing that might be better.

Here's the thing about learning: the more you learn, the easier learning the
next thing becomes. You form links, insights on relationships, new concepts
that apply to old things, etc. If this guy thinks learning is such a burden,
it's probably because he refuses to learn anything in the first place.

If he thinks he's a poor programmer, it probably has less to do with innate
ability and 100% to do with the attitudes he gives as "advice" in this blog.

~~~
geowwy
He is right though – new tech nearly always disappears.

Example: Learning CoffeeScript was a total waste of time. Learning JQuery
helped me for a few years, but now JQuery is basically useless to me.

Based on past experience, I strongly suspect the same will happen with React,
Rust and a bunch of other new exciting tech. There are countless examples
besides the ones I mentioned.

But on the other hand, the time I put into mastering SQL or Unix will probably
continue benefiting me for the rest of my career. Even C will continue to
benefit me, even though it'll only ever be a small part of my job.

So I would modify his rule: _Avoid Learning Anything New – Learn Something
Old_

~~~
factsaresacred
> _Learning JQuery helped me for a few years, but now JQuery is basically
> useless to me._

'A few years' is a solid return on investment. Besides the reason Jquery is
now useless is because Javascript now has those capabilities, so your skills
were 'grandfathered' to ES6/ES7.

Learning an applicable skill is almost never useless.

~~~
geowwy
Yeah, I'm not saying it wasn't useful at the time, just that it's not useful
now. As opposed to learning SQL or shell scripting which will probably be
useful my entire career.

~~~
rwnspace
I'm convinced of 'learn old > learn new' in general, but I think there are
some interesting edges: the older something is, the larger the gap (maybe
chasm) will be between 'basic competence' and 'venerable expert'. Also, the
more likely all the common problems you'll face will have been posted on the
internet, and that the quality of tutorials and explanations will be
superlative.

While the payoff of learning old > new is typically much higher (search: Taleb
Lindy effect), I think matching your learning to the 'human api' is more
important. For me, learning is very emotional: when I feel a sense of
curiosity and intrinsic drive to know, I'll follow my nose, spend my time
where it takes me. I want to spend this kind of energy in a certain way.

When I find myself with an instrumental cause or external need to know, it's
most likely going to be because it's something typical, something old, in
which case learning about it and being useful with it will require less of my
spirit and drive to crack.

New language/tech fanatics tend to pressure through both sources (a la "look
at our ingenious design breaking paradigms" and "it's so good your boss will
want you to work in it (well, soon)"). Often the former argument is stronger,
so it appeals to your curiosity - in which case you're best off searching for
the useful kernel, followed by a swift exit in order to preserve your sense of
discernment. Should you return there, provoked by your general interest, that
ought to be your indicator of importance. How hyped you felt that one
afternoon you learned about it after reading HN comments is likely not.

On the other hand, the 'should learn' brigade will tend to target the latter
source of pressure (employability & centrality). If you find yourself feeling
resistant to this and force yourself into it anyway, you'll easily burn out
and douse your curiousity for the day. I've arranged learning resources to
languages I find fundamentally dull multiple times, and made only very shallow
dives into them.

When choosing what to learn, your built-in heuristics will tend to serve you
much better than either a long list of common 'shoulds' or a tangle of
overhyped 'musts'. Let natural forces do their thing in shaping what things
will be presented to you: avoid paying much attention to the loud people where
marketing, shills and zealots tend to roam.

------
jugg1es
This is such a bizarre post. If you have the self-awareness to admit this
about yourself, you are probably not a poor programmer.

Poor programmers do things like allowing an incoming request to spin up
unlimited concurrent threads. Poor programmers erroneously throw exceptions on
any operational deviation - even it if can be handled without error.

Most importantly - poor programmers do not learn from their mistakes and are
unable to see that they are poor programmers.

~~~
Hermitian909
Why should being aware of how lacking you are as a programmer have much
bearing on whether you're good or not?

When I started programming seriously I had a mentor who gave me a long list of
my shortcomings. It's taken _years_ of hard, dedicated work to rectify most of
those problems and I still have a huge distance to cover to be as good as I
want to be.

Being a good programmer is a skillet acquired with hard, concentrated effort
over time; not just a good attitude with some self-awareness.

~~~
disjhshava
In general, people tend to be overconfident in their beliefs and too slow to
incorporate new information. There is experimental evidence to back this up
(pdf warning:
[http://www.researchgate.net/profile/Baruch_Fischhoff/publica...](http://www.researchgate.net/profile/Baruch_Fischhoff/publication/230726569_Knowing_with_certainty_the_appropriateness_of_extreme_confidence/links/00b4952b854b29281c000000.pdf)).
I think this applies to engineers as well as anyone else.

------
human20190310
It's a great, succinct post. _Deeply_ uncool. A programmer should be modest
about their skills, skeptical about new-new things, eschew bullshit, and
terrified of dependencies. I buy the whole thing.

Moreover, I trust the advice of someone who rates themselves poorly more than
someone proclaiming that they're a hotshot.

------
albertoCaroM
ALAN. Avoid Learning Anything New

Terrible advice and mindset, It's good to learn for pleasure or curiosity.
This way, you can enjoy reading SICP, effective java, code complete... Or you
can use a new system, Linux, Mac, Android, iOS. Doing it you model your
thoughts and mind, learn new ways, practices, or patterns, you don’t have to
use then only for having learned, but even so, they will be useful to you.

~~~
keithnz
It's not terrible advice. You mentioned SICP, funny thing is, years ago when I
put someone on to SICP, next thing you know in our production code we started
getting these weird ass recursive functions..... Also had similar issues with
Design Patterns, all kinds of overly engineered class structures started
popping up. On the side of ALAN, I used to work with guy who did a lot of
machine vision research, he coded everything the way he knew how for years,
and he was super productive doing it. I didn't really like the code. But he
got stuff done. Having said all that, learning stuff is still good, practicing
stuff on non critical code is the next step.

Taking some lessons from BJJ ( Brazillian Jiujitsu ), when you compete, you go
in with your A game, the things you have practiced, the things you have made
work over and over again, your high percentage moves. Over time, you add
things into your A game as you gain experience in making those techniques
work. You may occassionally find yourself in an odd situation which some "new"
technique is screaming to be used and you might try it out. But usually, when
you encounter a situation you don't really know, you start working at getting
the problem to change to something you do know, then throw your A game at it.

I think programmers should understand what their A game is.

~~~
overgard
You lost me at "next thing you know in our production code we started getting
these weird ass recursive functions...". I don't mean to be mean, but
recursion is pretty fundamental to a lot of algorithms, and it sounds like the
SICP coder was just writing stuff more advanced than you were comfortable
with.

~~~
Lutger
Might also be an inappropriate application of recursion in a language where
loops are more common. I've seen people who learn new stuff go overboard with
clever code in production that is less readable for their peers.

Recursion shouldn't even be considered advanced though, just like classes and
first class functions. I'm all for making code as boring as possible, but to
me understanding at least these concepts is a requirement of entering the
profession of software development.

~~~
albertoCaroM
Sometimes recursion is the good solution, For example a back-tracking solution
is more simple with recursion than without it.

------
robbya
Two things jump out to me:

> Only learn something new when forced

I think there is a balance between always doing things in a new way, versus
always doing things as you've done before. When engineers are pushed too hard
on deadlines, some will avoid learning new things as a short term approach for
quick delivery. If your in that environment, you aren't going to grow.

> Avoid linking to other software unless forced. It empirically rarely goes
> well.

Source? The rapid growth of npm, rubygems, and other ecosystems suggests
otherwise.

I was hoping this would talk about how to support your co-workers (code
review, culture, cohesion) or how to succeed at non-engineering tasks other
'good' programers may overlook

~~~
BigJono
Learning is pretty simple, the golden rule is "Don't learn new tech, learn how
to solve new problems".

Learning how to use something like Vue when you already know how to use React
(or vice versa) is stupid because they both solve the exact same problem in a
reasonably similar way with a reasonably different API.

A better example might be something like Postgres and DynamoDB, since it can
go either way. If your problem is 'I need a database for a CRUD app' then
learning the second one is stupid because they both solve that problem just
fine. But if your problem is 'I have a complex use case, my data is in a bad
format for the one I'm using and I'm taking a huge hit in performance' then
learning the other one is a reasonable choice and probably not a waste of
time.

Basically whenever you take the time to learn something, make sure you're
getting something out of it in terms of end results. It feels good to just
learn more of the same tech, and if the API is different enough it'll feel
like you're making progress, but you're probably not.

~~~
soulofmischief
> Learning how to use something like Vue when you already know how to use
> React (or vice versa) is stupid because they both solve the exact same
> problem in a reasonably similar way with a reasonably different API.

Settling on Mithril as my front-end library was a long journey from framework
to framework. If I had stopped at Vue or React, I'd be much worse off for it.

Really, if I had stopped at the first front-end library/framework I used in
web dev, I'd still be using PHP. Sometimes you need to move on, and if later
asked about your technical choices in a professional environment, you need to
have a professional answer that comes from wide experience.

~~~
BigJono
> Really, if I had stopped at the first front-end library/framework I used in
> web dev, I'd still be using PHP.

Well, I did specifically give an example of two techs that do similar things
that might be worth learning if they're different enough that one solves a
problem the other doesn't.

I don't know anything about Mithril, but if you're right that you'd be 'much
worse off' then that would be a similar case, no?

~~~
soulofmischief
One could argue that it's not "different enough" seeing as how it ostensibly
functions the same as React from a bird's eye view. However it takes learning
both to understand the strengths and weaknesses, an uneducated assessment
wouldn't be enough. All I'm saying is that if you're curious, find out. Don't
drag your company into it but learn in your free time. It doesn't actually
take all that long to pick up a framework.

------
gabrielblack
> Only learn something new when forced

This could be devastating, much more in little/medium company. I know of
successful companies (I mean company with successful products ) with a C/C++
stack they considered "good enough" so they didn't change anything: the C++
standards, architecture, structures. Often that line of conduct was supported
by a management looking any change or improvement as a cost. The result is
always the same: a day they wake-up realizing that the "product" is a pile of
crappy legacy code. I know some cases. One of those company was bought by a
bigger company that asked to modify the stack to modern standards with
disastrous results because the programmers wasn't skilled to port the code
base to modern standards/architectures. In another case, the owners sold the
entire division to another company, interested to the clients more than to the
product and, after checking the status of the code base, the buyers hired a
group of consultants that rewrote all in Java, with results that you can
imagine.

~~~
ensiferum
Just because something is old or uses old framework or old standard doesn't
make it crap. Crappy code is crap, but it age doesn't make it crap. Code
doesn't rust or detoriarate by itself.

Example if you write good code now using C++14. If it's good code now it will
continue to be good code 20 years later. There's no property that adds bugs or
"crappiness" to the code as the years go past. The invention of a new "c++40"
standard doesn't obsolete old code or turn it "crap".

~~~
gabrielblack
The age make it "crap", but not in the way you think. I can speak about C++.
C++ code wrote in the older standard, let say C++ pre-11, who are 20 or more
years old make it difficult to maintain because new generation of programmers
are not generally interested to legacy code or in learning old c++ standards
and skilled programmers don't want be relegated to be eternally maintainer of
old projects. For this reason projects die by asphyxia. We have that good code
base in Cobol, ok, but the point is: who cares ? Who is important when you
need new programmers. Besides, a part considerations on the language, the
architecture is important we live in a world of microservices and old code not
updated to modern paradigms could became "crap", even if if the most elegant
code.

------
sixtypoundhound
This is probably really bad but I totally empathize (and agree?)

KISS => everyone should do this

YAGNI => 95% of the shit I add is ignored (and I've got fairly objective proof
that I can develop good projects)

ALAN => Master SQL, one scripting language, and how to use Stack Overflow.
You'll be the most useful dev on the team.

I agree with most of the rest.

Bonus point: never be afraid to tell a business person that what they want is
an exceptionally bad idea. it usually is.

------
anaphor
I don't think most of this applies only to "poor programmers".

The advice about KISS is something many "good" programmers would benefit from
following. Likewise, I've seen good programmers recommend using arrays as your
first attempt at a data structure as well (e.g. Jonathan Blow,
[https://www.youtube.com/watch?v=JjDsP5n2kSM](https://www.youtube.com/watch?v=JjDsP5n2kSM)
)

------
gigatexal
I’ve been in a rut lately. Missing obvious things, shipping less than my best
code. It’s come to my attention that I’m not as good at programming as I am at
crafting database queries and tuning them but that’s such a small niche given
how easy it is to pick up SQL that I’m having an existential crisis.

So I’m working on getting better and getting more confident.

~~~
her_tummy_hurts
You could make a career out of just SQL. I know my company could use a decent
SQL programmer. We’ve got hundreds and hundreds of procs written by SQL
amateurs

~~~
denton-scratch
Correct. Tuning SQL queries can deliver performance improvements of several
thousand percent, in exchange for just a few hours' work. EXPLAIN is your
friend.

------
denton-scratch
I totally agree with this stuff.

I was a professional programmer (now retired), and not a very good one. I'm
familiar with Dunning-Kruger; I've worked with good programmers and bad ones,
and I can tell the difference. Very good programmers are far and few.

I noticed that most of my colleagues were keen to learn new shit, like new JS
libraries, new languages, new source-code management systems and so on. I
think I lost interest in newness (for its own sake) about 15 years ago; I got
turned over one time too many by a vendor that decided to withdraw support for
a programming language that I had committed myself to.

I recommend retiring from programming. You don't have to keep up with the
young whippersnappers any more, you can carry on coding in bash, you don't
have to use git, Docker, or weird NoSQL systems. I realise that some of this
new-fangled stuff is better than FORTRAN or VB6 or whatever; but learning a
new programming system every 6 months is a total waste of time and effort. Get
to be good with a few useful tools, then concentrate on people skills.

Or give it up completely, and learn cooking, or drumming, or interior-
decorating.

I think there may be some rationale behind ageism in software development. For
the first 25 years or so I got better at it, but I think after I turned 50 I
started getting worse. Or at least, I got better slower. It took me longer to
learn new tricks.

But I really think that some of those new tricks were not worth learning - for
example, you can stuff Node and that ridiculous dependency system where the
sun don't shine. JS is a very clever language; but cleverness isn't always
best.

~~~
glloydell
> you can stuff Node and that ridiculous dependency system where the sun don't
> shine.

You're really taking embedded computing to the next level

------
noonespecial
If you ALAN you can't KISS. You won't know what the simplest thing is. You'll
end up with miles of nested "ifs" instead of 4 lines of recursion. A gazillion
"else ifs" instead of case etc.

He's got one thing right. He's a poor programmer. "Success" must be mighty
loosely defined here.

------
rdiddly
I was all curious to read about _monetarily poor_ programmers. Instead, this.
I agree with part of it. The goddamned arrays really trigger my Refactoring
Legacy Code PTSD though. Any time you make an array, you need to make sure
that damn thing is big enough. Do you even know in advance how big that
bastard needs to be? You could make it just plenty big, I suppose, like an
asshole. Just allocate 10,000 4-byte blocks like that shit just grows on
trees. Then change it to 15,000 the first time someone has 10,001 things and
crashes your shitty shit. Also any time you look at your shit you're gonna
need a goddamned shitty index. A whole 'nuther variable! Get some generics ya
dufus!

~~~
Tade0
Former monetarily poor[0] programmer here. AMA

[0] $7k per annum take-home-pay.

------
Noumenon72
> If you are bad at programming, you are still programming, something that
> very few people can do.

That's heartening. I bet a very bad doctor or lawyer can still do some good
somewhere too, as long as they don't convince themselves they are better than
they are.

------
benboughton1
I think the best way to make it as a 'poor programmer' is be a domain specific
'poor programmer'. I am bad a nearly all my programming in all my projects but
probably just as efficient at getting jobs done than a 'good programmer'
because the feedback loop is fast. I can hack away until something works and
then sometimes polish it off when it functions. I suggest work using really
popular tools and there is usually a solution written up online easily found
through Google. Such are the times.

------
manmal
Please don’t store everything in Arrays. If your language supports tuples, use
them. If it doesn’t, at least use a struct if possible. Basically: Constrain
the possible input and output space of your functions to the smallest amount
of possible values. That way, you are reducing the room for errors
significantly. Using arrays everywhere, many errors might even go unnoticed
until an edge case occurs.

------
oneepic
A very interesting post, since it poses one very conservative, tried and true
approach to programming productively. I know a lot of programmers tend to
arrive at those very same conclusions in their career. That said, I think the
article and most of the commenters here don't necessarily conflict with each
other; every piece of advice can apply or not, depending on the circumstances.

On a larger scale, it's remarkable that a lot of the general programming
advice given today is more of a heuristic/guide than a tautology, but many
people still assert that their advice is The One Right Way To Do Things. It's
important to understand that the vast majority of all this advice comes from
real examples of what worked and what didn't. So, perhaps the best thing one
can do as a programmer, in any domain, with any technology, whether you're a
great programmer or a poor programmer, is to listen to it all with zen and a
grain of salt.

------
aledalgrande
I dislike this mindset too. For any other profession it would be crazy, but
for programming it's fine? What happened to apprentinceship and mastering the
craft? Why is it fine to be considered a fool if you are not an expert
programmer from the start? Should a blacksmith never make a sword, because at
the beginning they can only make nails?

------
kissgyorgy
This guy is poor because he is lazy as hell. The most important thing about
programming is constantly learning new ideas, tools and paradigms, solving new
problems, improving as a person and learning as much as possible. If you don't
do these, don't like to think or solve problems, you will stay indeed poor.

------
viburnum
Kids, trust your elders on this one, the author is correct.

------
dagw
With regards to "Avoid Learning Anything New" I would be very surprised if
Pete Shirley follows that advice when it comes to things actually relevant to
his job. People who avoid learning anything new don't get to be senior
researchers at Nvidia or have the following publication list:
[https://scholar.google.com/citations?hl=en&user=nHx9IgYAAAAJ...](https://scholar.google.com/citations?hl=en&user=nHx9IgYAAAAJ&view_op=list_works&sortby=pubdate)

I guess what he's trying to say is Avoid Learning Anything New if it's
ancillary to your job, and instead focus on your core competences.

------
alephnan
> Finally, let the computer do the work; Dave Kirk talks about the elegance of
> brute force (I don't know if it original with him). This is a cousin of
> KISS. Hardware is fast. Software is hard.

Yeah, I don’t know. The cloud bill is going to be expensive.

~~~
aurelwu
It is also terrible advice in regards to Energy consumption

------
Tade0
One problem I have with 1. and 2. is that they tend to escalate.

Example: "We're doing a combo-box component, but to keep it simple let's not
support multiselect."

Retrofitting such a feature when You Eventually Do Need It(YEDNI?) is neither
pleasant nor simple.

My take is that one's skill level is not the most relevant thing - we have
code reviews to deal with exactly this problem.

What ultimately matters is whether you're adding or subtracting value.

I've worked with people who were _aggressively incompetent_. As in: they had
bad ideas and were insistent on implementing them, even going as far as
bypassing the regular review cycle.

------
29athrowaway
ALAN (avoid learning anything new) is bad advice if taken literally.

Learning fundamental knowledge such as programming paradigms, data structures,
algorithms is something that will likely not become obsolete in a year by year
basis. You should totally learn this if you have the time.

Now, memorizing every API in a framework that is likely going to change in 6
months, is probably not going to be very useful in 5 years (but it can be
beneficial to achieve your short-term goals and move your career forward).

It's not about "avoid learning anything new", it's about being tactical about
what to learn.

------
markus_zhang
std::vector everything!! TBH I feel anything beyond a simple
array/vector/stack/BST/queue is pretty advanced and should only be touched by
advanced programmers...

------
wyclif
"Programming is rather thankless. You see your works become replaced by
superior ones in a year. Unable to run at all in a few more." ~ why the lucky
stiff

------
segmondy
This case is making the case for a specialist, it's just worded poorly.
There's a case to be made for such folks, there's a case to be made for
generalists... and if you're super smart with tons of memory you can be a T
shaped person. It doesn't matter which path you choose, you can thrive in any
of them.

------
rr-geil-j
I wished he just used the word 'bad' instead of 'poor'. I thought this is
about cash-strapped programmers.

------
oytis
ALAN might be a good advice, but software is such a miserable place unless you
can learn and apply new stuff. My boredom would get me if I would do things
the same way every time.

UPD: just realized who the author is. He _does_ learn a lot of new things,
just not in software development, because it's not the focus of his career.

------
michannne
Rather than Avoid Learning Anything New, I'd say avoid implementing anything
new. Options always come in handy, you don't need to implement every new tech
you learn about, but knowing it exists can make the difference between an
impossible feature and a possible one

------
mutant_rvalue2
There is no formula to succeed in anything. The more people know something the
less it values. It is just random, accept it. Just a tech flavor of random.
You can always be a salesman but you can't be a good salesman and a good
scientist.

------
KirinDave
I wonder if the author has considered the idea that it's not a lack of effort
or some inherent cognitive gap that keeps them from being "good" programmers,
but perhaps these beliefs instead?

~~~
noughtme
I think it’s time. Peter Shirley decided to specialize in computer graphics,
leaving the programming to people who would implement his ideas in libraries,
and the production developers who use those libraries.

~~~
KirinDave
Interesting, but does that change anything though?

I had a famous cryptographer professor. He refused to learn anything past
Pascal. He freely admitted he was no longer a programmer and that he shouldn't
be doing it.

That seems like a different message from the messages presented here.

------
BrissyCoder
Most of this seems like good advice for all progreammer.

Except for the ALAN thing. I think that should be modified with a few
qualifications. Don't learn anything you will ever use less than 10 times or
something.

------
em3rgent0rdr
If "Be aware that most coding advice is bad." is true, then how confident
should I be that this "How to Succeed as a Poor Programmer" coding advice is
good? ;)

~~~
djmips
Yes.

------
blue_devil
>>I discuss how to be _an asset to an organization_ when you are not a good
programmer.

Sounds like it's advice on how to sandbox yourself in the interest of the org.

------
rukuu001
I assumed it was satire

------
codr7
Diligent practice?

------
known
Brutally true :)

------
the_cat_kittles
i love this. this is a _good_ programmer. sure to piss off a lot of people on
here who think that because they have big salaries they are good at something.

------
bsder
> 4\. Make arrays your goto data structure.

The others are arguable. But, in 2019, this is flat out bad advice.

Your "go to" data structures should be hash tables about 70% of the time and
vectors about 30% of the time. In 2019, memory and CPU are so stupidly
abundant that the abstraction costs nothing in 99.9% of all cases. The
programmer gain for not having allocation, dereference, fencepost, and
invalidation errors is enormous.

But, then, this is hardly surprising advice from someone who only learned
about a "scripting language" in 2015. The rest of us realized that those silly
"scripting languages" were better than C++ for 90+% of our problems way back
in 1995.

And anyone who has used a "scripting language" realizes _extremely_ quickly
just how stupidly useful hash tables are.

~~~
banachtarski
> Your "go to" data structures should be hash tables about 70% of the time and
> vectors about 30% of the time.

I literally just wrote a comment elsewhere about how hash tables are obscenely
overused and cause measurable performance degradation in many situations.

~~~
bsder
That may be.

However, the number of times I see people hit a bug because they fenceposted
or flat out overflowed a fixed size array _VASTLY_ outnumbers the times I have
seen people have to redo their underlying data structure because it just
wasn't fast enough.

~~~
banachtarski
We're just in different fields. The OP of the post was a graphics engineer, as
am I. We drink to celebrate when we shave off a millisecond haha.

~~~
Impossible
I've shaved 10 microseconds in large refactors and been told that's great ship
it. Other fields might say that wasn't nearly enough to justify the code churn
(performance vs. "programmer productivity" argument). A millisecond is a
lifetime and can mean the difference between shipping and not shipping in some
products.

~~~
banachtarski
Odd that we feel differently about hash tables then.

~~~
Impossible
I didn't mention how I feel about hash tables (I'm not OP), but I think we
feel the same? I definitely agree that arrays (potentially vector style
growable arrays) should be a programmer's go to data structure, both for
simplicity and performance. For my use cases I'd probably swap array and hash
table usage with OP (70% array and 30% hash) but really it's probably more
like 90, 10. Computing hash functions and resolving collisions can be very
fast with the right hash table implementation, but it's absolutely not free.
There is a reason why Lua tables can implement array like access and memory
usage, even if from a language standpoint a Lua table looks like a hash table
with some fancy features.

~~~
banachtarski
Ah didn't realize you weren't OP. I think any programmer that operates in the
realms of microseconds probably would understand the overhead of both the hash
function overhead as well as the coherency issue. I'm not a lua programmer but
I remember studying the LuaJIT source pretty extensively.

