
Five Pervasive Myths About Older Software Developers - gacba
http://www.lessonsoffailure.com/developers/pervasive-myths-older-software-developers/
======
akadien
What's hilarious about this myth is that I learned to program in NeXTStep
using Objective-C well over a decade ago and am enjoying writing iPhone apps.
But I hear younger programmers bitch about how hard it is to write code in
Objective-C, manage memory, etc.

~~~
jrockway
_But I hear younger programmers bitch about how hard it is to write code in
manage memory, etc._

That's because we like to create applications, not program computers. Memory
management is programming for the sake of programming, it doesn't do anything
user-visible. The computer can do it automatically, so why pass it off to the
human?

(Also, C does have automatic memory management. Ever "free" anything you
allocate on the stack? I thought not.)

~~~
akadien
Uh, Q.E.D.?

~~~
jrockway
"Old people like unnecessarily performing the same exact task a number of
times, and enjoy spending time with obscure debugging tools when they forget
to do it right in one place?"

OK, sure, I agree.

~~~
rajeshamara
Comments like this shows immaturity. Just because a platform manages memory
management doesn't mean you can create 1 million objects. If you program in
C++ or C you appreciate these aspects which will help you write better code.
Satellites designed in 1970's still going to the far edges of the solar system
even though the processing power and memory is less than your cell phone. When
you design your software without considering these aspect would be disaster.
Example satellites designing now crashing on the orbits of other planets.

~~~
jrockway
OK, but I feel like I'm not reading the same comments you (and the voters)
are.

Why is allocating your own memory in Objective-C "better" than letting, say,
GHC, do it for you? I may be a young whippersnapper standing on your lawn, but
my programs run as fast or faster and are more reliable. And it takes less
time to write them.

~~~
misterbwong
I'm a young whippersnapper standing on the lawn too, but there's something to
be said about knowing the underlying workings of code before using it.

Here's the disconnect I'm seeing: Allocating your own memory in Objective-C is
not necessarily better. But all other things equal, a programmer that
understands how GHC manages memory is better than one that does not. You're
arguing the utility of this knowledge, as it pertains to your usual use case,
is minimal. This is not the same as saying that the utility of this knowledge
is minimal.

~~~
jrockway
_But all other things equal, a programmer that understands how GHC manages
memory is better than one that does not._

Sure, but I never said this. The original post, now many levels up, said "kids
bitch about how hard it is to manage memory". It is hard. That's why people
bitch about it, and don't want to do it.

I say that it's conceptually very simple. But in practice, it's very difficult
to type the code to do this yourself at the right place. (It doesn't sound
difficult, but most C programs either leak memory or use memory they didn't
allocate, so clearly it is difficult.) There is a reason why C++ has things
like the boost pointer library (and even auto_ptr). It's because memory
management is for computers to do, not for humans to do.

I think people have trouble realizing that a _human_ can know both "high-
level" and "low-level" things, but that a _programmer_ needs to stick to one
level or the other -- don't write low-level concepts (memory allocators) in
your high-level application program. The same person can write both parts, but
the programmer writing the high-level app shouldn't be worried about the low-
level details ("of course it works"). When you mix the levels of abstraction,
you write bad code.

(I wrote a toy language once. I found that writing a garbage collector was
much easier to get right than manually managing my own memory. When writing
the collector, the details of memory allocation was all I needed to think
about. When writing the high-level application code on top, all I had to think
about was my high-level application. That's the whole point of abstraction,
and it's what C-level languages fail to provide.)

------
strlen
It's always a red-flag for me when in an engineering organization of
statistically significant size (which does exclude YC start-ups, so don't view
this as a judgment) there are no older engineers. It isn't just the age
discrimination angle. It says something about technological identity and
mission: they aren't solving a problem that has a barrier to entry (or they
aren't going to be able to get past that barrier).

As someone pointed out in the "when an engineer goes to Google, a start-up
dies" thread, there's an great value to having people with vast technical
knowledge and experience: they'll be able to bring _new concepts_ vs. merely
_new technologies_ to the table.

Being a founder is a different matter (the risk factor is much more
significant once there's a family), but I don't see a reason to not hire older
engineers (all else being equal). Disclaimer: I'm <30 myself, but benefited
tremendously (earlier in my career) from contact with older engineers.

------
neilk
I'm not sure how to feel about this. In 1995 I was hacking scripting languages
and MySQL... and that's what I'm doing in 2010. Hardly anything of substance
has changed.

The one thing that's different is my attitude.

Sometimes I feel that I know too much about programming to be employable as a
programmer any more. So much of our industry is about foolish waste. Half of
all projects fail. Most of the other ones are rotten ideas that make life
worse for everyone. Few managers (or markets) really have the patience to
build good software, or the maturity to listen to advice about what might be
better.

When I was in my 20s, I was dumb enough to commit crazy overtime hours to
making projects work, even if they were ultimately doomed and stupid. And this
served me well in the job market.

In this environment, experience may not be an asset. It seems to me that the
ideal programmer, in an economic sense, is someone who obediently writes
unmaintainable code quickly and is unaware of the future pain they are
creating for themselves and their customers, or is unaware that all their work
is likely to be forgotten in a few months.

~~~
strlen
> In 1995 I was hacking scripting languages and MySQL... and that's what I'm
> doing in 2010

I think that's the problem (I'm also not a technology bigot: you can do
s/scripting languages/J2EE/ and s/MySQL/Oracle/). Doing simple things may pay
well at some point, but there's no technical career growth path.

There's also no security: there's no experience/education/intelligence based
barrier to entry. In the end, there's nothing to differentiate a 40 year old
CRUD-screen developer from an 18 year old one. You can't say the same thing
about a (for the education/domain knowledge angle) search relevance expert or
kernel hacker.

~~~
marshallp
Depends on how you think of CRUD.

CRUD is actually just what were called 'expert systems' in the 80s. Once you
realize that, you can see that it is actually the most interesting type code.

The real problem with programmers is that they don't specialize in a specific
type of application and become domain experts in it. Choose a field, such
accounting, or hr, or finance, and become an expert in it throughout your
career, so that you develop the most featureful CRUD application of that type
in the fastest time (or base a startup around it).

Knowing java, php, sql, etc. is like an accountant claiming he just knows
bookkeeping. A really valuable accountant is someone who might be a specialist
in taxes for medium sized manufacturing firms in california for example.

~~~
giardini
"CRUD is actually just what were called 'expert systems' in the 80s."

No. CRUD was a programming style that arose from relational database
technology. CRUD = (Create, Read, Update, Delete) corresponds to the SQL
CREATE, SELECT, UPDATE, DELETE statements. CRUD was popularized by Oracle. For
example, Oracle's program generators IIRC used the acronym CRUD as part of
their program generator's specification.

"Expert systems" in contrast, were a specific outgrowth of AI technology in
the 70's and '80's and were defined loosely as systems that mimicked human
expertise.

There is no necessary conceptual overlap in the two terms.

~~~
marshallp
That's a simplistic unerstanding that ignores the actual technology. Expert
systems were nothing more than logic programming dressed up to seem all "ai".
Prolog is backward chaining, while Clips expert system shell is forward
chaining. Today they are referred to as business rules engines, which can
again be forward and backward chaining. Sql databases are also in fact logic
programming, basically prolog without recursion and iteration, but instead
have pl-sql and t-sql to make them turing complete. Also, with triggers they
are also forward chaining too. The main difference between crud and expert
system is essentially that crud is multiuser and everchanging, and so is
actually a better version of classical expert systems.

------
cromulent
Many people start in coding, but realize they are bad at it and look for
opportunities to move into other areas (eg management). When I worked for
large companies, many of my managers would list programming as their early
roles. You could tell, or they would admit, that they were no good at it,
didn't enjoy it, and got out as soon as they could.

The ones that are left are more likely to be good, simply because the poor
ones with better opportunities have left.

Edit: "better opportunities" from their point of view, of course.

------
acid_bath
For the older crowd on HN: How much of ageism is real, and how much is in your
head?

I only ask because all the places I've worked at have had a lot of older
programmers, at least in their mid 30s up to mid 50s being the typical range.
Then of course there's the slew of non-quite-programmers-anymore but who still
make tech decisions (managers).

Then there's the very frequently uttered "Well he doesn't really have enough
experience but we need to fill this position ASAP" and "He worked for Corp XYZ
for 5 years?! Get him now!"

I'm not saying it's tough to be a younger programmer because it's not, but
strictly IME (internet companies) older devs are a rare and valued commodity.

Now that I think about it, could it be because most of the senior devs don't
have much web-related experience, due to it being relatively new?

~~~
msluyter
I've yet to notice anything and I'm pretty old, but I also look much younger
than I am, so I'm probably not a good data point. I think attitude plays a
huge role here. I don't act like I know it all or have nothing to learn
because I'm relatively new to development (officially), having done QA before
that. If you're curious and willing to learn, it's unlikely that you'll
written off as inflexible and closed minded. (AKA, "age is a state of mind.")

Regarding the article, I wish it had some actual empirical evidence to back
its claims.

------
dkarl
Age-related decline is, according to my observations, caused by declining
interest in the nuts and bolts of programming. When people lose interest, they
start relying on high-level abstractions even when those abstractions are
inaccurate or inadequate. Or they rely on concepts they've learned in the
past, without learning what's new about the new technology. It looks exactly
like intellectual laziness, but "laziness" is not the right word, because it
is an unavoidable consequence of boredom.

Losing interest in technology is like tires losing their grip on the road. You
keep going in the same direction, no matter which way you point the wheels.

Often age-related decline is masked by working in a static technical
environment that a person has already mastered. It's like sliding
uncontrollably down a straight road. Then, when the person needs a new job,
they find that they haven't learned anything new in five years, are hopelessly
out of date, and can't muster enough interest to learn anything new. The road
finally curved, and they're in the ditch.

Personally, I've come to value my ability to be interested in the technical
details of computer systems. It's embarrassing sometimes, but it's the key to
my livelihood. (I've gone from being ashamed of it to occasionally wishing I
had more of it.) I know I can't really be satisfied without an aspiration, or
at least a connection, to grander things (fame, the mysteries of the universe,
or boatloads of cash, depending on who you ask) but now I treat my geekiness
as an asset instead of a distraction.

------
fauigerzigerk
Experience is of course a double edged sword. It helps good judgement in some
cases but does the exact opposite in others as the underlying dynamics change.

It's interesting that some patterns do not seem to change very quickly though.
I'm reading "This Time is Different. Eight Centuries of Financial Folly" right
now. It's a rather dry empirical study of debt crises. The conclusion is that
these crises have certain patterns that recur again and again, but each time
there are people pointing at new factors convincing themselves that this time
is different and loads of debt are OK.

So it would appear that a minimum of eight centuries on the job experience
would be appropriate for any banker.

But even if, as in this case, experience would lead to the right conclusions,
I would argue that sometimes it's folly that actually creates value and
innovation, even if 3/4 of it is later destroyed in a crisis. The dot com
bubble is a prime example of that.

The real myth is that older developers work based on experience. Sometimes
experience is just a convenient excuse for laziness. Younger developers find
other excuses for that.

I have 20 years of programming experience but I'm still struggling with the
urge not to jump on each and every new fad that's out there, at least when it
comes to things like programming languages and paradigms. What I can do way
better than any recent college grad is to explain in a very reasoned,
professional and well informed way why it is absolutely critical to use that
fancy new language or paradigm now ;-)

------
nzmsv
If the older programmers are discriminated against, why is it that almost any
job posting I see has minimum 10 years experience as a requirement?

~~~
dasil003
Because you must be between 28 and 32.

------
kabdib
I'm pushing 50, and doing well. While I can see getting pigeonholed into the
embedded system niche, I feel that I can still learn new stuff.

My father-in-law was hacking C until he retired at 67.

Keep reading, learn something really new every year. Best continuing education
I've found is access to the ACM journals online.

------
BillGoates
Above 40 good programmers probably are rare, but doesn't have much to do with
age, but is more of a generation difference. 40 and below is the same as the
gamer generation, starting with those growing up with the first popular home
computers (vic20/c64/zx spectrum).

Someone able to write C64 assembler games/demoes 25 years ago, will still be a
great programmer today, even if he didn't touch a computer since then. The
basics haven't changed that much, nothing that cannot be learned within a few
weeks.

Older programmers are probably less likely to jump every new hype. For them
it's just the nth way to do same, only without the already collected/own
written libraries and tools. That could be a disadvantage when looking for a
new job.

~~~
ahi
I agree, but there is more to it. A 50 year old programmer in 1995 likely
finished their formal education in the late sixties. What are the odds that
their education was actually in a software development related field? As an
industry and source of employment, software development grew a lot faster then
sources of relevant education. In the recent past a lot of the ageism may have
been due to the older generation of developers just being the guy in the
office who could program. I think the ageism will wane a bit as more and more
of the older programmers have the same foundation as the kids just coming out
of school.

------
ulrich
The article is not very accurate. Age and experience don't necessarily
correlate to each other. There are a lot of programmers in their mid-twenties
with more than 10 years of experience. On the other hand a lot of the 40+
workers did work with only one kind of technology all the time. Or they have
been promoted into management positions 10 years ago and know shit about whats
going on today. Don't get me wrong, older developers with the same passion for
computers as todays kids are extremely valuable. But they are also very rare.

~~~
janm
Exactly: There are people who have repeated the same year of experience thirty
times in a row ...

------
ori_b
I'm young, and I have to say that I learned the most - and often enjoyed
working most with - from the older programmers. Some of them were the best
developers I've known, able to work magic with the keyboard.

All I can say is that I'd have no problem hiring an older developer if I was
in a position to, and I have trouble understanding ageism. As always, you do
have to be careful to pick the motivated ones, but this isn't a problem that
varies significantly across age groups.

------
ww520
It's not really age discrimination per sec but a lack of adequate
compensation. It's known that good developers are 10x more productive than
average ones. Aged developers are usually pretty good since they have survived
this game for so long and with all the experience coming with it. But a
developer with 15 or 20 years of experience is rarely making more than 2x or
3x of someone just started. Older developers just quit after this realization.

~~~
akadien
That or they get pushed into management with the lure of higher $ multipliers.
But, it's too late to realize that management sucks and the lure of $$$ was a
siren song. It's with great courage that you attempt to go back the other
direction.

------
MrSafe
I'm in my mid 20s and this is actually a concern of mine. I love designing and
building software systems and I would be happy to build them into my 40s. But
people may wonder why I'm forty and still "just a programmer". I hope to have
a stable buisness by then and won't have to deal with this problem.

~~~
brlewis
I'm 41 and programming is more fun for me than ever. Don't worry about people
wondering about you; it's worth it. Just make sure you have enough money to
give your kids a good education.

~~~
mechanical_fish
I'm going to call this out as an important general principle:

 _Don't worry about people wondering about you._

~~~
Maro
Or, as Feynman said, "What do you care what other people think?"

[http://www.amazon.com/What-Care-Other-People-
Think/dp/B000S3...](http://www.amazon.com/What-Care-Other-People-
Think/dp/B000S39V2C/ref=sr_1_2?ie=UTF8&s=books&qid=1266966223&sr=8-2)

------
ojbyrne
Countering "myths" with your own generalizations seems like a bad idea. The
part at the end - "Young is not necessarily bad. Old is not necessarily good."
is probably the best takeaway.

~~~
hello_moto
So far young has been bad in my experience (especially the entitled
generation) and old has been great mentors for me. Usually the bad old guys
are the sys-admin who turned to be the IT directors or the C programmers who
scored a dream job (working in a high paying job with low responsibilities and
demands).

~~~
brlewis
Your brain naturally wants to make associations like that, but resist. You
might find great mentors who are young, too.

~~~
hello_moto
I can only hope for that.

Seeing from yesterday's buzz regarding how hard it is to find a good
developer, one can only hope...

------
chrisbennet
To some, career advancement means getting out of coding and into management.
To me, that would be like switching from _driving_ the race car to managing
the race team. I'd much rather be driving thank you very much...

------
moron4hire
When I program "because I love it", I want to be programming on projects that
I actually love. That typically means "not the ones at work". Getting out of
programming as a career doesn't mean I'm not going to stop programming. On the
contrary, it means I'll probably end up doing _more_ of the type of
programming that I love because I'm not burned out on programming by the end
of the day.

------
moron4hire
This sounds more like "programmers versus cheese-head managers", not "old
programmers versus new programmers"

