Hacker News new | comments | ask | show | jobs | submit login

The odd thing is hiring older workers is generally a great idea. From an organisational perspective programming teams don't scale vary well so hiring less skilled people is generally a terrible long term strategy. Personally, I have worked with a few hundred programmers and I can only name 3 that where competent before 30.

Outside the valley you find a plenty of really competent programmers mostly in there late 40's along with a few people that are nearing retirement age and while often gruff tend to blow your minds. As in "we stopped testing his code years ago." Or just calling someone team _. As in you could get 15-30 people working on a project or if your lucky Bob.

The software industry has been growing rapidly in recent decades, and the growth has been concentrated in a few geographic areas. So employers in those areas (especially Silicon Valley, and also smaller hubs like Seattle and Boston) have to recruit a lot of their workers from other regions, as their job openings grow faster than the "organic" growth of the local worker population.

Theory: This importing of workers has negative effects on age and gender diversity, since it means a higher proportion of jobs go to people who can easily relocate for work. That includes young people (who are less likely to have kids and/or spouses) and men (who more often receive the primary or larger income in a household). I'm sure you can think of other things that are advantages or disadvantages when it comes to moving to California for a job.

There are ways to test whether this effect exists and how large it is. For example, you could compare age and gender diversity across industries. The theory predicts that age/gender imbalance is correlated with geographic concentration of employment growth for each industry. You could also see whether these imbalances are worse within industry hubs, compared to companies in the same industry outside the hubs. (Though the effect can also spread outside the hubs if it influences a culture that gets exported to the rest of the industry.)

I'll bet geographers or sociologists have published research on this subject, though I don't know enough to search for it effectively. (I found some vaguely related papers on Google Scholar, but nothing yet with data that would directly answer these questions.) There are some historical parallels of course, like gold rushes and resource extraction booms that cause new industries to grow suddenly in a particular area.

After some search, "Disruptive decisions to leave home: Gender and family differences in expatriation choices" seem relevant, and you can crawl citations from there.

Another interesting aspect to rapid growth is to note where the influx of new developers is coming from.

The need for developers is so strong that all the major tech companies are devoting resources to recruiting as soon as developers graduate.

That dynamic could account for a reasonably large part of the age imbalance in the industry. If there was a similar-sized pipeline of recently-trained developers with a more diverse age group than university students the story would likely be different.

Another benefit (I might be selling myself here, watch out) is that older programmers have lived through more issues. Be it management, technical, HR, etc. We have seen more. And some of us know how to steer clear of such issues, and when to interject.

Side effect this is that they don't freak out when stuff goes wrong. They've seen a lot of stuff go wrong in their lives, and they know how to work through it to a solution.

That's not always what employers want. I've been reprimanded for not visibly freaking out enough during a server outage.

They won't necessarily know what the solution is. But that's okay.

You will see the following in experienced, good consultants as well. A lot will go wrong on a gig that is not anticipated during the initial scoping of the project. The inexperienced (often but not always correlated with youth) will "freak out". Whether it is a passive flip out like analysis paralysis or a more overt one, the end effect upon management/clients is still unpleasant for them.

Even when I was much younger in the field, I observed some of the older staff tended to have an imperturbability and unflappability about them even when presented with unexpected and unwelcome challenges (read: emergencies). They often didn't know what the solution was, but had enough taste and experience to know where to start and to start Getting Shit Done. This meant picking a likely direction of troubleshooting and just start walking a breadth-first search of causes, then a depth-first on anything turning up from the breadth-first that looked promising. Easy to describe in these antiseptic terms and comfort of your keyboard, hard for many younger developers in the heat of the moment when there is a frantic manager beating them about the ears that the company is about to go under if they don't fix this "right this damned second!".

This points out another observation I've made about managing younger developers. Do not mistake their energy level and enthusiasm for the degree of emotional and verbal abuse it requires to engage them. It might be different for other industries and positions (I doubt it), but too many times I've seen people take the easy way out and subscribe to the Management by Berating school. Also called the Management by Complaining/Screaming/Browbeating school. You can mention the dire effects some emergency has to the younger crowd to establish context, but move on from it and don't use it as a way to urge them on to fix something faster. Most younger developers simply do not possess the life experience to easily tamp down the anxieties they feel and set them aside long enough to dispassionately fix the problem at hand if there is management at their desk anxiously wanting blow-by-blow updates.

This doesn't mean you simply dump the problem in the younger developers' laps and walk away, either. Neither extreme is good for your team or organization. Lead by example with a calm approach to working the problem, either make decisions on where to start or support your senior leads with their educated guesses with where to start (and if you don't know where to start and don't have senior leads to turn to, you've got bigger problems), triage the effects with mitigation tasks, assign bite-sized tasks to the less experienced developers as your judgement advises, create a big picture view for yourself with the best information available of where your team stands with the problem, clarify the picture as you go along, and run interference from the rest of the organization's managers who Manage by Dysfunction.

A side note for younger developers. The only partial shortcut to that unflappability I've had experience with is to secure yourself 2-3 years of living expenses as soon as you can. It helps you achieve perspective, especially when a manager yells you'll lose your job over something. It isn't a replacement for experience, taste, judgement, whatever you want to call it, but it does help a lot with compartmentalizing the emotional component of facing down stressful situations at work, which takes most employees a decade or two to learn, and some don't learn it at all.

>I have worked with a few hundred programmers and I can only name 3 that where competent before 30.

Are you sure you are not committing the same ageist judgement here?

There are two kinds of reverse discrimination:

One is an active reverse discrimination that is attempting to tilt the balance back (often enforced with some kind of law or policy). This has all kinds of well-known problems (that arguably are larger than the benefits).

The other is a more natural reverse discrimination that comes from collecting the missed opportunities of your competitors. If your competitors reject candidates for frivolous reasons, then it's easier for you to hire qualified people. I think this is what the parent comment was doing, and it doesn't strike me as a problem.

If you are a founder and aware of this bias, it is a big advantage. But if you're a competent guy in your 50s who got laid off, it's a big disadvantage. Free market economics rarely works like the textbook says, and the man (or God forbid, a woman) looking for a job is the one searching for the needle in the haystack of (1) attractive companies (2) that don't discriminate against older employees.

Depends on what one defines as "competent" and the problem space. A lot of colleges that the thirty and under set that I've talked with don't always have a required path in CS that teaches systems and architecture, let alone C (or assembler). So, if one expect such from someone with a CS degree, that could be an issue.

The tooling accessible to people today as opposed to those that started 10 years prior is also extremely different. Running Windows? Trumpet Winsock, understanding the win.ini, config.sys, and autoexec.bat were important. Linux - a lot of people have wasted a lot of time getting sound and video drivers (as well as xf86config files) to work right, most just work fine now. All these required a level of digging that required knowing more about the system than you may have wanted.

Not saying your point isn't valid, but it does depend on what one's definition of "competent" is.

That works both ways, though -- I went through college from 2000-2004, and aside from some brief forays into other languages (Ruby and Perl, from what I remember) we were taught exclusively in C. I had some VB experience (self-taught) from middle school, but in my day job we exclusively use C and Ada ... and I'd never really had reason to learn another language. A few years back, I decided that I should learn some new languages for a) fun, and b) the diversity that I'd need if I ever wanted out of my industry. I taught myself Java, and then the Android framework. Then C# and VB .net. What really forced me to start learning new stuff was teaching - I started teaching as an adjunct at a local college after work (the only favor that getting my MS has ever done me), and then I had learn stuff to fit with their curriculum. Anyway, my point is that it's just as easy for older people to be oblivious to new tech as it is for young people to have a complete lack of understanding of fundamentals. I actually teach binary number formats, boolean algebra, virtual machine code, etc. to my students just to make sure that they're not completely unprepared ... and I will say that your point is valid - many of them have had no prior exposure.

He's just making an observation and a possible correlation (X competent people, 3 younger than 30), not a causation (X people were incompetent because they were younger than 30).

One of the issues that is unpopular to discuss in inertia in constraints. Experienced people have learned a set of platonic causations that they apply to pattern matched situations. When they come into a new situation, if they match patterns that don't quite fit, they'll apply old solutions to situations that are not appropriate.

Enders Game is a good allegory for this.

It's good to have diversity and meritocratic process for decision making. It's good to have different perspectives, but recognize that sometimes bad experience is worse than little experience.

> One of the issues that is unpopular to discuss in inertia in constraints.

Prove it with some research. The current research shows that people really don't make their best contributions to their field before 35-45.

And, to be fair, I see more young people who attempt to hit the screw with a hammer because a hammer is the only tool they have.

>> The current research shows that people really don't make their best contributions to their field before 35-45.

Look at someone like John Carmack - before he was 35 he had done Command Keen, Wolfenstein 3D, Doom, Quake 1-3. Guys like Steve Wozniak were basically out of the business by the time they hit 35. Linus released Linux when he was 22, and started git when he was 35. I doubt guys like Marc Andreesen have written much code since they hit 35 as well.

And Federico Faggin who headed the development team for the 4004 was 30 at the time. William Kahan who was instrumental in the Intel 8087 was about 45. I can go on.

So, basically what you're saying is that young people can churn out software where you can undo your huge screwup, but if you have to get it right the first time like hardware, then you need some experience.

But what if it's an awesome shiny new hammer based on the latest impulse-delivery fads? Plus, screwdrivers are boring.


You are the one with a "platonic causation", namely, older = rigid thinking.

Enders Game is a child's book, written to appeal to children. Note we don't actually have 9 year olds making decisions in reality.

A competent person would understand why the solution did or didn't work the previous times, and thus why it would or wouldn't this time.

What you're saying is that some people don't learn from their experiences. You should focus on finding out if they have. That's worth way more than hoping that this young kid will.

Smart and wisdom of experience together are as kids say... gangster. It's only when a person gives up either learning or trying and accept status quo that their ability to make a big impact can be questioned, and that can happen at any age or sometimes never with a lucky few.

>Experienced people have learned a set of platonic causations that they apply to pattern matched situations.

Good developers do use pattern matching. But good developers just use it as a way to come up with ideas, and they use their experience to judge the ideas.

There's no such thing as "bad experience", just poor reasoning skills. And mostly people will improve their reasoning skills over time. Unfortunately, not always.

Experienced developers have more patterns to apply, and they have more sophisticated patterns that can be applied using differential diagnostics. An inexperienced developer might not have any pattern to apply, or may naively apply the wrong one. Sometimes that leads to re-inventing the wheel, sometimes that leads to solving the wrong problems. Errors of the latter sort are favorites for senior developers to come in and clean up.

Not so much. I'm 35, started programming at about 10 in my TI99-4A, typing games from magazines, etc. the usual stuff.

No way a 40 year old guy have more experience than me in general programming, unless he was working at Microsoft or IBM at that date, and surely they are about less than 10000 of those guys in the world. Of course there are 40 year olds that may have way more experience than me in a specific language or technology.

My point is that if computer tech is only 30 year old, then you won't find programmers with more than 30 years of experience, no matter their age.

Computer technology is about 60 years old, though. There were people doing some pioneering work in the 50s and 60s, and the modern hacker culture really has its roots in the 1970s, just prior to the rise of personal computing.

There was lots of stuff going on with the DEC minicomputers, LISP machines, early UNIX systems, and big-iron mainframes long before most of us were even born, let alone getting our starts on TI-99s, Apple IIs, and Commodore 64s. And let's not forget that the people who were creating those early home-computer platforms were themselves all from the previous generation.

Would you turn people like Steve Wozniak or Al Alcorn or Ken Williams away from your new startup? You'd be a fool if you did. The basic architecture of computers hasn't changed at all since they did their most important work, and the cognitive skillset that makes a great programmer is the same as it's always been.

In fact, I'd expect that people who cut their teeth on early systems, and had to work within narrow constraints, without a big stack of abstraction layers and frameworks, would be all the more adept at writing robust and efficient code than those who learned on modern tools. And I'd bet, with Moore's law slowing down, and with better-optimized code becoming more advantageous in building scalable systems, older programmers will soon be in higher demand.

The work in the late 40s was interesting, too. First, you had specialized systems, then Eckert, Mauchly, and von Neumann started working on stored program computers. They had to figure out what opcodes were needed and how they were going to be implemented. Jean Jennings Bartik's book (http://www.amazon.com/Pioneer-Programmer-Jennings-Computer-C...) is a great first-hand account of all the work (and drama) from back then.

(Not trying to short-change Bartik here either -- she did a lot of the work on the ENIAC instruction set when they converted it to a stored-program computer)

The fact that you can name by memory the people that worked in the 40's means that the size of the computer science community back then was tiny.

He didn't say that those were all of the people working on computers in the 40s.

Exactly. Just like how history doesn't celebrate the names of everybody who worked for Thomas Edison, there were hundreds if not thousands of people toiling away in anonymity underneath these giants.

Edit: And many more "giants", of course…

Assuming Ada Lovelace was the first programmer, programming as a discipline is more like 160 years old.

She was certainly the first Computer Scientist, well that's what I was told in uni!

She was definitely the first person to realize the potential that Babbage's Analytic Engine had, particularly outside of just calculating things. Babbage's design, if I remember correctly, was basically an improvement on a design he initially made to calculate trajectories for artillery teams, and most of his thoughts on what to use his computer for were "calculate (thing)". Lovelace wrote on how the computer could be programmed to solve more complex problems.

I think that it would be a mistake to say that computer science has truly been around for 160 years, though. A few people (there was also an Italian who was interested in Babbage's work, although I'm not sure what his contributions ended up being, if any) does not a field make, and the fact that any progress in it was more or less put on hold until the early 20th century (when mathematicians started working on what you could calculate or construct in a finite number of steps), and you didn't get (untyped) lambda calculus and turing machines until 1936, which is probably the best place to truly start the idea of computer science as a field. (And since you got early devices that were sort of primitive mechanical computers in the late 30s, early 40s as part of the whole Bletchley Park cryptography work by the Brits.)

A very large debate around the turn of the century was if mathematics that you couldn't specifically construct in a finite (or countable) manner were, which became particularly heated after Cantor's set theory work (showing that the real numbers were uncountable) and then things like Russel's paradox (showing contradictions in Cantor's naive set theory if you allowed sets that contained themselves). I'd argue (without firm, researched proof that it was definitely the intent and case) that the spirit of early computer science (lambda calculus, turing machines, etc.), which was concerned with what you could and could not compute with a finite algorithm, came in spirit from those sorts of debates. (See finitism, intuitionism, constructivism, etc. for parts of this debate; traces of it remain in modern day mathematics with some people's concerns about if the Axiom of Choice is a valid or reasonable axiom to have)

I think you're mixing up a couple stories there, but it doesn't really matter that much. Just to fill in the details (and because the alternative is for me to start responding to work emails which I just don't feel like doing this morning): Babbage was concerned about the quality of mathmatical table books, which were used for calculations before we had modern calculators and slide rules weren't mainstream. The figures in them were computed by hand, and some were so bad there was an error on every page, which made them useless for scientific work and dangerous for industrial work. The Difference Engine was designed to print out pages of figures using a little printer attachment.

ENIAC was ininitially put to work calculating artillery trajectories, once the programmers figured out the bug where the shell kept going after it hit the ground (oops!).

>No way a 40 year old guy have more experience than me in general programming

I started programming toys in 1979. [1] I started programming "modern" computers in 1981. I was selling games when I was 15 in 1983. My "real" professional experience started in 1987 with a contract to write a video game from a major publisher.

So yes, it's not hard for someone to have "more experience" than you in general programming.

[1] I had one of these: https://en.wikipedia.org/wiki/Big_Trak

OMG THANK YOU! I had one of these as a kid, but couldn't remember the name! We used to have competitions to have it go from the living room, out to the kitchen, through the table legs, and back again. I was 5 years old, and wouldn't have another programming opportunity until 3 years later when we got a C64.

Ha ha, I'm 42 and always wanted a BigTrak too. I finally bought one a couple years ago when they remade them. Sucks though it does not have the dump wagon accessory.


>My "real" professional experience started in 1987 with a contract to write a video game from a major publisher. >So yes, it's not hard for someone to have "more experience" than you in general programming.

Do you know how many people were doing that? you are way over estimating the amount of people in the world with your experience.

> over estimating the amount of people

You said "No way a 40 year old guy have more experience than me". To disprove an absolute statement you only need one example.

That said, a significant fraction of my friends from those days were doing similar things. I won't deny that I'm particularly good at programming, but all of those friends could also claim more experience than you.

My grade school had a Big Trak. Loved that thing. I saw there was a new version coming out that works with a phone app and thought about getting it for my great-nephews. But there's so many interactive toys out there like it I have to wonder if they'd think it was cool.

I'm 35 and I was doing OOP via HyperTalk and AppleSoft BASIC with a dose of 65C816 assembler before you picked up your calculator. Also, TIs were toys compared to HPs.

30 years ago, computers were actually pretty common. The school I went to had two rooms full of computers and were teaching programming to every child as part of the maths lessons. Software was being talked about everywhere.

I've heard a number of these comments recently drastically underestimating how long computers have been around.

Yeah, maybe I should have said 50 instead of 30. I forget we are in 2014, my mind still is in the 90's. Miscalculation.

You think you have more experience than 40 year olds because you were typing in games when you were a kid? How about the people working when you were a kid? I don't know, this is sort of an incredible thing to say.

Maybe I'm not that good at english because I believe I was clear that there obviously are people with more experience than me, but there are few. IT in the 70's and 80's was a extremely small field.

> I'm 35, started programming at about 10 in my TI99-4A, typing games from magazines, etc. the usual stuff.

> No way a 40 year old guy have more experience than me in general programming, unless he was working at Microsoft or IBM at that date, and surely they are about less than 10000 of those guys in the world.

You're talking about 1989, right? In 1983, there were 443,000 computer programmers and 276,000 "Computer systems analysts, scientists" in the United States alone (source: http://www.census.gov/prod/2/gen/96statab/labor.pdf).

Damn thats a lot. I recon I had my numbers a little wrong.

"My point is that if computer tech is only 30 year old"

I'd be really interested as to how you get that figure...

From my ass, basically. It was an example. It work for any other kind of tech.

The TI99-4/a was discontinued in 1983 as I well remember because it happened not long after I spent my savings from my paper route to buy one.

If you were using one at age 10 and are now 35 that would have been 1989/1990 or so... there were no magazines writing about it by that time. You had some old issues of the TI-99er or something?

I did the same – cut my teeth on a 4A well into '94 or so (~9 by then) using old magazines. It doesn't really matter what's "current" when you have no-one but yourself to learn with.

I beg to differ - I distinctly remember getting TI magazines well into the '90s. IIRC, there was one called Micropendium that kept publishing until 1999.

Started programming on an Apple II which I still own, in 1983 aged 11. I'm 42 now so that's 31 years. You'd need to have started at 4. Plus by the mid eighties it's not like this was that rare. My school had rooms of computers (BBC Micros, later PCs) and the kids all had ZX Spectrums, Vic20s and Commodore 64s. There are tons of people on their 40s like that. Most CompSci and Physics undergrads who were my contemporaries at college - so also now in their 40s - followed similar paths.

So it makes little sense to suggest that it's somehow a crazy idea that someone 5-15 years older than you (in their 40s) might have a longer track record working with computers.

Really I first coded at 13 in the early 70's at school and started professionally in 79 (fortran) so thats 34 years

My father was programming before you were born...

Working in a bank, not for an editor...

Did I say that people with more experience than me does not exist?

"computer tech is only 30 year old"

Well, you are "old". Yup, over thirty, you have missed the boat. Welcome to the new ageism.

FWIW, I'm 47 and was programming assembly and microcode in my youth. So I have 10 years on you. At this scale, I doubt the difference matters much. Because you are old, too.

My point is that there are VERY VERY FEW people like you and they are expensive.

Also, you have about 30% more experience than me, I don't think it's a small amount.

TI99? You pussy. I'm 40 and when I was 10 used to help my Dad build homebrew z80 machines. My first programming was writing a small OS in z80 asm. Helped him build an awesome z280(that was a rare chip) transputer.

I'm sorry, what? If a guy has been doing a job for longer than you then he can't possibly have more experience than you? Isn't "experience" "doing"?

Did you read my entire comment before pressing reply or only the first three words?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact