His thesis that technology hasn't changed is false in any case. E.g. while the fundamentals of algorithms haven't changed, that sort of knowledge is less central than it used to be; for better or worse, software development seems to be evolving into something where you glue together building blocks made by other people rather than writing everything yourself.
I look around at my last 2 work places and have yet to meet more than the occasional mid-40s professional who is in positions of technical ownership. Most of them have moved into positions of management (paper pushers for the most part), architects (whatever that means) or moved out (to whatever it is they do, hopefully something interesting).
Those who did move into management and position involving architecture were not necessarily more technically proficient than those who moved out, with a history of accomplishments that identified them as suitable for those positions. In most cases it was just a case of who one was acquainted with. Put simply the technical field is not a meritocracy beyond a certain watermark.
I have yet to hear a cogent argument on why folks in their late 40s or early 50s are not adequately represented in programming positions start ups or otherwise. What is unique about programming as a practice that make it unsuitable for professional beyond a certain age?
In most professions that have similar age barriers to early exit (sports comes to mind immediately) one case expect to be compensated handsomely for the short while that one has is holding on to the position, this is not the case with programming.
Leaving aside the argument that most programming positions do not require any education because they are no longer "creative" functions just functions of component "assembly" and as such do not require the experience gained from years of practice, what changes does the industry require to undertake to effective use the experience of folks who have worked hard to gain it?
I certainly do not have the answers.
Because programming is concerned with "details", it was considered clerical work from the beginning. Look at the indignities suffered by Jean Bartik and the female staff that were assigned to program ENIAC. Do you think they would have assigned women at the time if they had any idea of the intellectual challenge of programming?
Since programming is considered low level work, it's always been a smart move for programmers to find their way out of it into management or whatever. Relying on the good will of employers has always been a dubious proposition (except maybe during the American post-war boom—a unique period in world history), so it stands to reason that most programmers would be looking to move up and out.
In the case of sports you have 10,000 people that aspire to every single paid position. If programming were that way, where vast swaths of people practiced daily from age 6-21 in the hope of snagging one of the glorious few spots, then you could expect pay to scale accordingly. As it is, I think the economics of being a startup founder are a bit closer to professional athlete.
On the aspect of practice, why is practice from the ages of 6-21, 15 yrs any different than practice from 15-45, 30 yrs. The answer may lie in what you say, "programming is considered clerical work". That being the case it becomes the responsibility of those currently in the profession to make sure that those who are in the stages of making a choice are adequately aware of it. Tho, it would make me sad if that were truly the case. I do not believe that to be the case.
Another factor is the rate of change in technology. Mere experience may be an obstacle there. When you start knowing nothing and pick a technology to learn, you naturally tend to pick the newest one. Whereas for an old person, learning new technologies is something you do out of duty as much as eagerness. An old person learning new technologies is keeping current. A young person learning the same new technology is learning to build stuff, which is much more exciting.
I may have been searching badly, but I couldn't really find anything to support this idea, at least in terms of a cognitive deficit. The following was the best discussion about mathematicians and age that I found:
- Mean age of best contribution by mathematicians is 38.8 (based on counting references to research by historians and biographers). This is roughly the same for other SET researchers.
- The Fields Medal cannot be awarded to mathematicians over the age of 40, which skews the mean age of mathematicians receiving their field's most prestigious award.
- As mathematicians get older, they take on more administrative work, have families, etc., which reduces the time and energy they have to devote to research. (I'm reminded of how Andrew Wiles basically shut out the world while proving Fermat's Last Theorem.)
- Rapid expansion of the field meant an increasing proportion of mathematicians were young, thus more discoveries were by young people.
- Math is a field where someone can make a great contribution without having acquired a large body of knowledge.
- Young math prodigies get a disproportionate share of attention.
Geez, you're right. Wikipedia:
The Medal also has an age limit: a recipient's 40th birthday must not occur before 1 January of the year in which the Fields Medal is awarded. As a result some great mathematicians have missed it by having done their best work (or having had their work recognized) too late in life.
Clearly someone needs to endow a better math prize. You know, one that is for the best math, not the best math by a specific sort of person.
Firstly, Mathematicians don't peak at an early age this is just one of the many nonsense Hardy spouted. And I say this as a non mathematician < 25 who just enjoys the history of science - I have no horses in that race.
There are many examples of people who not only contributed work at a late age > 30 (some near or over 40) but did so while being outside of the academic community - George Green, Faraday (physics) and Grassman. Then there are those such as Weirstrass who started late or those who produced an amazing work later, some in addition to earlier success e.g. Poisson, Bayes, Fourier or Hamilton whose quaternions came quite late. And then there were the Eulers who were just none stop greatness.
Nextly, although maths is a subject where some topics require very little background knowledge the vast majority of these low hanging fruit (knowledge wise) are gone. It was much easier to sketch out the core of an algebra while still in jail and have it be new and profound 200 years ago than it is today. As the amount of knowledge increases I expect the mean age of precocious contributions to increase. Knowledge and even experience will increasingly become a hard minimum.
Finally, you imply that only young people want to build things and have enthusiasm. I think an eagerness to learn and build great works is something you never lose. At least I hope so, otherwise growing old sounds like a very miserable prospect to me. And while the examples of old fellows not keeping up abound, I think this is simply a philosophical state that can be consciously guarded against.
However the argument can be made that we need experienced folks to evangelize what is good and productive in the current context and the more of them the merrier.
The in-ability or lack of desire to learn something new is a determent to the individual and should apply equally to all professions. It does not adequately answer the question, Given the choice of 2 programmer equally skilled in new technology X why is there a preference for the 21 yr old over the 45 yr old?
While the economic arguments (lack of family responsibility resulting in the ability to work long hours and lower cost including benefits and pay) apply in the case of start ups they do not address the industry as a whole.
Peaking at an would be an excellent reason, it implies that the experience gained from practice is not sufficiently relevant, which I cannot immediately judge and is possibly correct in the context of technology. I am inclined to think that it is more related to the inability of those responsible for hiring to realize ability outside of their own limitations.
I always thought that your essay on not being able to recognize a better programming language outside of the one you knew (forget the title), is more generally applicable.
If I couldn't get a job programming I'd do it for free. Heck, I'm working on a startup so I basically am doing it for free. It's the thrill of making something other people find useful.
From the people I know and have worked with, the older the programmer, the less likely that attitude has survived.
An extraordinarily small percentage of mathematicians ever make any significant contribution to the realm of knowledge, as is true in most fields. It is unfortunate that such a small sampling is then used to judge the pursuit as a whole.
They are the exception. Further, it's just as possible that those very, very few would have led long lives of contributions, but the early success corrupted and waylaid their focus.
We see the same sort of observations in the field of software engineering where a remarkably small samping of Torvalds and Carmacks are used as a baseline, when they're anything but. With all due respect, the technology behind most startups -- many of the YC examples included -- almost qualifies as purile. There are remarkably few that represent any depth or significance, although granted there are some great fundamental ideas and some excellent dedication.
This is a lot simpler than it seems. Highly experienced, working programmers are rare because there were far fewer programmers twenty years ago compared to today. So of course they are now far outnumbered by newcomers.
This happens in any booming field, in fact any booming demographic.
Perhaps the hip young trendy 20 somethings at startups wouldn't dream of discriminating against somebody black or gay - but they don't feel comfortable telling a 50year old what to do. Or worse finding that they hired a 50year old that knows more than them.
Bear in mind that programming is still a relatively new field. There are simply more young programmers out there. We see ageism more because it's an option. You can start a startup consisting only of young people.
I predict that as the field matures, it will become less of an issue. Companies won't be able to ignore older programmers (or at least will be less able to ignore them) because there will be more of them.
Bear in mind that programming is still a relatively new field.
I'm not sure. I've worked with several /really good/ old guys. Even with this virtualization and 'cloud' stuff, a lot of *NIX knowledge that was relevant in the 70s is still relevant now. It's just the hairball of bash scripts you used to need to manage many servers at once has become more standardized. You still need guys who actually understand what is going on.
For all positions, and /especially/ technical positions, I think government workers tend do be older than private sector workers... it makes sense, as the way pension pays out means that you get better pension contributions from your later years of work, and the same goes for healthcare.
Really, for old folks, the private sector can't (or won't) touch those kinds of benefits. I think the thing of it is, the private sector will pay more for youngsters (that is, if you don't care about risk) while the public sector will pay more for old folks, so it doesn't surprise me that you don't see many old programmers in the private sector.
All of my parents work for the government; my stepfather as a .net programmer(retired.) There were years where I was making more than them, but their ending salaries were pretty comfortable, and upon retirement, they get free health insurance /and/ a very significant percentage of that salary for the rest of their lives.
I don't have the temperament for government work, but really, as you get older, if your choice is individual contributor work in the private sector vs. individual contributor work in the public sector, staying in the private sector past the age of 45 or so is just plain irrational.
I've thought long and hard about this, and the idea is vaguely terrifying. I told myself that if my business wasn't supporting me by the time I was 30, I'd start preparing for public sector employment. I'll be 30 this October, and so far, it looks like I've made it, but only just.
Because the public sector has such silly low starting wages for technical people, they don't have that much home-grown programming experience. An experienced private sector programmer is going to have a relatively easy time scoring a public sector job when it's time to settle down for his last 15-20 years before retirement; so I think what we see here is the natural inclination people have to seek out the highest paying jobs they can get.
In fact just by using that single word you've strengthened his argument.
Concerning technology, Dave has a specific thesis in mind, that I think has at least a little merit. Quite simply, large technology companies push "innovation" that has little actual value to the public at large. They do this to create barriers to entry so that smaller companies who devote less than 100% of their resources to keep up with the technology that the big companies champion (i.e. they devote some resources to sales, marketing, creating a business) get marginalized.
Most engineers (like every other profession and occupation) are mediocre. They look for job security in incompatibily. I once had a programming partner who refused to document his work. I asked why after pleading with him repeatedly to leave a trail behind him. "Job security." At another place, the programmers had a slogan "comments are for sissies." Same idea.
In this world of mediocrity there are a very small number of gems, people who work for the user, who strive to make their tech work better for people. That's a skill that develops over the years, you get better at it every decade, because you know more about people. When you're in your 20s you don't even have a clue about yourself.
And most of you commenting here are the mediocre kind of programmer (if you're programmers at all). The ones who are questioning the broad conclusions are the ones I'd want to work with, and I don't care how young or old they are. What I care about is if their minds are at work and if they can relate to other people as equals despite superficial differences like gender, race, age.
Well, you know what I say about people who complain about "codewords"; provide a better word or shut the fuck up.
The thing with Dave Winer is that I've never read his stuff and thought "What a fascinating, energizing insight this fellow is sharing. I must read more!"
Instead, I'm left thinking he comes off as such a bitter, grouchy sourpuss who's really impressed that once upon a time he invented RSS. I mean, no kidding, look at the post previous to this one:
"And everyone was hogging every inch of the road, including my bits of the road. Lots of near-misses, cars driving in the bike lanes, and people walking and riding four a-breast."
Oh. So your blog's raison d'être truly is just to grouse about everything. That's cool, but you generally have to be really funny to make that work in the long term.
I don't think his age has anything to do with the fact that people don't want to work with him. I think people don't want to work with him because by and large, the things he has to say are neither fun nor interesting.
The "X is exactly the same as Y" is one, I think. I got in an argument with someone who asserted that Twitter was exactly the same as the old Unix utility 'talk'. Of course, how do you distinguish between obviously boring reinventions and stuff that really will have an impact?
Our judgments of how intelligent people are usually come from how many relevant details they are able to see that we would otherwise have missed. So someone who asserts that Twitter is exactly the same as 'talk' is dumb. Someone who asserts that Twitter is exactly the same as 'talk', except that it's on the web, and it can broadcast to multiple people, and it's limited to 140 characters, and the social norms surrounding its use are different, and it's frequently used to share URLs that are automatically shortened, and that this practice is controversial because it breaks the decentralized nature of the web, sounds slightly more intelligent.
For instance, a web client that allows you to use IRC protocol is generally Just-The-Same-As-IRC. Putting it on the web doesn't do anything really different, except maybe that one doesn't have the inconvenience of installing IRC software.
But let's say that you had a really good help channel for your website, but that users of your website didn't typically find their way to IRC. Putting an IRC client right in the web page itself, configured to go straight to this help channel, is suddenly more of an innovation.
I don't know how to phrase this right, but there is a knack of noticing material distinctions, rather than useless ones. Maybe it's in noticing when certain limitations have been removed or added.
Maybe the real problem is that we're often unaware of how limited our world is, and what is doing the limiting. Perhaps the only way to notice when limitations have been removed is by sheer luck or that mysterious force called "creativity".
You couldn't have a mind that was constantly probing every tool for every limitation. Could you?
Knowing when a distinction is material frequently depends on context, goals, and experience. As Louis Pasteur said, "Chance favors only the prepared mind." I would be a little more generous, but still he's essentially right.
Isn't this called "engineering"?
Interesting question whether you could auto-detect closed mindedness though. One sign might be when people stop being surprised, or realizing that some earlier theory of theirs was wrong (which amount to the same thing).
Bear in mind that he is taking a very high level view here, and think about what he's saying again. Are not JSON and XML, fundamentally, simply better ways to put structured data into text files with formats everyone can agree on? Sure, they are better approaches than what preceded them (CSV? random ad-hoc formats?), but fundamentally, if I said the previous sentence to a programmer who gone into a coma in 1985 and just now woke up, wouldn't they understand it pretty quickly?
Now, of course he's exaggerating, and venting a bit as well. But for a programmer with a lot of experience, these innovations don't seem that special. Good ideas, sure, but not enough to make everything you've learned obsolete.
In a previous discussion I mentioned my first (and last) extremely unpleasant interaction with Dave Winer and someone mentioned I was a member of a large club (http://diveintomark.org/archives/2003/04/21/whats_your_winer...). That might have more than a little to do with his inability to get offers in new startups.
I'll also note that anyone who says there's no difference between C and an OO language doesn't know what he's talking about. Or in this case C and a language that's largely Scheme semantics with a C style face. And then there's compilation vs. interpretation. Feh.
But the OO part is a big deal (at least to the extent it is used by BrowserScript). Learning a new language is generally fast. Learning a new paradigm like OO or functional programming is generally not.
Its curly-brace behavior, however, is pretty much exactly like C.
Then he when everyone laughed at him, he silently edited his rant to appear a bit more sane .. It's not that he wouldn't support the json api, it is that he couldn't, because the programming environment he wrote doesn't have a json library.
It's 2010. Would you hire a programmer that who can't parse JSON?
Edit: Even better.. would you hire a programmer who, when asked to design a json api, refused to do so and instead yelled at you about the superiority of XML-RPC?
Many of my contemporaries have climbed the management or technical ladders to the point where they have not practiced the skills needed in a startup for years. They are managing fifty or more people. They are architects at large corporations. They are not designing products and writing code.
A smaller group of my contemporaries are set for life financially. They are no longer working or are only pretending to work.
Of those that are still actively working, I know that some of them have ratcheted up their lifestyle to match their high compensation. It would be a major adjustment for them to live on a startup salary.
There might be ageism in the startup world, but I think it's definitely the case that the pool of eligible and interested 50-somethings is small.
I personally have not experienced ageism. I was 20 years older than everybody else at my last startup.
Maybe he should learn how to build LAMP server farms if he wants to work in startups doing them. There's no reason he can't start his own and work with people who aren't ageists.
Part of his salary would probably have to go on the marketing account but I'm sure there are wiser ways to spend that money in a startup.
I would be more OK with having him in a technical advisory role or something similar. That would give me most of the publicity value at a much lower price. (I would then of course also make sure his experience was utilized to the fullest extent possible, probably as a mentor to the tech team.)
Not sure he would like that, though.
Let me know when you're willing to take startup pay (real startup pay, not $100K instead of $130K). Most 50 year-olds that I talk to aren't.
Unless your name is Steve Jobs, when you choose (d) hire someone for far below market and compensate them with employee level equity, you're basically looking for a sucker. Most engineers realize this way before they turn 50.
Pretend this was not written by him, because I don't want this to come off as attacking anyone.
Reading just that alone, my reaction is, "Well, why SHOULD they?"
Isn't having coding skills the same as having made a living as a writer? I don't see novelists complaining that no one has "invited" them to write books. They do it without any money up front in the hopes someone will like it and publish it (some self-publish in e, these days).
I thought being able to code was like a ticket to mint money, if you had a great idea and the willingness to push at it. Isn't that how everything on the Net came about? Maybe I'm just missing something here. But the fact remains if you sit around waiting for an "invite" to anything, you're bound to be waiting a very, very long time.
I'd rather be doing something than waiting any day.
we chuck our experience, wholesale, every ten years or so... These are gratuitous reinventions.
Even when these statements seem to be true. Don't train yourself to think like this. It isn't going to help.
Yes, people -- especially the young, but the old as well -- go over the same ground a lot in software. That's called practice.
It's partly a generational thing: New generations need to learn from mistakes by making some of those mistakes; the process of having your elders lecture you about the mistakes is great, and can save you valuable time, but it is imperfect. Some things you just have to experience for yourself.
But it's mostly an evolutionary thing. We reinvent because the environment keeps changing, especially in computing, which has been rocked by epochal changes over the course of the last forty years. (Take out your iPhone and compare it to the machines and networks in use in 1970. Then compare the owners of modern smartphones to the owners of machines and networks in 1970.) The reinvented thing never turns out quite the same as the original, and those differences -- many of which look like accidents, some of which are in fact accidents -- are usually where the progress can be found. The new tool fits the new problem better than the old tool because it evolved in the presence of the new problem.
If we hadn't reinvented the wheel, we'd be using stone tires.
Reinventing is not always a mistake... And sometimes you have to make a lot of mistakes to make progress.
I know I've thought that this ageism thing could be a factor. Hell I was nervous 'admitting' I am 43 in another post and when I signed up for the start-school (like it's wrong or something). But the thing is, because I don't think about age, and because I continue to learn and grow, it's never come up. It probably helps that I played a sport at a near pro level and know how to 'train' and consequently look 34 instead of 43; but the reality is we influence ourselves more than we realize (it's a internal mental game thing). Bottom line is if you want it bad enough you make it happen.
I think its a treasure trove of information as to what worked and what didn't work, especially if one can cross apply those lessons across knowledge domains/areas of technology.
I think one reason is that people of that age are mostly the ones who even remember various useful solutions to problems and tradeoffs that have come up in the past. CS is strangely bad at digitizing its history: I can get a philosophy journal issue from 1890 on Jstor, but many CS journals and proceedings from as recently as 1980 are still dead-tree only. And many are still relevant, e.g. the V8 js engine is strongly based on 1980s Smalltalk compiler research. A handful of important papers (say, by Turing) have been piecemeal scanned and put online by interested people, and a handful of university classes cover historical aspects, but overall I don't think history transmission is taken particularly seriously in CS circles. So, it ends up being more of a living-memory sort of deal, where if you want knowledge from the 1970s/80s to influence your designs, you have to hire people who personally remember it.
One reason may be that most current web startups just don't do work that is difficult or innovative enough to warrant that level of experience/cost. You don't need 20 years' experience to create Reddit, for example, and even though a veteran team might have avoided pick the wrong language initially or storing users' passwords in plaintext, these didn't really hurt Reddit that much.
There's also the employee's side. In my late thirties I already find it hard to get excited about self indulgent social-location-based-nano-blogging apps aspiring to get mentioned at Wired. When you look for something that's either really challenging or tries to solve a real problem, your range of choice is greatly limited.
The arrogance of this statement is just mind boggling. Why exactly does the author think he's orders of magnitude more employable than other engineers of his age group?
Just go do it. Then the argument is moot.
Startup entry costs get lower and lower everyday. All of those zillions of barely-useful web apps that everybody is funding provide a nifty base to launch and begin interacting with a market. It's really not the same business environment as it was just ten years ago. That's not meant as an argument for age discrimination, just an acknowledgment that things have changed. The best guy to find is some graybeard who kept the best of the old stuff and is also learning the new stuff.
Unfortunately the temptation in tech is to rest on your laurels and try to coast on using natural intelligence as opposed to continuously learning new things. Sometimes this gets kids right out of college -- take the corporate 9-5 job and the punch-the-clock mentality. Sometimes it takes longer. But once this attitude begins the logical conclusion is blog articles like this.
I think if I were 25 and launching a startup, hiring somebody twice my age really might feel a little bit weird.
Maybe this alone could explain it - if people only like to hire people who are younger than themselves, there would be fewer places willing to hire the older you got.
I suppose that when I'm 50, it won't feel quite as weird to work for people younger than me, but the bigger the gap, the weirder it's going to feel.
But since not a single one has done so (at least publically), something else must be stopping them.
That is not true today in Silicon Wally, because nobody knows who wrote the code one is currently using.