Hacker News new | past | comments | ask | show | jobs | submit login
Why innovators get better with age (nytimes.com)
97 points by mitmads on March 30, 2013 | hide | past | favorite | 46 comments



I think the idea that the "true innovators" are a bunch of kids with no experience is massively counter-productive. If you look at the real innovators in computing (not Zuck) over the last 50 years, you'll see that the ripest period seems to be 30-40. Larry Ellison was 33 when he founded what became Oracle. Bradeen was 39 and Brattain 45 when they invented the transistor. Bill Hewlett and David Packard were 26 and 27 when they founded HP, but the company achieved its real successes during the war when they were in their 30's.

The reason I say it's counter-productive is that it tends to upend a very fruitful social structure: younger people learning how to innovate under the direction older, experienced people. Ken Thompson and Dennis Ritchie didn't invent Unix one day between their college classes. They joined Bell Labs after getting advanced degrees, worked on a system (Multics) implemented by older, more experienced people, and gained the expertise they needed to innovate. You can't really develop expertise as a young kid, and expertise is usually a pre-requisite for real innovation.

And to tie in to the organizational management angle in the article: who would you rather have in your organization? John Carmack circa 1991 (when he founded id software), or John Carmack circa 2013?

If this seems counter-intuitive in the context of the current Silicon Valley youth worship, ask yourself: what are the young kids at Twitter, Facebook, etc, really building? The answer is: lifestyle and entertainment products. Without demeaning the value of those products, I'll say it's not a contentious assertion that young people have some significant insight into lifestyle and entertainment as an industry, but that doesn't mean they're particularly innovative.


This is also true for women - in my life many of my friends who are over 40 are getting their PhDs and doing incredible research. Also see composers and painters: so much of our great works in history are done by older people. Perhaps because I studied classics, I always assumed that I would only start doing my 'real' work after the age of 40 and I'm personally really enjoying building my career with the next 30-40 years in mind.


"expertise is usually a pre-requisite for real innovation". May Be. But many a times, being naive (and inexperienced) may make one to see and think differently.


You rarely get serious innovation out of people who are naive and simply "think differently." As pointed out in the article, while Dirac and Einstein were outliers and making substantial contributions to science at 26, they both had persued extensive education leading up to that point.

Are there people who innovate out of whole cloth? Maybe. I can't think of any offhand but I'm sure they exist. But they're not representative. Representative are people like Larry Ellison, who build an innovative company like Oracle (it was cutting edge in its time) by leveraging substantial industry experience and theoretical developments.


Cutting edge can be mildly innovative. Facebook is somewhat innovative in my scale; as far as I know they came up with the timeline and tagging people. But for me really innovative are Kickstarter, Twitter, StackOverflow. Companies that went blue ocean and literally created a new thing or changed culture in a significant way, because of a fundamental idea, and not simply critical mass (like Facebook.)


StackOverflow was a slightly bettor implementation of an old idea. GPS was innovative and game changing.


And GPS was based on work funded by the DoD - an organization not known to be a worshipper of youth, at least in their R&D corners.


"Representative" is a dangerous term to use when you're already talking about distant outliers.


Making lot's of money is not the same thing as innovation. Apple started as a somewhat naive computer kit company who's sales took off, but was far from a stable company. NeXT and Pixar where both more innovative, not to mention how much better run Apple was after Job's return.


I've read some articles in the past that argue the people who see and think differently (and create something new and great) are the people who enter the field with a background in another typically unrelated field. Like, I don't know, professional fly fishing and network security.


Or it might cause you to see the same things everyone before you saw, and not know the difference.


At the age of 25 I was a long-haired pot smoking rocker dude who drank Budweiser and played Sega Saturn when I wasn't working night shift at the factory.

I didn't hit my stride until the age of 35, when I self-published my first book after four agonizing years of writing it. I'm 42 now and kicking out better, more innovative stuff than ever before. My latest project is INDOCTRINATOR: http://indoctrinator.com but I've developed other successful products.

So yeah, as my own experience would suggest, people do get more creative with age. It has to do with mastery of skills, an accumulation of knowledge, and the alchemist's ability to convert these elements into amazing things to offer the world.


> My latest project is INDOCTRINATOR: http://indoctrinator.com but I've developed other successful products.

"INDOCTRINATOR is an email program that reveals the secrets to becoming obsessed over anything, even something you hate."

Pass.


Very poorly argued points. There is no case made that innovators actually get better with age, nor indeed a decent definition of an innovator. Apparently, best-selling authors and directors are the best example of innovators that the author could come up with. Skip.


I don't know if they're poorly argued. I found the Nobel Prize statistics and the study from the Northwestern professor fairly compelling myself. (Link from the article: http://www.kellogg.northwestern.edu/faculty/jones-ben/htm/Ag...).


But, as pointed out by others, PhD/academia dynamics are heavily skewed by the fact that you have to do a PhD, then work as a postgrad for ages, before you get to run your own experiments. If you don't let anyone do anything but work on other people's experiments until they're 35, of course you won't get many great innovators below 30.

If anything, the fact that the average age of academic innovators is 38 rather than 60 means that there is a large preponderance of innovations in those few years between 30 and 38, to balance out the fact that most academics in a position to innovate are older than 38.

However, that's still using a fairly loose definition of innovator... arguably, people rewarded by Nobel prizes are not innovators, they're inventors. They've come up with something wholly new, and rightly deserve much credit for it. Innovators, however, are often those who take something new invented by someone else and actually bring it to the wider world.

The classic example of an innovator is, of course, Steve Jobs - didn't invent any of the stuff Apple is known for, whether mp3 players, smart phones, tablets, or even the original Apple II (invented by Woz), but damn was he good at getting it market-ready and then getting people to know about it.


I trust you will learn with age not to be make dismissive claims about an article you didn't finish reading.


I read the whole article. My conclusion was that it was weak. But thank you for the condescending comment.


Sorry. When I read he word "skip" I couldn't resist making the condescending older guy joke. I think the tension between the generations is good for us and keeps us on our toes. We could both learn a thing or two from each other.


"The directors of the five top-grossing films of 2012 are all in their 40s or 50s" - What does this prove? Are they innovators? "But there is another reason to keep innovators around longer: the time it takes between the birth of an idea and when its implications are broadly understood and acted upon. This education process is typically driven by the innovators themselves." - I don't fully agree with this. 'typically' is a broad word.


Well, that in particular proves that very, very few people are going to be trusted with a project that has a $100 or $200 million dollar budget when they're 25 and have no track record. Those are the movies that top the box office charts.

I dunno that you can draw much of a conclusion from that.


I agree with the fact that producers will not invest $100/$200 mn on a 25 year old. Thats the nature of the business. No connection with innovation.


Doesn't every human activity with a large dependence on knowledge and experience get better with age?


Begs the question whether doing meaningful work in a field will take more time given that it takes a decade or two to get up to speed with current research.


The premise of the article is simple. Companies may lose money in the long run by cutting older workers and hiring younger workers in their place to save money. There's an obvious case to be made for experience and wisdom that comes with age to some older workers. A blanket policy of bias towards the young is going to lose that. Companies should be smarter about it and realize that some older workers do get better with age. They should try to hire boy wonders and keep some wise heads around. The author mentions a problem without naming it, which is the Peter Principle, where workers are promoted until they reach a level where they are no longer effective.


I like the theme but indeed this article misses out on so many opportunities to make a good case. The innovators they are picturing are the "corporate innovators", which is different from the young kids disrupting markets altogether. The former are evolutionary, the latter really shine when they're revolutionary, which often needs a new company to take over instead of a big corp reiventing itself.

But both are great drivers of human progress. If we only developed by leaps and bounds, revolutions, we'd be hard pressed to avance at all. I like the concept of slow hunches. These are the ideas that you breed in your head over years, decades. They often need to meet other slow hunches other people have been breeding to really shine. This is a slow innovation that startup culture completely misses out on. It's the foundation of scientific research, but it's also very much directly (albeit slowly) applicable to business. I have a few slow hunches of my own, which have been evolving over the last 5 years (I'm 23). They have spawned little ideas and projects already, but the main branches keep pivoting and growing because they're far from concrete yet to be even market tested or MVP-built.

I feel my best innovation comes about from deliberate mixing of areas of knowledge. And it seems to me every week I have a new interest. I want to understand painting, poetry, design, writing, statistics, politics and a lot more. There's all this breadth I don't yet have, and I feel that's what makes good innovators, they're generalists, and criss-cross the DNA of different areas to create new mutations all the time. Most suck. In this sense, I'll be so much better at 50.

It's also why I don't see myself calling software development my career in 20 years. I feel like I want to build a career that ages well, and though surely I'll be a better developer at 40, many market dynamics will be playing against me in the field of tech. If it's even relevant anymore in 2030. Maybe robot code-monkeys will do, at least CRUD and interface design, much better.

I want to be a writer. I'm using article writing as a platform for all my expression and creativity. Want to understand something better? Write as best as I can about it, then edit, cut, edit. Like when it's said that you should always write all software as if it were open source (commenting, modularity, extensibility, docs), I write my thoughts as if they were published. I want to hone the craft, and eventually, as the decades pass, have a respected career for writing insightful articles where I wouldn't for writing old-man's code (Though I'm pretty sure I'll actually pay my bills with software still.)


I have a whole bunch of these slow-moving projects - it's my modus operandi really. A recurring idea surfaces during mental downtime, is improved on, then dismissed. Repeat this until it has taken on a life of its own, then sit down and plough through the whole implementation stage in one sitting. It's hard to quantify or even explain properly, but it works for me. So for a certain type of creative endeavour, yes, more time = more potential for innovation.


One problem: What is innovation?

Some innovation can and does happen from a position of almost complete ignorance. Other innovation requires years of study, domain experience and the benefits of a multi-disciplinary background. These are vastly different things. The former could be the domain of the younger crowd. The latter, almost by definition, belongs to those with more candles on their cake.

Nothing wrong with either scenario.


amazingly content free article. disappointing read.


I recently read another article on HN that said innovation got worse with age and responsibilities like kids etc, does anyone remember it?

I think it depends on what industry you are in, but I have seen old guys build bad ideas too


The word innovator might be ill-suited to support the argument that mastery generally takes time.

For example physicists and mathematicians do their best work in their early twenties. Arguably they get worse with age.


I was curious as to whether this was true, so to test it out, I looked at Hilbert's Problems[0], which are widely considered some of the most important problems in Mathematics in the 20th century. Of the solvers whose ages I could find on Wikipedia, the median age was 30.5 and the mean was 29.4. (For Hilbert's 10th problem the solution is listed as the joint work of 4 people. I treated them as one person with an average age of 42.5)

This obviously isn't statistically significant, but it lends some weight to the myth that Mathematicians "die young."


This is great research, thank you.


You see the effect in math because math requires the sharpest most complex pure thinking. Other major creative efforts (business, movies) involve collaboration and social/organizational structures that benefit from years of relationship-building and reputation-building. Or years of drudgery, if your innovation is something like the world's first dictionary.


These articles remind me of the the Malcom Gladwell book(outliers?) asking the question if this is because of all just genius or right idea right time


I totally agree, I am 37 now and within the next two years will change programming and math forever. :-)


As a 32 year-old trying to break into Silicon Valley and feeling the strong effects of subtle (and no so sutble!) age discrimination, I found the headline and the hypothesis encouraging and really wish I could get behind it. However the evidence in this piece is lacking and unconvincing.

He cherry picked a few examples of industries where people are required to pay their dues before the system allows them to make a contribution. In science one must earn a PhD as an ante into the game. Then they must earn a reputation and tenure before they are allowed to fully devote themselves to making major breakthroughs with their research. Early to mid 30's is roughly the age when one would be afforded that luxury, for the few who make it that far, so it makes perfect sense people would make their discoveries at 38.

With film it's a similar story. Directors must first go to film school, then fetch coffee for directors, then work their way up the ranks on other peoples projects and then catch a series of extraordinary breaks before they are given the opportunity to direct other peoples ideas before they are finally given the freedom to truly do what they want. If they achieve that freedom by their 50's, they are one of the chosen few.

Being an author works much the same way.

Anybody with a text editor can write code that changes the World. So it's not the same. Does that mean young people are better at hacking than old people? No, not automatically. Nothing can be proven from all of these examples other than the relative barriers to entry in a given field.


With film it's a similar story. Directors must first go to film school, then fetch coffee for directors, then work their way up the ranks on other peoples projects and then catch a series of extraordinary breaks before they are given the opportunity to direct other peoples ideas before they are finally given the freedom to truly do what they want. If they achieve that freedom by their 50's, they are one of the chosen few.

LOL no. I have worked in film for the last decade and I never went to film school. If you have talent and you're not an asshole, then it's one industry where you can pretty much write your own ticket just as fast as you can learn your craft. It helps, a lot, if you have or know people with money. But that's true of just about anything.

Being an author works much the same way.

To be an author you just start writing. You don't need to get a degree in English or a blessing from a senior writier.

You are suffering from a bad case of a disease I am sometimes afflicted with, which is permission syndrome. You don't need other people's approval to be innovative. I find it particularly odd that you invoke film as an example, because I sometimes think if I had got into that field when I was younger and more brash (instead of balancing it with family considerations and some other factors) I think I'd probably have gone farther and faster.


I might just be missing something, but I don't really see the distinction between film, writing, and coding. Sure, there are power structures which expect certain things. But you can also ignore those power structures, and many people do. You can put code on github, or produce indie films. What's interesting is that even in that case people tend to be older in a number of fields. The film Primer, which was made on a $10,000 budget with no studio support, and is probably the best film about time-travel in existence, was made by Shane Carruth when he was 32— and not because he spent a decade going to film school or paying any other kinds of film-dues before that.

With novels there is even less of a formal dues-paying apparatus. Nobody cares what degrees you have, and there is no way to move up in the ranks by spending 10 years making coffee for someone. Yet there are not many successful novelists in their 20s. Same with programming: you could be successful in your 20s, it's just uncommon.


You think people getting their phds in physics are just anteing up to the game? The fact of the matter is thaat innovation is cumulative, and it takes time and education to get up to speed on what's been done so you can incrementally imrpove it. That's where innovation comes from. Not some kid with a text editor.

Look at it another way: where are the game changing algorithms coming from kids with no experience and just a text editor?


How much of the game changing behaviours in technology are coming from algorithms? To be honest, I think the biggest changes are coming from new data, and new social systems over which to build technologies.


I don't understand how being an author would work the same way. It's such a solitary pursuit. . .I don't understand how that would be dependent on fetching coffee, kissing rings, etc.


You really feel a strong sense of age discrimination?

I feel it has to do with relevancy more than anything. I'm 35, a Rails consultant, several open source gems, speaker at the upcoming Rails Conf.

I don't say this to toot my own horn, I say this because my extensive software engineering background enhances my interactions with the latest tool sets. I've seen a ton projects go incredibly right and wrong and want to share those opinions and offer guidance. Those hiring me are generally younger (and my boss) but my feeling is that they actually look up to me.


>>my extensive software engineering background

Obviously that trumps everything. When I said I was looking to break in, I meant I have limited experience and skills.


I understand now.

My best advice; create something open source. If front end is your strength, create a few sample layouts and provide the html/css. If Javascript is your thing, create a nice jquery widget that others could utilize. If Ruby is your thing, create a gem.

Once you've done something like that, be sure to promote it. Post it to RubyFlow, the appropriate Subreddit (reddit.com/r/ruby, /r/javascript).

Get a few "stars" for the library to show that others are interested in what you do. Lastly, be sure to include it on your resume. Top of your resume. Front and center.


That's really helpful actually. Sounds like fun too, I have a few ideas already. Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: