I'm happy for anyone who is productive past normal retirement age, but it's important to be aware of what happens to our minds as we get older. I think the examples in the comments are exceptions that prove the rule. The reason for the dearth of older hackers is the same reason there are few people running marathons at age 60 or 70: As we age, our bodies and minds degrade. Exercise, nutrition, and (probably) drugs can slow the decline, but we don't yet have the technology to turn back the clock.
The most depressing graph I've seen is figure 1[1] in Images of the Cognitive Brain Across Age and Culture[2]. It shows how our cognitive abilities decline soon after we reach maturity. Starting in our 20s, we lose about 6 IQ points per decade; more in our 70s and 80s. That means someone in the top 1% in high school (IQ 135) would be down to average intelligence by the time they were in their 80s.
On the bright side, the decline in raw cognitive horsepower is offset by gains in knowledge. In fact, knowledge more than offsets it in most disciplines. Our peak productivity is usually in our 40's and declines much more slowly than one would expect[3].
Still, if you want to keep building cool stuff when you're older, it's important to prepare now. The best thing you can do is stay healthy and active. To return to the marathon analogy: A 55 year-old might not set a world record, but with the right training, nutrition, and possibly performance-enhancing drugs, they can beat >95% of people half their age.
Finally, to everyone mentioned in this thread: Well done! I hope to follow your example.
1. Time just seems to move sooo fast. When I was 6, 30 minutes seemed like forever. Now as I close in on 40, 30 minutes feels like a handful of breaths. I can easily spend 2 hours on something, feel like I made no progress because the time is short. Even into my 20s, I felt like I could crank out tons of things hour after hour. Now the work of a week feels like the work of a day.
2. I think why this happens is that I've noticed I feel like I take in more space-time at once than I used to. Hypersmall details I used to obsess over seem to blend into an entire scene. I'm gulping space-time rather than sipping it. I think it's because I have much more experience and knowledge than I used to that I just automatically filter out most things. "Bigger picture" isn't just a word to me anymore. I've largely stopped thinking in hyper-local ways and started thinking more strategically, in terms of systems rather than components and in terms of aggregate behaviors rather than individual behaviors, etc.
I have to consciously focus down my attention onto small, local concerns when it used to just happen. When thinking about business ideas, I don't think as much about smaller concerns like the technology stack or whatever, but where I can take the entire idea over the next 5-10 years. Ideas grow like trees in my mind stretching out for a decade without mush effort, but looking at an individual leaf (which used to be easy), is exhausting.
It's given me a lot more understanding of what my parents are going through as they age, things I never really understood when I was a precocious child, but now make perfect sense. I don't know what they're going through now, but now that I understand roughly the trajectory of my own mind and thinking, I can kind of see how they're arriving at where they are.
With regards to point #2, do you feel the same if you put yourself into some completely novel situation. I.e. let's say you've never been diving, you go diving for half an hour, does it pass by in the blink of an eye or does it take as long as you once remembered?
Is your perception of time because you're just so used to most things you do in your daily life that you don't notice the details, or is it some innate change in the brain as we get older?
It's just personal observations, pop-psychology...so YMMV.
But I like to travel overseas. I find that the stranger the environment, the more enjoyable. I think it's because the details, the things I can't readily filter out all come flooding in again.
A week in Seoul is a totally different kind of experience for me than a week in NYC. Even though they have lots of superficial similarities.
But I also notice that my brain seems to spend more time subconsciously analogizing what I'm seeing rather than learning things new from whole cloth like when I was young. It's like the filtering mechanism is working overdrive and knows it can't just toss things away (it can't make a value judgement) so its first pass is to try to find similarities so it can start to make these judgments.
"So this kind of place is like a department store, but also like a fleamarket..." my inner monologue goes.
After a few days in a new country, I'm usually just mentally exhausted and look for some familiarity, something I don't have to work for.
After a week or two, I'm usually comfortable enough in a place this feeling goes away and everything starts to look "normal" again. Meaning my filter is locally tuned and working at normal efficiency.
Just to add to your second point, one of the reason that time moves faster is that, in a sense, it is moving faster. When you're 10, a whole year is increasing your life by 10%. That's a huge percentage of your life. By the time your 60, that year is a much much smaller percentage. Thinking of it like that, it's not surprising that 10% of your life seems longer than ~1%.
I can't relate to this. The fact that you have lived for 23 rather than 10 years might make a single year feel less significant in retrospect, but there is no obvious connection to how you feel about that year when it is actually happening.
But the same thing applies to days, or even hours. So while every microsecond is the same, at the end of the second, in retrospect it was a faster second. And at the end of the hour, in retrospect, a faster hour, etc.
Thinking more about this, if you're running at a marathon, your concerned not with how far you've come, but how far is left. Such a view of life isn't likely to be very productive. I still think my point holds for your marathon effort -- 1 meter after start is very different from 2 meters after start (or say 50 and 100) -- however the difference between 1000m and 1001 (1050) -- not really easily perceptible without some kind of aid?
"The reason for the dearth of older hackers is the same reason there are few people running marathons at age 60 or 70: As we age, our bodies and minds degrade."
That is an overgeneralization that could lead to needlessly depressing conclusions. Research is pointing in directions that are much more encouraging and that vary tremendously from field to field. If you are interested in this subject, you'll definitely want to see the summary that the following link, which, among other things, says that the output of scientists appears to peak in their 40's and decline only in their 70's (assuming, I would guess, that they are choosing to continue to pursue their vocation):
0. bit.ly/1kzXKIz
To help maintain your brain power as you age, research is pointing in the direction that the best thing you can do is, perhaps counterintuitively, physical exercise:
There are a couple of issues though. One is that this decline coincides with a decline of challenges in most people's lives. That's a cultural thing and doesn't have to be that way for every individual.
The other one is the definition of intelligence. Take the speed component for instance. It greatly affects the IQ score, but does it equally affect our ability to come up with interesting hacks?
In other words, average IQ score does not equal individual problem solving ability.
Unfortunately, it's pretty clear that we get dumber, not just slower. If you look at the first paper or the graph I linked to, you'll see that the researchers measured many aspects of cognition. Processing speed, working memory, and long-term memory all decline at similar rates. Only vocabulary stays constant or improves. Imaging also shows that our brains shrink as we age.
Another good bit of evidence is the third citation: On Age and Achievement. It shows a peak in our 40's, followed by slow decline. In that paper, they find the best model of this curve uses two factors: cognitive ability and knowledge, with the former decreasing and the latter increasing. Different professions have peaks earlier or later depending on how much they favor knowledge vs. fluid intelligence.
This is exactly the sort of evidence you'd expect to see if our brains slowly degraded like the rest of our bodies. I wish it were otherwise, but wishing doesn't make it so.
Again, how are standard components of IQ tests related to hacking ability? Are speed and retention of random items the only or even the most important variables?
How much of the average declines documented in these studies is related to cultural factors (people are allowed to stop using their brains as they get older) as opposed to individual potential?
I don't doubt for a second that certain cognitive functions degrade with age, but we don't know by how much they degrade in those who keep using their brains.
And we don't know how the things that do get better with age, like discerning patterns that only emerge after looking at a lot of instances, affect our ability to solve complex problems.
It may be the case that the slowing down isn't just offset by gains in knowledge, but caused by it. That has some exciting implications for the quality of results from older hackers!
"A series of simulations show how the performance patterns observed across adulthood emerge naturally in learning models as they acquire knowledge."
Actually, that's one area I don't know much about. Since we can't yet easily change our genes, I haven't looked into it much. I'd bet they play a big role. For example, people probably express (or are sensitive to) different amounts of telomerase and BDNF.
How do you account for criticisms of IQ? I think anecdotes are permissible here since we are looking for counter-examples: I know people who have scored in the 110s and 120s on IQ tests in school, who are now, in their 30s, very prolific researchers. On the contrary, I also know a few people who scored very high in school IQ tests (140+) who haven't amounted to much, professionally, later in life.
Other counter-examples to the general trend of 'age-related mental deterioration' include several greats in creative fields (e.g. P̶a̶b̶l̶o̶ ̶P̶i̶c̶a̶s̶s̶o̶,̶ ̶T̶.̶ ̶S̶ ̶E̶l̶i̶o̶t̶ Paul Cezanne, Robert Frost, etc.) [1] whose best work came later in life. All said and done, it is hard to quantify success and definitively relate quantifiable functions of the brain with 'success' and 'creativity'. An extreme example in the realm of pop-psychology is Maurice Ravel, whose most famous work 'Bolero' is thought have been the result of frontotemporal dementia. Strictly speaking, Ravel shouldn't have amounted to much after his brain started deteriorating, but the dementia directly underlies the repetitive rhythms that make 'Bolero' a creative masterpiece.
While physical exercise (and a few other mental 'exercises' such as bi-linguality) has been shown to be supremely important to stem age-related mental decay, I find your comment to be a bit too pessimistic.
This brings up an interesting question: how do statistical results relate to self-actuation and motivation? When presented with such statistics as in your comment, one can either give up on making lifestyle changes with a resignation to 'inevitable aging', or may look at the statistics as a motivating factor to remain an exception.
The authors of your reference 2 state this in the conclusions: "Importantly, these findings also suggest that neurobiological aging does not always lead to neurocognitive decline in a uniform manner, and that external experiences can modulate and perhaps alleviate some of the neural effects of aging in the brain."
[1] Bruce A. Weinberg & David W. Galenson, 2005.
"Creative Careers: The Life Cycles of Nobel Laureates in Economics,"
NBER Working Papers 11799, National Bureau of Economic Research, Inc. http://www.nber.org/papers/w11799.pdf?new_window=1
*Edit: I meant Paul Cezanne, Robert Frost, and Virginia Woolf.
The use of IQ is usually criticized in the context of comparing one person to another. What we're talking about here is the effect for a fixed individual. If all else is equal, is higher IQ more advantageous? Probably.
This is one of the best comments I've read on HN, and I really appreciate you writing it. You are, however, using "exception that proves the rule" incorrectly:
> were these healthy people or were the sick averaged in?
Unless you have a magical cureall and were wondering whether to bother taking it, that's not really relevant. You're going to get sick as you get older.
The most depressing graph I've seen is figure 1[1] in Images of the Cognitive Brain Across Age and Culture[2]. It shows how our cognitive abilities decline soon after we reach maturity. Starting in our 20s, we lose about 6 IQ points per decade; more in our 70s and 80s. That means someone in the top 1% in high school (IQ 135) would be down to average intelligence by the time they were in their 80s.
On the bright side, the decline in raw cognitive horsepower is offset by gains in knowledge. In fact, knowledge more than offsets it in most disciplines. Our peak productivity is usually in our 40's and declines much more slowly than one would expect[3].
Still, if you want to keep building cool stuff when you're older, it's important to prepare now. The best thing you can do is stay healthy and active. To return to the marathon analogy: A 55 year-old might not set a world record, but with the right training, nutrition, and possibly performance-enhancing drugs, they can beat >95% of people half their age.
Finally, to everyone mentioned in this thread: Well done! I hope to follow your example.
1. https://lh4.googleusercontent.com/_gxYAfFM1cj0/S6hXmZ4qtjI/A...
2. http://cdn.intechopen.com/pdfs-wm/36842.pdf
3. http://resources.emartin.net/blog/docs/AgeAchievement.pdf