When I was young, I often wondered why adults seemed so "stupid" when it came to learning things, and I do think this is a big part of it. As we get older, it's necessary to make a conscious effort to pay attention to details of our environment, and to be acutely aware of the great uncertainty found all around us, even in everyday life. This is the difference between a person who becomes a "stupid" adult, and one whose mind is still sharp as a razor at age 70.
I have never met sharp as a razor person at 70 who is not showing signs of reduced learning in new situations. They learn new stuff faster than most of other people in their age, but they still do it slower than younger people.
What they have is large crystallized intelligence (Gc-factor) they can apply successfully to learn new things. This makes them usually great people to learn from and hang around.
I suspect that if we didn't expire the gravity of this accumulated cognitive perspective – habits of thinking, tough patterns, overfitting – would become intellectual disability even for the best of us.
One example I've observed of this consolidation process is that I'm way more prone to accidentally using homophones of a word than I used to be. I know the difference if I'm focusing on it, but if I'm typing I'll accidentally use eg. 'there' instead of 'they're'. When I was 16 I never, ever made this kind of mix-up.
We are "designed" to exist only for a limited time. The patterns are everywhere, including in the way the brain works.
You think you see reasons why an overly long-lived human would encounter difficulties because of limitations you perceive that they'll encounter, but your observation set is so limited that your conjectures must be treated as conjectures, not as solid understanding!
The limitations on learning we observe in older human could easily be due to a huge number of factors, almost all of which are extremely solvable. The one mentioned here, that crystalized intelligence, which is an advantage in some ways becomes a liability in this case - that one is not immediately solvable in my view -
But when I discuss radically increased lifespans for humans, and I discuss it often because I want to see it happen and am already participating in a startup whose success I believe will hasten that effort, I often hear people giving reasons much like yours are reasons why my efforts are not worthwhile. Honestly, seriously, people are using justifications like yours to avoid investing in efforts like mine, and it's maddening to hear such an unsure, ephemeral hypothesis - and I honestly regard the conjectures you've made above to be among the most uncertain hypotheses in history of humankind - to be used as a reason to deprive concrete efforts of concrete investment.
Absolutely maddening, especially because I don't even necessarily disagree, but if the actual pattern for humans is that after, say, 1,000 or 100,000 years, we've got too much cruft as a part of us to continue, we'll never know unless we investigate it, and we can't investigate it if everyone just blindly trusts in silly design pattern hypotheses!!!
I am 100% for research into radical life extension. I am 100% pissed off by those who oppose (or are indifferent to) this research for whatever reason. I totally hear ya.
But I am also a realist. We have been optimized by evolution for spreading the more successful genetic sequences, nothing more. Once the machine has served that purpose, its subsequent fate is inconsequential. That's been the high level trajectory of the blind optimization process so far, for billions of years.
The human brain is absolutely optimized to peak early and struggle mightily at the game of Spread Your Genes. It's chock-full of subroutines that kick in and help it play the game and hopefully win. Once that's been statistically achieved (by the end of your 20s, according to our default genetic programming), the same subroutines sort of don't care about it anymore and slowly begin to fade out.
That's not to say this is acceptable, or we should just lay down and take it. I've very little doubt that a solution could be engineered to stop and eventually reverse the aging process. But we're unlikely to succeed if we don't acknowledge the challenges that litter the path to this goal.
We forget. A lot. As a normal course of thinking. And that's OK.
Old habits (good and bad) do die, with a bit of effort, if they aren't being constantly reinforced. If you make it a point to continually refine and improve how you think (trying to disabuse yourself of invalid ideas), then you are setting yourself up for a better future.
I agree that, in keeping with all efficient system structures, we seem to be built so that all of our systems fall in a heap at roughly the same time.
Likewise, as the economy grows, the space between "dots" goes up, resulting in more fragile systems. We have to innovate things into the economy to make those less fragile. Parts go obsolescent and the supply chain weakens, year by year.
It'd be interesting to know whether or not actual formal training in uncertainty changed all this. By "uncertainty", I mean the term as it is used in information theory.
As I watch my age cohort... age, I think it's more that people just get bored with it all and lose interest in favor of other things. It's easier for me to think that becoming established just leads to complacency, and I for one don't wanna do that.
There seems to just be more awareness since you are forcing yourself to learn to deal with new environments.
I wonder if there's been any kind of study on frequency of moving or travel on the brain.
Have you ever noticed how sometimes when you read a really interesting book, and you learn some really key insights about the world, suddenly, you get a burst of new ideas? As though a flood of neurotransmitters has just been released?
Maybe that has health effects related to dementia.
I'm wondering if that relates to your anecdote, and how much impact traveling vs languages help.
This means most adults will never ask the simple questions and pretend like they understand something complex quickly or already know. It's a dangerous mindset to develop but it's difficult not to.
As most on here are programmers how many times have you been in a meeting or technical discussion where you are judged for asking a "simple" question?
We need to stop the judgement of those that ask "simple" questions.
I grew up with people locked into this mindset and it completely fucked me. Still trying to get over the knee jerk reactions I get to certain pieces of info, and getting over using hyperbole all the time. Have been slowly unravelling the politics that come with that train of thought.
I think part of what helped was to get interested in some completely unrelated thing. I was pretty much just a programmer, but have been getting into drawing and other art, and it's given me that sense of curiosity back and let me learn knew things even about programming topics.
This is also my theory on why "time flys by when you get older". We lack original experiences and so one day melds into the next. When each day is an adventure then of course it seems long.
One year is 1% of a hundred year old person, one day is 50% of a 2 day old baby.
I also think adults have fewer major book end types experiences. As a kid I can bookend every year with summers off and out of school. College was similar, but I rarely took summers off so a lot of college is melded together. As an adult, there are fewer natural bookends. Marriage, kids, new job are obvious ones, but they are not every year and not everyone has them. I try to take 2 large vacations each year to new places. Doing this has slowed time down.
See, that's just why you have to put effort into making sure you experience new and original things in life over time. If you make yourself do stuff, you can look back on even weekends and think, "Wow, a whole lot of time passed then!"
In fact, let's see...
* Friday night, went out to a new pub with fiancee. Great dinner and beer.
* Saturday, particularly the afternoon: did some last-minute costume shopping with visiting friends, watched Futurama together in the evening, a proper New England thunderstorm finally hit our area for the first time in the summer.
* Sunday: went swimming at the pool in the gym for the first time, fiancee was sad in the evening.
* Yesterday: worked all day, got a kick to more responsibility dumped on me, went home and got incredibly frustrated with my side-project's precision numbers going down rather than up. Eventually derped around on internet for the evening, realized I was measuring the precision of the whole joint distribution rather than of single observables.
* Today: talked to Chabadnik friend from grad-school, tested out measuring the precision of single features, at work now. Going to go swimming again tonight for exercise.
I agree: do things. Let's see how you get on with a family :-)
Older people are more often fulfilling the above premises, and thereby the above conclusion.
This means: Worse learning performance is not a consequence of age per se.
The actual cause of this high-level change could be lower-level age-induced differences in neurotransmitter levels or in neuronal function and connectivity. As the article's summary points out, the actual biological basis needs to be investigated in further research.
It's also important to note that this study found a plausible explanation for changes in learning performance under specific circumstances. This is a very important qualification.
Learning in general is believed to involve many different interrelated mechanisms. It's well established that many different cognitive abilities (perhaps most importantly working memory) diminish with age, and some of this degradation is likely related to natural cerebral shrinkage (a healthy 75-year-old has a 15% smaller brain than a 25-year-old).
> Why do older adults fail to represent sufficient levels of uncertainty? One possibility is that representing appropriate levels of uncertainty requires a cognitive and/or biological resource that decays across healthy aging. One obvious candidate for such a resource is working memory capacity. ...
> Another possibility is that older adults fail to represent sufficient levels of uncertainty because they have an aversion to uncertainty or the mental effort required to represent it. ...
> The crucial factor limiting uncertainty representations in older adults could, and at some level must be, biological in nature. One candidate for such a limiting factor is norepinephrine ...
> Although it is tempting to link age-related changes in representing uncertainty to reduction of a single neurotransmitter, several alternative biological accounts exist. ...
1. I think I know nothing.
2. I try to learn hard to get better.
3. goto 1.
One thing I wonder, though, is how they can ever adjust for the strong possibility older adults just can't care about a make-believe test the way younger people might.
My observation, as I age, is I have a harder and harder time getting interested in hypothetical scenarios, or taking on someone else's agenda as being super important.
So I really wonder if older adults just test worse than younger people, simply because they care less about the test.
If that's true, it wouldn't hold in the real world. Except that older people can seem less smart, when in reality they think whatever you're trying to get them to learn is not important in the grand scheme of things.
But I am also sure that maintaining novelty in my life, whether through education or travel, is key to maintaining healthy schemas in my mind to help understand the world.
From my experience teaching older people when I was younger, and myself no longer being 20 anymore, the challenge is figuring out which assumptions to question.
Let's say I'm reading a math proof, and don't really understand a statement. If I pretend I understand it well enough and keep going it might make sense, or maybe I'll get further lost. I feel like I had a better sense both of how not to get bogged down, and when to slow down when I was younger.
The property of neoteny - the retention of juvenile features in adults is one of the things that humans are particular noted for, and learning is one of them. So we could well still be evolving the ability to learn later in life.
The other unknowable here is environment. It is a lot easier to learn things via the internet than it was even 10 years ago, and it is also a lot more important to keep learning in many jobs. So the really interesting experiment will be to take today´s 70 year olds, who had to be content with evening classes if they could find the time to learn new skills, and compare with today´s 20 year olds, in 50 years time.
In my peer group at least, the difference between those who never opened a book after they left school (a sad, but surprisingly large number of people even with university educations), and those who kept reading, is huge - which also supports the findings.
edit again - i'm a little slow today.. you're right, it might just be an accidental side-effect, sure.
"The problem with the world is that the intelligent people are full of doubts, while the stupid ones are full of confidence". (Who said Java?)
The same notion is related to the concert of "the beginner's mind" popularized by D. T. Suzuki and S. Jobs. Packers call it "thinking out of the box". J. Krishnamurti call it 'freedom from the "known"' (people call all kinds of nonsense "knowledge").
Children learn so quickly and efficiently because they are not habitually pattern matching against personal experiences, cultural conditioning and popular memes - the way most adults do, but are still building the map of the world out of so called primordial awareness, or the Buddha Nature.
But "the map is not the territory".
And finally - “Trust those who seek the truth but doubt those who say they have found it.”
I strongly recommend it for engineers over 40.
The idea was that it asked open ended questions, ranging from questions such as "What is the GDP of USA" to "How many days in a lunar cycle" or "How many symphonies did Beethoven compose" and would not only present a handful of options for the answer but also ask you to self-rate how accurately you think you knew the answer.
The idea was not to measure the actual responses, but to look at how well people know whether they know things.
I found it very interesting and it reminded of this: I wonder how measures on that scale relate to learning agility.
"When you know a thing, to know that you know it, and when you do not know a thing, to know that you do not know it -- this is knowledge."
I like this one better :-)
1) Being scrupulously honest with oneself.
2) Re-examining cached thoughts and opinions when using them.
To some degree this is good. We don't want to go around testing every chair we come across to make sure it's good to sit in. But it has its downsides as well.
I'm not even particularly heavy. I'm 180#, which is pretty much spot-on for my height.
"Age differences in learning emerge from an insufficient representation of uncertainty in older adults"
The biggest thing you can do for yourself, as best i can tell, is to admit you don't know, and then go digging for info rather than just accept that you don't know (or worse go on to pretend you do).
What the study measured was that older people tended to be worse than younger people at adjusting their predictions after making small errors (theoretically because of lower uncertainty levels), but better than young people at adjusting predictions after making large errors (theoretically because of higher surprise sensitivity).
What the study did was fit each person with one of the curves in figure 2, the curves being functions of how much a person was willing adjust their new prediction ("learning rate" value on Y axis) in response to seeing different amounts of error on their previous prediction ("relative error" value on X axis).
- In the "normative" case (best case), a person has a perfect S curve, where they don't adjust their predictions too much after small errors, but greatly increase willingness to swing their predictions after errors reach a threshold.
- In the "surprise insensitive" case, where a person is unable to be surprised by large swings in data and update their predictions accordingly, the steep rise in their S curve is flattened out. These people are bad at learning after large errors.
- In the "low hazard rate" case, where a person is able to be surprised by large swings in data, but their threshold for surprise is too high, their S curve is shifted to the right. These people are bad at predicting after moderate errors, but fine at predicting after small and large errors.
- In the "low uncertainty case", where a person is too sure of themselves at low error levels, the S curve is depressed at the left end. These people are bad at learning after small errors, but good at predicting after medium and high errors.
- In the "reduced PE" case, where a person isn't good at understanding magnitudes of prediction errors at all, their S curve is vertically compressed, and their predictions are worse across the board, at low, medium, and high error levels.
Figure 6 shows outcome of the experiment, with age being correlated with higher "uncertainty underestimation" ("Unc") and higher "surprise sensitivity" ("SS") in the fitted curves.
This happens because older people are worse at learning from small changes in data, but can compensate somewhat be being more willing to change their predictions after large swings in data.
"Insufficient uncertainty" might be a reasonable explanation for this, but it's easy to imagine other possible explanations as well. Maybe "insufficient attention" or "insufficient caring" could be factors, with older people maybe being more willing to stick to rough predictions without sweating the details. It would have been interesting if the study tried to measure self-confidence / certainty levels more directly, instead of just relying on fitting to a theoretical model.
e.g. if there is a small error, older people can convince themselves that they were right and the world is wrong. (My reasoning was correct, but the answer was wrong by chance. Therefore I won't change my reasoning).
However, younger people are less able to rationalize these small errors, forcing them to accept the conclusion that they are wrong.
"learning deficits observed in healthy older adults are driven by a diminished capacity to represent and use uncertainty to guide learning"
From a Bayesian, or normal machine learning, perspective: older people have collected more evidence about the world, and thus would normally reduce their learning rate, or belief update amount, for best performance. In this example, though, video games represent a different environment, and the older adults have little evidence about them. However, they fail to have a higher learning rate. Looks to me like an evolutionary adaptation that does not work anymore; the environment did not change that quickly before.
To paraphrase: older people learn more slowly, and this parallels the lesser weight of new evidence when you have more accumulated evidence in Bayesian beliefs; or the reduction of learning rate in machine learning systems.
Even simple things like automatically switching your mind off because you find video games less interesting than a blackboard (which is the case for me) have to be taken into account.
This is really the same as intelligence tests that make assumptions over the cultural environment.