Hacker News new | past | comments | ask | show | jobs | submit login
Age differences in learning from an insufficient representation of uncertainty (nature.com)
200 points by danieltillett on Aug 16, 2016 | hide | past | favorite | 88 comments



As people get older, their minds tend to filter out most things around them as a consequence of familiarity. Hence, they become oblivious to details of their environment. They lose their sense of wonder and their minds become less active and eventually stale.

When I was young, I often wondered why adults seemed so "stupid" when it came to learning things, and I do think this is a big part of it. As we get older, it's necessary to make a conscious effort to pay attention to details of our environment, and to be acutely aware of the great uncertainty found all around us, even in everyday life. This is the difference between a person who becomes a "stupid" adult, and one whose mind is still sharp as a razor at age 70.


>adult, and one whose mind is still sharp as a razor at age 70.

I have never met sharp as a razor person at 70 who is not showing signs of reduced learning in new situations. They learn new stuff faster than most of other people in their age, but they still do it slower than younger people.

What they have is large crystallized intelligence (Gc-factor) they can apply successfully to learn new things. This makes them usually great people to learn from and hang around.

I suspect that if we didn't expire the gravity of this accumulated cognitive perspective – habits of thinking, tough patterns, overfitting – would become intellectual disability even for the best of us.


I dunno, I think it's an exponential process whereby each year you 'consolidate down' your total current knowledge by X% to make room for new data. You keep the broad strokes of the least recently used knowledge but lose the fine details.

One example I've observed of this consolidation process is that I'm way more prone to accidentally using homophones of a word than I used to be. I know the difference if I'm focusing on it, but if I'm typing I'll accidentally use eg. 'there' instead of 'they're'. When I was 16 I never, ever made this kind of mix-up.


> I suspect that if we didn't expire the gravity of this accumulated cognitive perspective – habits of thinking, tough patterns, overfitting – would become intellectual disability even for the best of us.

We are "designed" to exist only for a limited time. The patterns are everywhere, including in the way the brain works.


I call bullshit! And it's not because of the teleological implication you're making - it's because you're overfitting a pattern - limited existence - where you have no data to observe the correctness of your conclusions!

You think you see reasons why an overly long-lived human would encounter difficulties because of limitations you perceive that they'll encounter, but your observation set is so limited that your conjectures must be treated as conjectures, not as solid understanding!

The limitations on learning we observe in older human could easily be due to a huge number of factors, almost all of which are extremely solvable. The one mentioned here, that crystalized intelligence, which is an advantage in some ways becomes a liability in this case - that one is not immediately solvable in my view -

But when I discuss radically increased lifespans for humans, and I discuss it often because I want to see it happen and am already participating in a startup whose success I believe will hasten that effort, I often hear people giving reasons much like yours are reasons why my efforts are not worthwhile. Honestly, seriously, people are using justifications like yours to avoid investing in efforts like mine, and it's maddening to hear such an unsure, ephemeral hypothesis - and I honestly regard the conjectures you've made above to be among the most uncertain hypotheses in history of humankind - to be used as a reason to deprive concrete efforts of concrete investment.

Absolutely maddening, especially because I don't even necessarily disagree, but if the actual pattern for humans is that after, say, 1,000 or 100,000 years, we've got too much cruft as a part of us to continue, we'll never know unless we investigate it, and we can't investigate it if everyone just blindly trusts in silly design pattern hypotheses!!!


Hold on a sec, I'm not at all in the group you think I am in. I've also put the word "design" in between double quotes for a reason; please be a kind reader and re-interpret it the way it was meant to be perceived. Online discussions have a way of erasing subtle nuances, let's try and resist that trend, shall we?

I am 100% for research into radical life extension. I am 100% pissed off by those who oppose (or are indifferent to) this research for whatever reason. I totally hear ya.

But I am also a realist. We have been optimized by evolution for spreading the more successful genetic sequences, nothing more. Once the machine has served that purpose, its subsequent fate is inconsequential. That's been the high level trajectory of the blind optimization process so far, for billions of years.

The human brain is absolutely optimized to peak early and struggle mightily at the game of Spread Your Genes. It's chock-full of subroutines that kick in and help it play the game and hopefully win. Once that's been statistically achieved (by the end of your 20s, according to our default genetic programming), the same subroutines sort of don't care about it anymore and slowly begin to fade out.

That's not to say this is acceptable, or we should just lay down and take it. I've very little doubt that a solution could be engineered to stop and eventually reverse the aging process. But we're unlikely to succeed if we don't acknowledge the challenges that litter the path to this goal.


Assuming you are running on well-maintained hardware (your brain), I don't think there's any set limitation to how well your intellect will function after a long period of time.

We forget. A lot. As a normal course of thinking. And that's OK.

Old habits (good and bad) do die, with a bit of effort, if they aren't being constantly reinforced. If you make it a point to continually refine and improve how you think (trying to disabuse yourself of invalid ideas), then you are setting yourself up for a better future.


To fend off the hysterics which the word 'design' might cause, I'd maybe phrase this as "evolution hasn't optimised us to last longer than ~80-90 years."

I agree that, in keeping with all efficient system structures, we seem to be built so that all of our systems fall in a heap at roughly the same time.


Moral of the story: learn as much as you can from uncertain situations now, because you may have less ability to when you get older.


As an Olde Pharte(tm), I'm pretty sure most things are less certain year by year. To mangle a metaphor from physics, the Universe is expanding; we're all like dots on a balloon being inflated, and the space between the dots gets larger.

Likewise, as the economy grows, the space between "dots" goes up, resulting in more fragile systems. We have to innovate things into the economy to make those less fragile. Parts go obsolescent and the supply chain weakens, year by year.

It'd be interesting to know whether or not actual formal training in uncertainty changed all this. By "uncertainty", I mean the term as it is used in information theory.

As I watch my age cohort... age, I think it's more that people just get bored with it all and lose interest in favor of other things. It's easier for me to think that becoming established just leads to complacency, and I for one don't wanna do that.


Anecdotally, I've noticed a very big difference in elderly people who have lived in several places, especially abroad, versus those who have mostly spent their life in one home. They are usually much more lucid and easier to communicate with. Less prone to social miscues, that sort of thing.

There seems to just be more awareness since you are forcing yourself to learn to deal with new environments.

I wonder if there's been any kind of study on frequency of moving or travel on the brain.


Relevant: Bilingualism delays age at onset of dementia, independent of education and immigration status

http://www.neurology.org/content/early/2013/11/06/01.wnl.000...


I think you might be able to generalize that to say that feeding the brain novelty keeps it from degenerating, by forcing it back into learning mode.

Have you ever noticed how sometimes when you read a really interesting book, and you learn some really key insights about the world, suddenly, you get a burst of new ideas? As though a flood of neurotransmitters has just been released?

Maybe that has health effects related to dementia.


I've heard that there's a negative correlation between people who speak multiple languages, and dementia and Alzheimer's.

I'm wondering if that relates to your anecdote, and how much impact traveling vs languages help.


Maybe it's because more intelligent people tend to travel more and intelligence is correlated with decreased rates of dementia?


I think sometimes bravado of knowledge can play into this. People as they become adults like to think they are very knowledgeable and don't want to come across as "stupid".

This means most adults will never ask the simple questions and pretend like they understand something complex quickly or already know. It's a dangerous mindset to develop but it's difficult not to.

As most on here are programmers how many times have you been in a meeting or technical discussion where you are judged for asking a "simple" question?

We need to stop the judgement of those that ask "simple" questions.


>This means most adults will never ask the simple questions and pretend like they understand something complex quickly or already know. It's a dangerous mindset to develop but it's difficult not to.

I grew up with people locked into this mindset and it completely fucked me. Still trying to get over the knee jerk reactions I get to certain pieces of info, and getting over using hyperbole all the time. Have been slowly unravelling the politics that come with that train of thought.


Interestingly, as I get older, I find myself asking more questions. Where once I considered myself an "expert" I now feel like I know so very little.

I think part of what helped was to get interested in some completely unrelated thing. I was pretty much just a programmer, but have been getting into drawing and other art, and it's given me that sense of curiosity back and let me learn knew things even about programming topics.


I gave a 10-year-old girl a unicycle for her birthday last week. I can never put into words the look on her face when she made, only an hour later, her first three cranks in a row before falling off (it took me 2 days to get that far in my early 30's). It is this type of experience that makes growing old worth the pain: to see the world through their eyes as though it were all new again (paraphrasing Virgil).


Uhm, her center of gravity and power-to-weight ratio is also significantly different than yours.


Yes but not in the little girl's favor. Unicycling favors taller people and taller seat-to-ground height. The larger the distance, the easier it is to balance and maneuver.


This is a very good point. I have observed that people tend to forget the physical differences between being small and being tall. Not everything is proportional [1].

[1] https://en.wikipedia.org/wiki/Square-cube_law


This isn't the worst thing in the world, but I hope it'll be helpful to know that the "Uhm" can come across as condescending. It seems to suggest that it's obvious or that someone not considering it is stupid.


Uhm, yeah


I love spending time with my kids for this. They see this detail and bring it to my attention. Often when they do I resolve to try to see as they do, but it's hard when you have seen most of "it" before. I envy them.

This is also my theory on why "time flys by when you get older". We lack original experiences and so one day melds into the next. When each day is an adventure then of course it seems long.


I always assume that time flys by for adult because as we get older, a given unit of time is a decreasingly smaller percentage of our entire life.

One year is 1% of a hundred year old person, one day is 50% of a 2 day old baby.


I remember reading this in some article and also remember some kind of debunking of it from a neuroscientist. Tried to find both right now and wasn't really successful but stating it here so if anyone else want to give a shot finding them I know that they exist.


But the root cause of that is the same as for the effect described in the paper. As you're older, your mental machinery filters more.


> We lack original experiences and so one day melds into the next. When each day is an adventure then of course it seems long.

I also think adults have fewer major book end types experiences. As a kid I can bookend every year with summers off and out of school. College was similar, but I rarely took summers off so a lot of college is melded together. As an adult, there are fewer natural bookends. Marriage, kids, new job are obvious ones, but they are not every year and not everyone has them. I try to take 2 large vacations each year to new places. Doing this has slowed time down.


Yes, had not thought of it like that before but it make sense. Similar to "a change is as good as a rest" although the summer holidays are a great rest!


>This is also my theory on why "time flys by when you get older". We lack original experiences and so one day melds into the next. When each day is an adventure then of course it seems long.

See, that's just why you have to put effort into making sure you experience new and original things in life over time. If you make yourself do stuff, you can look back on even weekends and think, "Wow, a whole lot of time passed then!"

In fact, let's see...

* Friday night, went out to a new pub with fiancee. Great dinner and beer.

* Saturday, particularly the afternoon: did some last-minute costume shopping with visiting friends, watched Futurama together in the evening, a proper New England thunderstorm finally hit our area for the first time in the summer.

* Sunday: went swimming at the pool in the gym for the first time, fiancee was sad in the evening.

* Yesterday: worked all day, got a kick to more responsibility dumped on me, went home and got incredibly frustrated with my side-project's precision numbers going down rather than up. Eventually derped around on internet for the evening, realized I was measuring the precision of the whole joint distribution rather than of single observables.

* Today: talked to Chabadnik friend from grad-school, tested out measuring the precision of single features, at work now. Going to go swimming again tonight for exercise.

Do stuff!


Agreed. I've moved continents and I'm learning a new language, having had two kids (well my wife had them).

I agree: do things. Let's see how you get on with a family :-)


Other studies posted here have mentioned that LSD can increase activity in the brain that is similar to what is seen in a child's brain. Does this mean then that LSD or other drugs could be used to increase learning potential in the elderly?


TLDR; If you think you know everything already (or you don't need to learn something) then you learn less well.

Older people are more often fulfilling the above premises, and thereby the above conclusion.

This means: Worse learning performance is not a consequence of age per se.


This is not an accurate summary of what the study actually found. The article proposes a high-level explanation for how one aspect of learning changes with age.

The actual cause of this high-level change could be lower-level age-induced differences in neurotransmitter levels or in neuronal function and connectivity. As the article's summary points out, the actual biological basis needs to be investigated in further research.

It's also important to note that this study found a plausible explanation for changes in learning performance under specific circumstances. This is a very important qualification.

Learning in general is believed to involve many different interrelated mechanisms. It's well established that many different cognitive abilities (perhaps most importantly working memory) diminish with age, and some of this degradation is likely related to natural cerebral shrinkage (a healthy 75-year-old has a 15% smaller brain than a 25-year-old).


I interpreted the paper slightly differently: The older you get, the harder it is to tell whether you understand something. The authors are not sure exactly why:

> Why do older adults fail to represent sufficient levels of uncertainty? One possibility is that representing appropriate levels of uncertainty requires a cognitive and/or biological resource that decays across healthy aging. One obvious candidate for such a resource is working memory capacity. ...

> Another possibility is that older adults fail to represent sufficient levels of uncertainty because they have an aversion to uncertainty or the mental effort required to represent it. ...

> The crucial factor limiting uncertainty representations in older adults could, and at some level must be, biological in nature. One candidate for such a limiting factor is norepinephrine ...

> Although it is tempting to link age-related changes in representing uncertainty to reduction of a single neurotransmitter, several alternative biological accounts exist. ...


lol, is this why impostor syndrome exists?

1. I think I know nothing.

2. I try to learn hard to get better.

3. goto 1.


:-) I think it's the first time I see an answer with a "goto" that didn't get downvoted!!!


Thank-you; great summary / much clearer than the paper itself.


The truth is, most people seem firmly in the grip of illusory competence at most times, and it's not a shock that as life progresses that becomes crippling. I'd be shocked if people somehow magically changed in old age, rather than just distilling who they've always been.


Why would you be shocked? It's a medical fact that a lot changes biologically as we age, and it's also a medical fact that biology can greatly affect the mind. There's nothing magical about senescence.


Clearly not the kind of change that I was referring to, in context.


That is not really clear to me. Biological changes can give us depression, irritability, mania, sluggishness, dementia and many other less-than-desirable mental states. What kind of change were you referring to that is fundamentally different from the things we know biology can do to a person?


Change in relation to the illusory confidence I mentioned in the first part of my sentence.


This article is really thought provoking.

One thing I wonder, though, is how they can ever adjust for the strong possibility older adults just can't care about a make-believe test the way younger people might.

My observation, as I age, is I have a harder and harder time getting interested in hypothetical scenarios, or taking on someone else's agenda as being super important.

So I really wonder if older adults just test worse than younger people, simply because they care less about the test.

If that's true, it wouldn't hold in the real world. Except that older people can seem less smart, when in reality they think whatever you're trying to get them to learn is not important in the grand scheme of things.


Your comment is kind of buried but I feel like this is what has happened to me as I've aged. Where once I was consumed by the depth of a problem I now find that I am more consumed with the breadth of it. In that sense I tend to consider the practical outcomes more than I used to.

But I am also sure that maintaining novelty in my life, whether through education or travel, is key to maintaining healthy schemas in my mind to help understand the world.


nobody seems interested in looking at the thing from the other perspective - why was it an evolutionary advantage for older people to develop this uncertainty-blindness? i would guess that perhaps with age the importance of making a decision rises, even if not the best decision. perhaps it's an evolutionary adaptation to reduce analysis-paralysis in developed adults and improve their self-sufficiency? in any case, it's interesting how we all jumped to "fixing" it. a cute reflection of the culture here.


A wealth of experiences can be a source of wisdom, and a good basis for decision making. Instead of rejecting that advantage in all situations, I think people are exploring ideas about how to learn when prior assumptions should have less bearing.

From my experience teaching older people when I was younger, and myself no longer being 20 anymore, the challenge is figuring out which assumptions to question.

Let's say I'm reading a math proof, and don't really understand a statement. If I pretend I understand it well enough and keep going it might make sense, or maybe I'll get further lost. I feel like I had a better sense both of how not to get bogged down, and when to slow down when I was younger.


That may be asking the right question the wrong way round.

The property of neoteny - the retention of juvenile features in adults is one of the things that humans are particular noted for, and learning is one of them. So we could well still be evolving the ability to learn later in life.

The other unknowable here is environment. It is a lot easier to learn things via the internet than it was even 10 years ago, and it is also a lot more important to keep learning in many jobs. So the really interesting experiment will be to take today´s 70 year olds, who had to be content with evening classes if they could find the time to learn new skills, and compare with today´s 20 year olds, in 50 years time.

In my peer group at least, the difference between those who never opened a book after they left school (a sad, but surprisingly large number of people even with university educations), and those who kept reading, is huge - which also supports the findings.


There doesn't need to be an evolutionary advantage for everything (e.g. vestigal features) - this "uncertainty-blindness" could have just not been exposed to evolutionary pressure.


Very few 70+ year olds are reproducing so evolutionary advantage doesn't make sense in that age group as they have already reproduced (or not) by then.


Historically the aged sat around the fire and taught the young. Which encouraged their genes' propagation. Since Humans became social, its not been all about reproduction.


this is all extremely speculative, of course! but i would say that anything that affects your thinking can pose an evolutionary advantage or disadvantage. (to also answer a peer comment here:) including when you are past your reproductive age, since human beings are social beings, and our evolution cannot be regarded purely individualistically. to give an extremely stupid, but i hope illustrative example, a society where the old people go extremely stupid and start killing people around them is going to have a hard time ;)


It's an interesting question, but many of the changes that occur with old age could simply be side effects of optimizing for performance before and during reproductive age and need not be evolutionarily advantageous per se.


isn't that just saying the same thing, but in different words? :) better reproduction is the ultimate evolutionary advantage. edit - ok, not "the ultimate", but i guess you know what i mean.. ;)

edit again - i'm a little slow today.. you're right, it might just be an accidental side-effect, sure.


Or just maybe fixing the brain after too many years becomes harder (less viable) than just creating a new brain (a baby). Just like old cars are disposed instead of fixed... Or in programming terms: just like building something from the source code is more viable than directly fixing a binary.


Causation is a strong claim here.) Correlation is known for ages.

"The problem with the world is that the intelligent people are full of doubts, while the stupid ones are full of confidence". (Who said Java?)

The same notion is related to the concert of "the beginner's mind" popularized by D. T. Suzuki and S. Jobs. Packers call it "thinking out of the box". J. Krishnamurti call it 'freedom from the "known"' (people call all kinds of nonsense "knowledge").

Children learn so quickly and efficiently because they are not habitually pattern matching against personal experiences, cultural conditioning and popular memes - the way most adults do, but are still building the map of the world out of so called primordial awareness, or the Buddha Nature.

But "the map is not the territory".

And finally - “Trust those who seek the truth but doubt those who say they have found it.”


There is a learning to learn course on Coursera https://www.coursera.org/learn/learning-how-to-learn

I strongly recommend it for engineers over 40.


I strongly recommend it for everyone. Especially for those workaholics and one-(brain)-hemishpere individuals who think that knowing only one narrow area is OK.


On a related note, does anyone else remember a study (I think linked here) that measured certainty / uncertainty in individuals?

The idea was that it asked open ended questions, ranging from questions such as "What is the GDP of USA" to "How many days in a lunar cycle" or "How many symphonies did Beethoven compose" and would not only present a handful of options for the answer but also ask you to self-rate how accurately you think you knew the answer.

The idea was not to measure the actual responses, but to look at how well people know whether they know things.

I found it very interesting and it reminded of this: I wonder how measures on that scale relate to learning agility.


I'm certain that there's a learnable technique to cultivate uncertainty.


"Defamiliarization or ostranenie (остранение) is the artistic technique of presenting to audiences common things in an unfamiliar or strange way in order to enhance perception of the familiar."

https://en.wikipedia.org/wiki/Defamiliarization


The wisest man knows that he knows nothing.


"Master, what is knowledge?"

"When you know a thing, to know that you know it, and when you do not know a thing, to know that you do not know it -- this is knowledge."

I like this one better :-)


Is it even possible to know if there are more known unknowns than unknown unknowns?


How could I know?


I think it is (should be) as simple as:

1) Being scrupulously honest with oneself.

2) Re-examining cached thoughts and opinions when using them.


Most won't get past step one.


Always keep a Beginner's Mind.


I think that many times we don't give enough credit to how amazingly lazy our brains are. Whatever they can generalize, they generalize. They don't track images, they track edges and movement (I oversimplify, of course). They can hear garbled speech in a crowded room and somehow figure out what the words are -- even if those aren't the actual words.

To some degree this is good. We don't want to go around testing every chair we come across to make sure it's good to sit in. But it has its downsides as well.


Maybe I don't need to test everything to see if it is a chair, but I definitely test every chair before I sit in it. I learned that I should do that after breaking a few chairs, and a bench.

I'm not even particularly heavy. I'm 180#, which is pretty much spot-on for my height.


This headline (on yc) is missing a key verb from the actual paper's headline, making it confusing. The actual title is:

"Age differences in learning emerge from an insufficient representation of uncertainty in older adults"


Yes. Unfortunately YC limit of 80 characters prevented me from including the full title. I hope the change is not too confusing and captures the intent of the original title.


Actually, it was confusing and I was searching for this comment alone. I wouldn't have clicked on it if it wasn't for the points it already has.


So does one have to remain open to learning about everything, including subjects they don't find interesting or can you simply continue to wonder and learn about things that actually interest you. Do you have to force yourself to learn things as we were "forced" during our early education? For example a developer is annoyed/bored with re-learning frameworks /languages/concepts and takes up a hobby they find interesting where they are still hungry for knowledge. Say homebrewing :)


So, basically my constant saying to myself "I don't understand this" and digging in deeper until I do plus feeling like there's so much more to learn is... a good thing.


Seems so. Been doing something similar my whole life myself.

The biggest thing you can do for yourself, as best i can tell, is to admit you don't know, and then go digging for info rather than just accept that you don't know (or worse go on to pretend you do).


I'd say, "reminding myself I don't understand this." Like you say, you can always dig deeper and it's almost always rewarding.


another TL;DR : Making mistakes is a great way of learning. To learn from mistakes, the first step is to identify that you have made an error. The second step is to correct it. When we grow older, we become worse at the first step i.e. identifying the error. Without identification, there can be no correction and no learning.


So does evoking uncertainty allow for an overall learning advantage when it comes to age in more complex scenarios?


I don't think the study supports many of the conclusions being drawn in the comments here.

What the study measured was that older people tended to be worse than younger people at adjusting their predictions after making small errors (theoretically because of lower uncertainty levels), but better than young people at adjusting predictions after making large errors (theoretically because of higher surprise sensitivity).

What the study did was fit each person with one of the curves in figure 2, the curves being functions of how much a person was willing adjust their new prediction ("learning rate" value on Y axis) in response to seeing different amounts of error on their previous prediction ("relative error" value on X axis).

- In the "normative" case (best case), a person has a perfect S curve, where they don't adjust their predictions too much after small errors, but greatly increase willingness to swing their predictions after errors reach a threshold.

- In the "surprise insensitive" case, where a person is unable to be surprised by large swings in data and update their predictions accordingly, the steep rise in their S curve is flattened out. These people are bad at learning after large errors.

- In the "low hazard rate" case, where a person is able to be surprised by large swings in data, but their threshold for surprise is too high, their S curve is shifted to the right. These people are bad at predicting after moderate errors, but fine at predicting after small and large errors.

- In the "low uncertainty case", where a person is too sure of themselves at low error levels, the S curve is depressed at the left end. These people are bad at learning after small errors, but good at predicting after medium and high errors.

- In the "reduced PE" case, where a person isn't good at understanding magnitudes of prediction errors at all, their S curve is vertically compressed, and their predictions are worse across the board, at low, medium, and high error levels.

Figure 6 shows outcome of the experiment, with age being correlated with higher "uncertainty underestimation" ("Unc") and higher "surprise sensitivity" ("SS") in the fitted curves.

This happens because older people are worse at learning from small changes in data, but can compensate somewhat be being more willing to change their predictions after large swings in data.

"Insufficient uncertainty" might be a reasonable explanation for this, but it's easy to imagine other possible explanations as well. Maybe "insufficient attention" or "insufficient caring" could be factors, with older people maybe being more willing to stick to rough predictions without sweating the details. It would have been interesting if the study tried to measure self-confidence / certainty levels more directly, instead of just relying on fitting to a theoretical model.


Maybe older people are better at rationalizing low errors.

e.g. if there is a small error, older people can convince themselves that they were right and the world is wrong. (My reasoning was correct, but the answer was wrong by chance. Therefore I won't change my reasoning).

However, younger people are less able to rationalize these small errors, forcing them to accept the conclusion that they are wrong.


I knew that


Synopsis: A certain age group is better at video games.


Nice snark, but one can extract more interesting insights.

"learning deficits observed in healthy older adults are driven by a diminished capacity to represent and use uncertainty to guide learning"

From a Bayesian, or normal machine learning, perspective: older people have collected more evidence about the world, and thus would normally reduce their learning rate, or belief update amount, for best performance. In this example, though, video games represent a different environment, and the older adults have little evidence about them. However, they fail to have a higher learning rate. Looks to me like an evolutionary adaptation that does not work anymore; the environment did not change that quickly before.

To paraphrase: older people learn more slowly, and this parallels the lesser weight of new evidence when you have more accumulated evidence in Bayesian beliefs; or the reduction of learning rate in machine learning systems.


I'm wondering to which extent this effect can be attributed to illusion of competence, i.e. with age some meta learning program in our brains that is concerned with managing novel data simply becomes lazy because (1) adults really are able to handle most of the situations they are confronted with and (2) they are also expected to be able to deal with any situation. That might simply lead to a detoriation of the sort of novelty detection. Adults tend to automatically explain away things they don't understand (possibly to save face) while the young examine them. With increasing age these kinds of automatisms might simply override or detoriate learning strategies that are required to learn quickly.


Not very insightful. If you're used to video games, you have an inherent advantage over any "new" problem that is posed in terms of a video game.

Even simple things like automatically switching your mind off because you find video games less interesting than a blackboard (which is the case for me) have to be taken into account.

This is really the same as intelligence tests that make assumptions over the cultural environment.


I've always thought of this as precompiled vs interpreted; i.e. the more experience someone has the more hard-wired their response to that scenario; which makes people with more experience faster at reacting to familiar situations, but less adaptable should a different response be required or to some previously unseen variation in that initial situation.




Applications are open for YC Summer 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: