
The digital native is a myth - okket
http://www.nature.com/news/the-digital-native-is-a-myth-1.22363
======
jccalhoun
I teach at a college and I see how terrible these "digital natives" are at
technology first hand. Of course not all of them but enough of them to make me
laugh when I hear someone say young people are a lot better at tech than old
people.

What is happening is that there are now different sets of skills: touchscreen
skills and keyboard skills. It isn't even that they are particularly good at
using phones (students say they don't see email announcements because they
don't check email. I ask them why they don't set up their phone with the
university email. They say they don't know how) but that they use phones more
and have less experience with laptops and desktop computers (while setting up
for presentations I see them struggle to log into email or google slides. They
type google and enter to go to google.com, then search for youtube, then at
youtube they search for the video they want to show. Strangely, when they cut
and past a url they almost always right click and select paste then hit return
and nearly never use "paste and go" which is right below that in the context
menu. They don't know ctrl+f.)

So if "digital natives" means anything, it doesn't mean that they are
automatically good at tech.

~~~
freehunter
As someone who has worked helpdesk before, all of those things are widely
prevalent in every generation. Old people, middle aged people, and young
people alike all google "Google" and then google "Youtube" to get to a video.
Everyone sucks at logging in. Everyone finds a way that works for them and
doesn't know any alternatives.

The real answer is, computers suck. Seriously, computers are effing awful
machines, all developers are hackers, and the only reason things haven't
changed is because we constantly lie to ourselves that we're okay with the
world.

Passwords suck, no one can remember them, we have way too many of them to
remember, and every damn site needs its own special password requirements. And
on top of that every site needs its own _username_ requirements too, so
there's two things to forget. And email is the universal password, so you want
to make sure you use a good password there, but you never really log into your
email, do you? The password is saved. So when you get logged out somehow, good
luck remembering the password.

And don't get me started on keyboard shortcuts. At least Windows is
consistent, quick question: what's the keyboard shortcut on a Mac to go to the
beginning of your current line? Ah trick question, it's different for every
god damned application! And some applications don't even allow for that
without highlighting the entire line!

There's a reason "digital natives" are great at using phones and choose to use
them more. iOS and Android were built from scratch and could learn from all of
the mistakes of the old systems. In the process, they came up with a ton of
their own mistakes (3D touch and long press each do different things? Really?
Also TouchWiz is just a bad idea), but consistency and focus are major wins
here.

Computers suck and people get scared. Only having one app on the screen at
once, only having one way to do things... that's what people like. It's not
perfect, but man computers suck. Anything that makes them suck less is a huge
win.

~~~
copperx
> what's the keyboard shortcut on a Mac to go to the beginning of your current
> line? Ah trick question, it's different for every god damned application!

It's Control-A in every app!

~~~
raverbashing
Or Cmd-Left

(or Up if it's a one line text field)

~~~
copperx
Sure, but ain't nobody got time for getting the right hand out of the home
row. Emacs keybindings everywhere is one of the things that keeps me using OS
X. As a bonus, remapping Caps Lock to Control like a proper UNIX keyboard is a
few clicks away.

------
kator
The article makes sense, that said I have two grandchildren that are five and
three. They interact with ipads and phones like they're "normal" life. They
talk to their Alexa's and Siri like they're people. They ask the TV to change
channels and ask Mom why when they swipe the TV screen nothing happens.

My oldest grandson had the funniest argument with Siri, she kept saying I
don't understand you and he would say "why you're talking to me just like I'm
talking to you."

I agree we are still humans and no shocking net-new amazing multi-tasking or
concentration skills will come out of the "digital generation". That said
these kids are growing up on the shoulders of giants and will expect things to
just work. If anything, that could make things harder because they may not
want to learn how to make new things work.

All this said my grandmother lived through horse and buggy, radio, TV, Color
TV, Computers and men on the moon. She used to say the more things change the
more they stay the same. Because people are basically the same no matter when
they were born.

~~~
greggman
im probably being a Luddite here but I saw my friend's kid talking to alexa
and I couldn't help but think it won't be too long until kids don't need
parents much. if they have a question they'll ask siri. if they want help with
their homework they'll ask okgoogle. if they want someone to read them a story
they'll ask alexa.

on the one hand that will be great. a partner who never gets tired no matter
how many questions you ask.

on the other hand it's a brave new world. especially when as the computer gets
better they might get used to the partner that always does there requests and
upset with the real people who dont ?

I'm sure it will all be fine somehow.

~~~
eludwig
As a parent who's was "Why"ed to death by two curious kids in the pre-google
era, I can tell you that I would have killed for "someone" to take on a part
of that burden! ;)

>>I couldn't help but think it won't be too long until kids don't need parents
much

I think you meant need as pure "question answerers," but we do a few things
for the kids that Siri and Alexa wouldn't have the stomach or, for that
matter, the arms for. Oh, also: love!

~~~
mythrwy
When the answer can't be found perhaps the voice assistant can say "Because I
said SO!".

------
donkeyd
I think the term 'digital native' is too disambiguous. There's two things that
could be considered being a digital native; one is technical, the other is
more functional.

On the technical side, I completely agree with this article. Most people my
age (late 20's) and especially younger ones, don't have a clue about the
insides of their devices, or how the internet actually works. They also don't
seem to be interested in a lot of this stuff. They also tend to think that
everything that has to do with their devices is complex.

On the 'functional' (for lack of a better word) side however, they seem to
know exactly what to do on the internet. They know how to get attention on the
internet, what their peers are interested in and how to get as many likes as
possible. They live on the internet, live stream everything they do and make
lots of selfies. This seems to be the reason why we're starting to see lots of
younger kids becoming YouTube famous, or starting viral marketing companies.

The last thing, to me, is the part where people really become 'digital
natives' in stead of 'technical natives'. Also, because most devices are so
hard to open, most kids won't even get to see the technical side anymore,
unless they want to build a gaming PC.

~~~
robteix
> They know how to get attention on the internet, what their peers are
> interested in and how to get as many likes as possible.

Is it specific to an age group, though? My mother is in her 60s and she seems
to have no problem getting 100+ likes on FB every time she posts a pic of her
grandkids. She knows what _her_ peers want to see.

~~~
spondyl
Her peers could be the type to use the like button as a form of basically
saying "Yes, I have seen this"?

------
TeMPOraL
Compare to an excellent report from the trenches:
[http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-
co...](http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-computers/).
A very good read.

~~~
ef4
The observations in that post are helpful to illustrate the problem. But I
can't agree with the author's conclusions.

There's another way to read every single one of his examples: as evidence of a
design failure. For example:

> A message box opens up, but the kid clicks OK so quickly that I don't have
> time to read the message.

Users are so bombarded by unhelpful dialog boxes that many are trained to
dismiss them as fast as possible. It becomes subconscious, some users
literally can't see the dialogs anymore.

> I hand back the laptop and tell him that it's infected. He asks what he
> needs to do, and I suggest he reinstalls Windows.

This is obviously not the users fault. It's the responsibility of the system
designers to make it hard to subvert. iOS is now ten years old with no major
malware outbreaks -- it can obviously be done (through means that the author
doesn't approve of, like application signing).

> I take the offending laptop from out of her hands, toggle the wireless
> switch that resides on the side, and hand it back to her.

Poor design decision. Normal users in 2017 basically never want no-network,
dedicating the cost of a physical switch to it is doubly bad.

My point is that the author needs to stop lamenting that the hoi polloi have
breached the gates and start taking seriously the question of how to design
general purpose computing that actually serves most human beings. "Use Linux"
is not, by itself, a remotely helpful suggestion. His "Mobile" suggestion is
indistinguishable from satire.

~~~
vacri
> _iOS is now ten years old with no major malware outbreaks -- it can
> obviously be done_

If even a company with $200B in ready cash on hand, humongous design and
manufacturing resources, and full control at every part of the vertical stack
can't make a _general purpose_ operating system with these values, what hope
do the rest of us have? iOS is extremely limited in what it allows you to do.
If it isn't, then how come macbooks still sell?

I've been a tutor at the tertiary level before. Some people just can't be
fucked putting in _any_ effort to get what they want. Fuck them; they get to
be called incompetent - don't shift the blame. If someone could barely drive a
car and smashed it into every tenth lamppost, you'd call them an incompetent
driver. Or another example: I can't cook for shit. I can follow a recipe, but
it's not going to be very well done. But you would never make the argument
that it's the recipe-writer's fault that I couldn't make a good souffle.

It's reasonable to expect users to meet developers at least partway here;
reasonable to expect _some_ learning curve for an incredibly complex, capable
device.

------
j9461701
> And paper co-author Paul Kirschner, an education researcher at the Open
> University of the Netherlands in Heerlen, happily describes himself in his
> academic work as a “windmill-fighter”. But whereas Don Quixote aimed against
> solid walls, the digital-native assumption, on closer inspection, does seem
> illusory. It is certainly no giant.

To fight windmills, as per the English idiom 'to tilt at windmills', means to
attack non-existent enemies out of a delusional sense of righteousness (from
the last bit is derived the word quixotic). In the stories Alonso Quijano read
so many romantic novels about knights and chivalry he went insane, renamed
himself Don Quixote, and set off to vanquish windmills he imagined were giants
and swear fealty to an innkeeper he imagined as a lord (for the honor of his
lady love who doesn't actually exist).

This paper's authors being the Don Quixote figure doesn't make sense, as I
doubt they were trying to insult themselves. A more apt analogy would be the
companies trying to grapple with a new generation of digital natives are
windmill fighters (fighting an enemy that doesn't exist), while the paper's
authors are Sancho Panza desperately trying to get them to see reason.

~~~
hyperdimension
I learned and always thought that the phrase "tilting at windmills" meant,
aside from what you wrote about "to attack...out of a delusional sense of
righteousness," more like a futile task or something that wouldn't succeed
regardless of your efforts. Because of that, I interpreted the previous
comment not as someone who is creating enemies out of nowhere against oneself,
but rather someone who is doing something (telling others that "The Digital
Native is a Myth") that is a more-or-less hopeless, or mostly impossible task.

They're close, but to me there's a distinction between the two.

~~~
j9461701
Don Quixote is a pathetic, vainglorious egotist with mental problems, whose
attack on windmills is just one of his many feats of remarkably willful
stupidity. He isn't fighting a good battle that is unlikely to be won, he's
inventing battles out of nothing so he can feel special and important when
really all he's doing is being irritating and getting beaten up by inanimate
objects (the windmill sail). He's to be laughed at or pitied, not respected or
admired. That was something of the point, take a classic knight in shining
armor from romance stories, put him in real life, and watch how much of a
loser he'd be.

This might be a case of pop culture failing to capture the essence of a
reference, and the misunderstanding gaining higher prevalence than the true
meaning. Calling someone 'nimrod' is another example. Nimrod was,
unironically, one of the greatest hunters in the bible - smart, swift, strong.
But Bugs Bunny called Elmer Fudd that, and Fudd was an idiot. So anyone not
familiar with the Bible character thought nimrod == idiot, rather than Bugs
calling Fudd that sarcastically.

------
doozy
The digital native is not a myth, he's just going extinct.

Back in the day the difference between a computer user and a computer
programmer was, at best, blurry. You had to know how to program to use a
computer.

And hardware and operating systems were complicated things you had to study,
they weren't the appliances of today that "just work".

I think the breaking point was around the time Windows XP and Mac OS X were
released. The difference in skills between those raised before that time and
nowadays is mind blowing.

So the digital native exists, you just won't find one without grey hair.

~~~
KGIII
This reminds me of a story I like to share. You can have the short version.

I touched my first computer in the early seventies. It was from HP, took punch
cards, mag strip cards, keyed entry, and could output to a plotter, tv, or
LED. It was horrible. I hated it.

Late seventies, early eighties, I now have to use a terminally connected
computer and have a computer at home that required additional memory, just to
use lowercase letters. It was horrible. I hated it.

Late eighties, it hasn't improved. No, no it has not. I still have to dance a
strange dance and wait for my work to be completed. Usually, that meant
waiting for my work to be completed in another part of the university. If I
wanted to search the 'net, those searches went out to people - who'd send me
an answer back in as long as 72 hours. It was horrible. I hated it.

I pretty much hated computers until the 2000s. Oh, I used the Infernal Beasts
of Spite and Retribution - but it was adversarial. We're on friendly terms,
more like old enemies too tired to continue fighting but with mutual respect.

I don't like programming, tweaking, or fixing. I will, but I don't like it. In
fact, I use Linux because she's a familiar beast and I pretty much don't ever
have to screw with it. It does what I want and stays the hell out of my way.

That's the short version. It usually contains more details and a whole lot
more vulgarities. As a parting comment, it has been wonderful to live through
these changes and advancements.

------
microcolonel
Today's computers seem to try very hard to draw a distinction between
consuming content and being productive. These tasks tend not to share any
skill sets at all. My much younger sister was a wizard at binge-watching
YouTube videos on iOS at ~6. She used most of the features at her disposal:
subscriptions, autoplay, voice search... but despite all of that practice, she
never really got to using the keyboard, and is functionally incapable of
typing a URL in a timely fashion, or structuring a text search for something
full of homophones. She basically just gives up if it wouldn't work through
voice search.

I got into computers as an interest because on Windows 95, 98, 2000 for
WorkGroups, and other contemporary Windows versions (we were not affluent),
code of some sort was just a step or two away from using the machine at all.
Most computers today (iPhones, WinRT, etc.) don't even have shell scripts on
the desktop or a javascript console in the browser. Operating systems in the
late '90s and early oughts gave an impression that there were consistent
underlying concepts like _files_ and _processes_ or at least _tasks_. You used
to be able to market your dual-core processor as being able to run more tasks
simultaneously without trouble, nowadays you just tell the consumer it's
"faster", and they don't understand why the same app runs slower on their new
phone than the last.

~~~
freehunter
To be fair about the keyboard thing, when I was born computers were programmed
through punch cards, and I never learned how to use them. They switched to
keyboards pretty quick after that.

So to a 6 year old who can't use a keyboard... well I don't see keyboards
disappearing _quite_ that quick, but it's obvious that voice interaction is
going to feature quite heavily into the future of computing.

~~~
skummetmaelk
The difference is that keyboards are better than punchcards in every way.
Voice is inferior in almost every way.

~~~
soared
Voice:

\- hands free

\- quicker for short queries ("set a timer for 5 minutes")

\- quicker when precision is not necessary (voice to text can be quicker than
typing)

\- doesn't require extra hardware

\- innate/intuitive skill, doesn't need to be learned/practiced

\- no need to point + click. Like terminal, input what you need and go
directly there

\- accessibility is 100x better (kids, elderly, disabled)

~~~
khedoros1
Voice: \- socially embarrassing \- prohibitively imprecise (when is precision
_not_ necessary??) \- low work, but also low quality \- currently popular
methods require some outside company handling your audio \- also require a
good network connection, which shouldn't be treated as a given

These are all related to my own situation and preferences, of course, and I'm
sure that there are some people where your list of positives vastly outweigh
my list of negatives. No one's served well by assuming that everyone has the
same requirements.

~~~
freehunter
I want to make it clear that I'm not specifically targeting you for this
comment, it's more of a meta comment on the whole subject we're discussing.

I find it interesting how people tend to get stuck to the current forms of
technology and then become outraged when anyone dares challenge it. Who needs
Windows, I boot straight to DOS. Windows N sucks, I'm going to stick with
Windows N-1. Blackberry phones are useless and pretentious, I'm going to stick
with Nextel. The iPhone is stupid, I need a physical keyboard. Bluetooth is
awful, I need my headphone jack. The iPod is worse than a Nomad. The iPad is
just a big iPhone. And now, apparently, voice control sucks, I'm happy with my
on-screen keyboard.

And as history shows us in nearly every one of those cases, us tech guys are
typically the worst judge of how well a new technology will fare in the market
and what consumers actually want.

~~~
khedoros1
> And now, apparently, voice control sucks, I'm happy with my on-screen
> keyboard.

Nah, on-screen keyboards suck, I'm happy with my non-mobile devices (when
possible). Bluetooth _is_ an expensive pain in the ass, though ;-)

I'm used to my desires not meshing well with the way that markets go (and I'll
bet that other like-minded techies are too). Out of the specific things I
mentioned with voice recognition, most of them will change with technological
improvements and shifts in societal norms. Change is inevitable.

Honestly, I'll still find a way to do what I want, even if most other users
are happy to have tech go in a different direction. I'm not deluded enough to
think that what I want is what anyone else wants.

------
627467
I've seen my niece (we she was just under 4yo, she's now 5) open her dad's
YouTube app on Android phone and hit the voice button and shout "Pepa pig" to
find what she wants.

We were shocked to see it.

She's Spanish native. And she was not as fluent at her own native tongue as
some other kinds her age. Her parents don't know how to use voice. No one
around her does. I only use it for setting alarms and never _in front of
anyone_ (I'm self conscious about it).

~~~
sidlls
On the other hand I've got kids near that age, was born before 1980, and I
cheerily use voice commands and many other technology artifacts of the day
just the same as I'd do any other thing in a given social context, and much
better than my kids (thanks to the benefit of having lived a number of years
and the experience gained from that).

I'm not sure why anyone is shocked to see someone using modern technology
that's made available to them. How do you figure humans learn? We build on
what was learnt before, which requires a degree of familiarity.

~~~
627467
I agree with your questions. In this particular case I doubt she has seen
anyone use that particular technology.

But I guess she has seen her family use whatsapp audio message all the time
and established the analogy herself when she saw the "mic" icon on youtube.

It is also a testament of the ease of use of some of this tools.

------
gaius
The BBC like to describe older people as "digital immigrants" to contrast with
"digital natives" but seem to not quite comprehend that the "digital
immigrants" _actually built all this stuff_ whereas "natives" merely use it.

~~~
copperx
I don't see anything wrong with the terms. It's just like in the real world.

~~~
Terr_
Consider the difference between "indigenous natives" versus "second-generation
immigrants."

Technical professionals are arguably the indigenous natives, present "before
it was cool" and working to coerce a inimical environment into something
suitable and appealing to people.

~~~
resf
Consider Dubai. Built by immigrants, used by natives.

~~~
Terr_
Considered in many ways an exception and unusual occurrence.

------
perakojotgenije
No news here. This excellent blog post is from 2013: Kids can't use
computers... and this is why it should worry you[1]

[1][http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-
co...](http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-computers/)

~~~
mixmastamyk
Thanks, that post was the first thing I thought of, but couldn't remember the
details.

------
joaorico
There's a slight confusion in this thread, that many are rightfully pointing
out, as there's a bit of context missing in the article.

The thing is, the term 'digital native' has been _heavily_ used as a way of
saying that children that grew up surrounded by tablets and TVs and the
internet actually _learn differently_ from the so-called digital immigrants.

For an example, see these quotes by Marc Prensky cited in the literature
thousands of times:

‘today’s students think and process information fundamentally differently from
their predecessors’ [1]

‘Our students, as digital natives, will continue to evolve and change so
rapidly that we won’t be able to keep up’ [2]

‘Our young people generally have a much better idea of what the future is
bringing than we do’ [2]

‘In fact they are so different from us that we can no longer use either our
twentieth century knowledge or training as a guide to what is best for them
educationally’ [2]

Now this might immediately sound extremely suspicious for anyone who has some
background on the research of how humans learn, but it's a common myth all
over the learning world. 'Oh I can't learn from books, I need interactive
videos and apps, that's why I have bad grades.' Actually computers are an
excellent choice for learning if applied correctly, but this is not what is
usually being implied in these situations.

There's a big confusion between how much children are familiar with using
technology from a consumer's point of view, and the natural human ability to
learn. The latter hasn't changed.

The debunking of 'digital natives' is not new either [3].

By the way, for anyone interested, I recommend Hattie and Yates' book on
learning [4] as a great, very balanced, modern introduction.

[1] (cited 18000 times!) Prensky, Marc. "Digital natives, digital immigrants
part 1." On the horizon 9.5 (2001): 1-6.

[2] (cited 1000 times) Prensky, Marc. "Listen to the natives." Educational
leadership 63.4 (2005).

[3] (cited 2700 times) Bennett, Sue, Karl Maton, and Lisa Kervin. "The
‘digital natives’ debate: A critical review of the evidence." British journal
of educational technology 39.5 (2008): 775-786.

[4] Hattie, John, and Gregory CR Yates. Visible learning and the science of
how we learn. Routledge, 2013.

------
mtl_usr
I stopped believing in the "digital native" buzzword after spending some time
with some older relatives.

Because nobody makes good old fashioned cellphones that can only make calls
they were all upgraded to touch-screen smartphones. A lot of them came to me
for direction to perform the most basic tasks. They were interested in
downloading apps, getting the news from their devices, managing pictures and
generally browsing the internet from them.

At first I explained every step of the process and I noticed they would all
mechanically memorize the steps without really understanding any of them. So I
changed my approach. I told them that, software wise, whatever happens to
their device I can repair it and that instead of waiting for me to have a
minute to go over the steps with them, they should try themselves and if
something bad happens then they should come to me.

This has changed their behavior dramatically. They are now more confident
using their devices, knowing that someone can get them out of trouble should
problem arises, and try new features by themselves. An example of that is when
they enabled voice control and started to play with it.

I like to separate users in two categories that I consider more realistic than
"Digital Native" and so called "Digital Immigrants". I refer to them as having
an "Active" position toward technology and a "Passive" position. The active
crowd search for solutions themselves and learn by exploration while the
passive crowd pretty much will only do as directed and what they have been
taught with clear explanation. I noticed this pattern of refusing to explore
new tools and platforms when exposed to one not closely related with age, as
many young "Digital Natives" would get extremely confused when exposed to a
new tech. Even stuff as basic as an oldschool car GPS (not their usual phone)
would confuse them.

------
kensoh
I'm born 1981, so according to the article I've just made it to be a native?
What I do observe is this (this is over-generalization but seems true to me) -

Pre-internet era - people know little but do a lot and deeply

Post-internet era - people know a lot but do little and shallow

~~~
dmmc
I was born in 1980 and, having grown up with computers from an early age, I
totally agree with pervasive internet access being the next dividing line.
Computers allowed us to do many things we couldn't do before, but the internet
took communication and information sharing to another level.

------
jrnichols
Working in tech support anywhere would have shown you this. The supposed
digital natives aren't born web savvy.. they're the ones that just expect
everything to work flawlessly, and "just make it work."

They don't understand the how and why something works, and they don't care.

and get off my lawn. :)

------
sexydefinesher
Excellent, the nerd aristocracy's grip on IT stands unchallenged.

~~~
delinka
And how would it ever be challenged? The moment some non-nerd, non-aristocrat
becomes conversant with the machine, gains power over it, molds it to her
whims, she has become the nerd aristocracy.

The nerd aristocracy's grip will forever remain unchallenged.

~~~
chongli
_The nerd aristocracy 's grip will forever remain unchallenged._

Yeah. This story reminded me of the Star Trek: TNG episode _Schisms_. There is
a scene when Riker, Troi, and LaForge are on the holodeck trying to recreate
the aliens' surgical table from memory. The way the characters interact with
the computer to accomplish an extremely complex task -- build a model of a
table from fragmented dreams -- is remarkable.

The scene showcases how natural and comfortable they are at talking to the
computer. Though the episode has its detractors, I'm glad they made it. I can
imagine the characters regularly using the holodeck to model and design all
sorts of objects in the course of their normal duties. It demonstrates that
the power of their tools goes way beyond silly period-piece virtual reality
games.

The twenty-forth century nerd aristocracy no longer needs to use plastic
keyboards and text editors from the 1970s. That doesn't stop them from being
sophisticated users of their technology. _Programmers_ , even.

------
wallflower
When my niece was very young, she would try to swipe framed pictures on the
wall.

It's not about digital native but having expectations of interactive objects.

------
ZenoArrow
When most people talk about digital natives, they aren't talking about
learning styles.

The idea that you can dismiss the idea of digital natives just because people
still learn differently is a little wacky. Digital natives imply familiarity
and comfort with the digital technology they grew up with, not anything else.

~~~
nkuttler
So, just like every generation before them that was comfortable with the world
they were born into. Got it.

~~~
ZenoArrow
Yes, exactly. It doesn't say anything more profound than that.

------
pzh
Digital natives? Are people still using this term? So the people who invented
the Internet and most of modern technology are called "digital immigrants"?
How can one take lazy journalists who still use these terms seriously?

~~~
khedoros1
> So the people who invented the Internet and most of modern technology are
> called "digital immigrants"?

Consider if a community designed a new language. By definition, no one in the
community would be a native speaker. Couples start having children and
teaching the children that language. Those children will be the first native
speakers. The best of the non-natives would still have more technical
knowledge of how the language works than the average native speaker. That's
the situation that we're looking at here.

------
nitwit005
I suspect it'd be more accurate to say that many of them know quite a bit,
it's just that there's no overlap with what schools want them to know.

I'm sure a younger version of myself would have completely failed a basic test
of computer literacy. And yet, I set up DOS extended memory, configured
hardware settings for the Sound Blaster, and calibrated a joystick, so I could
play Tie Fighter.

The only skill I think a school might have valued was that I learned a tiny
bit of basic to fiddle with that qbasic game where the gorillas threw bananas
at each other.

------
duckingtest
It turned out to be a misconception because computers are astronomically
easier to use now. If you had to ever configure a config.sys entry for a
specific game you know what I mean.

------
mahyarm
I think it's somewhat like cars. One generation never used cars, another grew
up along side cars where changing your own oil was a standard thing people do
and now cars are better, people don't know much about cars any more.

------
bitwize
"Digital natives" is a codephrase used by employers to mean "we only want to
hire young people; if you're over 30 you won't be considered". It's dogwhistle
discrimination.

------
Kenji
My sister is a teacher and confirms: While kids play around with their
smartphones all the time, they are remarkably ignorant about the basics of
computing. They are users, watching videos and composing texts, but when it
comes to drivers (okay, who in today's world installs drivers themselves), OS
installations, BIOS, but also simpler things like proper backups or Excel
formulas, they are completely clueless.

> _People born after that date are the digital natives; those born before are
> digital immigrants, doomed to be forever strangers in a computer-based
> strange land._

That's a ridiculous notion anyway. If you put in time and effort, you can
learn pretty much anything, and computers and programming are no different, no
matter the age. And if you do not put in time and effort, you always remain
ignorant. It really is that simple.

~~~
kazen44
>That's a ridiculous notion anyway. If you put in time and effort, you can
learn pretty much anything, and computers and programming are no different, no
matter the age. And if you do not put in time and effort, you always remain
ignorant. It really is that simple.

I tend to agree, But younger people have the 'advantage' of being submerged
with computers from a very young age. Children tend to pick up things quicker
and easier compared to adults. (this is especially true with language,
learning a second language as a child is far easier then doing the same as an
adult).

Also, i think its unfair to judge people on their knowledge about computing
when they are not in the computing field.

most people don't know how to gearbox in their car works, yet they are still
able to use a car just fine.

~~~
Kenji
> Also, i think its unfair to judge people on their knowledge about computing
> when they are not in the computing field.

Yes but you wouldn't call people who don't know how the gearbox works
"mechanical native" or something like that just because they grew up with
cars. You acknowledge that there are people like mechanics who know that stuff
because they learned it, and other people who do not because they did not
learn it. It has nothing to do with generations and so on.

