I teach at a college and I see how terrible these "digital natives" are at technology first hand. Of course not all of them but enough of them to make me laugh when I hear someone say young people are a lot better at tech than old people.
What is happening is that there are now different sets of skills: touchscreen skills and keyboard skills. It isn't even that they are particularly good at using phones (students say they don't see email announcements because they don't check email. I ask them why they don't set up their phone with the university email. They say they don't know how) but that they use phones more and have less experience with laptops and desktop computers (while setting up for presentations I see them struggle to log into email or google slides. They type google and enter to go to google.com, then search for youtube, then at youtube they search for the video they want to show. Strangely, when they cut and past a url they almost always right click and select paste then hit return and nearly never use "paste and go" which is right below that in the context menu. They don't know ctrl+f.)
So if "digital natives" means anything, it doesn't mean that they are automatically good at tech.
I think you're seeing the "Generation Z" effect among your students. As a Millenial, I've noticed it too. Millenials are really the only generation that "grew up" with digital technology and the internet. We had to debug viruses, worry about floppy disk compatibility, learn how to change out failed hard drives, etc. if we simply wanted to use technology at all. Whereas "Generation Z" (those too young to remember 9/11) grew up with nothing but highly polished devices like iPad and Macbook. There was a short window from the early 90's to early 2000's where you could really call someone a "digital native" IMO.
Notice the slight difference between "grew up with" and "grew up alongside". The term "digital native" was coined specifically with the former in mind, those too young to remember offline. It's a misunderstanding when "digital native" is associated with widespread tech skills, that's more a thing of the homecomputer generation.
"A kid puts his hand up. He tells me he's got a virus on his computer. I look at his screen. Displayed in his web-browser is what appears to be an XP dialogue box warning that his computer is infected and offering free malware scanning and removal tools. He's on a Windows 7 machine. I close the offending tab. He can't use a computer."
"Ask them to reinstall an operating system and they're lost. Ask them to upgrade their hard-drive or their RAM and they break out in a cold sweat. Ask them what https means and why it is important and they'll look at you as if you're speaking Klingon.
[...]
How the hell did we get to this situation? How can a generation with access to so much technology, not know how to use it?"
Why the hell do we expect kids to know things nobody ever told them that they exist? That seems to me like a more reasonable question. Kids don't know how human body works till we tell them, they don't know how to read unless we show it to them, why would we expect them to know how to reinstall operating system when no one ever used the word operating system in front of them?
If no one ever show RAM to kid, the kid wont be able to react on request to change it. That is the default state, the state when a kid knows a thing is a state achieved by somebody teaching the thing to kid.
Do you think the average driver is more knowledgeable about their car than the average computer user is about their computer (for the sake of argument lets limit both to the same age group)?
Computers are now appliances rather than power tools.
I don't think they are more knowledgeable, don't expect them to be so and I am not surprised that driving schools teach driving rather then how cars works. Nor I am outraged over the fact that most people are not experts in everything and don't have deeper knowledge about my profession.
I think that all the kids should be taught enough about computers to know it is no magic, that it is learnable and actually easy once you want to do it (and you dont actually need to be genius to learn it as many believe).
However, I really don't think they all need to know Linux administration.
If I remember that article correctly they weren't saying that kids should be able to know these things, rather, that parents insist that they do because they are "digitally native".
I've heard something similar for Baby Boomers and motor vehicles. They grew up along side the development of cars and had to learn the basics of how a car works whereas later generations grew up with more complicated cars.
This is definitely true of my father. When he was young owning a car meant working on cars, especially for a teenager who couldn't afford to take it to the shop every couple of weeks.
Cars are so reliable and take so little maintenance nowadays, he marvels at them. OTOH they're also very unfriendly to the shade-tree mechanic. The engines are covered by plastic cowling panels. Everything is packed in so tightly it's a nightmare to change anything. And it's all controlled by proprietary black-box computer systems.
This in a very apt analogy. Listening to stories from my dad of his early vehicles makes me both jealous and grateful. He was forced to learn how to do all kinds of component repairs and be prepared to do them on long road trips because cars would just occasionally break down. I'm jealous in that I was never forced to pick up all of the hands-on mechanical skills but I'm grateful that I don't actually have to realistically worry about constantly repairing my car to have transportation.
I think people are overreacting to simple facts of life: if a technology is mature and the services around it plenty and cheap, there's no point wasting your time learning the ins and outs of it.
I don't see what the problem is if someone doesn't know how to add a RAM stick just because I do, the same way I don't feel compelled to know how to replace a light bulb to save myself a tenner
"[Some of us chose] to debug viruses, worry about floppy disk compatibility, learn how to change out failed hard drives, etc. if we simply wanted to use technology at all"
Now, Millennials is a pretty uselessly broad term, so depending on what age range you are actually talking about, there is slightly more truth to this. If you are talking about someone who is 29-34 today, then you would be an early adopter of these technologies and it did require a high level of expertise. However this was a very small portion of the population.
If you are talking about 22-25 year olds, then computers and the internet had become more common in households and de-skilling. I know plenty of people who have used and owned computers most of their lives and have never debugged a thing.
Nerds have the inability to understand that most people use computers like any other tool. They do not obsess over it, seek to understand how it works, or otherwise care.
Like a microwave, they buy a new one if their current computer "breaks." Like a light fixture, if it malfunctions, most people hire a repairman instead of doing it themselves.
This has always been true for the mainstream users of computers. It only seems more absurd now because 16-19 year olds have had smartphones and other technology luxuries their whole lives, and so we expect them to have invested more interest into truly learning how to use them and how they work.
But availability of a technology does not have much of a connection to proficiency of said technology.
Ironically, I am sure most of the people here who whine about how little other people understand computers have a similar lack of understanding for other objects they use everyday. How many people in programming/I.T. know anything about plumbing or car maintenance?
No matter what generation,majority of kids did not had to debug viruses nor had possibility to do anything advanced. It is only small subset of families that encourared or made possible such thing (and only to subset of their children).
I've never heard anyone call generation Y people that were too young to remember 9/11. I thought gen y _was_ millennials. That being said I do agree with what you're saying I just found it confusing.
My understanding was that Gen Y originally referred to older millennial. At least, as a younger millennial, that's what I always heard them called growing up until the millennial term got solidified. Meanwhile, Gen Z is the name for the people younger than the millennial cohort until someone coins a catchier one.
As an aside, I always thought that the age rage for millennials was about twice as broad as it should be. I don't really find myself identifying with the older members of the cohort.
Yeah, being born in 1978 I was definitely not Gen X growing up. Gen Y floated around a bit and seemed to refer to me, but once Millenial gelled it seemed to cut off at 1980 or later, and now they want to call me Gen X. I don't really identify with either though.
Hah. It started with GenX (my label) because of some book with the same name. I read it and didn't relate to it at all. It's something the news likes to do .. generalize an entire generation of Americans. Usually to write unsavory articles about how crappy they are. We got the same treatment.
I think the fundamental confusion is that risk-taking resembles competence.
I believe that's what drives comments like "kids these days are so good at technology". When archetypal Aunt Tilly looks at somebody younger using a computer, she sees that they do not hesitate when navigating around, quickly clicking on everything that appears.
However, that doesn't always mean they know what they are doing. They're simply much more comfortable trying and hoping that it won't end badly. (Or at least not badly in a way that they can recognize, until someone tells them that their computer is full of malware.)
I always had the same perception growing up. When an older relative asked me to fix the TV settings or something, I had no idea what I was doing. I just clicked things that sounded reasonable until something worked out.
It's almost impossible to to mess one of these devices up (other than malware) for good, but I think most people who are not very technology literate see them as precariously balanced items that crumble at the slightest touch.
Yes bricking them is hard. But as we grow older we like to have things as recall them, as that reduce cognitive load. But TVs etc are crap at reverting to last good state (as defined by user, not OEM or some automagical system).
Also, last thing you want is for it to be stuck in some "useless" state when it is needed...
As someone who has worked helpdesk before, all of those things are widely prevalent in every generation. Old people, middle aged people, and young people alike all google "Google" and then google "Youtube" to get to a video. Everyone sucks at logging in. Everyone finds a way that works for them and doesn't know any alternatives.
The real answer is, computers suck. Seriously, computers are effing awful machines, all developers are hackers, and the only reason things haven't changed is because we constantly lie to ourselves that we're okay with the world.
Passwords suck, no one can remember them, we have way too many of them to remember, and every damn site needs its own special password requirements. And on top of that every site needs its own username requirements too, so there's two things to forget. And email is the universal password, so you want to make sure you use a good password there, but you never really log into your email, do you? The password is saved. So when you get logged out somehow, good luck remembering the password.
And don't get me started on keyboard shortcuts. At least Windows is consistent, quick question: what's the keyboard shortcut on a Mac to go to the beginning of your current line? Ah trick question, it's different for every god damned application! And some applications don't even allow for that without highlighting the entire line!
There's a reason "digital natives" are great at using phones and choose to use them more. iOS and Android were built from scratch and could learn from all of the mistakes of the old systems. In the process, they came up with a ton of their own mistakes (3D touch and long press each do different things? Really? Also TouchWiz is just a bad idea), but consistency and focus are major wins here.
Computers suck and people get scared. Only having one app on the screen at once, only having one way to do things... that's what people like. It's not perfect, but man computers suck. Anything that makes them suck less is a huge win.
>The real answer is, computers suck. Seriously, computers are effing awful machines
It's easy to brush off the millions of things that computers allow us to do that we couldn't do before because of some terrible UIs.
Computers suck, the internet sucks, but the pony express and cargo boats sucked worse. I can get a message across the world in about a second. No applause for that though. No praise for the things we can now do. We're just too eager to find things we wish we could do, and blame our current technology for not knowing our dreams.
Yeah I covered that in the “we constantly lie to ourselves” part. Computers have enabled us to do wonderful things. Like criticize people we don’t know for using software in a different way than we do. Yet we, as the creators of this software, have not only enabled them to use it “incorrectly”, we’ve actively encouraged it by saying things like “it’s not the computer’s fault it doesn’t know your dreams”.
If only those pesky users would just read the source code, they’d understand why this site works when you put www in front of it but doesn’t work without it! And if they want it changed, well it’s open source! They should send a pull request!
> what's the keyboard shortcut on a Mac to go to the beginning of your current line? Ah trick question, it's different for every god damned application!
Sure, but ain't nobody got time for getting the right hand out of the home row. Emacs keybindings everywhere is one of the things that keeps me using OS X. As a bonus, remapping Caps Lock to Control like a proper UNIX keyboard is a few clicks away.
I am sympathetic, to a point - mainly because it is possible to improve interfaces, there's no physical constraint preventing that (there's also no guarantee that a different interface will work equally well for all users, of course). But I also would suggest it's reasonable to expect a certain level of basic understanding. Imagine someone pumping gas into their radiator, because the two holes were the same size. We don't design cars to prevent that, but we also don't expect people to make that mistake.
"what's the keyboard shortcut on a Mac to go to the beginning of your current line? Ah trick question, it's different for every god damned application!"
I haven't seen any macOS applications that don't accept Ctrl-A (or any other Emacs-style shortcuts, while we're at it). Maybe I just haven't looked hard enough.
I think a consistent history-rewind button for every app, would take a lot of anxiety about breaking software out of this and allow for people to experiment. Oh, you rearranged every single button in excell outside the screen- history button takes you back to the start of this.
After all, we do that too with git and containers.
To do this we need to start modeling our applications using state based design. Redux is a good push from engineering in this direction. Elm is also great for this idea as well. The most important part of your boring business application is the applications state and the users state. Take the time to model it and how to move it forward in time with actions.
There are things people should have to think about and there are things people should just be able to do without effort. I don’t want my doctor having to spend brain power figuring out how to use his new software, I want him to use is entire brain to diagnose my disease. Programmers should think. Users should just do. If users have to think about how to use their software, the developer failed at their job.
> If users have to think about how to use their software, the developer failed at their job.
Under that definition, I don't think that there's ever been a successful developer. I'd set the bar more at the ecosystem level. Once a user learns how an ecosystem works, the software in it should be as intuitive as the complexity of their tasks allow. Anything more seems like making perfect the enemy of good.
Developer and user definitions of "suck" are completely different.
But [thread] OP is correct. UIs absolutely do suck.
The proof is that after fifty years of UI design, there's still no such thing as a standard system/technique/anything for assessing the quality, reliability, and effectiveness of a user interface.
There's been almost no effective research into cataloguing the most common kinds of user error OR developer error and designing systems that minimise those errors.
There's a lot of tribal and religious noise about operating systems and languages, but virtually none of it is based on empirical research.
> ...quick question: what's the keyboard shortcut on a Mac to go to the beginning of your current line? Ah trick question, it's different for every god damned application! And some applications don't even allow for that without highlighting the entire line!
I realise this isn't really the crux of your post, and I broadly agree with you that computers still suck. But the answer to this question is CMD + left arrow. I have no idea what applications you're using where this doesn't work.
I think we take for granted a lot of assumed knowledge that the younger generation simply isn't taught or hasn't been required to learn.
The other day one of my coworkers, a senior engineer in his 40's. Sent a simple C++ program he had written to dump some data into a CSV file to another coworker(a new graduate in her early 20's) via email. I was CC'ed. The email body simply said:
Usage:
convert.exe [infile] [out.csv]
She stopped by my desk later in the day I mentioned in passing "How did that program work out for you?" she then complained that she couldn't work out how to supply the program with the arguments - double clicking the file did nothing for her. Turns out she had never used a command line in her life.
I was shocked I grew up on the command line it boggles my mind to thing there is an entire generation who has been raised on nothing but a gui. This is quite a smart person with an (non software) engineering degree mind you.
In MS land it's very common for developers to be unwilling or unable to use the command line. The two groups where this is most notable are the younger devs and the ones that come to development via the access/vb classic path. To them if something can't be done in visual studio then it can't be done.
One of our code tests for hiring is writing a very simple version of grep with only a few flags like case sensitive, recursive and inverted. Of the 10 or so people that kind of completed it, not a single person seemed to understand that multiple flags could be specified (in any format).
FWIW, he should have written a simple WinForms or WPF app instead. It probably would have taken just as much time as parsing the args[] array with proper input checking. I know command line because that's all there was for a time. Windows command line is pretty horrible compared to *nix.
I work in a school and have found it's really easy to spot "Digital Natives" by the method they use the capitalize letters.
They will almost always use Caps Lock instead of the Shift key, so rather than momentarily hold down the Shift key and press the letter they want capitalized, they will press the Caps Lock key, type the letter and then press the Caps Lock key again.
Considering that there is only one Caps Lock key compared to two Shift keys I am not quite sure why they do it this way since their hand has to travel further when capitalizing letters on the right-hand side of the keyboard.
There's definitely some research on this, and from memory they confirm your observations. The older generation actually understands what might be inside your computer far more than the newer generation. The newer generation is so detached from what they are actually using, they have no idea.
As for this 'millennial' business, I think you can quite easily divide the cohort into conscious before they had access to the internet and vice-versa. This obviously varies between geographic regions, socio-economic status and a number of other factors, but from personal experience, there is a massive divide between the two.
Why I have never enjoyed the black box that Windows is, and Linux is being turned into in the name of "ease of use".
DOS and earlier Linux and a more transparent and layered setup. But these days we have stuff reaching all the way from the office suite to the GPU for reasons I can only consider eyecandy.
Exactly. When I taught at the University I divided them into Mouse-skills or no mouse skills. Keyboard was easy to learn, and touchscreen or touchpad was not a thing back then.
The mouse skilled people where way more advanced and picking up the taught skills much faster in the practical courses, so I had to tune the program to haptically help the novices and do only non-mouse-favoring questions in the exam. The beginners and the women were equally good in the academic skills, but the women still left from 50% down to 20% for social reasons. But in the end - in the office - the numbers were equal again, even more than 50%, so I was happy.
I would figured it the other way round. For me it seems that the mousers are the people that do not really learn what they are working with, and only click by rote. Those that know how to operate by keyboard on the other hand also know the software, as they had to discover the shortcuts somehow.
Also true. The shortcut skilled people were then followed by people how could make their own shortcuts. At the end were those who thought themselves simple macros to automate things.
I sometimes think that perhaps the myth of the digital native generation arose because an insanely high percentage of the population is so illiterate with computers that they think that just because the kids are spending all of their time in front of screens it means that those kids automatically "understand" computers when in fact they are just learning very specific applications of computers.
I also think it's a bit of a cop out on their part (not sure that it's the right word). The computer illiterate are so turned off by the idea of trying to understand computers that they will gladly throw compliments at any kid just so that the kids will shut up about anything relating to computers. "Oh ho ho, I'm so old you know I don't know anything about them computers and phones and what have ya. But you kids, you grew up with these things, so to you they are second nature."
Edit: Sibling commenter about millennial vs gen z has a good point. Still I think that it's mostly those of us that made an effort to learn that it goes for though, just like with the people that grew up before that with home computers.
In my experience, the sweet spot for a generation being good at tech came about in the 80s and 90s, when home computers were super cheap, but didn't do much. To make them do something, you had to read the manual, a large chunk of which was about coding.
The result of that is you had people as early as elementary school being exposed to programming through Commodore, TRS80s, Apples, etc. Being exposed to something so young tends to give you more understanding over time. Of course, it wasn't for everybody.
Today, computer are more abundant than ever, but they already do stuff. They've, for the most part, become appliances. You have to actively seek out programming whereas before, the computer didn't do much without it.
Possible explanation for typing "google" instead of "google.com" - it's because autocomplete makes typing out the full URL unnecessary. I'd be willing to bet the students were running on muscle memory.
The article makes sense, that said I have two grandchildren that are five and three. They interact with ipads and phones like they're "normal" life. They talk to their Alexa's and Siri like they're people. They ask the TV to change channels and ask Mom why when they swipe the TV screen nothing happens.
My oldest grandson had the funniest argument with Siri, she kept saying I don't understand you and he would say "why you're talking to me just like I'm talking to you."
I agree we are still humans and no shocking net-new amazing multi-tasking or concentration skills will come out of the "digital generation". That said these kids are growing up on the shoulders of giants and will expect things to just work. If anything, that could make things harder because they may not want to learn how to make new things work.
All this said my grandmother lived through horse and buggy, radio, TV, Color TV, Computers and men on the moon. She used to say the more things change the more they stay the same. Because people are basically the same no matter when they were born.
> My oldest grandson had the funniest argument with Siri, she kept saying I don't understand you and he would say "why you're talking to me just like I'm talking to you."
Jaron Lanier challenges the Turing test original intend asking if we as humans are lowering our human capacity so machines pass the test. [1]
There is some interesting interaction between the Turing test and Poe's law. Put simply, it's hard to distinguish a malfunctioning AI from a troll and / or an idiot.
im probably being a Luddite here but I saw my friend's kid talking to alexa and I couldn't help but think it won't be too long until kids don't need parents much. if they have a question they'll ask siri. if they want help with their homework they'll ask okgoogle. if they want someone to read them a story they'll ask alexa.
on the one hand that will be great. a partner who never gets tired no matter how many questions you ask.
on the other hand it's a brave new world. especially when as the computer gets better they might get used to the partner that always does there requests and upset with the real people who dont ?
Watching John with the machine, it was suddenly so clear. The Terminator would never stop. It would never leave him. It would never hurt him, never shout at him, or get drunk and hit him, or say it was too busy to spend time with him. It would always be there. And it would die to protect him. Of all the would-be fathers who came and went over the years, this thing, this machine was the only one that measured up.
Ah. Very apt. That's actually one of the most under-rated lines from the film and such a great scene. For some reason, I had a strong emotional reaction reading this.
As someone who more or less grew up with the web (first computer at 12, finished "High School" in 2012, nearly 23), I can tell you that we've already been living in it.
There was a lot of questions growing up about particular hobbies for example, that I would just find online as no one in my small town would have any similar interests.
Same deal for homework eg; Algebra that my parents weren't able to answer, as neither had finished high school themselves nor having gone on to take STEM-ish careers.
Learning how to properly search is pretty much one of the most valuable skills you can have!
I think talking to Alexa is wonky on the surface, sure, but it's just changing the way in which you access the same information.
I'm more or less in the same age bracket as you (graduated high school in 2015) and your spot on about the utility of computers for research and the importance of learning how to effectively use a search engine (this is a skill most of my peers lack).
That said, there's a huge difference between getting your first computer at 12 and getting your first computer at 3 (or at least largely unmonitored access to one). It remains to be seen how interaction with technology at such a young age affects social development and maturity, whereas technology's impact on our lives occurred after that development had mostly finished.
It's a bit late for that. Education used to be something parents and close community did. Now parents don't teach much to their children, and most of education is done by institutions of the state (or church in some places).
This was done to avoid heresy, insurrection and rebellion, but it has wider implications.
Computers are just another tool in this change. In fact, computers could even help parents to regain a bit of the privilege to provide education to their children.
As a parent who's was "Why"ed to death by two curious kids in the pre-google era, I can tell you that I would have killed for "someone" to take on a part of that burden! ;)
>>I couldn't help but think it won't be too long until kids don't need parents much
I think you meant need as pure "question answerers," but we do a few things for the kids that Siri and Alexa wouldn't have the stomach or, for that matter, the arms for. Oh, also: love!
I found giving an answer that has some kind of final or really helpful/obvious consequence helps a lot. A reason instead of an explanation (that's tricky, don't mock your child).
Eg:
Instead of:
- Why does it rain ?
- Well, you see, there's that thing about clouds that goes through different pocket of air and...
> if they have a question they'll ask siri. if they want help with their homework they'll ask okgoogle. if they want someone to read them a story they'll ask alexa.
This has almost always been the case. Have you met a single millennial?
I'm confused by this. People tell me I'm one of these "millennials". I think I was 26 when the first, pretty useless, version of Siri came out, and 29 when Alexa came onto the scene. I can't mesh that with it having "always been the case" that millennials asked Siri and Alexa for help with things.
I think different people consider millennials to consist of different age ranges. Mostly, I see people use the term to mean "young adults of today", primarily people in their 20s.
After some quick searching, it appears the starting birth year for millenials can be considered to go as far back as 1977. Having personally been born in 1980, I've never characterized myself as what I hear a "millennial" to be, and the people I know in my age range feel the same way about themselves.
Yeah, we older millenials are super salty about being labeled as such. But professional demographers and sociologists say you are a millenial (at least some of them, 1980 is borderline), so to the extent that these are defined groups that mean anything, you are one.
I suspect that not wanting to be labeled a millenial is one of the defining characteristics.
How true. I'm a boomer. I'm told that I'm responsible for many ills, hold certain political views, and can't use a computer. None of which are true, but here we are. My generation had the sexual revolution, I'm so left I make Europeans blush, and I used my first computer in the early seventies.
But, people like their boxes and generalizations, or so it seems.
The voice interaction aspect may be new, but books and later the web were my go-to sources when I grew up, not my parents.
So to me it does feel like those "new" problems are not so new after all. Technology is just changing the mode of interaction, not the fundamental dynamic that lets kids entertain themselves without their parents.
I think the term 'digital native' is too disambiguous. There's two things that could be considered being a digital native; one is technical, the other is more functional.
On the technical side, I completely agree with this article. Most people my age (late 20's) and especially younger ones, don't have a clue about the insides of their devices, or how the internet actually works. They also don't seem to be interested in a lot of this stuff. They also tend to think that everything that has to do with their devices is complex.
On the 'functional' (for lack of a better word) side however, they seem to know exactly what to do on the internet. They know how to get attention on the internet, what their peers are interested in and how to get as many likes as possible. They live on the internet, live stream everything they do and make lots of selfies. This seems to be the reason why we're starting to see lots of younger kids becoming YouTube famous, or starting viral marketing companies.
The last thing, to me, is the part where people really become 'digital natives' in stead of 'technical natives'. Also, because most devices are so hard to open, most kids won't even get to see the technical side anymore, unless they want to build a gaming PC.
> They know how to get attention on the internet, what their peers are interested in and how to get as many likes as possible.
Is it specific to an age group, though? My mother is in her 60s and she seems to have no problem getting 100+ likes on FB every time she posts a pic of her grandkids. She knows what _her_ peers want to see.
I see your point, though I'm not sure most younger people really know how to get 'attention' on the internet. They can certainly share stuff like social media sites easily. But actually promoting it is another thing entirely.
The vast majority of young people involved in them struggle to promote their work on sites like YouTube and Twitch. There are definitely more young people than old getting popular there sure, but it's most likely not the majority.
You can't pay rent in likes and retweets, but you can pay rent in dollars earned by you getting your company likes and retweets. More and more companies are hiring "social media experts" (often interns, if my anecdotal observations are anything to go by) for exactly this reason.
If Arby's and Wendy's are any measure of success then Social Media Expert is code for "Someone who can take the latest memes and apply them to our business".
What the post seems to be getting to is a separation into three age bands, characterized by "computers as a mystery" (born roughly before 1970, with a very wide variance to account for different levels of exposure pre-1995) , "computers as tools" (born before 1990?) and "computers as media" (younger than that). Even people not technically inclined at all had to jump through so many hoops when they wanted to play a game, write a report or check their email once a week that they appeared near-magical to their parents. Some people seem to expect "digital natives" (the "computers as media" generation) to continue that, but we kind of knew this would not happen when the playstation pushed desktop gaming into a niche, right? I don't think that there is anything wrong with the term "digital native", except for some misled expectations some people associate with it.
Just to throw this out there, I was born in 57 and there are really good odds that you have been impacted by my code in one form or another.
Trivially related, I'd like to preemtively apologize for some of that code. I am not actually a programmer, I'm a mathematician. I programmed out of need, until I was able to hire qualified professionals. Sadly, I did not stop there. I am a pretty horrible programmer.
The “1970“ is certainly not a strict threshold, but I do think that it can legitimately be seen as a bit of a watershed regarding out-of-field computer skills. (sorry, I can't consider a mathematician who wrote impactful software truly out-of-field, no matter how blessed with humility)
Take for example brick and mortar architecture students. Most of those born mid 1970ies would at one point have put a lot of research in speccing out their personal CAD machine, learn how to apply cracks for the piracy countermeasures of the day and design multiple generations of less and less inadequate personal backup strategies. Their peers ten years older? None of that DIY, most would probably have jumped straight from paper to a professionally supported workstations. Their peers twenty years younger? Walled garden app stores.
I am definitely out of band. While I did some EE coursework, I've taken zero CS courses. I have a Ph.D., but took one single course in C. I only programmed because I had to. I only did IT because I needed to.
I don't know where the barrier is. I can explain how a CPU works, the various protocols for some networking, etc... Those are what I'd say are at a superficial level. I haven't even built my own system - for years.
My son knows a bit, though his older sister knows less of the technical aspect - and she's a physician. She is a bit older than him. I'm not sure if it is strictly an age thing, I guess.
The observations in that post are helpful to illustrate the problem. But I can't agree with the author's conclusions.
There's another way to read every single one of his examples: as evidence of a design failure. For example:
> A message box opens up, but the kid clicks OK so quickly that I don't have time to read the message.
Users are so bombarded by unhelpful dialog boxes that many are trained to dismiss them as fast as possible. It becomes subconscious, some users literally can't see the dialogs anymore.
> I hand back the laptop and tell him that it's infected. He asks what he needs to do, and I suggest he reinstalls Windows.
This is obviously not the users fault. It's the responsibility of the system designers to make it hard to subvert. iOS is now ten years old with no major malware outbreaks -- it can obviously be done (through means that the author doesn't approve of, like application signing).
> I take the offending laptop from out of her hands, toggle the wireless switch that resides on the side, and hand it back to her.
Poor design decision. Normal users in 2017 basically never want no-network, dedicating the cost of a physical switch to it is doubly bad.
My point is that the author needs to stop lamenting that the hoi polloi have breached the gates and start taking seriously the question of how to design general purpose computing that actually serves most human beings. "Use Linux" is not, by itself, a remotely helpful suggestion. His "Mobile" suggestion is indistinguishable from satire.
> iOS is now ten years old with no major malware outbreaks -- it can obviously be done
If even a company with $200B in ready cash on hand, humongous design and manufacturing resources, and full control at every part of the vertical stack can't make a general purpose operating system with these values, what hope do the rest of us have? iOS is extremely limited in what it allows you to do. If it isn't, then how come macbooks still sell?
I've been a tutor at the tertiary level before. Some people just can't be fucked putting in any effort to get what they want. Fuck them; they get to be called incompetent - don't shift the blame. If someone could barely drive a car and smashed it into every tenth lamppost, you'd call them an incompetent driver. Or another example: I can't cook for shit. I can follow a recipe, but it's not going to be very well done. But you would never make the argument that it's the recipe-writer's fault that I couldn't make a good souffle.
It's reasonable to expect users to meet developers at least partway here; reasonable to expect some learning curve for an incredibly complex, capable device.
once when I was working support, I needed a client to read back to me the contents of an error box - it could only be one of two boxes. She said the content was something else ("the files are corrupt"), which wouldn't happen in that part of the workflow (and wouldn't make sense at that point anyway). I got her to read back the error dialogue word-by-word. She still just made it up ("the. files. are. corrupt."). It wasn't until I got her to read out the error message letter by letter that I found out what it actually said and was able to help her...
This was kind of painful to read. The author's tech arrogance is extreme. For example, how was the lady who came in supposed to know the wifi password, the proxy settings or that youtube was blocked?
You've misread the article. He isn't complaining about her not knowing the proxy setting. He's far from being arrogant - he's complaining about how people look down their nose at 'support' while being clueless about the things they need support with.
He's not complaining that she doesn't know that the proxy blocks streaming video, he's complaining that she doesn't know that her powerpoint presentation uses streaming video and what the ramifications of that are.
Read the article again and this time actually look at what he's complaining about, not just the tech keywords in the story. His problem is with her demeanor, not her skill.
The lady did not know where the WiFi icon is or where the proxy settings widget is. The latter is understandable the first just shows utter and complete uselessness.
> And paper co-author Paul Kirschner, an education researcher at the Open University of the Netherlands in Heerlen, happily describes himself in his academic work as a “windmill-fighter”. But whereas Don Quixote aimed against solid walls, the digital-native assumption, on closer inspection, does seem illusory. It is certainly no giant.
To fight windmills, as per the English idiom 'to tilt at windmills', means to attack non-existent enemies out of a delusional sense of righteousness (from the last bit is derived the word quixotic). In the stories Alonso Quijano read so many romantic novels about knights and chivalry he went insane, renamed himself Don Quixote, and set off to vanquish windmills he imagined were giants and swear fealty to an innkeeper he imagined as a lord (for the honor of his lady love who doesn't actually exist).
This paper's authors being the Don Quixote figure doesn't make sense, as I doubt they were trying to insult themselves. A more apt analogy would be the companies trying to grapple with a new generation of digital natives are windmill fighters (fighting an enemy that doesn't exist), while the paper's authors are Sancho Panza desperately trying to get them to see reason.
I learned and always thought that the phrase "tilting at windmills" meant, aside from what you wrote about "to attack...out of a delusional sense of righteousness," more like a futile task or something that wouldn't succeed regardless of your efforts. Because of that, I interpreted the previous comment not as someone who is creating enemies out of nowhere against oneself, but rather someone who is doing something (telling others that "The Digital Native is a Myth") that is a more-or-less hopeless, or mostly impossible task.
They're close, but to me there's a distinction between the two.
Don Quixote is a pathetic, vainglorious egotist with mental problems, whose attack on windmills is just one of his many feats of remarkably willful stupidity. He isn't fighting a good battle that is unlikely to be won, he's inventing battles out of nothing so he can feel special and important when really all he's doing is being irritating and getting beaten up by inanimate objects (the windmill sail). He's to be laughed at or pitied, not respected or admired. That was something of the point, take a classic knight in shining armor from romance stories, put him in real life, and watch how much of a loser he'd be.
This might be a case of pop culture failing to capture the essence of a reference, and the misunderstanding gaining higher prevalence than the true meaning. Calling someone 'nimrod' is another example. Nimrod was, unironically, one of the greatest hunters in the bible - smart, swift, strong. But Bugs Bunny called Elmer Fudd that, and Fudd was an idiot. So anyone not familiar with the Bible character thought nimrod == idiot, rather than Bugs calling Fudd that sarcastically.
The digital native is not a myth, he's just going extinct.
Back in the day the difference between a computer user and a computer programmer was, at best, blurry. You had to know how to program to use a computer.
And hardware and operating systems were complicated things you had to study, they weren't the appliances of today that "just work".
I think the breaking point was around the time Windows XP and Mac OS X were released. The difference in skills between those raised before that time and nowadays is mind blowing.
So the digital native exists, you just won't find one without grey hair.
This reminds me of a story I like to share. You can have the short version.
I touched my first computer in the early seventies. It was from HP, took punch cards, mag strip cards, keyed entry, and could output to a plotter, tv, or LED. It was horrible. I hated it.
Late seventies, early eighties, I now have to use a terminally connected computer and have a computer at home that required additional memory, just to use lowercase letters. It was horrible. I hated it.
Late eighties, it hasn't improved. No, no it has not. I still have to dance a strange dance and wait for my work to be completed. Usually, that meant waiting for my work to be completed in another part of the university. If I wanted to search the 'net, those searches went out to people - who'd send me an answer back in as long as 72 hours. It was horrible. I hated it.
I pretty much hated computers until the 2000s. Oh, I used the Infernal Beasts of Spite and Retribution - but it was adversarial. We're on friendly terms, more like old enemies too tired to continue fighting but with mutual respect.
I don't like programming, tweaking, or fixing. I will, but I don't like it. In fact, I use Linux because she's a familiar beast and I pretty much don't ever have to screw with it. It does what I want and stays the hell out of my way.
That's the short version. It usually contains more details and a whole lot more vulgarities. As a parting comment, it has been wonderful to live through these changes and advancements.
Today's computers seem to try very hard to draw a distinction between consuming content and being productive. These tasks tend not to share any skill sets at all. My much younger sister was a wizard at binge-watching YouTube videos on iOS at ~6. She used most of the features at her disposal: subscriptions, autoplay, voice search... but despite all of that practice, she never really got to using the keyboard, and is functionally incapable of typing a URL in a timely fashion, or structuring a text search for something full of homophones. She basically just gives up if it wouldn't work through voice search.
I got into computers as an interest because on Windows 95, 98, 2000 for WorkGroups, and other contemporary Windows versions (we were not affluent), code of some sort was just a step or two away from using the machine at all. Most computers today (iPhones, WinRT, etc.) don't even have shell scripts on the desktop or a javascript console in the browser. Operating systems in the late '90s and early oughts gave an impression that there were consistent underlying concepts like files and processes or at least tasks. You used to be able to market your dual-core processor as being able to run more tasks simultaneously without trouble, nowadays you just tell the consumer it's "faster", and they don't understand why the same app runs slower on their new phone than the last.
To be fair about the keyboard thing, when I was born computers were programmed through punch cards, and I never learned how to use them. They switched to keyboards pretty quick after that.
So to a 6 year old who can't use a keyboard... well I don't see keyboards disappearing quite that quick, but it's obvious that voice interaction is going to feature quite heavily into the future of computing.
Right, but voice input will likely always suck as much for computers as it does for human beings. The keyboard, even the touch keyboard, is a better way to input any reasonably complex string. If people spend any time together that isn't recorded by Google, then they will have inside jokes, puns, portmanteaux (this word was not in my dictionary), and words that Google/M$/Apple simply haven't heard enough of to recognize. In an office environment, it is absolutely unacceptable for everyone to be using voice input to compose documents, comments, and emails.
I’ve worked at companies who have said it would be absolutely unacceptable for everyone to have a computer at their desk. I’ve worked places where they said it would be absolutely unacceptable for everyone to have an internet connection at their desk. I’ve worked at places where everyone is constantly on the phone at their desk, and somehow they make it work.
And I just noticed I’m breaking my rule of never responding to the kind of troll who would put “M$” to mean Microsoft so I’m going to end here. I’m sorry for wasting my time.
Geez, that's a pretty restrictive rule. M$ is for folks who remember just how many people got sued by Microsoft back in the day. No matter how tame they seem today, they have been positively aggressive in the past. It ultimately hurt them as much as it hurt the community, but they conducted themselves poorly.
As for those other things, those are matters of personal distraction. It may not seem acceptable to have a computer or an internet connection at the desk, but that's a matter of self-discipline. Everyone talking all the time around you is something you can't overcome simply by disciplining yourself.
Voice:
- socially embarrassing
- prohibitively imprecise (when is precision not necessary??)
- low work, but also low quality
- currently popular methods require some outside company handling your audio
- also require a good network connection, which shouldn't be treated as a given
These are all related to my own situation and preferences, of course, and I'm sure that there are some people where your list of positives vastly outweigh my list of negatives. No one's served well by assuming that everyone has the same requirements.
I want to make it clear that I'm not specifically targeting you for this comment, it's more of a meta comment on the whole subject we're discussing.
I find it interesting how people tend to get stuck to the current forms of technology and then become outraged when anyone dares challenge it. Who needs Windows, I boot straight to DOS. Windows N sucks, I'm going to stick with Windows N-1. Blackberry phones are useless and pretentious, I'm going to stick with Nextel. The iPhone is stupid, I need a physical keyboard. Bluetooth is awful, I need my headphone jack. The iPod is worse than a Nomad. The iPad is just a big iPhone. And now, apparently, voice control sucks, I'm happy with my on-screen keyboard.
And as history shows us in nearly every one of those cases, us tech guys are typically the worst judge of how well a new technology will fare in the market and what consumers actually want.
> And now, apparently, voice control sucks, I'm happy with my on-screen keyboard.
Nah, on-screen keyboards suck, I'm happy with my non-mobile devices (when possible). Bluetooth is an expensive pain in the ass, though ;-)
I'm used to my desires not meshing well with the way that markets go (and I'll bet that other like-minded techies are too). Out of the specific things I mentioned with voice recognition, most of them will change with technological improvements and shifts in societal norms. Change is inevitable.
Honestly, I'll still find a way to do what I want, even if most other users are happy to have tech go in a different direction. I'm not deluded enough to think that what I want is what anyone else wants.
I don't see a distinction with punch cards, as punch cards were just an added step between keyboard and input device. You still used a keyboard to input, although more slowly and with a more limited character set.
Unless, of course, you had keypunch operator minions available. I preferred to do it myself, as it would take me longer to hand-print the contents into those little boxes.
I've seen my niece (we she was just under 4yo, she's now 5) open her dad's YouTube app on Android phone and hit the voice button and shout "Pepa pig" to find what she wants.
We were shocked to see it.
She's Spanish native. And she was not as fluent at her own native tongue as some other kinds her age. Her parents don't know how to use voice. No one around her does. I only use it for setting alarms and never _in front of anyone_ (I'm self conscious about it).
On the other hand I've got kids near that age, was born before 1980, and I cheerily use voice commands and many other technology artifacts of the day just the same as I'd do any other thing in a given social context, and much better than my kids (thanks to the benefit of having lived a number of years and the experience gained from that).
I'm not sure why anyone is shocked to see someone using modern technology that's made available to them. How do you figure humans learn? We build on what was learnt before, which requires a degree of familiarity.
I agree with your questions. In this particular case I doubt she has seen anyone use that particular technology.
But I guess she has seen her family use whatsapp audio message all the time and established the analogy herself when she saw the "mic" icon on youtube.
It is also a testament of the ease of use of some of this tools.
At what age can one really understand the downside of their body of queries, sites visited, videos viewed, facial images (often geocoded) etc. being stored, analyzed, and profiled with the goal of productizing them?
It's a bell that can't be unrung, and it's at least as invasive and pervasive as what Orwell envisioned... but most participate voluntarily.
I'm of the view that the best we can do is restrict the information available to these companies, and only to the extent we can control it directly, and only to the extent that one can tolerate various disadvantages in so doing.
For example I don't upload images of any photos I take from my smartphone and I don't use social networks at all. However my spouse does, my friends do, and they all refer to me and pictures of me they've taken and there is nothing I can do to stop it. I can ask them to not do that and to minimize it if they must do it anyway but that's the extent of my control as far as that goes.
I could be completely disconnected from the internet, but my view is the disadvantages there, due to the aggregate mass populace's use of it, are too much for me to tolerate. So I use a popular browser and engage in various online activities.
I don't know if that's a sign of her generation not being ashamed of using voice commands in public as much as it is a sign that kids have no sense of shame. I may be proven wrong as the years go on, but it's hard to make predictions about the future of a 4 year old.
The BBC like to describe older people as "digital immigrants" to contrast with "digital natives" but seem to not quite comprehend that the "digital immigrants" actually built all this stuff whereas "natives" merely use it.
Consider the difference between "indigenous natives" versus "second-generation immigrants."
Technical professionals are arguably the indigenous natives, present "before it was cool" and working to coerce a inimical environment into something suitable and appealing to people.
There's a slight confusion in this thread, that many are rightfully pointing out, as there's a bit of context missing in the article.
The thing is, the term 'digital native' has been heavily used as a way of saying that children that grew up surrounded by tablets and TVs and the internet actually learn differently from the so-called digital immigrants.
For an example, see these quotes by Marc Prensky cited in the literature thousands of times:
‘today’s students think and process information fundamentally differently from their predecessors’ [1]
‘Our students, as digital natives, will continue to evolve and change so rapidly that we won’t be able to keep up’ [2]
‘Our young people generally have a much better idea of what the future is bringing than we do’ [2]
‘In fact they are so different from us that we can no longer use either our twentieth century knowledge or training as a guide to what is best for them educationally’ [2]
Now this might immediately sound extremely suspicious for anyone who has some background on the research of how humans learn, but it's a common myth all over the learning world. 'Oh I can't learn from books, I need interactive videos and apps, that's why I have bad grades.' Actually computers are an excellent choice for learning if applied correctly, but this is not what is usually being implied in these situations.
There's a big confusion between how much children are familiar with using technology from a consumer's point of view, and the natural human ability to learn. The latter hasn't changed.
The debunking of 'digital natives' is not new either [3].
By the way, for anyone interested, I recommend Hattie and Yates' book on learning [4] as a great, very balanced, modern introduction.
[1] (cited 18000 times!) Prensky, Marc. "Digital natives, digital immigrants part 1." On the horizon 9.5 (2001): 1-6.
[2] (cited 1000 times) Prensky, Marc. "Listen to the natives." Educational leadership 63.4 (2005).
[3] (cited 2700 times) Bennett, Sue, Karl Maton, and Lisa Kervin. "The ‘digital natives’ debate: A critical review of the evidence." British journal of educational technology 39.5 (2008): 775-786.
[4] Hattie, John, and Gregory CR Yates. Visible learning and the science of how we learn. Routledge, 2013.
I stopped believing in the "digital native" buzzword after spending some time with some older relatives.
Because nobody makes good old fashioned cellphones that can only make calls they were all upgraded to touch-screen smartphones. A lot of them came to me for direction to perform the most basic tasks. They were interested in downloading apps, getting the news from their devices, managing pictures and generally browsing the internet from them.
At first I explained every step of the process and I noticed they would all mechanically memorize the steps without really understanding any of them. So I changed my approach. I told them that, software wise, whatever happens to their device I can repair it and that instead of waiting for me to have a minute to go over the steps with them, they should try themselves and if something bad happens then they should come to me.
This has changed their behavior dramatically. They are now more confident using their devices, knowing that someone can get them out of trouble should problem arises, and try new features by themselves. An example of that is when they enabled voice control and started to play with it.
I like to separate users in two categories that I consider more realistic than "Digital Native" and so called "Digital Immigrants". I refer to them as having an "Active" position toward technology and a "Passive" position. The active crowd search for solutions themselves and learn by exploration while the passive crowd pretty much will only do as directed and what they have been taught with clear explanation. I noticed this pattern of refusing to explore new tools and platforms when exposed to one not closely related with age, as many young "Digital Natives" would get extremely confused when exposed to a new tech. Even stuff as basic as an oldschool car GPS (not their usual phone) would confuse them.
I'm born 1981, so according to the article I've just made it to be a native? What I do observe is this (this is over-generalization but seems true to me) -
Pre-internet era - people know little but do a lot and deeply
Post-internet era - people know a lot but do little and shallow
I was born in 1980 and, having grown up with computers from an early age, I totally agree with pervasive internet access being the next dividing line. Computers allowed us to do many things we couldn't do before, but the internet took communication and information sharing to another level.
Working in tech support anywhere would have shown you this. The supposed digital natives aren't born web savvy.. they're the ones that just expect everything to work flawlessly, and "just make it work."
They don't understand the how and why something works, and they don't care.
And how would it ever be challenged? The moment some non-nerd, non-aristocrat becomes conversant with the machine, gains power over it, molds it to her whims, she has become the nerd aristocracy.
The nerd aristocracy's grip will forever remain unchallenged.
The nerd aristocracy's grip will forever remain unchallenged.
Yeah. This story reminded me of the Star Trek: TNG episode Schisms. There is a scene when Riker, Troi, and LaForge are on the holodeck trying to recreate the aliens' surgical table from memory. The way the characters interact with the computer to accomplish an extremely complex task -- build a model of a table from fragmented dreams -- is remarkable.
The scene showcases how natural and comfortable they are at talking to the computer. Though the episode has its detractors, I'm glad they made it. I can imagine the characters regularly using the holodeck to model and design all sorts of objects in the course of their normal duties. It demonstrates that the power of their tools goes way beyond silly period-piece virtual reality games.
The twenty-forth century nerd aristocracy no longer needs to use plastic keyboards and text editors from the 1970s. That doesn't stop them from being sophisticated users of their technology. Programmers, even.
When most people talk about digital natives, they aren't talking about learning styles.
The idea that you can dismiss the idea of digital natives just because people still learn differently is a little wacky. Digital natives imply familiarity and comfort with the digital technology they grew up with, not anything else.
I'm not really sure what viewpoint you're attributing to the article, but according to my reading, the article dismisses the "multiple learning styles" thing as dogma without evidence.
Digital natives? Are people still using this term?
So the people who invented the Internet and most of modern technology are called "digital immigrants"? How can one take lazy journalists who still use these terms seriously?
> So the people who invented the Internet and most of modern technology are called "digital immigrants"?
Consider if a community designed a new language. By definition, no one in the community would be a native speaker. Couples start having children and teaching the children that language. Those children will be the first native speakers. The best of the non-natives would still have more technical knowledge of how the language works than the average native speaker. That's the situation that we're looking at here.
Think of the earliest colonists everywhere, they created their colonies but they weren't native to it. Their children were native to the colony though.
I suspect it'd be more accurate to say that many of them know quite a bit, it's just that there's no overlap with what schools want them to know.
I'm sure a younger version of myself would have completely failed a basic test of computer literacy. And yet, I set up DOS extended memory, configured hardware settings for the Sound Blaster, and calibrated a joystick, so I could play Tie Fighter.
The only skill I think a school might have valued was that I learned a tiny bit of basic to fiddle with that qbasic game where the gorillas threw bananas at each other.
It turned out to be a misconception because computers are astronomically easier to use now. If you had to ever configure a config.sys entry for a specific game you know what I mean.
I think it's somewhat like cars. One generation never used cars, another grew up along side cars where changing your own oil was a standard thing people do and now cars are better, people don't know much about cars any more.
"Digital natives" is a codephrase used by employers to mean "we only want to hire young people; if you're over 30 you won't be considered". It's dogwhistle discrimination.
My sister is a teacher and confirms: While kids play around with their smartphones all the time, they are remarkably ignorant about the basics of computing. They are users, watching videos and composing texts, but when it comes to drivers (okay, who in today's world installs drivers themselves), OS installations, BIOS, but also simpler things like proper backups or Excel formulas, they are completely clueless.
> People born after that date are the digital natives; those born before are digital immigrants, doomed to be forever strangers in a computer-based strange land.
That's a ridiculous notion anyway. If you put in time and effort, you can learn pretty much anything, and computers and programming are no different, no matter the age. And if you do not put in time and effort, you always remain ignorant. It really is that simple.
>That's a ridiculous notion anyway. If you put in time and effort, you can learn pretty much anything, and computers and programming are no different, no matter the age. And if you do not put in time and effort, you always remain ignorant. It really is that simple.
I tend to agree, But younger people have the 'advantage' of being submerged with computers from a very young age. Children tend to pick up things quicker and easier compared to adults. (this is especially true with language, learning a second language as a child is far easier then doing the same as an adult).
Also, i think its unfair to judge people on their knowledge about computing when they are not in the computing field.
most people don't know how to gearbox in their car works, yet they are still able to use a car just fine.
> Also, i think its unfair to judge people on their knowledge about computing when they are not in the computing field.
Yes but you wouldn't call people who don't know how the gearbox works "mechanical native" or something like that just because they grew up with cars. You acknowledge that there are people like mechanics who know that stuff because they learned it, and other people who do not because they did not learn it. It has nothing to do with generations and so on.
What is happening is that there are now different sets of skills: touchscreen skills and keyboard skills. It isn't even that they are particularly good at using phones (students say they don't see email announcements because they don't check email. I ask them why they don't set up their phone with the university email. They say they don't know how) but that they use phones more and have less experience with laptops and desktop computers (while setting up for presentations I see them struggle to log into email or google slides. They type google and enter to go to google.com, then search for youtube, then at youtube they search for the video they want to show. Strangely, when they cut and past a url they almost always right click and select paste then hit return and nearly never use "paste and go" which is right below that in the context menu. They don't know ctrl+f.)
So if "digital natives" means anything, it doesn't mean that they are automatically good at tech.