Hacker News new | past | comments | ask | show | jobs | submit login
The digital natives are not who you think they are (torh.net)
103 points by ingve on May 12, 2021 | hide | past | favorite | 145 comments



There's something to be said about growing up using devices that are made for creation and not just consumption - the barrier to go from "this is a cool web page! How can I make one?" is a hell of a lot higher on a phone than a computer, or now vs. the mid-90's (if only because the standards are so much higher).

But this guy is just being a jerk: "Digital natives my ass, all they do is stream videos on YouTube and Twitch."

Which is a shame, because it crowds out what I was hoping would be a more thoughtful discussion about devices created purely for consumption, or perhaps walled gardens, and how they don't afford people the same opportunity to learn as general purpose computers.

I wonder if early car enthusiasts felt the same about people who learned to drive when you no longer needed to know how to fix your car all the time.


I think you're applying your incomplete idea of what it means to create. Now phones/tablets let people say 'that video is cool, let's make one.' Same with pictures or other digital art. When I look at the creator options available in my pocket, I'm amazed.

If anything, back in the 90s having to learn to create web pages was a barrier to creation. Video? Forget about it. Digital cameras were just coming around, but were still super expensive. I have some old pics taken on my flip phone from the early 2000s and they are...really bad. Instead I had to spend hundreds of dollars are completely separate device to be creative. And that device still didn't do video.


> Now phones/tablets let people say 'that video is cool, let's make one.'

The problem is that outside of certain subjects, video is a time-inefficient and low-information-density medium. So many things could be conveyed in fuller form, and in a way more respectful of people’s time, by using text. Unfortunately, the use of phones as default devices by many people today young and old discourages longform text, due to the limitations of tiny touch keyboards and screens. That is my own feeling of unease with the way tech is today.

In some niche travel scenes, I have seen a big decline in meaty, useful information since the early millennium, when a blogger generally had a laptop to type on. Sure, now we get glamorous video content, but that can’t be all there is.


As a counterpoint, I love video content. When I’m learning something, I’d much rather watch a talk on the topic rather than read an article on it, if I can. Especially when being first introduced to a topic, I will always reach for videos when they’re available. Conference talks by the authors involved are the best, if they’re good speakers.

I’m not sure why I prefer video content. I find my attention relaxes more when watching a talk than it does while reading. Like, I find it more effortful to concentrate on a long article in comparison to watching a talk. (Though usually 1.5x+ speed is a necessity). Also, good talks are generally better written than the average article, and good talks are easier to find than good articles are. Project readmes and webpages will say useless things like “This is our database in Go, here’s how you install it. It’s fast and reliable!”. But a good talk by the author will give me the background and the story. “So, here’s why Postgres wasn’t working for us for this problem we have at Netflix. We really wanted to solve our problem this way. So then we built this new thing to do that and it’s working great! Let me show you the weird parts we’re really proud of…”.

Text is great for reference. But I could listen to technology stories like that - telling the “why” of what people have made all day long.


I am the total opposite. The best way for me to learn is picking the useful text pieces I need right now and use them. Watching a video my brain does not actually focus on every detail, it drifts off


And let me add: I have been reading all my life and while I can skim through even a long text to decide if it's worthy of a longer read, this process is way harder with video.


Even if reading about new topics is harder than watching videos about new topics, it says nothing about the relative efficiency of learning from text or video.


The problem is that the ability to create that functionality - a robust video editor, let alone the sophisticated image processing- is poorly suited to the device itself. Art is intrinsically valuable, and I love shooting on my camera and my phone. But it doesn't really translate into a level of literacy with desktop systems to create whatever the next generation of computing might be.


I bet that the number (or even percentage) of people building those tools has increased too.


Creating high quality video or photos still poses a far higher barrier than simple text. The acquisition hardware (camera) is a trivial step, it's everything else that involves a huge amount of work.


It's a trivial step now. At no point did I say it wasn't a lot of work. But instead of having to create a web page to show off my creative works, I can focus on the work I want to do - take/edit/share pictures.

I just get annoyed when people call these devices consumption only. I could say the same thing about computers when many people only play games on them. These devices are what the person makes of them.


I agree with you, as a parent i witness how my child uses his technology. I did think to mention that we might have biases, as technologists our world view is dominated by technology. My child has a ipad and i have a few computers and game consoles in house. By observing me my child sees how i use technology.


I agree with you personally. For me, it's easy to gather my thoughts into sentences and paragraphs, and I'd rather learn Markdown and Hugo (or LaTeX!) than edit a video.

But that isn't some kind of law of nature - it isn't true for everybody. There's a significant group of people who find it much easier to use video or voice and powerful tools have arisen to help them.

For example, the TikTok video editor gets people started quickly and lets them learn as they go. I'd really encourage you to watch a tutorial to see how easy and surprisingly powerful it is.


I also think its much faster to get information back out of text than it is to watch a video.


But isn't the topic being discussed technology? Being a video creator or painter using Krita is being an artist, not a tech person.

There is no doubt that modern technology has enabled people to realise a lot of things they woul dneve rhave been able to, such as creating a movie using tools available in your pocket.

However, the original post was about learning technology. I.e. understanding computers. The difference between being a mechanic and someone using a car to drive to the grocery store.

The problem is that that the term "digital native" implies that the person understands how computers work. One might argue that the term actually should mean something else, but I argue that would make it an even less useful term. Do we call people who has drivers licenses "motorised vehicle natives"?


Matwood - This post speaks to how we often look at the 90s with rose-colored lenses and often see the computers as better. As much as I'm a collector of older machines and the tech of the era, I can also admit that things were more simplistic, slow, and with plenty of barrier-to-entry. Simple example, the first iPhone did not even have video capability. Right before the 2008 release, nobody was walking around with a video camera in their pocket. You'd had to have carried around a clunky and very expensive camcorder and nobody was just taking videos, anywhere, on a short whim. Even cameras just a few years before that were just terrible. Nostalgia all that we will, the tech was terrible. ASCII warez, and all that stuff was awesome. BBS was awesome, but there is just far more for kids today than what we had.


Another way to look at it is that smartphones and social media let powerful political lobby groups and corporations get ideas and advertising even more directly into people's minds, so that they go forth and replicate them. Who actually starts the majority of trends and what accounts are the most followed? Is it individuals with no other motive than share their creativity? Even in music and art, the mainstream is full of more or less subtle manufactured content. This is propaganda and we're having more of it than ever before.

And no, I'm not blaming the kids who grew up with this and can't know any better (yet). It's the adults who are wrong. They're old enough to know better, but they're content with their positions, they're lazy and they're scared of taking risks by speaking up.


Yeah the creative possibilities have definitely increased. It’s a cycle though, the early podcasters and vloggers look at today’s crowd and can’t believe how easy they have it...”we literally had to run a cli tool to download our podcasts!”, “there was no YouTube let alone iPhones!”


Interestingly enough there still is a high barrier to entry. Nobody wants grainy mobile pictures or low audio quality mobile clips. You'd still need a proper camera and a semi strong computer to cut something that fits modern standards


Oh come on, let's be honest. Open up Instagram and look at what people "create", most of it is memes and image macros. Not even comparable to, as an example, the first years of Deviantart (not what it has become today).


> "this is a cool web page! How can I make one?"

I sympathize with what's been lost, but what happened was people realized they didn't actually want to build a web page, they wanted to converse on a forum and share pics, and be entertained. Sure, you could publish anything you can imagine that's less than 4 MB on Geocities, but I'd rather have Wikipedia.


It's more like "That's a cool YouTube career, how do I make one?"


(Perceived) fame and fortune are pretty standard aspirations.


Fame and fortune aren’t the only reasons people try to make careers out of content; some (maybe even most) are just trying to support themselves by doing things they enjoy instead of sitting in a cubicle from 9-5 so they can afford to do those things on the weekend.

I find it pretty inspiring that we live in a time where someone who really loves doing, like, aerial yoga can reasonably make doing aerial yoga their job just by learning how to edit videos and engaging with other people who are also into that thing.


It's interesting that streamers and "influencers" seem to be replacing, or at least competing with, other pop culture and performing arts celebrities.

Though there doesn't seem to be a huge difference in practice between "youtuber" and "television presenter" other than the broadcast medium. One thing that youtube adds to television is an easy way to interact directly with viewers.


Yeah, I grew up in the same era as the article author, and I'm honestly much happier about the state of access to resources these days.

The modern "digital native" that I categorize is the legions of young people getting started in programming the vast number of rich environments that exist today.

Yes I remember the times upgrading my 486DX-33 to 8Mb of RAM, installing slackware from floppies, figuring out whatever dark magic was needed to run Duke Nukem 3D over IPX networks simulated over dialup, and heady and critically important discussions about the role of Amigas, coveting the barmaid in LoRD on the high school BBS, running unauthorized star wars MUDs on the high school computer network and getting dragged in front of school boards because you ran "nethack" once (in fairness it was likely also the pornography being accessed and distributed leveraging the school T1 lines as well but in my defence I was 14).

It was a time and a place, and there's certainly value in reminiscing. But these new kids will create their own time and place, and demanding that they somehow pay homage to my experience as being "more authentic" seems a bit pleading.

And to be honest we downplay the limitations of our times. Free and accessible development environments? Not on the standard desktop PC operating system (until DJGPP came along anyway, and even that had severe limitations for a while for building native windows apps). If you were a super nerd who went out of your way and downloaded an obscure little free unix system developed by some finnish guy named after a peanuts character, you got a compiler. And then if you scrounged around random docfiles spread out across dozens and dozens of random howtos, you could sort of learn how to program C, or python or perl.

I look at today's technical environment, available to the entry level student in some technical field, and the available breadth and depth of tooling, the amount of documentation and tutorials, all available for free, and it's amazing. If you're someone looking to build something interesting, there are a 100 different more opportunities to do that today using accessible tooling than there ever was when I was a kid.


i don't think the author is disputing that, of course, tech today is better than it was before. what he is saying is, that having learned to use computers as the computers themselves developed, he got a different, better, insight into how they work and what potential they have, whereas todays youth use computers as a black box, not caring how they work inside, and thus not learning how to use them to their potential.


I grew up at the tail end of the era in the article. Windows 95 was well established, and while CAT construction games still came with controllers to strap over the keyboard or I had to learn how to troubleshoot the game port to get SimCopter to work, I never had to learn IRQs or use dial-up.

And I admit having a sense that I am not as knowledgeable as I could be about using these insanely powerful devices to their fullest potential. The demoscene can make amazing things happen with teensy amounts of code. Amazing!

So I started watching the Ben Eater making your own breadboard computer series, and am now following hia 6502 series so I can eventually convert my model M to a self-container battery powered DIY Alphasmart word-processor-to-SD-card contraption, because why not?

So I too agree that the tiktokers are the kids these days for now, but I more heartily agree that some of them will discover something new and amazing on that there YouTube. Maybe they'll learn something we've forgotten and leave all of us in their dust with what they do with it!


You must have missed the completely available and included with MS-DOS 5 or later QBasic. Pretty powerful for how easy to use it was. Delorie's tool was nice, but nothing compared to Borland's or Wacom's.

Then again, bit earlier you had included BASIC on every microcomputer, and hardware documented way better than random PCs glued together with duck tape.


Hi, author here.

First; I must say I was surprised that my little rant made it to HN, but obviously some found it interesting, and it did spark a discussion.

Secondly: Yes, I do come of as a jerk. English is not my native language, and my Norwegian dry humour may not have translated as well as I had hoped (which is not an excuse). Also, at an age of 38, I do not really have that much experience with what todays youth does, with exception of kids between 5 to 10. And that is not a fair comparison anyways.

I have not yet read all the comments here, because I fear some of them might be a bit personal, while my "attack" was mostly a generational one. As I said earlier, this is the first time I gotten a real audience on my blog, and I was not prepared at all for this kind of reaction.

But I value your feedback, and I see your point. Next time I will try to look a bit past the "rough" language and see topics from different angles. Because the world I grew up in is gone, and we are still making progress, so obviously things are going the right way.


probably the language. You guys aren't the digital natives, you are more like the digital pioneers, you weren't born into digital land like today's digital natives. you came over on an analog boat so to speak, and with your axe, you built a digital country.

its the kids who grew up in digital land that we should consider the digital natives.

2 cents from the peanut gallery!


I think the line about streaming is less of the author being a jerk and more of the author resetting the dialogue concerning the average tech know-how of Gen Z relative to previous generations. Of course for every IT geek in the author’s generation, I’m sure there are at least two in Gen Z, which the author could have acknowledged.


Author here.

You are right. My intention was not to come of as a jerk, nor did I expect anyone to read my blog post, to be honest.

My point, which kind of got lost in the end, was to show of the big difference between my generation and the next.

I have heard so many people praising today's kids as technical wizards because they master the gesture/finger based interface so much better then the generation before me. The term "digital natives" I got wrong, big time.

In hindsight I could have spent some more time polishing my idea before publishing.


I think this is because you are thinking as a programmer/something.

Folks now have web page editors in their phones/tablets. Drawing apps, etc.

When I was 16, yeah, I want to make a webpage about WWE/F, so I had to spend days learning table based layouts and how to ftp files to a server (and find one).

Now I (my son) can open his phone and he has various tools to do so. He can share/post pictures of someone doing a full nelson and write about it, and get it up on the web in minutes vs days.

Just because he didn't have to learn 45 different incantations at the command line, doesn't mean he isn't creative, he just doesn't need to slog through programming to do it. (He is learning python though)

Think about it as frameworks, when we started, we needed to learn it all to make any kind of decent page, now most folks here would use bootstrap/material ui/whatever, or react/rails vs writing all the css by hand, fighting with all the problems it brings, configure perl scripts to be able to send an email from it, etc etc.


True - I am thinking of it as a programmer. And I wasn't entirely clear. You can create lots of wonderful things on tablets, phones, locked-down computers, etc. but it relies on someone else making the tools. On my computer I can use a program, and then make a program just like it. On a tablet I can use an app, but if I want to create an app I need to use a computer. It's one layer of abstraction removed.

The comments here have been a helpful reminder that creativity is very much alive and well! I do worry about how many people would think "This app is cool! I want to learn how to make them" but stop because the device for consumption is different from the device for creation.

Really dating myself here but the very first programs I used were ones from a magazine I typed in to a little TRS-80 pocket computer from Radio Shack myself (I'm not 100, really, I just started young and had an encouraging grandpa), and in the process realized "hmmm I could change this line and get a different output, cool!". I think that's about the lowest possible barrier between consumption and creation.

On the other hand, I learned very, very slowly compared to now, in large part because it meant getting a magazine or reading a book, and having nobody to ask when things weren't clear. On the whole it probably is easier, now that I think about it, but only if you're motivated. Maybe that's OK - people who want to can still learn to rebuild a transmission, even if most drivers don't.


A few things that would help this issue 1) a mobile-first IDE - app and responsive website (that you can share links to) with a great interface for mobile that makes use of strengths like touch screen, while getting round the weaknesses of screen size and lack of physical keyboard by not relying on typing out full words and not using the convention of having entire files open in a tab. (eg a single function or code block on screen at a time). Preferably with accounts you can use to log in from whichever device suits at any given point 2) games and other apps that are made to be "modded" that link to that IDE - modding is a big way that PC gamers get into coding these days, it's the art of making add-ons to games - if this where a common thing with mobile games and apps you wouldn't just have the PC gamer crowd modding 3) If programming and development communities where easier to find from a more diverse set of social networks, more people who enjoy expressing creativity with people, would find out that they can create this way as well


Not sure what a TRS-80 is as I don't think we had them here, but I had an old Spectrum and did the same :)

But there are tools on mobile/web to create those things also. I've been researching and trying to get something that is intuitive to work on tables to program web apps (it started web based, but now moving it to mobile as I think it matches the visual paradigm better).

There are various tools for kids to learn to program creating games (Scratch is probably the most well known) but other visual programming apps also.


Agree. Plus, anyone who has gotten into streaming knows that there is a learning curve which introduces the streamer to all sorts of hardware and software. The best streamers often invent specialized solutions for their needs.


I am hopeful we will see a creative renaissance in technology in our children. You can do more than consume with your phone today. Our children have ipads which are hybrid computers at a young age and are learning to access information more readily. They are general knowledge whizzes and learning to do more than just consuming. The software is better and with cloud services the average child will probably never see a data center or build a PC however they most definitely will use airtable and probably build an app in the cloud. The world for them will have opportunities at higher levels of abstraction. At least i hope that is the case.


>I wonder if early car enthusiasts felt the same about people who learned to drive when you no longer needed to know how to fix your car all the time.

I don't think you need to go back to early car enthusiasts. A lot of people would argue that at least being able to do basic car maintenance is a life skill everyone should have. I don't necessarily agree but I do think it's useful to understand at least the basics of how a car operates.


The car is a black box. Gas goes in, vroom comes out. If the box stops working, I pay someone to fix it.

You might argue I'm not car literate, but I'm not sure that is particularly insightful, as the type of specialized knowledge is not actually relevant to doing almost anything with a car that people want to do.

Similarly, while coding is great and all, I'm not sure that not being able to code is the same thing as not knowing how to use a computer effectively.


> The car is a black box. Gas goes in, vroom comes out. If the box stops working, I pay someone to fix it.

It would be a huge problem if you're an aspiring mechanic or anything related, but that's an increasingly niche role so we aren't worried about kids not having mechanical literacy. Many kids are aspiring to work in roles where computer literacy is vital and becoming more so every day. The one's that lack the fluency to do things like automate the repetitive parts of their jobs will gradually not be adding enough value to be gainfully employed.

Not being computer literate will have a huge impact on their careers, not being car literate won't (for most).


I won't argue. And I grew up at a time when cars were less reliable (and my first clunker certainly wasn't). I do think some basic things like changing a tire, jumping a battery, and at least knowing how to check fluid levels is useful. You don't always have something go wrong where you can easily just call for help.


Yeah except new cars you don't have access to the battery (they are starting to make the engine 'protected') and even changing a tire is more rare as many manufacturers are shipping a repair kit vs a spare wheel. Most also have roadside assistance (or your insurance has) which makes it easier.

Most people want a car to go from A to B, many times, if every few years (I only had a blown tire in 20 years driving) they have to call someone to help them, it makes much more sense than spend the time to learn how to do it themselves. Same with computers, they don't want to know how the bytes get into ram, they want to play Doom or write a school paper. There is no need for them to learn all that.


In such cases though I always have a nagging suspicion they are fixing things I don't need or gold plating the service. It definitely happens to some people.

So, I always feel better with a small amount of understanding to arm the BS detector.


Totally agreed. I'm going to be a lot less charitable than you, because to me this is some pretty obvious gatekeeping and I don't have patience for it.

This attitude was around for a while when Linux started getting easier to use (I feel like it's gotten a bit better, but maybe I'm just a part of better communities now). It's the same attitude that came up occasionally in game dev around Unity. It's pretty predictable and pretty tiresome. I don't think the author is a bad person or that they hate kids, but I think this kind of attitude is something that shows up regularly in technical communities and it's worth forcefully stamping out.

This article as it's written isn't interested in education, it's purely inwardly focused on describing how hard the author had it growing up, and how rewarding it was, and how great they turned out, and how everyone else who didn't have that same experience is a poser. It's purely designed to put younger generations down and denigrate them rather than reach out to them in any kind of thoughtful or meaningful way.

The author isn't proposing any solutions. Forget solutions, they're not even identifying problems. There's nothing of substance in this post other than bragging. No mention of how proprietary hardware incentivizes lock-in. No mention of how laws have changed. No mention of how software gets written today and how our toolkits affect accessibiliity. No mention of the rise of SAAS and how that affects people's ability to modify the programs they run. No mention of education challenges. Just nothing at all.

The only reason this blog post exists is because the author is mad that some kids are getting more attention than they did. And while it's worth talking about increasing barriers to creation, the author doesn't seem to be equipped to do so, and their targets of ire (streamers and content producers, arguably some of the more technically involved youth communities out there today) are poorly chosen.

> "If anybody is a digital native, it is me. I did not just grow up with computers, I grew up alongside them."

We get it, you're very smart. I'm super proud of you for installing Windows from a floppy disk. Do you want a medal? Should we all clap for you? Round up and scoff at the people who didn't appreciate your generation enough?

Notice what this article never says. It never says that a reduced hacker ethos in younger generations is a problem. Its primary concern is not that younger kids aren't engaged enough with technology, or that they're not hacking their devices. The primary complaint this article raises is that younger kids are called digital natives, a title that the article is concerned they don't deserve.


You wrote a critique of this blog post that's actually longer than the blog post itself, but you failed to notice the obvious joking tone the author used, so the whole thing sailed right over your head. He's clearly just doing a humorous "get off my lawn" bit about the term "Digital Native". Relax.


There's nothing wrong with consumption devices, but they are not computers. You are using a computer if and only if you are programming, otherwise you are using a glorified television or a glorified typewriter. There's nothing wrong with that. The vast majority of people want exactly that. We shouldn't fight it, nor we should believe we are better or smarter.

But that's not tech literacy.


> You are using a computer if and only if you are programming

Maybe my definition of programming is quite narrow, but I wouldn't consider using CAD software, spreadsheets, or (physics, space, etc) simulation software as programming, yet I would consider the ability to use them a degree of computing or tech literacy. I would certainly consider them very far from only being a consumption device.


The thing is that computers are useful for so many things that I find even just attempting to define what is "tech literacy" difficult and bound to endless discussion, and well the result is this blog post and the following debate here.

It's like you would like to claim "desktop literacy" for the noble tasks that can be performed on a (e.g. wooden) desktop, traditionally it being let's say copying evangelical scriptures, and are angry against those ignorant young folks that don't give a fuck about the bible but find writing novels cool, and let's not even talk about the peasants that merely use their "desktops" to cook and eat (and in the background you have a pen maker who listens to the copyist rant, with a small smirk)

At this point the vague "tech literacy" term is not useful anymore, and the problem is just that more precise terminology is needed to communicate efficiently.


All of those are heavily algorithmic in nature, so understanding algorithmic complexity makes it easier to understand how and why the software handles the way it does.

For example, joining together many separate 3D objects is extremely slow if one just selects all and does an union, since the algorithm needs to check all pairs of objects for overlaps in 3D space.

That may not be immediately apparent to non-programmers, but to programmers, it will feel like an intuitive consequence of the problem and the inevitabilities of its solution.


How many people are at the intersection of (a) being able to competently use CAD software or specialized software for physics or engineering and (b) being unable to write a simple script?

Literacy isn't about reading all the time, it's about being able to read if you want/need to. Same with computer literacy.


Essentially all the older non-software (i.e. electronic, mechanical, aerospace) engineers I know.


> There's nothing wrong with consumption devices, but they are not computers. You are using a computer if and only if you are programming, otherwise you are using a glorified television or a glorified typewriter.

Looks at all the creation-related tools on his iPad and iPhone, none of which involve programming

You sure about that?


That's.... a bizarre frame I don't think I've ever heard before.

Does this mean that my laptop undergoes some sort of miraculous transformation when I type :wq and open my browser?

What if I'm watching video in the background while coding - does my laptop then enter some superposition-state?


I don't see how this is strange, I think I failed to explain myself.

The defining feature of books is that you can read them, and books are useful because they have that property. You are literate if and only if you can read a book, regardless of whether you are reading a book at the moment.

The defining feature of computers is that they are arbitrarily programmable machines, and computers are useful because they have that property. You are computer-literate if and only if you can program a computer, regardless of whether you are programming a computer at the moment.


Wait, hold on though. You just jumped from reading to writing.

You are literate if you can read an arbitrarily chosen book. You are an author if you can write a book. Similarly, you are computer-literate if you can accomplish tasks in an arbitrarily chosen program and maintain a system. You are a computer-author if you can write a program.

We use literacy to describe understanding, not creation. What you're claiming is the equivalent of saying that modern generations can't be book-literate unless they're writing fan-fiction.

I would argue tech-literacy means understanding the common language of tech/UX today and being able to use common technology. Someone is tech-literate if they can "read" technology and are comfortable with common interface conventions, terminology, system maintenance, etc... Anyone who sets up a streaming platform using OBS, who manages a community using admin tools on tech platforms, who figures out how to tune games that they're playing to accommodate recording without dropping frames or de-syncing audio, who sets up microphones and figures out how to balance audio using mixers, who edits the result and uploads it to Youtube -- they would fall very squarely into that category of literacy. They clearly know how to "read" software, even complicated programs like video editing tools.


A book is more or less only useful for reading. Well I suppose I can stack some books if I want to elevate something on my desk but pretty much.

Whereas a computer can do many different things. Yes, it's because it's an arbitrarily programmable machine but if I choose to use software written by others rather than programming it myself I'm not sure why that's a lower use.

In fact, I can program but rarely do so. Usually I'm using a computer to do tasks like writing, working on photos, etc. My day to day use is sort of irrelevant to the fact that I can do some programming. So I guess I'm not computer-literate.


So if someone who can't read picks up a book, it's actually not a book...?


But isn't it though? Does someone literate in film need to make movies? Why can't someone be literate in tech without writing html?


It is the simulacrum of tech literacy.

Just like an Office 365 subscription is a simulacrum of a horde of calculators (people) with mass-produced calculators and typists on mass-produced typewriters, which were simulacra of noblemen scholars, pens (mass-produced simulacra of quills) and papers (mass-produced simulacra of expensive parchment)… which themselves were simulacra of prehistoric humans making cave paintings.

So do you really feel the need to tabulate, calculate and write today?


What an odd form of gatekeeping. To what end?


I am specifically not attempting to 'gate-keep'. You don't want to code? Don't. I'm not going to force it down your throat.


You're gatekeeping who qualifies as "using a computer", and using a really odd definition to do it.

Now, the rest of your point is, who cares? Do with that thing whatever floats your boat, it's fine if you don't program. That point is valid. But when "that thing" is a computer, and they're "doing" something with it, that qualifies as "using a computer". At least by the definitions the rest of us are using.


That's just a debate on the semantics of the sentence "using a computer". Maybe your definition is better, I don't particularly care.

The point I'm trying to make is that I percieve a very clear conceptual difference between "using a computer to program" and "using a computer to do other things", and I would tend to believe an effort to improve computer literacy should attempt to point people to the former rather than the latter.

Do you believe those things to be the same? I have a strong intuition that they aren't, but I'm very willing to hear a counterpoint.

Once again, I'm specifically trying to avoid any kind of value judgment.


My counterpoint to

>I would tend to believe an effort to improve computer literacy should attempt to point people to the former rather than the latter.

is why? I mean sure. If they want to become programmers, then they need to learn how to use a computer to program but I'm not sure why that's any more about "computer literacy" than lots of other tasks.

I use computers for lots of things on a day to day basis, including many "creative" tasks, and almost none of those involve programming. Programming is a specific way that you can use a computer and it may imply deeper knowledge of the underlying system than making a video, but so what?

For that matter I could equally argue that a pure front-end developer isn't really computer literate because they maybe don't understand kernel schedulers, security model, processes, interrupts, etc. work. Oh, and how about TLBs, cache eviction policies, dynamic resource allocation, etc. at the CPU level?


Because that's the fundamental, distinctive, unique thing about computers. That's the reason we bother with computers in the first place.

There's certainly nothing wrong with using a computer for other stuff, also. Or even exclusively, not everyone needs to be a programmer, that's for sure. I would argue, however, that algorithmic thinking/basic scripting is a very important piece of human knowledge, is it really more esoteric than Latin or Greek, which are routinely taught to high-school students?

> For that matter I could equally argue that a pure front-end developer isn't really computer literate...

There are different levels of literacy. A 7-year old and a PhD in English literature are both literate, but not at the same level. Same goes for computer-related knowledge, it's a bottomless pit, like any other interesting field.


Another instance of "middle-aged tech nerd makes fun of kids these days"

I get it, you're not too old to be good at computers, but I don't see the need to put down the younger generation and act like their tech knowledge doesn't extend beyond "video streaming".

Acting like kids are computer wizards is an exaggeration, but acting like young people know less about computing than this older generation is simply wrong.


I have taught kids for 20 years.

Young people (on average) know less about computing than they did.

The author of this article says "floppy disks, or 'the save icon' as the young kids would recognize it" - except that they often wouldn't. Kids don't have any experience of saving and loading files.


> Young people (on average) know less about computing than they did.

They know less about desktop computing.


And they know more about cultural computing. In the same way they'd be clueless when faced with a DOS prompt, many older people are clueless about how Insta etc work.

You could argue that's not really computing, and in a very obvious sense it isn't. But it's certainly a form of application development, for a very specific kind of application.

And the up side is that - unless AI takes over - people who can do nuts and bolts infrastructure computing will be rarer and in even higher demand than they are now.


I do disagree with the article: there are lots more opportunities for kids to deep dive into almost any field of tech that they choose to.

But, on the other hand, I've met teenagers who lacked confidence even using a browser to navigate to an unfamiliar website. They've learned how to navigate particular apps as silos.

This reminds me of my parents and grandparents' generation using computers. Comfortable with what they know and use regularly; no concept of what happens outside of that narrow window.


Kids these days know less about cars than they used to. Lots of them drive, but they couldn't service the engine.

Same energy.


Tangent.

I grew up with cars I could (and did) service. I still remember how to break down a drum brake, and adjust it so it worked again when re-assembled. I've used grease zerks on the auto I drove. I've helped troubleshoot a carburetor.

Can't do that anymore, and it's only partially because the components don't exist, but because everything is so much more complicated you have to specialize in it to get much done.

Tangent wrap-up.

Maybe these "digital natives" have simply decided to specialize in something other than computers (and cars). Not all of them: I work with several folks who are decades younger than me, and they're perfectly comfortable with packet captures and bare metal databases.


My great grandma played Klondike Solitaire with actual cards. How low-level is that?

Yup, this article is some world-class curmudgeoning. Most kids my age weren't programming at 7, so I had free reign on the commodore PET in the school library. Same thing goes for kids of today. Most are passive consumers, some will tear their world apart and figure out what makes things tick.


> Can't do that anymore, and it's only partially because the components don't exist, but because everything is so much more complicated you have to specialize in it to get much done.

As a tuner, this reads strangely. I've rebuilt my ABS disc brakes and ECU tuning has been a thing for decades. People are building 300hp Saabs in their garages with open source software.


> I grew up with cars I could (and did) service. I still remember how to break down a drum brake, and adjust it so it worked again when re-assembled. I've used grease zerks on the auto I drove. I've helped troubleshoot a carburetor.

Maybe not the same thing, but about two years ago, my car's check engine light came on. Dealer quoted me $2400 to fix the issue (ABS pump failed because of low voltage on system power). About a month ago, some friends and I spent a Saturday replacing the ABS pump. $300 ebay part, $20 worth of tools, $10 worth of brake fluid, $50 worth of OBD-2 readers to confirm what the problem was, and $50 for lunch for my friends.

I think it's not impossible to fix cars today. Maybe you won't be tuning an engine to get more power out of it, but it's 100% possible to tear down your brakes, brake systems, and lots of other stuff if you know where to look. I'm not super technical with cars, but if you forced me to it, I could change out my car's brake pads given two days.

This stuff isn't easy, and the tools are more computerized now, but if you want to understand modern cars, the info is out there. Buy your car's maintenance manual. It'll probably cost $30, but you'll have more information about your car than you could ever possibly learn. Every circuit, every resistor, every bus line, every hydraulic line.

But I think you're right that there is a class of people who work at a level above the tech. And I think that's what "digital native" usually refers to. Something less like a software engineer, something more like a societal engineer (I don't want to use the term "social engineer" because that's a loaded term in the security world). For them, tech is a platform to be used for alternate ends - followers, and likes, and clout and such.

I think it's a good thing that not everyone needs to be able to troubleshoot DHCP issues in order to get online - though I know that my friends and I got really good at that in high school. The fact that we enable people to operate at the level of societal engineering (as opposed to forcing everyone to operate at the level of software engineering) speaks to the maturity of tech. And that, at least, should be comforting to those who use tech to create tech.


desktop computing had core skills that could be developed, that would apply across different programs.

on mobile systems, the balance of power feels very different. apps are each their own immersive experiences, each picking their own widgets, toolkits, styling, paradigms. many apps are a thin shell over far off closed services we could not understand if we tried.

mobile has been a very advanced war on general purpose computing. the value of understanding the os, understanding "computing" has gone way way down in this far more black boxes environment. the user has been treated like an idiot, protected endlessly from themselves, and the technical underpinnings deeply deeply masked over.

computing is being destroyed, especially on mobile. the desktop is one of the few places one has any chance to learn about computing in any meaningful way.


I wish this thread would focus more on this, and less on how poorly the article expresses it. The very concept of "mobile" as some sort of distinct platform is an attack on general purpose computing. Refer to a smartphone as a "computer", and people will look at you strangely. The entire thing is a highly succuessful exercise in escaping the decades of culture norms surrounding general purpose computing.

Until you can make a first class "mobile app" on a mobile, it's a system for manipulating people.


Which is a problem when you get to to work in the real world in some cases I have seen a lot of basic knowledge.

Of course its muggings here how has to perform the task of putting it all back together - the emotional labor to use a modern term.


The business world will all be app-based once Gen-Z starts founding companies.


I've already seen some of that various agencies sharing work on numerous different "sharing platforms" and this is not internes this is supposedly big name agencies working for major brands.


And I don't see any merit in being proficient in using any touchscreen device, which have been proven intuitive enough for a toddler or even an ape to understand. So you can tap, drag, scroll, swipe? Impressive digital nativity.


Yeah... I agree, and would argue further that there is a specific kind of mistake being made in this post that I see many in tech make...

That of believing that living lower down the stack makes you superior somehow...

Here's an example of someone I think is clearly fucking amazing but arguably lives much higher up the stack than this guy...

https://www.google.com/amp/s/www.theverge.com/platform/amp/2...



The complexity today is so great that understanding the entire system requires a vast amount of knowledge. The Commodore 64 was notable for being totally described down to the hardware and bit level in a 1 inch thick book. People who grew up with that could fully understand that little device.

With a modest amount of effort, you could totally understand an AMPS phone, the last generation of analog cellular. It was a two-way FM radio controlled by a tiny CPU, no more complex than a Commodore 64.

Understanding a modern phone is a huge job. Even at a general level. There's a 90s data center worth of compute power in there. About four radios. A GPU. A 6-axis inertial guidance system. Several cameras. A rather excessive amount of software. A web browser, which is itself overly complex. Voice codecs. Voice recognition. Quite possibly a machine learning system.

That's a lot to understand. It is also not useful to the end user to understand it.


I suspect a better measure of technical proficiency is the ability to shape the technology to reflect your own needs. That could happen at a much lower level on the Commodore 64 since it was a much simpler machine, yet that should not invalidate the proficiency of a JavaScript programmer simply because they work at a much higher level.

Put in other terms, even those who absorbed the Commodore 64 programmer's guide didn't truly understand the machine. They may have known how to write software for it and even how to exploit the various chips, but that manual only provided surface level details about the electronics.


> Understanding a modern phone is a huge job.

I’ve tried (well maybe not phones, but other things). There’s a lot of stuff and large chunks of it are either undocumented, do not provide documentation unless you are in a contract with the company or are very superficially documented. On top I usually ended up finding I could not fully comprehend how A worked without comprehending B which required C which required D, etc.


Is there anyone alive who completely understands a latest generation iPhone down to the bit level? It would be impossible for someone outside of Apple to have the access necessary. But even if everything was completely open source would it be within the realm of human ability to understand something so vast and complex in the way that it was possible to understand a Commodore 64?


Some speculation here: there is a certain percentage of people who are interested in technology for its own sake in any generation and it that percentage is relatively stable. So yes, I agree that treating kids as computer wizards is an exaggeration and claiming that they are ignorant is wrong.

That being said, the "digital natives" claim really rubs me the wrong way even when it is confined to the subset of people who are genuinely interested in computers. It leaves the impression that the younger generation has little to learn from the older generation. The flow of knowledge should be going in both directions.


It's probably that older people growing up with computers needed to do something with them (could not just stream twitch) and hence when you had a computer, then you was what today would be called a power user.

Today, just millions of normal people use computers, tablets, phones, appliances every day and they get along fine - but they are not power users, since you can spend your time on the computer with almost zero knowledge about it.

A concrete example would be that it's totally fine today to not know, what a file or folder is - because all you see is a feed or an album.


For example, to run your game, you might very well have needed to go into your computer and make changes to your autoexec.bat and config.sys files.

Do a lot of kids (above a certain age) actually not know what a file or folder is? Doing perfectly ordinary day-to-day stuff which is not at all technical in nature, I find I have to save and open files all the time.


It's a common problem when people have tied up their sense of self-worth in tweaking with janky, broken stuff, rather than on the things you can create with that. See also bitter, mediocre photographers who revile folks for being "fauxtographers" because they're making a living from photography on Instagram instead of fucking around with expensive lenses and Photoshop.

My kids know a lot less about the internals of a tech stack than I do, but they create a lot more things with the tools at their disposal than I do. Unlike the author, I think they're in a better place than I am.


Gen Y / Millennials usually know a fair bit about computers. They grew up as internet connected desktops become a mainstream thing.

Gen Z usually seem more like Boomers who can maybe do wonderful things with the video-telephone thingy but don't know a lot about it.


Talk to kids about cryptocurrency mining and suddenly you have teenagers who understand Merkle trees and zero knowledge proofs. They understand mining rigs and GPUs, which are about as ground up as anyone in the PC generation ever got. They also use raspi's, there are real advantages to learning reverse engineering (Hackaday is evidence of this), and I'd even say ethereum is the new netbsd.

The abstractions we spent the last 30 years on (operating systems, protocols, etc) will fade like musical genres. There were literaally people who spent 5+ years of their lives learning OS/2, SCO, VMS, Novell, and others. I"m optimistic about kids.

Also, there are demographic issues. Consider that aptitude is Pareto distributed, which means the population lull between millenaials and the younger zombie apocalypse generation creates a smaller total sample of people in that range, so you don't encounter as many upper percentile people in that cohort because there just aren't as many of them. Any minority demographic is necessarily going to appear less exceptional because even if the distribution is the same (pareto), you're going to encounter more of the long tail cohort, and fewer of the very tiny exceptional elite in that selection. This guy is on about a group of kids who have the same aptitude distribution as older folks, but since there are fewer of them than the mega generations like X, Y, and M , he's ignoring their exceptions and focusing on their long tail.


> Talk to kids about cryptocurrency mining and suddenly you have teenagers who understand Merkle trees and zero knowledge proofs.

Yeah, an extremely small subset of nerdy kids. When people speak of "digital natives", they mean the population at large. Nerdy kids are nothing new.


There are a lot of teenagers who are interested in cryptocurrency, but I think relatively few of them are actually mining themselves, and an even smaller subset of them understand much of the cryptography at work.

I'm speaking from experience; I mined some Dogecoin in high school, but at that point I was just barely learning for-loops.


It took myself a while to admit, and I get why this is an unsatisfying take for a community as heavy on creatives as HN: there are digital native makers and digital native consumers, and the latter is utterly dominating the young population to the point of relegating the makers to the fringe.

The age of making tools and coming up with workarounds in the face of limitations has come to an end as a mainstream computing activity. A person using a computer in the 90s had to know a lot of technical details and was far more likely to need this knowledge in order to accomplish something. A person growing up today does not, in general, have this need: they can go their whole lives without ever needing to know anything "technical" by virtue of never brushing up against serious limitations that seem surmountable with personal expertise.

Before judging what's good and bad about this, other people have rightfully pointed out the exact same shift in other technology/tools sectors, such as the car industry. The most positive way I can frame this shift in respect to computing is:

In the beginning, the tool itself mattered the most. Its composition, its setup, its ever-evolving but inspectable properties. For many, the tool itself was the purpose. Overcoming its limitations and improving it was the purpose.

By now, the tool itself is no longer relevant. It only matters what you can do with it. The works made with the tool are center stage. Whatever limitations the tool still has are accepted as unchangeable and/or not worth contemplating. Spending time on the tool itself is considered, at best, a weird hobby.


The knowledge of making and maintaining these tools is still incredibly important and should be handed out through generations (while also inventing innovations to make more efficient and powerful tools).

But what many older generations in tech who constantly lament about the “new generation of tech-illiterates” ignore is that there are actually quite some people (both old and young) who understand this and are actually dedicating towards learning, teaching, and creating low-level mechanisms and systems (such as the people in the Handmade Hero Network, the young new Linux hackers contributing to the kernel, and hobbyists creating their own game engines). And it’s a grave disrespect against them to say that nobody actually cares about anything. If you really are worried about this, don’t just complain and cry about it, be the change you want to see in the world.


I was starting to write a reply about how much I agree with you, but then I came to the second half of your post.

> And it’s a grave disrespect against them to say that nobody actually cares about anything.

I think you might have completely misread my intent. For starters, if you're a HMH watcher, I invite you to tell Casey how much society cares about software quality, you're probably going to trigger a well-deserved rant.

When I say as a society we now have less low-level tools developers per capita, that's not supposed to be an insult to you personally, nor is it necessarily a lamentation on the state of affairs.

> If you really are worried about this, don’t just complain and cry about it, be the change you want to see in the world.

I have no idea where this criticism connects with what I wrote or meant. I did not mean to "complain and cry". My GP post was set against the backdrop of the article, and my disagreement was specifically with the article.


Sorry about this, my intent for this criticism was not towards you, it was towards other people around HN who generally go around and generalize the young generation for being ignorant.

Maybe I should have written this in a different place...


> When it comes down to the meat and potatoes, they do not know jack shit.

Don't forget to tell the kids to get off your lawn after you're done shaking your fist at that cloud.

I find it pretty difficult to believe that, in the age of universal instant access to technology information at every level of abstraction, "kids these days" are somehow less knowledgeable about computers than they were in the author's glory days of, presumably, the early-to-mid 80's - "I grew up with the 286, 386, 486 and all the other x86’es."

If I had to, I'd bet that just as many (or just as few, depending on your perspective) "kids" get their hands dirty (as it were) with the guts of computing as ever have - and having written this out, my assertion sounds absurdly conservative. How is it possible that more kids don't know more?


I'm a late millennial, born in 93. I've met a few of the vintage computing nerds in my travels, and am constantly impressed by the depth of their knowledge.

I wouldn't say that young people are bad at _using_ a computer, or even building/servicing them, but we don't understand the technology in the same way the older generation had to. "Computing" just isn't a hobby the way it used to be.


> I've met a few of the vintage computing nerds in my travels, and am constantly impressed by the depth of their knowledge.

Sure, I've met a few greybeards too, and I'm happy to agree that they have tremendous depth of knowledge about computing that most of us millennials (myself included) lack. But keep in mind that they have a depth of knowledge about computing that most of their generational peers lack as well!

Additionally, your statement that "we don't understand the technology in the same way the older generation had to" is just as true if you reverse it: the older generation doesn't understand the technology in the same way WE have to. For my part, I find it pretty difficult to look at the obvious technical aptitude of somebody like, say, Alyssa Rosenzweig or Mike Stewart - and then conclude that the younger generation doesn't "understand the technology."

Perhaps the problem here is that we're trying to compare multiple unknowns. What counts as "impressive depth of knowledge" for a generation? Is there anything objective one could use to say "actually yes, this generation really does know less than that one?" Because to me it sounds like the argument being made is "the fact that they don't understand what we went through is proof they know less than us" - not that they know differently but that they know less - and I don't buy that argument.


While the author's point is probably valid to some degree, it's not really what digital native means. [1] It was popularized by an education consultant so understanding it primarily in the context of consuming media and interacting with computers of all types is likely the right lens. It's not really connected to being able to program and having a lot of under the hood computer knowledge.

[1] https://en.wikipedia.org/wiki/Digital_native


I've thought the same thing. I was lucky enough to grow up with computers. I matured as they did. This allowed me to understand the hardware and software fundamentally. It seems like a lot of that knowledge has been abstracted behind the well polished machines that exist today. It makes it hard to really learn what these devices are actually doing.


I grew up in that era as well - but I realised it doesn't matter. I used to have compile my own kernels and fumble around with downloading and installing the nVidia drivers, and now my distro deals with that for me. I can go look if I want to, but I don't need to.

And you know what, it's better that way. The tech scene and civilisation moves forward because those abstractions lets us achieve more with the same amount of time.

I fully embrace that notion that one of the problems of the well polished machines is that access to the underlying layers is actively blocked. I think this is a very real problem.

However, if you want to learn programming, electronics, operating systems or any other level of the stack there is a plethora of options available to you. Most people simply have other interests.


Yeah. I suppose that is right. For me, that was an ideal way to learn but that doesn't mean their aren't different/better ways. And don't get me wrong, I'm all for these layers of abstraction helping simplify complex systems and allow for progression. In fact, that's a big part of my career now.


Fortunately, these abstractions are horribly broken, so you get to peek behind the curtain on the regular.


That's still not what "digital natives" means. Yes, sure, there's millenial nerds like myself that had to actually know a bit about computers to use them. That's not the point.

Digital natives were born after the inception of the internet. If they don't want to use it to learn about computers, or hobbies, or ancient history, that's their prerogative. But I bet the percent of young folks enriching their minds is about the same nowadays as back then. But now, they are outnumbered by general users. No, you don't get 1337 skillzors by osmosis, that's not the point of Digital Natives. The point is they were born with greater intellectual opportunities than any prior generation.


> The point is they were born with greater intellectual opportunities than any prior generation.

Nope they weren't. This is the least intellectual generation since the latest 60 years or more. There's nothing "intellectual" about following the latest popular trends on Facebook or Twitter.


That’s not what they said. The parent said they’re born with more intellectual opportunities than other generations. In other words, they have more content to fulfil whatever intellectual curiousities they want.

> There's nothing "intellectual" about following the latest popular trends on Facebook or Twitter

This just comes off as smug and self-serving bs.


I think there absolutely are more intellectual opportunities. On the flip side I also believe the absolute number of distractions have increased by an amount that seems to dwarf the productive activities.


Fluency in digital culture is a separate skill from expertise in digital technology. Both are valuable, but the latter is what "digital natives" generally refers to.


“Oregon Trail” generation here. I’m more tech literate than my older and younger family members. But I also made tech my career so I’m not sure if this is a fair anecdote.


Same, same, and same.

I think there's a tendency for anyone who halfway gets how computers work to end up in tech, because halfway getting how computers work pays much better than almost anything else that's anywhere near as easy to get into. About the only reason not to is if you can't stand sitting in front of a screen reading and typing arcane crap all day (a sentiment I very much sympathize with).

If you're an Excel whiz and kinda understand how to navigate a filesystem and how that relates (and how it doesn't) to what you navigate to in a web browser, then you're not that far from being a Javascript jockey, which pays better than most mid-tier or lower office work, even entry-level. So mid-tier office workers, and others at or below that pay level, are usually not much more than barely competent at using computers, because if they were better they'd rarely stay in those roles.

I don't find that younger generations are much more inclined to "get" computers in that way than, at the very least, Gen X and Millennials are/were. And the main thing driving younger kids to learn actually-useful computer skills seem to be the same as before—PC video games & modding. Dunno what will happen if those stop having a unique appeal compared with mobile and console games. Probably universities will have to get used to most of their best CS candidates not having already done half the university's job for them, in self-directed free time from ages 8-18.

I do get the sense that a hell of a lot of kids are better at video production and photography than in our generation, but the tools they have for that are so much better, easier to use, and cheaper, and there are more incentives (YouTube and such) for them to at least give it a try, that it'd be surprising if that weren't the case. Smartphones, free YouTube how-tos, free non-terrible video & photo editing software, and much faster computers, completely changed the accessibility of getting hands-on with those things. You can do more experimentation with, say, lighting, in 10 minutes now than you could in an hour with your folks' old tape camcorder.


I spent way too many hours in a darkroom in high school and college. While it was fun, I'd never want to go back to all the time spent messing with chemicals.

Video of course was even worse. In fact, I was somewhat into film at the time but the overhead of dealing with heavy tape camcorders and editing was more than I could deal with. I'd definitely have done more if I'd had access to even an iPhone and Final Cut Pro X.

Good photography and good video are still hard of course. But so many barriers have been removed.


Seriously, the amount of practice and skill-improvement a kid could get in an afternoon of trying out angles, cuts, and shot-movement, just trying to imitate some director or DP they like until they're getting similar results, all on their iPhone, is incredible.

Just a little more work and they can post a highlight reel of their practice session, to be viewed by anyone in the world. After a few times doing this, the work might even be looking pretty good, with nothing more than a smartphone for the entire process.

Repeat a few afternoons for various other aspects of technique, and you'll have had more and better practical experience at some aspects of production and editing than people who'd worked at it for months or more the 80s or 90s. The learning feedback loop is so tight now.

I mean, damn. That's cool.


Oh, and by the way, they can watch a video on YouTube in which someone does a detailed walk through of Tarantino's framing and angle choices in a scene.


OK, Digital Native, go create some content edited to the quality and style of something like https://www.tiktok.com/@happykelli [1]

Content Creators today on TikTok and Youtube are as talented and multifaceted as some of the most talented musicians of previous generations. And they're using the tools of the day.

Don't be condescending because they don't know the low level abstractions that are completely irrelevant to them any longer.

They are pushing the envelope and doing advanced shit with the best tools available to them JUST LIKE YOU USED TO.

[1] I feel like I might even have to use this example for anyone whose understanding of TikTok is just "pick an effect from a dropdown list then make funny faces at the camera until the ML algorithm generates something awesome".

Most of the videos of this person are completely impossible to create natively through the app. They require a clever combination of built-in filters, and offline video postprocessing, and extremely precise editing. Plus some awesome dancing.


I think the way “digital native” is someone for whom the technology is so pervasive that they don’t think about how it is implemented and spend more time thinking how it can be used.

Many times, it is rarely the people that developed the technology that are aware of all the ways it can be used, but rather people for whom the technology was a given.


The first time I heard the phrase digital native was about ten or so years ago from a then 22-year-old guy in IT. I get what he was saying; most of his life he had digital devices and internet access so it was more ingrained in his psyche. He was trying to use it as a differentiation between him and the rest of us old farts. Of course he did not have enough context to realize that folks older than him were no less digital native with our own computers, video games and slide rule calculators.

At the end of the day it is just a term millennials used to market themselves in the tech community. I am sure the kids born in the 2010s will come up with their own differentiations (first always connected generation maybe?).


I'm a millennial and have never heard anyone younger than my parents use this phrase.

I've always hated it for similar reasons, the "digital natives" are mostly just used to being slaves to cooperate software publishers/authors and don't understand the freedom that a properly configured PC provides.


While I feel similarly, this strikes me as a nearly perfect example of No True Scotsman.

https://en.m.wikipedia.org/wiki/No_true_Scotsman


As a commenter on the article stated and summed up better than I've seen here in these comments (maybe Sam is one of you here) a native doesn't at all mean that it's someone who knows the ins and outs of today's modern computers. A native is simply someone born right into using. These kids may not know anything about how their phone chips work, or how their school laptops work, but they had cheap computers in their pocket and laptops in school where us older generations didn't have any of this stuff growing up. Digital native meaning that they are born into it and everyone is using it. This has nothing to do at all with being born in the 1970s, and having used computers as the odd guy out in the mid 1980's. That's cool and all, but nothing to do with what makes a digital native.


The phrase digital native is silly, like petrol positive to describe the 1900+’ers. Being exposed to apps won’t make the next generation supperior. But having parents from the previous generation surely will without any doubt at all.


Key quote:

> If anybody is a digital native, it is me. I did not just grow up with computers, I grew up alongside them.

Yes, this. I grew up and grew with technology as a member of the Oregon Trail Generation. Of course, kids these days will learn to operate with the greatest and latest abstractions and will continue to awesome things. The folly is calling the whole generation "digital natives." Just because you are exposed to (consumption based) technology, it does not mean you understand it at some foundational level. It is similar with cars - I've driven cars all my life, but working on them? I'm terrible at it.


I don't think "digital native" was ever intended to mean or imply "skilled in computer science", and more "takes digital technology for granted (and therefore thinks differently)".

For example, if you ask where someone lives for the purpose of meeting them there, an analog native might answer in terms of driving directions. This would never occur to a digital native, who would instead answer with an address under the assumption that you'll plop it into your GPS.


Coming from using dialup with a USR Robotics Sportster 56K modem using Mosaic browser on Win 3.1 with Winsock I am glad my kids have the iPad.

I learned a lot but things are now far complicated but easier for ordinary people to use.


I think millenials are the first and last generation of digital natives. That is the group that had to deal with computers when they were yet not polished and hiding everything, but still grew up and adopted the modern way to use it. Good question for this group is what use of technology they wouldn't be able to pick up if needed? That is new app or skill? Ofc, young people using the apps and social networks know what is going on there and how to use them, but that isn't really key. Key is being able to do that and understand computing at some level.


Nothing I read in this article has anything to do with somone being a "digital native". I read digital native as digital-first. In this case "digital native" rings true. Hell, reading this rant, I caught myself thinking along the lines of pshh-- 286? What, you didn't breadboard your first PC and clear your eeproms with a UV light? The feeling that the youth knows less than the olds is something I have heard as long as I have been able to understand the concept. If this was really the case, we would all know nothing by now.


The author has a point. I disagree with it fairly strongly however.

"Digital Native" doesn't meant built digital, it means uses digital means first.

I get it's a slippery concept. YouTube creator sounds like a narcissistic ploy. It is, to a minor degree, but it's creating content for others to use.

So to map an analogy, this guy is complaining about people using the roads he builds and knows how to build, and fails to recognize the farmer, the opera singer, the teacher, the Cloud Engineer, and so on that use those roads regularly.


I'm in the same generation (or maybe even slightly older) than the author of that article.

And at work I tend to see a lot of the "digital natives" who can't understand how the black box was built and they put together things that they don't really understand and they really like to throw around a lot of lingo to deal with what I suspect is a bunch of imposter syndrome going on.

There are really fairly few people who understand how its built and will drill down until they find and fix the problem.


Yes. Today's "digital natives" just mean they know how to open Netflix on their phones and make a tiktok video.

Not dissing those activities, but it's not technical knowledge per se. Ask them what's an IP address, how would they go about replacing a hard drive, etc, and get empty stares.

I've been hearing from some companies/hiring managers that candidates in entry level jobs have difficulty with basic Word/Excel usage.


> If anybody is a digital native, it is me. I did not just grow up with computers, I grew up alongside them.

How is that not misunderstanding the meaning of native? Google's first definition for native is "a person born in a specified place or associated with a place by birth, whether subsequently resident there or not." Growing up alongside computers is another way of saying they weren't with you since birth.


I’m always reluctant to generalize. If I remember correctly most people from our generation (the author and myself) were pretty bad at using computer. Computer Geeks were a fairly minority.

I see the same thing in my son’s class. There’s a couple of kids who can mess around with HTML and JavaScript and programmable LEGO. But they’re all well versed in social media in the same way we could all type on a T9 keyboard.


If you havent, make sure you teach your kids proficiency with desktop systems and the command line.

That way they will be lords instead of serfs


Pluralization are not what you think them is.


I think the most generous reading of this article is something like: The general public falsely believes that most people in their 20s are much more proficient with technology than even nerds in their 40s and 50s, simply because they have had access to computers and smartphones most of their lives.


I kind of miss the day when everyone is chatting on the phone and ICQ, MSN and you need to be in a certain time for certain people. Nowadays, my brother who is a digital native just chat with his friends all day long with Discord, which is something that I just can never do.


Something that's missing in all of this is why the author was into computers. They obviously were, and running a homelab is as good of a hobby as any, but to what end? Kids aren't learning this because why would they when they just want to watch Tiktok videos.


This is confusing knowing how to use technology with knowing how technology works under the hood. I can drive a a car, but I'm a shit mechanic, and only vaguely know how to go about designing an engine.


All that comes to mind is how people who displayed deep technical knowledge online, back in the 90's, were often misanthropic, tactless stereotypes who loved hacking because they wanted power.

To the extent that people who are now in their 30's and 40's "learned computers", it was mostly to get the printer working, load a game or bypass school security. Young people today do the same things, just using slightly different mediums. If you started earlier you might have learned more about your 8-bit micro's hardware, if you started later, more about servers and 3D graphics. It's just stuff.


> were often misanthropic, tactless stereotypes who loved hacking because they wanted power.

It's not like the SV techbros of the 2000s and 2010s are any different from that POV. We used to complain about the power wielded by the local BOFH sysadmin, but that's nothing compared to present-day "tech" monopolies.


Old man yells at children to get off his lawn




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: