Just spitballing here, but I'd expect peak tech-savviness to have occurred in the 80s and early 90s. If you were old enough to want to play resource intensive games released in the run-up to Win95, you'd sometimes need to free up memory by messing around with config.sys and autoexec.bat. Good times...
I think it's simply that modern machines are (generally) so reliable that you never have to get under the hood to figure out what broke until you (rarely) do and then you have no conceptual framework for trying to resolve what is broken.
His trouble shooting process is handing it to me and telling me "it's not working".
I refuse to fix it unless he watches and pays attention to how I'm fixing it, slowly he's starting to understand that none of this stuff is magic.
I don't care if has no interest in computers generally but I think it's important that he realises the devices/technology he interacts with isn't just a magic black box.
He was interested in what I was programming the other day though so I told him we could build a website together as long as he chose the subject, wrote what he wanted to put on it and wrote the code with my supervision, inevitably it's going to be about Fortnite but I'll take whatever wins I can get.
Generally I'll ask answer his questions with questions, we had a discussion about IP addresses the other day (though he had know idea what an IP address was) because he was curious how one computer "talks" to another so I asked a bunch of questions "Assuming you had lots of computers how would you tell one from the other" "I'd number them" "Ok, so what if you wanted to replace a computer but keep talking to it as if it was the old computer?" "I'd make it so the numbers could be changed for each computer", "OK, so you have millions of computers with millions of numbers how would you know which number went with which computer?" "well...I'd name them but in a way that I could say this name belongs to this number", I was proud, he pretty much figured out DNS without knowing DNS was a thing.
So then I showed him the config panel for a web host and pointed out that his names and numbers where an actual thing.
You have the same problem with cars. When I was 18 I was able to disassemble and reassemble my car and basically fix anything which broke. I even replaced the bodywork one time after a crash.
Today nobody even changes their summer-/winter-tires on their own anymore, let alone changing oil or renewing breaks.
You could also say that the machines became so reliable and so complicated that most people don't need too look under the hood and if they need it's too complicated anyway.
However, if you somehow get into new areas, it feels similar. Like smart home tools of today are very badic and requires a lot of troubleshooting.
I'm a bit older and didn't touch a computer keyboard (actually a teletype) until late in high school and didn't even do much with computers in college. (I'm an engineer by training but not CS.) I was working when I got into PCs, initially as a way to do some engineering calculations and project tracking on my job.
I agree that although consumption is the norm, the options and information available for those who want to go under the hood is pretty amazing today.
I remember having a hard time searching for information when I was young, as it was pre-internet and I was reliant on my parents on driving me places like the library to access information.
Nowadays, if you want to learn the ins and outs of something, you can order books online or watch informative video tutorials made by people who have been in the tech field for a significant amount of time.
Desktop computers still exist, and you can still order the parts to make your own. You can even buy cool glass cases where the parts are exposed and all light up.
Raspberry pi are essentially free, and can be used for all kinds of interesting stuff. Do that.
See also Razer synapse, the El gato streamdeck, programmable keyboards, all of which still exist.
Phones are not the only computers.
OTOH, my dad did a significant amount of major renovation work with the fixer-upper house I grew up in.
A lot of it comes down to choosing how you spend your time and what skills you choose to develop. Because you probably can't do everything yourself.
The difference is Tesla won't sell you technical manuals, which are reserved for their chain of authorized service shops.
See, e.g. this thread: https://teslamotorsclub.com/tmc/threads/does-teslas-parts-an...
E.g. Helm is the official publisher of OEM technical manuals for at least 23 vehicle makes--the same manuals used by their respective authorized service shops. Something happened circa 2012-2013 that caused most of these manufacturers to stop allowing direct purchase of their service manuals by the general public, including Acura, Lexus, and Honda being the 3 that I've kept a close eye on over the years.
People using computers seem to be the same people mindlessly click on anything with an OK button on it. Then their entire bank account is drained and they cry innocence.
Anecdotally, several people who I rode with 20–25 years ago who were >50 at the time were aggressive and reckless drivers, much more than anyone I have ridden with recently. My general impression is that such aggressive driving used to be much more common than today. Even if they had fast reaction times and good control of steering, I would hardly call it “good driving” for public streets.
If you want to see this style of “good driving” try riding in taxis in China or some other developing country. Dudes with amazing steering and reaction times shaving every possible second off by pulling risky probably illegal stunts.
The only difference is vehicle manufacturers have convinced ignorant consumers into believing that technology has advanced so far that only a "professional" can do it, i.e. those who hold a copy of official technical manuals.
I'm a linux guy, but why do you think people buy macs ?
My zines, if you're interested.
A bit of this lives on in porting e.g. newer versions of Android to hardware which has been left behind by the manufacturer without a shred of documentation or care. If and when one or more of the alternative mobile operating systems get a foothold this might open another avenue for this type of hacking.
I'm not sure the two compare so well. Kids that fiddle with Linux do so out of curiosity. For the other kids, the closest you'll ever get to config file fiddling might be when they're forced to use Linux -- which could happen, for instance, when their techie parent refuses to have a Windows box or a Mac in the house.
By contrast, take any household with a Windows box and a few kids. The latter will want to play games on the former sooner or later.
All the complaints about kids toys being too dumbed down have been played out before.
Edgar Dijkstra complained that Americans were so rich that American so-called computer scientists built and played with computers
too much instead of doing computer science that the Europeans did.
Also miss messing around hexadecimal editors to cheat in games.
It's been many years. Computers have gotten far more complex and reliable. My uncle knew how to rebuild his car engine, I don't. Same thing.
Now I'm grown up, and cars have changed a lot. I have a tesla, and the only thing I can really service on it myself is the washer fluid. But it's also more reliable and doesn't need constant tinkering.
This was what came to mind reading this. I was of the "tinker" generation with computers, which have changed a lot since then. More reliable, more user friendly. I don't expect my children will have the same relationship with them I did, just like I didn't have the same relationship with cars my parents did. But maybe they will be the tinker generation of something else...
: I can only talk about Windows, which is what I use extensively. I noticed that there are many things that I could do in Windows 7 (sometimes using non-Microsoft tools, like Open Shell) and that are not possible any more in Windows 10.
There are only two things that genuinely preclude owners from servicing their own vehicles (Tesla or otherwise): a lack of will, and vehicle manufacturers refusing to sell technical manuals to anyone other than authorized service shops.
Sure technology has appreciably advanced in 30 years, but the fact that technical details and know-how are scarce commodities in the age of information is not a generational coincidence.
Of course, this has always been the case, people just accepted (or didn't understand) the risks. The capacitors in an old TV could pack quite a charge. Working on a roof has its risks. Early chemistry kits let kids blow stuff up. Apparently it was accepted more, before.
That being said, I'd posit that having a set of technical manuals at your disposal would dramatically change this perception to the detriment of certain lucrative service markets. Everything breaks; the real question is who has the capacity to administer repairs, and at what cost?
Computers, for the majority of people, are now principally consumption devices, with some small amount of guided typing of tweets and chats (which I think I would argue is effectively consumption still, in much the way that I'd consider someone speaking on the telephone to be a consumer of that service).
I would not expect someone who watched a lot of movies to be good at making them; I would not expect someone who reads a lot of books to be a whizz when it comes to typesetting and printing; I would not expect someone who eats a lot of sandwiches to be competent at baking bread; I would not expect someone who watches Netflix and reads tweets to be good at configuring or programming the computer they're using to do it.
I (as a computer scientist) have never heard people actually working in tech refer to themselves as that.
As the linked article suggests though, it's not clear it's a particularly useful term. The digital immigrants are often more tech savvy in at least many respects than the natives because they once had to be. And most of the people I work with in tech have had absolutely no problem fully incorporating the rideshare, social media, video/ music streaming, and mapping apps into their everyday lives even though many first encountered them as adults.
Are there age-based differences in media consumption etc.? Sure. But that's much more about generational preferences and habits than having grown up before the Web.
Meanwhile an electrical engineering professor may be capable of designing and diagnosing it and know about all of the nasty design pitfalls like "if you try to make the cache too low impedience for low latency it can lose its DRAM state" or "wired this way will add noise to your analog signal" but have trouble operating the menu or unable to use a tablet with the ease of a toddler.
It is possible for both to learn the other side of course.
It seems arbitrary but I think the reason is that non technical users always feel more comfortable sitting at the exact level of abstraction of the interface without translating it into what it actually is, in computing terms. A folder is something you recognise, you put files in it, there's even an icon that looks like a folder - a file extension is the 'type' of the file and controls the progam that will open it. That's all you need to know.
I'm not sure those non technical users have got less technical with time, I just think the abstraction has moved up a level from the OS and file storage being something they have to think about regularly, to now apps mostly just handling storage and files for you and presenting the data through their interface (i.e. The concept of a file being separate from the app that uses it is going away somewhat).
The people that were dealing with folders and file extensions may seem now in retrospect to be more low level, but they weren't. That was just the interface/abstraction available to them at the time. I don't think they had a better model of what was going on underneath than someone today who just taps app icons etc.
So I'm a tech savvy parent. Yet I've noticed some age differences. My kids are "quicker" at picking things up. They can notice a tiny detail on a big screen, and are more likely to recognize and remember the meaning of an icon, whereas I have to hover over it with my mouse pointer and hope for a tool tip to pop up.
So things like GUIs will seem more intuitive to them, when the real advantage is their ability to quickly distinguish a bunch of tiny abstract symbols.
That's exactly what the article above is saying. The kids aren't learning any problem solving skills. They may be better at following social media trends, knowing the latest memes, and finding the latest hot new app, but that has nothing to do with technical aptitude.
The fact you show no interest in it is more a reflection that you got through the dredge of surface level vapid entertainment all kids are subject to, come out on the other side, and realize "hey look, theres stuff thats a lot better / more interesting than that" and unsurprisingly gravitate towards that over the next iteration of entry level disposable culture targeting kids.
That’s how you become an old person that know nothing about tech - you keep being good at tech and don’t consider the stuff you aren’t good as to be tech. Eventually that other tech is all there is and you are now your dad.
* Here's What Happens When an 18 Year Old Buys a Mainframe - YouTube || https://www.youtube.com/watch?v=45X4VP8CGtk
The other is that most devices are increasingly more geared towards conmsumption and less towards experimentation and creation. On one hand, I used to do crazy things with assembler, regarding direct access to ports and interrupts, that no current OS would allow. On the other, the shift towards phones has greatly changed (and IMHO, reduced, mostly) what we do with our computers.
The good news is that the info is out there for the ones who do want to learn; we didn't have that privilege. Still, these are a minority, and in my experience, the average 18yo person preparing right now to start a degree in CS (or whichever career centered around programming) knows way less about computer than what we used to know at their age.
I started out learning how to program in Windows 95 to make games. I learned to use inline assembly to set the video mode, and then directly write to video memory at 0xA0000000.
You can't do that anymore on modern Windows even if you knew how. Today if you wanted to start game programming, you'd probably read a bunch of articles recommending Unity, which doesn't even easily support C++ FFS.
Doesn't it sound more important to have an IDE + fluid building so people don't have to care about the details?
This is the biggest difference between then and now. In the 70s and 80s computing was all about details, all the way down to the hardware, which most developers understood at the register level.
8-bit amateurs grew up with simple toy machines. They learned enough detail to allow a natural transition to professional development on mainframes and minis - far more detail and more complexity, more of an OS to deal with, but with a recognisably similar outline.
Now most development is more like LEGO - clip moving parts to other moving parts in a slightly precarious way and hope nothing breaks.
It's a completely different way of thinking about computing - kit and cookbook based, with less space for original creation and problem solving. This is partly because the details are hidden and can't be changed, but also because the ethic of experimentation and creativity based on deep domain knowledge driven by curiosity isn't the same.
The whole point of the many abstractions between the user and the hardware is so they don’t have to care about the details.
Yes, the abstractions mean people can do more without caring, which in turn means more people can use them, but it comes at a cost too.
It doesn’t come for free and ultimately other things might be more important.
You can call it a business decision, but that basically means the tech people are doing the work and making our execs rich without actually creating anything that's more beneficial to humanity compared to the computer systems of 10 or 20 years ago.
Think back to the computers of the early 1980's. It was quite easy for someone to pick up BASIC and create simple programs. Their understanding of programming will be basic and their understanding of the machine will reflect that. That is quite similar to today.
Someone who whetted their appetite will move on to more advanced concepts. With early computers, that typically meant learning about the hardware or learning about algorithms. You either had to squeeze more performance out of your software or you needed to extend your reach beyond the language (and libraries). Today's programmer is, more likely than not, going to dive more deeply into libraries or learn more about their programming language. Yes, there will be exceptions based upon the domain that interests them or what they wish to gain from their learning.
Let's say that they decide to dive deeper into how the computer works. In the case of an early personal computer, the task was relatively easy. From a programmer's perspective, almost everything can be viewed sequentially and shuffling data across a bus. If they were more interested in electronics, it was relatively easy to see what was going on by looking at a circuit diagram or the board itself coupled with reading (admittedly more difficult to obtain) datasheets. With that knowledge, it was possible to modify the existing hardware and practical to build new hardware.
The notion that we are abstracting our understanding further and further from the workings of the machine while letting that limited understanding decide how what we can do with it isn't exactly a new one. My first real encounter with that idea was from one of the pioneers of digital computers, who described the machine as fundamentally physical and analog. In other words, making the physical world digital is in some sense and abstraction in its own right.
Granted, the article's author had relatively little to say on the fronts of programming and electronics design. It was more of a critique about younger people using software with very little understanding of its features or how it works, the sort of thing that can get you into trouble when things are not working as expected. While I disagree that this problem is specific to this generation, it is a solid reminder that the stereotype that younger people are better with technology is just plain wrong.
Certainly younger generations will always be more accustomed to interfacing with technology (this will never change). But that doesn't mean they understand it.
I mean, I suspect as much from my own experience, but I'm curious to know if there's actual research out there.