The thing is the new generations don't have to work as much to make things work in their day to day lives. Most use ios devices which are dead simple and straight up don't allow anything complicated things (it just works) and windows came a long way.
Not so long ago things were breaking apart almost daily, you had to dig into things to make them work, and as you did that you were learning how to fix things, how to research about issues, &c.
I remember struggling with torrent clients, cracked games, dodgy drivers, unrecognised CDs, building my own pc &c. then in uni I was toying with linux distros, I probably spent more time hacking around to make them work than actually studying but it taught me a lot of things, things that were useless on the spot but are helpful in the long run
> I remember struggling with torrent clients, cracked games, dodgy drivers, unrecognised CDs, building my own pc &c. then in uni I was toying with linux distros, I probably spent more time hacking around to make them work than actually studying but it taught me a lot of things, things that were useless on the spot but are helpful in the long run
You do realize only a tiny tiny subset of kids were like that back in the day, right?
>You do realize only a tiny tiny subset of kids were like that back in the day, right?
Not if you grew up in some post-communist/eastern european country in the '90s to '00s and wanted to play pirated games on the family PC. Except the linux part, no kids really cared about it.
All that tinkering to obtain and make pirated games work made a lot of kids proficient in PCs and Windows internals. Almost every 15 year old back then new at least something about .RARs, ISOs, .DLLs, .cfg and .ini files, IPs, network settings, registry settings, direct connect, CD burning, anti-viruses, binary patches, driver installs, etc.
I remember some article about 15 years ago that the SW industry and computer usage proficiency in eastern Europe blew up so quickly despite being impoverished VS the richer west because kids there had to fight witch complex systems to play videogames VS the kids in richer countries who would just plug in a cartridge or disc in the console and hit PLAY.
Not every kid back then was into tinkering and hacking, we only wanted to play games, but those of us who were went into tech as adults and the childhood experiences around PC problem solving definitely helped.
I think part of it is that current technology is mostly interacted with via software rather than worrying about how it is actually implemented. The article mentions not knowing that a paper to be copied needs to be upside down so it faces the copier glass. It's fun to laugh at that, but younger people aren't making that mistake "because they are stupid" but because they see that as just a weird user-interface fail. If you think how a copier works internally, obviously the copier can only copy things facing the glass, but if you are not used to thinking about the hardware, that just seems user-hostile.
And, in fact, it depends. I have a scanner at home that is a camera with an LED light that basically takes a picture--and in this case, it obviously print side up. Obvious? Well, sort of but you need some mental model of how things work.
I have a great deal of sympathy for anything printer/scanner related. This Oatmeal comic comes to mind [1].
However, I have heard anecdotally that recent university class intakes for CS have been encountering the death of the desktop metaphor as a problem. I'm aware there are more instances of students (who have chosen to do CS) not knowing the first thing about hierarchical file management, folders, etc. It's definitely possible to get through secondary education using only a phone or perhaps a tablet. It's then totally unsurprising when people (other than PC gamers) have very little intuitive understanding of desktop computing.
I don't know how much of a problem this really is, as people are generally fast learners, but it's a weird thought as someone who has used a "desktop" (laptops mostly) computer every day since about age 12.
I would caution you of rose-tinted glasses. Even during "our time" when computer literacy seemed universal, in reality we were just a small handful of nerds and the vast majority of the world didn't computer.
We're over a fifth of the way through the 21st century now, and the vast majority of the world still doesn't computer while we still remain a small handful of nerds. The only difference is the generational turnover in the workforce, with more of us nerds getting into the field wondering why nobody computers.
Yeah, there's been this common belief that "you kids are good at computers" throughout my life. Yes, the kids* aren't afraid of computers in the way that some people in my parents generation are, but there's still a massive spectrum of computer ability within each generation, and it corresponds more to mindset than age.
People who are good at computers have good mental models of "systems" and aren't afraid to experiment. That's pretty much it. The individual pieces of knowledge come and go.
An Apple interface evangelist wrote a book years ago (Tog on Interface). Not really specific to Apple and you can believe or not the Meyers-Briggs data he used to bolster his argument. But it's almost certainly directionally correct that the engineers developing the software have a much better mental model of how the whole thing works and interacts than many of the users of that software.
That's a very valid point. I remember having to help my less tech-savvy friends constantly at university.
You're absolutely right that it's never been universal in the workplace either. I have experienced endless incidents of people fiddling with projectors etc.
I think my point is more that it's strange to me that people are ending up in CS programs without knowledge that certainly felt universal during "our time". I don't know how much it's a real issue though, as if you've selected CS as a course, you'll probably get up to speed fairly quickly.
Or they switch majors if they decide it's not for them.
As I was writing the other day, I'm not sure why CS as a technical discipline should uniquely require more than a standard high school curriculum in order to major in at university. And the fact is that very few high schoolers have learned much computer science even if they've hacked around with computers and Python a bit.
> You're absolutely right that it's never been universal in the workplace either. I have experienced endless incidents of people fiddling with projectors etc.
I like to think of myself as a pretty tech savvy person (I have held director level positions in FANG companies) but having interviewed again recently I still end up resorting to quitting and reconnecting when there’s no sound on my Google Meet/Zoom meeting. With 20 years of experience I have no real idea how to start debugging the problem. Maybe the problem wasn’t with the folks struggling with the projector?
Yeah. I can assure you I've been to many events where you end up with a cluster of very technical people who present regularly and an AV tech struggling to get a projector working with someone's laptop. It's much better than it used to be but you still have random failures.
The desktop metaphor is still my favorite way of doing computers. Android is still heavily influenced by it, they hide it just enough, mostly with the hideous "Data lives in apps" idea that's more the fault of cloudmania than android itself.
An interesting effect, even though they can't avoid calling young adults "lazy" even when that contradicts the rest of the article. I would say that office printers are complicated, and people usually learn how to use them by watching other people in person. Also, I would hope this leads to less crap being printed in general.
It mentions apps a lot, implying that users use apps because they are simple and can't be bothered to do anything complex. I would point out that Apple and Google spend billions and billions of dollars to get users to stay in the walled gardens.
To be fair, scanning from a mobile device has almost zero discoverability. Few mobile OSes ship with a builtin "Scan" app in the same way that they all have a "Camera" app.
What modern societies need is a tax on feature non-discoverability; essentially providing a financial incentive for tech companies to ensure 100% of their features have meaningful uses for their users and that users can find these. At that point whatever details are economic will fall out w/o extended arguments of "what is right/best way to do X".
> Few mobile OSes ship with a builtin "Scan" app in the same way that they all have a "Camera" app.
Android[0] and iOS[1] both feature official apps (and in the case of iOS, the app is included in the default install) that can scan documents using the device's camera.
>Few mobile OSes ship with a builtin "Scan" app in the same way that they all have a "Camera" app.
What is the problem? I had many systems where you just send a picture to a email then to a server, there it got OCR'd and converted to a PDF(A) or whatever, then send back to the sender-email as attachment or link and archived.
If I emailed myself a JPEG from my phone I would absolutely have to look up the best way to convert it to a PDF. I don't know off the top of my head as I normally use a PDF scanning app.
It's actually a bit odd thinking about it that it's not a common native phone function as capturing PDFs from paper is an incredibly common thing for people to need to do these days.
I've been working in an office environment for over a decade now, and every time I start at a new office figuring out the scanners/printers is always a hassle. This really isn't a Gen Z thing as much of a scanners/printers suck thing.
> "How would they know how to scan something if they’ve never been taught how to do it?"
I'm not surprised that came from a professor of education. Leave it to a teacher to protect you from bullies by saying out loud to the whole class exactly what your bullies were whispering to mock you.
Snark aside, in all seriousness, this seems like an article about nothing. Young people are going to get a little bit of snark from older people who are jealous of their youth and hipness, then they'll learn how to use the copier, and nothing of note will have happened.
There isn't going to be a follow-up article in twenty years about law firms going out of business because all the workers who can send a fax have retired. People who can get a printer to print double-sided and collate will not be working into their 60s at inflated wages like Cobol programmers. It's just a bit of fluff.
If I remember right there is a famous movie among programmers where a group of developers becomes so frustrated with a printer they brutally destroy it with a baseball bat.
Its not just a gen Z thing. I used to look after printers in a past job, and the little shits are still unreliable bastards to deal with.
At the current $office, there is a team that works full time making sure the printers are working, even so, it still fails to actually print properly 1/4 of the time.
Printers, despite (or perhaps because of) being literally the first peripheral for computers, they still are utter bastards to use.
It's been nearly 20 years since I declared "I give up printers, they are a failed corrupt industry." I best decision of my life, as they literally are sent from hell.
My printer (A TS8220) is fine. I'm not sure why everyone hates them. Like, I'm a big office a few will be bad, and they'll accumulate wear if used all the time, but a home inkjet is fine unless you let the nozzles dry up, although they seem to be fine for months.
I sometimes work with students too, and the level of general tech knowledge is very low indeed.
I understand not knowing how to use a copy machine, since they don't have a copy machine at home. But not knowing how to change notification settings in apps, or generally turning off spammy notifications (in android) is something completely different.. they have the device in their hands for many hours per day, and the spammy notifications come daily, and yet they don't even try to figure it out how to disable them.
Privacy settings, again, are something they have no idea how to use or set.
Stuff like scanning multipage documents into a single pdf (there are many many apps to do that) are revelation to many.
And i'm not talking 14yo kids here, i'm talking about engineering students.
> For most people, the cost to figure that out is higher than the cost of ignoring notifications and default settings.
It's not, because they don't try to evaluate the time savings realized by disabling those notifications, because they don't possess the critical thinking skills to do so.
Or, some updates were pushed to the OS and they gave up, because it reset their settings.
That's not usually what I've seen in offices. Instead, it's some utterly brain-dead naming scheme that the IT department came up with, where the only sensible part is that, usually, the printer's name has the floor number in the name, but aside from that it's just some random alphanumeric crap that's completely unhelpful for actually finding the damn thing. The best way of figuring out where to print is to take 30 minutes walking around your floor, looking for publicly-accessible printers, and writing down their names, and then going back to your PC and looking for those.
And yeah, in some places you really have to use the printer's control panel to find out the printer's IP address.
They ARE complicated and this is compounded by the fact that you almost never need to use one-- until you REALLY need to use one. At that point it becomes a race to print to the right printer (because they're almost never named in a logical way). Or worse, add a printer that the IT department installed months ago and you had ignored until you needed to use it. And finally, diagnose problems and fix jams which can be an interesting exercise with varying degrees of frustration depending on the model of the printer and its condition.
There are a lot of comments here saying that printers and scanners are complicated! Really? More complicated than your microwave, fridge freezer, television? More complicated than your car?
Printers and scanners aren't complicated, it's just that we don't really need to use them much anymore... until we have to. 99 times out of a hundred I don't need to scan because I can just take a picture on my phone of any hard copy document. I don't need to print because people are happy to receive an electronic document by email.
If I never had to drive again, I'm pretty sure I'd forget how to operate a car after a few years. If I'd never operated a washing machine in my life, I'm pretty sure I'd struggle to put on that first wash. Every time we have a power cut, and my oven refuses to work, because the clock is not set, I just push buttons randomly until it works. Instructions for resetting the clock are just not worth remembering, as it only happens once a year or so.
There's nothing complicated about these devices. I used to bootleg VHS tapes by hooking up two player/recorders. That's a skill that's completely useless now. These devices are just becoming obsolete, so we don't bother learning how to use them. I'm pretty sure I'd be unable to start and drive a Model T Ford, and I don't see any shame in that.
> More complicated than your microwave, fridge freezer, television?
Absolutely, none of those tend to crap out by merely looking in their direction!
> There's nothing complicated about these devices
Printers and scanners have had atrocious UX for as long as I've lived - going from driver installation, to the accessory software, to poor hardware build quality and questionable decisions by the makers that reek of planned obsolescence, poor regard for consumable economy, and other nefarious ploys. This applies to both the home and professional (office) markets in my experience.
It's only in recent years where I've seen some printers or scanners where I've said to myself: "huh, you know what, this one ain't half-bad".
Printers and scanners are massively more complicated than my microwave. It has two physical dials and a hefty spring-loaded button to open the door. It doesn't speak USB or any network protocol, it doesn't have ink cartridges or paper to replace, it doesn't link up to a computer or smartphone…
>> There are a lot of comments here saying that printers and scanners are complicated! Really?
So I recently bought a small printer - I think an Epson WF-110 - hoping to connect it to my Linux computer. I was worried because it can with a driver disk for Windows and macOS but made no claims of Linux compatibility. Once I figured out how to get it on my Wifi network (easy) it just appeared on Linux and I was able to print to it with ZERO setup on that end. Easy right? Nope. After sitting unused for a few weeks, it was being unresponsive. I tried turning it off, but the power button doesn't seem to do anything. Did I mention is wireless and battery powered for total portability? Unplugging it didn't turn it off. So I left it unplugged during a 2 week business trip to see if it will power up and function again when I get home. Assuming the battery will die by then.
TLDR: Older guy couldn't turn off a printer that stopped responding...
Well, my microwave never releases lovecraftian horrors instead of warming my food, whereas my printer regularly produces ancient runes instead of my documents.
Its about time some engineers step up and build a modern, functional and repairable printer. I dont care that its not financially feasible. Just make it feasible. How are people still buying printers from companies that spend all their ink cartridge money on marketing and more obsolescence?
Or just buy a cheap laser printer, connect it via USB, and the toner cartridge (and drum) will probably last you years with doing just the occasional printout.
My 12 year old cheap HP Color LaserJet printer I paid 99€ for is still going strong, on original toners... Just tells how much I actually use it. The colour printing is slightly off, but I never do much of that anyway. And it printed the shipping labels today acceptably. Not perfect, but good enough.
Amazing actually... Bit earlier, I thought it finally broke as I failed to print, but USB cable was not connected...
I rarely need color; I'm going to send photos off to be printed anyway which is basically never. I'd probably buy a color Laserprinter anyway today just because they're pretty cheap.
I have the space and I don't print a lot. But it's sometimes convenient and I sure don't want to drive 20 minutes to Staples every time I actually need to print something.
I actually consigned my color inkjets--including a large-format HP photo printer--to purgatory (aka my attic) because the ink dried out and I'd have been over $100 for a refill.
I think Stallman wanted to do that but somehow ended up with FSF and GNU instead, which AFAICT didn't significantly affected quality of printers either.
While the "incompetent youth" trope is trite, I do feel we're past "peak learning curve" on systems for youth.
I think photography is even a better example than scanners: so many things that were super hard once are now literally point-and-click. However what was once an easy growth path now requires a large upfront investment, which most people pass on.
The unfortunate casualty is the hobby-ness of the occupation. If doing things requires so much premeditation, people start treating it as a job/chore, and lose an opportunity for joy. There's nothing wrong with that, but it is a shame.
> The first time he had to copy something in the office didn’t exactly go well. “It kept coming out as a blank page, and took me a couple times to realize that I had to place the paper upside-down in the machine for it to work.”
I had never thought about it until now, but from a UX perspective would a copier be better if you could scan face-up instead of face-down? Regardless, I can imagine the disappointment of not being able to scan a butt.
>I had never thought about it until now, but from a UX perspective would a copier be better if you could scan face-up instead of face-down?
You can. Any decent office printer does exactly this: you put a stack of papers in the auto document feeder and press a button and it scans all of them, double-sided if you want. Even simple photocopiers were doing this back in the 80s and 90s.
On other hand. It should be reasonable to figure out, if one were to stop and think about how it must work.
You have this transparent glass on one side and then opaque white on other... Per usual understanding you can see through other and can't through other.
Being able to navigate touch interfaces made for the general consumption does not mean necessarily knowing how the underlying devices operate.
See as an example the article posted here some time ago in which a uni professor is lamenting the fact that his students do no know of the existence of a file system. Files live "in" applications.
How is this different from other generations that struggle with scanners? I’ve worked for almost 30 years and there’s always been a sizable portion of the office who can’t work scanners and copiers.
About 5 years ago, I was in a budget planning meeting and a senior level colleague (probably about 65) was advocating for hiring contractors to print out her meeting materials for her. She would have spent about $75k/year to print, collate, scan and organize paper. I showed her how to do these things through the print settings as our printer would do all that automatically.
I think printers are complicated because they can do a lot. Many people don’t want to go through training and don’t want to spend time going through and testing lots of nested properties. I don’t think it’s a gen z, or millennial, or gen X, or baby boomer, or lost generation thing.
Printers are complicated because they never work, their UI leaves a lot of guess work, you have to have some office lingo training to use them, and who knows if what the menu offers will work. Printers are special in that they’ll offer to copy in color but then you realize it was set up as grayscale by the vendor, or this one didn’t have the X option but the driver lets you do it, it’s the shittiest device and no one is stepping up to make one that doesn’t suck because people buy them anyway.
Given how much money Xerox poured into UI research in 1970s it's quite amazing that copiers still have terrible user interfaces. They did all this research, all that work and the entire industry just forgot or decided that they didn't care.
To be fair: copiers are weird. They exist in a weird intersection of mechanical machinery and computers. I do feel like much could be fixed with better software, but that's not really interesting once the device have been shipped. E.g. we have an IPv6 only office network. The manufacturer of our copier/printer did claim to support IPv6, but it was ever only tested once, eight years earlier. No regression tests was ever performed. It's a little besides the point, but illustrates the indifference towards the actual software embed in these things.
For me the biggest issue is the lack of understanding of files and folders. Google Doc is used by I'd say most people under 25 today. It sucks, it teaches people to rely on search, and not even Google is good at finding stuff on that level. We use Google Docs at work, and as a result no one is able to find anything, partly because it's tricky to organize things in folders, unless everyone is onboard with that approach.
>For me the biggest issue is the lack of understanding of files and folders. Google Doc is used by I'd say most people under 25 today. It sucks, it teaches people to rely on search, and not even Google is good at finding stuff on that level.
It pains me to say this, but honestly? I think it's an elegant solution to a problem that doesn't want answers.
The world at large has never, and I mean never, understood files and folders. Most people, both in the 20th and 21st centuries, create and leave all their files straight on the desktop. Teaching people about file structures never works because they don't care, far as they're concerned they have more important stuff they need to be dedicating their brain cells to.
So the wiser(?) computer nerds giving up on trying to teach file structures and bending the knee to what the world at large wants, a singular place to store data and a quick way to find them, is an elegant solution.
> The world at large has never, and I mean never, understood files and folders.
This doesn’t mean not to use an analogy to organize information. Most people don’t understand Dewey decimal, but that doesn’t mean libraries of the future should abandon it.
Files and folders are at least teachable and correspond to many successful life patterns (eg, kids put their toys in their cubby and their friends put their toys in their cubby. This makes it easier to find the next day at school).
I would like to see more work done on graphs of folders instead of just hierarchies. When google docs first started you could have lots of labels and organize the same file into multiple “folders.” But I think they did away with this when they switched to drive and had to represent as a folder structure.
I think an easy way would be to have lots of bins and link them together with lots of edges that could be hierarchies or topics or subjects. If I have to project it onto a file system then use lots of symlinks to put the same file in multiple places.
I tend to avoid too much Desktop pollution except to maybe keep something I'm referring to a lot. However... While I used to maintain a pretty granular file folder hierarchy, these days I mostly just dump things into an &Archive folder or one of a few other fairly generic folders and, yes, rely on search/knowing the approximate date.
And I don't really have trouble with finding my own stuff. It's finding docs that have been shared with me in part because things I really need are mixed in with a zillion agenda docs and presentations from some meeting I didn't really care about.
> It sucks, it teaches people to rely on search, and not even Google is good at finding stuff on that level.
It’s weird how bad google search is within drive. Yesterday I was searching for my monthly expense tracker and couldn’t remember where I put it. I searched for “form” and nothing came back. I searched for “expense form” and nothing came back. I sorted by recent and couldn’t find it. I pulled up a list of all forms and saw it as I titled it “Monthly Expenses Tracking Form.” I cursed google, wondered briefly how search fouled up and went about my day.
Thinking back now, I guess they are doing complete word tracking so searching for “expense” won’t find “expenses” but not sure why it didn’t return for “form.” I guess I can start embedding seo spam at the bottom of my own personal forms so I can find them.
> Printers are complicated because they never work, their UI leaves a lot of guess work, you have to have some office lingo training to use them, and who knows if what the menu offers will work.
Why is the 'office space printer scene' such a hit?
. . . We have all had experiences where we worked ourselves to the bone to complete a document/spreadsheet/presentation/program only to experience extreme stress when trying to deliver it because the crummy printer won't do its job. . .
I am skeptical of the lemma that "[Gen Z's] formative tech years were spent using software that exists to be user-friendly."
The only place this lemma is used explicitly is in the central argument:
"They may be digital natives, but young workers were raised on user-friendly apps – and office devices are far less intuitive"
The examples given in the article evidence software that exists to be easy to use, particularly to access content, but not to be user-friendly in and of itself:
"They grew up using apps to get work done and are used to the ease that comes with Apple operating systems."
"...apps like Instagram and TikTok are so easy to use that younger people expect everything else to be a breeze, too. When it’s not, they’re more likely to give up."
"'It takes five seconds to learn how to use TikTok,' [Simon] said. 'You don’t need an instruction book, like you would with a printer. Content is so easy to access now that when you throw someone a simple curveball, they’ll swing and they miss, and that’s why Gen Z can’t schedule a meeting.'"
These distinct purposes seem to be conflated in the article. The article also seems to conflate user-friendliness with intuitiveness, and asserts that desktop computing and office devices are less intuitive.
A critical aspect of user-friendliness that is absent in the article is freedom: "that the users have the freedom to run, copy, distribute, study, change and improve the software" [1]
The difference is clear when you substitute "easy to use" for "powerful and reliable" in this quote: [2]
"The idea that we want software to be powerful and reliable comes from the supposition that the software is designed to serve its users. If it is powerful and reliable, that means it serves them better. But software can be said to serve its users only if it respects their freedom. What if the software is designed to put chains on its users? Then powerfulness means the chains are more constricting, and reliability that they are harder to remove. Malicious features, such as spying on the users, restricting the users, back doors, and imposed upgrades are common in proprietary software..."
The article gives an example of a user killing a laptop by repeatedly accepting a pop-up without reading it. I found this passage from the article interesting:
"Dell used its own survey of respondents between the ages of 18 and 26 to find that 56% of respondents said “they had very basic to no digital skills education.” A third of them said their education had not provided them “with the digital skill they need to propel their career”. What they know comes from the apps they use on their own time, not the tech supplies at Office Depot."
I'm curious if some of this would be better explained as user behavior being optimized for easy to use but unfree software.
Not so long ago things were breaking apart almost daily, you had to dig into things to make them work, and as you did that you were learning how to fix things, how to research about issues, &c.
I remember struggling with torrent clients, cracked games, dodgy drivers, unrecognised CDs, building my own pc &c. then in uni I was toying with linux distros, I probably spent more time hacking around to make them work than actually studying but it taught me a lot of things, things that were useless on the spot but are helpful in the long run
It's a bit like car drivers, they all know more or less how to drive a car, but they have absolutely no idea of the internals even the most basic stuff. Or bicycles: https://twistedsifter.com/2016/04/artist-asks-people-to-draw...
familiarity != knowledge/understanding