Hacker News new | past | comments | ask | show | jobs | submit login

I know I'm ancient for saying this but....

""" led managers to prioritise digital natives for open roles, believing they are more adaptable than Gen Xers """

I'm an old millenial and there's truth to the joke that we have to help our parents AND our kids fix their printers. I have found the complete lack of fundamental computer skills in younger candidates ("what is a directory? Why would I ever need the terminal?") to be a challenge.




Kids in our time used to sit in front of computers, nowadays they are glued to their phones and phones have neither printers nor directories... (don't 'well, actually' please)


I think the “printers” nor “directories” is a red herring.

The problem is that computers could be configured. There were commands and scripts that you could run to fix things. So basically there was something to troubleshoot. You learns how to figure things out.

Phones today have almost no configurations, and there are no scripts or commands to run to fix things that might be broken.

There’s absolutely nothing for kids to learn other than switching it off and on again.


Sure, because you've been limited to a high degree. Apple showed the world you can treat your customers like thick headed babies and they'll be thankful. Android is following suit. Slowly eroding the freedom we once had and the invitation to do more than the default.

Look at this, for example https://developer.android.com/about/versions/10/behavior-cha...

Why? Let me handle permissions. Don't choose, for me, whether I can chew steak or not.

End of the day, we're trained to be idiots. School teaches us nothing. Products treat us like babies, strengthening idiocy. I mean, if a device breaks, we call the service guy and don't try and repair it ourselves, right? Because "changing a battery is life threatening. It must be outlawed." (- Apple) People are convinced, by ads and media campaigns, that all this is good. People are convinced they are using luxury devices.


My more cynical take is that Apple fans have lives and they don't live for computers, as I do.

They want the PC (a 'smart' 'phone') to get out of their way, and if it doesn't work, they'll just go do something normal people do, like a platonic dinner party or a drinking party.


> normal people do, like a platonic dinner party or a drinking party.

Hopefully more food pictures & selfies will emerge


I'm struggling to think of a reaction besides "what a textbook case of Stockholm Syndrome."

We certainly agree that there has been an erosion of computer skills among the younger generation(s) raised on appliance-like computing devices. No argument there. This is a loss. Full stop.

Where we might differ is the question of whether or not this is a good thing in the larger context. The fact that everybody had to become subject matter experts in personal computing just to be effective with a personal computer in the 1980s and 1990s was a byproduct of the fact that personal computing was incredibly buggy and user-unfriendly for multiple decades. If everything in life was as balky as computers were, we'd never get anything done as a freaking human race. How many f'ing hours did we waste as a human race just getting printer drivers to install?

I mean, think about all the hours you spent just getting printers to work and PCs to connect to the internet back in the day. It was rewarding in its own way, and it led to careers for many of us.

Now imagine if everything was like that. Imagine if you couldn't use your car, the HVAC system in your house, or kitchen appliances effectively without hundreds of ours of ad-hoc experience gained through troubleshooting.

For those that want to get their hands dirty, there's Linux and craploads of free development tools even on Mac and Linux. And for those that don't give a crap, computing devices function pretty well as appliances these days.


I agree completely. I'm in my 40s and, at this point, on this topic, I feel like Cypher in the Matrix. Plug me back in and I don't want to understand any of this shit anymore.


I want to understand it but only when I am making something that requires me to understand it

I don't want to understand it when I just want to like, print out my car insurance card or do some mundane life task


> Phones today have almost no configurations, and there are no scripts or commands to run to fix things that might be broken.

I agree. To put it in somewhat starker terms, it might be your device, but we have rapidly eliminated ownership of the devices. It's not even all the War on General Purpose Computing; there's so many factors here: an expectation that apps/os just work/are simple, need for security sandboxing, and most of all imo the shift of more computing/resources into the cloud. Connected services where the app is a thin-client to far off systems, that's a huge shift away from the ownership model of computing we briefly had.

It's hard for me to imagine what exactly might trigger a resurgence of the Personal Computing model. But computing, for the time being, is no longer ours, no longer humanities; it's heart has effervesced up into the cloud.


I think configured is a red herring.

Computers were and are an abstracting interface to functionality. Historically the abstraction was frequently leaking and the user had to make various adjustments.

It could be argued the nowadays the leaking has been reduced to an extent that users can fully embrace the abstraction. That is not quite right, counter examples include "after two years of browsing my computer became slow, need a new one" and "I can't use that website, it is full of popup adverts".


> Phones today have almost no configurations

My friends’ kids are plenty hacky on their phones. Not in an 80s sense. But in the configurations, with knowing various apps and websites through which to uniquely interact with the world, et cetera.


Yeah exactly. They know how individual websites and apps work, but have very little insight into how the actual device and OS works. The browser is the OS, basically.


> have very little insight into how the actual device and OS works. The browser is the OS, basically

As many of us have little clue how the hardware works, at the fundamental level, today. I am analogising the hacking we’re being nostalgic about with this higher-level hacking they’re doing today.

It’s still hacking, and it’s incredibly sophisticated. It’s just that the layer that was at the forefront decades prior is no longer today. Someone has to maintain it, but it’s not where most opportunity is.

This is a recent view of mine. But LLMs have set a cap on the relative value of pulling up a terminal.


That’s the same for basically everything. There’s a bit of writing that highlights how modern life is so complex that no one actually knows the full process of how a modern mass produced pencil is made. It’s just millions of people in their own bubble working on the little piece of the supply chain they understand.


Also compare opportunities to open things up and see how they work... 80s, 90s, now...


It’s not the end of the world though. My parents (who only knew how to use a TV) could and did learn how to use a computer. My kid (who only knows how to use a phone) can definitely learn how to use a computer. These aren’t mystical devices that only (current) 30-55 year olds will ever know how to use.


You’re explaining why the kids are bad at computers… I agree.


I think there was a sweet spot when your home computer didn’t have internet. You’d “use” the computer and there would be very little to do other than snoop around, fiddle with settings and try to make it do something interesting. When my computer is internetless today it’s useless. It was the same then but somehow I spent hours fiddling with it. It was the same for even less curious people. If you got a CD with some software and it wasn’t working right you had to just fiddle with it until it did. You couldn’t google the answer. You learn a lot through trial and error.


Generally these stereotypes about various generations are highly questionable (as any stereotypes, of course). My nickname includes my birth year, so I'm even older than you, but I still grew up with computers and know more about them than many of the kids whose parents put an iPhone/iPad in their hands as soon as they could hold it...


There is a disturbing number of young software engineers who have never installed or used Linux, and who don’t want to learn Python or Powershell for their job.

They stick to what they know (usually C# or Typescript) and they have problems if they need to use a new tool or language.


Most juniors where I work spend their days delivering nothing and telling me how shit everything is and how they already know everything and if they could just use some other language or framework they’d be productive.After all that, go generate a bunch of crap with copilot that never comes close to landing in production and most other mid and senior engineers just quietly do their work for them.

Some of these people are in their 40s and are new to the industry. Not just picking in "younger" people here. Just a different attitude. I think the barrier of entry is just getting higher as we have to know a lot more these days.


When I was in the University most people didn't know and didn't wanted to learn how to do that, and a couple of buddies and me helped the rest of folks with those needs.

We were in an EE/Telco degree where we had to program in different languages and configure plenty of different software, so knowing to solve that kind of problems are relevant in the kind of work they were going to do in the future.

The reality of that is how most of people are (or were?) in those degrees as a mere formality to get a good paying job, and interests are a secondary matter.


This is the same story that I lived in the late 90's. My buddies and I would install every different distro, run through LFS build and that has turned into a very niche embedded systems expertise that is exactly where the industry is now. Every system has linux, qnx, or android. The university was so small that when we went to the IT department to "register" our computers for the network that the IT guys were excited that we were running linux.


It was always so.

24 years ago, I had to do my computing A-level[0] in Visual Basic because that was the only language the teacher knew.

[0] UK thing: https://en.wikipedia.org/wiki/A-level


A few times I've had to push juniors into new languages by having them do something very simple in a system that was written in a language they haven't learned, and the result was they found it easier than expected.

At least for juniors, I'm pretty sure the problem for a lot of them is they remember how much effort went into learning their first programming language and are assuming the second one will be just as much effort. They haven't realized how much of that effort went into general concepts that do translate to other languages, making it easier to learn more.


> At least for juniors, I'm pretty sure the problem for a lot of them is they remember how much effort went into learning their first programming language and are assuming the second one will be just as much effort. They haven't realized how much of that effort went into general concepts that do translate to other languages, making it easier to learn more.

Programming languages have become ecosystems/biotopes. In this sense, learning another programming languages is about thinking how things are done in (often subtly) different ways in the other programming language, learning about the whole culture surrounding the other programming language, etc. There aren't so many transferable principles.

Because many programmers are not willing to put this effort into transferring into a different language, so much bad code is written by programmers who "never properly learned the way of thinking in this language/framework" and thus write, say, "idiomatic Java code" in C++ or "idiomatic Go code" in JavaScript.


You're having trouble thinking at the level I'm talking about: variables, conditionals, loops, functions, etc.

All of these were new concepts to us at some point. I'm talking about people who are new enough to easily remember the effort that went into learning those basic concepts on top of learning the syntax for their first language.


> You're having trouble thinking at the level I'm talking about: variables, conditionals, loops, functions, etc.

Exactly my point: just to give some examples concerning your keywords:

- In many cases, for loops (with an index) became unidiomatic. Instead foreach or range-based loops became the idiomatic way

- In C#, for many cases, the idiomatic way is to use LINQ. The old style to write LINQ queries was to use query expressions, but it became out of fashion. Now, you typically use the method-based query syntax instead.

- conditionals: there actually do exist programmers who argue that if expressions are code smell when you do object-oriented programming (google "ifless programming" or "anti-if programming"), since the idiomatic way in (class-based) object-oriented programming when what should happen depends on some state is to write a method. This thought has not become mainstream, though. On the other hand: for a related reason in Smalltalk, a programming language that takes object-oriented programming much more seriously than most other languages, there exists no "if" as part of the language; see for example https://www.quora.com/If-if-is-an-object-in-Smalltalk-how-is...

- functions: functions now often have become stateful (coroutines, methods, polymethods, closures, ...). On the other side, there exist programmers (often from functional programming) who argue that functions as they exist in imperative languages are not "pure enough".


You're still thinking at too high a level. I'm talking about "what is a function", not "all the ways functions can be used".


If you do consider "what is a function?" not to be a not quite subtle question, you end up with something that basically everybody knows who paid a little bit of attention in the 7th grade math classes.


And this just reinforces what I'm talking about: People who have done this for a while have forgotten what it was like.

I was a teacher's assistant in introductory programming classes for several years in college and I assure you people new to programming do have to put in effort for all the basic things I listed above, not just functions.


That is nothing new. I've met and worked with plenty of old software engineers who learnt C++ 30 years ago and don't want to learn anything new (not even modern C++) for their job.


Yes and they didn't even "learn" C++ 98 or 2003 very well! Templates are a mystery to them.


Thank god. ;D

Also, you don't need anything other than C++. >:D It's the one lang that can do it ALL! But not taking the time to learn C++11+ is demeritable.


Anecdotally, I don’t think anyone who isn’t interested in computers has ever had an easy time setting up a printer.


I haven't had issues using a printer from my mac in about 25 years. I haven't tried from Windows, but I still needed to find and download a driver when I last tried from linux.


I brought a black-and-white laser printer 20 years ago and, much like you, I haven't had a single problem since.

My parents, in the meantime, have probably gone through 6 or 7 printers. For a truly cursed printing experience, here's my advice:

* First of all, make sure it's an inkjet. They give the best colour print quality, you see.

* Second, always be on the look out for a good deal. Why buy a $400 printer when you can buy a $50 one? As far as you know it'll be junk in 2-3 years anyway.

* Third, get a wifi printer that doesn't have a screen, and make sure your wifi router doesn't support WPS. If the printer comes with an app, make sure it no longer works on your dated smartphone. Install the printer at the absolute edge of your wifi coverage.

* Fourth, make sure it's a multifunction printer/scanner/copier, and always install the manufacturer's full suite of software. You want to be able to access all the features, don't you?

Just follow these simple steps and before you know it, you'll have empathy with the many people who say printers suck.


Ubuntu has worked out of the box with every printer I've thrown at it since 2008 or so when I switched to Ubuntu. Literally tens of disparate printers from all manner of expensive laser printers to cheap no name ink jets. Both using USB and network interface.


Same here. My Mac and iPhone find my WiFi printer with ease. The only issues I have are printer related, WiFi disconnects and waking on sleep.


It’s been less than 25 years, but I haven’t had issues printing from a Linux desktop in a long while. It’s as smooth as my apple devices


I haven’t used a printer in ages but I just got a Brother laser printer last month. I plugged it in and pressed the wifi button on the printer, then I pressed the WPS button on the router and it connected. The printer then instantly showed up in the printer list on my MacBook.

I don’t know how it works and I didn’t need to. It just worked. Optionally I could have also plugged the printer in with a usb cable which I assume also just works.


I think I the same Laser Printer The Brother HL-L23600W. Windows? No problem. Chrombeook? No problem. Mac mini? No problem. Rocky Linux 9. Problem. I added the driver using the GUI, it finds it. I send a print job and sits there pretty doing nothing. No even a print queue is displayed. I re-installed the driver several times without luck. I tried cups. No dice. I believe the origin of open source was a guy wanted to code a printer driver for his own system and the printer company gave him the code and said: "go for it". Because they did not have time for that. So no, HL-L23600W The Brother laser printer it won't 'just work' for Rocky Linux 9 kernel 5.14.0-162.18.1.el9_1.x86_64 The only gadget that I own that just works is my Nintendo Switch.


Brother laser printers are such a sublimely simple experience. Quite honestly a contender for the best piece of computing-adjacent technology I have ever owned.


The miracle of mDNS and how it pissed off a bunch of Active Directory admins.


Perhaps so, but nonetheless the chronology of this hassle can be divided into two eras: our own ("USB") and that horrific prior era (LPT1:, COM1:, COM2:, ...)


Same. Used to be able to tell old people to go ask a kid to troubleshoot their e-mail software, but it seems to stop at my generation.


> complete lack of fundamental computer skills in younger candidates

Outside elite software engineering, as in writing papers engineering, isn’t this becoming more an art than a skill with LLMs?


No because your boss still wants you to do some report in Google Docs (and no they can't use that neither) or work with a spreadsheet in Excel, or how to print a pdf etc


> your boss still wants you to do some report in Google Docs (and no they can't use that neither) or work with a spreadsheet in Excel, or how to print a pdf

This is in a different category from printer debugging or launching a terminal.


Yes but Gen-Z struggle with those simpler tasks as well


> Gen-Z struggle with those simpler tasks

Not in my experience. I’m sure every generation has its winners and duds. Just saying Gen Z’s winners (by being better networked) seem to have a skillset uniquely honed at X’s weaknesses, much as Millenials seemed almost uniquely crafted to compete against their Boomer counterparts (by being more efficient).


Good for you but it's not the experience a lot of people have https://www.reddit.com/r/NoStupidQuestions/comments/18mu8w3/...


Agree with this.

I had to teach Excel to apprentices when they joined - just because they are digital natives, doesn’t mean they can actually use business tools.


> just because they are digital natives, doesn’t mean they can actually use business tools.

As I explained in https://news.ycombinator.com/item?id=39789906 "digital natives" refers to the fact that this is the first generation for which (rather consistently available) internet was something that they grew up with from very early childhood on, i.e. they "never saw the 'old world without consistently available internet' in their life".

So, quite by definition, it does not imply that digitial natives can use common business tools.


For what it's worth, I have spent much of the last several years undoing someone's untested pile of garbage in Excel and making clean, tested, code with reproducible results in DBT.....




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: