Hacker News new | past | comments | ask | show | jobs | submit login

What's worse is the number of other posts with the top comments being vociferous defenses of these companies as if they needed these people to defend them or cared one iota for their welfare. It feels like it's Stockholm Syndrome on a mass scale.

I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens. They scream outrage when they're not allowed their tiny little dance-jig operators provided courtesy of an abusive regime.

Sure, I reflect that I may be now nearing the curve of an obsolescent person attached to such silly outmoded principles like "ethics" but if we're all just selling ourselves out constantly to the angry god-machine of the id, is this really the future we want for our daughters and sons?

> I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens.

Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.

I recommend Jonathan Blow's talk "Preventing the Collapse of Civilization"[1] for an unsettling view of how far we have already traveled down that path.

[1] https://www.youtube.com/watch?v=ZSRHeXYDLko

Gen Z can get real general-purpose computers with a development environment for cheap. The new Raspberry Pi 400 is a modern take on home computers like the C64, but 15 times cheaper. Pi 400 full kit: $100, C64: $595 (1982) / $1576 (inflation adjusted for 2019).

The computers are not just getting cheaper, educative material is easier than ever to find.

Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore, but then again, nothing is stopping a kid from buying a microcontroller and playing with it. It cheap and information is plentiful. I would even say that it is easier for Gen Z to play with electronics than it was for millennials.

But that unprecedented availability of computers didn't turn everyone into a nerd. Just because someone has a smartphone doesn't mean he has any interest in computing. When computers were limited and expensive, only those who had some affinity with them had them, now everyone have them, so mechanically, you can't expect the same average level of interest.

> The new Raspberry Pi 400

Learning about the Raspberry Pi is nice, but that doesn't help preserve the institutional knowledge needed to build system/low-level features in a modern OS. I'm not talking about teaching kids to program; I'm concerned about preserving in the future the knowledge of how to create all of the complex tech we use today. Many civilizations throughout history collapsed after they lost the technical knowledge on which their civilization depended.

> Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore

Yes, that's the problem! Unless those skills are actively used and passed on to the next generation of engineers, that knowledge decays. Part of the reason you don't see a lot of bare metal programming anymore is due to the knowledge decay that has already happened!

> nothing is stopping a kid from buying a microcontroller and playing with it

This article is about how those same kids are being stopped from learning the complex systems we currently use.

> didn't turn everyone into a nerd.

Nobody is trying to turn everyone into a nerd. I'm talking about making sure the nerds of he future have the ability to learn about the tech they use, so the ability to understand and make that tech isn't lost. Locking down the OS into an "appliance" that cannot be inspected or changed is a direct attack on the ability to learn,

The problem with low level programming is just the general trend of specialization.

The modern computer is the pinnacle of human civilization, a thing so complex that it requires thousands of people to fully understand how it works. Low level programming is just a small link in the chain of complexity. From mining the materials, to producing silicon of incredible purity, to designing circuits at quantum scale, to CPU design and finally the guy who places the frame that will display cat videos. So if you argue that no one knows how to program the bare metal (that is not that bare anymore), one can argue that the knowledge of how to make logic gates from transistors or the metallurgy of copper traces is getting lost.

Maybe less people will know how to program on the bare metal. But think about it, it has been thousands of years since most people are unable to hunt and gather. This is a great deal more worrisome than not knowing how to program computers, and yet, human civilization have been thriving.

The important thing is that it is still accessible, and it is.

What you're saying was truer in the '00s when 'scopes, logic analyzers, etc. were professional instruments costing thousands and FPGAs were inaccessible to all but the most dedicated and wealthy hobbyists. Today, with adequate hobbyist-grade instruments and dev boards available in the $100-200 range pretty much anywhere you look, really, the situation for low level engineers and developers is better than it's been for years and, in some ways, even better than the '70s heyday of homebrew computers.

The problem is more the lack of interest. Low level systems development is hard, a computer engineering degree is hella hard, and all the hype and easy money was in following the webdev career path starting from the late '90s dotcom boom.

It was always a tiny minority of the general population that had real knowledge and capabilities of creating the actual technological equipment widely used by the said general population. In the same vein, there are no less people capable of low/OS level programming than before, and you can't expect every proficient user, even a power user or e.g. the now numerous web developers, to be at this level of ability.

What changed is that the very things capable of eliciting interest in programming also offer overpowering content consumption functions with huge, never ending catalogs of games, movies, videos, short funny clips etc.

As computing and "content" proliferate, the uncompetitiveness of creation, esp. symbolic creation such as programming, is increasing. At some point, broadening of the access no longer offsets this effect, and the talent pool may start to shrink even if capability and permeation is a million times higher than it was.

I think Arduino has us partially covered there, allowing kids to get into electronics easily with some wires, sensors, and C code.

And for cobbling together a computer out of microelectronics there's CollapseOS https://collapseos.org/

It's also a fait accompli in education where Python reigns supreme as the primary teaching language.

There is nothing wrong with teaching Python as an introduction to programming methods and data, then diving deeper. My son started with Python in his Junior year of high school and as a senior is now learning Java.

It is just top down learning, instead of bottom up. For programming, I think either works.

I just happened to learn the other direction because of the year (early 1980s). On CP/M, you had a macro assembler right there, and it took me a while to get my hands on a Basic interpreter.

My university course in 1980 started with Pascal and Fortran along with Niklaus Wirth's "Algorithms + Data Structures = Programs".

Excellent book! Although written with Pascal in mind, it became really useful to me. Always brought it at work along with the K&R 2nd Ed. and the famous TCP/IP networking book by W. Richard Stevens.

I think the point is that before, if you had a computer, you had the ability to inspect, modify and program, and with curiosity you'd learn. Now, most people get computers where those things are forbidden. You still can get "programmable" computers, but you have to do it explicitly. Before, it was implicit. So there's a filter that some people won't pass and won't learn, and the pool of people shrinks.

>I think the point is that before, if you had a computer, you had the ability to inspect, modify and program, and with curiosity you'd learn.

That's a rose-colored view that maybe reflects a small window in time.

But take the 80s. First of all that computer was maybe $5,000 in today's money. (Yes, there were some limited alternatives like the Sinclair and Commodore 64 that were cheaper.) Long distance telephone calls were expensive as were the big BBS systems like Compuserve. Any software you purchased was costly. A C compiler was hundreds of dollars. (Turbo Pascal at $50 was an innovation.)

Perhaps more to the point, most people didn't just have a computational device of any sort. So the fact that, if you had one, you could use it for a lot of different purposes is sort of irrelevant.

My experience being born in 1992 is that as a kid I could toy with QBasic for instance (and then with Dev-C++ a bit later) ; as well as go peek & change stuff in random files (hello red alert save files :D) - can't really do that with an iPad.

I was a QBasic kid as well, but I think it was pretty clear to me even at the time that QBasic was a mickey mouse environment. You couldn't get into 32-bit mode, or dynamically allocate memory with pointers, much less blt sprites or backgrounds around at a usable speed.

I'm not saying it's exactly equivalent, but there's certainly a perspective which could argue that a) the free JavaScript interpreter available on every device is more featureful and interesting than anything available to us 80s and 90s kids, b) low-cost SBCs running free open source operating systems and compilers are more accessible and better supported than the commercial compilers that few could have accessed back then, and c) the overall ramp from curious/creative gamer kid to capable hacker is much smoother now than it was then, with a lot of interim options between JavaScript and commercial environments in which to gain comfort and achieve interesting results (thinking stuff like GameMaker and RPGMaker, modding communities for existing games, hacking/homebrew/porting communities for consoles, etc).

I get the argument that many kids are growing up today without necessarily touching a "real" computer, just a tablet. That said, as someone who got into PCs quite early on, I'm a bit skeptical that the world we live in today where you can assemble a working computer based on a Raspberry Pi with loads of open source software for probably about $100 somehow is less accessible to a kid who wants to hack around on computers.

I think the key with the Pi is that it needs to find its way into the house for some other reason than just to be an educational aid. Positioned that way it will be about as interesting to most kids as a plate of cold spaghetti— am thinking here of the book I was given as a teenager on developing ActiveX components, because it was an enterprise-y thing, when what I really wanted was a book on DirectX, for making games.

But yeah, if the Pi shows up as part of an IOT system, or as a TV/streaming box, or to play retro games on, or whatever, then it's there and it's available to be tinkered with; and from my limited experience, basically none of those use cases will run on their own without at least a little bit of tinkering. :) Even my little Linux-running Powkiddy emulation handheld has probably consumed about as much of my tinkering time as it has my retro gaming time.

Right. So quite a bit later than the period I'm talking about. Late 90s/2000s period is arguably when commonly-used hardware was most accessible/hackable.

Eh, I mean the kids who are passionate about how systems work will find a way to dig deep. I don't think this is going away with lock-down systems regardless.

The only thing it will do is make it easier for the general public to work with machines and the masses never cared to begin with and to be frank that's a good thing. They are less likely to install malware and cause trouble for themselves.

Plus you're forgetting that these companies still need engineers to build and maintain their infrastructure so it's not like the knowledge is going to disappear, never mind the fact that the corporations heavily rely on OSS.

I am a developer, I use Mac... so I don’t understand how it’s forbidden to program. Mac OS comes with Python, Ruby and PHP preinstalled, even.

The lockdown and controlling of distribution of software.

Great, you can write a script, it's checksum is then sent over the net and verified to not be malicious or if the service on the other side is experiencing lag..well... gotta wait, and wait.

To the point where, all choice is removed from an operating system. So much for root access.

It's not about the ability to write programs, but the ability to write programs that influence your base system in a meaningful way. Yes, you can write programs, but can you write a different /bin/init for your system? I'm not sure you can (either because Apple will not let you, or your lack of ability to do things like setting up a system to be used). Maybe you can, we're wrong about this, and more power to you. But it's quite likely that you can't.

$100 isn't cheap for a curious kid with no income, especially if they're not already sold on it as an interest.

There's a big difference between buying a computer explicitly for the purpose (even if it's cheaper), vs being able to play around with the hardware that you already have, for no new monetary investment at all.

But computing tools are cheaper now than they have been in ages.

You can buy an Atmel dip version of the chip in an arduino really cheap, and build basically what is an arduino on a breadboard. Then get a USB programming adapter (again, cheap) to get it running.

Then you can get in expensive USB logic analyzers that are plenty capable of monitoring I2C and SPI buses and learn how all that works.

None of that existed in the 1990s. It simply wasn't there.

The the price and the availability of tools is so much better today. You don't even need to buy any books, it is all online, with community members jumping in all the time to help.

Get involved in the local community board. Present to the library council that this is a good investment for education. Purchase 20, lead a community class to teach the next generation of students.

If a kid with a phone but no money wants to know if programming is something they might like to study, and somehow the school also can’t afford a Pi, I’d point the kid at something like https://js.do/

The problem is that's not really how people get started. Hello world is how you get started in a university intro course, not in your parents' basement.

What happens there is that you have a pile of photos that are all in one folder and you want to organize them into separate folders by date. A problem you're actually having, not a homework assignment. Which is possible to do automatically, because they all have metadata with that information in it. So you find a python script somebody wrote to do that, and it works! Then you realize the script is just a text file and you can modify it to sort using geotags or something else instead, and now you're down the rabbit hole. But only if it's easy to see and modify the code that runs on the device you actually use.

Not everyone learns the same way or for the same reasons.

For example, I learned to read at the same time I learn to code, and from the same source — I had the unusual combination of not just the right birth year, but well-off parents and two elder siblings that had gotten bored of their Commodore 64 just as I was reaching reading age.

Back then my coding was, naturally, 95% transcribing from the manual, but it was fun. One of the few ones I can remember reasonably well this far removed from the experience was a sentence generator that had a list of nouns, verbs, etc and picked from them to build grammatically correct gibberish.

Hello world is how a lot of us started in C with the K&R book, wherever we were. Most of us didn't want to organize photo folders, we wanted to make the computer display something cool or play a song or something, and usually we had to start with something elementary when tackling a new language, especially a complicated one like C.

I can still remember being 11 years old and meticulously keying in the hello world program on my dad's aging Tandy Model 16, then saying 'cc hello.c' and watching as the compiler chugged through the tiny program for a minute or two. (It #includes stdio.h, so there was more code to go through than it seemed, plus there was a linking step. And not much actually happened, because Unix C compilers are silent if there are no errors.)

When I ran a.out and saw "hello world" appear, I was over the moon. It was like clearing 1-1 in Super Mario Bros. for the first time. Like I had surmounted the first obstacle and was thus ready for all the challenges that lay ahead.

I don't know why a raspberry pi is supposed to be a good device to learn programming other than the fact that it forces you to use Linux. Do kids really ssh into their raspberry pi from their phones? Or do they really buy an expensive screen, expensive cases and expensive peripherals adding up to $200?

I also doubt that raspberry pis are actually the type of device that schools are interested in. What they'd want is the TI-84 equivalent of a laptop. It should be easy to reset the laptop to factory conditions and it would only run a few specific preinstalled programs (one of which would be a barebones IDE, probably for Java). To me it feels like the raspberry pi fails at all of these. You have to mess with hardware and then mess with the software. A school provided Chromebook with student accounts managed by the school would be way more practical.

However, if you actually want people to learn programming organically, it's much simpler to just get a Windows laptop and use that as the main device.

I too have doubts about the real-world suitability of the Raspberry Pi in formal education environments, but certainly the business of resetting it _is_ a solved problem; every kid is issued their own SD card, and can flash it back to the starting image whenever they want.

It all depends on what you want to learn.

On a RPi, with a standard laptop on the side (Windows/Mac/Linux, doesn't matter), and some basic installed tools, you can pull down and configure Alpine Linux and run a bare bones IoT Linux platform. If you want to get really gritty, you also configure and build U-Boot as a primary loader.

Once you get passed that point, you pull docker into your Alpine build and start running containers.

Stepping through that full process will teach you research (because you are going to be reading a lot about how to pull that off), it will teach you about Linux kernel configuration. It will teach you to be at least comfortable with git.

There is a lot you can learn on a Raspberry Pi, cheaply, that only involves plugging in an ethernet port and power supply, and never seeing anything on a terminal.

> The computers are not just getting cheaper, educative material is easier than ever to find. [...]

Being accessible and being accessed are two different things.

I'll make a more general parallel.

Never in history was such an enormous amount of audiovisual media about the decades preceding the birth of their generation ever produced, during those decades of later. And this astonishing amount of written documents, audio, films, both fiction and non-fiction, live takes, documentaries, is readily available: often immediately, often gratis.

And yet this generation is disconnected from the previous ones in an unprecedented manner. And yet this generation seems to be the most ignorant about how stuff was just a couple decades before them. I've never seen such a break in the flow. The material is present but not reached or understood.

The Rasberry PI is not open hardware, and is actually part of the problem, not the solution.

Yes and no. At least it created a modest tinkering culture while the mainstream went with smart phones. I wonder if RISC-V will help us getting over ARM* or if RISC-V SoC vendors will try to create their own walled garden, proprietary interfaces and GPUs/codec support.

* = The issues with the Raspberry PI are both with Broadcom's closed source hardware (there are other tinkering boards with more open hardware) and of course with the license model of ARM.

I agree there are other 'tinkering' boards with more open hardware, but the Pi has the advantage of a huge community of very nice supporters. Maybe even as nice as the community over on arduino. The biggest problem with Broadcom on the Pi is the videocore block, and if you never run the video (using it for IoT), you avoid the biggest black hole on the device.

As far as the ARM license goes, very few people are ever going to spin their own hardware, and at least the ARM programming API/ISA is very well documented. We'll see if that continues under NVidia.

Certainly MOS, ST, National, Zilog, Motorola, INtel, IBM, Dec, (etc etc) CPU were always closed source. That did not prevent anyone from learning about computer systems and programming then, and it won't stop them now.

That said, I doubt Apple will ever make the M1 available for purchase. Which is where this thread started.

>> The biggest problem with Broadcom on the Pi is the videocore block, and if you never run the video (using it for IoT), you avoid the biggest black hole on the device.

Avoiding the black hole does not solve the part of the problem where you are financally supporting closed hardware and disincentivising the Rasberry PI foundation fron doing the right thing.

This is the same problem we had with TiVo: you could get the operating system, tinker with it, and run it. Just not on your TiVo. What good is having the source if you can only run it on toy hardware? You need to be able to alter the code on the tools you use so you aren't beholden to a company that drops warranty and support within 5 years and may never fix the problems you have even while it's supported.

I just made the C64 <-> Pi 400 comparison to someone last week. It is completely analogous.

Pair that up with an arduino, and maybe wiring, and you can a have hardware interaction with a Linux desktop on the 400.

I just helped a young arduino user with reading an analog pot yesterday. There are still young people out there playing with hardware, and it is way easier today than it was for me in 1984.

Then go over and look at the people configuring and compiling custom versions of Marlin for their 3D printers.

Or the nerds designing flight computers for solid fuel model rockets.

Wouldn’t it be great if you could route the mac’s internet through the raspberry keyboard, strip the surveillance automatically, and continue to use the internet like normal?

So you see the non-free-and-open Raspberry Pi as a real development environment. That's nice.

> Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.

Gen Z here. I tried to give manufacturers the finger and try to play with systems level software on modern PC's and ended up retreating after realizing, among other things, just how much modern computers are not for hackers. Everything feels like a "undocumented" (by which I mean the documentation is likely only accessible to OEM's and those with large business deals woth them) mess. Even when it is available, I'm too scared to touch it. I found something I wanted to try to to hack around with till I saw the license agreement I had to read in order to access the docs. I couldn't parse out the legalese, especially regarding open source works and I'm not consulting a lawyer to figure out if I can post some project I might hack together online under an OSS license. The few things that do get open sourced are incredibly half assed and you can still tell they're designed for corporations, not hackers (i.e. edk2)

Conversely, I have an old IBM XT compatible luggable sitting in my closet. The manual for that has the full BIOS source llisting. Nowadays I mostly just hack around woth old non-PC devices, but for the most part computing just isn't fun for me anymore.

As other commenters have said, just picking up an open source *NIX is a great start.

If the lack of open hardware bothers you, there are several non-x86 architectures that are fully open source. Chief among them is RISC-V[0]; SiFive[1] is taping out RISC-V processors to real hardware, selling dev boards, and stuff like that. These are real, competitive, Linux-capable CPUs, although widespread adoption has not happened yet.

On the more minimalist front, the ZipCPU[2] is simpler (though has less software support), and it's author writes some really great blog posts about HDL[3] (Hardware Description Language -- how these CPUs are defined).

You might also enjoy developing for very small embedded systems. I like AVR CPUs such as the ATMega328P (of Arduino fame) for this purpose. Although not open-source, they have quite good and thorough datasheets. AVRs are very small 8 bit systems, but as a result, you can fit most of the important details in your head at once.

If you want to talk more about these topics, feel free to get it touch with me. You can find my contact info on my profile, and on my website (also listed in my profile).

0 - https://riscv.org/

1 - https://www.sifive.com/

2 - https://github.com/ZipCPU/zipcpu

3 - http://zipcpu.com/

Most of the issues you list should not be a problem if you use a Linux distro on a fairly standard pc/laptop.

Pretty much all inside it is open for you to look into and change as needed.

At the OS level sure. Its still running on a completely non-free / hardly hackable platform though riddled with all sorts of backdoors and other curiosities. I'm aware of things like coreboot / libreboot, but support is even more limited there and porting involves a deep understanding of the x86 platform in my brief experience.

You don't need any of that to understand how x86 or the OS works. Realistically you can do the entire learning with QEMU/VM and not be restricted by the hardware at all?

At some institutions they teach how OS work by students implementing Pintos, a full x86 OS.

In my first year I had to build a full raspberry pi emulator from scratch and then code an assembler for it. These programs would then run natively on the raspberry pi. People wouldn't even touch the hardware until much, much later into the project when the emulator has confirmed to be working.

I disagree with the view that you need a full FOSS hardware to understand a platform. You can do a lot with a VM/QEMU.

It really depend on what you want to do - Linux distro on x86 is IMHO good start, but of course its not the only option - there are now some reasonably open arm devices (including an atm laptop) from pine, there is the very open Tales workstation running on Power & RiscV showing up everywhere.

Why do people make these kinds of comments? Talos isn’t showing up anywhere, and even during the Kickstarter it was more than $3000 for just a motherboard.

As an example of what is available ? And these do get used - for example as a desktop: https://m.youtube.com/watch?v=ktSwuF32ywM

Also they seem to have released a cheaper micro ATX board: https://www.raptorcs.com/BB/

> Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.

Kids today aren't as skilled with computers as I would've expected years ago. I feel the app culture stunts deeper learning.

I'm aware that I am looking at this through the lens of self-bias, but I have definitely found that much of what I know about computers came from being able to (and sometimes needing to) poke around and figure out how to do something.

I am not a software developer and have little formal education in computer software or hardware. With that said, I've picked up enough just by growing up with Commodore, DOS, Linux, etc. to at least have the basic understanding needed to research or just figure out solutions to most common problems.

I work in media production and IT in an educational setting, and while I'm far enough along the Dunning-Kruger graph to know just how little I know, the sort of things I see from students (grad and post-doc at that) definitely give me pause sometimes. Often, the mention of basic things like how to change a setting or search for a file in MacOS or Windows is met with glazed eyes or signs of discomfort. Apologies for how "I'm not really tech savvy" follow talk of anything more complex than pointing at pictures on a screen or doing something outside of the web browser.

And yes, I do understand that the very nature of these students' specialization in other fields can mean they haven't had the need or opportunity to learn much about computing. I just feel like I only picked it up because it was how you got things done or because it was there for poking and prodding when I got bored or curious.

I think in the end it's not just that I have a personal connection to computing and think everyone should share my interests. It's more akin to a basic competency that opens a lot of doors and prevents you from being taken advantage of.

My analogy is usually along these lines: if your job requires you to drive a car, you don't need to be a mechanic or be able to build a car...but you should at least know the basics of how to operate and maintain a vehicle beyond just gas pedal, brake pedal, and wheel.

10 years ago when I started working I feared the younger generation and their enthusiasm and energy. Then I had to teach a few of them how to ssh and that fear was dissolved. Most of these kids will not cut it in heavy reading/comprehension jobs.

In my experience the up-and-coming generation are far, far better at branding than my own generation ever was. Talking to my younger colleagues I generally get a deep feeling that they know what they are talking about, are thoughtful and make good decisions.

But the thing is, that's just branding: they aren't actually much more competent or thoughtful than my generation was at that age, they just seem like it. When I look at the code they write, it is as bad as the code I wrote. They have startling gaps in their knowledge and experience, just as I did (and no doubt still do — there is always something to learn!).

The thing that worries me is that until one gets to more objective measures, they really do seem more competent and trustworthy — which means that others are more likely to trust them, which is likely to lead to more bad decisionmaking at scale.

I wonder, though, if there is really a difference at all. Maybe my generation actually seemed more competent to our betters than we really were, too!

As an "older" coder I get where your hinting at but your view on Zoomers in tech from your high horse seems very narrow and entitled.

I'm sure there are Zoomers out there who can code rings around you(and me).

not really meant as a hard judgement, just sharing my experience. i am 100% sure there are folks at all age levels that can be and are better than me. i have definitely worked with some.

I think the parent just meant the surprise at the lack of enthusiasm, based on the ubiquity of the technology/device.

Perhaps it is considered more of a commodity now?

> Perhaps it is considered more of a commodity now?

It is and it should be. What the white beards all too often forget is the fact that the generation before them would say the exact same thing about them (lack of enthusiasm, lack of interest and knowledge, etc.).

This is a story as old as time:

> The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households.

attributed to Socrates (469–399 B.C.)

How many 80s kids were into HAM radio or actually building computers (Ben Eater-style: https://www.youtube.com/c/BenEater/videos ) instead of toying with micros or basically playing Lego with PC components? Same difference.

Today's computers have reached a level of tight integration and complexity that simply cannot be grasped by a single individual anymore.

People keep whining about that when it comes to computers but happily accept the very same thing in other commodities like cars, clothes(!), highly processed food(!), hygiene and cleaning products (soaps, hair gels, rinsing agents, ...), utility grids, etc.

It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.

> It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.

that is perhaps the most poignant take on this i have come across in a while. thanks that got the cogs turning in... different directions. i do need to ease up. :)

I have to disagree with you about cars.

There has been a significant increase in demand and cost for pre-owned vehicles that do not have integrated entertainment systems and drive-by-wire systems.

The used truck market, I have seen vehicles triple in value over the last decade.

The issue with cars and trucks is the ability to repair them (not necessarily by yourself - just in general), not interest in the technology, though.

Problem is the pipeline. Schools that really bought iPads for "digitalization" will produce learn results that are significantly worse for understanding tech applications and information technology, because the principles get obscured by fancy UI. Nevermind that it wouldn't even be different from their phones.

A problem is that teachers would require some extensive training first before they can even try to teach kids something they didn't know before.

I do appologise if this will sound condescending : What are you on about?

I do agree that the majority are limited in terms of technical knowledge but I think that was always the case. It's always a (driven)minority that are responsable for tech advancements, preservation and magic.

I wish I was 14 in this day and age with this much information, how-to's, cheap(and expensive) tech and microcontrollers(when is the last time you checked places like aliexpress and git's?).

We live in a potential golden age for technology education:

- Hardware is ridiculously cheap and powerful and widely available

- Inexpensive educational computers such as the BBC micro bit and Raspberry Pi are widely available along with associated educational materials; these devices are easily interfaced to the real world, and can often be programmed at a very low level or high level as desired

- Overall, robotics and real-world computing is cheaper and more accessible than it has ever been

- FPGAs are ridiculously cheap and powerful and allow you to explore processor and digital hardware design

- Nearly every computer, smartphone, or tablet contains a web browser that can be programmed in JavaScript, webasm, and dozens of other languages that compile to JavaScript or webasm; web apps can display 2D, 3D and vector-path graphics, generate sound, and interact with all sorts of input devices

- Nearly every computer, smartphone, or tablet has a Python implementation as well

- Free and open source software is widely available

- Free and open source game engines are widely available

- Games are often moddable and many contain built-in level editors

- Source code of almost any type of software you can imagine, from operating system kernels to compilers and libraries to databases to to web browsers to scientific computing to games and productivity software is widely available

- Billions of people have internet access

- Tons of free and often high-quality educational materials are available from universities, khan academy, and many other sources on youtube, on github, and all over the web

- Many books including textbooks and tutorials are available for free as PDFs or readable on the web


The materials are out there, but the challenge is empowering people to find and make use of them.

Omg, I can’t up vote this enough. I have been telling principal/director-level engineers this for almost a decade. We shouldn’t be making hiring bar so high that it requires a PhD but rather hire on aptitude and hunger for knowledge and feed them. So much of being a senior+ in tech is knowledge sharing which we don’t do enough of. I’m guilty of this too! I value my personal time and don’t want to spent it building training materials.

The whole “playing” with system level software is how I became an engineer in the first place!

It’s also extremely apparent when you suggest a new way of doing things and older folks look at you like you’re crazy or worse, actively block because “we’ve always done it this way”. Those youngsters in those tiny screens have big ideas, from a different perspective, and we should acknowledge that and learn what they know.

Back to topic, I think the maturity of software engineering has enabled a lot of what we see today due to the macro cash grab of capitalism. There are things that stand firm beyond it, but 80%> of software written is logical duplication of software already written (for another company perhaps).

I’m guessing on the percent but it’s important to know that global overall LOC is really just hashing out the same stuff we’ve been doing (albeit in maybe a slightly more novel way) to get business value.

20%< is moving that stick forward.

> We shouldn’t be making hiring bar so high that it requires a PhD but rather hire on aptitude and hunger for knowledge and feed them.

As a self-taught developer with just a few years of experience, I appreciate seeing this sentiment. I have this burning desire to be a great engineer and to learn as much as I can, but I'm not exactly sure what I should be doing or learning.

Do you have any advice as to activities, books, or other resources that you think would benefit a young engineer who would like to learn how to do things right?

Learn as much as you’re enthusiastically able!

I’m on year 20 of this amazing ride. I have another 20, possibly 40 in me. I love it!

I was self-taught. The trick is to find something you’re super super interested in and learn as much as you can from books, youtube, Google, blogs, other members of HN.

I was really into games and wanted to make quake-style fps games. I started with art. Went to college for graphic design and hated it so I dropped out. Worked on games while I did crap jobs. Eventually made a few small ones and got really into web development. Got a new job...

I can’t really tell you what books without knowing your interests.

As an old dude, I agree with you completely. We push way to much on the degree/school thing in HR, long before we get to interview them. For most positions, I would rather have an enthusiastic learner with a BS, or even an associates, and have them learn how it is done from experienced staff. If they really like it, and want to pursue more college, fine. If they want to dive in and learn by doing, fine.

I am in my retirement position now, stepped back to staff engineer, because I don't want to spend any more time stuck in meetings. The best principles, the best directors, all came up through the ranks. Yes, some of them returned to college for a few years before coming back, but the best ones just came in as enthusiastic green horns and dove straight it.

I have worked long enough now (36 years in this field, and this is my third 'career') to have seen people I mentored grow into director level positions, and that is perhaps the most rewarding thing there is.

> I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy

Aren't there more kernel developers now than ever before?

It's not the case that we're struggling to produce technically skilled software engineers.

Thanks, that was a really good watch!

> I worry for Gen Z because they're tiny-mobile-device native.

Gen Z here. I learned most of my basic computer skills by playing / modding games, editing videos, and having no friends. Gradually snowballed into writing scripts to make things easier, learning more programming languages and eventually getting books on more formal CS topics. I hope my kids can have a similar experience.

The ugly and maladjusted among us will always find a way ;)

On a related note, game consoles seem to be going the same direction, focusing on services and downloadable content or games over physical media (and even those that are physical seem to have a large online/multiplayer component). After a console generation or two, these services get taken offline as they are no lonher profitable, leaving a portion of your system useless.

I can go back and play old games I grew up on any number of platforms thanks to emulation or buying old physical media. This generation won't be able to do the same in 10-30 years.

Also, it seems alot of games are only meant to be enjoyed as a distributed multiplayer, not really a single player ‘quest’.

> What's worse is the number of other posts with the top comments being vociferous defenses of these companies

I slowly tend to think that some of those commenters, who fiercely defend all and any action taken by Apple to limit the freedom of its users, are paid marketing and PR firms who scan forums to oppose any opinion that is not in line with Apple‘s vision of a closed system.

I understand the sentiment, but how many people were ever going to understand their systems at that level? I'm sure Gen Z will have their own tech equivalent of Roy Underhill telling them to put down the Ikea bookshelf and grab an axe and a log, and that the number of people who respond then will be about the same as now.

It is a choice to buy or not buy a device from a company. In this particular case the greatest threat is that devices that were always used by powerusers, are a step away from preventing us having superuser privileges on them, and instead have a moderator like on an iDevice.

At the same time Gen Z has way more choice and exposure to "computers", so worrying about them sounds very diminishing to their prospects. Sure, adoption of computers, even if they fit your bag or your pocket, is much greater nowadays which leads to way wider range of usages, for better of for worse.

In the end same could be said about the Edison generation and electricity. In the end I barely have a clue how electricity networks operate. Can I experiment with it? Sure. Do I need to if I just want to power on my computer? No.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact