I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens. They scream outrage when they're not allowed their tiny little dance-jig operators provided courtesy of an abusive regime.
Sure, I reflect that I may be now nearing the curve of an obsolescent person attached to such silly outmoded principles like "ethics" but if we're all just selling ourselves out constantly to the angry god-machine of the id, is this really the future we want for our daughters and sons?
Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.
I recommend Jonathan Blow's talk "Preventing the Collapse of Civilization" for an unsettling view of how far we have already traveled down that path.
The computers are not just getting cheaper, educative material is easier than ever to find.
Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore, but then again, nothing is stopping a kid from buying a microcontroller and playing with it. It cheap and information is plentiful. I would even say that it is easier for Gen Z to play with electronics than it was for millennials.
But that unprecedented availability of computers didn't turn everyone into a nerd. Just because someone has a smartphone doesn't mean he has any interest in computing. When computers were limited and expensive, only those who had some affinity with them had them, now everyone have them, so mechanically, you can't expect the same average level of interest.
Learning about the Raspberry Pi is nice, but that doesn't help preserve the institutional knowledge needed to build system/low-level features in a modern OS. I'm not talking about teaching kids to program; I'm concerned about preserving in the future the knowledge of how to create all of the complex tech we use today. Many civilizations throughout history collapsed after they lost the technical knowledge on which their civilization depended.
> Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore
Yes, that's the problem! Unless those skills are actively used and passed on to the next generation of engineers, that knowledge decays. Part of the reason you don't see a lot of bare metal programming anymore is due to the knowledge decay that has already happened!
> nothing is stopping a kid from buying a microcontroller and playing with it
This article is about how those same kids are being stopped from learning the complex systems we currently use.
> didn't turn everyone into a nerd.
Nobody is trying to turn everyone into a nerd. I'm talking about making sure the nerds of he future have the ability to learn about the tech they use, so the ability to understand and make that tech isn't lost. Locking down the OS into an "appliance" that cannot be inspected or changed is a direct attack on the ability to learn,
The modern computer is the pinnacle of human civilization, a thing so complex that it requires thousands of people to fully understand how it works. Low level programming is just a small link in the chain of complexity. From mining the materials, to producing silicon of incredible purity, to designing circuits at quantum scale, to CPU design and finally the guy who places the frame that will display cat videos. So if you argue that no one knows how to program the bare metal (that is not that bare anymore), one can argue that the knowledge of how to make logic gates from transistors or the metallurgy of copper traces is getting lost.
Maybe less people will know how to program on the bare metal. But think about it, it has been thousands of years since most people are unable to hunt and gather. This is a great deal more worrisome than not knowing how to program computers, and yet, human civilization have been thriving.
The important thing is that it is still accessible, and it is.
The problem is more the lack of interest. Low level systems development is hard, a computer engineering degree is hella hard, and all the hype and easy money was in following the webdev career path starting from the late '90s dotcom boom.
As computing and "content" proliferate, the uncompetitiveness of creation, esp. symbolic creation such as programming, is increasing. At some point, broadening of the access no longer offsets this effect, and the talent pool may start to shrink even if capability and permeation is a million times higher than it was.
And for cobbling together a computer out of microelectronics there's CollapseOS
It is just top down learning, instead of bottom up. For programming, I think either works.
I just happened to learn the other direction because of the year (early 1980s). On CP/M, you had a macro assembler right there, and it took me a while to get my hands on a Basic interpreter.
That's a rose-colored view that maybe reflects a small window in time.
But take the 80s. First of all that computer was maybe $5,000 in today's money. (Yes, there were some limited alternatives like the Sinclair and Commodore 64 that were cheaper.) Long distance telephone calls were expensive as were the big BBS systems like Compuserve. Any software you purchased was costly. A C compiler was hundreds of dollars. (Turbo Pascal at $50 was an innovation.)
Perhaps more to the point, most people didn't just have a computational device of any sort. So the fact that, if you had one, you could use it for a lot of different purposes is sort of irrelevant.
But yeah, if the Pi shows up as part of an IOT system, or as a TV/streaming box, or to play retro games on, or whatever, then it's there and it's available to be tinkered with; and from my limited experience, basically none of those use cases will run on their own without at least a little bit of tinkering. :) Even my little Linux-running Powkiddy emulation handheld has probably consumed about as much of my tinkering time as it has my retro gaming time.
The only thing it will do is make it easier for the general public to work with machines and the masses never cared to begin with and to be frank that's a good thing. They are less likely to install malware and cause trouble for themselves.
Plus you're forgetting that these companies still need engineers to build and maintain their infrastructure so it's not like the knowledge is going to disappear, never mind the fact that the corporations heavily rely on OSS.
Great, you can write a script, it's checksum is then sent over the net and verified to not be malicious or if the service on the other side is experiencing lag..well... gotta wait, and wait.
To the point where, all choice is removed from an operating system. So much for root access.
There's a big difference between buying a computer explicitly for the purpose (even if it's cheaper), vs being able to play around with the hardware that you already have, for no new monetary investment at all.
You can buy an Atmel dip version of the chip in an arduino really cheap, and build basically what is an arduino on a breadboard. Then get a USB programming adapter (again, cheap) to get it running.
Then you can get in expensive USB logic analyzers that are plenty capable of monitoring I2C and SPI buses and learn how all that works.
None of that existed in the 1990s. It simply wasn't there.
The the price and the availability of tools is so much better today. You don't even need to buy any books, it is all online, with community members jumping in all the time to help.
What happens there is that you have a pile of photos that are all in one folder and you want to organize them into separate folders by date. A problem you're actually having, not a homework assignment. Which is possible to do automatically, because they all have metadata with that information in it. So you find a python script somebody wrote to do that, and it works! Then you realize the script is just a text file and you can modify it to sort using geotags or something else instead, and now you're down the rabbit hole. But only if it's easy to see and modify the code that runs on the device you actually use.
For example, I learned to read at the same time I learn to code, and from the same source — I had the unusual combination of not just the right birth year, but well-off parents and two elder siblings that had gotten bored of their Commodore 64 just as I was reaching reading age.
Back then my coding was, naturally, 95% transcribing from the manual, but it was fun. One of the few ones I can remember reasonably well this far removed from the experience was a sentence generator that had a list of nouns, verbs, etc and picked from them to build grammatically correct gibberish.
I can still remember being 11 years old and meticulously keying in the hello world program on my dad's aging Tandy Model 16, then saying 'cc hello.c' and watching as the compiler chugged through the tiny program for a minute or two. (It #includes stdio.h, so there was more code to go through than it seemed, plus there was a linking step. And not much actually happened, because Unix C compilers are silent if there are no errors.)
When I ran a.out and saw "hello world" appear, I was over the moon. It was like clearing 1-1 in Super Mario Bros. for the first time. Like I had surmounted the first obstacle and was thus ready for all the challenges that lay ahead.
I also doubt that raspberry pis are actually the type of device that schools are interested in. What they'd want is the TI-84 equivalent of a laptop. It should be easy to reset the laptop to factory conditions and it would only run a few specific preinstalled programs (one of which would be a barebones IDE, probably for Java). To me it feels like the raspberry pi fails at all of these. You have to mess with hardware and then mess with the software. A school provided Chromebook with student accounts managed by the school would be way more practical.
However, if you actually want people to learn programming organically, it's much simpler to just get a Windows laptop and use that as the main device.
On a RPi, with a standard laptop on the side (Windows/Mac/Linux, doesn't matter), and some basic installed tools, you can pull down and configure Alpine Linux and run a bare bones IoT Linux platform. If you want to get really gritty, you also configure and build U-Boot as a primary loader.
Once you get passed that point, you pull docker into your Alpine build and start running containers.
Stepping through that full process will teach you research (because you are going to be reading a lot about how to pull that off), it will teach you about Linux kernel configuration. It will teach you to be at least comfortable with git.
There is a lot you can learn on a Raspberry Pi, cheaply, that only involves plugging in an ethernet port and power supply, and never seeing anything on a terminal.
Being accessible and being accessed are two different things.
I'll make a more general parallel.
Never in history was such an enormous amount of audiovisual media about the decades preceding the birth of their generation ever produced, during those decades of later. And this astonishing amount of written documents, audio, films, both fiction and non-fiction, live takes, documentaries, is readily available: often immediately, often gratis.
And yet this generation is disconnected from the previous ones in an unprecedented manner. And yet this generation seems to be the most ignorant about how stuff was just a couple decades before them. I've never seen such a break in the flow. The material is present but not reached or understood.
* = The issues with the Raspberry PI are both with Broadcom's closed source hardware (there are other tinkering boards with more open hardware) and of course with the license model of ARM.
As far as the ARM license goes, very few people are ever going to spin their own hardware, and at least the ARM programming API/ISA is very well documented. We'll see if that continues under NVidia.
Certainly MOS, ST, National, Zilog, Motorola, INtel, IBM, Dec, (etc etc) CPU were always closed source. That did not prevent anyone from learning about computer systems and programming then, and it won't stop them now.
That said, I doubt Apple will ever make the M1 available for purchase. Which is where this thread started.
Avoiding the black hole does not solve the part of the problem where you are financally supporting closed hardware and disincentivising the Rasberry PI foundation fron doing the right thing.
Pair that up with an arduino, and maybe wiring, and you can a have hardware interaction with a Linux desktop on the 400.
I just helped a young arduino user with reading an analog pot yesterday. There are still young people out there playing with hardware, and it is way easier today than it was for me in 1984.
Then go over and look at the people configuring and compiling custom versions of Marlin for their 3D printers.
Or the nerds designing flight computers for solid fuel model rockets.
Gen Z here. I tried to give manufacturers the finger and try to play with systems level software on modern PC's and ended up retreating after realizing, among other things, just how much modern computers are not for hackers. Everything feels like a "undocumented" (by which I mean the documentation is likely only accessible to OEM's and those with large business deals woth them) mess. Even when it is available, I'm too scared to touch it. I found something I wanted to try to to hack around with till I saw the license agreement I had to read in order to access the docs. I couldn't parse out the legalese, especially regarding open source works and I'm not consulting a lawyer to figure out if I can post some project I might hack together online under an OSS license. The few things that do get open sourced are incredibly half assed and you can still tell they're designed for corporations, not hackers (i.e. edk2)
Conversely, I have an old IBM XT compatible luggable sitting in my closet. The manual for that has the full BIOS source llisting. Nowadays I mostly just hack around woth old non-PC devices, but for the most part computing just isn't fun for me anymore.
If the lack of open hardware bothers you, there are several non-x86 architectures that are fully open source. Chief among them is RISC-V; SiFive is taping out RISC-V processors to real hardware, selling dev boards, and stuff like that. These are real, competitive, Linux-capable CPUs, although widespread adoption has not happened yet.
On the more minimalist front, the ZipCPU is simpler (though has less software support), and it's author writes some really great blog posts about HDL (Hardware Description Language -- how these CPUs are defined).
You might also enjoy developing for very small embedded systems. I like AVR CPUs such as the ATMega328P (of Arduino fame) for this purpose. Although not open-source, they have quite good and thorough datasheets. AVRs are very small 8 bit systems, but as a result, you can fit most of the important details in your head at once.
If you want to talk more about these topics, feel free to get it touch with me. You can find my contact info on my profile, and on my website (also listed in my profile).
0 - https://riscv.org/
1 - https://www.sifive.com/
2 - https://github.com/ZipCPU/zipcpu
3 - http://zipcpu.com/
Pretty much all inside it is open for you to look into and change as needed.
At some institutions they teach how OS work by students implementing Pintos, a full x86 OS.
In my first year I had to build a full raspberry pi emulator from scratch and then code an assembler for it. These programs would then run natively on the raspberry pi. People wouldn't even touch the hardware until much, much later into the project when the emulator has confirmed to be working.
I disagree with the view that you need a full FOSS hardware to understand a platform. You can do a lot with a VM/QEMU.
Also they seem to have released a cheaper micro ATX board: https://www.raptorcs.com/BB/
Kids today aren't as skilled with computers as I would've expected years ago. I feel the app culture stunts deeper learning.
I am not a software developer and have little formal education in computer software or hardware. With that said, I've picked up enough just by growing up with Commodore, DOS, Linux, etc. to at least have the basic understanding needed to research or just figure out solutions to most common problems.
I work in media production and IT in an educational setting, and while I'm far enough along the Dunning-Kruger graph to know just how little I know, the sort of things I see from students (grad and post-doc at that) definitely give me pause sometimes. Often, the mention of basic things like how to change a setting or search for a file in MacOS or Windows is met with glazed eyes or signs of discomfort. Apologies for how "I'm not really tech savvy" follow talk of anything more complex than pointing at pictures on a screen or doing something outside of the web browser.
And yes, I do understand that the very nature of these students' specialization in other fields can mean they haven't had the need or opportunity to learn much about computing. I just feel like I only picked it up because it was how you got things done or because it was there for poking and prodding when I got bored or curious.
I think in the end it's not just that I have a personal connection to computing and think everyone should share my interests. It's more akin to a basic competency that opens a lot of doors and prevents you from being taken advantage of.
My analogy is usually along these lines: if your job requires you to drive a car, you don't need to be a mechanic or be able to build a car...but you should at least know the basics of how to operate and maintain a vehicle beyond just gas pedal, brake pedal, and wheel.
But the thing is, that's just branding: they aren't actually much more competent or thoughtful than my generation was at that age, they just seem like it. When I look at the code they write, it is as bad as the code I wrote. They have startling gaps in their knowledge and experience, just as I did (and no doubt still do — there is always something to learn!).
The thing that worries me is that until one gets to more objective measures, they really do seem more competent and trustworthy — which means that others are more likely to trust them, which is likely to lead to more bad decisionmaking at scale.
I wonder, though, if there is really a difference at all. Maybe my generation actually seemed more competent to our betters than we really were, too!
I'm sure there are Zoomers out there who can code rings around you(and me).
Perhaps it is considered more of a commodity now?
It is and it should be. What the white beards all too often forget is the fact that the generation before them would say the exact same thing about them (lack of enthusiasm, lack of interest and knowledge, etc.).
This is a story as old as time:
> The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households.
attributed to Socrates (469–399 B.C.)
How many 80s kids were into HAM radio or actually building computers (Ben Eater-style: https://www.youtube.com/c/BenEater/videos ) instead of toying with micros or basically playing Lego with PC components? Same difference.
Today's computers have reached a level of tight integration and complexity that simply cannot be grasped by a single individual anymore.
People keep whining about that when it comes to computers but happily accept the very same thing in other commodities like cars, clothes(!), highly processed food(!), hygiene and cleaning products (soaps, hair gels, rinsing agents, ...), utility grids, etc.
It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.
that is perhaps the most poignant take on this i have come across in a while. thanks that got the cogs turning in... different directions. i do need to ease up. :)
There has been a significant increase in demand and cost for pre-owned vehicles that do not have integrated entertainment systems and drive-by-wire systems.
The used truck market, I have seen vehicles triple in value over the last decade.
A problem is that teachers would require some extensive training first before they can even try to teach kids something they didn't know before.
I do agree that the majority are limited in terms of technical knowledge but I think that was always the case. It's always a (driven)minority that are responsable for tech advancements, preservation and magic.
I wish I was 14 in this day and age with this much information, how-to's, cheap(and expensive) tech and microcontrollers(when is the last time you checked places like aliexpress and git's?).
- Hardware is ridiculously cheap and powerful and widely available
- Inexpensive educational computers such as the BBC micro bit and Raspberry Pi are widely available along with associated educational materials; these devices are easily interfaced to the real world, and can often be programmed at a very low level or high level as desired
- Overall, robotics and real-world computing is cheaper and more accessible than it has ever been
- FPGAs are ridiculously cheap and powerful and allow you to explore processor and digital hardware design
- Nearly every computer, smartphone, or tablet has a Python implementation as well
- Free and open source software is widely available
- Free and open source game engines are widely available
- Games are often moddable and many contain built-in level editors
- Source code of almost any type of software you can imagine, from operating system kernels to compilers and libraries to databases to to web browsers to scientific computing to games and productivity software is widely available
- Billions of people have internet access
- Tons of free and often high-quality educational materials are available from universities, khan academy, and many other sources on youtube, on github, and all over the web
- Many books including textbooks and tutorials are available for free as PDFs or readable on the web
The materials are out there, but the challenge is empowering people to find and make use of them.
The whole “playing” with system level software is how I became an engineer in the first place!
It’s also extremely apparent when you suggest a new way of doing things and older folks look at you like you’re crazy or worse, actively block because “we’ve always done it this way”. Those youngsters in those tiny screens have big ideas, from a different perspective, and we should acknowledge that and learn what they know.
Back to topic, I think the maturity of software engineering has enabled a lot of what we see today due to the macro cash grab of capitalism. There are things that stand firm beyond it, but 80%> of software written is logical duplication of software already written (for another company perhaps).
I’m guessing on the percent but it’s important to know that global overall LOC is really just hashing out the same stuff we’ve been doing (albeit in maybe a slightly more novel way) to get business value.
20%< is moving that stick forward.
As a self-taught developer with just a few years of experience, I appreciate seeing this sentiment. I have this burning desire to be a great engineer and to learn as much as I can, but I'm not exactly sure what I should be doing or learning.
Do you have any advice as to activities, books, or other resources that you think would benefit a young engineer who would like to learn how to do things right?
I’m on year 20 of this amazing ride. I have another 20, possibly 40 in me. I love it!
I was self-taught. The trick is to find something you’re super super interested in and learn as much as you can from books, youtube, Google, blogs, other members of HN.
I was really into games and wanted to make quake-style fps games. I started with art. Went to college for graphic design and hated it so I dropped out. Worked on games while I did crap jobs. Eventually made a few small ones and got really into web development. Got a new job...
I can’t really tell you what books without knowing your interests.
I am in my retirement position now, stepped back to staff engineer, because I don't want to spend any more time stuck in meetings. The best principles, the best directors, all came up through the ranks. Yes, some of them returned to college for a few years before coming back, but the best ones just came in as enthusiastic green horns and dove straight it.
I have worked long enough now (36 years in this field, and this is my third 'career') to have seen people I mentored grow into director level positions, and that is perhaps the most rewarding thing there is.
Aren't there more kernel developers now than ever before?
It's not the case that we're struggling to produce technically skilled software engineers.
Gen Z here. I learned most of my basic computer skills by playing / modding games, editing videos, and having no friends. Gradually snowballed into writing scripts to make things easier, learning more programming languages and eventually getting books on more formal CS topics. I hope my kids can have a similar experience.
The ugly and maladjusted among us will always find a way ;)
I can go back and play old games I grew up on any number of platforms thanks to emulation or buying old physical media. This generation won't be able to do the same in 10-30 years.
I slowly tend to think that some of those commenters, who fiercely defend all and any action taken by Apple to limit the freedom of its users, are paid marketing and PR firms who scan forums to oppose any opinion that is not in line with Apple‘s vision of a closed system.
At the same time Gen Z has way more choice and exposure to "computers", so worrying about them sounds very diminishing to their prospects. Sure, adoption of computers, even if they fit your bag or your pocket, is much greater nowadays which leads to way wider range of usages, for better of for worse.
In the end same could be said about the Edison generation and electricity. In the end I barely have a clue how electricity networks operate. Can I experiment with it? Sure. Do I need to if I just want to power on my computer? No.