What I remember from these early times:
1) we had only one television at home, so typing programs required to have access to the TV.
This is why I spend a lot of time analyzing programs BEFORE typing them, since I didn't have a lot of time to type them.
This probably taught me a lot !
2) I always wanted to improve the programs I typed, so I spent a lot of time optimizing them. This also proved useful later ;-)
3) Some programs included mysterious hexadecimal characters.
I tried to find some documentation about that.
It was difficult, because the information was scarce, and there was no Internet.
One day, I got an Aha!, and I discovered 6502 that day.
This was useful, since I did write quite a lot of games in 6502, and it got me my first job in the game industry in 1985.
4) In France, there was a beautiful newspaper called Hebdogiciel. It contained programs for all kinds of computers. I tried to convert these programs to my computer, and this also gave me pointers to handle conversion between Basics. My first job was about converting Basic programs between various computers (Thomson TO7 <> Exelvision).
5) everything was so new and exciting ! Nowadays, I don't feel this kind of excitement. Everything is so easy to put in place. At the time, we only had 48 to 64 kilobytes of memory.
Everything was a challenge. The computers were not designed to write games, but games were doable.
I'm probably the same age as you (started in 1982 on a Philips P2000) and feel the same. What wrecks me every time is the thought how many new things we discovered and did in a timespan of just a few years back then (the lifespan of various home computer models). Today, 10 years pass and all we really get is somewhat faster CPUs, slightly bigger memory, better graphics. And it's not worth it to discover any specific tricks for any platform (like you could on home computers to get more colors / bigger screen etc.).
5 ) I don't agree, i still find so much exitement today with what is going on around, sure we have so much more power, but there is still challenge today, i'll admit not the same, but still a lot of challenges.
I created a series (pardon the plug) to make the software I had wanted to have when I started: simple, understandable, frugal, almost elegant.
Maybe, it will bring some nostalgia back!
Dragon, Oric, Commodore, Amstrad, Sinclair and BBC.
I already had a Sinclair ZX81.
The article really captured the exploration and discovery of that era.
On a more amusing note, my mate wrote a utility which we put on a floppy disc. We would go to shops and boot it, and it would copy all the roms. Good times.
I have fond memories of this computer. I absolutely loved its keyboard.
The problem is that the manual contained only the set of instructions, without any explanation.
My Aha! was understanding how it worked.
I did write a couple of games on Oric/Atmos, but quickly changed to Atari ST (I had one of the first 520ST with its developer manual).
There were many publications like that, and often they would contain broken code. I sometimes wondered if it was on purpose, but that was actually a great opportunity to go beyond just typing the code and run the program, having to understand the code, and debug it.
We had Input magazine, the Capital's newspaper computer section on Friday, Crash and from our neighbors Microhobby, which helped to improve my Spanish as well. :)
You got dropped right into the interpreter. There was not already a GB of OS loaded that you had a hard time to learn. All the code that was there was what you wrote (copied) yourself.
The processor & interpreter was about as fast as you could think, so it was easy to follow and step though (mentally). Reversing to assembly was the logical next step and since the programs where small, it was easy to learn & memorize.
After years of coding yourself, you'd stumble on the first OS, which consisted of the most rudimentary libraries that one could basically read & remember.
Years later still, the first rudimentary networking picked up. Slow and not business critical so again easy to experiment with. By the time I connected the first commercial network to "the internet" downtime of email for less than 24 hours was not even noticed.
I do not envy the kids who nowadays stand 0 change to ever learn the complete stack of code running on any modern device. From what I see, they all are "stuck" on top of a GUI with only the slightest idea of what happens between their mouse and the actual hardware (and even that is often not hardware anymore).
Petzold's CODE is also really good going from logic gates to microprocessors to assembly language.
I still wish you could buy something like a Pi board that has just an interpreter and compiler on it as well as a textbook and you implement a simple version of a file system, text utilities, task manager...etc.
I graduated from a CS program (BS) a few years ago, and the projects I'm still most proud of existed "lower" in the stack: implementing FAT16, writing compilers/interpreters (I still have my Brainfuck interpreter!), playing around with paging in MULTICS, and the like.
These were all very toy-like (with good reason, a semester is only so long), but since I enjoyed them I've found myself asking things like "I wonder how the process scheduler/virtual memory manager/in OS X is handling [whatever I'm doing at the moment]? What does my stack look like right now? How are all of these threads communicating with each other, what are they saying?"
You can occasionally see this in the console when something is going wrong, but usually not when the system is operating normally. When the OS is handling a heavy load brilliantly, that's kind of when I'm most impressed, and therefore most interested.
(maybe this kind of procedural output would be dreadfully boring or unreadable due to the complexity of a modern system if I actually saw it, I don't actually know of course.)
The possibly-incorrect impression I get of the performance-analysis related areas of the kernel is that they're a bit siloed (in the same way that X11 is siloed, and only a very small group of people look at it), which may render it bit functionally academic. (It's newer code, though, so there's less chance of eccentricity, FWIW.) Isolation does have benefits - with less chaos to keep up with and less bus factor, the code is changing less and the maintainers have more mental bandwidth to post on mailinglists :), so you have more opportunity to get a good understanding of what's going on.
I don't have the same low-level experience you do, but I do share the same interest in wanting to understand "what's really going on" - and I incidentally want to make a Linux system monitor tool that vacuums up as much information that the kernel is willing to make available to it. `htop` and friends surface a very caricatured picture from maybe 1-10% of the data the kernel has to offer at any given moment.
Unfortunately these days you can have simplicity or you can have a web browser but not both. And without a web browser a computer feels extremely limited.
I was going to reference the same thing. Particularly the text Elements of Computing Systems.
Many Espressif MCU boards come with Lua already installed, and some other boards are supported by eLua.
I think building a FORTH would be educational, but I've never seen a tutorial (including JonesForth) that can take you from zero to Forth. Everyone just says it is easy. Maybe it is more or less obvious with more Assembly background?
Skia used to include a really simple Forth implementation written in C++. According to https://groups.google.com/forum/#!topic/skia-discuss/joAyAl1... it was intended for straightforward/simple scripting, and for some reason promptly got deleted shortly after being brought up.
[In case you haven't noticed yet, the official Skia repo is at /google/; I've linked to /servo/'s obviously-old copy above, but there are more than enough forks and clones for the code to be findable elsewhere too.]
I kind of like this implementation because it's both incredibly context-specific and generalized all at the same time, and it isn't Bare Metal Implementation #79,513. And it's C++, too, although this is not (overly) abused.
I also started back when things were simple (BBC B with interpreted BASIC, no GUI).
Having transitioned to modern development, I think the bare bones system was good for self-learning. But for the last 2 decades I've worked on systems where many layers were hidden from me (Windows, RDBMS, Salesforce etc) and most of the time it doesn't matter. I very rarely need to learn anything about low levels Windows stuff. With SQL Server you need to learn a bit about query plans and how to influence them, and where to put indexes. But mostly, you can ignore lower layers and just let them get on with what they do. And thats the whole point isn't it?
One huge exception to this is the modern web stack, if you think of the browser as the bare metal, with HTML and JS and CSS as some sort of machine code. On top of that you have Content Management systems and PHP and so so many JS frameworks. And whatever you do, soon enough you need to get down to the HTML/JS/CSS level. So thats the modern equivalent to our experience of the old days - no matter how many frameworks and CMSs we throw at the web, everyone still has to know their HTML/JS/CSS to get anywhere.
Perhaps one day we will escape from that and people will reminisce about angle brackets and escaping ampersands.
However to have an understanding of things from top to bottom is definitely a benefit. People get silo'd in their various layers and lack comprehension of much that's going on below (or above) them.
It can make you a more rounded developer, to have a good idea of the whole stack, even if it's not always in-depth and some of the mental models used aren't strictly true.
Someone needs to know how it all works, someone needs to build the kernels, the low-level libraries, the compilers, network protocols server programs etc etc.
You can certainly have a great career without them, I wouldn't dispute that.
You’re right, my generation definitely missed a lot but I hope some day to work through NAND2Tetris. Furthermore, also I didn’t study CS, but a good CS/EE program would really take you through first steps. No matter what, you can’t keep a book from a scholar. There appear to be many resources to learn this stuff if one has the opportunity. Combine NAND2Tetris with the x86-64 assembly on Ubuntu book I saw here yesterday and you should be golden, right?
Yes; IBM System 360/370, for example, was a marvel of engineering and design, with a remarkable history ; so were the first microprocessors, such as MOS Technology's 6502 . The best part was, you could know all of it, if you wanted, down to the transistor level - the geek's Holy Grail. With modern chips/systems you cannot any longer, not even with the Raspberry Pi.
I wish IBM's revolutionary architecture lived on in our PCs; good engineering pays off, and today's software could be more sane, as people would be learning from the giants rather than wasting time on a massive scale trying to reinvent the wheel.
(and don't even start me on WS/rest over HTTS over SSL over IP).
Learning stack by stack, one generation at a time, let us, old gray beards, digest the whole thing. But now, starting from scratch ? Not even with a 10 feet pole.
I missed out on all the stuff you mentioned and really want to learn it, but I am grateful that I get to use all this stuff that’s developed. Instead of spending time learning the above listed fundamentals, which would have been revelatory in its own way, I have the opportunity to use Java on Kubernetes to build and operate globally whatever Internet software dream comes to mind. It’s not the end all, and I am crippled by lacking fundamentals, but it’s so much more than before and I am grateful to have the opportunity to have these tools.
It's just that I have that warm, reassuring feeeling that if I have to dig down, I'll be able to. Too bad I almost never have to; abstraction works way too good :-)
I'm sure one can be a very good programmer not knowing the assembly stuff behind. And even myself, I don't know exatcly how a CPU works (I mean, I get the logic of it, but I wouldn't be able to make one from scratch) :-)
Around when the Voodoo cards came along (1996?) things got really interesting again. Also about any keyboard that shipped was decent still, Silicon Graphics and Macs (even the beige ones) were really nice looking but expensive machines. Had a SCSI drive hooked up to a PowerBook 1400, that was pretty much as cool as having an external Thunderbolt SSD now.
The ASUS P2B of course was amazingly stable for it's time, however if you had a dual Celeron 300A @450MHz you were really the king. Dial-up modems getting upgraded to T1 lines or cable modems. That was a huge speed boost.
So yes the 90's were nice, but just no comparison to what happened between '77 and '87 (the year the Amiga 500 got introduced).
They might be using a modern laptop, but their screen is hardly different from those beige UNIX terminals with green phosphor screen and VT100 keyboard that I had to use the first couple of semesters.
Even AT&T later moved into it with Plan 9 and Inferno, with ACME.
All to be found on Interlisp-D, Smalltalk, Mesa XDE, Mesa/Cedar environments.
Yes - local, very local.
And to be fair, they are right. There was a level of UART programming and video logic control that I never understood, because they required knowledge of analog circuitry that was beyond me.
As I grew older, I realize that every generation has people who that about the next one. Sure, car engines in my g'grandfather's day were so simple that anyone could take one apart and really learn how it works. But I prefer the benefits of modern engines.
BTW, the first real OSes were in the 1960s. I do not think you started with computers in the 1950s. And how is it that "all the code that was there was what you wrote (copied) yourself" when you were "dropped right into the interpreter"? - who wrote the code for the interpreter?
Typically you had 16K of ROM BASIC, if you were really lucky on a half decent machine (BBC or so) you would have something more akin to todays BIOS in another 16K. So there was some code there, just not easily accessible or replaceable.
I tend to agree with the feeling, but, on the other hand, only the really worthy will chose to dig in and really understand how a computer works.
Separating the wheat from the chaff.
As someone who was there and did that I want to refute that assumption :) I have typed in many programs and there's not a lot to learn because most of them consisted of many pages of DATA lines and a small loop that loaded those machine instructions into the home computer's memory and started the program by a USR directive (please note that I made this explanation many years after the fact). I guess there are not many teenagers who are able to debug this kind of program by looking at the actual opcodes.
Sure you could learn a lot from typing in regular BASIC programs but that weren't the most interesting games as far as I remember. The most productive learning experience was interactive and exploratory programming a shown in the OP article.
But still, I learnt a lot by modding basic games I typed in from books, then later learnt a lot of assembly as I tried to mod those assembly language programs I typed in with hex codes. You'd print the whole program out using a good ol dotmatix printer ( took a while.... not because there was that much code, just printers were slow... ) and then puzzle over the assembly trying to work it all out and drawing lines trying to work out all the jump sequences etc. Later my Dad and I had a lot of fun hacking tape based games so we could put them on our awesomely huge (doubly huge if you cut a notch and flipped them over) and speedy 5/14 inch discs.
That varies a bit by platform. Most 8-bit machines had relatively poor BASICs and no assembler, so BASIC code could be a mess of line-number targeted GOTO/GOSUB statements and anything that needed to be faster than the higher-level language interpreter could manage needed to be pre-assembled and just pushed into the relevant RAM locations by a BASIC loop as you describe.
I had the luxury of learning on Acorn machines, initially an Electron then a BBC Master Series. Their variant of basic had useful features which implementations on other 8-bit devices lacked: long variable names (IIRC the C64's BASIC allowed long names but only used the first to characters so PersonName and PermitCount would clobber each other, BBC basic respected up to 40 characters of a name), named procedures and functions, and a built-in multi-pass assembler. This made BASIC code (if written using those features obviously, some magazines had content targeted at multiple architectures and stuck to the lowest common feature-set) easier to read, understand, and modify, and made it possible to see any machine code logic so you had a hope of pulling it apart usefully too. I learned a lot starting from nuggets in type-in demos and using them as starting points to experiment/read a little deeper, not just about those specific machines but reading around computer architecture more generally. I don't think I'd have the same level of competence or knowledge that I have now were it not for that start.
On the Beeb most listings were BASIC with all the logic there and DATA statements just for in-game assets such as sprites or room descriptions in a text adventure. All the Osbourne books are online now , check them out https://www.raspberrypi-spy.co.uk/2016/02/usborne-releases-1... many happy memories inside!
Most of the assembly and other DATA/POKE stuff is less than relevant nowadays as hardware has improved and a lot of magic is already done in firmware.
The form of "type it in, verify and run; explain mistakes and correct solution" is still didactically valid.
I'm pretty sure there are programming language interpreters out there with simplicity of BASIC, including BASIC itself.
On MS-DOS, you had a variety of commands built in and extensibility via BASIC too. (or more) Programs could be written as batch files.
On Unix-like, the primary interface was via shell, csh, ksh or later POSIX. A set of built in commands. If you wanted something more advanced, programming languages and interpreters were easily available, typically pre-installed.
It was the main interface, magic words. We have regressed to pictures now and most extensibility has been hidden, removed and made very indirect.
Hardware doesn't come with programming manuals anymore as it is deemed too complex (yeah right) and expensive.
It is be great if your GPU came with a description and software to write and execute shaders, for example. PC with instructions how to programmatically reboot it, start it or access sensors. How to access EFI services. What the machine code looks like. Etc.
It doesn't come with manuals because it is considered a trade secret. Commodore had to lay the hardware bare because it wasn't fast enough to program in a high level language nor did the high level languages fully support their hardware, whereas the hardware of today is so fast that nobody thinks anything of it programming it through a driver or a shared object library in a high level language. It's still wrong not to lay the hardware registers and their functionality bare (yes NVIDIA, have you forgotten your SGI roots?)
It takes time to produce useful documentation though.
I've read a discussion in a forum where an AMD employee when questioned if it was possible to have a GPU's FW without DRM (or something like that) his answer was that would probably be a commercial suicide (OK, I'm little tired right now and I'm not recalling all the details of the conversation but I can try to dig it up... ^__^;)
25 years later I bought the real thing.
The code was out in the open and decipherable. Great way to learn in that particular environment.
Atari with their player-missile graphics made immediate sense to me though.
One of the things that mentor emphasized was writing a dissassembler soon after becoming familiar with a given CPU.
However, that woild be post fact. Some teens got closed computers, like Atari machines. By todays standards, they are wide open. Back then, they were mysterious, but capable with spiffy graphics and sound chips. The key thing was how users were nudged toward or away from the guts of it all.
Other teens got, say an Apple 2. On that one, you got schematics, a commented ROM listing, and the whole machine was open, software driven. It screamed, "hack me!" The whole damn computer was a clever hack!
Again, my per group were Apple people, and were writing ML based off the snippets found in the ROM, along with great tutorials found in the grocery store magazines. (Where I got my first dissassembler, BTW. Thank you COMPUTE!)
This stuff varied extremely widely. One town would have that spark of learning, a particular school, or club maybe. Another one, nothing. Just gaming, the usual.
Circling back, yeah. Those DATA statements were known to be programs. Lots of us hand disassembled them and either wrote them down (I have a couple in some papers somewhere, I am sure), and my preference was to add REM statements with assembly code in them for later. Oh how I would have loved BBC Basic!
The Apple was cool, had a monitor and mini assembler, disassembler. Wrote a lot with it. Still a mess compared to what I understand the BBC had. I want to go and run one.
Other machines varied, as did people and what books and or other resources were available to them.
However, because I was young and still had memory in my brain, by typing in those poke data parts, patterns stuck in my memory. When I bought my first asm book, I finally understood what those patterns of opcodes meant. I never programmed in assembly instructions on the z80; I always used direct opcodes, still do. It is faster for me.
The more advanced I had found was a football (soccer) simulator that was working like a text adventure game, showing you the development of the game by writing the words of the sportcaster in regular time intervals. Being able to understand it and modify it, was a milestone on my growth as future programmer.
Anyway, I learned a few things from those listings early on.
I pity sometimes the young people who are studying computer science nowadays. I studied CS in the 1990s. Compared to a modern curriculum, my courses looked very basic: five years on functional programming, compiler construction, networking, databases, etc. No P2P, cloud computing, mobile applications, IoT, etc.
Now, most CS studies have to rush through the basics in three years, followed by two years where the students have to learn all the tools and techniques that they need for their professional career. I had an entire course ("Advanced topics in databases") on the efficient implementation of indexing and query execution for databases. Today's students have to learn in the same time: a shortened version of the old course PLUS nosql, column-oriented DBMS, DHTs etc.
If your university is good. I took a computer science bachelor the past three years in the Netherlands. In my case, I fear that I learned as much about databases as your basic course taught you, and maybe even less. In particular, the implementation of databases wasn't even touched on. I definitely learned nothing whatsoever about nosql db's, column-oriented db's, dht's and other cool things, though the existence of the first two was hinted.
Luckily I can usually figure out what I want to know without help of a teacher, but there's a definite difference in level of difficulty and amount of content between universities today.
That's probably one of the most valuable things I learned at university.
I was just curious, because not all countries do it.
It's 1983 and your dad decides to buy a personal computer. The C64 is too expensive at $595, so he buys a VIC 20 (introduced at $295, probably $200 by '84).
You plug it in to the television downstairs and, after mentally tuning out the hum from the RF converter, you start to enjoy gems like GORF and Radar Rat Race (https://m.youtube.com/watch?v=1LRkON9XTOk).
You try to read the included user manual, with its helpful computer chip themed cartoon mascot explaining things like strings, but none of it makes sense because you're eight. You and your sister spend hours reading aloud and typing in bytecode listings from then-popular computer magazines. Imagine spending an hour transcribing hex codes, only to type the run command and have it crash.
If, on the off chance you found the errors and got the program to run, you'd find it was wildly over-hyped in the description. You didn't want to waste your work, so you'd save it to the tape drive that used audio cassette tapes.
A few days later, you'd try to load the program from the cassette tape and find that it was corrupted. I never once got a saved program to load from that thing, it only successfully read commercially published software.
The nostalgia is largely inaccurate. It was an era of immense frustration. And I never saw any C64 television ad, probably because we only had three television stations and no market to speak of for personal computers.
Incidentally, there's no way the kids in Stranger Things would have had those high end walkie talkies. They'd have had the crappy ones that only work to maybe 100 yards and emit static nonstop.
One long-form magazine program I tried never worked right, even after finding and fixing some bugs myself. But we neighborhood kids had hours upon hours of laughs plugging our names and bad words into the sentence generator program's DATA lines.
This is basically how/why I learned how to program. I had a knock-off Apple II clone (Franklin Ace 1000), and I'd type in BASIC games. I was terrible at typing and often made mistakes. I learned the basics of debugging when trying to get those games working. In a lot of cases, just getting the game working was far more fun than actually playing it.
There was a progressive complexity that happened back in the late 70s and early 80s such that people alive today who still code and learned back then have taken the ride from machine language to multi-gigabyte stacks.
We just kept adding stuff and having to make sure we could be functional in all of it. Not an expert, but functional. It was standard practice on my commercial programming teams to decide what everybody wanted to learn on a new project before starting. (And these were high-paying projects. We always left with happy customers).
People were jack-of-all-trades. Most everybody was. You had to be.
What am I seeing resonate, at least as far as I can tell? The inability to understand what the hell is going on and work with it. You get a C++ compiler compiling a hellacious codebase working in DOS, a rails configuration ain't nothing.
I see what are supposedly senior programmers walk a bit off the happy path on a framework and they're lost. Not only are they lost, they are insecure, afraid, embarrassed. There's nothing wrong with these people. There's something wrong with the way we're training and staffing them.
Fifteen years ago I was still coding commercially, having a blast. Talking to a recruiter one day about various projects, she said "You know, you're one of the last true general consultants"
There may be ten thousand of us. Beats me. But her general appraisal was correct. There is a drastic and complete change between the way coders used to relate to technology and the way they do today. It's not tech. It's mindset.
They largely focus on buzzwords and where you went to school. Seems like they even often fail the companies they work for by designing poor matches and wouldn’t recognize experience like you’ve noted as being worth anything. Then again, maybe I’m just in a weird bubble here. And I’m just generalizing—they’re not all the way I’ve described, but many (possibly most) are.
How many years experience of "the XML" do you have?
I'm like well uh that depends. 95% if you include #import sklearn and basic numpy and pandas operations. 01% if you consider 'real programming'. 'so what %?' ...
That aside, there is a real problem in the tech industry when it comes to developing talent. Universities teach mostly theory and rely on the industry to teach us best practice, 'production level coding'. but then once they get to the industry, they get no mentorship.
companies don't want to spend time training someone who will leave in 2 years. however, if everyone did solid mentorship programs, the entire industry would benefit.
If you want the simplicity of the Basic interpreter, just fire up a Python console, and you are in a much more comfortable position to learn programming and computer science than you were back then. But it is still a long way to get to someplace useful. In the 80s, by owning a VC-20 and programming Basic and Assembler, I was pretty much already at the edge of something new and powerful.
I think that in the 1980s, the threshold of meaningful was a lot lower. It wasn't just that you needed less of a stack to be able to do things; it's that so much less had already been done.
And those things hadn't been done because they were too hard to do with the existing tools. But we suddenly had a new, more powerful set of tools, and we could do a bunch of things that couldn't be done before. (I remember, in high school, moving from punched cards, where it took a week to see the results of a change, to BASIC on a TRS-80, where it took a minute. That changed the world quite a bit...)
in some ways that is true, in many ways that is false. only as far back as 2005~, if I didnt know linux and wanted to learn it, I needed to purchase hardware. installing and messing with binaries / configurations ran the risk of making a machine inoperable. now today, its very easy to spin up something in AWS or VM. want to experiment with tensorflow on a large GPU box? 4$/hour on demand.
also, we have lots of API's and libraries we can just import, and I think thats a great thing. writing a website where the login was needed but not ultimately what you wanted to explore use to be hard. now its gem install devise. makinga basic website look pretty enough? bootstrap. want to compare random forests, xgboost, and linear regression for predicting home prices? from sklearn import ...
i know all these things arent doing 'new' things, but they enable the person to not worry about the details and do something cool. I think the volume of interesting stuff we see coming from blogs and articles on hacker news is a testament to that.
If you have kids do that know they would probably say "It's not a real computer program, it's just text!"
I don't and I grew up in the era of 8 bit machines and kilobytes of RAM etc. I fully recognize that it was fun to have those constraints and we learnt a lot about dealing with them (using less memory, using fewer instructions) but I don't buy that that really matters for most programmers today. They'll have other things to worry about: e.g. debugging distributed programs.
Sure, if you want to do microcontroller work then that sort of thing is useful, but literally nothing stops a "Full Stack" programmer picking up an Arduino and programming it and learning something new.
I love playing with those environments (http://blog.jgc.org/2009/08/just-give-me-simple-cpu-and-few-...) but every day I see people with different experience of computing from me and I don't feel that I have an advantage over them: they often know about things I'm totally ignorant of. It's true that I'm very good at debugging horrible low level things, but it's also true that I'm not good at imagining the state of a system with hundreds of micro-services.
I feel you. Thinking across multiple levels of abstraction is one of the hardest thing in engineering, especially if, over the years, you've overspecialized on one specific strata.
The final report is available here:
I haven't kept up with subsequent research the Institute has produced. I do know they had an aversion to producing software artifacts and were far more concerned with the written reports (which in some senses restricted the ability of amateurs like me to play with the interesting output by the group). I did play with OMeta (a meta-parser) and the COLA / Id code - which was enlightening!
(Core emulation by Thomas Skibo, interface and IO enhancements by yours truly.)
10 PRINT "* ";
20 GOTO 10
It is not a blessing but a curse. If you understand the computation you will spend so much time wondering “WTF is this (modern) computer doing” when it fails to do something simple that you know an older machine with a tiny fraction of the power could do easily.
Not to mention, bloat finds a way. See e.g. smartwatches. There was Pebble that managed to create a good, bloat-free piece of equipment. But the market was soon flooded by watches running... Android. Even navigations, biking computers and other sports devices run Android now.
There’s no actual need for a page to be 10Mb to deliver 5k of text and a 50k JPEG. Esp not on mobile.
Nevertheless it ran BASIC and my friend and I learned to program on that machine. Subsequently we honed our skills on Apple II's at school, and later by hacking on the PDP-11 at school. In 1981 I built a simple Z-80 computer, roughly equivalent to the TRS-80 (note:"80"). At that time doing something like this literally caused newspaper reporters to come interview you. It was nice to learn these things sort of organically, albeit perhaps not optimally; I've made my living in software development and to this day have never taken any class in programming.
Of course that moment of novelty really was brief. Once the IBM PC came out in 1981, computers proliferated rapidly and they no longer seemed so special. Nevertheless I do think that our "Generation X" had sort of lucky timing with computer culture, since we were also just reaching working age when the Internet revolution hit (my first job out of grad school came from an ad which literally said "the Internet revolution is here and you can be part of it"...a small part to be sure but still part!)
But anyway every time period has its pluses and minuses! If you want to know how it really felt to grow up during that time, the truth is that we envied the 60's generation hugely and thought that everything we had was just kind of a pale imitation of what they did (for example, music). Take a look for example at the book called "Generation X", which is pretty dystopian, and really did express how many people felt. There's always something new happening!
I will say this though. I disagree with one word in the author's text. He said the machines of the time were "constrained." They weren't constrained at all. They had limitations just like today's machines do. But we weren't aware of the limitations because we weren't time travelers coming from the future with more computing power in our pockets than most corporations had in 1983.
In fact the exact opposite was true. The computer industry was bursting at the seams. It was exploding. It was not constrained at all.
Then they cloned the IBM PC and the rest is history. Every machine you're reading this post on descended from those.
It's interesting how rather than coming with an instruction manual nowadays, its assumed that everybody knows how a PC (including phones) works, especially as they are more complicated now - just hidden beneath a veneer of GUI
As the author of the article said, "In 1983... home computers were unsophisticated enough that a diligent person could learn how a particular computer worked through and through."
OTOH, the experience of learning C and having to actually write video and network drivers (or their barest elements) because there wasn't a web you could download a library from...yeah, that probably actually did make me a better programmer. Having had the experience of writing a little bit of actual assembler, even just the old "MOV AX 10; INT 13" (am I remembering that right?) does give me a sense of connecting with the machine more deeply than someone who grew up with the internet.
When I started in the mid to late 80's as a child, we had a Commodore 128 and it was Basic or nothing. I also remember typing in those program listings, getting annoyed they didn't work, realising I'd transcribed something wrong, then when it was working making changes.
We then moved onto a 386 and I gravitated towards QBasic (and later TurboBasic because you could use the mouse and compile to an exe)
With my son, I ended up searching high and low, umming and ahhing, and in the end I found the spiritual successor to those computer magazines with program listings in a Python for Kids book that takes them step-by-step (a little like the Commodore manual did) introducing concepts and eventually getting you to build a game.
I think a lot of the wonder about computers that seems to have waned is because we're all older and not looking at it through the same lens, true, computers are far more ubiquitous now, but learning that you can tell it what to do and are only limited by your imagination is still incredibly powerful to a child.
Today, the set of building blocks is unlimited. The challenge is to find the right building block for what you want to do, using the infinity of resources available to you (all the APIs, libraries, Google, GitHub, StackOverflow, blog posts). You're drowning in complexity, even if what you want to program is straightforward. And you are constantly mindful of writing idiomatic code, always wondering if there's a better way to do the same thing, which is normally a good thing but can be time-consuming and getting in the way of the fun part that is the problem solving.
I'm a bit confused because you compare that experience to modern programming as a software developer. I think you are thinking of the difference across 40+ years, rather than the difference of kid vs. adult.
I mean, if you were programming on a Unix machine in 1983, or a VMS machine, or an IBM mainframe, you would have far more than a single manual.
The point I was trying to make was about the distinction between programming with a limited set of building blocks vs programming with a large and open-ended selection of libraries. They don't require the same skill set, and one seems more fun to me than the other.
Mechanical engineering is a lot harder, there are so many options to consider, new options come along all the time so you're always falling behind, and cost plays a much larger and - I think - more boring role in the process.
I think that is analogous to the point you are making with computers.
Almost all games and programs in the 8-bit world were written in Assembly. The BASIC interpreter was just used as the bootloader to load and run these programs, BASIC was in fact the shell interface and the toy language, but the assembly programs essentially sold the computers.
If you wanted to do something in these computers, it would have to involve assembly because of BASIC slowness. And there were newbie books about assembly language, assembly libraries, tons of assembly compilers and debugging utilities available anywhere.
No, you could do all kinds of useful, productive things in BASIC, which were still orders of magnitude faster (and also still more accurate) than manual processes you were replacing.
Try to use EFI shell? Oh wait your machine doesn't even have one. Uboot is not even programmable...
Luckily there are many JS C64 emulators online like this one https://virtualconsoles.com/online-emulators/c64/ so its a breeze to get started.
And of course her first program was
10 PRINT "HELLO PETRA"
20 GOTO 10
She also tried Scratch and like it but to me in order to write code you need to learn to "write" it.
I remember for one of these platforms, there wasn't really a great commercial release scene - but there was a great home-hacker scene, with type-ins from magazines and so on ... and I remember having quite a small library of routines and utilities, saved on cassette tape, that could do various things - fast scrolling, tape copying, UNDELETE commands, and so on. But now these things are lost to time (still out there on my cassette collection, wherever it is these days) .. and we have to re-create them.
So now one of my favourite aspects of the hobby is the reconstruction of all the 'cool utilities and stuff' that made the platform great in the 80's. Its not so easy! For some of the obscure platforms, we really have to dig deep .. fortunately though, 8-bit computer magazines seem to have been pretty well preserved on the Internet. Its just now a matter of going through them, spotting the gems for the obscure platform, and re-typing it all in, lol. ;)
But that said, having a variety of 8-bit machines at hand is really a special treat. I always wanted a Spectrum machine, and now its finally affordable. ;P
The site contains all the listings from the Hebdogiciel magazine, re-typed and ready for download. The magazine was a must for all French-speaking computer kids back then.
No doubt other readers of this thread will find it useful.
My friend also had a Commodore 64 which even to start a game you had to know some commands (LOAD "$", 8, LIST, then LOAD your game). Computing and coding/interfacing with the computer was more involved but simple as well, led to lots of fun learning to code and create.
Between these two machines (Apple II and Comodore 64) I fell in love with coding and games. I never had an Amiga which was a bummer but these were enough to inspire kids to be creators with code.
In high school my teacher Mr Isles was big into the internet and media computers, we were watching TV on a computer, playing games like Scorched Earth, making games in pascal while browsing the web, it was amazing and moving fast.
Flash had the same fun factor as those in late 90s to around 2006-ish before the mobile phone came out. Flash communities were very special in both designers and developers, it was powerful that either type of aim could create games, interactives, experiences and Amiga like demo scenes.
Even when mobile truly arrived in 2007 when the iPhone and smartphones upped the game, I was blown away when OpenGL was on it and I knew immediately that it was a new handheld gaming market that I had to get into. Mobile existed at that point and I was making games on Windows Mobile but everything changed with iPhone/Android in mobile in the ability to create. That fun creation market is still going on today, now we are onto fun interactive tools like augmented reality and location based games which are fun.
There are really inspiring innovations happening all the time in each generation but it does seem like the fun platforms and really interesting ones are led by gaming, or apps today, and areas approachable by designers and developers alike to create interesting games, apps, interactives and the platform makes it fun.
I think it is really important for platform designers/developers to make their platform approachable, simple and reduce complexity so that it can attract people interested in creating. I think an engineers job is to create simplicity from complexity, design platforms smartly with senior skill for the junior, in some areas today we are failing that due to heavily specialization. Every truly successful platform that really hit and progressed innovation/creation forward did exactly that.
(I started on the Pet and later th C64, and had so internalized this approach that I was baffled when I first encountered systems where this wasn’t the primary approach.)
I think there’s actually a lesson here for API design today: the power of the “everything’s-a” approach. Then it was everything’s a memory location. But generally the everything’s-a approach allows for a flat abstraction space with low cognitive overhead and high inherent/automatic composability, leading to short learning curves and high productivity.
Today, (almost) everyone knows how to use a computer, but very few people, even in the computing industry, grasp all of what is going on inside of any single machine."
An astute observation: I see a lot of complexity in IT today which obviously comes from not having a clue how the hardware functions and how it's efficiently programmed. The complexity, performance and resource hits grow with every layer of abstraction. Convenient for those who write software, very bad for users who then needlessly suffer.
I could walk into a computer store in early/mid 1980's at the mall, spot a model of microcomputer I never encountered before, and could type:
10 PRINT "Oh no! Something is going to blow! Run!"
20 GOTO 10
Why did programming get so complicated?
I think drilling is something we do in almost all technical skill development (music, art, athletics, vocational) and I wanted to bring the same thing to my CS courses - so I started requiring typing exercises as one of their assignments for the week. These aren't just "typing a for loop 10 times", but additional context (for example, the link below shows regular expressions for addresses) to give them something they could use as a template for their programming exercises. To combat copy and pasting, I just made the code an image. In my first link, you'll see an example of using a regular expression to validate addresses. After completing this, the students would then be required to complete some Q&A exercises as well as traditional programming exercises where they needed to design functions that: validate phone numbers, (a limited scope of) email addresses, and Social Security Numbers. The objective of that week was to get them familiar with regular expressions, not finding a StackOverflow link that teaches them how to implement regular expressions.
As the article says, this is what we did in the 80's. That doesn't make it better, it just makes it how things were done "back in the day..." However, K. Anders Ericsson states that early specialization is often a key determinant to future mastery and that deliberate practice refines areas where an individual struggles and may be unenjoyable  (see my older comments on grit/perseverance). Likewise, syntax errors are one of the first problems novices face . By completing typing exercises, the learner does not need to worry about using problem-solving skills, which they may still be struggling with, just the correctness of the typed characters. Thus, typing exercises give the learner a deliberate practice resolving a simple, but major issue. Additionally, typing exercises remove students' ability to just "copy and paste" before using example code. With syntax errors mostly resolved, the student can then focus on problem-solving rather than where the semicolon should go.
I currently have a SIGCSE paper under review, but the gist of the paper is that students that voluntarily completed typing exercises performed better in their class than students that did not. The students may have just been more motivated and therefore that is why they scored higher, so there is a limitation to my study. I could require it, but then designing a control group that would receive the same amount of learning would be difficult as well.
 https://research.csc.ncsu.edu/arglab/projects/exercises.html (the Heroku link is currently down as I've made recently changed to the live version)
 https://www.youtube.com/watch?v=hHebXvoHue0 (Rener Gracie is a character, but listen to those first few minutes)