Hacker News new | past | comments | ask | show | jobs | submit login
Learning BASIC Like It's 1983 (twobithistory.org)
284 points by fcambus 9 months ago | hide | past | web | favorite | 160 comments

I'll share my own experience, since I started computer programming in 1982 (on a 6502 based computer called Oric-1), then worked in the game industry for 18 years.

What I remember from these early times:

1) we had only one television at home, so typing programs required to have access to the TV. This is why I spend a lot of time analyzing programs BEFORE typing them, since I didn't have a lot of time to type them. This probably taught me a lot !

2) I always wanted to improve the programs I typed, so I spent a lot of time optimizing them. This also proved useful later ;-)

3) Some programs included mysterious hexadecimal characters. I tried to find some documentation about that. It was difficult, because the information was scarce, and there was no Internet. One day, I got an Aha!, and I discovered 6502 that day. This was useful, since I did write quite a lot of games in 6502, and it got me my first job in the game industry in 1985.

4) In France, there was a beautiful newspaper called Hebdogiciel. It contained programs for all kinds of computers. I tried to convert these programs to my computer, and this also gave me pointers to handle conversion between Basics. My first job was about converting Basic programs between various computers (Thomson TO7 <> Exelvision).

5) everything was so new and exciting ! Nowadays, I don't feel this kind of excitement. Everything is so easy to put in place. At the time, we only had 48 to 64 kilobytes of memory. Everything was a challenge. The computers were not designed to write games, but games were doable.

> 5) everything was so new and exciting ! Nowadays, I don't feel this kind of excitement. Everything is so easy to put in place.

I'm probably the same age as you (started in 1982 on a Philips P2000) and feel the same. What wrecks me every time is the thought how many new things we discovered and did in a timespan of just a few years back then (the lifespan of various home computer models). Today, 10 years pass and all we really get is somewhat faster CPUs, slightly bigger memory, better graphics. And it's not worth it to discover any specific tricks for any platform (like you could on home computers to get more colors / bigger screen etc.).

4 ) "In France, there was a beautiful newspaper called Hebdogiciel." <== je suis d'accord avec vous... quel magazine... j'ai beaucoup appris avec sans Internet(I agree with you, what a magazine, i learned a lot without internet)

5 ) I don't agree, i still find so much exitement today with what is going on around, sure we have so much more power, but there is still challenge today, i'll admit not the same, but still a lot of challenges.

The lack of access to documentation was hard, very true, but then it also meant reverse-engineering was the only way to proceed, and from a certain angle, that was a good thing.

Actually, the documentation was amazing in retrospect. The Sinclair Spectrum user manual, which started with how to connect the computer to the TV, then proceeded to teach BASIC, had, in the appendix at the back, the most fascinating table. IIRC, it was a combined ASCII and Z80 opcode table, from which I pretty much taught myself assembler (and committed to memory the most common opcode hex encodings - and I still remember some of them to this day!).

This! Still have fond memories of reverse-engineering the screen-buffer encoding on my Amstrad CPC 464 by poking values directly into the buffer and watching what happened on the display (i.e. my TV). Somehow got hold of the Z80 datasheet for opcodes - but working out the relative jumps with pen an paper was a slow process. Upgraded to Amiga around the time when I realized that assemblers had to be a thing...

Yes, I was coding in hand-coded hex machine language for a couple of years before I had access to an assembler. These days, when and IDE drops into assembly and all my 20-something colleagues duck for cover, I rub my hands together in glee...

I have always been bummed when listings had huge fields of data statements. In my mind, it was a little bit like cheating. Like using cake mix instead of ingredients.

I created a series[1] (pardon the plug) to make the software I had wanted to have when I started: simple, understandable, frugal, almost elegant.

Maybe, it will bring some nostalgia back!

[1] https://latenightsnack.io

I remember the Oric1. I used to hang out in the local computer shop, reading all the manuals.

Dragon, Oric, Commodore, Amstrad, Sinclair and BBC.

I already had a Sinclair ZX81.

The article really captured the exploration and discovery of that era.

On a more amusing note, my mate wrote a utility which we put on a floppy disc. We would go to shops and boot it, and it would copy all the roms. Good times.

I started with the successor of the Oric-1, the Oric Atmos. I had a similar experience as yours, except we had multiple TVs. Also, I don't know if my father bought it separately or if it came along, but on top of the standard manual in french (we're french), we also had a book which contained, among other things, the full 6502 instruction set (in english), which was very valuable. IIRC this is what it looked like: http://www.48katmos.freeuk.com/oricman.jpg

I have fond memories of this computer. I absolutely loved its keyboard.

Yes, I had a similar manual (I'm french too).

The problem is that the manual contained only the set of instructions, without any explanation. My Aha! was understanding how it worked.

I did write a couple of games on Oric/Atmos, but quickly changed to Atari ST (I had one of the first 520ST with its developer manual).

Even before Hebdogiciel (or maybe at about the same time?) there was "Jeux et Strategies", and it came also with BASIC listings for adventure games. Mind you, I don't recall ever getting one to work...

Ah, “Micro Adventurer” was awesome for the same things in English. Cross-platform BASIC code, I did a lot of work based of an article on more natural Adventure command grammars. So much fun!

I came across an old copy of this when I was in grade school. Early aughts. Finding those books may well have put me on the path to where I am now (embedded hardware eng)

> Mind you, I don't recall ever getting one to work...

There were many publications like that, and often they would contain broken code. I sometimes wondered if it was on purpose, but that was actually a great opportunity to go beyond just typing the code and run the program, having to understand the code, and debug it.

Similar experience, started coding around 1986 with a Timex 2068.

We had Input magazine, the Capital's newspaper computer section on Friday, Crash and from our neighbors Microhobby, which helped to improve my Spanish as well. :)

Hebdogiciel.... A fond!

I cannot agree more. I started when a computer did not have an OS, let alone 1.000 applications.

You got dropped right into the interpreter. There was not already a GB of OS loaded that you had a hard time to learn. All the code that was there was what you wrote (copied) yourself.

The processor & interpreter was about as fast as you could think, so it was easy to follow and step though (mentally). Reversing to assembly was the logical next step and since the programs where small, it was easy to learn & memorize.

After years of coding yourself, you'd stumble on the first OS, which consisted of the most rudimentary libraries that one could basically read & remember.

Years later still, the first rudimentary networking picked up. Slow and not business critical so again easy to experiment with. By the time I connected the first commercial network to "the internet" downtime of email for less than 24 hours was not even noticed.

I do not envy the kids who nowadays stand 0 change to ever learn the complete stack of code running on any modern device. From what I see, they all are "stuck" on top of a GUI with only the slightest idea of what happens between their mouse and the actual hardware (and even that is often not hardware anymore).

I haven't read it yet, but I think Nand2Tetris was meant to address this.

Petzold's CODE is also really good going from logic gates to microprocessors to assembly language.

I still wish you could buy something like a Pi board that has just an interpreter and compiler on it as well as a textbook and you implement a simple version of a file system, text utilities, task manager...etc.

I find myself asking the same things when it comes to the lower-level software side of things. I find this area fascinating as well, but it's all so "invisible" in modern systems.

I graduated from a CS program (BS) a few years ago, and the projects I'm still most proud of existed "lower" in the stack: implementing FAT16, writing compilers/interpreters (I still have my Brainfuck interpreter!), playing around with paging in MULTICS, and the like.

These were all very toy-like (with good reason, a semester is only so long), but since I enjoyed them I've found myself asking things like "I wonder how the process scheduler/virtual memory manager/in OS X is handling [whatever I'm doing at the moment]? What does my stack look like right now? How are all of these threads communicating with each other, what are they saying?"

You can occasionally see this in the console when something is going wrong, but usually not when the system is operating normally. When the OS is handling a heavy load brilliantly, that's kind of when I'm most impressed, and therefore most interested.

(maybe this kind of procedural output would be dreadfully boring or unreadable due to the complexity of a modern system if I actually saw it, I don't actually know of course.)

`perf top` on Linux, and doing a (really) deeeep dive on the perf and BPF APIs, may be of interest.

The possibly-incorrect impression I get of the performance-analysis related areas of the kernel is that they're a bit siloed (in the same way that X11 is siloed, and only a very small group of people look at it), which may render it bit functionally academic. (It's newer code, though, so there's less chance of eccentricity, FWIW.) Isolation does have benefits - with less chaos to keep up with and less bus factor, the code is changing less and the maintainers have more mental bandwidth to post on mailinglists :), so you have more opportunity to get a good understanding of what's going on.

I don't have the same low-level experience you do, but I do share the same interest in wanting to understand "what's really going on" - and I incidentally want to make a Linux system monitor tool that vacuums up as much information that the kernel is willing to make available to it. `htop` and friends surface a very caricatured picture from maybe 1-10% of the data the kernel has to offer at any given moment.

One of the original concepts for the Pi was an Atmel-based system that booted into BASIC.

Unfortunately these days you can have simplicity or you can have a web browser but not both. And without a web browser a computer feels extremely limited.

Yeah, this. The simplicity of early computers came with a cost: they couldn't really do very much. You can recapture the essential parts of that simplicity today by, for example, firing up a Python interpreter, which is just as easy to learn as BASIC and much more capable. You can live inside a Python prompt for a very long time before you really start to hit the limits. (Or, even better IMHO, inside a Lisp prompt :-) But the fact of the matter is that the technology that makes modern web sites possible is complicated, and there is just no way around that.

Re: Nand2tetris

I was going to reference the same thing. Particularly the text Elements of Computing Systems.


Not sure what to make of it[s downloadability], but I found a PDF: https://archive.org/details/TheElementsOfComputingSystems_20...

This seems a nice and open platform to experiment with; strictly speaking not what you are looking for but can be tailored to the task: https://wiki.odroid.com/odroid_go/odroid_go To me it also looks like a great gift for tech oriented kids.

Many Espressif MCU boards come with Lua already installed, and some other boards are supported by eLua. https://github.com/whitecatboard/Lua-RTOS-ESP32 http://www.eluaproject.net/

It might not exactly be the same as operating real hardware, but I have tried to address this by creating a small VM that is simple to use, understand, and modify[1].

[1] https://latenightsnack.io

Idk why would it have to be dedicated hardware that already has an interpreter/compiler on it. You could make your own mini-os. It's relatively simple to make something that boots in qemu, and once you've done...you can do anything!

The point is to grok the entire system: Hardware, OS, compiler, interpreter...etc.

I think building a FORTH would be educational, but I've never seen a tutorial (including JonesForth) that can take you from zero to Forth. Everyone just says it is easy. Maybe it is more or less obvious with more Assembly background?

Okay, here's an oddball answer: https://github.com/servo/skia/blob/master/forth/

Skia used to include a really simple Forth implementation written in C++. According to https://groups.google.com/forum/#!topic/skia-discuss/joAyAl1... it was intended for straightforward/simple scripting, and for some reason promptly got deleted shortly after being brought up.

[In case you haven't noticed yet, the official Skia repo is at /google/; I've linked to /servo/'s obviously-old copy above, but there are more than enough forks and clones for the code to be findable elsewhere too.]

I kind of like this implementation because it's both incredibly context-specific and generalized all at the same time, and it isn't Bare Metal Implementation #79,513. And it's C++, too, although this is not (overly) abused.

I do not envy the kids who nowadays stand 0 change to ever learn the complete stack of code running on any modern device.

I also started back when things were simple (BBC B with interpreted BASIC, no GUI).

Having transitioned to modern development, I think the bare bones system was good for self-learning. But for the last 2 decades I've worked on systems where many layers were hidden from me (Windows, RDBMS, Salesforce etc) and most of the time it doesn't matter. I very rarely need to learn anything about low levels Windows stuff. With SQL Server you need to learn a bit about query plans and how to influence them, and where to put indexes. But mostly, you can ignore lower layers and just let them get on with what they do. And thats the whole point isn't it?

One huge exception to this is the modern web stack, if you think of the browser as the bare metal, with HTML and JS and CSS as some sort of machine code. On top of that you have Content Management systems and PHP and so so many JS frameworks. And whatever you do, soon enough you need to get down to the HTML/JS/CSS level. So thats the modern equivalent to our experience of the old days - no matter how many frameworks and CMSs we throw at the web, everyone still has to know their HTML/JS/CSS to get anywhere.

Perhaps one day we will escape from that and people will reminisce about angle brackets and escaping ampersands.

> most of the time it doesn't matter.

It's true.

However to have an understanding of things from top to bottom is definitely a benefit. People get silo'd in their various layers and lack comprehension of much that's going on below (or above) them.

It can make you a more rounded developer, to have a good idea of the whole stack, even if it's not always in-depth and some of the mental models used aren't strictly true.

Its a benefit, but whats the cost/benefit of the time it takes to learn that, vs maybe learning something else?

As someone who's coded everything from web front ends to embedded systems with no OS and manual memory partitioning, via mainframes, network management tools and all sorts of other things, I'd say the benefit is pretty high!

Someone needs to know how it all works, someone needs to build the kernels, the low-level libraries, the compilers, network protocols server programs etc etc.

You can certainly have a great career without them, I wouldn't dispute that.

Yeah you’ll never step in the same water twice, but I was born in 85 and I don’t miss 90s hardware much, not even nostalgically. Do you miss 70s hardware? I mean in terms of what you can practically get out of it, not how nice the keys felt. Yeah I missed seeing and learning from a lot of advancement unfold, but progress has been massive and I do enjoy my computer-video-phone global communication device. What an amazing thing to have seen in a lifetime. As a kid I did not expect there to be videophones.

You’re right, my generation definitely missed a lot but I hope some day to work through NAND2Tetris. Furthermore, also I didn’t study CS, but a good CS/EE program would really take you through first steps. No matter what, you can’t keep a book from a scholar. There appear to be many resources to learn this stuff if one has the opportunity. Combine NAND2Tetris with the x86-64 assembly on Ubuntu book I saw here yesterday and you should be golden, right?

> Do you miss 70s hardware?

Yes; IBM System 360/370, for example, was a marvel of engineering and design, with a remarkable history [1]; so were the first microprocessors, such as MOS Technology's 6502 [2]. The best part was, you could know all of it, if you wanted, down to the transistor level - the geek's Holy Grail. With modern chips/systems you cannot any longer, not even with the Raspberry Pi.

I wish IBM's revolutionary architecture lived on in our PCs; good engineering pays off, and today's software could be more sane, as people would be learning from the giants rather than wasting time on a massive scale trying to reinvent the wheel.

[1] https://www.amazon.com/IBMs-Early-Systems-History-Computing/...

[2] http://visual6502.org/

Yeah, but the problem is that it takes time. Learning assembly takes time ('cos you need to understand the whole system at the level of machine code : bios calls, memory protections, interrupts, DMA, graphics cards, sound card, I/O,...). Then you have to go up a level into the OS stuff (hello file systems, processes, threading, paging). Then once you get that, you have to learn how a (extreme example) Java VM connects to all of that (welcome GC, optimization, compiler, byte code). then, ah finally, something one can easily understand : Java code :-)

(and don't even start me on WS/rest over HTTS over SSL over IP).

Learning stack by stack, one generation at a time, let us, old gray beards, digest the whole thing. But now, starting from scratch ? Not even with a 10 feet pole.

Yeah it’s not the same but it’s amazing starting from scratch at a new baseline. Every generation of humans has had to do that with everything.

I missed out on all the stuff you mentioned and really want to learn it, but I am grateful that I get to use all this stuff that’s developed. Instead of spending time learning the above listed fundamentals, which would have been revelatory in its own way, I have the opportunity to use Java on Kubernetes to build and operate globally whatever Internet software dream comes to mind. It’s not the end all, and I am crippled by lacking fundamentals, but it’s so much more than before and I am grateful to have the opportunity to have these tools.

Well, admitedly, I don't use all that knowledge at all times. Like many around, I'm super satisfied by being able to write a very complex SQL query knowing that : I won't be constrained by memory, I don't have to think about why the transaction will work, why my connection to the DB will work, how the SQL engine will optimize the damn thing. I just have to contemplate something that it almost business code and produces almost business results.

It's just that I have that warm, reassuring feeeling that if I have to dig down, I'll be able to. Too bad I almost never have to; abstraction works way too good :-)

I'm sure one can be a very good programmer not knowing the assembly stuff behind. And even myself, I don't know exatcly how a CPU works (I mean, I get the logic of it, but I wouldn't be able to make one from scratch) :-)

Well the start of the '90s was still Amiga time, which was the biggest leap in home computing ever but that really happened in the 80's. Then it was followed by the dark ages of PC gaming catching up while people played on dumb consoles.

Around when the Voodoo cards came along (1996?) things got really interesting again. Also about any keyboard that shipped was decent still, Silicon Graphics and Macs (even the beige ones) were really nice looking but expensive machines. Had a SCSI drive hooked up to a PowerBook 1400, that was pretty much as cool as having an external Thunderbolt SSD now.

The ASUS P2B of course was amazingly stable for it's time, however if you had a dual Celeron 300A @450MHz you were really the king. Dial-up modems getting upgraded to T1 lines or cable modems. That was a huge speed boost.

So yes the 90's were nice, but just no comparison to what happened between '77 and '87 (the year the Amiga 500 got introduced).

The way I see many devs on Emacs and vim doing development is hardly any different than when I got to university in early 90's.

They might be using a modern laptop, but their screen is hardly different from those beige UNIX terminals with green phosphor screen and VT100 keyboard that I had to use the first couple of semesters.

Vim/Emacs are local maximas in terms of productive UX, they'll still be here in 20 years. What's different might be the amount of compute those environments control. A keychord or three can easily trigger a big rebuild on a computer somewhere in the network.

I disagree, for me the local maximas in terms of productive UX is the vision of XEROX PARC workstations, partially implemented on OS X/iOS, Android and Windows development workflows.

Even AT&T later moved into it with Plan 9 and Inferno, with ACME.

I did some amount of Android and Windows development; what particular pieces of the PARC vision do you have in mind?

IDE based development, GUI designers, REPLs, graphical debuggers, edit-and-continue, component based frameworks, structured data command line (PowerShell), apis to interact with GUI apps from devenv, dynamic configuration of running apps.

All to be found on Interlisp-D, Smalltalk, Mesa XDE, Mesa/Cedar environments.

> local maxima

Yes - local, very local.

For a while I thought the same. Then I starting meeting more people from the generation before me, who complained about how the kids nowadays (meaning me and my generation) didn't really know how the machines worked. The kids only wrote code, but since they didn't build and maintain the hardware, they didn't grok the full stack.

And to be fair, they are right. There was a level of UART programming and video logic control that I never understood, because they required knowledge of analog circuitry that was beyond me.

As I grew older, I realize that every generation has people who that about the next one. Sure, car engines in my g'grandfather's day were so simple that anyone could take one apart and really learn how it works. But I prefer the benefits of modern engines.

BTW, the first real OSes were in the 1960s. I do not think you started with computers in the 1950s. And how is it that "all the code that was there was what you wrote (copied) yourself" when you were "dropped right into the interpreter"? - who wrote the code for the interpreter?

> There was not already a GB of OS loaded that you had a hard time to learn. All the code that was there was what you wrote (copied) yourself.

Typically you had 16K of ROM BASIC, if you were really lucky on a half decent machine (BBC or so) you would have something more akin to todays BIOS in another 16K. So there was some code there, just not easily accessible or replaceable.

> From what I see, they all are "stuck" on top of a GUI with only the slightest idea of what happens between their mouse and the actual hardware (and even that is often not hardware anymore)

I tend to agree with the feeling, but, on the other hand, only the really worthy will chose to dig in and really understand how a computer works.

Separating the wheat from the chaff.

> If you wanted to play one of those games, you had to type in the whole program by hand. Inevitably, you would get something wrong, so you would have to debug your program. By the time you got it working, you knew enough about how the program functioned to start modifying it yourself.

As someone who was there and did that I want to refute that assumption :) I have typed in many programs and there's not a lot to learn because most of them consisted of many pages of DATA lines and a small loop that loaded those machine instructions into the home computer's memory and started the program by a USR directive (please note that I made this explanation many years after the fact). I guess there are not many teenagers who are able to debug this kind of program by looking at the actual opcodes.

Sure you could learn a lot from typing in regular BASIC programs but that weren't the most interesting games as far as I remember. The most productive learning experience was interactive and exploratory programming a shown in the OP article.

As someone who was also there and did that, I want to partially refute the refutation of that assumption. There were many many pure basic games that you typed in and you could learn from. But it became more and more common to use assembly language via data blocks to get the best from that beasty hardware and it started to get more common that you would get listing that were pure hex codes and you'd enter the program with a special tool ( which also you typed in from a printed listing ) which would also include a checksum to make sure type each line correctly.

But still, I learnt a lot by modding basic games I typed in from books, then later learnt a lot of assembly as I tried to mod those assembly language programs I typed in with hex codes. You'd print the whole program out using a good ol dotmatix printer ( took a while.... not because there was that much code, just printers were slow... ) and then puzzle over the assembly trying to work it all out and drawing lines trying to work out all the jump sequences etc. Later my Dad and I had a lot of fun hacking tape based games so we could put them on our awesomely huge (doubly huge if you cut a notch and flipped them over) and speedy 5/14 inch discs.

As someone who was also there and did that, I want to reconcile your positions. When you have a look at the archival issues of Your Commodore magazine[0], you'll find both meaningful BASIC code that you can learn from as well as DATA instructions that were mostly useless for us at that time.

[0] https://archive.org/details/your-commodore-magazine

> I have typed in many programs and there's not a lot to learn because most of them consisted of many pages of DATA lines and a small loop that loaded those machine instructions

That varies a bit by platform. Most 8-bit machines had relatively poor BASICs and no assembler, so BASIC code could be a mess of line-number targeted GOTO/GOSUB statements and anything that needed to be faster than the higher-level language interpreter could manage needed to be pre-assembled and just pushed into the relevant RAM locations by a BASIC loop as you describe.

I had the luxury of learning on Acorn machines, initially an Electron then a BBC Master Series. Their variant of basic had useful features which implementations on other 8-bit devices lacked: long variable names (IIRC the C64's BASIC allowed long names but only used the first to characters so PersonName and PermitCount would clobber each other, BBC basic respected up to 40 characters of a name), named procedures and functions, and a built-in multi-pass assembler. This made BASIC code (if written using those features obviously, some magazines had content targeted at multiple architectures and stuck to the lowest common feature-set) easier to read, understand, and modify, and made it possible to see any machine code logic so you had a hope of pulling it apart usefully too. I learned a lot starting from nuggets in type-in demos and using them as starting points to experiment/read a little deeper, not just about those specific machines but reading around computer architecture more generally. I don't think I'd have the same level of competence or knowledge that I have now were it not for that start.

I remember in Polish computer magazines, they had a validation program. It was a shorter program that you typed in once and ran. Then, when typing in the long "game" code from the magazine, the screen displayed two-letter checksums for each line of codde. The two-letter checksum was also put in the code listing in the magazine, so that you could quickly compare them to find the line that was typed wrong.

As someone who was there and did that I want to refute that assumption :) I have typed in many programs and there's not a lot to learn because most of them consisted of many pages of DATA lines and a small loop that loaded those machine instructions into the home computer's memory and started the program by a USR directive

On the Beeb most listings were BASIC with all the logic there and DATA statements just for in-game assets such as sprites or room descriptions in a text adventure. All the Osbourne books are online now , check them out https://www.raspberrypi-spy.co.uk/2016/02/usborne-releases-1... many happy memories inside!

For more 80's microcomputer BASIC nostalgia, there's also INPUT magazine (https://en.wikipedia.org/wiki/Input_(magazine)). I spent hours and hours as a kid poring through those magazines. Archive.org has the entire series as PDFS - https://archive.org/details/inputmagazine. They were really good at explaining core programming concepts and the illustrations were also amazing.

BBC BASIC had a built-in inline assembler, which I think also made things easier for people typing in at home. Even if the program dropped to assembly language for speed, the instructions were still human-readable rather than DATA statements full of raw hex.

You could actually do that right now in a better way.

Most of the assembly and other DATA/POKE stuff is less than relevant nowadays as hardware has improved and a lot of magic is already done in firmware.

The form of "type it in, verify and run; explain mistakes and correct solution" is still didactically valid.

I'm pretty sure there are programming language interpreters out there with simplicity of BASIC, including BASIC itself.

On MS-DOS, you had a variety of commands built in and extensibility via BASIC too. (or more) Programs could be written as batch files. On Unix-like, the primary interface was via shell, csh, ksh or later POSIX. A set of built in commands. If you wanted something more advanced, programming languages and interpreters were easily available, typically pre-installed.

It was the main interface, magic words. We have regressed to pictures now and most extensibility has been hidden, removed and made very indirect.

Hardware doesn't come with programming manuals anymore as it is deemed too complex (yeah right) and expensive. It is be great if your GPU came with a description and software to write and execute shaders, for example. PC with instructions how to programmatically reboot it, start it or access sensors. How to access EFI services. What the machine code looks like. Etc.

"Hardware doesn't come with programming manuals anymore as it is deemed too complex (yeah right) and expensive."

It doesn't come with manuals because it is considered a trade secret. Commodore had to lay the hardware bare because it wasn't fast enough to program in a high level language nor did the high level languages fully support their hardware, whereas the hardware of today is so fast that nobody thinks anything of it programming it through a driver or a shared object library in a high level language. It's still wrong not to lay the hardware registers and their functionality bare (yes NVIDIA, have you forgotten your SGI roots?)

Rarely it is the case that showing relevant performance characteristics and APIs really exposes anything. AMD is perfectly fine with exposing the programming manual and most of setup code - it is not where the real important secrets lie...

It takes time to produce useful documentation though.

I would also guess that RIAA/MPAA are somewhat responsible with the secrecy to enforce the DRM.

I've read a discussion in a forum where an AMD employee when questioned if it was possible to have a GPU's FW without DRM (or something like that) his answer was that would probably be a commercial suicide (OK, I'm little tired right now and I'm not recalling all the details of the conversation but I can try to dig it up... ^__^;)

Hasn't that changed in recent years? I've seen manuals for recent AMD GPU instruction sets.

I remember that Amiga programming manuals had to be additionally bought from Commodore.

But they were available, and most of us used the optical character recognition scanned version on the LSD doc disk.

25 years later I bought the real thing.

The hardware reference manual was published by Addison-Wesley. Gave me a little nostalgia ping when I noticed the Stroustrup C++ book I bought recently was also published by them.

One of the advantages of the TI-994A was that the (Extended) Basic community didn't use (couldn't?) the prevalent PEEK,POKE commands on the Commodore 64.

The code was out in the open and decipherable. Great way to learn in that particular environment.

Wow - your comment tripped a memory in me. I learned on a TI and made heavy use of CALL CHAR to redefine the bit pattern for characters to draw on screen. I had a hard time switching to Commodore Basic because it wasn't clear to me how to accomplish similar things there.

Atari with their player-missile graphics made immediate sense to me though.

This depends on the teen. In my peer group, at the time, a few of us being nudged by a mentor or two, went right after the machine code.

One of the things that mentor emphasized was writing a dissassembler soon after becoming familiar with a given CPU.

However, that woild be post fact. Some teens got closed computers, like Atari machines. By todays standards, they are wide open. Back then, they were mysterious, but capable with spiffy graphics and sound chips. The key thing was how users were nudged toward or away from the guts of it all.

Other teens got, say an Apple 2. On that one, you got schematics, a commented ROM listing, and the whole machine was open, software driven. It screamed, "hack me!" The whole damn computer was a clever hack!

Again, my per group were Apple people, and were writing ML based off the snippets found in the ROM, along with great tutorials found in the grocery store magazines. (Where I got my first dissassembler, BTW. Thank you COMPUTE!)

This stuff varied extremely widely. One town would have that spark of learning, a particular school, or club maybe. Another one, nothing. Just gaming, the usual.

Circling back, yeah. Those DATA statements were known to be programs. Lots of us hand disassembled them and either wrote them down (I have a couple in some papers somewhere, I am sure), and my preference was to add REM statements with assembly code in them for later. Oh how I would have loved BBC Basic!

The Apple was cool, had a monitor and mini assembler, disassembler. Wrote a lot with it. Still a mess compared to what I understand the BBC had. I want to go and run one.

Other machines varied, as did people and what books and or other resources were available to them.

It depends on the system though; C64 had a very limited basic so without asm you could not do very much. Other systems, like the MSX, had pretty advanced basic and many of the programs you would type in were mostly or only BASIC.

However, because I was young and still had memory in my brain, by typing in those poke data parts, patterns stuck in my memory. When I bought my first asm book, I finally understood what those patterns of opcodes meant. I never programmed in assembly instructions on the z80; I always used direct opcodes, still do. It is faster for me.

This is true but partially. Maybe I have memories starting 1-2 years earlier when more than half the games were coming in BASIC listings and you could find some quite complicated and interesting ones among them.

The more advanced I had found was a football (soccer) simulator that was working like a text adventure game, showing you the development of the game by writing the words of the sportcaster in regular time intervals. Being able to understand it and modify it, was a milestone on my growth as future programmer.

In my experience there were plenty of both types of listings around (that is, both open BASIC listings and more inscrutable ML in DATA lines). My systems (PET and C64) had a built in disassembler, so it want necessarily that hard to comprehend the code anyway. And a lot of the DATA were just sprites and other graphics.

Anyway, I learned a few things from those listings early on.

And as importantly, people traded tapes, floppies, etc. "If you were an avid gamer, you became a good programmer almost by necessity" is pretty ludicrous.

> I think the people that first encountered computers when they were relatively simple and constrained have a huge advantage over the rest of us

I pity sometimes the young people who are studying computer science nowadays. I studied CS in the 1990s. Compared to a modern curriculum, my courses looked very basic: five years on functional programming, compiler construction, networking, databases, etc. No P2P, cloud computing, mobile applications, IoT, etc.

Now, most CS studies have to rush through the basics in three years, followed by two years where the students have to learn all the tools and techniques that they need for their professional career. I had an entire course ("Advanced topics in databases") on the efficient implementation of indexing and query execution for databases. Today's students have to learn in the same time: a shortened version of the old course PLUS nosql, column-oriented DBMS, DHTs etc.

> Today's students have to learn in the same time: a shortened version of the old course PLUS nosql, column-oriented DBMS, DHTs etc.

If your university is good. I took a computer science bachelor the past three years in the Netherlands. In my case, I fear that I learned as much about databases as your basic course taught you, and maybe even less. In particular, the implementation of databases wasn't even touched on. I definitely learned nothing whatsoever about nosql db's, column-oriented db's, dht's and other cool things, though the existence of the first two was hinted.

Luckily I can usually figure out what I want to know without help of a teacher, but there's a definite difference in level of difficulty and amount of content between universities today.

> Luckily I can usually figure out what I want to know without help of a teacher

That's probably one of the most valuable things I learned at university.

Doesn't Netherlands publish national classification of university quality, thus allowing a better selection where to apply for?

They do, and this university came out pretty high up, though probably not at the top. My reason for choosing it, though, was that they provided a maths-CS double bachelor programme. In my experience, their maths programme is of very high quality, so that makes up for it.

Sure, there are always multiple factors when choosing where to apply.

I was just curious, because not all countries do it.

It was just last week that you saw the Commodore 64 ad on TV. Now that MASH was over, you were in the market for something new to do on Monday nights. This Commodore 64 thing looked even better than the Apple II that Rudy’s family had in their basement.*

Alternative version:

It's 1983 and your dad decides to buy a personal computer. The C64 is too expensive at $595, so he buys a VIC 20 (introduced at $295, probably $200 by '84).

You plug it in to the television downstairs and, after mentally tuning out the hum from the RF converter, you start to enjoy gems like GORF and Radar Rat Race (https://m.youtube.com/watch?v=1LRkON9XTOk).

You try to read the included user manual, with its helpful computer chip themed cartoon mascot explaining things like strings, but none of it makes sense because you're eight. You and your sister spend hours reading aloud and typing in bytecode listings from then-popular computer magazines. Imagine spending an hour transcribing hex codes, only to type the run command and have it crash.

If, on the off chance you found the errors and got the program to run, you'd find it was wildly over-hyped in the description. You didn't want to waste your work, so you'd save it to the tape drive that used audio cassette tapes.

A few days later, you'd try to load the program from the cassette tape and find that it was corrupted. I never once got a saved program to load from that thing, it only successfully read commercially published software.

The nostalgia is largely inaccurate. It was an era of immense frustration. And I never saw any C64 television ad, probably because we only had three television stations and no market to speak of for personal computers.

Incidentally, there's no way the kids in Stranger Things would have had those high end walkie talkies. They'd have had the crappy ones that only work to maybe 100 yards and emit static nonstop.

I had a VIC-20, was about 12 and took to it like a duck to water. The tape drive worked flawlessly, though it did take ten minutes to find and load a program.

One long-form magazine program I tried never worked right, even after finding and fixing some bugs myself. But we neighborhood kids had hours upon hours of laughs plugging our names and bad words into the sentence generator program's DATA lines.

Imagine spending an hour transcribing hex codes, only to type the run command and have it crash

This is basically how/why I learned how to program. I had a knock-off Apple II clone (Franklin Ace 1000), and I'd type in BASIC games. I was terrible at typing and often made mistakes. I learned the basics of debugging when trying to get those games working. In a lot of cases, just getting the game working was far more fun than actually playing it.

The tape drive may have been too close to the TV. RF power from some old tube TVs could interfere with the tape drive. I used to move the tape drive as far as possible from the TV when loading/saving.

Sounds to me like you had a broken tape drive.

I've been doing a lot of tech blogging lately, playing around with various F# tools and toy projects, seeing what resonates with the community.

There was a progressive complexity that happened back in the late 70s and early 80s such that people alive today who still code and learned back then have taken the ride from machine language to multi-gigabyte stacks.

We just kept adding stuff and having to make sure we could be functional in all of it. Not an expert, but functional. It was standard practice on my commercial programming teams to decide what everybody wanted to learn on a new project before starting. (And these were high-paying projects. We always left with happy customers).

People were jack-of-all-trades. Most everybody was. You had to be.

What am I seeing resonate, at least as far as I can tell? The inability to understand what the hell is going on and work with it. You get a C++ compiler compiling a hellacious codebase working in DOS, a rails configuration ain't nothing.

I see what are supposedly senior programmers walk a bit off the happy path on a framework and they're lost. Not only are they lost, they are insecure, afraid, embarrassed. There's nothing wrong with these people. There's something wrong with the way we're training and staffing them.

Fifteen years ago I was still coding commercially, having a blast. Talking to a recruiter one day about various projects, she said "You know, you're one of the last true general consultants"

There may be ten thousand of us. Beats me. But her general appraisal was correct. There is a drastic and complete change between the way coders used to relate to technology and the way they do today. It's not tech. It's mindset.

Just to your point about the recruiter’s assessment— I’ve never met a modern tech recruiter who can make that kind of remark.

They largely focus on buzzwords and where you went to school. Seems like they even often fail the companies they work for by designing poor matches and wouldn’t recognize experience like you’ve noted as being worth anything. Then again, maybe I’m just in a weird bubble here. And I’m just generalizing—they’re not all the way I’ve described, but many (possibly most) are.

"Just to your point about the recruiter’s assessment— I’ve never met a modern tech recruiter who can make that kind of remark."

How many years experience of "the XML" do you have?

"What versions of .Net were you using at each of these jobs?" "What percentage of time were you using SQL databases?" "On a scale of 1-10, how well do you know Java?" and my personal favorite...."Are you willing to do a contract role for less than your salary? We have many candidates go long term."

Oh at least twice the lifetime of the technology. I'm a veritable expert.

as a data scientist, I would struggle to answer 'what % of your work involves programming in python?'

I'm like well uh that depends. 95% if you include #import sklearn and basic numpy and pandas operations. 01% if you consider 'real programming'. 'so what %?' ...

That aside, there is a real problem in the tech industry when it comes to developing talent. Universities teach mostly theory and rely on the industry to teach us best practice, 'production level coding'. but then once they get to the industry, they get no mentorship.

companies don't want to spend time training someone who will leave in 2 years. however, if everyone did solid mentorship programs, the entire industry would benefit.

There are good and bad recruiters in every age. Back in the day they would ask you if you knew MVS or OS/390, and if you said no to that, they'd proceed to ask if you knew TSO.

The problem is not that it was easier in the 80s to start programming (that would be that same as to say it was easier to study medicine in the 1800s) but that today it is much harder to get to the point to do something meaningful.

If you want the simplicity of the Basic interpreter, just fire up a Python console, and you are in a much more comfortable position to learn programming and computer science than you were back then. But it is still a long way to get to someplace useful. In the 80s, by owning a VC-20 and programming Basic and Assembler, I was pretty much already at the edge of something new and powerful.

> The problem is not that it was easier in the 80s to start programming (that would be that same as to say it was easier to study medicine in the 1800s) but that today it is much harder to get to the point to do something meaningful.

I think that in the 1980s, the threshold of meaningful was a lot lower. It wasn't just that you needed less of a stack to be able to do things; it's that so much less had already been done.

And those things hadn't been done because they were too hard to do with the existing tools. But we suddenly had a new, more powerful set of tools, and we could do a bunch of things that couldn't be done before. (I remember, in high school, moving from punched cards, where it took a week to see the results of a change, to BASIC on a TRS-80, where it took a minute. That changed the world quite a bit...)

> "but that today it is much harder to get to the point to do something meaningful."

in some ways that is true, in many ways that is false. only as far back as 2005~, if I didnt know linux and wanted to learn it, I needed to purchase hardware. installing and messing with binaries / configurations ran the risk of making a machine inoperable. now today, its very easy to spin up something in AWS or VM. want to experiment with tensorflow on a large GPU box? 4$/hour on demand.

also, we have lots of API's and libraries we can just import, and I think thats a great thing. writing a website where the login was needed but not ultimately what you wanted to explore use to be hard. now its gem install devise. makinga basic website look pretty enough? bootstrap. want to compare random forests, xgboost, and linear regression for predicting home prices? from sklearn import ...

i know all these things arent doing 'new' things, but they enable the person to not worry about the details and do something cool. I think the volume of interesting stuff we see coming from blogs and articles on hacker news is a testament to that.

I think the point is that in 1983, if you wrote a program that read in some input, did some calculations and printed output, you had created a program that resembled much of the professional software out there.

If you have kids do that know they would probably say "It's not a real computer program, it's just text!"

I think that still misses the point a bit, but is closer. The availability of more ready-to-use software means that even though it's much easier to do something meaningful/useful with the available programming tools, it's much harder to find an application where programming it yourself is the most efficient way to acheive that utility, so there is a lot more needed most of the time before approaching a problem with programming is a net benefit independent of developing programming skills.

I think the people that first encountered computers when they were relatively simple and constrained have a huge advantage over the rest of us

I don't and I grew up in the era of 8 bit machines and kilobytes of RAM etc. I fully recognize that it was fun to have those constraints and we learnt a lot about dealing with them (using less memory, using fewer instructions) but I don't buy that that really matters for most programmers today. They'll have other things to worry about: e.g. debugging distributed programs.

Sure, if you want to do microcontroller work then that sort of thing is useful, but literally nothing stops a "Full Stack" programmer picking up an Arduino and programming it and learning something new.

I love playing with those environments (http://blog.jgc.org/2009/08/just-give-me-simple-cpu-and-few-...) but every day I see people with different experience of computing from me and I don't feel that I have an advantage over them: they often know about things I'm totally ignorant of. It's true that I'm very good at debugging horrible low level things, but it's also true that I'm not good at imagining the state of a system with hundreds of micro-services.

>I'm not good at imagining the state of a system with hundreds of micro-services.

I feel you. Thinking across multiple levels of abstraction is one of the hardest thing in engineering, especially if, over the years, you've overspecialized on one specific strata.

This is why the "STEPS Toward the Reinvention of Programming" (a.k.a. 20k Lines of Code) Project was important. It was spearheaded by computing luminary Alan Kay with help from some very impressive researchers.

The final report is available here: http://www.vpri.org/pdf/tr2012001_steps.pdf

I haven't kept up with subsequent research the Institute has produced. I do know they had an aversion to producing software artifacts and were far more concerned with the written reports (which in some senses restricted the ability of amateurs like me to play with the interesting output by the group). I did play with OMeta (a meta-parser) and the COLA / Id code - which was enlightening!

For all those willing to play around a bit, here's a low entry-level access to Commodore BASIC, a web-based PET 2001 emulator. You may write programs in your favorite text editor and load them per drag-and-drop, and even export any screen contents as text. (Also, all the special characters are accessible by a virtual keyboard.) Manuals are found in the programs/download section.


(Core emulation by Thomas Skibo, interface and IO enhancements by yours truly.)

Thanks for this... Took a minute or two to re-create my very first program:

10 PRINT "* ";

20 GOTO 10


I think the people that first encountered computers when they were relatively simple and constrained have a huge advantage over the rest of us

It is not a blessing but a curse. If you understand the computation you will spend so much time wondering “WTF is this (modern) computer doing” when it fails to do something simple that you know an older machine with a tiny fraction of the power could do easily.

Like the crazy latency times from pressing buttons to letters appearing on the screen. I see it's much faster on hc than other websites but not hardware interrupt fast. Why in the world does it have to be usb.

USB isn't really the problem, it's all the code that feels it has to get involved in handling the event. Lots of websites (e.g. Facebook) invoke all sorts of background network activity on keypress.

You might be surprised but I remember an article comparing various computers and input methods and it turns out that USB vs hardware interrupt(PS/2) is significantly more latency. Software plays a role but even e.g. vim vs an old-school computer actually has a noticeable difference. I haven't experienced this myself but I do know writers that use computers from the 80s for this exact reason( George R. R. Martin being one example). I have also heard from other people that have tried this that the QoL improvement is pretty big.

I think we need to take it one step at a time. First, let's kill the modern web, then continue to get rid of other layers of bloat on modern computers, and when all the low-hanging fruits have been plucked, we can do something about hardware protocol latency...

Agreed. But how do you kill the modern web? And how do you go about killing all the bloat underneath? Do you expect microsoft to fix windows? Or redhat to abandon systemd? Or wayland? Or the countless other bloated pieces of the puzzle? Honestly at this point unless we go back and remake everything from the start the interests are probably too big to be dealt with.

You can never go back, just as you can never step in the same river twice; the only way this can happen is if by doing so it enables something radically different that wasn't previously possible. Just as Windows was never really "defeated" by Linux or MacOS, but is gradually being rendered less relevant by smartphones.

Which I guess again boils down to what I've always been saying: we need to find a way to make hardware manufacturing cheaper. You control the hardware, you control everything. We need more people controlling hardware in order to get a more healthy market.

The example 'pjc50 gave isn't one of just different hardware, it's one of a different class of hardware. I don't think letting people make hardware cheaper is going to help much now (you can make hardware prototypes cheap enough these days).

Not to mention, bloat finds a way. See e.g. smartwatches. There was Pebble that managed to create a good, bloat-free piece of equipment. But the market was soon flooded by watches running... Android. Even navigations, biking computers and other sports devices run Android now.

By hardware I mean the actual silicon itself. OpenGL is so complex and patent-encumbered that I highly doubt anyone can make a new implementation at this point.

But how do you kill the modern web?

There’s no actual need for a page to be 10Mb to deliver 5k of text and a 50k JPEG. Esp not on mobile.

Yeah, but the GP's question - the one that's really hard - is how, not why.

I am painfully aware. But that doesn't prevent webpages from actually doing so. I have no idea what the motivations are as I'm not a web developer.

I think this is the article you're talking about: https://danluu.com/input-lag/

Yup that's it. Bookmarked, thanks.

How else will advertisers know every permutation of a comment I went to type out?

You are right that it was a special and exciting feeling learning these things when these machines were brand new. When my friend's family bought a TRS-80 in 1978, it was like some kind of alien artifact that fell from space; we were utterly fascinated by it even though its capabilities were almost absurdly minimal. It had only 4k RAM and that required several minutes to load from the casette tape storage.

Nevertheless it ran BASIC and my friend and I learned to program on that machine. Subsequently we honed our skills on Apple II's at school, and later by hacking on the PDP-11 at school. In 1981 I built a simple Z-80 computer, roughly equivalent to the TRS-80 (note:"80"). At that time doing something like this literally caused newspaper reporters to come interview you. It was nice to learn these things sort of organically, albeit perhaps not optimally; I've made my living in software development and to this day have never taken any class in programming.

Of course that moment of novelty really was brief. Once the IBM PC came out in 1981, computers proliferated rapidly and they no longer seemed so special. Nevertheless I do think that our "Generation X" had sort of lucky timing with computer culture, since we were also just reaching working age when the Internet revolution hit (my first job out of grad school came from an ad which literally said "the Internet revolution is here and you can be part of it"...a small part to be sure but still part!)

But anyway every time period has its pluses and minuses! If you want to know how it really felt to grow up during that time, the truth is that we envied the 60's generation hugely and thought that everything we had was just kind of a pale imitation of what they did (for example, music). Take a look for example at the book called "Generation X", which is pretty dystopian, and really did express how many people felt. There's always something new happening!

It was definitely fun to go to the computer lab in high school and play with/program the Commodore PETs. These were the 4K models with the chicklet keyboard.

I will say this though. I disagree with one word in the author's text. He said the machines of the time were "constrained." They weren't constrained at all. They had limitations just like today's machines do. But we weren't aware of the limitations because we weren't time travelers coming from the future with more computing power in our pockets than most corporations had in 1983.

In fact the exact opposite was true. The computer industry was bursting at the seams. It was exploding. It was not constrained at all.

Then they cloned the IBM PC and the rest is history. Every machine you're reading this post on descended from those.

I would have been 6 around 1990 when I first discovered BASIC, on our Amstrad CPC464. Found the manual filled with code, and that cemented my love of programming.

It's interesting how rather than coming with an instruction manual nowadays, its assumed that everybody knows how a PC (including phones) works, especially as they are more complicated now - just hidden beneath a veneer of GUI

You can have the full experience today using the Colour Maximite board: https://www.youtube.com/watch?v=XQA8lowEKOo. The MMBasic environment it uses is surprisingly powerful and convenient - it even includes a full-screen text editor.

Now, this sort of experience can be recreated with Arduino, micro:bit and other educational microcontrollers and nanocomputers. Raspberry Pi doesn’t count: too complicated.

You can recreate it with Raspberry Pi, though it's no longer the best platform for it. It can natively run RISC OS Open, which can be set up to drop you straight into a BBC BASIC prompt as in the old days. And there was plenty of software printed in paper listings form for 32-bit Acorn computers if you want the traditional experience. The issues of Acorn User magazine from 1988 onwards will give you plenty to be getting on with... http://8bs.com/aumags.htm

The RISC OS people have one already set up to run BBC Basic standalone: https://www.riscosopen.org/content/sales/risc-os-pico

My thoughts exactly: the Arduino is about the same level of complexity as 1980s 8-bit machines.

As the author of the article said, "In 1983... home computers were unsophisticated enough that a diligent person could learn how a particular computer worked through and through."

And an ESP32 is already at the level of a PCW 1512, just more powerful.

I can't decide how much I buy this argument. I was born in 1980, and started programming Logo and BASIC very young (around 5 and 7 years old respectively). Almost everything this post describes is familiar to me: typing in code from books, learning PEEK and POKE, etc. I just don't think much of that matters: whether you learn BASIC on a stupid terminal that can't do much else, or in a super simulated terminal in a web browser seems kind of irrelevant.

OTOH, the experience of learning C and having to actually write video and network drivers (or their barest elements) because there wasn't a web you could download a library from...yeah, that probably actually did make me a better programmer. Having had the experience of writing a little bit of actual assembler, even just the old "MOV AX 10; INT 13" (am I remembering that right?) does give me a sense of connecting with the machine more deeply than someone who grew up with the internet.

On the first hand again though...living through the late 90s and most of the 2000s was crap. The time of Java. And the worst kind of JavaScript. I pretty much stopped coding altogether until the web had matured as a platform a bit, by the early 2010s.

dude make a new account. this one is hellbanned.

I've recently been introducing my son to programming, and it was so much more difficult just to choose the best starting language.

When I started in the mid to late 80's as a child, we had a Commodore 128 and it was Basic or nothing. I also remember typing in those program listings, getting annoyed they didn't work, realising I'd transcribed something wrong, then when it was working making changes.

We then moved onto a 386 and I gravitated towards QBasic (and later TurboBasic because you could use the mouse and compile to an exe)

With my son, I ended up searching high and low, umming and ahhing, and in the end I found the spiritual successor to those computer magazines with program listings in a Python for Kids book that takes them step-by-step (a little like the Commodore manual did) introducing concepts and eventually getting you to build a game.

I think a lot of the wonder about computers that seems to have waned is because we're all older and not looking at it through the same lens, true, computers are far more ubiquitous now, but learning that you can tell it what to do and are only limited by your imagination is still incredibly powerful to a child.

Not basic but fantasy consoles like pico-8 try to bring back that feeling. Pico-8 has a 32k limit and all graphics / sound / code are memory mapped so you can peek and poke if you want to old school hack. It boots directly into a Lua interpreter.


The nature of programming has changed so much since then. Back then, you could get going by knowing a very limited set of building blocks (BASIC keywords, all described in a single manual). The challenge was to build something meaningful out of those building blocks. Programming could be hard, but never too complex. Problem solving at its purest. That's what got me hooked!

Today, the set of building blocks is unlimited. The challenge is to find the right building block for what you want to do, using the infinity of resources available to you (all the APIs, libraries, Google, GitHub, StackOverflow, blog posts). You're drowning in complexity, even if what you want to program is straightforward. And you are constantly mindful of writing idiomatic code, always wondering if there's a better way to do the same thing, which is normally a good thing but can be time-consuming and getting in the way of the fun part that is the problem solving.

"Back then" means programming as a kid, on a hobbyist's microcomputer, yes?

I'm a bit confused because you compare that experience to modern programming as a software developer. I think you are thinking of the difference across 40+ years, rather than the difference of kid vs. adult.

I mean, if you were programming on a Unix machine in 1983, or a VMS machine, or an IBM mainframe, you would have far more than a single manual.

You're right, I guess I'm comparing against both those axes at the same time, since I'm relating to my own experience as a kid 30 years ago vs as someone working in tech today. Indeed, that's not really a fair comparison.

The point I was trying to make was about the distinction between programming with a limited set of building blocks vs programming with a large and open-ended selection of libraries. They don't require the same skill set, and one seems more fun to me than the other.

It is easy to get started with Lego, and fun to use. Those who are really good can build some impressive objects. The limitations of the parts make it easy to find challenges to overcome.

Mechanical engineering is a lot harder, there are so many options to consider, new options come along all the time so you're always falling behind, and cost plays a much larger and - I think - more boring role in the process.

I think that is analogous to the point you are making with computers.


I started on a Commodore PET 2001 in 1978, and wrote a horse racing game for my parents. Three horses, ASCII characters randomly going across the screen 0,1 or 2 spaces. Odds at the start and payoff calculated at the finish. My parents loved it! (OTB and racetrack fans). I then went on to a Vic-20 (1981), Amiga 1000 (1985) and eventually a 386 PC, a PowerMac PC (I got minix or other variant running on it), and then Linux, Mac, and PCs thereafter. The Basic Stamp in 1997 really brought back old coding to me with the addition of hardware tinkering. My fondest memories are the horse racing game, and plotting the 4 main moons of Jupiter on my Vic-20's thermal printer. Figuring out how to program those two programs started me on coding. Funny, I have always coded, but only as an employed coder for two years at one job. I program for myself, and when I can for work when needed (mainly technical computing).

BASIC was my first love. I was 10 when I got my first computer, a Tandy TRS80. It came with an instruction manual with example BASIC programs and I could save my programs with an external tape deck. Something awoke in me and from then on, I onew my calling. I still remember the first command instruction: SOUND 39,20

I recently found a copy of the commodore 64 programming manual in a garage sale. Of course, I had to buy it. Compared to the *nix paradigm with heaps and stacks, it's very, er, basic. The manual explains that you are just loading registers with instructions when you program. No Heap, No Stack, No memory management.

You're loading memory with instructions and data, registers with operands, and yes, there's definitely a stack that's used heavily.

There’s this naive concept that these 8 bit microcomputers programming were based in BASIC. The speed of these Z80 chips was 1-3 MHz.

Almost all games and programs in the 8-bit world were written in Assembly. The BASIC interpreter was just used as the bootloader to load and run these programs, BASIC was in fact the shell interface and the toy language, but the assembly programs essentially sold the computers.

If you wanted to do something in these computers, it would have to involve assembly because of BASIC slowness. And there were newbie books about assembly language, assembly libraries, tons of assembly compilers and debugging utilities available anywhere.

> If you wanted to do something in these computers, it would have to involve assembly because of BASIC slowness.

No, you could do all kinds of useful, productive things in BASIC, which were still orders of magnitude faster (and also still more accurate) than manual processes you were replacing.

Today, you cannot even boot a PC without looking up documentation in the internet from 5 consortiums. Even then it might not work because specifics are left undefined.

Try to use EFI shell? Oh wait your machine doesn't even have one. Uboot is not even programmable...

Thinking about how to introduce my 6yr old to the world of programming I decided she is going to write her first code like I did - using BASIC on C64.

Luckily there are many JS C64 emulators online like this one https://virtualconsoles.com/online-emulators/c64/ so its a breeze to get started.

And of course her first program was

  20 GOTO 10
exactly 33 years after her dad made his first.

She also tried Scratch and like it but to me in order to write code you need to learn to "write" it.

I've been quite into retro- computing for a few years, and have started recently to amass a collection of my favourite machines from the 8-bit era .. and one thing I'm really intrigued about is the 'lost tech' of these machines.

I remember for one of these platforms, there wasn't really a great commercial release scene - but there was a great home-hacker scene, with type-ins from magazines and so on ... and I remember having quite a small library of routines and utilities, saved on cassette tape, that could do various things - fast scrolling, tape copying, UNDELETE commands, and so on. But now these things are lost to time (still out there on my cassette collection, wherever it is these days) .. and we have to re-create them.

So now one of my favourite aspects of the hobby is the reconstruction of all the 'cool utilities and stuff' that made the platform great in the 80's. Its not so easy! For some of the obscure platforms, we really have to dig deep .. fortunately though, 8-bit computer magazines seem to have been pretty well preserved on the Internet. Its just now a matter of going through them, spotting the gems for the obscure platform, and re-typing it all in, lol. ;)

But that said, having a variety of 8-bit machines at hand is really a special treat. I always wanted a Spectrum machine, and now its finally affordable. ;P

If you are looking for 80's listings, please check out http://www.hebdogiciel.free.fr/

The site contains all the listings from the Hebdogiciel magazine, re-typed and ready for download. The magazine was a must for all French-speaking computer kids back then.

Thanks for that - as an Oric-1/Atmos fan, I'm already aware of it, though. ;)

No doubt other readers of this thread will find it useful.

Wow, what a throwback - this was me, but I had already cut my "basic" teeth on the Commodore VIC-20, quickly upgrading to the C-64 as soon as I could get someone to drive me to the nearest city to buy one with my meagre earnings. I'm still fumbling my way through "programming" today...

In 6th grade there were Apple II's at school I was part of a group of honors kids that got to code BASIC on Apple II's. I made a Tron disc game, it wasn't good but the ability to create a game, render block graphics using text and code BASIC was magic.

My friend also had a Commodore 64 which even to start a game you had to know some commands (LOAD "$", 8, LIST, then LOAD your game). Computing and coding/interfacing with the computer was more involved but simple as well, led to lots of fun learning to code and create.

Between these two machines (Apple II and Comodore 64) I fell in love with coding and games. I never had an Amiga which was a bummer but these were enough to inspire kids to be creators with code.

In high school my teacher Mr Isles was big into the internet and media computers, we were watching TV on a computer, playing games like Scorched Earth, making games in pascal while browsing the web, it was amazing and moving fast.

Flash had the same fun factor as those in late 90s to around 2006-ish before the mobile phone came out. Flash communities were very special in both designers and developers, it was powerful that either type of aim could create games, interactives, experiences and Amiga like demo scenes.

Even when mobile truly arrived in 2007 when the iPhone and smartphones upped the game, I was blown away when OpenGL was on it and I knew immediately that it was a new handheld gaming market that I had to get into. Mobile existed at that point and I was making games on Windows Mobile but everything changed with iPhone/Android in mobile in the ability to create. That fun creation market is still going on today, now we are onto fun interactive tools like augmented reality and location based games which are fun.

There are really inspiring innovations happening all the time in each generation but it does seem like the fun platforms and really interesting ones are led by gaming, or apps today, and areas approachable by designers and developers alike to create interesting games, apps, interactives and the platform makes it fun.

I think it is really important for platform designers/developers to make their platform approachable, simple and reduce complexity so that it can attract people interested in creating. I think an engineers job is to create simplicity from complexity, design platforms smartly with senior skill for the junior, in some areas today we are failing that due to heavily specialization. Every truly successful platform that really hit and progressed innovation/creation forward did exactly that.

Actually, the memory mapped approach of the C64 is an abstracted API. It’s so immediate and intuitive, though, I can see why you might not think of it that way.

(I started on the Pet and later th C64, and had so internalized this approach that I was baffled when I first encountered systems where this wasn’t the primary approach.)

I think there’s actually a lesson here for API design today: the power of the “everything’s-a” approach. Then it was everything’s a memory location. But generally the everything’s-a approach allows for a flat abstraction space with low cognitive overhead and high inherent/automatic composability, leading to short learning curves and high productivity.

"I think the people that first encountered computers when they were relatively simple and constrained have a huge advantage over the rest of us.

Today, (almost) everyone knows how to use a computer, but very few people, even in the computing industry, grasp all of what is going on inside of any single machine."

An astute observation: I see a lot of complexity in IT today which obviously comes from not having a clue how the hardware functions and how it's efficiently programmed. The complexity, performance and resource hits grow with every layer of abstraction. Convenient for those who write software, very bad for users who then needlessly suffer.

This is literally how I got hooked on computers in the 80s. That and a soldering iron.

+1 for the soldering iron. There was nothing quite so stressful as the first time you plugged in the prototype relay controller you'd built on Veroboard out of salvaged parts and a couple of 74-series logic chips. The whole thing was connected directly to the CPU via the edge connector, and if you didn't have flyback diodes and enough decoupling then every time the relay switched, the computer would crash with a psychedelic random bitmap on the screen and you'd wonder if it would ever work again... Ah, the nostalgia!

Same. I think I typed my first game into my 2nd hand ZX Spektrum in '84. Remember trying to save it to tape and the tape recorder chewed the tape...

I remember seeing those DIY kits on electronic stores.

A magic feature of BASIC was that you didn't need to learn different editors to get started. It was based on line numbers, so the line number was how you added, changed, and deleted lines.

I could walk into a computer store in early/mid 1980's at the mall, spot a model of microcomputer I never encountered before, and could type:

   10 PRINT "Oh no! Something is going to blow! Run!"
   20 GOTO 10
Then scurry off while snickering.

Can we replicate this experience on modern PCs using VMs?

This brings back so many memories from being about 12 and figuring out BASIC on my dad’s PC.

Why did programming get so complicated?

I own 10printhelloworld.com, but unfortunately 20goto10.com is only available for a whole heap o' money. :(

This is brilliantly written. Thanks and kudos to the author!

This is my current research [1]! In 2015, I was at PyCon's Educational Summit when I thought about integrating some of what we do in martial arts to CS - drilling moves to use in sparring/randori. Sparring/randori is a high intensity activity that requires fast problem solving skills which rely heavily on muscle memory due to the speed involved. Additionally, forcing a beginner to spar is one of the fastest ways to make them quit [2]. I think this is one of the reasons why CS has a high dropout rate - we are asking them to "spar" (problem solve) too early or incorrectly and as a result they quit because they hate feeling like failures. Instead there should be some level of drilling before getting "thrown to the wolves" (as I used to tell my students) to build their confidence and understanding. I don't think traditional small/large scale programming exercises fully tackle this problem.

I think drilling is something we do in almost all technical skill development (music, art, athletics, vocational) and I wanted to bring the same thing to my CS courses - so I started requiring typing exercises as one of their assignments for the week. These aren't just "typing a for loop 10 times", but additional context (for example, the link below shows regular expressions for addresses) to give them something they could use as a template for their programming exercises. To combat copy and pasting, I just made the code an image. In my first link, you'll see an example of using a regular expression to validate addresses. After completing this, the students would then be required to complete some Q&A exercises as well as traditional programming exercises where they needed to design functions that: validate phone numbers, (a limited scope of) email addresses, and Social Security Numbers. The objective of that week was to get them familiar with regular expressions, not finding a StackOverflow link that teaches them how to implement regular expressions.

As the article says, this is what we did in the 80's. That doesn't make it better, it just makes it how things were done "back in the day..." However, K. Anders Ericsson states that early specialization is often a key determinant to future mastery and that deliberate practice refines areas where an individual struggles and may be unenjoyable [3] (see my older comments on grit/perseverance). Likewise, syntax errors are one of the first problems novices face [4]. By completing typing exercises, the learner does not need to worry about using problem-solving skills, which they may still be struggling with, just the correctness of the typed characters. Thus, typing exercises give the learner a deliberate practice resolving a simple, but major issue. Additionally, typing exercises remove students' ability to just "copy and paste" before using example code. With syntax errors mostly resolved, the student can then focus on problem-solving rather than where the semicolon should go.

I currently have a SIGCSE paper under review, but the gist of the paper is that students that voluntarily completed typing exercises performed better in their class than students that did not. The students may have just been more motivated and therefore that is why they scored higher, so there is a limitation to my study. I could require it, but then designing a control group that would receive the same amount of learning would be difficult as well.

[1] https://research.csc.ncsu.edu/arglab/projects/exercises.html (the Heroku link is currently down as I've made recently changed to the live version)

[2] https://www.youtube.com/watch?v=hHebXvoHue0 (Rener Gracie is a character, but listen to those first few minutes)

[3] https://en.wikipedia.org/wiki/Practice_(learning_method)#Del...

[4] https://dl.acm.org/citation.cfm?id=2677258

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact