Every time I consider the simplicity of computing in the early 90s (when I was a kid) I have a pang of nostalgia.
We've made some wonderful things together as an industry but the author reminded me of something poignant in his last paragraph.
When I was younger I used to strive to keep my computer running as long as possible. I relished the idea that in the future, it would never need to turn off. Now I live a life where I seem to be hoping to do the exact opposite.
> Every time I consider the simplicity of computing in the early 90s (when I was a kid) I have a pang of nostalgia.
Nostalgia is a very subjective thing. Today, people fondly reminisce their first BASIC application or developing their first program that created a network socket. In the future, the old hats may look back longingly over setting up their first multi-server cluster or modifying an app to take advantage of parallelism or a GPU. Nostalgia isn't going anywhere.
Personally I think nostalgia is a proxy for a generalized desire for simplicity. Most people don't long for the days of swapping out 10 different floppy disks to install an operating system, or of messing around with IRQ/DMA and UART settings to get a modem configured. It's the simple things (like BASIC) that are remembered and missed.
I think nostalgia is more complicated than simply a desire for simplicity, though that may be a part of it - and not just because things were more simple back then, but because we didn't understand the complexity as children. Like those people who don't want politics in media/art even though there always has been, they just didn't know as a kid.
Maybe something like a cheap Chromebook that boots into something like BASIC where it's easy to write simple PRINT and GOTO based programs, draw stuff on the screen, and beep and boop the sound card.
It's having access to simple and easily composable basic elements, like Legos, that made computing so accessible back in the day. Computer gaming was my big motivation.
So, this simple platform also needs a decent collection of fun and hackable games, and other useful applications.
I like this idea. Sounds a bit like the PocketCHIP but scaled up to a normal size laptop.
You could actually implement this using low-end Chromebooks as a platform, just flash them with whatever simplified OS you are using. It would be a great tool for teaching programming basics.
Reminds me of Pharo Smalltalk VM or Lisp Machines. There is something powerful about the whole OS utilizing single language and allowing you to interact with and alter everything.
I'd agree with that. I want a new DOS really badly. Not something with DOS's archaicness and limitations of course, but the modern version of a system that simple.
I mostly remember everything being expensive. Today if I want to learn a new language I just download it an start working. Back when I started you had to put your hand in your pocket for everything.
I was able to write long novels as a child in DOS programs. I was never able to focus on word processing in a windowed environment. Full-screen “distraction-free” writing utilities attempt to recapture the simplicity of the old days....but it still just isn’t the same as WordPerfect.
The Raspberry Pi offers a tonne more since it is trying to be everything to everyone, at least in the domain of computer education. I agree that this obscures the obvious place to start, yet it also has some benefits. For example, there are Python libraries for GPIO. While interfacing electronics projects with a 1980's vintage computer was certainly possible, it did not afford such ease.
On the flip side is RISC OS on the Raspberry Pi. The OS is pretty much stuck in the 1990's with some throwbacks to the 1980's. For example: it is easy to drop into BASIC, then write programs as you would in the early days of personal computers.
>While interfacing electronics projects with a 1980's vintage computer was certainly possible, it did not afford such ease.
One of my favorite examples of interfacing 1980s micros with electronics is 8 Bit Guy's[0] series on interfacing with LCD character with the Commodore 64's user port[1].
My point of bringing in the pi was that it feels closest to the computers I had growing up. Tinkering was and is the point. Most computing devices, that is not the case.
For me the nostalgia is for something a little deeper: developing under resource constraints. Anyone else feels that with resource limitations came good software design and code? Or am I off my rocker ?
I still write all my software like that and am able to do so because of the programming language choices - shell and AWK, running on a true UNIX®️. My programs are full-blown applications compliant with all POSIX command line standards, yet they are only a few KB in size and use maybe several hundred KB of memory. They run so fast I never need to translate them into a compiled language.
Even when I write in C, I still write the program to be as small and as fast as possible, although sometimes I sacrifice memory consumption for that, but I'm solving a different set of problems in those cases.
I won't name the industry out of privacy paranoia, but I do large scale IT technical architecture and lots and lots of classic system engineering (requirements gathering, writing specifications, implementing those, writing manual pages for the software I develop). The reason I'm able to do that is due to massive off-shoring at the client where I consult: I am one of the handful of people left in "a high cost location" who can debug, link, compile and package software correctly, thereby making large and complex software easy to install and JustWork™️. Management is for the most part unaware of all of my software running underneath until some major application goes live and then I give them a presentation on what is ticking underneath.
This comes from my days of assembler coding of intros and cracking software on the Commodore64 and the Amiga: one learns the value of writing small, fast code very quickly and that programming and code design style stays with one for the rest of one's life.
With its vast memory, processing power and built-in tools like AWK, UNIX®️ makes that style of programming small, fast software even easier. I spent decades studying what all UNIX®️ and its tools can do and have to offer though.
I keep a 2001 era Pentium III laptop around for when I'm feeling nostalgic or the need for distraction-free writing. The keyboard is excellent even by desktop standards, much less for a laptop; the 4:3 screen is better suited for writing than my 2K main display, and it runs OpenBSD with Fluxbox which is about as distraction-free as one can get and still have a stacking window environment. It has one USB port, no built in comms other than a dialup modem (I have WiFi and Ethernet CardBus devices for performing updates and syncing files), and it can house two batteries if I take out the CD-ROM drive, making it a great portable word processor.
It won't play any games beyond the occasional Solitaire, it isn't powerful enough to edit modern video files (or even play them), and it's atrocious on the modern Internet unless I use a text only browser. But it is my favorite writing machine, and I'll likely never get rid of it.
My computer in the early 90s was a Mac LCII running System 7. It was no “simpler” than a computer I have now. I would be running three or four apps and it would be multitasking (cooperative but still). Instead of a web browser, I would have a dialup terminal or AOL.
This will sound like HN's initial dismissal of Dropbox's business idea, but for distraction-free writing I use a non-WiFi Raspberry Pi with Emacs on a framebuffer (no X). I never turn it off.
Sure, it's not very portable, but it isn't $600 either. And I version control with git and push ocassionaly to GitHub for backups.
> More sophisticated than MacWrite, Apple’s word processor, the program is still extremely basic—the only reason I chose (Microsoft) Word was so I could open the file on my modern Mac to edit and file it.
That's a pretty interesting sentence, modern Macs can't work with a document created using 1980's MacWrite, but they can work with a document created using 1980's Microsoft Word, by using a modern version of Microsoft Word.
"I glance past the small form of the Macintosh and ponder that idea while this file saves. The drone of the disk and the fan seems to lull into a resonant frequency, with the desk, the chair, and my body and brain connected to them."
Whenever my normally-silent macbook needs to fire on its fans (compilation, DB restores, or any mid-sized 3D render) I love the audible reminder of how unfathomably hard it's working, and (by extension) what cool stuff is going on behind the scenes.
Meanwhile, from the everything-old-is-new-again department, note the F-keyless Mac keyboard ... and a Mac line largely bifurcated between underpowered-yet-expensive all-in-ones and multi-thousand-dollar modular systems priced to fleece the employers of creative professionals.
By today’s standards the machine is a dinosaur. It boasts a nine-inch black-and-white display.
That was pretty ancient-seeming back then. At least, as a kid, I couldn't fathom what they were thinking. It made the old Apple II seem better, and certainly anything from Commodore blew it straight out of the water. That was my impression then anyway.
The 512x342 resolution was crisp compared to the Apple II's 280x192, in part because the Mac had a built-in display, while the Apple II was typically connected to the family TV set using an RF modulator. I remember seeing the Mac screen for the first time. The experience was like seeing your first HDTV images; you could never truly see the older technology the same way again after that.
Atari ST had a similar hires/hicontrast black-on-white designed for text display monitor available. If Commodore sold anything similar, I never saw it.
> The version I have retailed for $3,900, or about $8,400 in 2019 dollars.
Had a similar Mac SE with 1MB RAM and 20MB HD. From the university store it was student priced under $2500 with the standard ADB keyboard (iirc $130 and normally not included!), an Imagewriter II printer ($600), Microsoft Word 4 ($300), and SuperPaint ($100). Still pricy but just to give folks an idea on the retail markup on Macs back then. Plus the dealers often quoted more than sticker.
I was programming ESP8266 yesterday and I noticed I felt acute distress of the program size, whopping 250 kilobytes, only 40% left. This was a latent memory from 1970's when 250 single bytes was aplenty and enough for almost anything.
256 kilobytes was called a double mega on MSX systems and was used in very large games mostly. They would be cartridges, mostly from Konami, that had that whopping amount of game on it. Strange to think that most webpages are larger than that now.
I have one myself. But it is important to know its limitations -- most notably a maximum of 32K (which like on a modern smartphone, is both RAM and storage). So you are at best going to be writing very short documents. Which is why it was very popular among newspaper journalists writing columns in the field -- they would use a modem to transfer their column (often only a few paragraphs) to their editor.
It’s not magic. Some of us get the appeal of vintage machines, but it’s not magic that these systems work. I have a large room full of them that I use daily.
> This [power supply and hard drive being loud] is the experience a computer user would have had every time she booted up her Macintosh SE, a popular all-in-one computer sold by Apple from 1987 to 1990.
A lot of that is from the hardware being 30 years old. Fans and drives get louder.
There were so many nice computers with operating systems sporting cool features and the author picked the least interesting one. We already seen this type of "look, i wrote text on my old apple" blogposts so many times. This is terribly sad.
We've made some wonderful things together as an industry but the author reminded me of something poignant in his last paragraph.
When I was younger I used to strive to keep my computer running as long as possible. I relished the idea that in the future, it would never need to turn off. Now I live a life where I seem to be hoping to do the exact opposite.