Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why does current interest in retro computing focus on the early 80s?
38 points by amichail 5 days ago | hide | past | favorite | 85 comments
Will future generations remain more interested in this time than in any other?





I believe the main reason for the focus on these earlier decades is due to the openness of both the hardware and software during this time. Manual published by computer manufacturers contained detailed schematics of all the circuits, assembly code listings of the BIOS and system programs. As a hobbyist I can build my own SBC based off these schematics and probe physical pins on the chips in order to debug the board. As things became smaller and more integrated, chips started including more functionality and closed-source firmware and actually integrating them into your own designs became increasingly difficult.

I think its even simpler than that. The cohort with the largest amount of spare cash flow and moderate amounts of hobby time right now (35 to 55 year old roughly) remembers that era of computing fondly from their childhood. Its the same reason junk cars from the 1950s were all the rage in the 1990 to ~ 2010 era. Whatever the current aggregate richest in spare cash flow remembers positively from their youth gets a resurgence

Brings to mind that 1984 video of Stanley Kubrick discussing the quality of computer manuals: https://www.youtube.com/watch?v=tlsZoZLlwC8

Today it's hard enough getting OEMs to explain what their features do to anyone smaller than an OS vendor, much less how they work.


Others have answered why not later, but why not earlier is also a question. Earlier computers were a lot more work. CPUs weren't single chips, they were collections of dozens or hundreds of ICs. Before that they were thousands of individual transistors, Ben Eater has a Youtube series designing and building such a machine. Before that came vacuum tubes, in types which are no longer produced and aren't at all common to find working. Before that came electromechanical systems with machined cams, cogs, wheels, lots of relays, etc. So the earlier in computing capability you go the more expensive and difficult it gets.

Early 80s parts are a bit of a sweet spot combination of low price, enough challenge to be interesting but not so much as to be frustratingly difficult, and an interesting turning point in computing history as the first single-IC CPUs became popular.


I'm fascinated by the early PDP-11 range - and if I saw one readily available, reasonably priced, and close at hand (shipping is not trivial for those machines) then I don't know if I'm actually competent enough to handle a machine where normal configuration duties include changing the wirewrapped backplane!

That's part of the fascination too though.


Ownership? It was the personal computing revolution, after all.

I think that's a big draw to 80s computers. It was the first time most people knew someone that had a personal computer.

Yeah, exactly this. I've been toying with the idea of making a computer out of individual transistors, but building something capable enough of having an interactive CLI would probably take years, thousands of dollars, and some nontrivial engineering.

Hooking up a 68k to some peripherals? Quite doable. Building something out of 7400 ICs? Tricky, but if your expectations aren't too high definitely possible. Transistors, vacuum tubes, mechanical stuff? You better be really dedicated to it!


> Hooking up a 68k to some peripherals? Quite doable.

I did that for my project in the microprocessor lab class I took in college in the early '80s. It is indeed quite doable [1][2]. It was a bit scary, because the 68k was quite new and Motorola only gave the school a small number of samples. I was told that if I fried it or broke it by not being careful enough when removing it from a socket I would not be given another one. I could not afford to buy one either. And I was taking this class in the last term of the school year as a senior, so failing it would mean not graduating.

> Building something out of 7400 ICs? Tricky, but if your expectations aren't too high definitely possible.

I considered doing that. EdX has a version of MIT's 6.191, "Computation Structure", which goes through how logic gates work at the MOSFET level on chip, then goes through combinatorial and sequential logic, followed by how to design a 32-bit RISC processor, which you them build and test in a logic level simulator. (You don't do all of the processor. You are given pre-defined modules for a RAM bank, a 32 x 32bit register file, and ROM). That's the first two parts of the course. The third part adds pipelining and caching to processor.

I took the first two parts at EdX and afterwards seriously considered actually building my processor out of 74xx series chips.

My parts list came to 350 chips. And that's not counting whatever parts I'd need for the RAM, register file, and ROM. That's way to big for my breadboard! :-)

My ALU design includes a shift unit that can do left or right shifts or rotates from 0 to 31 bits in one cycle and that uses around 90 chips. I could drop that, and change things so that shift and rotate instructions trap, and then emulate them in software. That cuts the chip count down to around 260.

Still too big for me. Even changing from 32 bits to 16 bits, or even 8 bits, would be too big, and so the idea to build it was discarded.

[1] https://imgur.com/Ts9wcfW

[2] https://imgur.com/3D4rvdC


I think it is a form nostalgia. But, the early 80s was a fast changing time for tech where new and much more powerful hardware was coming out almost every day.

Plus, tech was not controlled by large corporations as it is now. Back then there were many small hardware/software vendors competing and in many cases helping each other advancing. Now, everything is vanilla and controlled by Large Corps.

Me, I miss the days before GUIs, I always thought plain text interfaces were and still are the better in many cases than point/click.


> But, the early 80s was a fast changing time for tech where new and much more powerful hardware was coming out almost every day.

This. It was an amazing time. Like, it was a trip to read the ads in the monthly computer magazines. There would be several each month that would be something I never even dreamed of doing with a computer. Groundbreaking advances every month, over and over.

That feeling - that revolutionary advances were being made all the time - was amazing while it lasted. While it lasted. A lot of people may want to feel that again.

But I think you can't go home again. From the vantage point of 40 years of (slower) advances later, those things look pretty primitive. It won't feel the same.


But, the early 80s was a fast changing time for tech where new and much more powerful hardware was coming out almost every day.

I feel like the nineties are more like that. Deep into the 80ies, most people would probably still get something with a Z80 or MOS 6502/6510. If you were rich, maybe you got an XT 4.77 MHz somewhere in the 80ies. The 80286 and later the 80386 were out of reach for most consumers.

The late 80ies to late 90ies things change rapidly. The 386, 486 (remember going from SX to DX) and Pentium all became in reach for normal households. At the start of the 90ies most people would still have a PC speaker, but then within 1-2 years everyone got Sound Blasters (or clones). Around 1992 or so CD-ROM drives start to become popular. In 1996 3D games go to the next level as 3dfx blasts onto the market. People start getting internet at home.

In the 90ies, if you got a PC, it would already be outdated in 6-12 months. This was certainly not the case in the 80ies (or 00s, 10s, or 20s).

Also on the software side there are huge shifts. Outside Apple, PCs and home computers were mostly command-line interfaces in the 80ies. But in the early to mid-90ies graphical interfaces took over the world (Windows 3.x becoming a major hit), true multi-tasking becomes more important (OS/2, Windows 95, Windows NT), etc.


>Plus, tech was not controlled by large corporations as it is now. Back then there were many small hardware/software vendors competing and in many cases helping each other advancing. Now, everything is vanilla and controlled by Large Corps.

Yes. My mother’s first computer was made by a company in Northern Kentucky called Thoroughbred Computers. It was tremendously expensive but locally made.


> "Plus, tech was not controlled by large corporations as it is now."

IBM would beg to differ. They were monopolists in a way that Microsoft and Google are only pale imitations of. "Nobody ever got fired for buying IBM" and the term FUD were coined to describe them. The entire x86 hardware ecosystem has its roots in IBM's dominance through the original IBM PC hardware architecture.

One aspect of why the '80s-'90s were interesting to the retro enthusiast because it was the era with the most diversity and competition in home computing. By 2000 or so, the IBM PC standard had steamrollered every other competitor, leaving only the Mac occupying a small niche.


When I went to college in the late '00s I was able to find several other Linux users. The counter culture to the Windows/Mac duopoly was alive and well. In some ways we won by being accepted as mainstream for certain niches. Between WSL, Steamdeck and Docker Linux has seen more adoption than I was expecting.

I've always assumed that Linux's server-side dominance was fated by providing an apache Web server for zero monetary cost. IIS was several years behind and cost money.

I can't imagine Facebook running on IIS in 2004, for example.


Computers haven’t really changed much since the 1990s, I mean I have been running Linux since 1994. Even in the late 1970s the DEC VAX had an architecture basically similar to modern computers.

What else could you be into? Computers pre the IBM 360 sucked. Personally I have some nostalgia for the 1970s PDP-11 and PDP-10 but not enough to really put a lot of time into setting up emulators. (The PDP-11 has the awkward bit that it has a 16-bit user address space, running RSTS/E you basically get a BASIC environment (or something like a CP/M environment) which is just slightly better than what you get with a 16-bit micro. I’d used installations where 20 people could enjoy that at once but for an individual hobbyist you might as well use or emulate a micro… But those GiGi terminals beat the pants off the Apple ][ and such)

Mass producing 8-bit micro hardware has tough economics today (can’t beat RasPi) but it is possible to make a computer of that era with mainly or entirely discrete components. (Even a 6502 or display controller!). Forget about making an IBM PC clone or an Amiga. (An FPGA can get there if you have enough $$)

If I wasn’t already overloaded with side projects I’d like to deeply explore the 24-bit world of the eZ80, like the old 360 and a place where micros never really developed a comfortable ecosystem. You could have more memory than the IBM PC that is easier to code for; you could run Linux with X Windows and Win 95 in 16GB of memory back in the day so such a system could kick ass. The AgonLight 2 has a programmable display controller based on the ESP32 and it should be very possible to make a more capable sprite-based graphics system than most contemporary game consoles, maybe even as good as the coin-op systems that supported games like Street Fighter


It represents a unique time in the history of computers -- an important gap between the mainframe era that came before it and the standard PC and internet eras that came after it. The wide variety of inexpensive and incompatible machines offered at that time has never been seen before or since.

This maverick era produced all kinds of innovation that simply doesn't happen today with one-fits-all computers that are controlled remotely by operating systems and cloud services. Modern machines bear no resemblance to 80's home computers except in their purity as computing machines, Turing machines, with simple I/O and programming.

Once you find out "how it all works" and that all computers are the same, one develops a certain interest in how we got there and perhaps a wistfulness for what things might have become. Home computers of the early 80's is where that answer lies.


Yep. You had:

1) Completely stand-alone machines with just enough expansion to then get some form of connectivity (or added data storage)

2) Self-contained, instant boot systems without an "OS" as we know it today (instant on is much more fun than it might seem), and that you explored until you could learn literally what every single register/port did.

3) Single-box systems with almost everything under or behind the keyboard, that you had to plug into a TV to get anything done with.

There was a purposefulness and immediacy to using those machines that is absent from today's bloated, over-connected, highly distracting and ADHD-inducing systems.

It was, in short, a more civilised time :)


Also, what came after this era it was either boring (standard x86 PCs) or too complex to easily maintain over decades, e.g. Suns or SGIs with failing [UW]SCSI drives that are hard to replace these days.

Firstly - is it really true that this is the main focus? I see a bunch of interest in the beige box era PCs as well.

Assuming it's true, though, then I would imagine there are several contributing factors:

The 80s is when computing arrived for the masses - and most of those masses were children at the time; the first computer I owned was a ZX81 and I was 9 years old when I got it. That lends it powerful nostalgia value. For later generations computers were likely more part of the background.

That generation of people is also now entering their late 40s or 50s. They probably have some income (especially if they got into IT) and their outgoings are likely tailing off - if they have kids then those kids are leaving the nest or have already. So there's spare cash to spend on all the bits and pieces that they couldn't afford back when they owned them the first time!

It's all far enough in the past that you can see it through rosy spectacles. Ram Pack wobble, slow tape loading, limited memory and primitive graphics all become features instead of limitations.

Then for younger generations who are getting into this the above points mean that there's a background of somewhat knowledgeable people to propagate information about these machines.

Add on top of that the limited nature of the machines meaning that one can have a complete-ish (or illusion of such) understanding of the machines. That's always been appealing.

Personally I find the 1970s minicomputers far more fascinating! But my dad worked with some of those and I adore Unix culture so I'm probably atypical.


There is a 30-year lag in collectibles; people in their 50s tend to buy what they coveted in their late teenage years. From what I understand, this trend is also present in the car market.

That's my bet; and so this 30-ish (I'd give it +/-5) year lag is the same reason that, when Back to the Future came out in the mid-80s, they traveled back in time to the mid-50s; why Forrest Gump centered on the 60s during the 90s; why That 70s Show was popular during the early 00s; why the now already decade-old Stranger Things was set in the 80s; and how 90s nostalgia has become "a thing" semi-recently.

Happy Days was big, and was set in the 50s while being from the mid-70s... but, I'd argue it could have been the early 50s, the show was on for over a decade, and AFAIK it wasn't an immediate hit (though it also didn't have the Fonz at first); also, FWIW, there was also a big nostalgia boom in the 70s that was focussed on the 40s (see the video linked below), so I would just lean on +/-5 years as why we see little discrepancies.

https://youtu.be/l0ZXItfw5r4

"[A behavioral scientist we asked about this] determined that we are not really all that happy about the present, we are terribly uncertain about the future, and when we talk about the old days we always refer to them -- I suppose because of our collective memory -- as the good ol' days; and, I guess, really, when you go back, you really can select the better parts of the good ol' days... like the music." -- a radio executive in the 70s


I don't think it is that simple. For example, take a look at the Connections Museum in Seattle: despite most of the technology being 50-100 years old, a decent bunch of their volunteers are in their 20s! Some people just love the technology itself for what it is.

And honestly, I don't think there's a lot to be nostalgic about with semi-modern tech. The hardware is a bit faster now, but looks and works the same as a machine from 15 years ago. Windows Vista is almost 20 years old by now, and is functionally virtually identical to Windows 11: some stuff got moved around and the whole UI got a dash of paint, that's about it. Games like GTA, CoD and FIFA got flashier graphics, yet are identical gameplay-wise. Want to play something novel like Portal? Just install it on your 2024-era machine via Steam.

People aren't going to be nostalgic about 2000s or 2010s era computing for the same reason very few people are nostalgic about toasters.

There's definitely nostalgia around MP3 players, iPods, or early smartphones, though! Those were still novel, had a lot of variation, and were genuinely world-changing to people.


Any lag at all helps the value of collectibles, especially stuff that deteriorates like cars and cards.

The cars that a current fifty year old saw driving around in their late teenage years were mostly hot garbage. There's a lot for a person to be nostalgic about when it comes to the eighties, but not cars.

As an 80s kid, I want a Lamborghini Countach, Ferrari Testarosa and a DeLorean. My computer collectible would be a Pentium i586 in glass mounted on the wall; to me it represents the inflection point when we had enough compute for anything.

I realized after some time that rather than putting a Pentium in glass, I would celebrate it by framing a card that had

4195835 / 3145727

or some other manifestation of the fdiv bug on it. Funny stuff.


Nice flex. I once wanted a hat with 0x5F3759DF on it; but it would make me a total poser since I don't understand the algo.

I can respect that.

I’ve got a counterfeit Athlon XP on the wall because it amuses me.


who said anything about coveting cars from the 80s.

The person I was responding to.

They said they end up wanting what they wanted in their teenage years, not what was new then.

I think because 90s which is far more interesting is continuum. PC is partially backwards compatible over decades. Thus it is hard to isolate any single segment unlike "8bits" which were mostly one and done. You can get some points like end of DOS beginning of Windows 95/98. But there after it kinda blurs. Tech got better and old stuff kept mostly working. So you really cannot just stuck yourself in singular point.

Not that there isn't opportunities like peak of DOS gaming with Gravis Ultra Sound and some Sound Blaster. Or early 3d gaming.


8-bit computers are fully comprehensible, they're something you can build on a breadboard from components with straightforward datasheets. They're a perfect learning tool.

I don't think future generations are going to be very interested in tinkering with a C64 or an Apple II, but the 6502 will live on for a very long time.


> I don't think future generations are going to be very interested in tinkering with a C64 or an Apple II

There's a decent bunch of Gen Z'ers absolutely fascinated by the DOS 3.11 / Windows 95 era, and almost certainly also C64 / Apple II. Both the hardware and software from that time period was still novel, innovative, and plain weird. On the other hand, it is very clearly just a computer: you can hook it up to a network or play games on it.

It brought us Microsoft Bob, who wouldn't want to poke around with that?


Windows 95 isn't that alien from WIndows 10, the usage it's almost the same.

With Retrozilla you can even comment into HN.

An Amiga or a classic Mac, OTOH, has a weirdness point.


>I don't think future generations are going to be very interested in tinkering with a C64 or an Apple II, but the 6502 will live on for a very long time.

I think it will be the exact opposite.


Future generations are somehow going to find a way to tinker with C64 and Apple II without involving the 6502?

There are a dozen 6502s in your vehicle.

I don't think you can give one reason, because everything feeds everything else and then there's a network effect of sorts of popularity.

But one factor I think not mentioned yet is the film industry: I think (aside from feed-in popularity) the 80s gets outsized attention in TV & film because it's before modern technology was quite so prolific - plot lines that don't work with mobile phones, the web, etc.

And then that feeds people's interest too. Like, you can be born after WarGames or Ferris Bueller's Day Off for example, watch them, and then it's appealing because it's different and interesting, and you want more.


It's because the IBM PC was released in 1981, and over the next several years, as prices came down, steadily overran pretty much every other competing platform and took all the variety out of the market from a historical perspective. If you're interested in 40-year-old computers you could have a collection with a z80 machine, a 68k, a 6502, a tms9918a, or an 8088, and that's just variety in CPU architecture, and just some of the popular ones. Everything else was the wild west, too. Go backward too many years from there, though and the home and small business computer manufacturing industry just isn't as big, and specimens become an order of magnitude harder to find that aren't just glorified calculators. You have to put up a lot of cash comparatively to get the fun part of the hobby of owning and using them yourself over just reading about them, which you can do for any machine in any era, plus they're harder to service. If you're collecting 20 years later, though, things had gotten so much more standardized and developed that it feels almost like a different hobby. Most all of what you'll buy will be some variant of the HP versus Emachines dichotony: either an expensive IBM PC Compatible using all the common standards, with a high-tier x86 that mostly does what they all do but faster, and maybe an add-in specialty card, or a cheap IBM PC Compatible with some of the common standards, some things shaved off for cost, a low-tier x86 processor that's just more frustrating than your fast one, and a motherboard covered in cheap components that you have to solder in replacements for before it even works again.

I've painted a bit of a skewed picture here, but not by much. You can still collect later computers, and people do, but it's understandable that most people are drawn to the "cambrian explosion" of the whole line of history, no? Variety is the spice of life, and plenty its staple food to be spiced.


It is probably due to the age demographics. People who are nostalgic for those years of the microcomputer revolution are both (1) relative large in numbers compared to earlier people in computing and (2) still relatively young and active. Many will be going into retirement about now also.

Then there is the practicality. The machines from that era were small. If you get your hands on the actual hardware, you can have it at home, set up in a small nook somewhere. Not so easy with some IBM 709 or something.

The stuff is easy to work on, and many components are still easily available.


Because it’s what a significant number of current developers grew up with. It’s also the first time we saw computers represented in movies etc.

In the future I bet we will see a wave of nostalgia around Windows XP (in fact we’re already seeing it)


I realize you said the early 80's, but the 80's, in-general, encompassed a ton of progress: from the 8 bit Apple, Commodore, Ataris, to the IBM PC and the clone wars, to the rise of the 16-bit Mac, Amiga, ST. Even the 386 was available by 1986. There were also many high end 32-bit workstations (Sun, NeXT, SGI...) Tons of variety in hardware.

The 90's was more about software (NT, Linux), and connectivity (dialup Internet goes mainstream, the first home broadband connections, etc.) Hardware felt mostly incremental: faster CPU, more RAM.


The Pentium and sucessors ate Unix workstations in the 90's.

Also, you forgot 3D for the masses, plus multimedia; realtime audio and video on PC's.

Even Linux got 3D hardware acceleration with DRI thru MESA 3D.


At a guess, because that’s roughly the time most people currently in a mid-life crisis consider their childhood?

My guess is “retro” moves on roughly as fast as we do.


As someone born in 2006, I can say that while that may be a contributing factor, I and many others my age are simply intrigued to look back at what the devices and software we use today descended from.

Although, perhaps it is the accessible creations of nostalgic mid-lifers that provides the foundation for our curiosities?


We fund the cost of your exploration with our nostalgia and retirement dollars. It's our pleasure to share with you what made our heyday great to us.

Well, “great”… I still remember the taste of hairspray.

Yep, so expect that in 2040 “retro” will mean the iPod and 1st gen iPhone. A working blue and white Mac G4 will be worth a fortune.

Folks aren't going to like it but this is the answer. It might not be a mid life crisis but the current troop of us were all born in the 80s.

Came here to say much the same thing, and it's very sad that you're getting downvoted for such an innocuous statement. Many new retirees and empty-nesters (or those approaching such milestones) first got into computing then. That creates a unique convergence of time, money, skills, and inclination to re-create the machines of that era. I'm sure that's not the whole story but surely it's a contributing factor.

16 bit computing was the domain of inspired individual efforts. 32bit computing brought about "communities" that overshadowed the individual geniuses contributing to them.

Right now there's an extra boost because of the "that's what I used as a kid" factor. In the future the DOS era will still be studied, as that time when collaborative development had not yet been invented.


Agreed, but small nitpick: the early ‘80s was mostly 8 bit, eg the C64, Apple II, MSX, etc.

It was possible to acquire a more or less complete idea of how these machines worked. My C64 had documentation that detailed every memory location, register and port right down to the hardware. It made them really compelling to ‘80s nerds like me.


>overshadowed individual

>collaborative

Say what? ITS, Emacs, MacLisp, Macsyma on top of MacLISP, Arpanet, SupDup and WAIS too. Remote process debugging and hooking, a Maclisp close to the future GNU Emacs' Elisp, proto-internet, AI, video terminals...

https://github.com/PDP-10/its/blob/master/README.md

http://www.hakmem.org/

These have been built collaboratively, for sure.

GNU and Emacs brought that spirit back. Well, at least the tried:

- Open Adventure

- Emacs

- Elisp (and CLISP/sbcl with Common Lisp)

- GDB (A far cry from DDT at ITS, but this is Unix, nit the

- Maxima. Code form old Macsyma will still work as is except for Gnuplot calling parts, but the plot itself will be 100% the same.

Have fun.


> extra boost because of the "that's what I used as a kid" factor

I've made this observation several times: the number of people who are buying these things who are young enough _to have never used them in the first place_ outnumber us dinousaurs who have used them by an order of magnitude or more.

I suspect it's just a matter of fashions like everything else.


60s/70s computing was dominated by mainframes and dumb, remote terminals for them wasn't it? There's not much you can do on your own with that tech and I'm just guessing here, but its nowhere near as available anymore. Plus, of everyone I've met that has done programming with punch cards, nobody said they missed it.

Because pixel games are an interesting aesthetic that's still pleasing now, whereas 3D games with ten polygons are the same as what we have now, just looking much much worse.

We still get games with "pixel" aesthetics, and they look just as good as Zelda did back in the day. We don't get low-poly games like Tomb Raider 1.


>low poly

There are, such as Apocalytic Petra

https://bad-sector.itch.io/post-apocalyptic-petra

http://runtimeterror.com/tech/petra/

On pixel art, I remind you that late NeoGeo games such as Garou look thousands better than most 32 bit games, even Crysis. If any, we need to head into the 64 bit peak of 3D gaming, something with true realtime raytracing.

The peak of 3D art from 32 bit machines it's maybe Crysis and some 2010 games where the OS' limits (even with PAE) show up by the architecture, yielding far less data managed per cycle, even with GPU's helping out. These should be compared to the peak of 8 bit CPU's doing pixel art, such as the Game Boy Color displaying Cannon Fodder which was a masterpiece.

The peak of pixel art from the 16 bit era (and 24-32 bit with the NeoGeo) should be compared to maximized 64 bit machines with almost a TB of RAM and GPU's able to host Windows 10 in VRAM if they could.

The early PSX games should be compared to the early OG Game Boy games such as Mario Land, or maybe the Atari 2600 because of the novelty of the architecture.


> There are, such as Apocalytic Petra

I wasn't expecting a complete lack of low-poly games, my argument still stands with a few of them existing.

I agree with your comment, which was my point: 80s games were beautiful.


No, not 80's games; they pale gainst the SNES' top notch pixel art on JRPG's and the fighting games from NeoGeo.

Late 80's games, maybe; but even late PSX games look good enough, just check Ridge Racer 4.


It's funny you say that, since it seems like early 3D games are seeing their own retro revival. Quite a few collectathon platformers based on Super Mario 64 and Banjo-Kazooie and Spyro the Dragon are being created by indie devs at the moment, and quite a few horror games are taking on a Silent Hill/early Resident Evil aesthetic too. Heck, you could even say the deludge of mods and ROM hacks for games of that era are evidence of nostalgia for the period too...

You might even say we've gone past that era, given that PS2/GameCube/Xbox games seem to be getting a fair bit of love from people who grew up with them recently, whether through mods based on them or indie games inspired by them.

But maybe retrocomputing as a whole is a bit different there.


On other hand, get to something like Atari and well how many new things is emulating that?

Yeah, Atari was too early so it didn't look good, so nobody wants to play those simplistic games.

Draw a graph, write all the decades since computers were invented on the x axis, and "how many computers were sold that year and still work well enough to be resold in 2024". Don't waste time looking it up, just best guess it. Now draw a vertical line where you define the word "retro" to apply, and erase everything to the right. It's a subjective word, so there's no wrong answer. Now re-ask your same question and look at your drawing.

Personally, I see the most interest in "Windows 98 era" retro computing (winamp, age of empires 2, millenium aesthetic), and rarely see 80s stuff anymore, but the people I follow the interests of are probably younger. 10 years ago I would have said 90s, and 10 years before that, 80s, so perhaps you're 20 years older than me? Just a guess.


I think it will remain the time period with the most interest, similar to how electric guitar and amp enthusiasts view the 60s.

For electric guitars, the 60s wasn't the decade where they were invented (30s) or the decade where they found their "final form" (50s), it was when they entered the cultural zeitgeist.

Despite popularity of the 50s guitars/amps and later decades slowly rolling in to "retro" status, the era of the Beatles, the who, hendrix, the rolling stones, etc. will always be the most popular.

A curious part of this is a sort of generational nostalgia transfer, where (for example) the popular bands from the 90s had this nostalgia of the 60s, so they used old guitars and amps and were influenced by 60s music which caused their fans to have the same view.


1. Recapitulation of our childhood is particularly strong in my generation. For example, even after the Transformers movies, we obviously thought it wasn't enough like our memories, so we made Bumblebee.

2. Software for some 80s systems is ubiquitous.

3. Sound blaster hell/IRQ hell is a real thing and DOS is frustrating more than fun.

4. Prior generation computers were made for work and fun was a rare side effect of letting off steam.

5. Since Windows 2000 modern computers run the same software, so your retro is limited to beige boxes with flaky capacitors.

6. Non-wintel 90s+ computers are interesting and are indeed becoming a focus of retro computing. Power PC Macintosh, Acorn, BeBoxes, NeXT, SGI, even HP and DEC RISC computers are finding enthusiasm.


The 8086 is from the late 70s. The x86 architecture would still take 10 more years to start dominating market. Recent processors don't differ too much in terms of instructions, so these processors are basically just simplified versions of we currently have, which understandably doesn't peak too much interest. Therefore the early 80s still being ripe with other architectures is more attractive.

I wonder, now that the market is slowly shifting to arm and RISCv architectures, we might see a similar trend 30 years from now, with people starting to procure x86 chips with similar interest.


Because 83/84 was golden era of personal computers. Even when 8-bits was very limited, but they was affordable.

Imagine, sales of commodores and consoles achieved millions annually.

What also important, fortunately, 8-bits was good enough and (mostly) enough fun for tech level of that time (VHS, AM radio).

Previous computers was extremely expensive, so limited to military/science or business. Later computers become boring commodity.


You can more or less fully understand an entire 80s home computer. The CPU, the memory access, the peripherals, and the graphics subsystem (…if you can call it that). Operating systems were simple by modern comparisons. Hardware is relatively cheap, and emulators at reliable, there was also a wealth of shareware available for many use cases, now much more easily available than when I ran a public access BBS for it.

IMHO the constraints of these systems make them still quite fun to code for, just like solving sudoku puzzles can be fun.

I found about this [1] amazing live coding session by lftkyro on YouTube showing how to build a live music pattern editor on the C64.

[1] https://youtu.be/ly5BhGOt2vE?si=1EzOnELSb5fd-dqA


The terms "retro" and "vintage" are a sliding window. In 20 years, computers from the 1990's and 2000's will be retro. There's also practical matters like price, availability and size - older computers from the early 1970's tend to fill up entire rooms whereas from the 1980's onward fit on your desk.

As someone who has been programming since the 1970s, I'll have to admit that I don't get this interest in the early 80s.

Anyone answering this should be required to mention their age before sharing their half-baked theory.

I’m 41 and the reason for this is that people born in the mid-1970s to late-1980s are old enough and wealthy enough now to have nostalgia for that time period and the time to pursue that hobby.


Firstly, it was the golden age where computers that you could own personally became capable enough to run decent games etc, but still be completely understood by a single person.

Also, people who were teenage hackers in that era are now middle-aged and beyond, and nostalgia.


It’s because the folks who are the middle-aged adults with disposable income in their 40’s and 50’s used to have computers from the 80’s.

Although one of the benefits of them is that you can still play and program for them (and even on them.) They’re simpler and more immediate.


8bit 6502 tech is so radically awesome for the specs. So much creativity was required to make things work amid the heavy constraints. Like, basic NES/famicom games had 2k ram, 8k program code and 8k graphics.

The smallest NES games had 16 KiB program ROM and 8 KiB graphics ROM. Super Mario Bros. had 32 KiB program ROM. Some games were as large as 1 MiB.

No, future generations will be interested in collecting technology from their own youth. There is already huge interest in old iPods, for example. Even original iPhone models are a collectible item now.

Well the 60s and 70s was mostly mainframes, a bit big for home tinkering. Babbage's difference engine would be an impressive project.

So basically,the 80s is the earliest easily accessible period.


A C-64 fits on your desk easily. A PDP-11 or a Cray-1 does not. This surely isn't the entire answer, but I suspect it does play a role.

any other time except the 80-90s will not have such interest and excitement. since the eighties are the beginning, these are non-uniform and extraordinary solutions that are still used, in games, in programs. now everything is flat and the same, even the devices are uniform and the same

It's been like this since the late 90s, "retro" basically means microcomputers

Because people who were kids in the 80s are now entering their middle ages



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: