
1980 Atari 800 ad: “It will never become obsolete” - bookofjoe
https://www.bookofjoe.com/2014/07/1980-atari-800-ad-it-will-never-become-obsolete.html
======
themodelplumber
Quick rant, downvote away but I thought I'd share something that's really
going to be important to the future of tech: Obsolescence in technology is a
completely broken mental model. As a communications tool it mainly serves the
purpose of justifying new development and economic growth. The obsolescence
model is based on a breadth-first estimation of technology (what's out there,
what's going on, how have things changed out in the tech world, what else is
available) rather than a depth-first estimation (what's in here, what can this
baby do, wow what a miracle in this way or that one).

By using the word "obsolete," the communicator is being--intentionally or
unintentionally--intellectually shallow. Even if the word serves their
purpose, they are also wrong in a big way.

The fact is, this machine still does what it started out to do, technically
speaking. If you look up the dictionary definition of obsolescence though,
you'll find some really terrible news: "no longer useful." An Atari 800 is no
longer useful? That's arguable, to say the very least.

The best way to communicate obsolescence with accuracy, precision, and
rationality is with a reference point, like a market: "This tool no longer
serves its original target market with the efficacy that it once did." This is
much more precise. However, it opens the door to another depth question: Which
markets or sub-markets might it _still_ serve, and even speak to rather
loudly?

IMO the discourse of obsolescence is a curse on modern civilization. We are
missing out on some huge _qualitative growth_ lessons until we learn to think
in a more nuanced way about how technology serves us all.

~~~
liveoneggs
If an old computer can be 100% emulated by a new computer using 1/1000th of
the electricity then the old computer should be archived as software and
physically recycled/eliminated.

~~~
tom_
Your laptop probably draws, what, 15W. Your old computer probably does _not_
draw 15KW! - yes, I'm sure there will be old computers that do, and more, but
since this article is about the Atari 800, I'm going to assume the discussion
is about that sort of old computer, rather than something the size of a room
that required a reinforced floor and your own personal account manager at the
electricity company.

Modern hardware that I've got:

1\. Mid-2015 13" Macbook Pro, mostly idle, battery charged, screen fully
dimmed: 11W (16W with screen at full brightness)

2\. My 2011 i7-2600K desktop: ~100W idle (packed away right now - but it's
about that)

I happen to have my old Acorn junk set up right now, so here's some
measurements from that:

1\. BBC Master 128: 12W

2\. BBC Micro model B: 25W

And some other Acorn peripherals. Add these figures to either of the above:

1\. Acorn 6502 second processor: 5W

2\. 5.25" disc drive: 2W (when in use)

3\. Ancient early-80s monitor of unknown provenance: 50W

4\. Acorn-branded CM8833: 85W

Looks like my favourite Acorn-branded monitor is actually pretty bad :( -
however it's not actually drawing a huge amount more than the 2 x 27" monitor
setup I have with my Macbook Pro, even though the pixels per watt isn't quite
as good.

In fact: the overall power draw for BBC Master 128 setup vs Macbook Pro setup,
both period-authentic, both (if not cutting edge for the time) up to the task
of a good range of period-authentic workloads, is about the same! - and even
if I was certain 1000x was an overestimate, this is still a bit of a
surprise...

~~~
mokus
If you factor in that your MBP could probably comfortably emulate 100 6502s in
real time, 1000x isn’t quite as severe an overestimate. Get the GPU in the
game and, even with terrible warp utilization and cache efficiency, you still
might be able to hit that 1000x for purely academic purposes on a desktop card
with HBM (maybe 2500 real-time-emulated 6502s on a 250w system).

Move to the microcontroller world, though, and 1000x is a vast underestimate.
The main core in a current project of mine runs at 24 MHz and draws around 1.2
mW without even entering sleep. This system could probably emulate a Master
128 just fine in real time, for about 1/10,000x the power.

~~~
tom_
My MBP can probably comfortably emulate 150 x 1 MHz 6502s in real time, but
your average 6502-based system is a system, not just a 6502, and the Acorn
stuff is no exception. There's a lot more to emulating it than just dealing
with the CPU.

The Master 128 runs at 2MHz. For each emulated microsecond you need to
(broadly speaking) do at least 2 CPU updates, 2 VIA updates, and 16 bitmap
bits/1 teletext glyph. 24MHz = 24 cycles per emulated µsec :( - good luck even
squeezing the video output into that, let alone the other stuff. Rather you
than me!

(This also ignores handling the FDC, ADC, and ACIA. The devices accessed via
the VIA will also take a bit of time to emulate...)

------
GlenTheMachine
Meanwhile, yesterday one of the top HN articles was on a brand new port of
Super Mario Brothers to the Commodore 64. There's more new good software
coming out for 8-bit machines today than there was in 1986 when I bought mine.

Depends on what your definition of "obsolete" is, I suppose.

~~~
Florin_Andrei
> _Depends on what your definition of "obsolete" is, I suppose._

I mean, the paintings at Altamira are still admired today, 36 thousand years
after they were made.

~~~
doctorpangloss
I agree. It's pretty rich that engineers are mocking cultural products for
obsolescence.

~~~
c0vfefe
Rather, mocking products that claim they're immune to it.

------
noobermin
On the other hand, I don't know about other people, but I tire of the constant
change of things that don't need to be changed. We in society have a large
number of issues we need to confront, some that are vital to continued
survival of our species. Instead of that, we still find our tools degrading by
means that people like Apple claim are not planned obsolescence or software
bloat for every update of Android that a once working phone moves to a crawl,
etc, etc.

There is entropy, then there are social actors whose material interest clash
with a life of sustainability adding to the drama we already must face in
society.

~~~
m463
I liked the opening of "The Gods Must Be Crazy", in particular when they say
_" But he didn't know when to stop."_

Here's the part of the script comparing the simple lives of the native bushmen
of the Kalahari desert to "civilized man":

    
    
        These Bushmen have never seen a stone or a rock in their lives.
        The hardest things they know are wood and bone.
        They live in a gentle world, where nothing is as hard as rock, steel or concrete.
    
        Only miles to the south, there's a vast city.
        And here you find civilized man.
        Civilized man refused to adapt himself to his environment.
        Instead he adapted his environment to suit him.
        So he built cities, roads, vehicles, machinery.
        And he put up power lines to run his labour-saving devices.
    
        But he didn't know when to stop.
    
        The more he improved his surroundings to make life easier...
        ...the more complicated he made it.
        Now his children are sentenced to years of school, to learn...
        ...how to survive in this complex and hazardous habitat.
        And civilized man, who refused to adapt to his surroundings...
        ...now finds he has to adapt and re-adapt...
        ...every hour of the day to his self-created environment.

~~~
projektir
This largely relies on the idea that pre-civilized life was preferable. And,
for some groups, it was. All the women enjoying close to 0 sexual freedom and
dying in child birth might disagree, though.

~~~
m463
wait, hold up.

The point was "knowing when to stop".

Civilization creates these juggernauts that start moving and when they reach
their goals their momentum carries them past need and reason.

How many laws do we need? Do we need more roads (or lanes)? Should taxes
increase? in percentage!? Do we need to "add value" to every product ever? Do
kids need more homework? Do you need another iPhone?

~~~
perl4ever
I never bought my first iPhone. I am however, not at the point at which I feel
like the world should stop turning. I'm more like "ok, I am capable of
selecting the good things from the bad, and letting go of the novel tech I
don't like". Perhaps that is a product of being middle aged and I will feel
differently when I am 80.

Douglas Adams wrote: “I've come up with a set of rules that describe our
reactions to technologies: 1\. Anything that is in the world when you’re born
is normal and ordinary and is just a natural part of the way the world works.
2\. Anything that's invented between when you’re fifteen and thirty-five is
new and exciting and revolutionary and you can probably get a career in it.
3\. Anything invented after you're thirty-five is against the natural order of
things.”

I like to look at classic car ads, and I saw one for a car from the 60s or
whatever, and I forget the exact phrasing, but it said it had had a modern
type of stereo installed, which had AM/FM and a CD player. Which I thought was
funny, because that kind of pinpoints the assumed age/generation of a buyer.
Going by Adams, someone who thinks car CDs are new, but good, could have been
up to 35 in 1985 when the first car CD players came out. Which is exactly the
age to remember mid to late 60s cars as desirable as a teenager. But now they
are close to 70 years old, and that's why the prices of such cars are trending
down.

I'm not sure that it's inevitable that you hate new things as you get older.
Maybe it's more that people feel insecure about their ability to choose which
new things they like, because it becomes a disorienting flood.

------
jlv2
Our Atari 800 is still hooked up and operational. My kids (now all in their
20s) came home with college friends and showed them some of their favorite
games on it (like JumpMan! and M.U.L.E.).

~~~
cowmix
OMG.. M.U.L.E. still might be my favorite video game.

~~~
bhauer
M.U.L.E. might be the most well-conceived and well-executed game for replay-
ability and enduring enjoyment given the hardware limitations of the 8-bit
era. It had/has the advantage of allowing up to 4 human players to
participate, which keeps it fresh even today in the same way board games from
40 years ago can remain fresh. Yes, the AI players are easily defeated, but
human opponents made M.U.L.E. a true delight.

------
scottlegrand2
It's 2019 and 1979's Star Raiders is still an amazing game.

------
hellofunk
I still use my 800. Not every day, but probably three times a week. It's a
great personal e-rolodex. I also like to paint with it. The games are just as
fun as anything current, and the power usage is low.

------
sneakernets
I remember an episode of Computer Chronicles where a man claimed 8-bit
computers should be as relevant today as they were when just released, because
"every program is a turing machine, essentially."

His main argument was that the memory and CPU limitations helped curtail
software feature bloat. When asked about the possibility of advanced
Multimedia features becoming more desired among consumers, he shrugged it off,
saying an expansion peripheral would do the job just as well as buying a new
machine, without having to learn a new machine entirely.

Now that we have cartridges for the C64 that allow for massive data storage
and 32+ Megs of RAM, maybe he was right. We could have all been using C64s or
Amigas with a ton of expansion cards/carts installed.

~~~
stordoff
> His main argument was that the memory and CPU limitations helped curtail
> software feature bloat

I think there's some truth to this - if you have a limited amount of RAM, you
can't just throw RAM away and do the easy thing. You have to be careful with
it. The flip side of that is of course one man's feature bloat is another
man's valuable feature, and some things just _won't_ fit in the limited amount
of RAM.

~~~
sneakernets
Seems to be the "Usable, Extendable, Efficient - pick two" scenario at its
core. I remember very well a word processor on C64 (GeOS Writer I think?) that
would use its own disk as swap to get around memory limits. I eventually
bought a memory expansion just to get around this, which wasn't cheap in the
slightest.

And that's just for text! It's stunning to me how inefficient text can be when
every byte counts.

------
ilaksh
I really like the designs of the old computers. They are so interesting.
That's why I'm making a libretro frontend that loads 3D models
[http://vintagesimulator.com](http://vintagesimulator.com)

------
tcbawo
I love the quality of the keyboards in these old machines. Is there anyone out
there building all-in-one keyboard computers w/ Model M-quality keyboards?

~~~
_red
A friend and I were speaking of this recently. The current trend for "all-in-
ones" is to have "monitor computers", however practically speaking it would be
better to have a "keyboard computer" that you can change every few years and
keep the investment in your nice 2k monitor.

~~~
paulmd
but then if you spill your drink you've ruined a $2000 computer instead of a
$50 keyboard...

~~~
bovermyer
Then stop drinking near your computer.

~~~
paulmd
"just stop using your PC the way you're accustomed to so that you can use my
new product", what a winning sales pitch. It's like if instead of making his
phone not suck, Steve Jobs had told people to just stop carrying their keys
around in their pockets.

I mean, drinking soda and gaming is practically a meme at this point, so
goodbye to all those customers. And anyone else who wants a beverage while
they edit photos or do their taxes.

~~~
bovermyer
I was talking specifically to _you_ , not all people who drink beverages at
their computer.

I'm not trying to sell anything. I'm just concerned for the longevity of your
computing hardware.

------
bane
There's an absolutely fantastic podcast called "ANTIC The Atari 8-bit Podcast"
[1] I listen to. The show started off a bit slow, a normal old-guys-talking-
about-old-hardware. But somewhere along the way they started hunting down
people involved with Atari or Atari products back in those days and have
started amassing an impressive collection of sometimes incredible interviews.
It's a treasure trove of oral history of the late 70s and early-to-mid 80s
computing movement.

I don't know how they find these folks, but the interviews can be absolutely
awesome. Sometimes people who's names I never knew, but who's companies or
products I absolutely remember give detailed background on the goings on of
early computing in a way that's just really amazing. There's hundreds of
interviews now.

Their non-interview shows are mostly about the three hosts talking about the
_modern_ Atari 8-bit scene. New hardware, new software and so on. Devices for
using the internet, SD cards and such on these ancient machines.

I never had one of these boxes, and only ever saw one a couple times, but I've
grown to deeply respect the thought and engineering that went into these
machines.

1 - [https://ataripodcast.libsyn.com/](https://ataripodcast.libsyn.com/)

------
jbuzbee
Still have mine. And what I learned from it hasn't become obsolete yet. Those
were the days of writing everything in assembly language, trying to cram your
program into what now seems like a tiny amount of memory. I recall that memory
was so tight that I'd use one byte to hold two different variables. Multi-
tasking? Yep. We had vertical and horizontal blank interrupts that you could
use. Magic times making that machine "sing"!

------
melansoncholia
Pictured with a cassette recorder and some sort of dot matrix toaster oven

~~~
rolph
yes i see it and i remember long sessions of group geeking and gaming.

[https://bookofjoe.typepad.com/.a/6a00d8341c5dea53ef01a73dea5...](https://bookofjoe.typepad.com/.a/6a00d8341c5dea53ef01a73dea5917970d-pi)
[IMG]

I recalled the MAME thing and the idea of installing a modern PC platform in a
retro arcade cabinet, i think and old borked atari cabinet with a harddrive
dock in place of the rom port would be a cool looking place for a gaming[mame-
ing] PC.

------
chasil
If I am correct, the 6502 in the Atari 800 was also used in the BBC Micro.

The BBC Micro itself was used to design the first ARM, and the 6502 was a
strong influence on that RISC design.

[https://www.theregister.co.uk/2012/05/03/unsung_heroes_of_te...](https://www.theregister.co.uk/2012/05/03/unsung_heroes_of_tech_arm_creators_sophie_wilson_and_steve_furber/)

The Atari 800 is in some ways an ancestor of the ARM platform, realizing that
ARM itself adapts wildly in efforts to maintain a dominant position in mobile
computing (i.e. the Thumb variable-length "RISC" instruction set, massive
64-bit architecture changes).

~~~
duskwuff
> the 6502 was a strong influence on that RISC design.

How do you figure that? The 6502 was itself based on the Motorola 6800, and is
a pretty typical accumulator architecture. It could hardly be any more
different from the ARM.

The _development_ of the ARM architecture was carried out on 6502-based
systems -- but only because they were conveniently available, not because
there was anything special about that processor that guided development.

~~~
tom_
The 6502 accesses memory on every cycle, predictably, so you can have memory
that's clocked at the same rate and you're getting 100% utilization. Compare
to Z80 and 68000, for example, which access memory only intermittently, and
not reliably at any N-cycle boundary. There's a couple of bits about this in
this interview with Sophie Wilson, one of the ARM designers:
[https://www.computerhistory.org/collections/catalog/10274619...](https://www.computerhistory.org/collections/catalog/102746190)
(a good read)

I'm sure I read somewhere that the 6502's low interrupt latency was also an
influence (the 8-bit Acorn stuff was IRQ-heavy, no DMA), but clearly it wasn't
in the above PDF, so I've no idea where this came from :( - but this might not
be 6502 influence necessarily, as the Z80's IRQ latency looks OK, maybe just a
desire not to fuck it up like the 68000.

ARM SBC is like 6502 SBC, just an ADD with inverted input... so... maybe that
counts?!

The 6502 influence is indeed not all that visible in the programming model or
instruction set, but I don't think this is a bad thing.

~~~
duskwuff
> The 6502 accesses memory on every cycle, predictably, so you can have memory
> that's clocked at the same rate and you're getting 100% utilization.

On the flip side: it also means that the processor can't run any faster than
memory, which is not great.

Anyways, the 6502 uses a rather narrow 8-bit data bus with no cache, so
instruction fetch takes up the majority of memory accesses -- in fact, I'm not
sure you can get _under_ 50% of memory accesses used for instruction fetch
without pathological instruction sequences. It's incredibly inefficient by
modern standards; even 68000 was an improvement in this regard.

------
karmakaze
I miss the closeness to the machine that programming an 8-bit computer
embodies. There was a brief explosion of Flash games that might have been
similar but at a higher level. Now we have JavaScript+HTML+CSS which seems so
abstracted. I feel a bit sorry for newer generations that do not get the
chance to learn not just software but hardware to go with it. Even the PC up
to SVGA cards still had that. Everything's more powerful now, including the
tools to make software which is better for making larger works but something
was lost. At least indie game dev's have kept aesthetics and focus on gameplay
alive.

------
fred_is_fred
LOGO on an Atari 800xl at elementary school was my first coding language. I
mainly typed in stuff out of a sprial bound book of cool patterns you could
make. I can only assume the title of that book was "Stack Overflow"

~~~
mimixco
LOGO (which is really Lisp) on a TI 99/4A changed my life. It was where I had
that first epiphany that software is just "castles in the sky." It can be
anything you want.

------
takk309
I seem to remember a similar sticker on an E-machines PC that I had as a kid
in the mid 90's.

~~~
patient_zero
Considering they got busted for using refurbished parts in new machines, a
lawyer might argue that the sticker was telling the truth! After all, what is
already obsolete may never become obsolete.

------
patricklorio
I suppose that's true if you keep adapting it's purpose. Probably provides a
decent experience as a keyboard attached to a modern desktop.

------
ken
I once saw a TV commercial for a non-name-brand PC with one of the first
x86-64 CPUs. It seemed decent enough but it wasn't _that_ much faster than
32-bit machines of the day, and I doubt it had enough RAM slots to break 4GB,
anyway. They repeatedly claimed that 64 bits meant it was "the last computer
you would ever need to buy".

~~~
JohnFen
Back in those days, claims that a system will never become obsolete were
pretty common as marketing nonsense.

I think that particular lie (and it was always a lie) came up so often because
technology was advancing at an incredible rate, and it made people reluctant
to buy computers because they knew that it was going to be obsolete in less
than a year. The old "joke" \-- funny because it was true -- was that a
computer was obsolete the minute you took it out of the box.

The pace of hardware change has slowed considerably since then -- now advances
are more along the lines of "the same thing, but faster and/or smaller".
Worthy advances, to be sure, but nothing that makes last year's machines
garbage.

------
musicale
That Atari 800 probably works fine today.

Since it's not always (ever?) connected to the internet, malware is much less
of a problem.

Games probably work fine as well, as they are stored on rugged cartridges and
don't depend on cloud services, internet-connected DRM, or game servers which
tend to be shut down after a year or two.

------
jnaina
Dad bought an Atari 800 for me when I was 16. Became a master pirate, with a
collection of over 1000+ games and programs. Modified my Atari 810 with a
Happy Archiver chip to copy protected disks and then ‘free’ them by modifying
the bad sector disk protection code and releasing them.

------
Theodores
Obsolete for what?

I really like the aesthetics of the Atari 400/800 and not for the logo or the
case design but just for the colours. The browns/yellows/oranges are quite
inspiring for a bit of modern day UX.

I know they didn't sell these machines for people to steal the colour scheme
from 40 years later, but I really like lots of aspects of the design including
the colour or the units.

The retro tech scene I find a little weird. It is surprisingly strong and
there are Youtube videos of someone changing the capacitors on these things
and getting views in the millions. It is as if retro computing is the new
model railway hobby or 'Airfix' of our times.

------
aquamo
I couldn’t afford the ATARI 800 but my 400 still works today. Just played some
missile command, star raiders, and Miner 2049 on it a few months ago. I
learned BASIC and 6502 on mine so will always have a soft spot for them.

~~~
charlesism
You could have picked one up after the video game crash. I think the XL
machines wound up selling around $100 in the mid 1980s.

------
usermac
So let me think... Things I've used to store data in my lifetime: Punch cards,
Cassette, Whatever that big, open, orange multi-platter with a plastic,
removable cover I think on a VAX in ~1985 (anyone know?), 8" floppy, 5.25
floppy, Spinning hard disk, SSD, Flash drives (many types), 3.5" 'floppy' or
hard? SONY, more I'm sure...

~~~
lokedhs
I think they were just called Dick Packs.

[https://en.wikipedia.org/wiki/Disk_pack](https://en.wikipedia.org/wiki/Disk_pack)

~~~
lokedhs
I promise that was a real typo, and it's too late to edit.

------
shortoncash
A friend had one of these and his dad tried to convince my dad to buy one. I
honestly don't know what my friend did with this thing other than play a few
games. I wonder what kind of analysis of the stock market they could even do
on this old computer.

~~~
CWuestefeld
Well, it's where I taught myself a good portion of my early programming
skills. That's got some value.

------
JulianMorrison
Not the same computer, but, with a serial cable and USB adapter, a RAM module,
a flash EPROM and a bit of software, a Cambridge Z88 from the same era is a
nice ultra-long-battery-life laptop for taking notes or writing uninterrupted.

------
atlanta90210
Commodore 64 > Atari 800

~~~
timbit42
The Commodore 64 has better sound and the graphics are more flexible in
putting colors in more places and more larger sprites. This is great for games
and the C64 would have been just as successful as a game console, and perhaps
more successful as one.

The Atari 800 is better in virtually every other respect. It has an actual OS
instead of a KERNAL, the SIO bus is what modern USB was based on and is better
than the IEC bus as it is not crippled to tape speed like the IEC bus is and
supports a tape drive and modem in addition to floppy drives and printers. It
has a 256 color palette and more video modes. The 6502 CPU is faster. The DOS
is easier to use with a menu instead of arcane commands. The BASIC is
friendlier with sound and graphic commands, although slower. It can autoboot
from tape and disk.

I had a C64 as a teen but today I appreciate the Atari 800 much more. It's an
amazing system and the designers went on to design the Amiga, which was
effectively a 16-bit version of the Atari 800 utilizing the same 3 custom chip
design.

The C64 was more popular because it was less expensive and better for playing
games, but it wasn't a better computer.

~~~
karmakaze
Good tech summary. I grew up on Atari 8-bit and loved the technical details
that got out. The C64 owners I met tended to hack less. I pretty much learned
to code disassembling ROM and AtariDOS and making custom patched versions (and
of course games). I would get into the zone with the macro assembler editor
like I was meditating and say "I'm in medit" if anyone interrupted me.

Bill Budge's Pinball Construction Set blew my mind and I probably became a
programming tool developer because of it.

------
Timothycquinn
This marketing blunder does not touch the best of all time: Harvey's (a
Canadian fast food chain) had a bold advertisement stating they only use "100%
real processed cheese". But, at least Harvey's was not lying ;)

------
jasoneckert
I still have mine. So yeah, pretty accurate.

------
poundtown
i remember getting an apple II+ and my neighbor had this...u could see the
envy in his eyes!

~~~
timbit42
LOL! There is no way an Atari 800 owner would be envious of an Apple II owner.
The Atari 800 is superior in so many ways.

------
plg
I really miss keyboard like that

------
TheTruth1234
Christ ... that brings back memories ...

~~~
timbit42
It sounds like you haven't used an 8-bit computer for a long time. You're
missing out. They're all the rage now. There are new games coming out for them
every week now.

~~~
TheTruth1234
I did a google with fair results - but point me in a decent direction for my
re-introduction? Thanks

