
“There Is No Reason for Any Individual to Have a Computer in Their Home” - sohkamyung
https://quoteinvestigator.com/2017/09/14/home-computer/
======
gregmac
> To understand the mindset of this period it is important to recognize the
> distinction between a computer terminal and a free-standing computer. Some
> experts believed that individuals would have terminals at home that
> communicated with powerful remote computers providing utility-like services
> for information and interaction. These experts believed that an isolated
> computer at home would be too under-powered to be worthwhile.

Considering how the majority of the general population's use of "computers" is
really interacting with remote and cloud services (Facebook, Netflix, etc)
this is a valid argument today. While today's equivalent of terminals (phones,
tablets, and even PCs) have a significant amount of local computing ability,
for the most part they really are just used as fancy terminals. Gaming is
maybe the one mainstream application where local compute is important.

The other quote that stuck out to me was this:

> “The personal computer will fall flat on its face in business because users
> want to share files and want more than one user on the system,”

~~~
simonh
>Gaming is maybe the one mainstream application where local compute is
important.

There are a ton of things our smartphones and tablets do that rely on very
high levels of local computing power.

Biometric authentication such as face and fingerprint recognition is a heavily
compute intensive application. It needs to be done locally in order to be
secure. Face ID is just nuts.

Modern smartphone photography is heavily compute-intensive, from HDR to
optical zoom to image editing and markup. Just look at what Apple is doing
with portrait mode and lighting effects. I love using time-lapse and slow-
motion video modes. What would instagram and snapchat be without locally
applied filters?

Note taking, address book and time management. The notes apps on modern mobile
devices are multi-media wonders with integrated text, image and handwriting
recognition built-in. Calendar management, alarms and notifications all rely
on local processing even if they do make use of cloud services. Without local
smarts those services would lose a huge chunk of their utility.

Document and media management. I read eBooks and listen to podcasts. My ebook
reader has dynamic font selection, text size adjustment and will even read my
books to me. Managing media on the device is essential as contemporary
networks are still nowhere near good enough to stream everything all the time.
My podcast app has sound modulation and processing options built in to tailor
the sound to my tastes and needs, including speed up, voice level adjustment
and dead-air 'vocal pause' elimination in real-time, all on-device and
adjustable at my fingertips. That's serious sound studio level stuff in my
pocket.

Playing digital video files. Even ones downloaded or streamed. So what if
we've had it since the 90s. It's still proper computation, especially with
advanced modern codecs. VOIP and video calling even between continents have
also become everyday and absolutely rely on powerful local number crunching.

These things have become so everyday that we hardly notice most of them, but
without serious on-device processing power, some of which would have been
beyond $10k workstations just 15 years ago, none of this would be possible.

~~~
com2kid
> My podcast app has sound modulation and processing options built in to
> tailor the sound to my tastes and needs, including speed up, voice level
> adjustment and dead-air 'vocal pause' elimination in real-time, all on-
> device and adjustable at my fingertips.

Everything short of the vocal pause stuff an old Sansa will do for you,
running a tiny embedded C based runtime.

> These things have become so everyday that we hardly notice most of them, but
> without serious on-device processing power, some of which would have been
> beyond $10k workstations just 15 years ago, none of this would be possible.

The modern photo and video stuff is super cool and uses a ton of processing
power. Likewise for high quality video streaming, decompressing is hard work,
though often offloaded to HW these days.

Everything else my desktop of 15 years ago handled just fine, with 256MB of
RAM.

Heck my smartphone in 2006 with 64MB of RAM did most of this stuff.

The one thing that has remained constant across a decade of smart phone
development is the Pandora app getting forced closed all the time.

~~~
DigitalJack
"Everything else my desktop of 15 years ago handled just fine, with 256MB of
RAM."

I hear this sort argument a lot. But I never hear of anyone doing it, that is
working on a 15 yr old computer for a while. I imagine with Linux it might not
be too bad.

~~~
com2kid
> I hear this sort argument a lot. But I never hear of anyone doing it, that
> is working on a 15 yr old computer for a while. I imagine with Linux it
> might not be too bad.

The Internet has made this infeasible. Everything has moved online, and 90% of
online resources have grown very bloated, though Wikipedia would probably
still work well.

Security is another problem. Windows XP would work fine if you can find 15yr
old hardware that still functions (hardware rot is a real thing), but the
number of 0 days out there is far too high.

All that said, I wonder if any Shoutcast streams still exist. I'm using
Youtube Live for the same thing now days (orders of magnitude more CPU being
used!).

Coding would work fine, except the old IDEs don't support newer C++ standards,
and I'm accustom to my niceties in life, but if you can ignore that, life
would be fine for writing code. Modern web platforms wouldn't work that well,
writing code would work OK, but the resultant webapps wouldn't run that well.

~~~
gh02t
> All that said, I wonder if any Shoutcast streams still exist. I'm using
> Youtube Live for the same thing now days (orders of magnitude more CPU being
> used!).

Shoutcast is still around, at least as a protocol via Icecast. I dunno if it's
still being used for widespread public streams, but I know people still use
MPD + Icecast to stream their personal music collections.

~~~
yeahsure
Most of the FM stations in my city (in Argentina) have a shoutcast stream to
listen online.

------
ctdonath
Those not around (or aware enough) back then don't grasp how staggeringly
little computing power there was, and how enormous the effort to do that
paltry processing. Sure there were "desktop computers", but they barely did
anything; a computer sophisticated enough to do anything meaningful in one's
home would have required an entire room devoted to it - and still wouldn't do
much.

Today's Lightning cable literally contains more processing power than a "home
computer" from the early 1980s.

"Document processing", today a simple matter of scanning a paper at >300dpi
and running OCR, was a huge undertaking just to identify & fragment documents
(varying extreme compression of components based on differing legal importance
of preprinted content vs handwritten content vs legally-binding signatures)
precisely because there was so very little space to store anything in. One
common image today could easily overwhelm an entire storage unit back then.
(Tangent: I believe this is a major & widely-disbelieved source of much of the
consternation surrounding Obama's birth certificate).

The notion of utility-level remote computing was necessitated by computing
problems of real practicality requiring such scale, not just of straight
capacity but of complexity to squeeze maximum usability out of such under-
powered hardware, that doing it at home was unthinkable.

Nobody expected computing power would increase, per volume & dollar, so
incredibly much. Some 40 years later, processors are about 10,000x faster,
displays 10,000x more detailed, and memory & storage & networking 10,000,000x
greater. I remember my brother taking all day to download a large book; today
we use as much data for 1 web page.

"These experts believed that an isolated computer at home would be too under-
powered to be worthwhile."

It was. Think: modern insulated automated-HVAC 2000 sq ft home, vs mud hut.
They couldn't comprehend what was to come; most today can't comprehend where
we came from.

~~~
compiler-guy
One of my favorite examples:

The parent comment above would have been too big to store in main memory for
one of these computers.

The comments on this thread wouldn't fit into secondary storage.

It's also useful to remember that was talking about his then present, not
forever and ever. He didn't say, "and there never will be."

~~~
ctdonath
Nicely put. It's hard to get the orders of magnitude across; that's a good
example.

One could argue that the "didn't say 'never'" was implicitly understood. What
I don't think most grok is then-incomprehensible scale change; those loading
multi-ton 5MB storage units on airplanes couldn't imagine a half-terabyte on
your fingernail.

[https://imgur.com/uqKOX](https://imgur.com/uqKOX)

[https://ae01.alicdn.com/kf/HTB1nA6XRpXXXXX1XXXXq6xXFXXXy/200...](https://ae01.alicdn.com/kf/HTB1nA6XRpXXXXX1XXXXq6xXFXXXy/200PCS-
LOT-MICRO-SD-Card-32GB-64GB-128GB-256GB-512GB-1024GB-CLASS-10-Memory-Card-
TF.jpg)

------
ender89
George W. Mitchell's quote was pretty much spot on considering the context of
the time, he was basically describing an internet connected pc which was
communicating with servers somewhere that do the real computing. Now obviously
even the cheapest netbook has more computing power than anything he was
thinking of at the time, and your average gaming computer is vastly more
capable of computing than what the home terminal he was envisioning was, the
fact of the matter is that we're moving more and more towards the thin client
model where the majority of your computing is done in the cloud. Hell, I'm
sitting in front of a macbook pro that could tear your face off but I rely on
the web and a remote server to run my word processing software.

~~~
digi_owl
It hits me as i dig though the comment pile that what has changed is that
while compute power has grown, storage has downright exploded.

The power of Google et al is not the compute, it is the storage. The amount of
data that Google have on tap and sifts through is downright staggering, and
clearly not something that is possible to store locally for home users.

~~~
thomastjeffery
Especially considering the petabytes of tape they use.

------
gnicholas
Funny how we're now at a time where we're moving past computers. I thought
this was going to be about how you don't need a computer at home anymore
because you can do so much on tablets, phones, watches, and TVs.

When the Apple Watch with LTE was announced, I wondered when people will start
to have a watch but no phone. When the iPad got LET, senior execs shifted to
them and raved about the experience. It worked for them because they mostly
just did email and a few other things, unlike worker bees who need a
traditional computer OS. I wonder if a similar thing will happen with higher-
ups foregoing phones for watches.

My guess is that in the next 3-4 years, we'll start seeing people talking
about how amazing it is to go around without a phone. Perhaps a tablet for
doing "real work" and a watch for everything else.

~~~
6ak74rfy
I immediately got excited about the idea of replacing my phone with a watch,
until I learned about the constraint of needing a phone to setup the watch.
That shows that the phone isn't completely replaceable right now, but I guess
Apple will fix that in a year or two. I'll definitely get rid of my phone
then.

~~~
gnicholas
Yeah, even though I don't currently have an Apple Watch (go Pebble!), I
wondered whether this would be feasible now. You could just choose to leave
your phone at home all the time, but you'd still have to pay for the wireless
plan. Currently, carriers charge $10 extra for the Watch, but you can't yank
coverage for the phone. They use the same phone number, so you still have to
have both.

~~~
valuearb
My iPhone is a three year old 6 Plus. I have no problems leaving that giant at
home to rely on a watch. I'd hope it would bring more attention back to my
day, and help break me of the habit of going to my phone when bored.

The watch would do everything I need, texts, phone calls, email alerts, and I
can even use it for navigation. While it would be better not to pay the $10 a
month surcharge, that's a fairly small price to pay if it's an improvement in
my daily experience.

~~~
gnicholas
Yep! And one issue Apple will have to deal with is: if people routinely leave
their phones at home, will they pay $600+ for new phones every 2 years? My
guess is that Apple will not be too eager for that scenario...

~~~
valuearb
Apple isn't dumb enough to think they have a choice. When Android finally
figures out watches as well as Apple, a hundred watch makers will start
chipping away at the phone market. Apple has always been smart enough to try
to cannabilize its own products if cannibalization was inevitable.

~~~
gnicholas
What do you think the ETA is on a good, LTE-enabled Android watch? I feel like
nothing out now is as good as the original Apple Watch, and it took them 2
more years to get the LTE stuff worked in there (and even then, the battery
life isn't as good as Series 2).

And to be clear, I'm not a fanboy who sees the Apple Watch as the gold
standard — I don't have one and probably won't purchase the Series 3.

------
alphapowered
An interesting talk by Ken Olsen on his history with DEC:
[https://www.youtube.com/watch?v=GNBS0I1h42k](https://www.youtube.com/watch?v=GNBS0I1h42k)

Sadly, the names DEC and Digital barely resonate with the average under-30
developer. In fact, I'd guess that DEC/Digital on a resume not only counts for
much less than AirBnB or some other trendy, frivolous startup, but probably
even hurts, given how prevalent ageism is in our industry.

------
paulgerhardt
We laugh at these quotes and yet we keep making them.

"640K... nobody would ever need that much memory"

"No wireless. Less space than a nomad. Lame."

"Well this is an exceptionally cute idea, but there is absolutely no way that
anyone is going to have any faith in this currency."

It seems inevitable that despite knowing better we will keep putting our
collective foot in our mouth. I'm ok with that. The takeaway is that limiting
predictions aren't useful.

~~~
ajross
> "640K... nobody would ever need that much memory"

The Gates quote was "640k ought to be enough for anybody", and it's been
_wildly_ misinterpreted by basically everyone.

The IBM PC had already picked the 8088, which had a 20 bit addressing
architecture regardless of whether or not 1MB was "enough" or not. The issue
Gates was addressing was whether or not DOS's choice of 640 as the barrier for
loadable program memory management was enough given that it was only about
half the space in the machine. And he wasn't really wrong. The time frame
between DOS programs bumping up against 640k and "DOS" programs wanting to use
the multiple-megabyte flat spaces available on the 80386 (which couldn't
possibly have been architected for in software in 1981) was like three years.

If you want to ding DOS, complain not about the 640k limit but about the
complete lack of architecture and discipline in the _high_ 384k. Really no one
ever got this space under control, and it was a bucketful of undetectable
device I/O hazards and too-clever-by-half hackery well into the 90's.

~~~
paulgerhardt
the quote was "When we set the upper limit of PC-DOS at 640K, we thought
nobody would ever need that much memory."

"640K... nobody would ever need that much memory" was just a contraction

[https://quoteinvestigator.com/2011/09/08/640k-enough/](https://quoteinvestigator.com/2011/09/08/640k-enough/)

------
yial
I guess some of the ideas were “half” right of the future to come, considering
how many of our devices now make use of “cloud” computing.

------
rektide
But less and less do we have personal computers in the home- the PC is being
converted into a dumb terminal, a screen and keyboard, with all the real
brains on the other end of the wire.

------
jandrese
The computers there were talking about would be analogous to the big iron
supercomputers of today. The quote "There is no reason for any individual to
have a supercomputer in their home." is much closer to reasonable.

It does show a lack of foresight on their part, but it's not as completely
wrong as it looks on the surface.

~~~
AnimalMuppet
True, but... for that era, a "supercomputer" was a Cray 1. How many Cray 1's
do you have in your home? That is, how many devices do you have that can do
160 MFLOPS? For many HN readers, it may be in double digits.

~~~
digi_owl
Then again said Cray1 was a liquid cooled beast that acted as much as a piece
of furniture as it did a computer.

What blew everything up was how fast Intel and "friends" could get ICs to
shrink.

But even then we now have vast warehouses that we access remotely that is
mostly there to store metadata that is sifted and sorted on command.

------
tudorw
As an adjunct, here's a journalist in 1979 struggling to comprehend Ted
Nelson's vision that the computer would become a creative tool for the masses,
[https://www.youtube.com/watch?v=RVU62CQTXFI](https://www.youtube.com/watch?v=RVU62CQTXFI)

------
drtse4
From the same period, a radio interview where Ted Nelson tries to convince,
unsuccessfully, a skeptical radio host that home computing has a future:
[https://www.youtube.com/watch?v=RVU62CQTXFI](https://www.youtube.com/watch?v=RVU62CQTXFI)

------
dredmorbius
"Unless you are very rich and very eccentric, you will not enjoy the luxury of
having a computer in your own home."

\- Ed Yourdon, _Techniques of Program Structure and Design_ , 1975

Yourdon had been at DEC, but left the company prior to 1970.

~~~
contingencies
Awesome quote. Added to
[http://github.com/globalcitizen/taoup](http://github.com/globalcitizen/taoup)

------
microcolonel
This is, in large part, still a popular sentiment in Japan.

------
wglb
Failures in imagination can be deadly for businesses.

------
caf
To me the amazing thing isn't that DEC pooh-poohed the idea of the PC, but
rather that IBM was agile enough not to!

~~~
digi_owl
The PC was really an outlier.

Not only was it an IBM product, but beyond the BIOS everything was off the
shelf (even the OS).

So once the BIOS was cloned, and the clone defended as clean in court, they
promptly tried to reassert control with the PS/2 and spiraled into
irrelevance.

------
zzzeek
There is no reason for any individual to have a high energy particle
accelerator in their home.

35 years later...

------
gumby
Why didn't the investigator contact Dave or Gordon both of whom are alive and
well?

------
mongol
For these quotes I always wonder if we put them in the right time context. Is
it fair to assume that the quote was meant to mean "never", or was it implied
a time frame such as "during this decade" etc.

------
milansuk
There is one big reason and that's privacy!

------
amelius
And 640kb ought to be enough for anyone :)

------
Koshkin
This would have been true today if people no longer felt the need to buy
desktop computers. (They still do it to play games.)

------
known
"There is no such thing as an underestimate of average intelligence" \--Henry
Adams

