
Bill Gates: “I never said '640K should be enough for anybody'” (1996) - ElectronShak
https://groups.google.com/forum/#!msg/alt.folklore.computers/mpjS-h4jpD8/9DW_VQVLzpkJ
======
myself248
This has always annoyed the piss out of me. It wouldn't have been Bill's or
Microsoft's call to make, in the first place. The hardware memory map is not
set by software.

The 640K limitation derives from the 1MB address space of the IBM PC, and as
the name implies, IBM did the hardware design. They did it around a particular
Intel chip, which had a 1MB address space. IBM could've put in hardware
support for bank switching (as some EMS/XMS add-in cards later did), they
could've used a chip with more than 20 address lines, they could've done a lot
of things.

But they didn't. IBM wasn't designing a mainframe-killer, they were designing
a personal computer. It was competing with 16k and 64k 8-bit machines, and the
first IBM PCs shipped with 64k and later 128k of RAM. Using the top 384K for
peripherals and allocating 640K for programs must've seemed insanely generous
at the time. But whoever made that decision, it was on the hardware side, not
anyone at Microsoft.

~~~
onion2k
_But whoever made that decision, it was on the hardware side, not anyone at
Microsoft._

Bill Gates' famous comment isn't really a decision though. As it's usually
cited it's just an opinion - '640Kb should be enough for anyone' means "I
don't think any programs will need more than that!". If someone at IBM decided
that there should be 640Kb of available RAM for programs it's _believable_
that Bill might have simply been agreeing with them.

The main thing that's annoying about the quote is that it's trotted out as an
example of how Bill was wrong, as if being wrong is something terrible that he
should be ashamed of decades later. That's nonsense. Being wrong is fine so
long as you change your mind when you understand that you are wrong.

~~~
13of40
At one of the Microsoft company meetings in the middle to late 2000s (I recall
it was at Safeco Field) he claimed that when IBM was developing the PC he
tried to convince them to use an MC68000 instead of the 8088. He said going
with the 8088 set the industry back ten years. Assuming he wasn't making the
story up, it's hard to imagine him making that quote or even agreeing with it.

~~~
dep_b
> He said going with the 8088 set the industry back ten years.

That's what it always felt like as an Amiga user. Not before DOOM there was
much I liked on the PC.

Release Amiga 1000: July '85 Release DOOM: December '94

~~~
13of40
Similar story with me. I got an Amiga 1000 and did a fair bit of assembly
coding on it, then ended up writing some 16-bit x86 assembly for school later
on. Being used to having sixteen 32-bit registers, then all the sudden having
to use AX, BX, CX, and DX (and don't forget they all have slightly different
purposes!) was like being brutally shoved back into the 80's.

~~~
kbenson
Well, history has shown neither RISC nor CISC as actually a better choice,
since both models more or less converged to a sort of hybrid design years ago.

~~~
ch4ck
And what this has to do with Amiga?

~~~
kbenson
At a simplistic level, the difference between RISC and CISC processor design
boils down to having many registers and reduced instructions, or few registers
and extra specialized instructions.

What was being described is going to programming from a RISC-like design
processor to a CISC-like design processor, and how they felt constrained after
doing so. It likely does feel more constraining (I don't really remember how I
felt about it back when I did it, but I also went the other direction, and
only in the context of classwork), but in the end, most people are programming
a level removed from that anyways.

There used to be quite a lot of arguments about what design was better (IMO
mostly fueled by Macs running a RISC processor and Windows running a CISC
processor, at least until Apple switched to x86). I find it slightly comical
that both designs ended up in a fairly similar place though (with RISC
processors adding extra instructions, and CISC processors adding more
registers, even if mostly just logical registers).

~~~
ch4ck
All of them: IBM PC, Amiga and classic Mac were using CISC CPUs...

~~~
kbenson
You're right that the Amiga was CISC, I was mistaken there.

The Classic Mac was as well, but PowerPC based macs were RISC, which is what I
was recalling in my followup.

------
ckastner
A somewhat more extensive research of that quote:

[https://quoteinvestigator.com/2011/09/08/640k-enough/](https://quoteinvestigator.com/2011/09/08/640k-enough/)

~~~
tabtab
Perhaps the most damning quote: _" I have to say that in 1981, making those
decisions, I felt like I was providing enough freedom for 10 years. That is, a
move from 64k to 640k felt like something that would last a great deal of
time. Well, it didn't..."_ -Bill Gates

While he didn't use the exact words of the title quote, there are multiple
sources which indicate he was indeed surprised at how fast applications grew
in memory requirements/usage. Thus, I rate it as "partly true".

At lot of it was probably caused by the switch from hand-crafted low-level
code to libraries and compilers, which usually take up more RAM for comparable
features. While such tools sometimes can produce compact code, it's often not
considered economical by publishers.

------
nikanj
Linus Torvalds on the other hand did indeed say ”Anybody who needs more than
64Mb/task - tough cookies” on the original Linux announcement email chain.

~~~
unmole
Well Linus also said his kernel won't be big and professional like GNU's.

~~~
mcguire
Which it isn't. :-)

------
maxwell
"Modern operating systems can now take advantage of that seemingly vast
potential memory. But even 32 bits of address space won't prove adequate as
time goes on."

But 64 bits should be enough for anybody.

~~~
aogl
Moving beyond a 64-bit architecture will allow manipulation of memory address
spaces that are larger than 16 Exabytes. Even high-end servers today do not
contain more than 1TB of physical memory. It would take several decades if not
more for memory densities to become high enough to begin to reach the boundary
that 64-bit addressing limits computers.

So yes, highly likely for a little while..

~~~
stochastic_monk
Current x86 architectures only use 48 bits for addressing. That’s still enough
to address ~280TB of RAM. I expect that to change before upgrading to 128-bit
pointers.

~~~
red75prime
128 bits are nearly enough to address every atom in the Universe. Such
pointers are mostly a waste of memory space.

> I expect that to change before upgrading to 128-bit pointers.

Or the joke was wasted on me. Did you mean it will never be feasible?

------
jaysonelliot
Slightly off-topic, but it's cool to see David Mikkelson posting in that
Usenet group as "Snopes," as that's where he first used the name, even before
launching the urban myth-busting site in the '90s.

~~~
rconti
I had no idea Snopes was that old; that was my takeaway from this as well.
We've always assumed it wasn't an actual quote from Gates, but I only knew
Snopes as the website.

------
diego_moita
Wow... in a post denying that the comment never happened a lot of people here
still take it as a fact.

Now, what do you guy think about the "Al Gore said he invented the internet"?

Ah, the wonders of a post-truth world...

------
bryanrasmussen
3MB of RAM should be enough for anybody to get a ticket out of the Sprawl.

~~~
bitwize
Gibson's prose has a short shelf life. That's why he doesn't write future-set
stories anymore.

In the film of _Johnny Mnemonic_ , the screenplay of which was penned by
Gibson himself, Johnny must act as a courier for 320 GB by using a brain
implant -- in 2025. It's 2018 in the real world, and data smugglers today
would probably get farther by swallowing toy balloons with two or three
microSD cards in them than by going the whole invasive wipe-your-childhood-
memories brain implant route (if such a route were available).

~~~
notheruser
Not true that Gibson's prose has a short shelf life. Neuromancer is in my view
a timeless masterpiece, and Gibson's most recent book is set in the future.

~~~
InternetOfStuff
Well, Neuromancer's first line doesn't make sense any more :-P

That said, it's one of my favourite books.

~~~
mistersquid
I realize you say this somewhat in jest, but I want to respond to this because
the first sentence of of William Gibson's _Neuromancer_ is paradoxically both
timeless and time bound.

> The sky above the port was the color of television, tuned to a dead channel.

That first sentence characterizes the novel's diegetic (in-narrative) setting
by giving the sky a property of an alternate medium (television). Unlike
gentler rhetorical assertions such as simile, the declaration that the sky
possessed a property of television suggests that the setting--like the intra-
diegetic setting of the (Morrocan) beach in which an unnamed adversarial AI
imprisons Case--is also a media construct in which the reader is imprisoned.

In other words, the media topology structuring the novel's outermost narrative
is a recursive formation of one medium inside another medium and this
structure is timeless. It's merely recursion.

However, fully understanding the literary (as opposed to narrative)
significance of such a structuration depends upon knowing what television was
and why any of its channels might be "dead" as opposed to "live".

A useful point of comparison can be found in William Shakespeare's _As You
Like It_ (II.VII.139-143)

    
    
      All the world’s a stage,
      And all the men and women merely players;
      They have their exits and their entrances,
      And one man in his time plays many parts,
      His acts being seven ages. [. . .]
    

The significance is timeless, drawing power from the assertion that life is in
fact lived on a stage. On the other hand, the description is also time bound
(as well as culturally circumscribed) depending as it does on the greatly
diminished (since Elizabethan England) medium of live performance.

EDIT: Fix predicate in first sentence. Formatting. Change "at" to "a" in
penultimate sentence.

~~~
taeric
I think the point is that my children will likely never know what a television
tuned to a dead channel looks like. I've almost forgotten, myself.

I say that as someone that finds that line rather nice.

~~~
SuperPaintMan
You reminded me of a post JWZ made a while back with a TV in his bar. It's
neat how these things can be embedded into our collective experince.

(Copy and paste the link, referrer shenanigans.)
[https://www.jwz.org/blog/2012/01/snow-crash-
simulated/](https://www.jwz.org/blog/2012/01/snow-crash-simulated/)

~~~
taeric
I didn't realize newer televisions simulated the static. That is rather
amusing, in many ways.

Also, thanks for mentioning the copy/paste referrer nonsense. Didn't realize
they blocked referred links from here and was curious what the reference was
for a time. :)

------
pure-awesome
I find it quite funny that there's a parallel discussion as to whether or not
everything in a modern circuit board is implemented with NAND gates.

------
jokoon
While today, 16GB of ram and 1TB of storage will soon not be enough for a
modest computer.

[https://en.wikipedia.org/wiki/Wirth%27s_law](https://en.wikipedia.org/wiki/Wirth%27s_law)

Although I'm sure hardware companies will find a way to promote deep learning
techniques into home computers, which will obviously require more and more
transistors.

~~~
sys_64738
Today in 2018 you need 16GB. Back in 1998 it was 16MB and 20 years before that
it was 16KB.

That should tell you how much you'll need in 2038.

------
cipherzero
Out of curiosity how did this chain end up on google groups? Where was the
original and how was google able to ingest it?

Thanks!

~~~
myself248
[https://en.wikipedia.org/wiki/Google_Groups#Deja_News](https://en.wikipedia.org/wiki/Google_Groups#Deja_News)

~~~
cipherzero
Thank you!

------
princekolt
This reply down the thread wins it for me:

> I always thought he was talking about his monthly bonus, not computer
> memory...

Ha!

~~~
trumped
soon after that, it became his daily bonus...

------
LoSboccacc
[https://books.google.it/books?id=2C4EAAAAMBAJ&q=%22nobody+wo...](https://books.google.it/books?id=2C4EAAAAMBAJ&q=%22nobody+would%22&redir_esc=y&hl=en#v=snippet&q=%22nobody%20would%22&f=false)

the fact that a direct quote doesn't exist doesn't mean that it wasn't the
sentiment at the time.

~~~
nolok
The fact that it was the sentiment at the time is one thing, but it doesn't
make attributing a quote to someone who didn't say it right or correct, even
if it goes in the way of the general sentiment, even if that someone did agree
with said sentiment.

------
interfixus
Doesn't matter. We all know there is no reason anyone would want a computer in
their home.

~~~
maxwell
No one actually wanted a computer in their home.

They did want an interactive TV set.

~~~
interfixus
Oh, that explains why there's no market for more than five computers in the
world.

~~~
ddalex
That's an accurate description for certain values of the term 'computer'

------
known
I think Gates overestimated
[https://en.wikipedia.org/wiki/Swappiness](https://en.wikipedia.org/wiki/Swappiness)

------
Shaddox
Hah. Reading this is quite amusing. Tricksters and pranksters were just as
prevalent on the internet back then as they are now.

~~~
SmellyGeekBoy
"Tricksters and pranksters" AKA good old fashioned trolling. It's a shame the
modern definition of "troll" seems to have warped to include those who tell
people to kill themselves on social media etc.

------
jl6
Having watched a 4K movie, I can confidently say 640K should be enough for
anybody.

~~~
LogicalBorg
Well, let's be scientific about this. If Bill Gates is willing to write me a
check for 640K, I'll be happy to test his theory for him. ;-)

------
wheelie_boy
Yeah, I don't think it's a literal quote, unless it was in a very specific
context.

It was making fun of the 640K limitation of MS-DOS, and the implicit reasoning
behind that hard limit.

------
yuhong
Of course, this reminds me of the OS/2 2.0 fiasco.

------
pcunite
I have 16384000K of ram and I could use more. However, my use case might be
different from someone browsing the web.

:-)

------
sys_64738
Sure, Bill. Of course you didn't say something like that as it would make you
sound totally clueless to future generations. You wouldn't want a howler of a
mistake like that to be your calling card, would you? So, yeah, of course you
didn't say it.

