
24 Gigabytes of Memory Ought to be Enough for Anybody - angrycoder
http://www.codinghorror.com/blog/2011/01/24-gigabytes-of-memory-ought-to-be-enough-for-anybody.html
======
patio11
There are a _lot_ of problems which the engineer in me wants to code around
that should really, really be solved by throwing money at it. Often, a trivial
amount of money.

I once did several hours of work trying to optimize my use of Redis to avoid
having to upgrade my VPS (I was nearing the limits of the physical memory at
1.5 GB). I even asked Thomas for advice on how to decrease memory usage. His
reply: "How much to the next tier?" "$30 a month." "Why are we having this
conversation?" And, of course, he was right.

~~~
joshu
better code (algorithms, performance, memory usage) is amortized across the
life of the rest of the system. fixed-cost upgrades will eventually be caught
up to. so it's not bad to think like that.

~~~
patio11
You can use the freed-up engineer time to write better code which you can also
keep for the life of the system though, though. It is not too hard to think of
things which you could do in a few hours that would pay for RAM upgrades in
perpetuity, even at BCC's relatively small scale. (Say, an A/B test which
resulted in a 1% lift on my AdWords landing pages.)

~~~
qq66
It's precisely BCC's small scale that makes spending engineering time on
performance optimizations expensive. At Google's scale it's worth several
thousand engineers' time.

~~~
wisty
At which point, you have several thousand engineers. It's a problem that
solves itself, as long as you have some kind of margins.

------
btmorex
"To me, it's more about no longer needing to think about memory as a scarce
resource, something you allocate carefully and manage with great care."

Once your data set doesn't fit in memory, it is indeed a scarce resource. And
24 GB is really not very much data.

~~~
bluekeybox
Some people say: "but the computers become faster every year," to which I
respond: "but the amount of data we throw at them grows every year at a
similar rate."

There will never be a substitute for good algorithms. In my field
(bioinformatics), thanks to next-gen sequencing technology, the data growth
actually outpaces Moore's law. I imagine the same is true for data collected
from social networks.

~~~
simonsarris
You have reminded me of that old line: "What Andy giveth, Bill taketh away."

(Referring to Andy Grove of Intel CEO fame 87-98, still there in some
capacity, and Bill Gates.)

------
mrb
Reasons why sometimes you cannot just throw RAM at a problem:

1\. Your smarter competitor will _both_ throw RAM at the pb _and_ optimize
their code, and end up outperforming you.

2\. Your quad-socket machine already has RAM maxed out and adding more RAM
(ie. adding a second server) will need significantly more complex code to
distribute the workloads on more than 1 machine. Optimizing your memory usage
is probably easier.

3\. Increasing RAM is just not an option on a mobile platform (eg. cellphone).

4\. You will lose customers by upping the minimum RAM requirements for your
app.

5\. Once you have added more RAM, you have no more tricks left in your bag to
quickly scale up in case of emergencies. Instead work on optimizing your code
while you have time.

------
ajg1977
"Algorithms are for people who don't know how to buy RAM" is one of those
soundbites that people seem to find clever, but really shows a startling lack
of understanding.

You can chuck as much RAM as you like at your problem but it's not going to
help once your data set fits in memory - and maybe way before that if you're
thrashing those piddly L3 and L2 cache's.

~~~
newt
_"Algorithms are for people who don't know how to buy RAM"... really shows a
startling lack of understanding_

Are you perhaps missing that it's a deliberate inversion of the more obvious
statement "needing more RAM is for people who don't know how to use
Algorithms" in order to make the point that RAM is cheaper than an engineer's
time?

~~~
nervechannel
But many _interesting_ problems can't be solved by throwing more RAM at them,
because RAM isn't the bottleneck.

~~~
alextgordon
Are you suggesting that _Jeff Atwood_ may be exaggerating?

------
cletus
Buying RAM as a substitute for algorithms is for people who don't work for
Google (disclaimer: I work for Google).

~~~
younata
Wait, so instead of spending a semester learning about algorithms and stuff, I
can go throw money at the problem?

Why am I in school?

~~~
jbrennan
Technically you are throwing money at the problem (or someone is, for you).

~~~
Keyframe
So, does buying hardware have better ROI than college?

~~~
wlievens
Sometimes yes, sometimes no, but you need a trained brain to spot the
difference.

~~~
rbanffy
It goes like this:

\- turn the screw: US$ 5

\- knowing which screw to turn: US$ 5000

------
agentultra
_Seriously?

I like your blog but please never write something like algorithms are for
people who can't afford more ram. Memory management and efficiency should
always be a goal, or else you end up with browsers using 2gb of ram with 5
tabs open, and more crap like that.

Yes, memory is cheap. That shouldn't promote terrible code. Vostok4 on January
21, 2011 12:09 AM_

That was my first thought.

I still have a laptop I bought in '07 with just 1G of RAM. It works fine and
I've never felt compelled to drop a grand or two every couple of years just so
that I can.. what? Use more system resources to do the same things I always
do?

The problem I've been having with NOT keeping up in the tech arms race at home
is that this poor laptop can easily start paging memory with only a couple of
Gnome apps open and FF running. It used to be a pretty decent machine.

There are contexts where it's "cheaper" to throw hardware at the problem. I
don't think the desktop is one of those areas. Not everyone using them is an
engineer being paid over a hundred dollars an hour.

------
sliverstorm
_To me, it's more about no longer needing to think about memory as a scarce
resource, something you allocate carefully and manage with great care. There's
just .. lots_

As much more of a hardware guy myself, I've always been frustrated by this
path of thinking. Sure, it's convenient for you as the programmer if you don't
have to think about resources, but it always feels like the growth in hardware
performance is mostly consumed by more and more needless resource usage.

Always reminds me of how cars are getting more and more efficient, powerful
and clean-burning, and yet mpg is only barely creeping upwards.

Alternatively, an anecdote from my father- he was once tasked with upgrading
the capacity of a company NAS, which was a few hundred megabytes, that was
closing in on max capacity. He arranged to bring on some 20x that capacity,
thought "well, this should do the job for a while", and it was filled in a
matter of weeks.

------
cloudwalking
As an aside: "You can pair an inexpensive motherboard with even the slowest
and cheapest triple channel compatible i7-950..."

Inexpensive motherboards are a terrible idea, they can cause _really_ weird
problems that look like issues with other components. I've built a few
computers and since sworn never to buy a cheap motherboard again.

~~~
fhars
On the other hand, I know people who will never buy expensive server
mainboards ever again as they are produced in so low numbers that bugs never
get ironed out or programmed around in operating systems, unlike widely
distributed consumer hardware. (Cheap consumer mainboards still sound like a
bad idea, though).

~~~
ComputerGuru
_Strong_ upvote. I'm one of those, but not on the server market.

3 years ago, I purchased an Asus Republic of Gamers motherboard for 400
dollars. It was the most expensive component in my PC, and I hate it every
minute of every day.

All my work is in bootloaders, but this motherboard has a terribly, terribly
buggy BIOS. Restarting the PC is not a fun thing to do with this 400 dollar
PoS.

Of course, the only reason BIOS upgrades failed to address this is because
only a few thousand were ever sold... and mostly to gamers and not hard-core
computer engineers.

------
davidst
Rule of thumb: RAM size lags disk size by about ten years.

Ten years ago a typical workstation had around 32GB of disk storage. Today,
32GB of RAM in a workstation is perfectly ordinary.

Take the size of your local disk space today. Ten years from now that will be
the amount of RAM in your computer.

~~~
redthrowaway
Nope. Both RAM and disk are asymptotic. We're getting to the point now where
the laws of physics are preventing us from getting much smaller. Now, all of
this is only applicable to the current paradigm; an entirely new kind of RAM
could mean vastly higher ceilings, but if we keep making it the way we've more
or less been making it for decades, we'll hit a wall.

~~~
Devilboy
People have been saying that for decades. AMD, Intel et al. are still very
confident that Moore's law is good till at least 2020.

~~~
redthrowaway
We're reaching the scale of process where quantum effects are starting to
rule. I didn't say we wouldn't be able to keep expanding, simply that this
current model will run out of steam. You can't continue to get smaller ad
infinitum.

~~~
kevinpet
Ten years ago it was because the features were smaller than the light
wavelength used to print the mask (not sure of terminology here). There's
always _something_ that looks like an insurmountable barrier, but billions of
research dollars can work miracles. Intel spent $5B on R&D in 2009.

~~~
mturmon
And don't forget, that R&D is funded by RAM profits.

So, do your part, and upgrade soon. ;-)

------
cgart
I wouldn't say that 24GB is enough, no way. It depends always on type of thing
you use your computer for. For a normal user who installed windows and
recently plays games, 24GB might be more than enough. For almost any kind of
developer this might be also more than enough, in the end this developer
develops for an end-user, which used not to have much RAM.

However, I am a scientist, and my machine at work has 96GB RAM with 24Cores,
so at the end it comes to around 2GB/Core, which isn't that much anymore. In
order to run algorithms on big data and not to bother about disk accesses (SSD
or not) more RAM is just crucial. My previous machine had only 8GB ram and it
was a big problem to stress algorithms with big data sets on them. So in my
case, there are never enough RAM ;)

~~~
klbarry
For the average consumer, I think 4gb is more than enough Ram, they don't need
anywhere near 24gb. I wonder what percentage of the computer market is
scientists who need the massive processing power yours does?

------
fletchowns
What is the point of this article?

~~~
bigwally
Jeff Atwood likes to tell the world about all his computer upgrades.

His is bigger than yours.

~~~
buro9
Rubbish, mine is larger.

And whilst I really don't want to engage in this kind of dick waving, since
it's Jeff Atwood I will.

I have a HP Z800 (
[http://h10010.www1.hp.com/wwpc/us/en/sm/WF06a/12454-12454-29...](http://h10010.www1.hp.com/wwpc/us/en/sm/WF06a/12454-12454-296719-307907-4270224-3718645.html)
) at home with 48GB RAM currently and it's upgradeable to 192GB RAM.

Of course, having hardware that you're not using is just dumb... why have such
a large capex for home hardware if it's not required. I only have this beast
because I worked on a project in my spare time around distributed data. For
this I built 17 virtual machines with which to test my work. I calculated the
cost of the opex of renting this in the cloud would exceed the cost of
purchasing over the duration of the project. And because spending money on
this kinda hardware is still dumb I then put it to work on a piece of software
I sold to a local consulting company. The result is that at least the machine
I have has been and is used, and has paid for itself a couple of times over.

BTW, this computing power really helps to build Chromium in 25 minutes rather
than hours that most people experience.

Also... since I'm now thinking of the Blackbird ground speed check story (
[http://groups.google.com/group/rec.aviation.stories/browse_t...](http://groups.google.com/group/rec.aviation.stories/browse_thread/thread/b9e23a24a784a26a)
), someone else has got to step forward and obliterate my home machine with
some more impressive dick waving.

~~~
ximeng
What's the bottleneck on the Chromium build on your machine?

~~~
buro9
The way Visual Studio builds (it's a Windows machine at the moment) and CPU.

It pegs a couple of cores but the others have relatively low load. I've
followed the optimisation advice ( [http://www.chromium.org/developers/how-
tos/build-instruction...](http://www.chromium.org/developers/how-tos/build-
instructions-windows#TOC-Accelerating-the-build) ) but can't manage to get it
below 25 minutes.

The bottleneck certainly isn't disk or RAM.

~~~
kenjackson
Why does it only peg a couple of cores? I suspect Chromium is a pretty
substantial build.

------
SoftwareMaven
Unfortunately, just like roads[0], all that memory will be filled up and used.

[0] <http://en.wikipedia.org/wiki/Lewis-Mogridge_Position>

~~~
chaosfox
Unfortunately ? I would never buy it if it wouldn't be used, good RAM is used
RAM.

------
barrkel
i7 920, clock for clock, made a huge difference over my Core 2 Q6600 on the
desktop. I estimate it was between 2 and 3 times faster, both overclocked to
3.4GHz, depending on whether I was doing a build of the source tree, or
transcoding a video.

Definitely not "blah".

~~~
reitzensteinm
I have to ask, did anything else change at the same time? Although the i7 920
is a fearsome chip, and a great upgrade over the older 65nm Core 2 Quads,
anything approaching even 2x performance per clock I'd probably consider an
edge case. Sounds like there was probably another variable in there.

~~~
barrkel
I think a lot of it comes down to hyperthreading, actually. The 3x improvement
was in transcoding, which is a reasonably pure test of CPU and memory
throughput, rather than amount of memory (4GB -> 12GB), or I/O speed (the
transcoding was actually done with both input and output across a gigabit
network to my NAS).

Nailing down the improvement in build time to any single change would take
more time substituting things than I'd care to expend right now. The extra
memory for file cache no doubt helped quite a bit, as did the SSD, but
different parts of the build are I/O heavy, and other bits are CPU dominated.
My MacBook Air 13" (4GB RAM, SSD etc., but slow processor, running Windows 7)
isn't particularly fast at the build.

------
Figs
What are the power usage / heat production implications of that much RAM in a
system?

~~~
wmf
Pretty small; each DIMM uses less than 10W.

------
XlpThlplylp
I guess no one has attempted to run the gaussian computational chemistry
algorithms on molecules with more than tens of atoms.

------
liquidcool
To me, this begs the question: why are higher memory VPS instances so much
more expensive? I know they typically have more processing power, but the
pricing still seems disproportionate.

And I think the more exciting issue is the falling cost of SSDs. I was
recently reading about in-memory databases being much faster not just because
of faster access, but because they spend so much less CPU time working around
disk access issues. Can the whole OS be written that way? We are quickly
heading toward exciting times.

------
chaosmachine
$299 for 24 gigs of ram is ridiculous. I thought for sure that had to be the
per-stick price, or something. Prices sure have dropped a lot since the last
time I built a system.

~~~
cdavid
I was actually surprised as well - I bought a powerful machine like 16 months
ago, when i7 became available exactly to get a lot of RAM. At that time, 300$
got you 12 Gb.

Also, SO-DIM became quite cheap: it is really affordable to get 8 Gb in a
laptop (IMO, the difference between 4 and 8 is big because @ 4 Gb, running
more than one vm on mac os x is not so practical, whereas at 8 it is a no
brainer).

------
mhb
I could go for like a week without having to restart Firefox!

------
CrLf
It seems like a sound principle, in theory, to trade memory by programmer
time. After all, one can buy a lot of RAM for the cost of a programmer's work
day (even the expensive ECC RAM used on servers).

However, most people don't seem to be able to tell where that stops making
sense, and the end result is software that's slow no matter how much memory
you throw at it.

Anyone that has worked for any amount of time with "enterprise" software knows
what I'm talking about.

------
wildmXranat
But it's not. Period. As an application dataset grows and presumptions of
yesterday blow through sane limits, you will re-visit this topic. It happens
again and again.

Hypothetical example: Boss asks if you can fit several millions of products
into the database. Keep them up-to date by setting related prices on them.
Each synchronization task needs to be aware of product deltas: is it
available, sold out, on sale and so on. Oh and there are many suppliers,
catalogs. All the while you're running set intersect, difference and so on on
multi-million count objects, you still need to be aware that access to this
data needs to be consistent and sane. Access times of several hundreds of
milliseconds are not acceptable.

Obvious solutions is to distribute the load, counting and updating amongst
machines. Load publicly available data from high performance memory tables. At
some point, it all comes unravelled when you get a request for custom sub-sets
where each one can live queries by price, text, brand ...

Memory bound problems are just that. At some point you hit the wall. I think
planning for that impact is the best insurance a shop can make.

------
nickpp
What "unexciting i7 platform upgrade" is he talking about? SandyBridge is out
but it is DUAL channel, not triple. So 16Gb, not 24...

------
jacques_chester
For contrast, I am working on an honours project proposal to develop a
blogging system (yes, I know, but I have a novel twist) and I plan to target
it at 64Mb VPSes -- partly because at that level it's price competitive with
shared hosting.

~~~
jerf
For "price competitive with shared hosting" you ought to be able to a lot
better than that? I pay 19.95 a month for what is technically a variable-RAM
instance (pay-per-resource), but I use somewhat more than 64Mb and don't come
even close to reaching my actual limits.

~~~
jacques_chester
What I'm driving at is that a lot of people used shared hosting for running a
blog because for a long time it's been the cheapest option (Dreamhost charge
$7.95/mth, for example). VPSes may overtake as the preferred option.

I predicted this happening in a piece I posted in 2008.

[http://clubtroppo.com.au/2008/07/10/shared-hosting-is-
doomed...](http://clubtroppo.com.au/2008/07/10/shared-hosting-is-doomed-and-i-
have-the-graphs-to-prove-it/)

And the HN discussion. <http://news.ycombinator.com/item?id=241952>

~~~
jerf
Ah, a tighter definition of "competitive with". That's fair. I've been around
long enough that a VM image for $20/month is still a bit mindblowing when I
think about it too much. :)

------
peterwwillis
This reminds me how some companies still run old web apps on huge server farms
made of boxes with 4GB of RAM. They're basically building in a limitation so
the programmers can't try to gain performance boots via memory caching if they
wanted to.

Also consider that though the article indicates a ~$900-1000 for a full 24GB
system, it may be only ~$3000 for a S5520SC-based system with 96GB. I suppose
one would be good for horizontally-scaled web caches and the other for big
databases or whatever funky app might make tons of random seeks over a large
dataset.

------
fosk
With 24 Gigabytes of memory I could even avoid to use the hdd.

~~~
djhworld
but what if you have a power cut?

------
lini
Since most applications are still 32-bit, most users are still OK with 4GB.
Case in point - games on PCs. I upgraded my video card at least 3 times in the
last 3 years but the main memory of the PC has always been 4GB and i have no
problem with any game.

Sure I am really glad that my work PC has more than that, so I can open
several IDEs with large solutions and not worry about memory, but for home - I
still don't see a need to upgrade.

------
leon_
> To me, it's more about no longer needing to think about memory as a scarce
> resource, something you allocate carefully and manage with great care.

Why am I not surprised to hear that from Mr. Atwood?

Yeah, 24GB is nice and stuff but the "not care about memory"-part rings my
alarm bells. I give him 2 or 3 months of careless coding till we see a "64gb
ram ought be enough" post ...

