
Tell HN: RAM Requirements are going through the roof - babebridou
There have been a few talks about how Moore's law is coming to an end, about how you can now build servers with close to a TB of RAM, how you can now scale clusters of servers and achieve great metrics when it comes to raw data storage or memcaching, how incredible feats in both parallelization and miniaturization will reverse the downhill trend for high-end computing.<p>Then there's the desktop. My desktop PC, for instance. It's a great 6 months old working computer, with 8 cores and 16GB of RAM. "This ought to be enough for all my needs", I thought. The Bill Gates in me was right so far, but unfortunately this won't be enough for long.<p>Yesterday I took a look at my memory usage, out of curiosity: 11.4GB. Woah. I had a look at the breakdown.<p><pre><code>  - Chrome : ~3GB
  - Firefox : ~1.5GB
  - Java (eclipse) : ~1.2GB
  - Rest in tons of various work-related apps.
</code></pre>
I'm of course responsible for letting this accumulate over several days, but still, a third of my RAM taken up by web browsing tabs? Chrome on its own clogging more than twice as much as Eclipse?<p>What worries me now is how hard we are getting struck with RAM-hogging web pages. Since I began writing this post, my freshly restarted Chrome browser with 9 open tabs (Hacker News (x2), Coding Horror, Google Search for "Mac osX Lion review 6 months", MoPub Ad Service monitoring, Google Analytics Visitors Overview, Android Developer Console, Twitter, Gmail) is taking up roughly 500MB of RAM. That's insane. On my 2010 Macbook Pro with 4GB of RAM, this means 20% of my overall caching capacity is taken up by my core web needs. Needless to say, I can't use my macbook anymore.<p>Last summer I was working on a little java experiment - a cross-platform 3D labyrinth. I wanted the overall memory and data footprint to be as low as possible so it could be played on a low-end android phone with really fast loading, and yet to keep the game space as big as I could, so I designed my own dedicated data structures overnight. I made the following calculation: over 500MB of raw, uncompressed data, I could store the description of an area roughly equivalent to a map of Europe with a resolution of 2.5 meters per pixel.<p>I'm not claiming any feat here, just stating the obvious: web programmers are doing something that's definitely not cool for our current RAM budget. We forgot any sense of measure. It's unnecessary for a twitter feed following 175 persons for an hour or so to claim a 70MB RAM footprint. You're not the only nor the worst offender, Twitter. Except that I had 26 new tweets to display, I clicked, and the footprint suddenly grew to 76MB. 26 tweets = 6MB. We're talking about 140 characters tweets, let's be generous and multiply that amount by 100 to take into account the tweeter's profile (which are all 10,000 characters essays as we all know), and we get a total of 364 000 new characters, which end up claiming more than 6 million bytes in RAM. Which means the RAM impact of adding 26 new tweets to a webpage is at very very least 10 times higher than it could be, and probably more like a thousand times too high.<p>Like I said, I'm only using Twitter to state a point. I mean nothing wrong with Twitter's web devs - actually I'm myself a very poor web developer - and I'm pretty sure the blame could also be put on Google Chrome instead, but I remember a day in 2002 where my Internet Explorer was trying to load a 4MB webpage from the hard drive, causing a RAM footprint above 40MB and a failure after 20 minutes of waiting in front of a white screen. Back then I was merely a junior consultant working on QA, and I was the one to tell the devs that they were definitely doing something <i>not cool at all</i> for the user's computer. True, we were showing rather complex and impressive amounts of "data" on our webapp with much simpler but oh so wrongly implemented "UI" (in that case, hundreds of unnecessary nested table anchors), but I can't help thinking back at those times where it was simply impossible to make a product that was <i>not cool</i> for the user's computer, because the computer would refuse to run it at all. We're way past this line today. My Twitter's tab has now garbage collected some data. It's back to 73MB ram footprint. There are 32 new tweets to show. I click them, and the footprint bumps back up to 78MB. Meanwhile, my overall Chrome footprint is now showing 550MB private memory. That's 50MB for two clicks on Twitter and ~4500 characters in a HackerNews' submit form.<p>Moore's Law nowadays affects the computing power requirements of software rather than the computing performance of hardware, and this is killing us.
======
Lewton
Chrome is designed to be memory hungry. It doesn't NEED the memory, but if
it's there it sure as hell will use it to try and speed up your internet
experience.

What's the point of having a tonne of ram only to let it sit idle?

I "only" have 4 gb of ram and chrome is taking up 1.2 gb of that, even though
i have 12 windows open with ~5 tabs in each.

~~~
kmm
"Idle" memory is used by the operating system as cache. On my computer, 800
MiB of my RAM is in use by programs, but 1.7 GiB of my 2 GiB RAM is used in
total, by programs, buffers and cache.

The problem is that an extension like "1-Up for Google+" doesn't need 20 MiB
of RAM just to play a sound and display a green mushroom, for a total of about
150 KiB of assets and maybe a few KiB of code.

~~~
jiggy2011
I've never used "1-Up for Google+" but are you sure it's really only 150KiB
worth of assets, if you are measuring file sizes you are looking at compressed
data which is useless to your computer.

You'd be surprised how big uncompressed data can get, compare a .BMP with a
.PNG of the same image for example.

~~~
kmm
I had uncompressed data in mind, but it was only a rough guess. I'll do a more
in depth calculation.

2 seconds of uncompressed sound, at 44.1 kHz, in stereo and 8 bits per sample
is 170 KiB. 17 images of on average 24 by 24 pixels at 32 bits per pixel is 68
KiB. Three other, bigger images give another 40 KiB of uncompressed data. A
total ~280 KiB. There's a few KiB of HTML but I don't know whether and how I
should count that. Even if we take into account headers, padding, objects and
higher quality data (e.g. 16 bits per sample), I assume all assets will still
account for less than half a megabyte.

Now that I look at the code, I see it's including a bundled version of jQuery.
I don't know how big the representation of Javascript code is in memory, but
29 MiB seems a bit much (correction, while I was typing this, it went to 31
MiB).

It might seems disingenuous to single out this extensions, but truth to be
told all my extensions need a similar, way too high amount of private memory.

~~~
jiggy2011
Most likely it's the javascript etc, it may also be holding syntax trees etc
in memory too?

If it causes the load of another library which nothing else was using that
could count especially if it's a different version of jquery etc.

If 2 processes are both using the same library and it is linked to their
addresses spaces, who's memory usage does it count against?

~~~
kmm
Neither, as I am only looking at private memory, which is by definition not
shared. Interestingly, most of my extensions have 0K shared memory, which
implies they're all keeping their own version of JavaScript libraries in their
own memory space. That might explain most of the problem. It doesn't seem easy
to solve a problem like this, without reinventing the shared library and the
accompanying "DLL hell".

~~~
sirclueless
And in fact the memory isolation between javascript execution contexts is one
of Chrome's most touted security features.

------
betterth
RAM is used by an operating system and managed. You may use 11/16 GB at once,
but someone could browse the internet with the same tabs at 3 or 4 GB and be
fine. Hell, they could at 2GB and be fine.

The operating system plays a huge role in memory usage, many modern ones cache
indiscriminately, preload most used applications, etc etc

Comparing RAM usage is almost apples and oranges these days, unless we're
talking similar setups.

~~~
daed
To add to this, I think it's important that OP knows this is a _good_ thing. I
built my computer just over 3 years ago when DDR2 prices bottomed out, so I
have 8GB, and this thing is still more computer than I need. I boot up, load
all my applications into RAM, and everything is super snappy - because RAM is
fast. Prior to this rig I found myself compelled to upgrade at least every 3
years because things would get slow. Times change.

------
jiggy2011
I would say that it's not so much that RAM Requirements are going through the
roof as it is that the _benefit_ you can get from having more RAM is going
through the roof.

Having said that many of the languages we are using such as Javascript are not
exactly storage efficient combined with combined with the fact that many
people who would probably not have considered themselves "programmer" enough
to write native apps in C++ back in the day are writing things in Javascript
and putting them online.

I often see people "benchmark" things like operating systems or browsers based
on RAM usage (people coming to the conclusion that 32bit Win XP is more
"efficient" than Win7 for example) but I think it is far more important to
measure swap file usage as this is the thing that actually hurts performance.
Afaik there is no performance hit in overwriting a flip-flop in memory that
already has a value stored but there is a huge penalty for hitting a disk. So
I'd be quite happy to have all my RAM utilized all the time if I have 0
swapping.

First of all if you are using a 64bit platform your pointers are going to be
twice as big as they used to be but you get to access so much more memory that
this issue solves itself.

Also IIRC Windows 7 does things like arrange your frequently used data into a
contiguous area of the disk and then reads all that stuff into VM at bootup.
This will result in higher RAM usage (due to many pages being loaded into
physical RAM) but better overall performance since if other stuff needs the
RAM more urgently it can either swap the cache back to disk or overwrite the
cache entirely.

Let's say you have 8GB of RAM and are watching a HD video which is 7GB
uncompressed and the rest of your system only really _needs_ 1GB of your RAM,
surely it makes sense what whilst you are watching the video you can use spare
CPU cores to uncompress the rest of the video and put it into RAM.

I think it's more the case that computing will always expand to fill the
amount of memory available. Bearing in mind that even the cost of swapping is
dramatically reducing with SSDs etc.

~~~
randallsquared
_Let's say you have 8GB of RAM and are watching a HD video which is 7GB
uncompressed and the rest of your system only really needs 1GB of your RAM,
surely it makes sense what whilst you are watching the video you can use spare
CPU cores to uncompress the rest of the video and put it into RAM._

That's not clear to me. Why do something ahead of time that can be done faster
than needed on the fly anyway? You're basically assuming that when system
resources are needed for something else halfway through your video, processing
power will be more useful than space.

~~~
jiggy2011
True , with video decompression that is probably the case.

I'm sure there are lots of examples where doing complex calculations whenever
you have CPU spare and swapping them out to disk as needed would be preferable
to having to contend with whatever else needs to use it at the time.

I think the key is how to OS deals with memory contention issues rather than
just minimizing memory usage.

Someone with a better understanding of OSes than me could probably clarify.

I guess the performance hit is when you write a page of RAM to the disk,
perhaps there are some cases where you don't bother to do this if memory
contention is high?

~~~
keeperofdakeys
Really, you don't want to randomly waste cpu cycles for nothing, like pre-
decompresssing video. It just increases the power usage and heat of the
computer. It would also make it harder to skip, since the program needs to
finish the portion it is working on.

One thing people often don't realise is there isn't really 'wasted' RAM, your
OS usually uses the rest to cache things, especially files. File caching can
greatly speed up disc reads and writes. (The following only applies to linux,
I'm not sure about other OS's; all three have very different memory models).
When memory is getting high, a few things can happen (depending on whether you
have swap). The file cache is preferentially cleared first, but swapping pages
to disc is also considered readily. An interesting thing is that swapped pages
aren't deleted from swap straight away, only if they are modified (so swap
usually doesn't decrease in size). There is also the Out Of Memory killer,
which can kill applications using too much memory. It really is an interesting
area. You can read more from this page
<http://tldp.org/LDP/tlk/mm/memory.html>. Just remember all OS's are
different.

------
cperciva
Are your numbers the VM size, or the RSS?

On the laptop I'm writing this from, I've got a Chrome process with 827 MB of
VM size -- but only 62 MB of pages actually in use. The rest is large memory-
mapped regions consisting mostly of untouched pages -- which makes sense,
because on a 64-bit machine, virtual address space is practically free.

~~~
babebridou
Figures are taken from the "private" column of chrome://memory-redirect/

Chrome says: "This is the best indicator of browser memory resource usage"

~~~
cperciva
Ok, that should be the RSS -- unless it's broken on your platform, which is
entirely possible, of course.

~~~
davidcuddeback
When I visit chrome://memory-redirect/, I see this message:

    
    
      (Bug: We seriously overcount our own memory usage: Issue 25454.)
    

Issue 25454: <http://code.google.com/p/chromium/issues/detail?id=25454>

------
kmm
I agree that this is completely getting out of hand. Memory was (and still is)
very cheap but we are reaching a limit. The problem is that there is little we
can do about it. I doubt that both Mozilla and Google don't know that their
browsers consume disproportionate amounts of memory. But developing a web
browser isn't easy and I assume a huge memory footprint is unavoidable. I've
often wondered whether the problem lies not with the web developers nor with
the programmers but in the expressiveness of the HTML+CSS+Javascript
combination. This allows an enormous diversity in websites. But I assume that
storing an exhaustive amount of metrics for each element on a page can be
quite some data.

I wouldn't say your macbook is useless though. I have only one computer and
it's a 2008 laptop with 2 GB RAM. And I cope very well, although I experience
the occasional lockup (I don't use swap).

~~~
zdw
With memory prices as low as they are right now, you should upgrade if
possible.

8GB of DDR2/3 from a good vendor has been less than $50 for a while now.

~~~
kmm
I've considered it, but I wonder whether I don't better buy a whole new laptop
instead of cobbling this one up, but that's a lot of money and this one works
fine, so I keep postponing it.

But really, buying more RAM is only a temporary solution. I sometimes feel
like the memory demands of web browsers are out of line and I wonder what
causes that.

------
whiskers
A lot of apps will take more memory than they need so they can keep more data
"live" and be more responsive when they need to perform actions.

If you ran the same set of apps on a 2GB machine you'd get different numbers.

~~~
babebridou
Still I fail to see _why_ this "live" data takes so much space.

Besides, I'll have to take your word on that, since my 4GB macbook pro has
become completely unuseable due constant swapping, but it could be completely
unrelated.

~~~
vrdabomb5717
At the same point as you are. I have 8 GB of RAM in my mid-2009 MacBook Pro
and usually have one browser running with a number of tabs. I quickly run out
of RAM and go to swap, and the only solution I've found so far is to use the
_purge_ tool that came with Xcode and manually free up the "inactive" RAM that
seems to just sit there.

------
joestringer
For those with the RAM to spare, this seems like sensible behaviour -- If
there are ways to speed up your web browsing by using otherwise idle RAM, then
great!

Unfortunately, the flipside is that if you're using a computer more than a few
years old, then anything more than your set of 'core' sites tends to slow your
computer to a halt as your browser causes everything to be constantly swapping
in and out of memory.

------
brador
PROTIP: Use noscript to take RAM requirements back to 2007.

I'm on a 2GB Laptop with XP typing this and the fan is silent. SILENT. The
second I turn noscript off it overloads, then of course, crashes. Some scripts
are running constantly in the background, with click tracking and even mouse
motion tracking becoming more commonplace.

~~~
chokma
Also using RequestPolicy addon on Firefox will dramatically cut down on stuff
loaded from 3rd-party websites (including: Google analytics, Facebook like
button, ad- and tracking services of all kind)

~~~
Barnabas
For Chrome users on the dev channel, try <http://silentorbit.com/kiss/>

------
keenerd
Turn off flash and javascript. That should reduce your browser's memory use by
90%. And use Opera. I can easily fit 200 tabs in under a GB.

~~~
IgorPartola
Or turn off your computer completely to have 100% free RAM. This advice is not
as helpful as it seems if your line of work is, let's say, a JavaScript or
Flash developer or you want to use anything more advanced than a blog.

~~~
keenerd
I use a second browser just for JS and flash. It never has more than 3 tabs
open and is quite light on memory usage.

~~~
IgorPartola
That does not address lots of use cases. It also introduces an inconvenience
that is pretty much a no-go for regular Joe's out there. The problem is that
an average user is now forced to buy a new computer every 3 years because
Flash and JavaScript take up too much RAM. Sure an above average user might go
through the inconvenience of setting up two browsers and learning what
JavaScript and Flash are, but your standard spherical bear will not.

As for developers, matters are worse. For one, I have the pleasure of working
with (debugging, deploying, writing) JS and Flash code. I would love to use
the extra 2 GBs of RAM for running 3 more virtual machines to do more testing,
but I cannot.

The solution is not to bury our heads in the sand, but to fix the memory leaks
and/or memory requirements. As the OP points out, there is no reason that a
web page that can be encapsulated in a half a meg of HTML needs to take up
hundreds of megs of RAM once parsed and running.

------
notatoad
your operating system uses the ram available to it. you see 11.6GB of ram
usage because you have 16GB of ram. my laptop has 4GB and short of running
virtual machines i've never wanted for more.

------
Tichy
The apps simply try to make the best use of the memory available, for example
they could use it for caching. It would be bad if they weren't using up all
the memory available.

That said, there might be the occasional rogue app or web site, which should
be uninstalled or avoided.

~~~
hollerith
>there might be the occasional rogue app or web site, which should be
uninstalled or avoided.

How would I know if a web site is "rogue"?

~~~
babebridou
I guess Rogue is a term to describe both malicious or poorly coded sites. As a
rule of thumb, if scrolling isn't perfectly instant, then a site is "rogue". I
found a recent example of a unintentional rogue website: it was a tumblr blog
using a preview service that would extract opengraph meta data and display
images as a caption to the link. One of the link was an article on a news
website where the og:image was incorrectly pointing on a 3MB jpg. The whole
tumblr blog would be totally unresponsive to scrolling due to a 50x50 image
with a 4096x2048 src.

This was absolutely unintentional and involves at least two intermediates over
which the original blogger had no control whatsoever.

Link to the page in question: <http://fh2012.tumblr.com/page/6> It so happens
it's the tumblr blog of the current favorite to the French Presidential
election, and it was the first post so I noticed it :P

~~~
hollerith
Thanks.

I would have preferred a web in which scrolling (and certain other operations,
like the Back button) was always instant regardless of how the page is coded,
and consequently I did not have to watch out for rogue pages or sites.

ADDED. Or at least a web in which there were fewer ways for a page to go
unresponsive, which would have made it easier for readers and writers to
anticipate responsiveness problems.

------
pm24601
You are correct to be calling out applications that are indifferent to memory
usage. The commenters who answer "ram is cheap", "your time is worth more than
the ram",etc. are missing critical points:

1) RAM is a finite resource 2) Managing excessive RAM usage (even without
swapping) requires OS effort 3) indifferent memory management is a sign of
sloppy coding and usually is a memory leak 4) casual memory usage does not fly
in the mobile power constrained world that is now upon us.

#1: Ram is finite

If a computer with 1 terabyte memory with all but 1 gb consumed by a memory-
indifferent program, is going to behave poorly. The OS is spending a lot of
time trying to find free space every time the program or OS needs new memory
for any reason.

#2 Managing excessive Ram usage is not free:

The OS is constantly shuffling stuff around in memory, updating pointers, etc.
If the OS has more active memory to manage then the CPU is spending more time
to do this overhead work that is not available to actually run the program.

#3 Excessive memory usage = sloppy programming (ers)

Whenever I have found code that is memory-indifferent I have found bugs beyond
memory usage: 1) massive http-session size. On a web server if each session
takes "only" 2 megabytes, that means 1 high-end server can handle about 4000
users (8gb of session data) before bogging down. 2) memory leaks - temporary
data created but never released to the OS.

#4 : mobile

Memory uses power. Remember this is a key reason why Flash is dead on mobile.
Adobe could never reduce the memory/power hungriness of Flash.

As a Java developer, I routinely constrain the JavaVM's memory during testing
to just above the target footprint. I want to know if there is a memory issue
before it blows up in my face on a busy day.

------
AndrewDucker
I have 25 tabs open in Firefox right now, and that's taking up 600MB. So,
about 24MB/page. Seems a bit high, but considering the number of images and
amount of JS, not so high that I'm panicking about it.

------
sirclueless
Try to think about this from an optimizer's standpoint. Here are some
excellent reasons why giant memory footprints are a good thing:

1) Freeing memory is expensive. In order to do it properly in a running
process, you need to check that each piece of memory is not in use, and that
typically requires traversing large graphs. It's only something you want to do
when absolutely necessary. (Hint: it's not necessary when you have 16 GB of
RAM.)

2) RAM is really fast. Things in memory can be read in a few CPU clock cycles.
On disk, you need to wait for a plate of metal moving at 50 mph to swing by.

3) It's totally free! It's not like unused RAM can enter power-saving mode or
anything. If your RAM usage isn't at 100% you are wasting valuable space. This
isn't a checkout lane where lower usage is a win for everyone. It's like an
everlasting candy fountain: it all rots if you leave it, so get the whole
neighborhood in for a piece.

So honestly, what are you complaining about? So chrome wants to cache hour-old
pictures just in case you hit the back button 45 times. What's the big deal?
YOU HAVE 6,600,000,000 FREAKIN' BYTES LEFT! You could probably fit Project
Gutenberg in there. If and when you run out of memory, and pages take 15
seconds to load because your hard drive is swapping like crazy, then you have
a problem. But you won't have that problem. And just because chrome is taking
up 3 GB at the moment doesn't mean it won't be a nice citizen when the
operating system starts to cry about low memory, which it won't do because YOU
HAVE 6 MORE GIGABYTES.

~~~
inconditus
Certainly all true points. However, for the people without the luxury of 16GB
(I'm running on an old machine, with 3gb of RAM), Chrome consistently freezes
the computer due to the resource hogging. I've learned my lesson and switched
to Firefox.

~~~
hiptobecubic
"I've learned my lesson and switched to Firefox."

Good luck with that.

------
hythloday
Instead of complaining that your RAM is 75% allocated, you should be
complaining that 25% of it is unused. (Imagine buying a processor and noting
that it never went above 75% utilization.) Memory deallocation isn't free
(it's often not even cheap), and it's completely wasted work if the
application is closed. In addition, for every allocation that's _not_
unnecessarily deallocated, you don't waste cycles redrawing bitmaps, reparsing
text, redecompressing images, and so on.

------
viraptor
While I agree in general, I think you're pointing at the wrong target here.
It's not that "web programmers are doing something that's definitely not cool
for our current RAM budget". The javascript is loaded and probably jitted to a
smaller space than its text (depends on the number of paths, etc - I'm
ignoring lots of stuff here), the actual text passing through the wire is also
pretty small. So where's the memory going? All the supporting stuff...

There's the whole javascript framework with it's own allocator, memory pools
which will stay around for a long time. Some of the browser elements use the
same kind of vm to actually display the UI to you, adding some more mem usage.
Then there's the whole page which had to be read, parsed into a tree,
processed to create appropriate model for display (all stages are preserved in
an editable way). This has to be a live model for the whole time, since
javascript might need to interact with it - you can't just flatten it into a
screenshot and display that. Now the images - they also need to be loaded and
decompressed for display. Also the text isn't that simple - there's the whole
engine behind loading fonts, analysing each letter / pair / triple / ... for
special spacing rules to be applied. There are multiple glyphs, font variants,
sizes, etc. to rasterise. That text needs to be distributed properly inside of
the DOM model generated above. That text flow actually affects the model again
- they're linked, so they get processed again (text metrics need to be
recomputed on each page resize, which may make some elements longer/shorter).
Then the theme goes on top of this - since the application displays itself,
the needed libraries / theme elements will be loaded too. But of course not
all browsers use the theme from OS for displaying the form elements, so
another layer needs to be loaded here. .... I could go on forever about
elements which are needed for rendering the page.

So yes - lots of that could be much lighter. Lots of that could be made into
modules or pulled some levels higher to make them more general. Lots of that
might be just a result of sloppy coding noone ever got to fix.

But just remember that what you're actually using is a system on top of a
system on top of a system on top of... If you used gopher on a real 80x25 term
you'd use kilobytes. If you used first web browser with no image support and
no scripting, you'd be under megabytes. But you don't - you're looking at a
realtime rendering of 10 different models coming together and interacting with
each other with next to no lag. And it all probably renders with hardware
support to give you a very smooth scrolling experience. So no, web developers
themselves have almost nothing to do with this whole mess. Unless they made a
silly mistake and are leaking memory like crazy... they're not the ones to
look at first.

(I simplified in many places, not all facts are relevant to every OS/browser
combination, etc. just trying to make a point here)

~~~
nirvdrum
Shouldn't it be the responsibility of the Web dev to realize this and work to
minimize it? In a way, it's not much different than building on top of any VM.
And here it definitely impacts the UX. I've certainly closed down tabs from
sites that I know are bogging my system down.

~~~
devs1010
Yes, it should be, but in reality the UI usually seems to see the least
attention for optimization of this sort, for one, because I think it tends to
have people working on it who aren't necessarily true programmers, they are
more of designers who do some JavaScripting, and also the company is much more
concerned with not wasting the memory on their own servers, they figure that
if its just taking up load on a client machine its not as important. Its a bad
approach and I will often refuse to use a site if it has bad client side
performance / seems like a memory hog but thats just the way things are now

------
willvarfar
Your reasoning is technically flawed.

That aside, i recently hired the cheapest VPS i could find; 128MB RAM. I could
not run apt-get; i ad to upgrade to 256MB.

~~~
babebridou
Like I said I'm in no way a browser or web expert. I only have a notion of
what could potentially fit in a certain amount of memory, and when I look at
webpages I'm at least an order of magnitude off, in the wrong direction. This
is why I'm worried. I sure hope I'm completely wrong!

~~~
willvarfar
had to blog:
[http://williamedwardscoder.tumblr.com/post/16634123844/128mb...](http://williamedwardscoder.tumblr.com/post/16634123844/128mb-
ought-to-be-enough-for-anybody)

To be honest, I don't really address where your 3GB for Chrome is going. All
those memory figures you see should be taken with a pinch of salt. Reserving
address space, using it and such are not so straightforward to untangle.

Your OS doesn't offer anything like
[http://williamedwardscoder.tumblr.com/post/13363076806/buffc...](http://williamedwardscoder.tumblr.com/post/13363076806/buffcacher)
sadly.

~~~
viraptor
Similar to your caching FS trick, it would be cool if Cleancache was exposed
to the userspace somehow. (<http://lwn.net/Articles/386090/>)

~~~
willvarfar
Yes, the official blessing of a mechanism for caching would open the door to
browsers using it

I like file schmantics because it gives us access control and an api that is
straightforward to use

------
silasb
To get better performance the operating system will use as much as the RAM as
possible. Why only use 4GB when you have 16GB to use? Cache as much of that as
possible. Remember locality in CS and how it relates with caching, this is
exactly the reasons why your OS does this.

~~~
ulvund
Exactly, OP does not understand the role RAM plays in computers.

It is a disappointment for me to see the link get so many points in this
community where you would expect the baseline of knowledge of technology to be
'above the average'.

~~~
adnan_wahab
surely op knows what he's talking about, he even wrote his own data structure.
it must be the moores' law. i read on wikipedia it uses too much memory...

------
nyellin
It isn't a problem with Twitter or Gmail, it is a bug in Chrome.

Chrome leaks memory. Check memory usage again after force-quitting Chrome and
relaunching it. I have to restart Chrome daily to regain 3-5 gigs of swap.

------
nextweek
I hate these kinds of questions about ram usage. Os developers and browser
developers are smarter than you give them credit for. They will do garbage
collection when memory becomes tight. You are slowing yourself by clearing ram
when you might need it and you still have free ram.

Browsers have sub second back buttons, how much pre-rendered ram space do you
think that takes up?

~~~
babebridou
> Browsers have sub second back buttons, how much pre-rendered ram space do
> you think that takes up?

I'm sure the os & browser devs are all good and great. I'm sure the web devs
are all fine and accurate; that the subject is a complex one and we're already
all doing the best we can.

But that's not my point: I'm worried that one day we'll run out of RAM because
we're designing our webapps as if client RAM was no longer a finite resource,
we simply forgot it even existed in the first place.

------
Havoc
Pretty sure the browsers follow a similar approach to windows: If the memory
is there it gets filled on the off chance that it shaves a few ms off
somewhere because the needed item is already in memory.

I think wiskers is right though: It depends on the box. On my PC (2gigs) the
browsers rarely make it to 1gig.

~~~
easp
It's one thing for the OS to be making this sort of decision, since it has a
global view of resource needs, a single application does not.

Actually, I don't even get the sense that chrome has an adequate view of its
own resource needs. My wife often finds that her browser and computer is
underresponsive because chrome is using so much memory for background tabs.

------
hogu
what stack are you guys running?

I'm running ubuntu 11.10 with awesome wm, I have chrome open with 11 tabs,
thunderbird, and pidgin and rhythmbox running. a number of terminals.

I'm using 1 gig of ram according to htop.

~~~
icebraining
Here: Debian Sid with Awesome, Firefox with 12 tabs right now but open for
days, bunch of terminals, MySQL, mpd and torrent client: 473MB of actually
used memory.

------
lshevtsov
What are you panicking about? A gig of RAM costs less than a fast food meal
nowadays.

EDIT: and the cost of programmer hours required to _save_ a gig of RAM is
orders of magnitude more.

