There have been a few talks about how Moore's law is coming to an end, about how you can now build servers with close to a TB of RAM, how you can now scale clusters of servers and achieve great metrics when it comes to raw data storage or memcaching, how incredible feats in both parallelization and miniaturization will reverse the downhill trend for high-end computing.
Then there's the desktop. My desktop PC, for instance. It's a great 6 months old working computer, with 8 cores and 16GB of RAM. "This ought to be enough for all my needs", I thought. The Bill Gates in me was right so far, but unfortunately this won't be enough for long.
Yesterday I took a look at my memory usage, out of curiosity: 11.4GB. Woah. I had a look at the breakdown.
- Chrome : ~3GB
- Firefox : ~1.5GB
- Java (eclipse) : ~1.2GB
- Rest in tons of various work-related apps.
I'm of course responsible for letting this accumulate over several days, but still, a third of my RAM taken up by web browsing tabs? Chrome on its own clogging more than twice as much as Eclipse?
What worries me now is how hard we are getting struck with RAM-hogging web pages. Since I began writing this post, my freshly restarted Chrome browser with 9 open tabs (Hacker News (x2), Coding Horror, Google Search for "Mac osX Lion review 6 months", MoPub Ad Service monitoring, Google Analytics Visitors Overview, Android Developer Console, Twitter, Gmail) is taking up roughly 500MB of RAM. That's insane. On my 2010 Macbook Pro with 4GB of RAM, this means 20% of my overall caching capacity is taken up by my core web needs. Needless to say, I can't use my macbook anymore.
Last summer I was working on a little java experiment - a cross-platform 3D labyrinth. I wanted the overall memory and data footprint to be as low as possible so it could be played on a low-end android phone with really fast loading, and yet to keep the game space as big as I could, so I designed my own dedicated data structures overnight. I made the following calculation: over 500MB of raw, uncompressed data, I could store the description of an area roughly equivalent to a map of Europe with a resolution of 2.5 meters per pixel.
I'm not claiming any feat here, just stating the obvious: web programmers are doing something that's definitely not cool for our current RAM budget. We forgot any sense of measure. It's unnecessary for a twitter feed following 175 persons for an hour or so to claim a 70MB RAM footprint. You're not the only nor the worst offender, Twitter. Except that I had 26 new tweets to display, I clicked, and the footprint suddenly grew to 76MB. 26 tweets = 6MB. We're talking about 140 characters tweets, let's be generous and multiply that amount by 100 to take into account the tweeter's profile (which are all 10,000 characters essays as we all know), and we get a total of 364 000 new characters, which end up claiming more than 6 million bytes in RAM. Which means the RAM impact of adding 26 new tweets to a webpage is at very very least 10 times higher than it could be, and probably more like a thousand times too high.
Like I said, I'm only using Twitter to state a point. I mean nothing wrong with Twitter's web devs - actually I'm myself a very poor web developer - and I'm pretty sure the blame could also be put on Google Chrome instead, but I remember a day in 2002 where my Internet Explorer was trying to load a 4MB webpage from the hard drive, causing a RAM footprint above 40MB and a failure after 20 minutes of waiting in front of a white screen. Back then I was merely a junior consultant working on QA, and I was the one to tell the devs that they were definitely doing something not cool at all for the user's computer. True, we were showing rather complex and impressive amounts of "data" on our webapp with much simpler but oh so wrongly implemented "UI" (in that case, hundreds of unnecessary nested table anchors), but I can't help thinking back at those times where it was simply impossible to make a product that was not cool for the user's computer, because the computer would refuse to run it at all. We're way past this line today. My Twitter's tab has now garbage collected some data. It's back to 73MB ram footprint. There are 32 new tweets to show. I click them, and the footprint bumps back up to 78MB. Meanwhile, my overall Chrome footprint is now showing 550MB private memory. That's 50MB for two clicks on Twitter and ~4500 characters in a HackerNews' submit form.
Moore's Law nowadays affects the computing power requirements of software rather than the computing performance of hardware, and this is killing us.
What's the point of having a tonne of ram only to let it sit idle?
I "only" have 4 gb of ram and chrome is taking up 1.2 gb of that, even though i have 12 windows open with ~5 tabs in each.