Memory use seems like a really pointless thing for users to care about. There's a widespread intuition that using RAM is inherently bad and low memory usage is a worthwhile goal, but it doesn't really make sense. The key performance goal is minimising latency, which is best served by using as much RAM as possible without having to swap out.
RAM is now fantastically cheap. 16GB of DDR3 costs less than $120 in either DIMM or SODIMM format, so there's little reason not to load out your machine with far more memory than you'd practically need.
I'd like software to use more memory, not less - for the vast majority of the time, most of my RAM is unused and my disk is idle, so why not use an aggressive strategy of pre-emptive caching?
I have 8gb. There was a time when 1gb was more than enough. Now you're saying I should have 16gb -- and you're right!
This is Not A Good Thing. This is people throwing every last feature -- even if it has a pile of 800kb of DLLs/SOs attached -- into every app, with no regard for memory.
I find my system thrashing almost daily at this point. I need to upgrade. Two years ago, running predominantly the same software, 4Gb was just barely not enough, and I upgraded to 8Gb. And two years before that, cut the memory requirements in half again. And again, and again, until you get back to the 1Mb of RAM I had in 1990.
There has been a trend to do exactly what you say you want for 30+ years. And you know, some of that extra memory has been used to great effect -- we can do a ton more now than we could in 1990. But a lot of it is wasted, just Because It's Easy.
And that wasted RAM means pointless upgrades every two years; 99% of the bloat at this point is for something I don't need or want. The ONLY valid reasons to need 16Gb+ are if you're actually using that much data in one place -- a server that needs to scale, for instance, or editing video, or maybe a game that uses crazy amounts of data and/or video.
That the same apps are bloating 2x every two years -- and as a result get slower, since hard drive and memory speeds certainly aren't doubling every two years -- is really unacceptable laziness on the parts of companies and developers in general. It's pretty much the opposite of a Good Thing.
Python and Ruby are on average about 40x slower than C. Fifteen years ago, this relegated them to the status of "scripting languages" that were unsuitable for Real Work. Today, there are still plenty of important applications where every clock cycle matters, but many more where that performance difference is absolutely irrelevant.
We have come to understand that developer time is vastly more valuable than computing resources in the overwhelming majority of applications - the only real disagreement is how many orders of magnitude of inefficiency we consider tolerable. Even the most resource-constrained applications are using higher-level languages and computationally inefficient designs, because it's much easier to spec a faster chip than a smarter dev team.
Most applications aren't getting significantly slower. Web browsers are a perfect example of this; Almost every aspect of their performance has improved vastly over the past few years, but as a result we're building much more sophisticated web apps and doing things in the browser that were once purely the domain of highly optimised native apps. It's a near-perfect example of Jevons Paradox. Improving the efficiency of a process reduces the cost of the output, which leads to increased demand for the inputs to that process.
My current computer has 700,000x more RAM than my first computer. If 99% of that memory is wasted, I'm still doing pretty well.
Funny that you should post this, and the next day on the front page of Hacker News an article talking about Ruby taking 30 servers, each of which running at 50% CPU utilization -- and still Rails would go to 100% utilization and crash the cluster when usage spiked -- while Go could run on a single server without even seemingly using CPU (5% was typical).  That's greater than a 40x speed improvement -- closer to 300x, going from 5% on one to 50% on 30.
>We have come to understand that developer time is vastly more valuable than computing resources
I get that, but still, 30 servers is pretty expensive. Having a full time engineer just to manage the cluster and write load balancing code is pretty expensive. Having down time because of a usage spike is pretty expensive. I'd rather pay for the smarter dev team myself, but I'm an engineer, so I'm biased.
Though fundamentally I wasn't talking about servers above. I was talking about desktops, and mostly Windows desktops. And having to install Python because that's what one developer prefers, and .NET because of another preference, and Perl because of a third, and Ruby because of a fourth, and Java because of a fifth...where does it end?
My own "scripting language" of choice is Lua. It's TINY, it's faster than all of the above (except, under some circumstances, Java -- but give Mike Pall another year or two and I bet LuaJIT will be beating Java in all the benchmarks instead of just many of them), and using it gives developers that productivity boost you're rightly saying is important.
And if you need more speed than Lua can give you, Go is a good option, if it comes to that. But as you point out, not every app needs every cycle.
Shared Objects are not too bad, since they are shared between all processes that use (the same version of) them. In fact, the apparently worst case of linking them "into every app" is only slightly worse than having them appear in one app each. Assuming they are not all in the working set of every app at once, the overhead is small. Great OS technology!
I think the main technical problem with code/feature bloat is the increased likelihood of bugs with every additional line of code.
The problem isn't when everyone is using the same .so/.dll. THAT would be a dream by comparison.
The problem is that one app wants the Mono/.NET runtime, another wants Python, another is using Java, another is using a huge pile of Visual Basic libraries...and then every app ECOSYSTEM has its own pile of dependencies and libraries. And on Windows, to avoid DLL-hell, everyone pretty much ships their own version of libraries to ensure compatibility (and if you're on Mac, the story is pretty much the same there -- apps don't share anything but OS services -- it's only on Linux where you can rebuild everything that people dare share versions, and even then another commenter just pointed out there are bugs with that too).
Java/Eclipse is probably my current worst offender, and Firefox is still greedy (though it's gotten better -- and Chrome is just completely broken on my system right now for whatever reason). Both are actually worse than the (bloated) Visual Studio that I need to run to develop Windows apps. And then of course there's Firefox (which has gotten better) and other various apps I'm running just to do basic development.
Eclipse makes Visual Studio look lean by comparison -- but if I want to develop for Android, I have to work a lot harder to get an alternate environment ALMOST working as well as they've got Eclipse configured to work.
Sounds like you're on Linux or Mac, and not doing as much GUI development as I choose to do. Looking right now Eclipse (the GUI from hell) alone is using 608Mb of memory. I know it has a lot of useful features, but 608Mb is a crazy amount of memory to use for a fancy text editor.
I'm also using Notepad++ with tons of plug-ins, and it takes up <10Mb. Eclipse does more, but really, does it do THAT much more?
And SKYPE is taking 90Mb to sit idle. I use Skype to chat with business clients, so I can't easily just toss it, despite that Pidgin is using 21Mb to connect to 4 different chat networks simultaneously and Skype only connects to one...
The list goes on. And I need to upgrade to 16Gb soon (though looking at the memory usage right now, shortly after a reboot, I'm "only" at 4.1Gb, so it's not critical yet). Sigh.
Actually, I encounter the problem of mismatched .so versions constantly when running new programs or libraries that are built from source. It would be nice if developers would figure out how to use libtool (and whatever the Windows equivalent is) so that this doesn't become a regular exercise in frustration.
8GB of RAM is about $50 right now. If you have to replace that every $2 years, are you really complaining about the cost of $25 per year on your $1,000 computer system? It's a rounding error.
Chrome, in particular, is generally be used for far more tabs (I have like 30 open right now on a MacBook Air) and increasingly complex web sites every year. It's easy to look at memory usage and complain, but the fact is that our usage and the size of the pages we're loading is also rising rapidly. If you loaded Chrome from two years ago and used it how you work today, you might not find the memory footprint that much better than the current version.
If hogging lots of RAM makes my program legitimately run better, then that's what it should do! Chrome runs a process per domain (or something like that) for security and stability reasons. That adds a lot of overhead for lots of tabs. But that's OK because there's actually a reason for using all that RAM.
>If hogging lots of RAM makes my program legitimately run better
It objectively doesn't make your program run better to waste RAM.
If RAM usage goes up 2x every two years but RAM and hard drive speed go up 20% in the same time period, then using more RAM when you don't need it is making programs more sluggish to start and to suspend to disk, and it's making computers draw more power while sleeping (to keep that much RAM alive -- the D in DRAM means every bit in the RAM needs to be refreshed continuously).
Objectively? That's a dangerous word; I hope you've got a better argument than this one. Hard drives are being rapidly replaced by SSDs, so their speed just jumped _way_ up (especially start-up and random access times, which affect sleeping and waking performance). 2 or 3 orders of magnitude, not 20%.
SSDs are great. But I have exactly one hard drive bay in my laptop, and it currently holds >580Gb of data. Some of that is bloat, but MOST of it is real data; I have lots of raw video, source art for games, music, and many folders of compiled object files. I'm a game developer, so I use a lot of hard drive.
From what I've read, an SSD has an expected life of about 1-2 years. Aside from not being able to afford $2500 JUST for an SSD to hold my data , I can't fathom paying that much every 1-2 years as the drives die. Not to mention downtime and potential lost data (between the back-up and the failure).
I'd love to have an SSD. In a year or two, I probably will, as the costs will likely drop.
But until then, yes, the performance has objectively dropped.
Even if you don't have replaceable RAM, you're still replacing your computer at some regular interval. It just means you have to buy the computer with more RAM than you would if it wasn't upgradeable (which was true anyway.) That's part of the cost of buying a computer where you can't upgrade the RAM.
I'd love to be able to upgrade my laptop RAM like that. Laptop CPU and motherboard manufacturers are still quite stingy about the maximum RAM they support. This limit is the only reason I had to buy a new laptop last year, and why I will likely need to buy another one in a few more years. A laptop sold today should support up to 2x64GB DDR3 RAM, but typical Ivy Bridge laptops only go up to 2x16GB, and a year ago it was max 2x4GB or for higher-end 2x8GB. The laptop I replaced had max total 3.2GB (you could put in 2x2GB DDR2 but the system would only see about 3.2GB, so I stuck to 2GB+1GB). Its 2007 Core 2 Duo is still blazingly fast, and its disk was easy to upgrade to an SSD.
 One can't easily buy a 64GB DIMM today, but Moore's Law is swift and mighty.
Total page size as measured by the http archive project is currently growing 20% per year on average. Any benefit to be had by chrome's speed-ups is lost many times over by web designers bloating their pages. The web is becoming slower to use.
What are you guys talking about? My main machine has 1.25GB of RAM and is 10 years old.
Sure I don't use it for modern gaming and sure I've delegated tasks to much stronger uni machines a few times in the past but other than the occasional hw problems due to age it still works and is mostly enough...
This is precisely the motivation behind the renaissance of the "simple / command-line-driven software movement" that you see sprouting up from the corners of the Arch Linux users, Slackware, and elsewhere.
I used to carry around a laptop with 128MB RAM. In 2010-2012, I had 1GB and was happy with it, but I had to adapt:
- Try a tiling window manager. You'd be surprised how many resources it takes to draw those window decorations, which not only saves RAM but also helps you focus better.
- Try a lightweight browser like LuaKit, dwm, surf, etc. Most of them are Webkit-based anyway, so there's not many problems. Plus, a keyboard-driven web browser is a hoot
- Try using emacs/vim instead of Word or OpenOffice. Thanks to LaTeX, I haven't touched a huge office suite in ages.
- Instead of having lots of programs open in the background, why not try some command-line equivalents? If you're not on windows, you have access to a world of utilities: music players, todo lists, file managers, etc.
Now I'm on an 8GB machine, and even under heavy load, I don't think I've ever come close to reaching that limit in the six months I've owned this box. (I don't have any swap space either)
Completely disagree. We've seen too much complacency in recent years when it comes to memory usage. Developers think it's ok for their applications to bloat out and have a huge memory footprint. I'd rather applications were developed in such a way that they were more memory efficient.
I've had a recent incident in which Chrome was consuming about 7GB of memory, just because I had about a hundred tabs open! It was really, really, insane. This reckless disregard for memory usage caused more problems in terms of performance, and it wasn't apparent what was happening until things started OOM'ing. (Swapping is really just as bad as an OOM - the computer's practically useless if it's trashing due to low memory.)
Your ideal situation of having more memory used will, in many cases, lead to degraded performance.
There's no way you could effectively work with 100 tabs open, and that is surely a smell for inefficiency. For any given task, 5 tabs is probably reaching the upper limit on what's actually _required_, with around 10 being the upper limit.
Granted, you could have multiple tasks ongoing in parallel, but are you really working in parallel at that point? If not, save the tabs and return to them later. I really doubt you are visiting all 100 tabs on a regular basis so as to merit they stay open at all times.
I've seen way too many people use their browser as a sort of todo app, by virtue of "I'll open a tab for this so I don't forget to do it later" -- hell, I do this too -- but that's not at all the intended usage of a browser, and so naturally, it sucks at it.
What can I say? It worked for me. I was doing a lot of research and things I'd read would lead to more searches of tangential topics that I'd want to revisit in the coming hours. This method of working fit my mental model and I was flitting between tabs as necessary, not just opening them up with the hope that I'd read them sometime.
Also, if I need to click on a whole bunch of links that I know I'm immediately going to evaluate, it's faster and easier to click on 50 links in one go than to click, evaluate, close, find my old position, click, evaluate ...! I'm someone that likes to batch tasks because I find it more efficient.
Opening that many tabs isn't something I do every day. I've only got seven open right now, which is normal-ish, but sometimes I need it and I know I'll need it again soon.
(On a side note, I can't really figure out why but I hate the term smell but it triggers a visceral reaction and wish it'd die a horrible death!)
You don't open 100 apps on your computer and then complain about memory usage. Running 100 tabs at the same time, all of which may be running live JS processes, is about the same. It's a perfectly valid usage pattern but it's also going to eat a lot of memory. It's not like driving with your parking brake on, but it's like loading your car up with its maximum passenger capacity and then complaining that it's accelerating more slowly.
It's possible, but there are always competing goals - speed, security, reliability, features, memory usage. When capacity on one is growing quickly (and cheaply), it's not surprising it doesn't get the same priority as some of the others. In particular, I'm not sure how many extra users you'd get by reducing memory footprint versus adding features.
But this has been discussed ad nauseam, and it's unproductive to keep complaining about it. The fact should be obvious, at least by now, that sandboxing uses more resources. If you're fine with that, great! If not, do something to fix it or use a different browser.
Originally you explained the problem as if you always had 100+ tabs open, but now it sounds like an occasional occurrence. I was mainly speaking to if you always had 100+ open -- as if that was your normal workflow.
I do understand at times you want to open many more than that for a limited time, but those situations (at least in my experience) are typically very short-lived and don't represent my ordinary common usage pattern.
[and yes, smell is supposed to invoke that reaction! a code "smell", or process "smell", etc. is something that stinks, and it should disgust you, and you should work to clean up the 'smell' ASAP! :)]
> Originally you explained the problem as if you always had 100+ tabs open, but now it sounds like an occasional occurrence. I was mainly speaking to if you always had 100+ open -- as if that was your normal workflow.
... because if you do it only occasionally it uses less memory? :-/
> [and yes, smell is supposed to invoke that reaction! a code "smell", or process "smell", etc. is something that stinks, and it should disgust you, and you should work to clean up the 'smell' ASAP! :)]
Tabs have replaced bookmarks as the means of saving stuff for later. Why is that? Because bookmarks require a lot more mental effort to organize and take more effort to delete when you're done with them. Bookmarks also take a lot longer to load than a tab (which is usually loaded already).
Very good point, and touches on a potential piece I'd like to write sometime. Used to, bookmarks were like a collection of good books -- easily filed away, organized, and still as good when you opened them a year later as when you last touched them.
But not anymore. They change. There's tons of them. And organization sucks, though FF isn't as bad as Chrome and IE, but the general paradigm of shortcuts needs to be re-evaluated. I don't like web-based services (for many reasons), and I've yet to find a fitting extension that works for me.
So yeah, I can see how this usage pattern has arisen, but I think the answer is ultimately up to innovation that has sadly yet to occur.
I agree. I regularly have hundreds of tabs open in Chrome, and while I do regularly save things to Chrome bookmarks, I treat it more as a place where I can put something that I won't look at again but there's a 1% chance I'll desperately need to find it at some point in the future and won't be able to find it by Googling.
Also interesting that Chrome never added an option to sync bookmarks to the Google Bookmarks web service.
If the majority of people are using a tool in a particular way, it becomes the tool's responsibility to meet that need effectively. Just because you didn't anticipate the ways that real people would use your technology doesn't mean you get assuage responsibility for its shortcomings because you didn't foresee them.
I agree it sounds a bit excessive (though I go up to 50 tabs myself often enough), but how is you telling him off about how his way of tab-usage is inefficient in itself in any way relevant to a discussion about browsers memory usage??
I hope you didn't really try to say "there's nothing wrong with browsers' memory usage, as long as you don't stress them too much" ? :) :)
> There's no way you could effectively work with 100 tabs open
I'm sure there's no way you could effectively work that way, but imagining that what's true for you is true for everyone is a massive smell for ignorance of other workflow styles and cognitive abilities.
Although once the favicons disappear, I usually open a new window, or group windows by subject matter. I've tried to alter my browsing habits over the years to ease the load on my computer, but I've just come to accept that I am just one of those people.
This tool is either the best thing that's ever happened, or it's about to exacerbate my problem 10 fold.
Good point, we'll have to see. I have the same problem as the GP (though up to 50 tabs usually) and indeed "bookmark all"+"close all" doesn't really help because if you never review those bookmarks, you tend not to take the step either.
As someone else itt already said, a better solution would be some improved bookmarking system that I can use to just file away a tab, knowing that it'll be there when I need it, as well as offering the option of, I dunno popping up for review in a couple of days or a specific time maybe.
One thing I do not understand is why modern browsers, apart from bookmark-folders do not offer tags as well. They were a hugely successful way of ordering things back in the day and they have all but disappeared for some reason. And they would be tremendously useful in bookmarks (to me). And del.icio.us is dead, plus I don't want a web service to keep my bookmarks, it's too slow.
I'm not sure if I understand your comment. Firefox does allow you to organize bookmarks by tag, and it even lets you use the tag collection as a bookmark folder (meaning that you can add the tag to the bookmark bar and it behaves like a normal bookmark folder.) You can even add different links to different bookmark folders and give them the same tag, so that they appear under the original bookmark folder and under the same tag category, or tag folder if you drag the tag to the bookmark bar. If you decide to remove the tag folder from the bookmark bar, the tag will still exist.
Well, if you have 100 tabs open, you might be using the wrong browser. Chrome is obviously not designed to work with that many tabs, either from a UI standpoint or, apparently, a memory use standpoint.
However, I don't really see why there needs to be a strict limit on the number of tabs you can have open before your browser starts trashing all your memory. This is a problem that can be solved by designing your software to deal with that use scenario, either by pausing unused pages and serializing the resources to disk or by disposing of the unused pages entirely without removing the tabs. You'd want a pref for this, but it seems entirely reasonable to me.
there isn't a strict limit, its just not a viable use case for current browsers.
Maybe what we need is Chrome/FF to consider the very high tab count as a use case and handle it better, when you have so many tabs it's not easy to find the 1 in 100 that you're looking for, we would need a new find tool, possible the browser should build a client side tf/idf index and make the whole working set (workspace) searchable. You see what I mean? Until the browser changes to support this usecase, the usage will be klunky and fraught with problems.
The Firefox tab bar is naturally more usable than the Chrome tab bar with large numbers of tabs because it scrolls rather than trying to fit them all in the bar. (I sometimes wonder whether squeezing tabs to the point of absurdity was actually a conscious decision by the Chrome developers to limit the number of tabs people can open and thus memory use.) There are a lot of Firefox extensions that add additional tools to manage large numbers of tabs, of which Tree Style Tabs is perhaps the most popular. There are not as many Chrome extensions, since Chrome's UI is not as extensible.
I agree. I'm a tab hoarder with hundreds of tabs open in Chrome. I never do that with Firefox's scrolling tabs because I remember where things are based on their place among the other tabs. Firefox doing a min-width on each tab and hiding the rest makes it impossible to find where that other page I looked at 5 minutes ago went.
How do you suggest to handle the fact that there are many websites which break the "back" button these days? I've gotten to the point where I open a new tab practically every time I follow a link. Not to mention news sites where you need to open all of the interesting articles that you see before the page updates and they go away.
tabs are not a replacement for bookmarks, the idea is not to open an ever increasing number of tabs, but rather bookmark and folder things if you need 100 concurrent research tabs. It's just the wrong way to use the browser.
It would be like me arguing that driving my car with the parking brake on makes it get hot, of course it does, it's user error.
Well, no. If you've got all those tabs open and you're flitting between their contents, it's appropriate. It's not the wrong way to use the tool, it's a way to use it that's perfectly appropriate in some cases.
There's no reason I should be fighting against my browser because my optimal workflow causes it to chew through memory!
no. when number of open tabs < 10-20, yes sure, but when that number reaches 100, your doing it all wrong. Its not about how much memory it consumes, that's irrelevant, if you need more memory just go to crucial.com, it's dirt cheap. Its about using the tools properly.
I can't argue the position for this workflow better than I already have. I refute the idea that I'm "doing it all wrong". No, I'm not and this was the only efficient way to get what I needed to do done.
My laptop's motherboard can't handle more than 8GB, so I don't even have the option of putting more RAM in it as it already has 8GB. Also, I shouldn't have to keep throwing in memory to handle applications whose coders feel it is fine and dandy to gobble memory with reckless abandon. Why should I bear the financial cost of a poorly designed application?
I'm most certainly using my tools properly whether you agree or not. I'm not bitching about 5,000 tabs causing excessive memory usage, I'm complaining about 100 which is not that many during research that requires going from page to page, or opening lots of links without having to remember where I was at in each part of my stack; I've got less working memory than my computer!
It wasn't the only efficient way to get what you were doing done. It's ridiculous to say that a hundred tabs is a reasonable use case. The very fact that you ran out of memory tells you that. If you really need 100 concurrent tabs (which you don't) then you need a different computer for your research. However, you're using the browser incorrectly. Why not save the text you want to a document, or out it into a spreadsheet?
10 tabs yes, 100 tabs no!! Stop arguing, even your computer's telling you the same thing :) you're just bent belligerent snd acting like a spoiler child. You're browser can't do, so you maybe should find a different way. As I've suggested.
I also regularly have over 100 tabs open and I find your suggestion that my workflow is wrong is insulting.
Maybe you don't need to have 100 tabs open.
Maybe you lack imagination to see scenario where having that many tabs open.
It is also possible that you are aware of some great/better way to solve similar scenarios than me, but I guess you are keeping it a secret, since I regularly evaluate new plugins and extensions, and also search google for tips and tricks, how to do it better. (And to date haven't found a way)
Sure I could work with less tabs. Hell Everything could be done with only 1 tab in 1 window (and I used to in the 90s), but it's not as efficient as what I do now.
"Maybe you lack imagination" - maybe you lack computer skills, it certainly seems so. There are plenty of ways you could avoid having 100 tabs open. However, if you feel like that's something you need then stop whining and buy a machine with enough RAM for your usual use case.
the ideal number of tabs roughly corresponds to the human stack size, plus a couple for on-going apps like gmail. If you need more than that, you should consider a better mechanism than the browser. Since you haven't found any, I posit that its you who lacks imagination. If you found my comment insulting then I am sorry.
Whatever the brightline test for "a lot" is, there's no compelling reason each tab should be consuming about 70MB. Under Chrome, each tab runs in its own process, which is certainly a contributing factor. I appreciate this is for stability and security, but it's not the best solution for the problem and has some serious drawbacks, i.e. horrible memory usage.
I end up with quite a few tabs open myself. I find if I bookmark a tab, add it to Pocket, or copy the URL into my notes that I'm far more likely to go back and actually read it. Before I would keep tabs open for weeks "just in case".
I can relate, I do the same to keep the tabs open so to read them later. Most of them end of not reading them weeks later. Never use Pocket. Presumably, it's similar to ReadItLater or Instapaper? Perhaps indeed that's better approach.
I don't see how it's a developers' fault that you opened 100 tabs in Chrome. 70MB per tab is really not that much, I don't think browser memory usage has actually gone up by much compared to other applications. I think the fact that it was usable up to that point is a testament to its design, though I personally think Firefox handles large numbers of tabs better overall (UI wise).
In theory you'd be correct. But some motherboards, even when they're not operating under the constraints of 32 bit addressing, can't handle more than a certain amount of memory. In my case, my laptop's motherboard can only handle a maximum of 8GB.
I agree with you, but 7GB for 100 tabs sounds likes something unusual happened.
I usually have around 200-350 tabs open (Yes, I'm hoarding, I'm aware of it). I usually terminate their processes and respawn them when I need them (their state is not lost) but IIRC, when I don't I get around 3GB for 200 Tabs
It is not about money, it is about sloppy programming. Sure you can use more memory, but not at the cost of disrupting everything else on the system. For example most of the netbooks / ultrabooks or whatever you call them (including macbook air) used to come with 4GB at most. Upgrading them is a pain, and having a memory hog is definitely an issue in those cases.
Memory is cheap in the the first world. I live in Argentina, which is not a particularly poor or undeveloped country, and my 8GB of RAM are almost impressive. I know of no one who has 16 GB, and a few of my friends have 8. Most people I know of have between 1GB and 4GB, with 2GB being by far the most common configuration.
I live in 'the first world' and I can't afford more RAM. $50 is far too much for me to spend at the moment. (A need for more RAM is trumped by my powerful urge to eat.) Not to mention working at a corporate job where the computers are older than my children and every ounce of saved RAM is a blessing.
Most people in the States have whatever their computer came with. Even computers with user-serviceable RAM can usually only address twice as much RAM as the machine came with. So, if last generation's hardware comes with nMB RAM, and this generation's comes with 2nMB RAM, owners of last generation's hardware can usually only upgrade their RAM once.
It's a bit silly, because RAM and storage seem to be the only things most people need more of on a daily basis. I suspect 5 year old CPUs would meet most people's needs just fine.
Remember not everyone can afford the latest beast of a machine that can accommodate 16GB RAM, some people still have laptops that can hold a maximum of 4GB by design limitation; it's more likely to be those people worrying about Chrome's RAM usage.
I suspect the power penalty of RAM is made up for by the fact that you don't have to hit the hard disk to get that data back (firing up the HDD will suck a lot of juice). It also appears the needs are pretty much tied to the chip, so a 4GB and 8GB module use about the same amount of power.
and buying a decent new machine to put that 16gb in costs some more. And migrating all your stuff over to that new machine takes time.
And then everyone will continue to write code without regards to memory usage, and we'll be complaining that Chrome 45 locks up 19g per tab, and people with your mindset will say it doesn't matter because 1TB ram chips are cheap enough at $400 to just upgrade.
Just having loads of stuff using RAM takes more time to manage. I would not have imagined in 2013 with 16G of RAM and a > 2ghz multi core processor that I would still get system hangs and beach balls in my OS, but I do.
My UI lets me press b to do a substring search on the url/title, which makes managing 100+ tabs trivial. And I often need to have a lot of tickets/revisions/forum threads/stack overflow questions open in tabs at the same time for my work.
Don't project your UI limitations onto other people please :)
I have a recently purchased Mac Mini which came with 4GB of RAM and whenever I happen to access a certain Flash-heavy page (http://www.lequipe.fr/direct_foot.html for the curious) Firefox just instantly eats up all the available memory, everything freezes and I have to manually restart the machine (I know the article is actually about Chrome, but I'm trying to also make a point that memory usage counts).
When your primary workstation is a laptop, which won't accept more than 8GB of RAM, and you have to use certain Java based desktop apps, everywhere you can save memory is useful.
I agree it should not be an issue, but when the usual short-sighted budget issues are the primary problem, one has to use other tricks:
"You mean the 3 year old machine you work on for 12 hours a day is no longer good enough? Sorry, that does not fit the depreciation model. Also, we spent all the budget on new MacBook Airs for Sales, because they really needed new web browsers."
Software should be optimized for the most common use-case -- not the outliers. Most people usually buy computers in standard configurations (which typically means 4GB to 8GB of memory) or use company provided equipment and they aren't going to be happy with being told "just buy another 8GB of memory -- it's only $120" as a solution to the slowness of their browsing!
I agree with you that everything should be in memory, but I wouldn't call saving memory pointless. The cost of RAM isn't the issue here... we all know RAM is cheap! However, the problem in my case is that my laptop caps at 16GB. I'm also using a RAM drive and no page file (for obvious reasons) and all temp/cache files are also in memory. I'm working on an app that does some heavy lifting on the browser. At times I have tons of reference material open, examples for third-partty controls, sites with flash or controls leaking memory, etc. Memory usage in Chrome was ~8GB.
I could certainly benefit from reduced memory usage while still being able to go back to all those tabs. My overall system usage was 15 GB and with any more stuff I risk losing everything (since I'm not using a page-file).
This might be a very specific use-case, but I do see a lot of value in the extension. (Now, only if I could further hack my machine to use 32 GB :>)
First world perspective. Any globally-oriented FOSS app like Chromium or Firefox should work to reduce hardware requirements as much as possible to provide best possible performance on devices like Raspberry Pie and MIT's $100 laptop .
And even in the first world there are a lot of miserly corporations that penny pinch on things like RAM.
If software that wants to target either of those markets can be made more hardware efficient, it should be.
It's hard to both target the leading edge of the market and also serve minimal-spec hardware targets. Seems like it would make more sense to have a different branch for those targets, being done by someone with a strong business interest in making it work.
The price of RAM is not the only (or main) barrier to getting more memory. My previous work desktop had 2GB of RAM. Getting it upgraded (from 1GB) took three months (after the upgrade had been approved by the budget holder).
I was fortunate to get a new work PC recently which has 4GB of RAM. Getting it upgraded (even if I bought the memory myself) would likely take another three months of requests and I suspect would be limited to a maximum of 8GB anyway.
this is the logic of someone willing to adapt to the environment around him. While the best solution would be more efficient programming, consumers are best served to max out their RAM and work with what they're given.