My desktop has 24GB of RAM, yet Firefox crashes on a regular basis when it bounces off its 2GB limit, usually when many image-heavy pages are opened, especially simultaneously from a folder of bookmarks. Seems like the single simplest thing to fix this is the option of using a 64-bit binary, given that it appears most of the work is already done.
The alternative is to break up the browser into multiple processes, much like Chrome. That would however reduce the distinctiveness of Firefox and be one less reason to keep using it. Though the reliability effects of using multiple processes is probably better in the end.
But sooner or later - sooner if various GL things ever get proper uptake - individual web pages are going to need more than 2GB to display and run. The bullet will need to be bitten, even in a multi-process model.
Firefox.exe is indeed large address aware (I just checked), but I had never seen it use more than 2GB of address space. It usually hovers around 700M of private memory and 1.5G of address space.
However, I just opened a few image-intensive pages and monitored the process, and got address space up to 2.4G or so. It's been a while (perhaps 6 weeks) since I saw a crash, maybe it's been fixed recently.
I run into the 2G limit very frequently too. I usually shutdown my home & work systems only once a week. Firefox too runs continuously from the 1st day of the week.
Over the duration of a week, as I open more webpages, the virtual memory consumption rises continuously till it hits the 2G limit and then the process kills itself. Usually during the last few hundred MB before the 2G limit, it becomes pretty unresponsive.
I'd be glad if as a user (not developer) I can set that option somewhere.
P.S., I am a developer, but over time, you learn that building unfamiliar codebases is not a trivial exercise. On Win32, I'm just happier downloading vendor provided binaries. On Linux, I can be both, happy and frustrated, building my own binaries with Gentoo.