The alternative is to break up the browser into multiple processes, much like Chrome. That would however reduce the distinctiveness of Firefox and be one less reason to keep using it. Though the reliability effects of using multiple processes is probably better in the end.
But sooner or later - sooner if various GL things ever get proper uptake - individual web pages are going to need more than 2GB to display and run. The bullet will need to be bitten, even in a multi-process model.
I'm impressed that you've gotten a browser that big on normal pages. I've only gotten it on ridiculously huge test pages or from memory leaking over time.
However, I just opened a few image-intensive pages and monitored the process, and got address space up to 2.4G or so. It's been a while (perhaps 6 weeks) since I saw a crash, maybe it's been fixed recently.
Over the duration of a week, as I open more webpages, the virtual memory consumption rises continuously till it hits the 2G limit and then the process kills itself. Usually during the last few hundred MB before the 2G limit, it becomes pretty unresponsive.
I believe you're referring to the linker options for large memory awareness
I'd be glad if as a user (not developer) I can set that option somewhere.
P.S., I am a developer, but over time, you learn that building unfamiliar codebases is not a trivial exercise. On Win32, I'm just happier downloading vendor provided binaries. On Linux, I can be both, happy and frustrated, building my own binaries with Gentoo.