Hacker News new | comments | show | ask | jobs | submit login

It seems many comments missed the point. The article is not about how bloated modern software is, how many useless features and programs are wasting CPU cycles for pointless jobs, etc. (Yes, modern software is bloated, for this reason, I'm using the MATE desktop on a minimum Gentoo installation, but this is not what the article is about.)

It is describing how web browser, a piece of software with extremely high inherent complexity, interacts with the memory allocator of the operating system, another piece of software with high inherent complexity, combined with a rarely used feature from Gmail, can trigger complex and complicated interactions and cause major problems due to hidden bugs in various places. This type of apparent "simple" lockup requires "the most qualified people to diagnose".

These problematic interactions are unavoidable by running fewer "gadgets" on the desktop environment, it can be triggered and cause lockups even if the system in question is otherwise good-performing. Installing a Linux desktop doesn't solve this type of problem (though this specific bug doesn't exist).

The questions worth discussing are, why/how does it happen? how can we make these problems easier to diagnose? what kind of programming language design can help? what kind of operating system/browser architecture can help? how can we manage complexity, and the problems came with such complexity, what is its implications in software engineering, parallel programming? etc.

From another perspective, bloated software is also an on-topic question worth talking about. But instead of the talking point of "useless programs wasting CPU cycles", or "install minimum Debian", we can ask questions like "do _ALL_ modern software/browser/OS have to be as complex as this?", "what road has led us towards this complexity nowadays?", "what encouraged people to make such decisions?", "can we return to a simpler software design, sometimes?" (e.g. a vendor machine near my home, trivially implementable in BusyBox, or even a microcontroller, are now coming with full Windows 7 or Ubuntu desktop! Even the advertising screens use Windows 8, and BSoD sometimes, despite all they need to do is just showing a picture. same thing for modern personal computers.), or even "is Web 2.0 a mistake?" (so we are here on Hacker News, one of the fastest website in the world!). These topics are also interesting to talk.

I get what you're saying, but it seems you already have a place you want to go and are using the article to get there -- much like the other commenters.

While these things are important, to me the critical phrase in the article is this: ...It seems highly improbably that, as one of the most qualified people to diagnose this bug, I was the first to notice it...

My system hangs when typing text all of the time. Reading this article, this indicates to me that 1) It probably hangs for tens of millions of other people, and 2) Nobody either has the time or money to do anything about it.

That sucks. Additionally, it appears to be a situation that's only gotten worse over time (for whatever reason).

You can look for potential answers as you point out. More importantly, however, is the fact that nobody is aware of the scope of these problems. Millions of hours lost, the situation getting worse, and there's nobody hearing the users scream and nobody (directly) responsible for fixing things. In my mind, figure those things out and then we can start talking about specific patterns of behavior that might limit such problems in the future.

tl;dr Who's responsible for fixing this and how would they ever know it needs fixing? Gotta have that in place before anything else.

We've all been trained by web apps that stuttering, jaggy rendering and hangs are normal and expected. I've railed against web apps for a decade now, most to deaf ears. They are so broken, unperformant, unreliable that in a previous time they would have never released with problems like those.

But now desktop apps have the same issues. And its not going back to where we were. So I guess we'll get used to it.

Jittery rendering, hangs, and high-latency are deal breakers in Virtual Reality. I think things will get better as a matter of necessity for VR and AR, and then "veterans" from the field will bring back "new ideas" about performance and user experience.

Or, if you prefer your optimism a bit more dystopian-flavored, some megacorp will come around with a walled garden whose user experience is just so good that users will flock to use it, and the rest of the industry will have to adapt to compete.

In either case, I don't think getting used to it is our only choice :)

Except in VR's case they just ask you to purchase a $1000 set of kit and if you experience stuttering just shrug and suggest upgrades

> a situation that's only gotten worse over time (for whatever reason)

The answer is ever-increasing complexity, no?

Complexity is always the enemy, but only if you have to deal with it. My car engine is very complex and I don't ever think about it.

I don't want to over-state this, but it's a hell of a lot more important than people think, mostly because it attacks you in little bits here and there. It's never a direct assault. We are creating a consumer society in which we're becoming slaves to the tech. It entertains us, it connects us, it remembers for us, it guides us. All of that might be fine if that's your thing. But there are dark sides too. These kinds of hidden bullshit annoyances are one of the dark sides.

The root of the darkness is this: if you steal just a few minutes per day here or there with hung-up text editors and such, how many man-years of productivity are you stealing from mankind?

I really think we need to go back to the metal and start designing systems with separate fault-tolerant systems dedicated to being humane to the users by invisibly handling the kinds of things that keep wasting huge parts of our collective lives.

Or, as you said, we could just keep adding complexity. That's always the answer, right? sigh

Complexity is always the enemy, but only if you have to deal with it

I think that we got to where we are now exactly because of the addition "but only if you have to deal with it". Software consumes many useless cycles exactly because developers on all layers shift their responsibility to deal with complexity to other layers (either the CPU, or the downstream developer). Sometimes it's because they have to, but most of the time it's simply because there's too much distance between producing and consuming developers.

being humane to the users

"users" all too easily implies end-users only. I'd add that developers also need to be humane to downstream developers (circle of influence and all that), including better API design and better instrumentation hooks. But that latter would be adding complexity :(

> Complexity is always the enemy, but only if you have to deal with it. My car engine is very complex and I don't ever think about it.

That's a bad example. My BMW E90 has a gasoline direct injection system, which was state of the art when the car was made 9 years ago. It is very complex and the parts it is made from are very expensive. The BMW specialist tell me the misfire my car has will therefore cost more than £2500 to fix, and even then they'd be guessing.

It would be better if car engines were simpler, like they used to be before they had to start passing artificial emissions tests that don't measure the impact for the whole life-cycle of the vehicle.

It's actually a very good example. I drive a 1978 GMC pickup. Once a year I drop it off at a place and the guy does maintenance. That and adding gas are all that I do.

When I travel, I get a rental that somebody else worries about.

These things are as complex as we will tolerate. I love new vehicles and have a blast driving them while traveling, but frack if I want to have to update and reboot my car. What kind of living hell is it where everything we touch is complex like this?

This is ridiculous. I have a pretty new car (2015), with GDI, and I sure as hell don't have to "reboot" it. It's been perfectly reliable. The infotainment system does have a few issues, but it has nothing to do with the driveability of the car (the engine, and other critical systems are not tied to it).

Modern cars are FAR more reliable than anything made in 1978, this is a simple fact proven by mountains of data. Cars last far longer than they used to; you can easily go 200k or 300k miles with basic maintenance, and I'm sorry, but despite what you might want to believe, that was just not the case in 1978.

And BMWs are terrible examples; those cars seem to be designed for expensive and necessary maintenance, so they can extract more profit from their owners. Japanese and American cars aren't like this.

A modern car is far safer, far more reliable, far more efficient, far more powerful, and far better than the environment. The added complexity only makes things better, in this example.

My memory may be faulty, but wasn't this one of the design goals of BeOS?

I wonder if there is a chance we could take another try at that.

That's 80-20 economy which is growing more and more, going hand in hand with "economy of scale". If tens of millions are less than 20 per cent of users - so be it, nobody cares. What does matter is 80 remaining per cent.

We could ask a lot of questions, but at the end of the day system complexity will increase the likelyhood of so-called "system accidents" for any type of system, not just SW.

One of the most effective measures to combat such issues is to... reduce the system's complexity. E.g. by not having another VM running on top of the OS just to read and write e-mail.

Since this won't happen any time soon due to various reasons, the only reasonable thing left to do for most of us is to grab some popcorn and watch how the software development world struggles to contain the mess we made and fail at it.

It was eye opening to me when there was mention of a 2 TiB map being created and something about 128MiB chunks. I'd just like to smack the person that thought that was a good idea. I can understand thoughts like "but the blocks won't actually be allocated" or some such, but you have to step back and say "WTF are you even doing with anything that large?" Control freak.

And yes, web browsers are becoming an OS on their own. I consider that a failure of the underlying OSes we have. Tabbed browsers are awesome, but they exist because OSes and standard desktops (GUI toolkits) didn't come up with decent ways to handle that. Browsers are also trying to implement fine grained access to resources - because our OSes haven't managed to do that for them yet. Memory management? I have no idea why you'd do that in ANY application software today. Actually there is a reason - people don't trust the OS or think they can do something better, but it ends up creating extra complexity. Complexity is categorically bad and should be avoided unless it's the only way to do something. Remember how X got its own memory and font management? Same thing.

It’s keeping track of 16k blocks of memory that way. If you changed it to 4MB you’d have to track and scan 512k.

We have a GIANT address space to play with. Why not use it?

They’re not actually using 128MB per function.

> The questions worth discussing are

Interesting that you ask how to diagnose and manage the complexity, but not how to avoid it. Do we really need a more or less complete OS+VM (aka web browser) running on top of another OS (Windows etc.) to read e-mails?

Likewise, one could ask if we need all this to read any sort of text - news, articles, etc.

Or even to just play music or stream videos.

Completely agree. I realized this perspective immediately after I made the post.

Original post updated!

agree, this kind of issues exists in any highly complicated systems which need to take into account memory security etc. I dont think a lot of people realise that modern browsers are about as complicated as the operating systems they run in and have about as much counter measures for security problems as the OS undernearth it. To manage these systems resources securely and not have them fight eachother in the process is an incredibly complicated and tedious job.

That being said i do agree with a lot of people there is a lot of bloat. But often this bloat is caused by lack of understanding of this complexity in what they are building software. If things like this are more generally known and understood problems like this memory issue in google application would be less.

my own solution to my hate for bloat is to write my own software from scratch. and before i complete that lifetime task i feel it's unfair to complain at others who spend their entire lifetime making programs you use because they made it a little too bloated for you due to whatever reasons...

I think for your question worth discussing, why/how this happens, and how to make it easier to diagnose, is that more people like the writer of this blog are so kind as tho share their findings with us :)

> I dont think a lot of people realise that modern browsers are about as complicated as the operating systems they run in and have about as much counter measures for security problems as the OS undernearth it.

There's an argument to be made that maybe 'security' isn't worth as much as the blogosphere thinks it is. Like everything else in life, it's a trade off, and because it is in their best interest the security "experts" do their best to sensationalize and promote paranoia and over-reaction to every little potential problem without regard for the cost, namely inconvenience and slow software.

What was the solution to Meltdown and Spectre again? Oh yeah, make everything slower on the off chance someone will use a timing attack to maybe slowly exfiltrate some information from memory that might be important. If you're a cloud host that tradeoff is probably worth it, if you're a desktop user outside of an intelligence organization it probably isn't, but you'll pay the cost none the less. 1% here, 2% there, no big deal right? But it sure adds up. Do an experiment: install 2 VMs, one with Windows Server 2016 (or Windows 10), and one with Windows Server 2003 (Or Windows XP). The 2003 (XP) VM will be so much more responsive it will freak you out because you aren't used to it. How much of your life has been wasted waiting for windows to appear and start drawing their contents? What are we getting in exchange?

How many minutes would that highly responsive Windows XP install survive browsing the web before it's rendered useless by tons of malware?

How many 2005 era applications, print drivers, toolbars or screensavers, and whatever else was cool in 2005, can you install before the machine is as responsive as a 300 baud connection?

XP era was probably peak crapware with people having IE with 12 added toolbars and unusable everything. Often solved by buying a replacement machine because the old one got so slow.

There's not too much in the way of xp malware still actively spreading, but you're right, an xp machine at one point was like a public domain colocated computer to be abused by whoever.

Computer insecurity is costly and counterproductive. It helps criminals and maybe the occasional oppressive regimes walk us backwards, mess up lives, mess up businesses. I don't think privilege escalation and encryption key theft should be taken lightly. Abusable things get abused.

There's plenty of crapware today, hell, Microsoft forces some of it on you in the default install. The same rules more or less still apply: it's risky to install crap from random untrusted sources. I still have an XP machine I use all the time at work because it has a real physical serial port for talking to some equipment with. It hasn't been a problem.

As I write in https://news.ycombinator.com/item?id=17775303 the high reluctance in the industry to clean/refactor core tech is a huge cost generator. Eventually is necesary to accept that "move forward" is not the way.

Money is not a excuse, because browser/os/languages are already HUGE money losers.

And the debugging exercise he went through was insane! I wouldn't have had a clue how to even begin tracking a performance glitch like that that results from an interaction between a very complex program and the OS.

I should have grown out of it by now, but I still dream of a Star Trek future, and I've developed a guideline (mostly in jest) for how I think about systems. It goes something like this: if the starship Enterprise turns out to run Linux, or windows, or the driver I work on, or Electron, or whatever I have in mind, and all the problems we see now show up on the monitors on the bridge, how would I feel about that?

Sometimes it's ok, sometimes it's not. I tend to wish that we could get a lot better at building systems, but that involves a number of difficult problems that people far smarter than I have been thinking about for far longer than I've been alive.

The future in my head doesn't have so many systems developed by accretion, but maybe that's how it has to be (for now).

We're still pretty early in computing history. It makes sense for things to be built by accretion the first time around.

I agree, and really appreciate your post. One thing I can't get over is what happens when everything is written in Javascript and runs in a browser? Clearly, we are not far from that now, which means the stack looks like this:

    Javascript code
    - - - - - - - - 
    Javascript interpreter
    - - - - - - - - 
    Browser (doing display things if not more)
    - - - - - - - - 
Running an app in a browser is cool, but the complexity is huge. Running a compiled app on the OS removes two layers of complexity from this situation!

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact