Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft Has First Major Impact on Chrome (thurrott.com)
147 points by adrian_mrd on Aug 17, 2019 | hide | past | favorite | 114 comments



Microsoft most major chromium impact is to move input handling off the main thread which has been empirically a massive success on Edge.

This would allow chrome to have lower input lag than native GUIs if it's not already the case.

They are also working on many cool things, openXR support for example. They are improving chrome devtools (and their integration within vscode) They are doing many performance related fixes. And they are massively modernizing windows apis used by Chrome. E.g better performance for sensors through new windows 10 API. And the smart use of a Kalman filter for sensors.

They are also helping chromium to support windows ARM 64 bit

A real time view of what they are doing: https://chromium-review.googlesource.com/q/author:*.microsof...


> Microsoft most major chromium impact is to move input handling off the main thread which has been empirically a massive success on Edge.

Is that why sometimes Edge queues up UI actions while pages are loading? Having stop and back and whatnot wait to be processed until whatever stuck thing gets unstuck isn't exactly what I had in mind.


> Microsoft most major chromium impact is to move input handling off the main thread which has been empirically a massive success on Edge.

Do you mean this would be their most major impact? The link you posted below is for Edge 15 and not for chromium.

But in chromium the scheduler should already be handling input off the main thread when possible.

Otherwise agreed with the point that this article is weirdly focused on a single somewhat underwhelming change when there have already been a bunch of changes.


> Microsoft most major chromium impact is to move input handling off the main thread which has been empirically a massive success on Edge.

I'm confused by these statements (not just yours but in the article as well) about how Edge performs so well; I have not seen anything like that on the Windows 10 systems I am forced to use (for work). On those systems Edge takes forever to load, forever to open a new tab, and will randomly make the mouse cursor disappear when it's doing something it considers more important than responding to my input.


Neither the Chromium or nonChromium based versions of Edge behave like that. It seems like your work machine has something impeding performance.


The Chromium based edge has not been released outside of Insider/Beta previews currently so the GP’s experiences aren’t relevant to the current chromium based builds.


> It seems like your work machine has something impeding performance.

I thought it might be that, but it so happens we had to get a Windows 10 machine for home use for a particular purpose, and Edge on it is also slow (though perhaps not quite as slow as my work machine). Both are Dell laptops, so it's possible that Dell puts some extra OEM stuff in that hurts performance.


I'm on a Lenovo X1 Thinkpad and Edge definitely doesn't feel slow to start here (it's instant), and while pages are maybe a bit slower to load than Firefox Quantum, I wouldn't call Edge slow. I can't compare against Chrome though - I don't use it because it chews through battery.

I think I have seen that disappearing mouse issue though, and mostly I use Edge when I double click on a PDF file and forget I don't have a PDF reader installed....


Not slow for me as well. It's my second browser beside FF and has the far better text to speech. Hard to compare performance, since my usage pattern differs widely between the two. But my biggest complaint would be tabs sometimes crash-and-reloading and some websites not rendering (bugs.chromium.org is an example).

Wonder how annoying the switch to chromium will be :/


FWIW I always thought of Edge as slow but just recently I picked up a Surface Go. It's not a fast computer... but somehow, Edge is pretty fast on this machine.


Every single version of Edge I have used has had unexplainable random pauses and jitter in GUI responsiveness. If they do some "clever" side-thread processing and queueing of inputs, I can imagine where this comes from. It was the number 1 thing preventing me from giving Edge more than a cursory try.


Edge with chromium is still on the insiders channel so if it’s from work systems you probably have only been on their engine (is/was it trident?)


EdgeHtml + Chakra. Trident was the old IE one.


Trident was IE's rendering engine.


For my games, both Edge and IE9 have the fastest canvas rendering of all the browsers. Just saying.


Well your experience is a data point, maybe it was not bug free. Anyway I was referring to this https://www.google.fr/amp/s/blogs.windows.com/msedgedev/2017...


If only Microsoft would make it so easy to move input handling off the main thread when writing programs for Windows. MFC was the bane of my existence starting out.


Be careful what you wish for. I have witnessed people in contexts like these, thinking that threads are magic performance enhancing pixie dust, and unable to reason through the subsequent bugs.


async/await helps with that, but like with any other abstraction, if you use it without understanding how it works and what it actually does, it’s bound to come back and bite you at some point.


How exactly?

The use case for perf improvement with async/await is for IO based issues, eg instead of burning a thread waiting for IO to complete, use IO completiion ports (on winders) blah blah.

If your work is CPU bound you can't async/await it away magically.

Maybe I'm misunderstanding you?


Yea I’m also confused. You’re main thread should be able to run completely asynchronously from worker threads. With async, you are constantly hitting a memory barrier while the worker thread completes the promise and then the main thread draws the result. Ideally, the worker thread should be able to natively queue its task for non-blocking repaints.

Side note, this is why I’m a huge fan of Qt. They really did a fantastic job on building that framework for c++ to make it more safe


async/await is about non-blocking, not IO specifically. You can use async to not block the current thread while IO completes, or use it not block while while some CPU work completes (in this case on another thread while the UI is unblocked).

It's similar to using an event loop architecture for all UI actions.


I was specifically referring to input handling off the main thread, from GP’s original complaint. IO only.


That had already done that back in Windows 8 introduction.

You just need to code with UWP and async/await.


Help me confirm I understand the status quo.

My laptop with 32GB of RAM running Chrome streaming twitch or YouTube is currently writing the stream to my SSD before rendering it?

That sounds like very low hanging fruit in terms of battery gains. Any arguments for why this has been like this?


It gets better, Chrome is caching YT stream, but if you ever decide to rewind to already watched part it will forcefully re-fetch data from the server.

All chrome does is burn thru your SSD write cycles.


Why does it refetch?


    Date: Sun, 18 Aug 2019 17:20:24 GMT
    Expires: Sun, 18 Aug 2019 17:20:24 GMT
Additionally YT token DRM makes the resources change URL every time you refresh (encrypted key) and on top of that force expire in ~5 hours.

I just tried random 1080@60 video and was able to rewind without refetching about 4 minutes back, more than ~6 minutes resulted in resetting progres bar (thin line showing already downloaded part) and re-downloading once again despite burning 100MB of it on my SSD for no apparent reason. I dont have to mention ~10GB of free ram just sitting there doing nothing.


This change doesn't affect YouTube: https://bugs.chromium.org/p/chromium/issues/detail?id=984851...

which I understand means YouTube will still write to SSD...


Probably because Chrome was built not to assume enormous amounts of RAM.


My point is that when the resource is available, it can't be intelligently used? Heck, shouldn't you just let the OS call the shots and simply say, "I've got ephemeral data for you to keep warm."


You aren't wrong but this is also you get users who think that Chrome uses all your RAM even if you have a huge amount...


Sure, if the OS has an API for that. If it doesn't, then the Chrome developers have to figure out what to do.


Isn't the API just storing in RAM and letting a page file do its thing?


How do you make sure that the video doesn't push something far more important to the pagefile? Does windows have a way of telling it the priority of an item in memory, or do you have to trust a caching algorithm?


In a 64-bit only world, that isn’t a bad strategy. I would guess a lot of this is just legacy from having to or having had to support 32 bit OSes.


Is that why its using ~150MB per open TAB while Opera 12.x used 15MB on same websites? :) Lets not kid ourselves.


Regularly see Chrome tabs >1Gb. What am I running in them? Jira.


You are not a alone and Jira is not the only offender.

Maybe we should introduce sone kind of certification for developers of publically available software?

Or point out the climate issues: the way you do this multiplied with the number of users who use it means x tons of CO2 released pr years in addition to annoying tye users.

To add insult to injury it also means your users will have to replace their hardware more frequently than otherwise necessary.

I'm only partially joking here.

Im so tired of machines that should be 500 - 1000 times more powerful or more than my old PC from the nineties still being slow, and I blame it mostly on antivirus (on Windows) as well as various web technologies that are misused (on all platforms).


Jira may not be the best example. It's absolute slow, crufty garbage of a web product. It's possibly the slowest, crustiest website I visit each day.


Slack and Google Docs are also pretty bad.


And raise you, Service Now.


It sure seems capable of gobbling it all up (Linux, 16 GB RAM, which is no longer an enormous amount..)


Compiling a game engine (Unreal) seems to use less RAM than browsing the web...


The linker, though ...


Lol chrome routinely takes up 100+MB per tab


Lol


What I find fascinating with MS's decision to use Chrome was their willingness to take the hit and learn a new code base from scratch (and not even their own moreover). Usually inertia wins at big companies but not in this case. Regardless of the final outcome I really am impressed by the engineering effort needed behind this decision.


Regardless of meme-ish behavior over MS being incompetent on software, they're actually aces at it. If not the best entity in the world, they definitely know software. If anyone can pivot, manage a team to learn a codebase as large as Chromium at the drop of a hat, it's Microsoft. I only invest in index funds, but I'd place my bets on Microsoft over FANG, zero hesitation.


"Best entity in the world."

Would love to see some data or citations behind these bold claims.


i have to believe they want to build a blink dll into the os, so all apps that need blink/webkit can call the os implementation vs bundling a copy of chromium. it would make sense to build node into the os as well.


This is why I love Arch Linux. You can already do this with Chrome, Firefox or other browsers:

https://wiki.archlinux.org/index.php/Profile-sync-daemon

Impact is pretty good but not as significant as I had hoped.


You can probably do something similar in Windows moving the profile directory to a RAM disk.


The great thing about Arch Linux is that someone has already come up with the steps to do it and put it in a convenient place for others to find. Arch Linux is more than just an OS, it's a community of power users.


This is for profiles and not the chrome cache though, if I'm reading correctly.


Correct, but they have links to storing cache in tmpfs as well.


Ah ok. Interestingly another low-tech solution (for avoiding most profile writes and all cache writes) is to just use incognito mode for browsing. This uses an in-memory cache directly.


The idea not to save data to disk means that they have to be kept in RAM, and Chrome will take even more of it. It's fine if you have an expensive system with tens of gigabytes, but what if you have only 4 or 2 Gb?

Also, why save energy when you are on AC power most of the time?

I remember that watching a long video on Youtube in WinXP could take as much as 400 Mb of RAM. Are they going to increase memory consumption?


If you run out of RAM virtual memory means excess pages will just get written to/read from disk, usually. Which is... Not really different than what it does now, I think?


Swapping is mostly random access operations, writing small blocks here and there, HDD can do about 100 seeks/sec, and it gives speed less that 1 Mb/s while sequential read/write gives speeds of order of tens Mb/s.


I would assume that in a low memory scenario it would be paging to disk and we'd be back where we were in the first place


It is not the same situation. If you write/read video sequentially, you get speeds about tens of Mb/s, and with swapping you get random access with speeds < 1Mb/s on HDD. So having limited memory buffer for video is better for the whole system.


A raspberry pi can have 4gb of ram. Cheap android phones come with 4gb ram these days as well.

At some point software needs to stop catering for miniscule percentages of users.


I'm currently on 4gb (ram sticks slowly died one after another, hope to replace the pc soon) and yes, Firefox taking up an entire gb is pushing it. Windows takes about 2gb for itself, leaving annoyingly little for other tasks.

~15% of steam users have 4gb or less [0]. And I doubt gaming PCs drag the total average down. So yeah, only a "miniscule percentages of users" caring about ram usage of browsers doesn't seem based in reality we still live in.

[0] https://store.steampowered.com/hwsurvey


Feature phones are a large thing in developing nations, but then again I don't know if those are running Chrome or watching videos.


I hate Chrome.

But Firefox drains my battery faster trying to run a VR simulation with an a realistic crypto mining rig and a USB hair dryer. My MBP's fans kick on full-blast when I start streaming video sin Firefox. And... worst Firefox crashes every time my computer goes to sleep now -- this has been happening for about the last month. When it comes back up I have to re-sign-in to about 20 sites... yes, password manager, but... with 2FA on most it's just tedious.

Like for a browser that respects privacy may fall to like for a browser that doesn't kill my battery.


There’s ongoing work that’s just (this week) starting to land to improve macOS battery life via CoreAnimation. That should help in the medium term.

As for crashing on sleep, that’s not normal behaviour. Maybe try refreshing Firefox via about:support? You’ll lose extensions etc but if nothing else it could help to narrow down the problem if it comes back.


Thanks for the heads up! I've got different extensions running in Chrome and Firefox, but according to Activity Monitor, Firefox's energy impact is 26.29, and Chrome's is 3.25. I've got Hacker News up in both, and that's the only tab that's open.

When I open a new video on YouTube, it spikes to 160ish, then drops to 100, then eventually settles into the lows 60s.

In Chrome, it shows similar... spiking when the video starts to stream around 160, but then quickly drops down to the high 30s.

Keen on seeing what the changes bring! (=

I just did a reinstall. Hope it helps the crashing issue.

At the end of the day, I really don't trust Google. But... if it comes down to having an extra hour on my laptop it's a bit of a tough call.

Thanks!


Shouldn't they start by getting Chromium as standard compliant as current Edge ? As of now every browser supports SVG-in-OpenType except Chromium. When Microsoft will proceed with their shortsighted migration to Chromium engine there will be two browser refusing to comply to this widely accepted standard. Microsoft, would you please at least try to minimize the push in the "Web standards are whatever Google likes" direction your decision will provoke ? This is the very worrisome aspect of this move.


If its not caching to disk, doesn't that mean that ram usage will skyrocket? As it is chrome is a memory hog, imagine now with all of those videos in ram....


If the content is ephemeral then it doesn't make sense to keep it on disk, and as long as Chrome doesn't lock the memory range, the OS will swap it to disk anyway if it needs the RAM for something else.


When I watched high-resolution videos on Youtube with Chrome, its process could take about 400 Mb of memory or even more and swap out other processes.

> and as long as Chrome doesn't lock the memory range, the OS will swap it to disk anyway

Swapping in/out is random access and is very slow (< 1 Mb/sec) compared to linear reading/writing unless you have SSD.


Swapping contiguous regions is sequential, and Windows will preemptively write non-locked regions to disk so it can instantly discard them when needed. You can force this behavior by dropping working sets, which has the effect of "converting" used RAM to cached pagefile.

Of course if you're actively using a program, it will swap out other processes, that's the whole idea.


It doesn't make sense to swap out other programs just to load video that will be immediately discarded after displaying. It makes more sense to have limited download and playback buffers and store most of video data on the disk.


When I watched high-resolution videos on Youtube with Chrome, its process could take about 400 Mb of memory or even more and swap out other processes.

I've noticed you've said this twice in this thread like it's a bad thing. This is exactly the behaviour I'd expect and want.

If I'm watching a video I'd want the system to do everything possible to make it as smooth as possible.


I've been running Firefox from tmpfs for at least a decade. (tmpfs has more useful properties than letting that RAM go to Linux disk caches instead, in this case.)

Firefox doesn't need all that much page/object cache, and videos usually seem to be streamed with transient buffering.


If you're playing a video, which is the use case in the article, it wouldn't make sense keep anything in RAM. A minute buffer in either direction seems logical. I am expecting the implementation to be something like that.


Ram is meant to be used. Ram should be 100% utilized


Yes, but if your application assumes that it should use 100% of my RAM, it's being antisocial. Don't assume that I don't have more important software that could be utilizing my RAM in ways more important to me.

That's what drives me up a wall whenever people use the "free RAM is wasted RAM" argument; the assumption that RAM unused by your software is unused by me.


The operating system should be responsible for managing resources like RAM, and applications should be responsible for using the right resources. Caching video is absolutely a RAM thing, since it's expected to go away once the video is done. Caching downloads is a separate matter, since it's expected to stay once finished.

A good litmus test here is what you expect to happen if the machine reboots. For video, I think it's reasonable to lose all video cache, but I think downloads should be resumable. If it shouldn't be there on a reboot, use RAM. If it should, use disk.

There are certainly exceptions, but I think the general rule of thumb is useful.


Android found a kind of solution for this by agressively killing processes (while saving their state). This seems to be effective because developers are not motivated to help competing applications, they want to have all resources for themselves.


That's perhaps great in android, but you can't/shouldn't write code for Linux/Windows/MacOS, where such a system is not in use, and then protest "I'm not the one that sucks." Maybe those OSes should implement such a system, but until they have the "wasted RAM" argument needs to end for anything other than specialty professional software that can reasonably assume itself to be the raison d'être of that computer (e.g. video editing software, CAD, build systems for large software projects, etc.)


Windows has such a system with UWP. I think UbTouch or whatever the Ubuntu phone successor distribution is called has one too.

And I imagine most systems programmers could define an api if they wanted to.

I think this is definitely a case where apps need to be antisocial and operating systems need to peg them in. The age of cooperative multitasking is over. Let the OS decide.


Harming users to coerce OS designers is a pretty shitty plan. Sorry if that's too blunt, but that's simply the way I feel.


I have 4 Gb of RAM and most of time it is utulized over 100%. Not a very pleasant experience with modern Electron apps and reactive HTML5 sites.


Spoken like someone who has truly zero understanding of how RAM usage relates to performance.

Repeat after me: high data locality reduces the amount of cache misses, which drastically reduces memory latency, and is only achievable through systematic memory usage conservativism.


You're assuming this person understands data locality, or memory latency or even memory in general.


Try running Office, Chrome and Teams on a constrained system, then say that again!


You seem to be ignoring a very important factor here, energy consumption. RAM usage has a direct effect on battery usage and battery usage is an important factor and selling point these days.


RAM _capacity_ has a direct effect on energy, usage not nearly much if at all.

Hardware has been around to selectively put banks into low power saving mode, but I've never heard of any commodity OS that supported anything like that. It'd require active organization of physical pages to take advantage of, if common controllers even supported it


Note that most of RAM's power consumption comes from reading and writing (a function of memory clock speed). Allocating the RAM itself is likely to meaningfully increase power draw, but accessing it will.


Are there any benchmarks around showing this? And how much extra power draw it is.


Does it? I was under the impression that all the installed RAM got refreshed all the time whether it holds useful data or not.


The state of bits in RAM has zero effect on energy usage. Caching things in memory allows for lower power consumption, though, by preventing duplication of effort by the CPU.

Edit: Originally said "zero effect on memory usage", which makes very little sense!


Surely not as bad as disk.


I tend to have dozens of tabs open at the same time. However, only a few of them usually have video elements and Chrome could probably discard the media cache for all of them - except for the currently active one.

My assumption here is that the media cache is mainly used for uninterrupted playback in case my network is briefly unavailable (e.g. when on a mobile network).


So MS's plan is to put a rebranded Chrome browser on Windows? Will that keep people from downloading Chrome? Then what incentive will G have to keep improving Chrome?


The incentive for Google was always to have the web platform at feature and performance parity with "walled gardens" like app stores and native applications. Because that gets content on the web where it can be indexed, and monetised as search results etc.

Therefore, it doesn't much matter to Google if it's Chrome, or a rebranded Chrome, or Safari, or Internet Explorer that keeps getting better, as long as something does and people use it.

They will continue to work on Chrome because they support more than the Windows platform. They will also not want to become dependent on Microsoft again, especially considering the rendering engine isn't that important for their mutual competition as the rest of the browser, such as extension and plug-in architectures are.


Good question. My guess is they saw most users installing Chrome or FF as the first or second thing on a new Windows install, and were tired of reimplementing Chrome features only to still look bad next to Chrome and FF with ever-changing web "standards".


Edge Insider is quite nice and it keeps me from having to use Chrome.


Because desktop Windows is nearly irrelevant?


"only" Relevant to 35% of the world

https://gs.statcounter.com/os-market-share


Surely you're being sarcastic?


Not compared to mobile.


I wonder if there’s a feature request to completely pause all Chrome/Chromium processes, when the browser windows are all in the background and low-power mode is on. I like to have lots of browser tabs and windows opened all the time, and in my experience with Chrome on Mac, the processes from these tabs eat most of the battery, if there’s any JS running in them...

For now, I just manually pause all processes with Chrome in their name, and then un-pause them once I need a browser...


I use The Great Suspender - it pauses your tabs after a defined timeout period.


How do you manually pause a process on Mac?


SIGSTOP / SIGCONT


Well I hope this guy doesn't write a post for every little pull request a Microsoft employee sends to Chromium


It's Paul Thurrott. If somebody high up at Microsoft takes a particularly satisfying dump, Thurrott will have a blog post about it.


You don't have to read it.


i don't see how it's a bad thing.


My first thought: They added Clippy to it.

https://www.youtube.com/watch?v=b4taIpALfAo

Junk comment, sorry.


I smell some kind of a DRM coming.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: