The name hangout_services suggests this is some old tech debty hack intended to make developing Google Hangouts easier by giving that team a direct stream of telemetry. For those who have forgotten, Hangouts was the first app that did video calling in the browser using what became WebRTC. If you look at what this module is doing it's exposing stuff like CPU/GPU/RAM usage/hardware details back to the app that it wouldn't normally have.
My guess is that Google will react to this Twitter thread by simply deleting it. Hangouts has been a dead product for a while; if their server side code still uses it they can surely remove it as presumably the Chrome team monitor WebRTC performance themselves in a multi-site way now, given the much wider usage.
No, this is used by Google Meet right now. Open the "Troubleshooting" panel in meet.google.com in Chrome, and you'll see live system wide CPU usage reporting :)
In principle, that's something that could be allowed without giving access "to" Google/the site owner—even allowing the site author to provide their own functions for formatting and drawing the values—and thus could be allowed for _any_ website. Designing and implementing it is a fun technical problem, so it's a wonder why it wasn't, considering the motivations of a typical programmer (and those at Google especially).
How would you make an API accessible on the browser side but prevent the return values from being sent to the server? Somebody would surely find a way to use it for user fingerprinting.
Edit: I guess if you only want to make a local debug tool, you could make it callable only from a completely isolated sandbox. Maybe?
> How would you make an API accessible on the browser side but prevent the return values from being sent to the server?
Create an API for starting a "performance-metrics visualization Service Worker", that takes two things from a page as input:
1. the service-worker script URL
2. the handle of a freshly-allocated WebGL Canvas (which may or may not already be attached to the DOM, but which has never yet received any WebGL calls.) This Canvas will have its ownership moved to the Service Worker, leaving the object in the page as only an opaque reference to the Canvas.
The resulting Service Worker will live in a sandbox such that it 1. doesn't have network access, 2. can receive postMessage calls, but not make them; and 3. doesn't have any write access to any storage mechanism. Other than drawing on the Canvas, it's a pure consumer.
Also, obviously, this special sandbox grants the Service Worker the ability to access this performance API, with the metrics being measured in the context of the page that started the Worker.
The Service Worker is then free to use the info it gathers from making perf API calls, to draw metrics onto the moved Canvas. It's also free to change how/what it's drawing, or quit altogether, in response to control messages posted to it from the page.
The page can't introspect the moved Canvas to see what the Service Worker has drawn. All it can do is use the Canvas's now-opaque handle to attach/detach it to the DOM.
The worker could still send the data back to the page via side-channels.
For example by using up resources like the cpu, the gpu or ram in timed intervalls. The page would then probe for the performance fluctuations of these resources and decode the data from the pattern of the fluctuations.
IMO that shouldn't be part of the threat model. I could run an ad right now that consumes CPU in timed intervals and estimates CPU usage using a microbenchmark to communicate with js on other pages. This sort of fingerprinting and bits/minute side-channels are impractical to block. You'd have to give each origin its own CPU cores, cache partitions, etc
Sigh. You don't prune threats you can't control from a threat model, you document them so that the consumers and maintainers of the target of assessment can intelligently reason about the threats as the product evolves.
If a page can already deduce performance fluctuations all on its own, then you don't need a special access-limited performance API, do you? Just have the page do whatever you're imagining could be done to extract this side-channel info on the performance of the host — and then leak the results of that measurement over the network directly.
(I imagine, if such measurements done by pages are at-all distinguishable from noise, that they are already being exfiltrated by any number of JS user-fingerprinting scripts.)
A page can deduce performance fluctuations. It just needs to do the same calculation multiple times and measure the times.
The issue with the API is that it provides specifics about the CPU like "Apple M2 Max". If you give this info to a worker, the worker can encode it into a side-channel and send it to the page.
I imagine you could "solve" this (for a painful and pointless value of "solve") by 1. only allowing the Service Worker to do constant-time versions of operations (like the constant-time primitives that cryptographic code uses), and 2. not allowing this special Service Worker the ability to ever... execute a loop.
But at that point, you've gone so far to neutering the page-controlled Service Worker, that having a page-controlled Service Worker would be a bit pointless. If the Service Worker can only do exactly one WebGL API call for each metric timeseries datapoint it receives, then the particular call it's going to be making is something you could predict perfectly in advance given the datapoint. So at that point, why have the page specify it? Just let the browser figure out how to render the chart.
After some of the replies, I gave this a bit more thought, and I came up with an entirely-different design that (IMHO) has a much better security model — and which I personally like a lot better. It eschews webpage-supplied Arbitrary Code Execution altogether, while still letting the user style the perf-visualization charts to match the "theme" of the embedding page. (Also, as a bonus, this version of the design leans more heavily on existing sandboxing features, rather than having to hypothesize new ones!)
1. Rather than having the page-perf API be a Web API "only available on specific origins" or "only available to weirdly-sandboxed Service Workers", just make it a WebExtension API. One with its own WebExtension manifest capability required to enable it; where each browser vendor would only accept WebExtensions requesting that particular capability into their Extension Stores after very thorough vetting.
(Or in fact, maybe browser vendors would never accept third-party WebExtensions with this capability into their Extension Stores; and for each browser, the capability would only be used in a single extension, developed by the browser vendor themselves. This would then be analogous to the existing situation, where the Web API is only available on a first-party domain controlled by the browser vendor; but as this would rely on the existing WebExtensions capabilities model, there would be no need for a separate one-off "WebAPI but locked to an origin" security-model. Also, unlike with a "WebAPI but locked to an origin" capability, you could play with this capability locally in an unpacked extension!)
2. Browsers that want to offer this "visualize the performance of the current page" ability as a "thing the browser can do", would just bundle their first-party WebExtension that holds this capability [and the "access current tab" capability] as a pre-installed + force-enabled component of the browser, hidden from the extensions management view. (I believe that this is already a thing browsers do — e.g. I believe Chrome implements its PDF viewer this way.)
3. This WebExtension would consist, at its core, of an extension page (https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/Web...), that could be embedded into webpages as an iframe. This extension page would contain a script, which would use the WebExtension version of the page-perf API to continuously probe/monitor "the tab this instance of the extension-page document lives within." On receiving a perf sample, this script would then render the sample as a datapoint, in a chart that lives in some sense in the extension page's own DOM. There'd be no Service Worker necessary.
(Though, for efficiency reasons, the developer of this WebExtension might still want to split the perf-API polling out into a Service Worker, as a sort of "weak-reference perf provider" that the extension page subscribes to messages from + sends infrequent keepalive polls to. This would ensure that the Service Worker would unload — and so stop polling the page-perf API — both whenever the extension page's tab goes inactive, and whenever the browser unloads extension Service Workers generally [e.g. whenever the lid of a laptop is closed.] The page-perf API could itself be made to work this way... but it's easier to take advantage of the existing semantics of Service Worker lifetimes, no?)
4. But is there an existing API that allows a web-origin page to access/embed an extension page, without knowing the extension's (browser-specific) extension ID? Yes! Just like native apps can register "app intents", WebExtensions can register URI protocol handlers! (https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/Web...) The performance-visualization WebExtension would register a protocol_handler for e.g. ext+pageperf:// , and then a page that wants to render its own perf would just create an <iframe allowTransparency="true" src="ext+pageperf://..." /> and shove it into the DOM. The privileged-origin-separation restrictions for iframes, would then stop the (web-origin, lower-privilege-level) page from introspecting the DOM inside the (extension-origin, higher-privilege-level) iframe — accomplishing the same thing that the "make the Canvas into an opaque handle" concept did in the previous design, but relying 100% on already-standardized security semantics of existing browser features.
---
But how would the webpage style the extension page to match itself? Well, that depends on what the extension page is doing to render the metrics.
An elegant assumption would be that the page-perf extension page is creating + incrementally updating an SVG DOM that lives embedded in the page's DOM — on each frame, appending new vector elements (e.g. bezier control-points) to that SVG's DOM to represent new datapoints in the time-series. (I believe e.g. Grafana's charting logic works this way.)
If this is the case, then all the parent page needs, is a way to pass a regular old CSS stylesheet to the extension page; which the extension page could then embed directly into its own <head>. Just have the parent page construct a <style> Element, and then call:
(And the beauty of doing that, is that once you embed the transferred <style> from the webpage into the extension page, the very same privileged-origin-separation logic will kick in again, preventing the extension pages from loading insecure web-origin content from an origin not in the extension's manifest's whitelisted origins. Which in turn means that, despite the webpage author being able to write e.g. `background: uri(...)` in their stylesheet, and despite the extension page doing nothing to explicitly filter things like that out of the stylesheet before loading it, the extension page would get to the "making a network request" part of applying that style, hit a security violation, and fail the load. Thereby neutering the ability of the page developer to use web-origin URL-references within a stylesheet as a side-channel for communicating anything about the metrics back to them!)
---
And all of this would be easily made cross-browser. You'd need a little standalone "Browser Support for Embedded Performance Visualizations" spec, that specifies two things:
1. a well-known URI scheme, for each browser's built-in page-perf extension to register a protocol_handler for. (Since it's standardized, they can drop the "ext+" part — it can just be `pageperf://`.)
2. a fixed structure for the page-perf extension page's metrics-visualization DOM — i.e. a known hierarchy of elements with specific `id` and `class` attributes (think CSS Zen Garden) — so that the same webpage-author-supplied stylesheets will work to style every browser's own metrics-chart implementation. (This would seem constraining, but remember that WebComponents exist. Standardize a WebComponent for the metrics chart, and how styles affect it. Browser vendors are then free to implement that WebComponent however they like in their own impl of the extension. Very similar to how browser vendors are free to implement the internals of a new HTML element, actually — in fact, in theory, for maximum efficiency, browsers could even implement this WebComponent's shadow-DOM in their renderers in terms of a custom "internal" HTML element!)
Those hypothetical programmers at Google could start by doing a Manifest V4 that would be like V3 but actually useful and privacy-respecting. I’ll believe it when it happens.
Right, Meet is derived from the Hangouts codebase, I still think they'll probably just delete it. Meet is a stable product, how valuable is this special privilege now?
This is interesting to me because you have all the right facts and are reasoning well with them. But, we end up at: "Yeah you're right it wasn't killed, just a rebrand, so they'll probably just delete the code for it"
I worked at Google, and I can guarantee ya people don't go back and change names in old code for the latest rebrand done for eyewash 4 layers above me. Not out of laziness, either, it just has 0 value and is risky.
Also, video conference perf was/is a pretty big deal (c.f. variety of sibling comments pointing out where it is used, from gSuite admin to client app). It is great on ye olde dev machine but it's very, very hard on $300 WintelChromebook thrown at line-level employees
FWIW, they shouldn't have hacked this in, I do not support it. And I bet they'll just delete it anyway because it shouldn't have been there in the first place. Some line-level employee slapped it in because, in the wise words of Ian Hickson: "Decisions went from being made for the benefit of users, to the benefit of Google, to the benefit of whoever was making the decision."
Sure, I was sloppy in my use of the term "dead". Hangouts the product/brand ceased to exist, Hangouts the codebase lives on. It was ever thus. I worked at Google too, y'know ;)
Heh, 100% agree. I switched to Chromebook went WFH started because of it. It couldn't handle it on an external display but at least it wasn't painfully bad
This decision was to the benefit of users if it got videoconferencing off the ground before Zoom came along.
(I swear, sometimes I think the Internet has goldfish-memory. I remember when getting videoconferencing to work in a browser was a miracle, and why we wanted it in the first place).
Pretending you said something conversational, like: "is that quote accurate in this case? The API may have literally enabled the creation of video conferencing. I, for one, remember we didn't used to have it."
I see.
So your contention is:
- if anyone thinks a statsd web API, hidden in Chrome, available only to Google websites is worth questioning
- they're insufficiently impressed by video conferencing existing
If I have that right:
I'm not sure those two things are actually related.
If you worked at Google, I'm very intrigued by the idea we can only collect metrics via client side web API for statsd, available only to Google domains.
If you work in software, I'm extremely intrigued by the idea video conferencing wouldn't exist without client site web API for statsd, available only to Google domains.
If you have more details on either, please, do share
Scoping the data collection to Google domains is a reasonable security measure because you don't want to leak it to everybody. And in general, Google does operate under the security model that if you trust them to drop a binary on your machine that provides a security sandbox (i.e. the browser), you trust them with your data because from that vantage point, they could be exfiltrating your bank account if they wanted to be.
But yes, I don't doubt that the data collection was pretty vital for getting Hangouts to the point it got to. And I do strongly suspect that it got us to browser-based video conferencing sooner than we would have been otherwise; the data collected got fed into the eventual standards that enable video conferencing in browsers today.
"Could not have" is too strong, but I think "could not have this soon" might be quite true. There was an explosion of successful technologies in a brief amount of time that were enabled by Google and other online service providers doing big data collection to solve some problems that had dogged academic research for decades.
After your infelicitous contribution, you were politely invited to consider _a client side web API only on Google domains for CPU metrics_ isn't necessary for _collecting client metrics_.
To be perfectly clear: they're orthogonal. Completely unrelated.
For some reason, you instead read it as an invitation to continue fantasizing about WebRTC failing to exist without it
(Worth noting: Google Hangouts predates WebRTC. I think a case can be made that big data collection of real users machine performance in the real world was instrumental for hammering out the last mile issues in Hangouts, which informed WebRTC's design. I'm sure we would have gotten there eventually, my contention is it would have taken longer without concrete metrics about performance).
A) Chrome sends CPU usage metrics, for any WebRTC domain, in C++
B) as described in TFA: JavaScript, running on allow-listed Google sites only, collect CPU usage via a JavaScript web API
There's no need to do B) to launch/improve/instrument WebRTC, in fact, it would be bad to only do B), given WebRTC implementers is a much less biased sample for WebRTC metrics than Google implementers of WebRTC.
I've tried to avoid guessing at what you're missing, but since this has dragged out for a day, I hope you can forgive me for guessing here:
I think you think there's a _C++ metrics API for WebRTC in Chrome-only, no web app access_ that _only collects WebRTC on Google domains_, and from there we can quibble about whether its better to have an unbiased sample or if its Google attempting to be a good citizen via collecting data from Google domains.
That's not the case.
We are discussing a _JavaScript API_ available only to _JavaScript running on Google domains_ to access CPU metrics.
Additional color commentary to further shore up there isn't some WebRTC improvement loop this helps with:
- I worked at Google, and it would be incredibly bizarre to collect metrics for improvements via B) instead of A).
- We can see via the rest of the thread this is utilized _not for metrics_, but for features such as gSuite admins seeing CPU usage metrics on VC, and CPU usage displayed in Meet in a "Having a problem?" section that provides debug info.
I also worked at Google, and this kind of telemetry collection doesn't seem surprising to me at all. I don't know if you are / were familiar with the huge pile of metrics the UIs collect in general (via Analytics). I never worked on anything that was cpu-intense enough to justify this kind of back-channel, but I don't doubt we'd have asked for it if we thought we needed it... And you'd rather have this as an internal Google-to-Google monitor than punch a big security hole open for any arbitrary domain to query.
JS is easier to debug (even with Google's infrastructure), and they have no need of everyone else's videoconference telemetry (which when this was added, would have been, iirc, Flash-based).
I believe the things they learned via this closed loop let Google learn things that informed the WebRTC standard, hence my contention it got us there faster. Unless I've missed something, this API was collecting data since 2008. WebRTC was 3 years later.
I think you've misunderstood my question regarding "What would the alternative be?" I meant what would the alternative be to collecting stats data via a private API only on Google domains when we didn't have a standard for performance collection in browsers? We certainly don't want Google railroading one into the public (with all the security concerns that would entail). And I guess I'm just flat out not surprised that they would have dropped one into their browser to simplify debugging a very performance intensive service that hadn't been supported in the browser outside plugins before. Is your contention that they should have gone the flash route and done a binary as a plug-in, then put telemetry in the binary? Google had (And mostly still has) a very web-centric approach; doing it as a binary wouldn't be in their DNA.
It was just updated to extension manifest v3 version and someone went to the trouble of having some sort of field test id mess for it on top of all the nonsense. Doesn't seem like anyone is planning to get rid of it anytime soon.
But the Git history of it is fascinating, starting at the initial merge that got it in that went with the old school trick of "just call X to explain why this is needed" to get your stuff merged. Then every non-trivial change ever to it is inevitably auto-reverted due to some failure before being resubmitted, this must be the "unparalleled Google developer environment" in action - nobody can or bothers to run the tests on a piece of software this big. Half the commits are various formatting nonsense. One third is my favorite - someone making a change to an extension API only to realize the fucking hangout guys sneaked an actual extension into the code base and they will have to update that one to reflect their change. I can feel their anger personally.
unsure how it's reported back now, but I believe (it's been a while since i've dug in there) it's also exposed as a metric for Google Workspace administrators to monitor client perf during said calls as well
Apart from the privacy violation, isn't this a clear antitrust issue - Google is misusing their Browser dominance to give themselves and edge in the videoconferencing space.
Could it be that it tracks CPU / GPU usage etc to finetune what quality video streams to use? It's nonstandard but I can well imagine a native app would do this as well.
I might be able to shed some light on this (disclaimer: Xoogler).
I worked for a time on Google's internal videoconferencing platform, called GVC. This was in 2010-2011 at a time when a lot of the company's VC equipment was proprietary, specifically Cisco Tandberg units. These were expensive and would be expensive to roll out to thousands of meeting rooms.
around this time a different team was developing Hangouts. It's been awhile so my memory may be off but I think it was called Google Meet at the time? or maybe that was later? It's hard to keep track. I think Hangouts was the name adopted when Google+ came along and rolled Hangouts into its product offering.
There were different configurations of GVC but the most common were these All-in-One ("AIO") monitor/computer combos. It was a full Intel PC. So the GVC platform was a custom Linux distro. The system was designed so GVCs could talk to Google services, which was nontrivial, and so software updates could be rolled out. It kept old distros too in case one didn't boot. These GVCs had to be named and a whole bunch of other issues.
Additionally they needed support for various hardware like a touch panel to dial. Larger units required larger PTZ camera support and support for various microphones.
Anyway, Hangouts became the stack GVC was built on. This ultimately replaced virtually all Tandbergs and saved a fortune. This system was certainly still in use by 2017. I can't speak for later.
Monitoring was a part of all this. So when I see there are *.google.com specific APIs, we need to be sure we're talking about this accurately. Like can Google query any Chrome instance in the world? Or is it only from/to google.com? I don't know the answer and the Tweet doesn't specify.
But given the name hangouts_services and the domain restriction I consider it highly likely this is purely to support monitoring embedded Chrome for GVC. I could be wrong.
I don't think it's this. Tried a Meet call in Firefox just now. If you click the troubleshooting button, there's a CPU chart greyed out that says "try Chrome to see your CPU usage." Sure enough, in Chrome you can view how much CPU Meet is using, or maybe it's systemwide idk, either way I don't think is available through regular APIs. Edit: Definitely systemwide as confirmed with some `yes` background tasks.
P.S. The naming confusion always comes up. GVC is such a nice clear name.
This seems pretty unequivocal then- they're clearly using this to provide additional functionality to their own applications (at least Meet) that other companies who don't control the browser can't match.
They can, if those other websites provide an accompanying extension for it. Are you aware that the Zoom website can use the Zoom extension to have that exact same type of interactions?
In fact, that’s literally how other videoconferencing websites operate and have been since forever (the dreaded cisco webex extension comes to mind). The only difference is that GMeet can be counted as a part of the browser itself, so it requires no additional extension.
> GMeet can be counted as a part of the browser itself, so it requires no additional extension.
That is the enraging part. If I install extensions, I am aware that they could send information and diagnostic stuff. Plain websites shouldn't be able to. Also (not 100% sure here), afaik Firefox tells me which information the extension gets, so there's even more awareness about the information sent.
Also, which VC websites are you talking about? Zoom doesn't need a extension, BigBlueButton and Jitsy don't need a extension, afaik Teams also doesn't need it. People using WebEx certainly don't care about privacy so actually I would leave that out. (Googled it, since I never use it: apparently WebEx also doesn't use an extension anymore, although I remember the old plugin that they required some time ago, that barely worked at all)
Cisco is the only recent-ish video chat app I can think of that has required an extension.
The other I can remember is original Hangouts about a decade ago, cause at the time WebRTC wasn't commonly supported. Chrome was the only browser that didn't need an extension for that, but you know what, other VC apps were free to use Chrome's early WebRTC features too, and they later did.
Unequivocally, why can’t other video chat companies provide their own browser? They could presumably fork chromium and change a single string (if it’s really just “*.google.com”).
Obviously that’d go nowhere and no one would use it, but I can’t imagine this really matters to any competitor anywhere.
Think even bigger, why not reinvent the wheel every time every product is developed? Why have a common platform for anything?
If you think about it, what value is there in all these companies using the same roads to ship products? Can't they build their own? And is it really important that every business accept the same currency?
Yes, platform independence and shared universal access to common standards that consumers can consistently trust to provide similar experiences across products and ecosystems does admittedly reduce wasted development resources, increase competition, and makes the market more accessible to new businesses. And sure, I guess technically it reduces consumer confusion, and sure it benefits consumers by making products and services more interoperable. But who are we to say that any of that is good? /s
It's a nice story but it doesn't plausibly have any remote connection to this. I'm sure the people running the GCV platform with a custom Linux distro have some other way of reporting machine stats than literally "put our custom extension into every Chrome install ever".
> Like can Google query any Chrome instance in the world? Or is it only from/to google.com?
I'm sorry, I don't understand the distinction you're trying to make? Yes, it sounds like the API is exposed just to the content running on *.google.com, but that's still a lot like "Google can query any Chrome instance" (that visits their site, but that's ~100% given that even if you don't visit Google services, Chrome pulls in NTP content from Google by default).
I don't think this is being used maliciously, but it's still problematic if Google can troubleshoot problems this way, and their competitors in the same space can't. There's no API for Zoom, right?
That API is available to all extensions. So Zoom could create an extension that uses that API and get their users to install that extension and approve the permission for that API.
Is there a reason why Google can't get its users to install the extension and approve the permission for that API?
I would theorize the reason Google doesn't go through that process is that it's unrealistic to expect users en mass to do that, and the only way to get wide rollout would be to build it into a browser by default and then for good measure to hide the fact that it's installed -- something which, notably, Zoom can't do.
But I mean, if it's no big deal to get users to install an extension, then Google can stop bundling it by default and instead ask users to install it, right?
I never worked at Google.... but this doesn't track for me.
You're saying the reason the 'retail' Google Chrome has this bundled plugin is so Google can get observability on CPU usage on internal appliances for an internal video conferencing platform?
I'm reading the tea leaves here. I have no direct knowledge of the situation. I'm retelling a story that seems (at least to me) to be consistent with what little information is here. There's plenty you can criticize Google on but I'm not sure this qualifies. Let's not attribute malice without cause. Or even negligence. It's fair to ask if this needs to be here still and what it's for. Let's just not fly off the handle prematurely.
As for "retail" Chrome having this plugin, it would make total sense. Chrome is a massive codebase. Maintaining a fork is a significant amount of effort. It would be far easier to add APIs to Chrome and whitelist them for only google.com extensions/JS.
I'm not sure what these APIs are exactly and why they're there, but Firefox also does something similar. It has special APIs available only to Mozilla and/or Firefox domains, for things like installing extensions, or helping with first-run experience.
A blog post about it was shared here on Hacker news <12 months ago, but I'm having trouble finding it...
apis are public, documented and the domain allowlist is both included in the UI and about:config (save from android playstore version where they hide everything to make the browser pure garbage for whatever reason)
and I'm pretty sure devs would at least think about adding your domain by default if you ask nicely with a great use case on bugzilla.
What? You think that Mozilla devs would think about adding your domain to the whitelist of domains allowed to install extensions if you just asked nicely? That would be insane from a security perspective.
But that is for websites directly related to operating the browser, whereas chrome is exposing APIs used by unrelated google products such as google meet.
This could possibly also be a violation of anti-trust laws since it is using a monopoly in one market (browsers) to get an advantage in another (video conferencing).
It's pretty standard among browsers. The risk should be about equal to someone spoofing the domains that the browser downloads software updates from, and you can turn it off via prefs if you really don't want it.
Not really. Browser developers add lots of different website-specific hacks to make sites behave better in their browser. Mozilla actually used to do this a lot, when they were originally the underdog 20+ years ago and were trying to get people to switch to the Mozilla suite (and then Firefox), when the argument against switching was often related to websites not working or rendering properly in Mozilla/Firefox that behaved properly in IE.
(Not that this is not the same thing as a website developer adding browser-specific hacks to make their site behave better/worse in a particular browser.)
Those are APIs related to browser functionality and onboarding. They're not there to advantage one of Mozilla's other product offerings at the expense of similar products offered by other companies.
Yes and they should also be criticized for it. Mozilla isn't exactly known for caring about privacy even if their marketing wants you to believe otherwise.
As for the anti-trust aspect, here the market share matters and Firefox is insignificant in that regard.
Disclaimer: I work at Google, but not on Chrome or on these APIs.
I think the explanation is quite mundane. An example usage: open google meet, start an empty meeting (an “instant meeting”), click the “…” menu, click “troubleshooting and help”.
There’ll be plots of various stats, including CPU utilization. I think meet will also helpfully suggest closing tabs if your machine is overloaded during a meet call, too.
It’s very helpful, I check it from time to time.
Edit: now that I think about it, I’m not sure about the suggestion to close tabs is actually a thing. I’ve only actually used the stats view.
> There’ll be plots of various stats, including CPU utilization. I think meet will also helpfully suggest closing tabs if your machine is overloaded
This is not mundane at all, it's a perfect example of giving your product an unfair competitive advantage.
If Meet users are told why their meeting isn't working correctly but Zoom, Teams and Slack, Meet users are going to have a better experience that Zoom, Teams or Slack has no way of replicating.
No wonder every other meeting provider pushes you aggressively into using their desktop app, Google Meet's desktop app is just Chrome!
At least other video conferencing tools don't lag like Meet, so users don't need to debug ;) I think this has to do with all of them using H.264 while Meet uses VP8/9.
Having had the dubious pleasure of trying to use Meet from various Apple devices, Meet didn’t lag because Meet couldn’t produce video at all. Maybe I only operate at the wrong edge of the Meet ecosystem, but it did not compare well to anything else out there.
(I’ve tried Safari, I’ve tried the native app, and I’ve even tried the phone bridge (!).)
The implementation is what I was thinking of. I've also heard claims that VP9 is inherently slower to encode than H.264, but no idea if that's accurate. AVC/H.264 has very broad hardware support. For example, the 2019 MBP I'm using right now can't do hardware-accelerated VP9 encoding, but even 2011-ish MBPs can do H.264 acceleration in both directions. Intel's support matrix: https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video
AV1 looks like it's getting broader support, but it's still new. Zoom's release notes mention they'll use AV1 if the participants support it, and I don't see a similar note about VP8/9.
It's only in newer chips, but it's broader than VP8/9 was. Intel and Nvidia's newest chips support hardware en/decoding of AV1, while Nvidia never supported VP8/9.
> If Meet users are told why their meeting isn't working correctly but Zoom, Teams and Slack, Meet users are going to have a better experience that Zoom, Teams or Slack has no way of replicating.
I had to re-read this a few times; did you accidentally omit a word?
> If Meet users are told why their meeting isn't working correctly but Zoom, Teams and Slack aren't, Meet users are going to have a better experience that Zoom, Teams or Slack has no way of replicating.
I fully agree with you, though; it's anticompetitive for them to use Chrome to give their other products an advantage.
I believe this is the point, rather than being mundane. Other video conference tools are not able to offer this debugging option - which you have pointed out is useful.
You have a competitor, Zoom. They have an in-browser version. Can they use this API for troubleshooting performance issues? No? The European regulators might be interested in that.
Perhaps this is one reason why Meet performs well in the browser and Zoom doesn't, meaning Zoom users use the native app if they want reasonable performance (particularly with many people in the meeting).
Apparently, this system statistics API is generally available to extension developers. The underlying issue is that they bundle a hidden "Hangouts Service Extension" with Chrome. Zoom, Teams, etc. would have access to this API through a Chrome extension, although they wouldn't have the advantage of having that extension pre-installed.
Mind you, Zoom does everything in their power to steer users away from their web client and toward their ~~malware~~ desktop client, so I don't think they're too upset about the status quo.
This is a link to a five year old issue, when Zoom was a much smaller operation and their quality was far worse than it is now. They implemented a convenience feature in an incredibly stupid and insure way, in 2019.
Do you have any evidence that the current Zoom client is "malware"?
Worse off. I have to use Meet for work, and I'm forced to run Chrome for Google's shitty software to run acceptably.
Edit: And for some reason I have a Chrome profile even though I never created one and never "logged into" Chrome. Another thing that's been forced on me by Google's product team.
Trying to be generous, the only reason I can think for why customers would be worse off is that Google is literally the only one who can be trusted with this kind of power. Not Zoom, not Microsoft, not the user whose data is being transmitted, etc.
But even that does not explain why the existence of the API was not disclosed. Do you agree that that looks bad for them?
Then there is the fact that Google is far from being a company people trust. They should be rushing to be transparent about their decision, if there is a good, persuasive reason for it. They could use the good press. Instead, they made a secret API that can read privileged system information, locked it so that nobody else could use it, and then never told anybody about it—all while claiming to be secure, and privacy-focused, and definitely not abusing their browser monopoly.
Microsoft Edge's source code isn't available. We don't know really know what kind of stuff they patch in.
It's just that Google keeps most stuff in Chrome open.
To me it looks like Google wants to use some analytics on Google Meet to improve it (e.g. a/b testing on CPU consumption), or they just want to provide that interactive CPU% widget in it, but they don't think it would be a good idea to let just any website use it, as it can could be used for fingerprinting (e.g. if you have two different sites open in separate tabs, they could detect this by co-operating to correlate the CPU% time series data, even if the connections are over Tor or proxies).
For non-Google services they provide a mechanism to do the same by having the customer install an extension with the correct permission.
That's the internal API used by the hangout_services extension. It's the extension itself that is undocumented.
A user might reasonably expect that web pages do not have access to the system.cpu API by default. And that's mostly true, but thanks to the pre-installed but hidden hangout_services extension, google.com does have access to this API. That's at least a little dubious.
"Of course we are the good guys, it's everyone else who is misguided when they get upset after finding out out about our awesome work."
Choosing to work for an advertising giant already calls into question either your ability to tell what "the right thing" is our your honesty in claiming to do it.
Except that is not what Google is doing. They have exclusive access to the one line that is preinstalled for all houses. Only they can use it. And if you want a different provider, you can't use that same line. You have to pay for the installation of a line from that provider with your own cash.
Huh, Chrome doesn't come preinstalled unless you are talking about ChromeOS.
I guess I just don't see the problem in a feature like this in a third party browser software that is completely optional to install and use and has lots of alternatives.
Yeah, so people with a choice of browsers might prefer it not be the one exposing exclusive APIs for its parent company, and it might affect that company's "we're not evil" image.
But yeah, having that build into your browser is a huge advantage over having to nag your users to install an extension or worse convincing the IT department that it's worth installing.
I agree it is very useful! This is also how I discovered this in the first place.
But that is not at all my point. The point is that google.com web properties have access to an API and a browser capability that is not available to it's competitors. Google only allows reading CPU info for itself.
The reason the data is not available for everyone, is because it would be a huge tracking vector. Same reason we don't allow webpages to read the device hostname, or username, or Chrome profile name. Google exposes this to google.com because it trusts itself. That poses this antitrust issue though.
Changing Firefox's UserAgent to Chrome's, results in a speed-up. Because when the useragent isn't Chrome, YouTube checks for feature availability, whilst on Chrome, they just assume it is all there.
Thanks for giving me a specific example. At this point in the evolution of web standards and their implementations, that kind of UA check is stupid enough that I can believe there is at least some level of intent to not also making assumptions about available features on other browsers.
This explanation was the first I read of what this actually does (yeah, yeah, I didn’t read the linked article first) and that’s a lot worse than I expected.
Yeah a whole lot of things really do seem mundane... once you have already accepted the fact you are tracked down to the cpu-percentage-usage-in-time level
This isn't a mundane explanation though: this is exactly the example Luca gives in the original thread. It's anti-competitive, because it's functionality only available to Google Meet. Google is using its browser monopoly to advantage its other products.
They are just trying to make their products better. Anti competitive behavior is generally perceived to be about doing things that put the company in question in a better position without improving the product.
Ask yourself the question - are customers better or worse off because of this?
That's not what anti-competitive means at all. Having APIs that Google Meet can use but competing products can't reduces competition, which makes customers worse off.
Anti-competitive behaviour doesn't need to mean other companies can't compete. It just means it makes it harder to compete, which this does. Having access to this data improves Google Meet, and evidently gives enough advantage to justify adding it to the browser. Other products don't have access to these APIs, so can't improve their products in the same way. Google has used their browser monopoly to give their other products an unfair advantage. That is anti-competitive behaviour, and is illegal.
"All they're doing is making their product better."
"Making your product better by privileging your own domains in the browser is the anti-competitive part."
"Come on, it's not like it's making their product better."
----
This really isn't complicated. Is this making Google Meet better? I would quote:
> danielmarkbruce: "They are just trying to make their products better."
Okay. So then Google Meet would be a worse product if they didn't have privileged API access over other apps. So... this does make it harder for those other apps to compete, unless you think that the quality of a product is somehow irrelevant for competition.
Sure, Google Meet still isn't winning, but who knows where they'd be in the market if they didn't privilege themselves.
You're saying that their product would be worse if they didn't do this, but also that it somehow doesn't matter because they're not the best product. Which has a similar energy to me cutting a loop out of a marathon and saying, "Come on guys, I only came in third. It's not cheating unless I come in first, everybody knows that. As long as I don't come in first I'm allowed to take shortcuts. Give me my third place medal that I definitely earned fairly, why is everybody mad about this?"
> The point of products is to provide value to customers.
This is idealistic, the point of a product is to provide value to the company.
And competition is not a by-product that exists by accident, it is the mechanism through which we get companies who are building things for their benefit to incidentally provide benefits to consumers.
Products are competitions. From a business point of view, the point is to win. From a social point of view, yes, obviously we want products to provide value to consumers. But don't make the mistake of assuming that Google (or any other company) has the same goals as society. Every business wants to be a monopoly.
----
> Misleading analogies don't illuminate.
Now, you may not like the analogy, but the general point here is exactly the same regardless of what analogy you use. I'll repeat:
> This really isn't complicated. Is this making Google Meet better? I would quote:
> > danielmarkbruce: "They are just trying to make their products better."
> Okay. So then Google Meet would be a worse product if they didn't have privileged API access over other apps. So... this does make it harder for those other apps to compete, unless you think that the quality of a product is somehow irrelevant for competition.
----
You can not in one breath argue that this is good because it made Google Meet better, and in the next breath argue that it's fine because it didn't impact the market. Those two ideas contradict each other.
And the fact that Google is so inept at product design that it can't capture the entire market even when it unfairly advantages itself does not mean that it is not unfairly advantaging itself or that it isn't causing harm. The Internet as a platform is better for both consumers and businesses when it is a common platform, not one that privileges specific companies. The Internet (and the market overall) is harmed by breaches of market rules regardless of the final outcomes, because each breach emboldens companies to attempt even more lawless stunts and destroys trust in the market.
I mean, seriously, call it whatever analogy you want, it's still awfully silly to argue that Google cheating to give itself a leg up over competitors is fine... because even with the advantage Google still couldn't build a good enough conferencing app to capture the entire market. That does not let them off the hook for cheating.
"It doesn't matter what the market effect was, only that Google engineers meant well" is certainly an argument, but it both contradicts the question you originally asked (are customers better off), and also (to be blunt) is a really heckin bad argument.
I'm just kind of blown away by the rapid shift from "this helped consumers", to "actually, no, the effect was minimal", to "actually, it doesn't matter if anybody was harmed, the result is immaterial." :)
"Google should not use a near-monopoly position in the browser to privilege it's own sites and services" is a very simple standard, and this really is not a complex case.
It only becomes complicated if you start trying to rephrase a simple principle as: "Google shouldn't privilege their sites unless they mean well, and then the result is immaterial, but no wait actually I didn't mean immaterial, I meant minimal, and anyway it's not like Zoom isn't still popular so-"
Or... Google could also just not ship invisible extensions as part of Chromium's build process that privilege Google-owned services with extra API access in direct contradiction to the principles of an independent Internet. Because the effect of casually breaking that contract isn't minimal. It does actually matter that the web be a neutral platform. If businesses expect that Google can get away with privileging Google platforms in the core browser, that perception and allowance of interference degrades the entire Internet as a commercial platform - and of course emboldens Google to go even further in the future.
I... what? Today I learned that the Federal Trade Commission is a figment of my imagination.
I'm sorry, your argument has devolved to the point where you're now saying that Chrome privileging Google sites isn't anticompetitive behavior because antitrust isn't a natural law? I can't believe I have to say this, but that's not the standard that the FTC or courts use.
Google is literally being sued right now for, in part, using browsers (both its own and others through browser deals) to privilege it's own services. No, the FTC was not created for the purpose of the Internet, but that is not a thing that anyone said, and I very genuinely believe that you are smart enough to understand that I was talking about the provably false claim that a neutral Internet that doesn't exist to privilege Google is some fantasy that only tech nerds have rather than a repeated principle in multiple current antitrust efforts by multiple governments around the world, including the US.
That being said:
> You keep moving the goal posts.
You're right, and I apologize. We weren't discussing whether or not antitrust was a natural right. That's off-topic, and that's on me. We started this out discussing, in your words:
> Anti competitive behavior is generally perceived to be about doing things that put the company in question in a better position without improving the product.
Which I hope at this point we've established is just straight-up wong, that's just not an accurate evaluation of antitrust. Then of course you went on to say there was no effect, and that if there was an effect it didn't matter because "result = immaterial" (which is also absurd, even the most conservative, limited perspective on antitrust in the government does not say that the results of a company action aren't relevant to antitrust). And then you went on to imply that having a neutral platform on the web isn't something anyone should expect anyway, which... yeah, okay, absurdities aside you're correct that now we're starting to get off topic.
The on-topic response to this as far as I can tell is: I don't think you understand what antitrust is, how it works, or why we have it, and everything you're saying here is absurd.
But it's very easy to get caught up in minute rhetorical debates: part of what's been wild about this conversation has been watching you make even more indefensible claims that you never had to make, just to avoid the appearance of one contradiction. And it's worth resisting that impulse, taking a step back from this and looking at the original questions: was anyone harmed? Is this anticompetitive behavior?
You say no, but you also say that no one should have any expectation of an Internet that exists independently of a single company's monopoly hold over its standards and APIs. So you're not really in a position to know if anyone was harmed because where competition on the Internet is concerned, you appear to reject an entire category of commonly understood harm.
So just taking a step back and looking at these ideas: are people overreacting over Google? Does this cause harm? Was the purpose of this to make people's lives better, or was it to privilege Google's services over competitors? If you believe absurd things about antitrust, competition, corporate intentions, and the Internet itself -- it doesn't really mean much when you say that Google's intentions are good and everyone is over-concerned.
If you tell people not to be concerned about the loss of a neutral, independent Internet, and it turns out you don't believe in a neutral, independent Internet, then... surprise, people aren't going to listen to you.
Materiality matters, in practice. This is a tiny thing.
The result of the action matters, in practice. Meet is an also ran.
The intent/motivation matters, in practice. They were trying to do the right thing, improve their product.
Using a dominant positing in one market to promote/dominate in another market via distribution is where regulators tend to push cases. And, people tend to (rightly, mostly) get upset. Improving the product via another product? Find me a list of cases.
Have you considered some people actually deal with anti trust on a day to day?
On whether platform APIs (like those in a web-browser) can be anti-competitive:
> Apple has used one or both mechanisms (control of app distribution or control of APIs) to suppress the following technologies...
[...]
On the need for neutral API access as a tool to increase competition:
> Messaging apps that work equally well across all smartphones can
improve competition among smartphones [...]. Apple makes
third-party messaging apps on the iPhone worse generally and relative to Apple
Messages, Apple’s own messaging app, by prohibiting third-party apps from sending
or receiving carrier-based messages...
[...]
On the suppression of APIs for third-party services:
> By suppressing key functions of third-party smartwatches —including the ability to respond to notifications and messages and to maintain consistent
connections with the iPhone—Apple has denied users access to high performing
smartwatches with preferred styling, better user interfaces and services, or better
batteries, and it has harmed smartwatch developers by decreasing their ability to
innovate and sell products.
[...]
On the use of privacy as an excuse restrict 3rd-party APIs that are not restricted for 1st-party services:
> In the end, Apple deploys privacy and security justifications as an elastic shield that can stretch or contract to serve Apple’s financial and business interests.
[...]
If you need it stated even more clearly:
> Apple selectively designates APIs as public or private to benefit Apple, limiting the
functionality developers can offer to iPhone users even when the same functionality is available
in Apple’s own apps, or even select third-party apps.
This is directly analogous to what Google is doing here. Shipping a by-default extension which takes advantage of a distribution channel (Chrome's list of default extensions) that is not available to 3rd-party developers. That extension grants Google access to a private API that benefits Google while limiting the functionality that third party sites can offer their users, and I quote: "even when the same functionality is available in [Google]'s own apps."
You do not know what you are talking about.
> Have you considered some people actually deal with anti trust on a day to day?
You're saying it's not a big deal while also saying it improves the product. The debug panel is apparent, but we don't know what else the API's data is used for. Maybe Meet uses it to improve performance too.
The goal or motivation is irrelevant; what they actually did and its potential effects are what matters.
I'm not sure why you seem to be having so much trouble with this concept. Google's majority-market-share browser gave their browser-based videoconferencing product a privilege and advantage that other browser-based videoconferencing products did not get.
That's it. You don't need to dig into their motivations or their intentions. It doesn't matter if it even "worked" or not; it is completely immaterial that Google is so incompetent that it can't even win when it has given itself the tools to play dirty.
Motivation is a basic concept in the legal world which goes a long way to deciding criminal cases, has an impact on civil cases and will certainly influence whether the DOJ brings a case and what the result is. It's also a basic concept used by humans.
Saying it is irrelevant shows a complete lack of understanding of the legal and regulatory environment in which businesses operate.
If so, the API has to be available to competitors. Maybe this is why Meet is in-browser while all the other ones work better in apps. Despite this, Meet still hasn't won because it's just not very good.
Yes, in the real world if something is so inconsequential, in practice gets left alone. It hasn't had any effect on the market. This isn't kindergarten, "THAT'S NOT FAIR!!" isn't valid unless it actually matters.
> are customers better or worse off because of this?
Worse. Google's Hangouts/Meet customers are better off, but everyone else is not. That's what anti-competitive behavior is: using your monopoly in order to advantage your own products at the expense of others.
Don't you find it hilarious how people who work or worked at Google happen to think that things Google does are "mundane", even when other people think they're outrageous? Hilarious coincidence, really. Can't stop laughing.
Yeah, crazy to think that Google of all companies would track people in unexpected ways :eyeroll:.
Your post is evidence that the scrutiny Google gets is actually helping matters. Companies, especially powerful ones, should default to not tracking personal data any more than necessary. I'm glad to hear that at least one department took that seriously.
Exactly. In a world with sufficient anti-trust and privacy enforcement Google would instill into their employees a fear of even thinking about pulling stunts like this. Instead we have Googlers and ex-Googlers running defence for it claiming they see nothing wrong.
In such world, no one does anything without running it by the lawyers first, and then a months long debate occurs around every single little move, and nothing gets done.
Tracking is very rarely useful to the application but can be useful to the company when the application isn’t profitable on its own. Google has demonstrated this before.
This is almost precisely backwards. Developers want so much telemetry for their applications, almost all of which is totally useless outside of the need to debug or improve that application.
Ah yes, the good old "Sorry we accidentally violated your privacy and illegally disadvantaged competitors using our monopoly. But everyone is actually just doing the best they can, honest mistake. Teehee."
Sorry but at some point people will stop buying that shit and Google is well past it.
They should fix the situation. They try to do too many things at once.
But it's hard to give engineers leeway to just work on things and at the same time make sure the combination of data collected can't be added in a way that is "bad" on some dimension. If you think about the combination of hundreds of groups doing things... it's just a hard problem.
If you imagine running just maps and email and youtube, and you try to improve those products by instrumenting them in a pretty vanilla way... you just end up with a lot of data on people. Consider person X watches videos about lung cancer treatment options. Person X also drives from location a to b with some frequency. Location a is a residential address and location b is a hospital. You track both these things so you can recommend videos they might like amongst the bajillion videos available, and so they can get in their car and immediately choose location b without having to enter the address.Person X has their name in their email account so it displays their name when they send an email. Person X's name is John Smith. Put it all together - John Smith, who lives at 12 Main Street, Chicago, has lung cancer. And when he stops driving that route and stops looking at cancer videos but is still active... he's been cured.
Even if you take the mundane explanation -- that this was just to allow Google engineers to troubleshoot user issues with Hangouts -- you still have a company using their market power to give their product (Hangouts) a benefit that no other product (Zoom, Teams, literally any other WebRTC video conferencing solution.) gets to use.
I think the submission is a bit wrong in editing the title from the original. I understood it like this:
Chrome has a built-in extension that uses public Chrome APIs that are easily available to other Chrome extensions. The issue described is that this extension shares this information to Google's own domains when they're communicating with the
extension, while other websites can't do this.
For what its worth, I'm on Brave (chromium based) and this also works there, so it appears to not necessarily be only Chrome but potentially any chromium browser where they haven't specifically blocked/disabled this
Arguably, Chromium is more important to Google than Chrome is.
By controlling the engine, they have de facto control over the standards.
By keeping Chromium open, they incentivise forks. Most people who don't want to use Chrome because it's Google's or for some other reason will use one of the forks or Chromium-based browsers.
By keeping this illusion of choice, Google extends its control.
People are running it because they think Google is trustworthy, so telling about abuses might be useful in order to erode that trust and let people know they shouldn't run anything from that company
There is potentially an innocuous and straightforward explanation for this. Imagine the browser comes with some functionality implemented as a google.com-signed web app (as opposed to compiled/linked C++ as a lot of the older Chrome UI).
It would be silly if that PWA-implemented browser code would need permission to access the system information, since it is part of the browser's functionality itself.
Another use case for a private API (that has long existed) is integration of the Chrome browser with Google-specific websites that provide core functionality, like the Chrome Web store, to allow for installation/removal of extensions from a web page.
I think it is a mistake to give a company like Google the benefit of the doubt. Consumer protection is a lot like security, we should theorize the worst case scenario, and assume the company is willing to work against consumer interest if it serves their own interests.
If there exists a mundane and reasonable explanation for this, that doesn’t matter if there also exists a potential to exploit it in a way that harms consumers’ interests.
My examples are of core browser functionality, just implemented with a different tool chain (a web app instead of C++). Should the user be asked for permission for C++ to send an IPC to another C++ component? Should the Chrome Web store ask for permission to install extensions in Chrome?
Down-thread I see that this is being used for Google Meet functionality, for which I agree it should ask for the user's permission.
In any multi-process software (of which Chrome is on the more complex end of the spectrum), there are dozens if not hundreds of different IPC message types sent between different components of the software.
To prompt the user for every type of IPC would make the software unusable. No desktop software does that on any platform.
It would be like a car asking the user for permission for the steering wheel to access the front axle.
Yes, and that's the issue: Google is bundling an extension into Chrome that is unrelated to the operation of the browser itself, which gives another one of its products (Hangouts) privileged access that no other videoconferencing app has, at least not without the big step of getting users to manually install another extension.
Don't know about the rest, but ungoogled chromium scrambles every occurrence of the string "google" in the code specifically to avoid things like this, so probably not.
I understood what the commenter was saying and the "security implication." I am mocking the thing that people want: to use Google Chrome but with the "googly" bits removed. It isn't super practicable. Just use Firefox.
Yes, Edge has it enabled. If you open Google Meet troubleshooting in Edge you can see your system CPU. If you try it in Firefox it says "Try Google Chrome to see your CPU usage".
Safari also has some Apple specific features, like being able to show a special dialog for logging into other websites with your Apple account that works differently from passkeys or password autofill, or the redirect based flow they make other browsers go through.
Always wondered how it's implemented in JS. WebAuthn with proprietary arguments...?
Google has done this sort of thing before. My memory is fuzzy as to the details, but I think it was Native Client being allowlisted at the domain level to only work on Hangouts, or something like that.
I briefly worked on Internet Explorer in ages past. They would develop APIs with the Windows team for use in IE to give IE special features that other browsers couldn't implement.
So this is a lot like Microsoft using specialized formats or APIs in Windows that competitors cannot access, which was a problem throughout the 90s. The problem never went away - it has just changed appearance.
> So, Google Chrome gives all *.google.com sites full access to system / tab CPU usage, GPU usage, and memory usage. It also gives access to detailed processor information, and provides a logging backchannel.
So I guess the question becomes how quickly you can spoof this ?
If you mean can another domain trick Chrome into letting it access those APIs… probably not; it seems it’s based on the browser extension architecture which is already somewhat hardened and I believe doesn’t even load the code for the extension if you’re not on a matching domain (though the typical protection goes the other way around — preventing extensions from accessing website data without permission).
The case here was just injecting a domain. There's another thread for this post pointing out you would also need to inject a malicious root cert for https traffic, which is correct, but not impossible (and given some bad/lazy practices I've seen places do when they sign their own certs for internal infrastructure, not a far stretch)
If they can do that, they can spoof or proxy any website and collect your passwords, auth cookies, and anything else sent over the network. At that point, who cares if they can also see how much CPU you're using?
I've unlearned over my years that trying to come up with what malicious actors can do under what scenarios and conditions isn't worth the effort, because they are many, know more than me, have different goals than me, and I am one. There's endless permutations of environments and additional weakness or scenarios or a particular sensitivity of information that you don't or can't consider that make some attack really painful. For this case, maybe CPU usage or aggregate changes in CPU usage tips off an attacker on what someone is ramping up internally that can be used for espionage or even timing attacks.
What I have learned in place of that is plug holes to minimize attack vectors.
So if you can just trick someone into trusting a bogus root CA, take control of their DNS resolution, and get them to open an attacker controlled domain in Chrome then you can... Use this API to get information about their current CPU utilisation.
Or anyone who controls your DNS resolution which has a number of paths (for example a local hosts file, possibly a router, changing your config or how you get your config to a malicious DNS server, etc)
In what world does "system / tab CPU usage, GPU usage, and memory usage" mean "full access to the system"? Any Chrome extension can access this info easily, the point that the tweet makes is that there's a built-in Chrome extension that shares this info with Google's own websites without any confirmation.
Is it really that easy? I just kind of assumed that devs could create subdomains under a dev TLD like googdev123.com, but not google.com until it was a fully-fledged product release.
Agree. I work at Google. I promise nothing happens quickly. It can take over a week to set up a new SQL database & client. Half coding (don't get me started on boq...) and half data integrity and criticality annotations for the data...
I don't know what setting up a new domain is like but I can't imagine it's something you "just do".
Only to leak your CPU/GPU utilization though as far as I understand it. Those can also be exposed in other ways by legitimate JS/WebGPU by measuring/profiling shader runs/etc.
So if you install your own certificate authority and then spoof the DNS it might be possible? Not so useful as an attack vector, but potentially useful for people who want to do fun things with the browsers they own.
This. I think chrome is now using something called certificate transparency, but it has the same effect it won't trust your own installed CA for google.com
Google Docs is designed to not let you run arbitrary JS in a trusted (i.e. google.com origin) context, or else the author of any doc you visit could act as you on Google properties.
Firefox’s website “interventions” are documented on Firefox’s about:compat page. Each intervention has a link to more information about the website issue it’s fixing and a button to turn it off.
You can build Chrome without this by setting `enable_hangout_services_extension` to false. Of course, then none of the WebRTC stuff on google.com will work.
FWIW I once tried and failed to compile Chrome. My machine didn't have enough RAM to compile chrome at the time. Even though it was able to fully compile any other software I threw at it.
> So, Google Chrome gives all *.google.com sites full access to system / tab CPU usage, GPU usage, and memory usage. It also gives access to detailed processor information, and provides a logging backchannel.
Those things can absolutely be used to "improve" fingerprinting. I don't think it's fair to assume it's being used for that though, without any further evidence. But it certainly could be used for it.
Anyone have any further context? As it stands right now, it's just a random claim without any proof what so ever? There is link in another comment, but how is that related to the tweet?
Google doesn't need any extra help to track users who are browsing Google sites in the Google browser. It is probably instead anticompetitive functionality that lets Google sites work better in Chrome in ways that other sites can't replicate.
> Google doesn't need any extra help to track users who are browsing Google sites in the Google browser.
I would not be surprised if there are cases where this would let them track users they otherwise couldn't. Like someone running two isolated Chrome instances with separate network connections but on the same PC.
If it's really accessible from *.google.com, wouldn't this be simple to verify/exploit by using Google Sites (they publish your site to sites.google.com/view/<sitename>)?
Google spent billions muscling their way into their majority market share of web browsers, now they're going to keep on cashing out with unfair practices like these.
Last time I used Google Chrome, if you logged into a Google account while using Chrome, it automatically logged you into that Google account in the browser itself.
Isn't that implemented in a similar way to this?
I agree with the concerns about unfair competition, and I think this auto-login "feature" could also qualify as an example.
Chrome extensions can access CPU and GPU statistics via a public API that any extension developer can use. Google Chrome comes bundled with a hidden "Hangouts Services" extension, which has permissions for *.google.com (that extension's ID is the 'nkeimhogjdpnpccoofpliimaahmaaome' referenced in the examples). That extension has a function that will return those stats when called from JavaScript running on a webpage. As a result, any JavaScript running on *.google.com can get the current CPU and GPU utilization.
It's an API that is available to extension developers. The wrinkle here is that Google bundles a hidden "Hangouts Services" extension in Chrome with permissions for *.google.com (the extension's ID is the "nkeimhogjdpnpccoofpliimaahmaaome" referenced in the examples).
And? Google uses Chrome to retrieve data about the user.
Every Chromium-based browser has 'hidden' APIs only accessible on certain domains. That's how the custom (read: closed source) extensions work. "Component extensions" are used to interact with them normally: https://chromium.googlesource.com/chromium/src/+/main/extens...
Any whitelisted domains for these APIs cannot be written to using user-installed extensions, in order for a malicious extension to not be able to inject a script and execute the special API.
At Opera, we previously tried attacking the underlying implementation about how these 'hidden' APIs are accessible. Although we found a lot of Opera-specific issues, the Chromium logic seems sound and a "bypass" for other websites accessing the API is unlikely. It also seems that the developer here was just a bit overzealous in allowing this API to be accessed from all google.com subdomains.
If you don't want to use Google to the extent of blocking all their domains, maybe it's time to change browsers? Why would you use chrome if you don't trust Google at all? The main concern here is Google giving itself unfair advantage, btw
For anyone having trouble with the logic here, which seems like a lot of people in this thread for some reason:
[Google's browser] comes with [code] that [does things] in a default installation of [Google's browser] that [Google's competitors] can't do in a default installation of [Google's browser].
Didn't you leave out that [Google's browser] allows [Google's websites] to do things [other websites] cannot?
Ostensibly [Google's websites] are websites like any other, but [Google's browser] treats them differently. IIRC Mozilla does similar things for addons.mozilla.org, but googles seem more broad since they are not as clearly linked to browser functionality.
My reading of your comment was defensive of google, basically "google ships a software package so of course they should be able to do stuff others can't" where I tried to highlight "google ships a browser which should of course treat websites equally". Seems like I misunderstood the intent behind your comment.
I don't have a problem with the logic, I'm just not sure why I should care. I imagine Edge probably can do magic stuff on microsoft sites that it can't on the rest of the web too. It makes sense for the browser to have a higher level of trust for the company that makes it than it does for the wider web.
Because the Web is supposed to be based on open standards and browsers are supposed to be neutral platforms that implement the standards.
If you don't care about that, fair: nobody really cares that you don't care. What everybody cares about is what happens if some of the various antitrust agencies of the world start caring.
Some people remember when the government went after Microsoft for having secret APIs that only IE could use.
... but other people remember that in the time since, that entire Microsoft monopoly fiasco is held up as an example of bad prosecution, and we don't go after companies like that anymore.
>... but other people remember that in the time since, that entire Microsoft monopoly fiasco is held up as an example of bad prosecution, and we don't go after companies like that anymore.
This, but also, I still don't see anyone posting a compelling reason why I should care about this issue. The government and I don't necessarily have the same interests. Personally I don't care that google gives their own browser more access to my computer when I use google services, and if it improves my experience, I actually want that to be the case.
The bulk of the complaints about this just seem to be tattletale behavior you see from children, not any thought out complaints based upon an actual harm.
The actual harm is that other companies can't use that additional access. If they could they might be offering better products than Google to me and you. You might not care about better products, but I (and probably many people on this post) do.
Except allowing other companies to have that access results in a huge loss of security. I trust google enough to install their browser, they already do what they want with my computer, I don't really want to evaluate whether or not I trust every other random website to have that access or trust that they'll continue to be trustworthy at some random point in the future.
yes the resource monitoring and alerting on meet was instrumental to my decision to cancel all other software subscriptions and give all my money to google
I’m joking a bit, but I do think it’s important to articulate the purported harms of claims of anti-competitive behavior. What did they do with this capability? How did that harm their competitors? Is the actual conduct worth getting up in arms about?
People arguing that this is "just extension" are ignoring the fact that extensions have special priviledges compared to websites, and you would not want all websites to have the full power of arbitrary extension.
If it's "just extension", make it available to all domains.
If correct, once you access a `.google.com` website, the browser makes available through javascript an API allowing the querying of a lot of information about all the open tabs (if open, for example, your banking website) and can send the collected information to the "mother ship".
If true, as usually, a lot of people have a Google tab open, you can easily deduct what it means.
This is definitely something to be investigated, for the moment, we only have a tweet.
> an API allowing the querying of a lot of information about all the open tabs (if open, for example, your banking website)
No. It uses the chrome.system.cpu API, that any extension can access, which gives CPU and RAM utilization info about your tabs. It doesn't give anyone "a lot of information about all the open tabs", and does nothing to expose your banking website...
The allowlisting going on here is that normally when you install an extension in Chrome it asks you to confirm the access to those APIs on the sites where the extension wants to run, but this one comes pre-confirmed from the factory. A quick GitHub search finds ~1000 manifest files that list system.cpu, possibly because that API is also in the boilerplate example chrome extension manifest.
That's still just as unfair, though. Google always has access to that information because their extension is preinstalled and you can't disable it, but other websites have no access to that information unless you go out of your way to install a third-party extension to do so.
OK. That's a point of view. I just thought it should be accurately described.
I think the idea that you will download a web browser from Google and then it won't be able to figure out what model of CPU it is running on is a bit weird, when you think it through. There are lots of features of Chrome that are only "available to Google" for example it will only download updates from Google, unless you've modified its source code.
Google would naturally have privileged access to the browser, but that doesn't need to mean they have secret privileged access to my computer's hardware
So the concern is that Google is making Hangouts better in a way that is hard for competitors to replicate? (And by "hard," I mean, "competitors have to ask users to install something," not hard in any HN-relevant sense of the word.) This forum sure has a lot of wanna-be Handicapper Generals. https://en.wikipedia.org/wiki/Harrison_Bergeron
If it's easy for users to just install the extension, then obviously there's no point in Google doing this in the first place. The fact that they bothered sounds like a pretty compelling argument for it being worth caring about.
Worth caring about why?? Because you think Google shouldn’t be allowed to make things better for their users? Like I said, you are so concerned about fairness you’re missing the point.
Worth caring about because they have a browser monopoly. It's not allowed for the exact same reason why in the olden days Microsoft wasn't allowed to add APIs to Windows that could only be used by IE, even if it made things better for IE users. It harmed users in the overall market by making it harder to compete.
My guess is that Google will react to this Twitter thread by simply deleting it. Hangouts has been a dead product for a while; if their server side code still uses it they can surely remove it as presumably the Chrome team monitor WebRTC performance themselves in a multi-site way now, given the much wider usage.