Hacker News new | past | comments | ask | show | jobs | submit login

Crazy. So any process can use MSCTF (Text Services Framework) to elevate itself into NT AUTHORITY\SYSTEM through UAC prompts or logon screen, because Microsoft forgot to implement access control!

I wonder whether this can also be used to escape sandboxes, but don't have time right now to analyze that. If so, this will be even more dangerous.

Edit: Yup, can be used for sandbox (AppContainer) escapes. The blog post mentions it, just didn't notice at first scan.

While I respect Google's security team's 90 day vendor response policy in general, I think it is too reckless in this case. This can too easily escalate previous "harmless" contained sandbox compromises into full system wide ones. A lot of people are potentially now on the harm's way.




I'm sure this isn't a 'forgot to' - rather it is likely a case of a legacy feature / code-base that nobody at Microsoft understood well enough to realize the significance of the vulnerability.

That doesn't excuse it, however given the amount of technical debt I am sure exists in the code base for MS Windows based products... this does not surprise me.


Agreed. This isn't a simple patch, and will require significant re-engineering of a critical Windows component. 90 days is entirely unreasonable in this case. Releasing his ctftool was particularly reckless, this is privilege escalation in a bottle.


Reckless by Microsoft or the researcher? Cause we don't know who has been exploiting this already and by providing the tool adds pressure on Microsoft to expedite this and helps people that may have already been effected have some insight into how they were compromised.


Perhaps the pressure could have been applied privately. Like researcher gaining access to achieved milestones. As long as Microsoft's progress would have been reasonable, no release.

This issue is very time consuming to fix and very easy to exploit, and it affects a large number of people, directly and indirectly. A full system compromise from unprivileged process or sandbox.

Even if you don't personally use Windows, this might for example be used to compromise your data processed somewhere else.

I completely agree there needs to be very strong pressure on the vendors, and 90 day response is a very effective tool at that. But there should be some kind of alternative way to apply pressure on the vendor in cases that take a long time to fix and cause devastating collateral damage.

Any script kiddie can now start to use this in hours or days.


> Perhaps the pressure could have been applied privately. Like researcher gaining access to achieved milestones. As long as Microsoft's progress would have been reasonable, no release.

Perhaps you should read the thread describing the communication with Microsoft. It sounds to me like the issue was not just the complexity of the bug, but a failure of organization/communication on Microsoft's part.

https://bugs.chromium.org/p/project-zero/issues/detail?id=18...


Yeah, that doesn't look good for Microsoft. I thought they've already learned their lesson, but maybe they need to have a periodic embarrassment for that.


I believe taviso is familiar with Microsoft applying pressure privately.



The righteousness of your post reminds me of the climax of a Disney movie.

Google does this all the time, on purpose, like clockwork. They aren't the only ones out there looking for zero days in their supply chain, but they're the only ones who ignore a vendors disclosure policy and substitute their own.

For example; If Google finds a bug in your product YOU get 90 days before they put you on blast in front of 8 billion people.

But if you find a bug in GOOGLE'S product and put them on blast YOU will find yourself in court.

Does anyone remember the last time Microsoft or Apple went looking for zero days to drop in public on their blog about Google? They don't. Because they're professional companies with better shit to do than stir the pot.

This has NOTHING to do wit supply chain security and EVERYTHING to do with putting heat on their competition. That is why PZ exists. If that weren't true PZ would be looking at non-competing products with large user bases. WordPress comes to mind. But Google doesn't compete with WordPress, so they'll never focus on it.


Perhaps there should be more heat on the competition. Microsoft could have found this problem as well, much easier even by reading their own internal documentation. They chose not to care.


> For example; If Google finds a bug in your product YOU get 90 days before they put you on blast in front of 8 billion people.

> But if you find a bug in GOOGLE'S product and put them on blast YOU will find yourself in court.

Reference needed.


> For example; If Google finds a bug in your product YOU get 90 days before they put you on blast in front of 8 billion people.

https://www.vice.com/en_us/article/7xqdxe/google-project-zer...

https://www.csoonline.com/article/2867534/microsoft-blasts-g...

Here are Google's many contradictory policies about disclosure. Notice the many discrepancies...

https://googleprojectzero.blogspot.com/p/vulnerability-discl...

https://security.googleblog.com/2010/07/rebooting-responsibl...

https://sites.google.com/site/bughunteruniversity/nonvuln

https://www.google.com/about/appsecurity/reward-program/

https://www.google.com/about/appsecurity/

https://googleprojectzero.blogspot.com/p/vulnerability-discl...

> But if you find a bug in GOOGLE'S product and put them on blast YOU will find yourself in court.

Maybe a bit of a dramatization, but the point remains. If Google finds a bug in your product: Your policy is moot. They follow their policy. If you find a bug in their product you are expected to follow their policy.

In the case of Apple their "security professionals" will make jokes about you on Twitter and in the case of Microsoft they just do whatever the fuck they want. "Your patches come out on Tuesday, huh? Well that's 92 days, big-guy! Tough break..."


> Maybe a bit of a dramatization

There's dramatization, and there's outright lying. Nothing will happen to you if you follow your own disclosure policy instead of Google's.

If you also want to participate in the program where you get paid by Google, then sure, you have to play by some of their rules. Similarly, nowhere does Project Zero say they expect to get paid if they don't follow the vendor's rules.


> If you find a bug in their product you are expected to follow their policy.

Is their policy more than 90 days? Yes? Fuck 'em; post everything everywhere.


It's not; from parent's own links: "We of course expect to be held to the same standards ourselves."


* impolitecough *

Great, then they should start pushing OS security patches out to devices instead of handing them to manufacturers and carriers and washing their hands of them.


They have started that. That's why more and more of Android has been moved to Google Play Services.

Coincidentally, that's one of the reasons why being denied use of Android is such an obstacle for Huawei, even though it's "open source".


You forgot to include "with pre-built exploit tooling built around what he explicitly said he spent a lot of time on."

Project Zero has done some great things and improved a lot of security, but this feels like a spiteful slap at a competitor. It's not Google is really vulnerable to the same kind of thing, they've long since shown that the security of older versions of their only real public OS is not their concern.


> You forgot to include "with pre-built exploit tooling built around what he explicitly said he spent a lot of time on."

No I didn't; I said everything and I meant everything. If anything, 90 days is overly generous to Google. If they can't get their shit together in three bloody months, fuck them. Of course, this is Google, so fuck them regardless, but this way you have obvious moral high ground.


> looking at non-competing products with large user bases

Like Linux? McAfee/Kaspersky/Malwarebytes/Symantec? LastPass? LG? Google's own software? Nvidia?


Given the complex nature of the fix, when do you think Microsoft should have started working on it?


This possibility arose 20 years ago according to the article.


You could require a specific digital signature on your exe to register as a CTF service, and verify the thread IDs when clients connect.


To be fair, you should never install a native program that you don't 100% trust. This could presumably be combined with other exploits - a JS vulnerability that gives you control of Chrome, for example - but if you're regularly running untrusted software on Windows (outside of a VM) you probably have bigger problems.


IMO it's not that clear-cut. A VM is safer than a webpage which is safer than an nacl-plugin, which in turn is safer than an UAP app. But in the end you risk privilege escalation for everything which isn't airgapped.


Right- it's still a big deal, but it isn't a clear and present danger in and of itself for responsible users


What makes you think so? How do you know how many exploits people are sitting on, but not using because they don't have a good sandbox escape exploit?

Well, now they do have an escape. That useless IE/Chrome exploit might be able to suddenly gain SYSTEM privileges.

You can get compromised by such attacks no matter how responsible you are.


What about the privilege escalation? I can take a lot of care of what I install, but what if some other person use my same computer and is not that careful.


There is no way to securely share a Windows install with untrusted users. Use real VMs.


I wouldn't necessarily dare to install untrusted software on a VM either.

VM escapes are a thing. They have a ton of emulated peripherals, like SATA, ethernet, audio, video/3d, USB HCI, etc. A lot of attack surface. There are still a lot of VM escape bugs to be found.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: