Flame then installed 'WuSetupV.exe' with the description "Desktop Gadget Platform" "Allows you to display gadgets on your desktop".
What's amazing is that Windows Update doesn't require explicit validation of an update-only certificate chain. It seems like any certificate from the Microsoft root can certify updates (!).
This is like finding out the zombies have made it into the compound.
I wonder how big this hole is to fix. I also wonder, as many have, if this was written by an Intelligence agency and, if so, if they had access to Windows' source code.
The awesome part is how they force you to download an additional signed WGA validator exe before the site coughs up the patch, which itself is signed. If I was the attacker, I'd definitely be MITM'ing this page.
How can you tell whether the fix was genuine? What if you were already infected by the Flame virus or similar that intercepts your Windows update traffic?
I hope Microsoft can deliver the revoked certificates in a trustworthy manner.
The idea of Microsoft willingly giving windows source code access to government does not make alot of sense.
What could have however happened is that the said "Intelligence agency" first created a malware to infect MSFT engineers' computers and get access info of the code repository and then spoofing themselves as MSFT employees to download the source code. This is alot more plausible considering what stuxnet and flame can already do. (Assuming they were made by same "Intelligence agency")
MSFT should really check the systems of their employees first.
It would be cool if the Wine project could use it for reference. Currently they are working by reverse engineering, AFAIK. I wonder if they are "qualified".
Probably not. Microsoft is 'sharing' their source code with select partners because it's in their best interest to be accommodating to organizations that are building complex systems on top of their product. Wine is closer to a competitor than a partner.
I think in cases like what Wine is trying to do (reimplement/reverse engineer) you want "clean" engineers, not "tainted" ones. So I think that having their devs look at the windows source code might be the last thing they want.
Compaq couldn't just copy IBM's BIOS to make their new machine guaranteed IBM compatible, this would be illegal, and easily proven by IBM.
Solution: Reverse-engineer IBM's BIOS. Compaq used two sets of programmers - one group analyzed the original code and made notes of exactly how it responded.
The second group took these notes, and wrote their own BIOS that performed exactly the same.
Fellow Belgian here, our ancestors had been naive...
Our main line of defense, the fort d'Ében-Émael, had been built by German workers. The German army had their plans, and knew the weak point of the fort: its vast, undefended roof (it was used as a football field by the soldiers).
During a dark night, they landed with gliders on the roof, and manually set up shaped charges[1] to destroy the turrets, which were resistant to conventional bombs.
> I guess the good news is that this wasn't done by cyber criminals interested in financial benefit. They could have infected millions of computers. Instead, this technique has been used in targeted attacks, most likely launched by a Western intelligence agency.
They apparently were only thinking of monetary losses, but a government malware on my computer is alot worse than credit card malware. At least we know what credit card malware can do at best (or worst).
Apparently the other tricky bit is that Windows can be set to auto-configure network proxies (presumably for enterprise support), so the infected host pretends to be the source of auto-config info in order to direct the other systems to connect through it to get to Windows Update. At which point the infected system can infect the package, which has been signed so it will auto-install.
Here we have an example of complexity arising from copy protection/licensing. It so happens that this complexity caused a security vulnerability which, when exploited on any one computer, affects close to a billion computers.
Is anyone else infuriated that a vulnerability like this exists in what is analogous to copy protection code?
In other words, if Microsoft had been spending more of their resources on making software work, instead of making software work only when you've proven you've paid for it, this particular issue would not exist.
But my point is that more teams needed access to signing keys because some of those teams were dedicated only to licensing issues. If they didn't need signing keys, those keys wouldn't have been compromised.
I suppose I can break it down another way.
Complexity introduces vulnerabilities.
Some complexity is necessary for the software to accomplish what the customer wants.
Some complexity is arguably necessary to protect the interests of the vendor. This is arguable because it varies between open source and proprietary software.
To me, it is upsetting when the code to protect the vendor's interests is where a critical security vulnerability exists. I don't think this is a controversial statement.
> But my point is that more teams needed access to signing keys because some of those teams were dedicated only to licensing issues. If they didn't need signing keys, those keys wouldn't have been compromised.
Cryptographically signed binaries are not used to manage licensing issues, they are used to make sure that no-one intercepts your download and replaces it with a malicious binary. It is absolutely essential that computer programs are signed or delivered through a secure connection.
Losing your signing keys will make the entire system jeopardized and new keys must be generated and securely transmitted (this is hard).
> To me, it is upsetting when the code to protect the vendor's interests is where a critical security vulnerability exists.
Yes, that would be upsetting if it were true. But it isn't. The whole system is in place to protect you, the customer.
The issue was that Microsoft left behind the ability to sign code with a Microsoft certificate by mistake.
The entire reason the attackers could use the certificate was because Microsoft left behind that functionality in the suite of software that allows enterprise customers to license their instances of Terminal Servers.
I am not railing against cryptographic signing as a concept -- what happened was Microsoft played fast and loose with their certificate chain in order to provide their customers with a way to prove that they had paid for software.
The certificate chain could have been a lot cleaner if that licensing bit wasn't necessary.
Well... No, it'd be like if ubuntu required you to sign up to corporate licensing in order to use Ubuntu. Then handing out misconfigured keys to corporate customers which could also be used to sign repos as an unfortunate side effect.
This is why Linux kernel source code is also signed cryptographically and so is their git repository (all tags are signed). They also employ a PGP-style web of trust instead of an SSL-style centralized certificate management.
The chances of someone slipping in a backdoor in kernel releases are very slim.
Yes but some one can break into whatever server is hosting the source and add a backdoor withought anyone noticing for quite a while, do you go through all the code you compile? Can't think of any specific examples right now but i remember this happening a few times
You can rather easily see the deltas. It's not perfect, but it's at least possible. That's another line of defense Windows users lack (and, usually, don't care about)
This sort of thing (certificate compromise) has happened with open source projects as well.
The downside of trusting certificates is that you're abstracting trust to a point that you (where you == the IT industry) stops questioning who can do what. You trust something absolutely, but that trust is based on lots of people whom you don't know doing the right thing without making mistakes.
Microsoft has over 90,000 employees, and no doubt some of those people were hired specifically to protect their software licensing. They're probably not pulling their top OS developers to work on this. So the idea that they should have been "spending more of their resources on making software work..." is not really valid.
In fact, there is no company or software community anywhere that writes highly complex and bug free software. It's not possible.
I agree with you that they are not necessarily removing developers from other projects to work on licensing. However, a percentage of code in Windows is dedicated to validating licensing and preventing piracy.
> In fact, there is no company or software community anywhere that writes highly complex and bug free software. It's not possible.
I agree with this point. For this reason, the point I'm trying to make is that reducing complexity is a key way to make software more secure. And when reducing complexity, we should look at what is in the user's best interest.
Code has bugs--that is an unescapable fact for the foreseeable future. Therefore, any code running on your computer makes you slightly less safe. Therefore, any code on your machine which is not accomplishing something in your best interests could be considered an unnecessary security risk.
I am arguing that it is not in your best interest to insure that you bought a little certificate from Microsoft before your computer operates the way you have configured it to.
After all that, of course it is in your best interests to have Microsoft grow its war chest and be a healthy company that can employ tens of thousands of smart developers to write good code (which we choose to keep using year after year).
But, every time a product breaks because they have to make sure I'm not a criminal, and the way they do it is sloppy, I get upset, and I think it's rightfully so. Usually it is small, proprietary CMS or CRM software that drives me crazy. But today it's this issue.
I shouldn't have used the word "certificate." What I meant was "license."
The code that caused the certificate chain to be compromised is related to licensing Terminal Services on Windows Server. That is how their key management failed, and that is what I'm railing against--The fact that this ability to sign binaries with a legitimate Microsoft certificate has been hidden away in copies of Windows Server for years by mistake. And not for any direct customer benefit, but to employ a licensing scheme.
Oh look, another scaremongering and purposely misleading article from F-Secure. This is starting to become a regular thing isn't it; I guess the recession must have hit them particularly hard.
I actually feel Mikko & the folks at f-secure are very good at explaining things to people who aren't everyday "virus fighters," but i felt the exact same way you did (like they were scaremongering) when they said "The Nightmare Scenario."
I'm on the fence about this issue, honestly. Part of me feels like they believe passionately that we're stepping into new, dangerous territory. The other part of me indeed feels like this is great advertising for not only them, but their industry (And they're going to push it all they can).
But the fact of the matter is they admit they can't protect you, whoever you are, from these types of targeted attacks. I've seen well respected speakers from Defcon go back and forth with Mikko on Twitter about the efficiency of AV.
It sucks you're (the parent) being downvoted but this is an issue, what with the incredible amount of FUD that comes with every serious attack.
EDIT: My mistake in misspelling Mikko's name. Sorry about that.
And here come the shills, one or perhaps two at a time to defend the article from any critique. I'm afraid I've grown bored of this dance, entertaining as it may have been I've become listless.
Flame took advantage of WPAD, a little-known magical hostname (http://en.wikipedia.org/wiki/Web_Proxy_Autodiscovery_Protoco...) to do MITM attacks on the Windows Update servers.
Flame then installed 'WuSetupV.exe' with the description "Desktop Gadget Platform" "Allows you to display gadgets on your desktop".
What's amazing is that Windows Update doesn't require explicit validation of an update-only certificate chain. It seems like any certificate from the Microsoft root can certify updates (!).