Hacker News new | past | comments | ask | show | jobs | submit login
The Microsoft Update mechanism has been used to spread malware (f-secure.com)
225 points by k33l0r on June 4, 2012 | hide | past | web | favorite | 61 comments

The SecureList summary was much more detailed: http://www.securelist.com/en/blog/208193558/Gadget_in_the_mi...

Flame took advantage of WPAD, a little-known magical hostname (http://en.wikipedia.org/wiki/Web_Proxy_Autodiscovery_Protoco...) to do MITM attacks on the Windows Update servers.

Flame then installed 'WuSetupV.exe' with the description "Desktop Gadget Platform" "Allows you to display gadgets on your desktop".

What's amazing is that Windows Update doesn't require explicit validation of an update-only certificate chain. It seems like any certificate from the Microsoft root can certify updates (!).

This is like finding out the zombies have made it into the compound.

I wonder how big this hole is to fix. I also wonder, as many have, if this was written by an Intelligence agency and, if so, if they had access to Windows' source code.

My Windows already fixed it. http://support.microsoft.com/kb/2718704

The awesome part is how they force you to download an additional signed WGA validator exe before the site coughs up the patch, which itself is signed. If I was the attacker, I'd definitely be MITM'ing this page.

Zombies in the compound indeed.

How can you tell whether the fix was genuine? What if you were already infected by the Flame virus or similar that intercepts your Windows update traffic?

I hope Microsoft can deliver the revoked certificates in a trustworthy manner.

The idea of Microsoft willingly giving windows source code access to government does not make alot of sense.

What could have however happened is that the said "Intelligence agency" first created a malware to infect MSFT engineers' computers and get access info of the code repository and then spoofing themselves as MSFT employees to download the source code. This is alot more plausible considering what stuxnet and flame can already do. (Assuming they were made by same "Intelligence agency")

MSFT should really check the systems of their employees first.

Nah, anyone can get hold of Windows sources http://www.microsoft.com/en-us/sharedsource/default.aspx

Microsoft licenses product source code to qualified customers, enterprises, governments, and partners for debugging and reference purposes

Drop 'em an email if you're curious.

It would be cool if the Wine project could use it for reference. Currently they are working by reverse engineering, AFAIK. I wonder if they are "qualified".

Probably not. Microsoft is 'sharing' their source code with select partners because it's in their best interest to be accommodating to organizations that are building complex systems on top of their product. Wine is closer to a competitor than a partner.

I think in cases like what Wine is trying to do (reimplement/reverse engineer) you want "clean" engineers, not "tainted" ones. So I think that having their devs look at the windows source code might be the last thing they want.


Also see http://oldcomputers.net/compaqi.html

Compaq couldn't just copy IBM's BIOS to make their new machine guaranteed IBM compatible, this would be illegal, and easily proven by IBM.

Solution: Reverse-engineer IBM's BIOS. Compaq used two sets of programmers - one group analyzed the original code and made notes of exactly how it responded.

The second group took these notes, and wrote their own BIOS that performed exactly the same.

It makes sense to someone.

Microsoft shared source initiative


In 2003 MS gave China access to their source


It really was an unbelievable oversight to use the same certs in the Terminal Services activation system.

Quite a demonstration that even if you go to great pains to secure the code if you aren't careful with your credentials then it's for nothing.

A very similar thing happened with France during WW2.

I always remind myself I'm not as smart as I think I am.

What happened with France?

I suspect it's a reference to 'the Maginot Line':


Oh yeah, sure, blame it on the Belgians!

Fellow Belgian here, our ancestors had been naive...

Our main line of defense, the fort d'Ében-Émael, had been built by German workers. The German army had their plans, and knew the weak point of the fort: its vast, undefended roof (it was used as a football field by the soldiers).

During a dark night, they landed with gliders on the roof, and manually set up shaped charges[1] to destroy the turrets, which were resistant to conventional bombs.

Game over.

[1] http://en.wikipedia.org/wiki/Shaped_charge , damages seen from the outside: http://upload.wikimedia.org/wikipedia/commons/thumb/f/f6/Ebe...


Edited to reconcile my foggy memory with the historic truth...

I'd be willing to bet some young combat engineer said "but what about the roof?" and some old general replied "you'll never get a horse up there!".

It's worse than that.

The roof should have been mined, but the soldiers petitioned the hierarchy to keep their sports field...

They surrendered.

> I guess the good news is that this wasn't done by cyber criminals interested in financial benefit. They could have infected millions of computers. Instead, this technique has been used in targeted attacks, most likely launched by a Western intelligence agency.

You mean the bad news.

They apparently were only thinking of monetary losses, but a government malware on my computer is alot worse than credit card malware. At least we know what credit card malware can do at best (or worst).

Apparently the other tricky bit is that Windows can be set to auto-configure network proxies (presumably for enterprise support), so the infected host pretends to be the source of auto-config info in order to direct the other systems to connect through it to get to Windows Update. At which point the infected system can infect the package, which has been signed so it will auto-install.

I saw the headline and thought "Oh (expletive) I let update run last night!", but it turned out to be the revoked cert update.

"Western Intelligence" agencies really seem to be good at mucking stuff up.

Here we have an example of complexity arising from copy protection/licensing. It so happens that this complexity caused a security vulnerability which, when exploited on any one computer, affects close to a billion computers.

Is anyone else infuriated that a vulnerability like this exists in what is analogous to copy protection code?

In other words, if Microsoft had been spending more of their resources on making software work, instead of making software work only when you've proven you've paid for it, this particular issue would not exist.

The same would happen on ubuntu if someone steals the repository keys. This has nothing to do with copy protection.

But my point is that more teams needed access to signing keys because some of those teams were dedicated only to licensing issues. If they didn't need signing keys, those keys wouldn't have been compromised.

I suppose I can break it down another way.

Complexity introduces vulnerabilities.

Some complexity is necessary for the software to accomplish what the customer wants.

Some complexity is arguably necessary to protect the interests of the vendor. This is arguable because it varies between open source and proprietary software.

To me, it is upsetting when the code to protect the vendor's interests is where a critical security vulnerability exists. I don't think this is a controversial statement.

> But my point is that more teams needed access to signing keys because some of those teams were dedicated only to licensing issues. If they didn't need signing keys, those keys wouldn't have been compromised.

Cryptographically signed binaries are not used to manage licensing issues, they are used to make sure that no-one intercepts your download and replaces it with a malicious binary. It is absolutely essential that computer programs are signed or delivered through a secure connection.

Losing your signing keys will make the entire system jeopardized and new keys must be generated and securely transmitted (this is hard).

> To me, it is upsetting when the code to protect the vendor's interests is where a critical security vulnerability exists.

Yes, that would be upsetting if it were true. But it isn't. The whole system is in place to protect you, the customer.

The issue was that Microsoft left behind the ability to sign code with a Microsoft certificate by mistake.

The entire reason the attackers could use the certificate was because Microsoft left behind that functionality in the suite of software that allows enterprise customers to license their instances of Terminal Servers.

I am not railing against cryptographic signing as a concept -- what happened was Microsoft played fast and loose with their certificate chain in order to provide their customers with a way to prove that they had paid for software.

The certificate chain could have been a lot cleaner if that licensing bit wasn't necessary.

Well... No, it'd be like if ubuntu required you to sign up to corporate licensing in order to use Ubuntu. Then handing out misconfigured keys to corporate customers which could also be used to sign repos as an unfortunate side effect.

Let's not forget that kernel.org itself was hacked. Nor that Firefox et al update themselves. People who live in glass houses...

This is why Linux kernel source code is also signed cryptographically and so is their git repository (all tags are signed). They also employ a PGP-style web of trust instead of an SSL-style centralized certificate management.

The chances of someone slipping in a backdoor in kernel releases are very slim.


But you can always get the source and build the package yourself. Can you do that with Windows?

Yes but some one can break into whatever server is hosting the source and add a backdoor withought anyone noticing for quite a while, do you go through all the code you compile? Can't think of any specific examples right now but i remember this happening a few times

You can rather easily see the deltas. It's not perfect, but it's at least possible. That's another line of defense Windows users lack (and, usually, don't care about)

This sort of thing (certificate compromise) has happened with open source projects as well.

The downside of trusting certificates is that you're abstracting trust to a point that you (where you == the IT industry) stops questioning who can do what. You trust something absolutely, but that trust is based on lots of people whom you don't know doing the right thing without making mistakes.

Microsoft has over 90,000 employees, and no doubt some of those people were hired specifically to protect their software licensing. They're probably not pulling their top OS developers to work on this. So the idea that they should have been "spending more of their resources on making software work..." is not really valid.

In fact, there is no company or software community anywhere that writes highly complex and bug free software. It's not possible.

I agree with you that they are not necessarily removing developers from other projects to work on licensing. However, a percentage of code in Windows is dedicated to validating licensing and preventing piracy.

> In fact, there is no company or software community anywhere that writes highly complex and bug free software. It's not possible.

I agree with this point. For this reason, the point I'm trying to make is that reducing complexity is a key way to make software more secure. And when reducing complexity, we should look at what is in the user's best interest.

Code has bugs--that is an unescapable fact for the foreseeable future. Therefore, any code running on your computer makes you slightly less safe. Therefore, any code on your machine which is not accomplishing something in your best interests could be considered an unnecessary security risk.

I am arguing that it is not in your best interest to insure that you bought a little certificate from Microsoft before your computer operates the way you have configured it to.

After all that, of course it is in your best interests to have Microsoft grow its war chest and be a healthy company that can employ tens of thousands of smart developers to write good code (which we choose to keep using year after year).

But, every time a product breaks because they have to make sure I'm not a criminal, and the way they do it is sloppy, I get upset, and I think it's rightfully so. Usually it is small, proprietary CMS or CRM software that drives me crazy. But today it's this issue.

But having updates cryptographically signed and checked against a list of (presumably) trusted certificates is in my best interest.

The "bug" here was in key management, not in SW that's running in my computer.

I shouldn't have used the word "certificate." What I meant was "license."

The code that caused the certificate chain to be compromised is related to licensing Terminal Services on Windows Server. That is how their key management failed, and that is what I'm railing against--The fact that this ability to sign binaries with a legitimate Microsoft certificate has been hidden away in copies of Windows Server for years by mistake. And not for any direct customer benefit, but to employ a licensing scheme.

let this be a lesson to you: run windows update frequently for maximum security.

Now available with new and faster bugs.

But does it refuse to install the malware if it arbitrarily decides that your Windows is not genuine?

So it's working as designed.

Cue the sound of a thousand palms hitting faces at Microsoft HQ.

You realize, you already have to be infected with the flame virus to work, right?

No, you just have to be sharing a network with a host already infected with Flame.

Oh look, another scaremongering and purposely misleading article from F-Secure. This is starting to become a regular thing isn't it; I guess the recession must have hit them particularly hard.

I actually feel Mikko & the folks at f-secure are very good at explaining things to people who aren't everyday "virus fighters," but i felt the exact same way you did (like they were scaremongering) when they said "The Nightmare Scenario."

I'm on the fence about this issue, honestly. Part of me feels like they believe passionately that we're stepping into new, dangerous territory. The other part of me indeed feels like this is great advertising for not only them, but their industry (And they're going to push it all they can).

But the fact of the matter is they admit they can't protect you, whoever you are, from these types of targeted attacks. I've seen well respected speakers from Defcon go back and forth with Mikko on Twitter about the efficiency of AV.

It sucks you're (the parent) being downvoted but this is an issue, what with the incredible amount of FUD that comes with every serious attack.

EDIT: My mistake in misspelling Mikko's name. Sorry about that.

Mikko, Mikko Hyponnen. Not "Mykko".

I believe it's Hyppönen, but Muphry's Law will probably strike me also ;)

I'm curious, could you please quote some of the points you thought were purposefully misleading in the article?

And here come the shills, one or perhaps two at a time to defend the article from any critique. I'm afraid I've grown bored of this dance, entertaining as it may have been I've become listless.

I have no connection to the antivirus industry. It was meant to be a legitimate question. Please reconsider answering it.

yeah, shills. sure. no way you're getting downvoted for spreading FUD and then exolicitly refusing to back it up. Nope. Must be shills.

The dude's comment history is hilariously shilly.

I guess I'll just remain scared and mislead then.

Frankly, I don't have an idea of what are you trying to say or imply.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact