Hacker News new | past | comments | ask | show | jobs | submit login
Tim Cook’s refusal to help FBI hack iPhone is validated by ‘WannaCry’ attack (bgr.com)
450 points by BishopD on May 15, 2017 | hide | past | web | favorite | 180 comments



The damages caused by US agencies* are easily underestimated. It's not just the equipment rendered useless and data lost due to their software. It's also the millions of hours spent by IT departments worldwide battling with infected computers and securing not yet infected computers. Damages are easily in the tens of billions of USD.

And then there is the loss of credibility of the US government and the companies involved.

*Edit: FBI changed to US agencies.


The recent hacks used NSA tech, the FBI had nothing to do with it. The OP is generalizing that un-closed vulnerabilities in the hands of any government agency are an unacceptable risk, which I agree with, but let's put the blame for the current issue where it belongs.


Agree and edited, although it really is two units under the same government - and that's where the blame belongs.


but let's put the blame for the current issue where it belongs

Was GP generalizing, or overspecifying? Because from where I sit the responsibility (for not notifying Microsoft) does appear to lie within the Departments of Defense and Justice, and I wouldn't be surprised if the Venn diagram for the split responsibility is based on a simple division of labor.


> It's not just the equipment rendered useless and data lost due to their software. It's also the millions of hours spent by IT departments worldwide battling with infected computers and securing not yet infected computers. Damages are easily in the tens of billions of USD.

You can't pin that on spy agencies. Criminals would have exploited the same holes eventually. Manufacturers and software developers could just start using better languages and better development practices instead of whinging every time their C/C++ software is trivially owned.


> You can't pin that on spy agencies. Criminals would have exploited the same holes eventually.

Officially, the NSA's role is not to sit on zero-day exploits. If it discovers zero-day exploits (either by accident or through active research), then it should follow industry practices for disclosing and patching it[0].

[0] There are discussions of what constitututes "industry best practices", but I think we can all agree that "telling nobody, including the software vendors, with the hopes of being able to exploit it later" does not qualify.


Either NSA's role involves using zero-day exploits, in which case they shouldn't disclose them; or NSA's role doesn't involve using zero-day exploits, in which case it shouldn't be looking for them and would have nothing to disclose.

You might argue whether or not NSA should be into 'hacking other people with 0-days' business or 'making computers more vulnerable' business, but NSA definitely is not in the business of making computers less vulnerable. Especially for computers located in other countries, they have no business helping to protect them - that's against their role, they're paid to facilitate the exact opposite goal.


> Either NSA's role involves using zero-day exploits, in which case they shouldn't disclose them; or NSA's role doesn't involve using zero-day exploits, in which case it shouldn't be looking for them and would have nothing to disclose.

Third possibility: their role involves securing domestic infrastructure, in which case they should be looking for vulns and getting them fixed. IIRC this actually is part of their statutory mandate, alongside offensive operations.

That being said, it still wouldn't help against WannaCry because Microsoft made a blunder when they disclosed the vuln by patching new versions of Windows while they left majority of XP installations unfixed. Even without the NSA leak somebody may have reverse engineered the bug from the released updates - such things happened in the past.


> Third possibility: their role involves securing domestic infrastructure, in which case they should be looking for vulns and getting them fixed. IIRC this actually is part of their statutory mandate, alongside offensive operations.

This sounds similar to what IAD does. The difference is that IAD focuses on securing systems against entire classes of vulnerabilities and attacks.

SID has the mission of gathering signals intelligence, so that is why it makes sense for them to utilize individual zero day vulnerabilities. They need to get into an adversary system, so vulnerabilities are required (when needed) to get that access.


How do you propose to solve the conundrum of this vulnerability is so bad, MS needs to release a fix, even for unsupported OSes, and get people still running those unsupported OSes to install the fix without disclosing there's a critical issue?


I'm not saying they shouldn't be disclosing that the patch is critical, I said that in the case of ancient bugs affecting all versions of Windows they should release all patches at the same time.


Correct me if I'm wrong, but isn't that what they did? Didn't they do it on the usual patch Tuesday?

I don't have any old versions of Windows installed, so I truly don't know how it rolled out there.


Wow, that's an old post. For currently supported versions I believe they released all patches at the same time, but XP was a problem because users without extended support contracts were left vulnerable which facilitated spread of WannaCry until they pushed updates to everyone. It's Microsoft's policy not to release fixes for unsupported versions to the public (this was an exception) and it leads to this kind of problems.


> but NSA definitely is not in the business of making computers less vulnerable.

Last time I spoke with the NSA, admittedly over a decade ago at a college recruitment event, they were in that business as well. So much so that they created a hardened version of Linux that a lot of us today [0].

[0] https://en.wikipedia.org/wiki/Security-Enhanced_Linux


That looks like a false dichotomy to me. I don't see any reason why the NSA's role couldn't include protecting the nation's citizens. I mean that's the original purpose of government, no?


NSA's role could include protecting the nation's citizens or whatever other role the US government decides to assign to NSA, but it currently does not.

In addition to it's main role of collection of foreign intelligence data (which includes breaking encryption and finding/making vulnerabilities), it does have a secondary mission of protecting US government systems. Hopefully that does mean that they have ensured that US gov't systems have been patched against those vulnerabilities. For example, Microsoft patching the SMB vulnerability (including for WinXP) before it was made public was likely done after a report from NSA, and would be consistent with such a goal. Having "private" custom patches that would get installed in sensitive gov't systems but not made available to general public would also match this goal.

However, the relationship of NSA and the main victims of WannaCry are at best neutral. For example, it seems reasonable that NSA should want companies like Spain's Telefonica to stay vulnerable to various exploits - their mission explicitly includes obtaining intelligence from (among other sources) foreign telecommunications operators, so if they had the ability to protect Telefonica, then doing so would go against the mission given to NSA by the congress.


The purpose of the NSA is the same as the purpose of the military: To apply force in the digital arena, the same way the military applies force in the physical arena.

In the same vein, the NSA isn't about protection. That's not their role, just like the point of the military isn't protection.


NSA has dual roles, not only spying on others but also protecting U.S. resources. This creates this tension between keeping exploits secret or informing vendors and others. For a long time, security researchers like Bruce Schneier have advocated for splitting the NSA up, essentially one for offense and one for defense.


> the point of the military isn't protection

I'd argue that it is, if indirectly.


It's true. We're all safer thanks to the enormous success of our military. That's why it's valuable, just like the NSA.

So it leads to the same logical conclusions: If the point of the military is to apply force to an opponent, then the NSA has to have ways of applying force to a digital opponent. Disclosing vulnerabilities would be precisely the opposite of what they should do.


> We're all safer thanks to the enormous success of our military

Examples from the last 50 years?


It is espionage by the US intelligence community against US companies. A lot of these guys should be in prison.

I've repeatedly said on HN -- if the US tech companies can't build the most secure systems possible, the US government itself will not be able to secure their own systems. Secure devices and communication is not a given nor will it become easier.


Are you of the opinion that we should dismantle our military? Because that's what you'd literally be doing by sending NSA employees to prison.

They are military. It was a cyberweapon. That's the point of a cyber weapon. Russians stole it from the NSA. Yes, that's incompetence, but an attacker only has to get lucky once. It's much harder to defend a cyberweapon than a tomahawk missile.

Does that mean we shouldn't build cyberweapons? No. That's just how cyberweapons are. That's the reality we live in now, and the world will just have to deal with it. There's nothing to be done, is there? Not unless you're suggesting that computers should never be used for offensive capabilities, which from a game theory perspective means that country will be powerless in the digital arena.


If the people working for the NSA don't want to go to prison, then maybe they should stop committing treason by breaking our constitution and illegally spying on Americans through illegal programs.

Don't do the crime if you cant do the time.

Personally, I'd rather our military NOT be full of people who are willing to do this kind of thing.

If this discussion was about missiles, and the gov agency in charge of the missiles were using them on Americans, I'd say that they should go to jail as well.

You are right. These cyber exploits are weapons. And they should stop using those weapons on Americans, as a multitude of leaks have shown that they do, through mass spying and mass data collection programs.


It's not productive to reduce the argument to this extreme. There's no way for the NSA to do what the NSA does without searching for exploits and weaponizing them. That's the whole reason to have an NSA.


Its all good until they get caught, then they should go yo prison like the rest of criminals.


Lives too. Our already financially crippled NHS was hit extremely hard.

They are not allowed to upgrade automatically because their software will break. The cost to upgrade the software isn't feasible when you have a multi-billion wide gap.


I think you meant to refer to the NSA instead of the FBI.


Quite an incentive for an untimely and disproportionate response to the attackers, should they be located.

Edit: Umm, humans have formed armies and started wars over far less theft and far lesser threats to their populations than this. If the attackers are located, some piano wire around their necks is really cheap in comparison. And for this amount of damage, I do not doubt for a moment that it's a consideration by every single government spy organization.


If these hackers caused tens of billions of dollars worth of damage... I'm happy to see a rapid and decisive response severe enough to dissuade people from trying a stunt like this again.

Maybe leaving my front door unlocked was a bad idea, but you still robbed my house.


Or, you know, just improving their practices.


I'm generally in favor of everyone's computer working but I think it would be wonderful if the WannaCry malware infects the NSA to the point they're shut down for a week or two. I don't know if I'll call it karma but they should at least share some of the pain. If it hadn't been the health-care system, I would have been cheering for the compromise of any of the five-eyes partners as well.

Note to the NSA/CIA - I want to cheer for my government, to be proud to be an American (even the U.S. variety) and it would be wonderful if the U.S. could return to being viewed as "the good guys". I'm not sure that type of patriotism will ever return.


> it would be wonderful if the U.S. could return to being viewed as "the good guys"

You mean internally or that there was ever a time where the external perception of US was of "good guys"?

I mean, I can't possibly imagine the latter being the case. For instance: it's pretty common knowledge in South America of US involvement in setting up bloody dictatorships in the 60's and 70's.

Only if US citizens believe in their own soft power propaganda (e.g. Hollywood).

See [0], it gets pretty ridiculous after 1949.

[0]: https://en.wikipedia.org/wiki/United_States_involvement_in_r...

Edit (Relevant Carlin):

Israeli murderers are called commandos, Arab commandos are called terrorists. Contra killers are called freedom fighters. Well, if crime fighters fight crime and fire fighters fight fire, what do freedom fighters fight? They never mention that part to us, do they?


It concerns me more that the UK's healthcare system runs on a no longer supported operating system. Something that important should be kept up to date.

The UK's NHS computers were running Windows XP (which was EOL). http://www.telegraph.co.uk/news/2017/05/12/nhs-hit-major-cyb...

Otherwise, the patch to fix the security flaw was released back in March before it was a real problem. There will always be holes in software, it's important to have a system in place to patch them in a timely manner.


"only" 5% of the computers are running XP, which is an improvement driven through after a report given to Hunt last year about the risks of old IT.

It wasn't just affecting XP machines, it was affecting other machines if they hadn't been patched.

http://www.bbc.co.uk/news/uk-39918426

> We know there have been warnings before about IT security in the NHS - last summer a review said it needed looking at.

> But the problem is that over the last three years the capital budget - which is a ring-fenced fund used to pay for buildings and equipment - has been raided by the government to bail out day-to-day services, such as A&E.

> Last year a fifth of the capital budget was diverted.

> That, of course, makes it more difficult for trusts to keep their systems up to date.

EDIT: Here's a better post https://www.instituteforgovernment.org.uk/blog/nhs-cyber-att...

> One of the problems with digital government is reforming the technology infrastructure which underpins its services (‘legacy’). There has been much speculation about how the continued use of Windows XP operating systems within the NHS contributed to the cyber-attack. Although only 4.7% of NHS devices use Windows XP, these are spread across 90% of trusts. Computers that have not been updated with Microsoft’s latest software were susceptible to the ransomware. Meanwhile, NHS legacies are further complicated by the patchwork of contracts across trusts. This digital fragmentation is in keeping with the scale of fragmentation within the NHS itself.


> it was affecting other machines if they hadn't been patched.

My point is mission critical software should always be up to date. To go months without installing the update on important systems is unacceptable. It doesn't matter who originally released this exploit, exploits routinely become available. It's about fixing them in a timely manner when they appear.


> It concerns me more that the UK's healthcare system runs on a no longer supported operating system. Something that important should be kept up to date.

I was recently thinking that software producers wishing to discontinue support for a software product should have to open source that product version (or at least, disclose the source to to paying customers). That way others can continue to support it and backport patches if it's truly necessary.


Most software is based on previous renditions of the same software. This would never work in practice without giving up trade secrets.


And? Trade secrets have and deserve little legal protection. This will have one of two effects: either software products will gain longer term support because companies want to retain their secrets, or supporting old versions becomes so costly for them that they are willing to let the source go.

Either outcome seems better for customers. In fact, it's significant incentive for better development practices and good software design so these companies can maintain older versions more easily. Sounds pretty good all around, so where's the downside?


Call me crazy, but I don't think any government should be running closed source software for anything where there exists an open source alternative.


> I think it would be wonderful if the WannaCry malware infects the NSA to the point they're shut down for a week or two

Would it also be "wonderful" if the US Army "shut down for a week or two?" How about the Coast Guard?

As an American, NSA protects you, that is NSA's job. NSA's job is not to impress you or make you think they are good guys.


No ... I think I was pretty specific about including the agencies who operate without enough oversight, frequently flaunt the law (and ignore the constitution) and are eroding the rights of the people they're supposedly serving.

The military is completely different. The troops on the ground are generally following orders and highly disciplined. There are definitely occasions where I don't agree with the orders they're following - I can't possibly fault the soldiers. Adding the Coast Guard to the mix is borderline ridiculous. And you completely forgot the Red Cross - there are some very militant nurses involved.


Please provide references for these claims:

operate without enough oversight

This is your opinion. FISA Court exists for a reason, and is there to protect the rights of Americans.

frequently flaunt the law (and ignore the constitution)

No idea how you're getting to this assumption, unless you're taking clickbait at face value. There have been contested judgements regarding the legal justification for metadata collection. That hardly constitutes "ignoring the constitution." I'm also curious as to your claim of frequency.

eroding the rights of the people they're supposedly serving

NSA cannot collect information about American citizens, only FBI can. I believe NSA can only be involved with a specific warrant issued in response to an imminent or direct threat, but even then, I think it has to be FBI.

The military is completely different

This is the point I'm trying to make to you: NSA is the military. One more time. NSA is the military. Excuse the gross oversimplification, but NSA provides intelligence to other branches of the military so that they can function in the most informed way possible.

If you "can't possibly fault the soldiers" who are "on the ground" and "generally following orders," I'm going to assume that you are someone that supports keeping US soldiers as safe as possible. Yet, you seem to delight in the notion that the NSA would be shut down for a week, which would directly endanger those lives. How would you rectify this?


I suppose these are now my opinions - they've developed over the last ten or so years as we've learned just how entrenched these supposedly controlled agencies are. The FISA court is described as rubber-stamping warrants etc. Read the news!

Your assertion that the NSA can't collect data about American Citizens would be great if it were true. There are pretty convincing arguments that they aren't abiding by at least the spirit of the law and that their lawyers are actively engaged in weasel-wording away the restrictions. I'm sorry but collecting the meta-data of American citizens is still collecting data (meta is a sub-class).

You've got a pretty strong opinion of the agency ... I'm guessing that you work for the PsyOps division. Or perhaps you're just a youngin' who hasn't been around the block quite enough times. And don't get me wrong - I've had a great life as a U.S. citizen. I gained my political awareness during the Reagan presidency and was at that point very proud of everything my country did - I didn't know how much was happening outside the view of the TV screen that I wouldn't be proud of - or how many other parts of our history could be questioned.

I'm certainly not advocating anything beyond having exactly this kind of discussion. It's not my fault that the NSA, CIA and U.S Government are viewed so negatively - they have the reputation that they deserve. I can understand why the rest of the world hates us but I hope we can decide to change our behavior.


> Your assertion that the NSA can't collect data about American Citizens would be great if it were true.

NSA is not allowed to intercept communications of American citizens. I should have been more explicit about that. I am simply trying to add context to what I believe is an erroneous perception that the NSA is a building full of gunslinging unchecked hackers that gleefully eschew US law so that they can... do what exactly?

> You've got a pretty strong opinion of the agency ... I'm guessing that you work for the PsyOps division. Or perhaps you're just a youngin' who hasn't been around the block quite enough times.

LOL. What does it say that you'd assume anyone who has a non-negative opinion of NSA is either working for them, or young and misinformed? Sorry to disappoint, but I am neither. I simply respect the truth, and misinformation regarding the US intelligence community is a cancerous detriment to the long-term personal safety of Americans. This is why I asked for references for your claims. But if you take nothing else away from our exchange other than the understanding that NSA is military, I'm happy.

> I'm certainly not advocating anything beyond having exactly this kind of discussion.

Good. I'm glad. See how it only takes a simple challenging of one's opinions to go from "I hope the NSA gets shut down" to "let's have a discussion?"


> I can't possibly fault the soldiers

You can. We all collectively decided at Nuremberg that I was just following orders is not a credible defence of one's actions.


> The troops on the ground are generally following orders and highly disciplined.

Because they had been disciplined, this isn't how military used to work 10k years ago.

I agree with your parent that it probably is safer to work on disciplining the NSA (and others such) than hope that they will get pwned or do anything in that direction.


Let's accept your analogy. They're military and they let sophisticated weapons systems go to the enemy. If the US Army let someone steal a squadron of Apache helicopters that were used to level a few civilian towns, that particular base where the theft occurred (at least) would be on lockdown and under harsh review for some time. If someone wishes one of the targets hit with the stolen vehicles was the base from which the theft happened so that they shared in the misery they caused, what's wrong with that? Are people supposed to say, "Well, little Billy is dead and we'll never see him grow up, but at least the fools who let this happen are okay?"


What gives you the impression that NSA is not under harsh review?

There have been many instances reported publicly about US tech falling into enemy hands. I've included a few from recent memory. This is something that happens, and I share your concern. It is a bad thing that happens, among many bad things that happen in war.

But to optimize away from it, would be to sacrifice the technological edge that keeps Americans safe.

https://en.wikipedia.org/wiki/Iran–U.S._RQ-170_incident https://www.theguardian.com/world/2015/jun/01/isis-captured-... https://www.theguardian.com/world/2011/aug/15/us-helicopter-...


> As an American, NSA protects you, that is NSA's job.

No, it doesn't. It protects the status quo.


>and it would be wonderful if the U.S. could return to being viewed as "the good guys"

The window that this occurred in is much smaller than many people realize. You need to go back in time a little, but not too far back — also, only look at our actions abroad and mostly ignore how we treated certain people domestically. From the outside looking in, the US has mostly kind of been the "eh, okay I guess?" guys. Or the "good by contrast guys."


"The window that this occurred in is much smaller than many people realize"

Maybe the default position for all governments (and people?) should be "they did something good at this instant in time". It's kind of sad to view the world that way but probably safer than swallowing the revisionist explanations that are usually offered up.


I think it's helpful to overlay a lot of American history with a timeline of not-so-great things that were happening at the same time.

For example, the american revolutionary war was an important event... but also we had slaves (some who were forced to fight), it fractured many indigenous tribes, and the first presidential election 10 years later only allowed male landowners to vote.


I don't think you understand how corrupt security-theatre government reasoning works (see TSA)

1) We need funding to fight X 2) If X happens anyway we need more funding

As long as there is a veneer of democracy over the outright theft of tax funds, there will be a way for these "essential" agencies to grow until they kill/consume their host.


> 1) We need funding to fight X 2) If X happens anyway we need more funding

That does seem to be roughly the stance taken by security agencies, but it also occurs to me that I view this as a completely reasonable argument when I hear it from an IT department or the CDC.

I want the CDC to have lots of resources to protect against serious outbreaks of infectious disease, and if one happens, I see it as a signal of three possibilities:

(a) They need more resources.

(b) They need to fix their methods.

(c) Nothing's 100% safe and we should accept that.

I definitely see TSA as mostly pretty much 100% (b)+(c) (mostly (c), to be honest), but the reasoning you cided isn't always wrong--sometimes (a) is legitimately the case, so you can't dismiss it out of hand.


This is a shallow, incorrect argument.

As a reminder, a custom version of iOS is absolutely useless without the means to get it onto a device. In fact, it's possible and relatively cheap for a talented team to make that custom version without Apple's help.

That means is a key held only by Apple, needed to sign software before it can run on any iOS device. Apple uses this key every single time any iOS device around the world is updated- a nonce is generated on device, sent to Apple, signed along with the firmware, then verified on-device before it allows the new software to run.

Creating the software may have been work for Apple, but would not contribute in any way to making phones less safe. Signing a piece of software inside their own premises for a particular nonce and device ID can likewise not be used to make any other device unsafe. This is a process Apple does many times a day, whenever any device updates.


If the started that, local law enforcement companies would also ask to have their govOS signed by apple to be installed on a particular device. How do you make sure that a) the link from the local law enforcement branch is authentic and has not been compromised, b) make sure that the person requesting such an update is actually authorized to do so (and not just some criminal how gained access to the law enforcement computers).

Not matter what you do, there is still a very good chance that somehow unauthorized people get access to sign an OS for any of apple's devices device they want. It might not result in such a huge attacked but nonetheless, there is still damaged to be done.


The right answer here is they need a valid writ from a court, just like the FBI had.

We don't have much of an issue with criminals getting wiretaps through faking subpoenas to phone companies (although I did find https://arstechnica.com/tech-policy/2017/03/feds-brooklyn-pr... when searching.)

Note that I'm not arguing that Apple should have complied, that the court should have said a particular thing, or anything like that (did enough arguing last year when the case was going on). All I'm saying is the comparison is invalid, and the author doesn't know the technical details and is therefore wrong.


This is assuming that whole thing is free of bugs, which is never going to be the case.

Just like the fastest code is the code that is never executed, the safest code is the code that is never written in the first place.


If the device firmware that checks for signatures has a bug then it can already be exploited by anyone who can find it.

Creating a new iOS version in no way increases the attack surface. If you think it does then you probably don't understand how the technicalities work, or I'm misunderstanding you (in which case please sketch a scenario where Apple doing what was requested increases attack surface).


Indeed the attack surface is the same, but now there is an extra incentive to crack the signing process, because you will be able to run an open version of the OS. Cracking the signing process now is useless because the OS image is still the original.

Now, cracking the signing process so that it authorizes a third-party-modified version of the OS is something else completely, but I'm not sure it would make any difference if the attack is done with physical access. It could be useful for remote attacks.


So your scenario is

>code gets leaked

>incentive to find bugs in signing increased

>bug in signing found

Offhand I would think the value of an exploitable signing bug is an order of magnitude greater than the cost of making a custom software. At the very least it trivially allows jailbreaks, which puts it in the million dollar range depending on how exploitable it is, and I think creating iOS software that skips the passcode would at worst be in the low six figure range if it didn't need signing, more likely mid five figure.

Saurik said it could be done in a week by a talented programmer (see https://news.ycombinator.com/item?id=11153022), so even though the supply of such programmers is limited I'm still confident the total cost would be 5 figures.

So the additional incentive would be pretty insignificant compared to the existing incentive.


The reason the attack happened to including British National Health Service hospitals was because they didn't follow manufacturer's instructions (in this case Microsoft) and upgrade their Windows XP software to Windows 10.

The only people to blame are those that don't follow manufacturers instructions.

They were warned ahead of time. There were even news reports about this.

Target and Home Depot both had credit card hacks precisely because they did not follow manufacturer (again Microsoft) instructions of upgrading Windows XP embedded to a supported version of the OS.

I don't understand how hospitals can get accredited if their IT systems are not up-to-date and verified by cybersecurity experts. Since the companies and the NHS can't be counted on to follow manufacturer instructions, this is extremely important.

May 11, 2017: British Medical Journal: The hackers holding hospitals to ransom - Hospitals need to be prepared to avoid shutdowns http://www.bmj.com/content/bmj/357/bmj.j2214.full.pdf

Also: Hospital accreditation http://www.uktreatment.com/why-the-uk/hospital-accreditation...

EDIT: The computer systems are capital equipment and like any other form of capital equipment (eg, vehicles in the motor vehicle pool) they must be maintained. Complex machines undergo changes over the life of the equipment and manufacturers issue updates (eg, field change orders) that should be followed by the purchasers of the equipment.

Regarding proper cybersecurity, that would include hardware upgrades since later Intel CPUs incorporate hardware that assist with proper security that is taken advantage of by later versions of the Microsoft OS.

The problems of hospitals both in the US and in Britain were because of a refusal to follow the manufacturer's (Microsoft's in this case) instructions.

Those familiar with the hacks of Target and Home Depot would know that they were hacked because Target and Home Depot refused to follow Microsoft's instructions to upgrade their point of sale software from unsupported Windows XP embedded to a later, supported version of the OS.


> The reason the attack happened ...

A reason. Real events have lots of root and proximate causes. MS could also have avoided writing the bug, or discovered it themselves with better auditing. NHS could have disabled SMB on systems that don't use it, or otherwise firewalled it.

And, of course, the NSA could have disclosed the bug when they found it instead of hoarding it. Or better protected their tools from theft. Or the Shadow Brokers could have better audited their disclosure to avoid spilling active hacks into the public.

Almost all of these things were required to get to where we are, and all of them are "simple and expected" from at least someone's perspective.

Lots of blame to go around, basically.


> "A reason. Real events have lots of root and proximate causes. MS could also have avoided writing the bug, or discovered it themselves with better auditing. NHS could have disabled SMB on systems that don't use it, or otherwise firewalled it."

No machines are built perfectly from the start and there are changes made over the capital equipment's life be it airframes, jet engines, or computer systems.

When purchasing capital equipment or buildings, bridges, etc, part of the responsibility of the firm is to follow the manufacturer's or builder's instructions and including maintenance upgrades. In the case of Microsoft, they gave warnings for years that the Windows XP software would not continue to be maintained.

The issue was not Microsoft, but NHS (and other governments, firms) decisions not to budget for and perform maintenance for the capital equipment that they purchased.


I agree this should be upvoted more than ten times.

It's staggering to me that so many recent revelations have shown that in their efforts to protect us, the 3 letter agencies have allowed vulnerabilities to exist on our machines -- and considered them assets -- without warning us.


I don't think it's that easy though. For example, if your CT scan software was made for Windows XP and you randomly update to Windows 10, then if you have any problems, the company making the software can blame you for not using Windows XP and you're on your own, since that's what the software has been tested/accredited for?

That's what I read somewhere, not sure how accurate it is.

I imagine the ideal solution would be to have a clause in the contract with the software company that they have to test and re-validate their software on new versions of windows and updates in a timely fashion so that the computers running the software can be updated? That would probably increase costs though


It is a terrible business and business arrangement (eg, the original contracts to purchase said software or equipment that uses software) that doesn't insist that the purchased software/machine run on manufacturer (in this case Microsoft) approved OSs. If the manufacturer says upgrade, then upgrade. When you purchase software and equipment, this should be part of any standard contract.


People in the know have been complaining about this for many years, and it's corporate bean counters that collude with a sclerotic corporate IT mentality to be conservative in all things including software.

And the reality is, the overwhelming majority of IT people have no training or background in risk assessment, they are not computer security people. But the bean counters, always optimizing, without good information let alone perfect information, have assessed risk incorrectly. And now there's a wildfire in progress, and basically being incompetent at the task they were handed, are all surprised.

Many other people who have considered this eventuality are not surprised.


> "And the reality is, the overwhelming majority of IT people have no training or background in risk assessment,..."

But in this case, as in the case of credit card hacks of Target and Home Depot, it was simply a matter of following manufacturers maintenance instructions and upgrading the OS version from an unsupported version to a supported version.

As many have pointed out, in the case of Windows 10, the upgrades had been free at no charge from Microsoft.


As many others have pointed out, the software company needs to run on that OS may not run on Windows 10, may not be supported on Windows 10, or maybe require a whole new licensing scheme and fees to move to a version of the software that will run on Windows 10. And IT and bean counters have very typically taken the position of "if it's not broken don't fix it" and do not consider constant migration in their best interest.

Also, that it was free for consumers doesn't mean it was free for enterprises. That licensing is different. And even if the licensing is free, the cost to know how by your IT staff is not free. Either you're getting new staff who can support it, or you're sending the staff to currency training. Doctors, lawyers, pilots, have done such recurrency training for decades, but in IT it's not a given it's very much driven by the CTO. And some of them do not care to have a staff more capable than minions. They're cheap. That's all they care about. And then this happens and they quickly will have to look for someone to blame.


The bean counters ultimately work for the shareholders if a public corporation or the public if it is government. The shareholders empower management and the public elects their leadership.

The CEO of Target got fired for not upgrading their point of sale software from an unsupported version of Microsoft Windows embedded (XP) to a later version which caused their system to be hacked.

Ultimately, this is going to be a problem as long as firms don't treat computers as any other form of capital equipment (such as repair vehicles) that need ongoing maintenance. Boards need to be asking their management about keeping their capital equipment under maintenance.

As for software that doesn't run on Windows 10, perhaps it is best to avoid firms that produce software where they don't upgrade it to the most secure version of the OS. I think this is mostly a hypothetical for most and it is probably important that firms change vendors if the vendor isn't willing to upgrade to the latest version of the OS.


There are a class of bean-counters that specialize in assessing risk quantitatively. They would be found in the insurance industry. Though I would caveat that with the guess that the vast majority of that assessment is with models of risk attached to lines of insurance that have been long established and change fairly slowly.


It is a terrible business and business arrangement (eg, the original contracts to purchase said software or equipment that uses software) that doesn't insist that the purchased software/machine run on manufacturer (in this case Microsoft) approved OSs.

So your answer to the difficulty in making an OS secure is that anyone writing an application to run on that OS should instead be contractually required to support that application on arbitrary future OSes that don't even exist yet?

Good luck getting anyone to supply anything on that basis. You'll need it.


> So your answer to the difficulty in making an OS secure is that anyone writing an application to run on that OS should instead be contractually required to support that application on arbitrary future OSes that don't even exist yet?

A piece of software that is operating on an non-supported OS is not a functioning piece of software. Usually, when software in the corporate environment is purchased, it is with a maintenance agreement or it should be. Part of the maintenance agreement should be for future versions of the OS from the same software vendor (eg, Microsoft in this case).

Perhaps in the future with more and more apps going from Desktop to SAAS or mobile this might be less of a problem. Don't know.


A piece of software that is operating on an non-supported OS is not a functioning piece of software.

Why not, exactly? What is mysteriously going to stop working just because someone's legal arrangement expired?

Or to be blunt, how many people do you think we should kill by not using medical technology bought at great cost just because some lawyers would like a bit more money please?

Usually, when software in the corporate environment is purchased, it is with a maintenance agreement or it should be. Part of the maintenance agreement should be for future versions of the OS from the same software vendor (eg, Microsoft in this case).

That's a lovely theory, but in the real world organisations buy very useful, very expensive equipment all the time with the expectation that its useful lifetime will be longer than any currently available OS is officially supported for. Moreover, in many cases it might not be economic to purchase at all without that. This is why standards and compatibility are so important.

Perhaps in the future with more and more apps going from Desktop to SAAS or mobile this might be less of a problem.

Heaven help us if anything important ever moves to SAAS, because no-one else will. SAAS is sometimes useful for convenience or short term flexibility. Self-hosted is for professionals who need guarantees.


> "Why not, exactly? What is mysteriously going to stop working just because someone's legal arrangement expired?"

No, it is about engineering and resiliency as opposed to just getting something to work. Would you fly on a plane with unsupported software?

Firms can choose to use unsupported software at their own risk (at least in most situations -- there may be situations where it is illegal to do so such as mission critical safety systems).

> "That's a lovely theory, but in the real world organisations buy very useful, very expensive equipment all the time with the expectation that its useful lifetime will be longer than any currently available OS is officially supported for."

First, I don't know if that's true about the expectation that a very expensive piece of equipment is expected to run on unsupported software. If it truly an expensive piece of equipment that usually comes with maintenance agreements (eg, MRI scanner, CT scanner) and in that case, the vendor can't be using unsupported software. It should be part of FDA approval process that that be the case, but I don't know for certain.

Regarding SAAS, it might be a private firm server (farm) but one that is a server none-the-less which is easier on upgrades than entire sets of desktop systems.

I believe that firms that use unsupported, outdated software open themselves up to various liabilities.


>For example, if your CT scan software was made for Windows XP

There are best practices for dealing with out of support equipment that are exceptional edge cases. Segregation and isolation from the network being the top ones. Having them sit with the smb service listening on the local subnet is completely inexcusable and a sign of incompetence.


How do people get their files out of isolated computers? For example if someone does a CT scan and needs the upload the results to the patient's file? Use a USB key? I think that's what my boyfriend does with his driving simulator at work that still runs Windows XP. But aren't USB keys just another vector?


There are unidirectional security gateways that only allow data to flow in a single direction. The legacy hardware creates a file, and copies it to a directory managed by the gateway. The gateway then copies it to some data dump on the other side, onto an updated and centrally managed part of the network. Barring some vulnerability in the gateway, passing data in the other direction is impossible.


cool, thanks


You just have those files rsync to a file server and then the client PCs connect to that file server. They should never ben connecting to the XP box directly as its firewall'd off to only be able to speak to the file server.


In engineering, we should assume the users will not adhere to instruction. This is why in physical engineering, tolerances for usage are often 3x their advertised value.

In Microsoft's case, they have the means to know who their users are and ensure they upgrade. Forced upgrades should be an opt-out feature, especially with an OS as prominent as Windows.


Don't know what kind of engineer you are, but I have BS EE with minor CS from top engineering school.

Airplanes have to undergo preventative maintenance. We don't count on airplanes not being maintained. Sorry.

Highway bridges and other structures undergo constant inspection.

It is one matter for individuals in their homes choosing not to update a computer or for a privately held firm.

But firms with stockholders, or governments must follow manufacturers instructions when they purchase equipment and that includes maintenance whether for their vehicle pool or other equipment.


> Highway bridges and other structures undergo constant inspection.

They are supposed to, but underfunding/mismanagement can abuse the overengineering (which thankfully existed in the first place!) and skip regular inspections, sometimes with fatal consequences.

http://www.nbcnews.com/id/22300234/ns/us_news-bridge_inspect...


Yes, you are absolutely correct. There was the collapse of an Interstate Highway bridge near Albany NY, the collapse of a bridge in Minneapolis, MN, and other cases. Since people die because of the lack of proper inspections, I wonder why management isn't held accountable for manslaughter or whatever charges.


And in Windows 10, forced updates are opt-out, and yet, prominent tech sites like C|NET publish crap like this without tons of caveats about why it's a terrible idea for most everyone who needs a C|NET article to tell them how: https://www.cnet.com/how-to/stop-windows-10-from-automatical...

They kind of acknowledge how bad this advice was:

"Update, May 15: With the Windows 10 Creators Update, Microsoft has largely addressed the forced updates that often resulted in lost work. And, while the the recent WannaCry ransomware does not (thus far) appear to affect Windows 10, you need to make sure your PC is kept up-to-date with security patches to avoid exactly those sort of attacks. To that end, consider the information below to be out of date, with a more thorough update to come."


The NHS had already opted out - because of the difficulties in upgrading.


The NHS had warned hospitals several years in advance that Windows XP would be discontinued and that they would have to upgrade the OS. The NHS took out a 1 year contract on Windows XP updates at great financial expense which gave responsible authorities plenty of time to do the system upgrades.


> The reason the attack happened to including British National Health Service hospitals was because they didn't follow manufacturer's instructions (in this case Microsoft) and upgrade their Windows XP software to Windows 10.

Well, on one side, you have the OS manufacturer's instructions to update to Windows 10.

On the other side, you have the software manufacturer's instructions to stay on Windows XP at all costs.

Which instructions do you listen to?

(Arguably, before this week, it would have been empirically correct to listen to the software manufacturer; upgrading from XP would have definitely broken the system, but staying on XP didn't break it so far.)


If the software manufacturer's instructions are to stay in Windows XP, then they should be held responsible for patching the security flaws in the system they delivered - if they choose to use obsolete components, maintaining them should be their problem.


In theory you might be right, but pragmatically (and from an engineering perspective) it isn't going to happen and it is just fraught with possible problems. Best not to deal with an OS from 2001.

In many cases, they can probably upgrade using Win XP running on a VM in Windows 10.


> "On the other side, you have the software manufacturer's instructions to stay on Windows XP at all costs."

It is one thing for an individual to purchase software for their own needs, but any firm or government should not purchase software that will not be upgraded to the current versions of the Operating System. That is simply irresponsible and should have been rectified in the purchase contract.

As it is, are there specific examples of software used by firms and government that are not updated to current versions of Windows?


Even common hardware devices as plain notebooks don't run properly with the "newer" OS. I have one notebook for which the drivers break with anything above Windows 7. I have another which can't be upgraded over Windows 8.1. For both Microsoft's "Get Windows 10" claimed they "will work" with Windows 10 (there were ads running on both which do the "check" before).

I've tried the upgrade for both. The Windows 7 one has problems with the WiFi card -- reason: Intel never made newer drivers, the old aren't actually compatible with the newer systems. The Windows 8.1 one has problems with the touchpad: reason: the newer drivers are broken for that model. It also has some display issues on the Windows 10. The WiFi card, the touchpad and the display effectively aren't replaceable. But the computers, apart from not running the newer OS, work without a flaw.

If I as a consumer with only a few computers have such problems, I can't see how you don't understand that the whole companies can have immense number of computers that work perfectly with their current OS and would simply break with the newer one.

They would not upgrade, but they would install the security updates. But guess what, Microsoft doesn't want to give the security updates to everybody who has Windows XP. Only after this failure they "exceptionally" did it.

So Microsoft is absolutely responsible. They can say "we'd prefer to motivate people to buy new" but they are still morally responsible for not releasing updates when there are so many actively used systems.


I think many firms have a two or three year cycle for upgrading their desktop/notebook computers. Honestly, they are far less expensive these days than they used to be. The biggest cost might be retraining people to use the new systems in many cases.

Windows XP is an OS from 2001. It is way out of date and does not include with it all of the security of a newer OS such as Windows 10.

In the case of most firms they upgrade every few years anyway. It might be different for some non-profits or government. But at any rate, I feel it is totally unrealistic to expect MS to support software from 2001.


> That is simply irresponsible and should have been rectified in the purchase contract.

Simple does not exist in even smaller private organizations, let alone large government organizations.

> As it is, are there specific examples of software used by firms and government that are not updated to current versions of Windows?

There is code written 60 or more years ago running on today's mainframes. The cost of rewriting even a fraction of this code can be staggering. This is precisely why COBOL programmers get paid big bucks and why Microsoft gets paid by large clients to provide support past end-of-life. If it works, it works.

You can't blame one person for something going wrong in an increasingly connected world.


Not advocating that software be converted from IBM mainframes (which IBM still maintains and still releases new hardware) to be converted to Windows.

How much software that is generally available only works on XP and that can't run in XP running on a VM?


> That is simply irresponsible and should have been rectified in the purchase contract.

Have you heard the expression "captive market"?

It's not like they can buy equivalent software from a company that guarantees upgrades.


> "As it is, are there specific examples of software used by firms and government that are not updated to current versions of Windows?"

Sounds like a hypothetical. What applications software is being purchased for which there is no equivalent and only a captive market?


>The only people to blame are those that don't follow manufacturers instructions.

And even then if you absolutely can't upgrade that XP machine running an MRI then there are best practices to segregate and isolate it from the network properly.

I'm getting a little sick of the people with an anti-whatever axe to grind pretending this was some big sophisticated attack that only nation states can do. This patch was released months ago, there's no excuse not to have it. It was just one of the hundreds of Windows exploits found per year. If admins aren't upgrading their equipment or segregating them properly then that's on them. Note, there was no NSA leak for Conficker, codered, heartbleed, slammer, etc. This stuff will not stop unless admins and management take security more seriously.

No admin, myself included, had any issues this weekend. It was completely avoidable.


Similarly, all of the brouhaha about Clinton campaign chairman John Podesta getting his gmail account hacked. He was in a very sensitive position and he should have had 2-factor authentication turned on. Gmail has great support for that.

If you look through Wikileaks, only after the email hack does he get the recommendation to turn on the 2FA which presumably happens.


There's plenty of blame to go around, if you want to play blame games. The "free market" has no mechanism for 3rd parties to affect the decision making process. Microsoft has its policies, well known to the buyers, and the buyers accepted this as well as their own policy to delay recommended upgrades. A completely neutral person, a patient, is put at risk through no fault of their own, had absolutely no say in either transaction (either the original point of sale of the OS, or the policy to not keep software up to date) is injured.

This person arguably has an injury claim against Microsoft (their bug, their policy to stop updating software without disabling it thereby permitting the broad weaponization of their software). And they have an injury claim against each medical service institution who refused service because their systems were fully or partially non-functional due to their negligent choice to run old software; to not have their own user space programs on a constant "everygreen" cycle so that, without any gaps, it's possible to run their mission critical software on a currently supported and secure OS.

Basically, these injured 3rd parties have to sue or legislate. Either way it requires a sovereign to intervene in the market.

And what role and blame do millions of non-business end users running outdated software, and their computer was part of a web enabling such prolific spreading of malware? Do they get sued or billed their incremental share? What if it's $1000 per person running Windows XP? And now that they have liability, maybe they can all sue Microsoft because Microsoft permits them to run outdated software?

This shit really needs to be engineered better, and really where we're at is, we're monkeys playing with computers and reveling in our own self-importance, when in reality we still sometimes suck. And this is an (yet another) example of how primitive we are still.


> There's plenty of blame to go around, if you want to play blame games. The "free market" has no mechanism for 3rd parties to affect the decision making process. Microsoft has its policies, well known to the buyers, and the buyers accepted this as well as their own policy to delay recommended upgrades."

Nobody forced people to purchase Microsoft software. There are other vendors (Apple, Linux based systems). But if they choose to purchase the software, they need to follow the vendors maintenance instructions if they want properly operating equipment.

Does United Airlines ignore the maintenance instructions of Boeing or Airbus? If they even thought about it, the FAA would be all over them.

In the case of hospitals in the US and the UK there are accrediting agencies. In the US it is The Joint Commission. In the UK it is Care Quality Commission. If the people running hospitals are irresponsible and choose not to follow manufacturers instructions for maintenace for any capital equipment, then the accrediting agencies should refused accreditation.


One thing that comes to mind is that while dropping XP might be the manufacturer's recommendation for the OS, it's not the only manufacturer's recommendation in play here. Only geeks like us and the ones in IT departments care about the OS. Most of the rest of people don't care about the OS -- they care about the applications and devices they can use with it. The OS only matters to the degree it enables or hinders that.

So when it comes to the computers used by hospitals and doctos and the like, OS manufacturer instructions are lower down on the priority list, below the manufacturer instructions for the applications and medical devices they use. And if $MedicalRecordsApp or $MedicalEquipmentController is only certified by the manufacturer to run properly under Windows XP, then you can bet that the hospitals and doctors offices are going to continue to run Windows XP.


> "One thing that comes to mind is that while dropping XP might be the manufacturer's recommendation for the OS, it's not the only manufacturer's recommendation in play here. Only geeks like us and the ones in IT departments care about the OS."

The OS is simply a part of a piece of capital equipment. Like any other form of capital equipment (think trucks that need their oil, brakes, and tires replaced) the computer is also a piece of capital equipment that must be maintained. The fact that it is not maintained is a function of poor management that should be replaced with management that is more responsible. Nobody would settle for a truck not having brake jobs and tires replaced. That same standard should apply to any form of capital equipment.

In this case, MS said that the OS was no longer supported. End of Story. Responsible boards need to start hiring CEOs that understand that it is important to maintain capital equipment.


The piece of capital equipment as far as a hospital or doctor's office is concerned is the medical application software or the MRI machine, not so much the OS, IMO.

If Cessna says the only powerplant that your Cessna 172H is certified to use is the Continental O-300, then that's what you run. Continental doesn't make the O-300 or parts for it anymore? Too bad, you fly with an O-300, or you don't fly. Guess it's time to turn to the secondary market for parts and support for that engine.

Get a newer airplane with a newer, better engine that's still being supported by the manufacturer? Hmm, the presently-owned 172H is paid off, still flies and still makes money. Buying a brand new 172S requires taking out a new mortgage and isn't going to make any more money than the 172H does. Seems like we're going to keep the 172H for the time being.

I don't disagree with you that capital assets should be maintained. However, the companies running these older applications that require older OSes aren't the only ones at fault here. The manufacturers of the medical devices and software themselves are just as much to blame. This situation didn't come about solely because CEOs were being slackers about maintaining capital assets.


> "The manufacturers of the medical devices and software themselves are just as much to blame."

Yes, they are, but when purchasing high value pieces of capital equipment, people need to have this maintenance as a part of the purchase agreement. In cases such as MRI, CT, many of these firms are generally very large and have the capacity to do so.


The reason the attack happened to including British National Health Service hospitals was because they didn't follow manufacturer's instructions (in this case Microsoft) and upgrade their Windows XP software to Windows 10.

And what happens if some of the software or equipment used with that system is not compatible with Windows 10? I can make a very secure house if I give it solid, metre-thick concrete walls, ceiling and floor, but it won't be very useful as a house without things like doors and windows.

You're far too quick to excuse Microsoft for what was, ultimately, a defect in the original product.


> "And what happens if some of the software or equipment used with that system is not compatible with Windows 10?"

It is irresponsible for a firm or government (as opposed to an individual) to purchase software without a contract that ensures that it will be upgraded to current versions of the OS.

> "You're far too quick to excuse Microsoft for what was, ultimately, a defect in the original product."

What defect are you referring to exactly? That Microsoft years ago couldn't predict every way that cybersecurity attacks happened to their OS? Or that users of versions of Microsoft software still under support contracts (including firms and governments) did not run software update tools for a patch released in mid-March?

It is simply unrealistic for a firm to anticipate all forms of cybersecurity attacks. Later versions of OSs include not only different architectures to fix cybersecurity flaws but take into account changes in the Intel hardware that help to combat cybersecurity attacks.


It is irresponsible for a firm or government (as opposed to an individual) to purchase software without a contract that ensures that it will be upgraded to current versions of the OS.

We're talking about things like medical equipment, often in regulated fields. There may be no such contract available, and even if there is, upgrading to a totally new OS may or may not be acceptable. The idea that you should throw out millions of pounds of high-tech medical scanner every few years because a hundred buck OS that it was supplied with couldn't be kept secure is laughable.

What defect are you referring to exactly?

They supplied an OS that was not secure. The rest is just rationalisation and apologies.

It is simply unrealistic for a firm to anticipate all forms of cybersecurity attacks.

It is also unrealistic to expect organisations concerned with literally saving lives not to buy equipment unless they have a plan in place to deal with updating the software on or used with that equipment in arbitrary ways with arbitrary consequences whenever anyone responsible for any part of its software sneezes. The need for stability, reliability and longevity is precisely why so much in this kind of sector is regulated in the first place.

Perhaps the problem here was that such systems should never have been built on such an insecure platform and then connected to a network, but that too is on the device manufacturer and not the hospital. There are basic standards of fitness for purpose that are reasonably expected with such devices, even without more specific regulation. Particularly in this sort of field, those devices should then be supported for their entire working lives, not some arbitrarily shortened version of their working lives dictated by one software development organisation.


Firms that implement software using software from a vendor (eg, Microsoft) have to follow the manufacturers instructions. If they feel they cannot do that, then they should use a different OS vendor (not Microsoft). That goes for proper, documented uses of the API but also for not using unsupported versions of the OS.

When people purchase equipment, there are equipment contracts. Medical equipment, depending on the type, needs to follow FDA approval guidelines. Clearly, not piece of equipment should allowed to be running in a hospital based on an unsupported version of an operating system. That is simply irresponsible.

Each newer version of Microsoft OS implements security features not in earlier versions. In some cases, there are changes to the Intel hardware for additional security that later versions of the OS take advantage of.

Again, firms and governments should not be using unsupported versions of hardware or software.

> "It is also unrealistic to expect organisations concerned with literally saving lives not to buy equipment unless they have a plan in place to deal with updating the software on or used with that equipment in arbitrary ways with arbitrary consequences whenever anyone responsible for any part of its software sneezes. The need for stability, reliability and longevity is precisely why so much in this kind of sector is regulated in the first place."

The longevity is mostly the supported API across different versions of the OS, and not the OS itself. Each version of Microsoft OS comes out with additional security features and vendors who use Microsoft OS (or any OS) need to plan for upgrading their application software that uses the OS and purchasers of application software should insist on it.


Clearly, not piece of equipment should allowed to be running in a hospital based on an unsupported version of an operating system. That is simply irresponsible.

You keep writing things like that, as if the only practical alternative in many cases was not turning the equipment off altogether.

Each newer version of Microsoft OS implements security features not in earlier versions.

And more spyware, too. What should a hospital do if Microsoft decides to instruct them to update to a new version of Windows that will automatically upload all of the data on the local hard drives to a Microsoft-cloud-hosted backup system, complete with all the legally protected sensitive personal information? Again, this issue can't just be as simple as suppliers getting to move the goalposts however they want after the sale.

The longevity is mostly the supported API across different versions of the OS, and not the OS itself.

Then why are so many healthcare providers who were caught last week saying they couldn't upgrade from Windows XP because some essential functionality of other software or equipment no longer worked on Windows 10?


> "Then why are so many healthcare providers who were caught last week saying they couldn't upgrade from Windows XP because some essential functionality of other software or equipment no longer worked on Windows 10?"

In the US, they might possibly be in violation of HIPAA health data privacy laws because by running on an unsupported OS they are leaving themselves open to cyber attacks and stolen health care data.

Best to deal with vendors of software that upgrade their software to the newest OS versions, especialy if it has anything to do with health data.

> "You keep writing things like that, as if the only practical alternative in many cases was not turning the equipment off altogether."

Microsoft had been warning for years that they were going to discontinue service and they have done it with previous OSs, so no surprise there really. Software vendors and their customers need to prepare ahead of time.

If properly maintained by their IT Depts, these Windows 10 Upgrade messages never come up.


While that's true, there is a flip side that there were likely a lot of custom-made or proprietary systems that only worked on XP. Even worse, there were likely various medical hardware that only had drivers for XP. So upgrading from XP is not necessarily an easy project, and definitely not cheap. It's likely a huge project that would be like doing a complete circulatory system transplant, not a simple isolated heart transplant.


As far as I recall, one should generally be able to use a Win XP in a VM running say, on a Win 10 OS.

Fixing by writing drivers may not be cheap for the vendor of the software, but they should do it. You simply can't get around the fact that each new version of the OS is more secure than previous versions.


Yeah, but let's not punish customers for choosing a vendor that sucks more than they already get punished. Bad decisions have their own consequences, that should be punishment enough. Also, who knows how many good vendors are out there for medical equipment. I'm not an expert, but I only hear horror stories, which may or may not just be because only the horror stories are worth telling others.


Healthcare is a mission critical business and they need to be responsible when purchasing capital equipment including of course, software. If it doesn't operate properly for example, in the US there could not only be patient harm or disclosure of their private information, the firm could be liable for malpractice or HIPAA (law regarding health record security) the later two can have financial consequences as well as direct patient health consequences.

Running on an unsupported and even if supported, not updated, software, invites failure and nobody can feel that their health records are secure.

Airplane maintenance for a Boeing or Airbus is very expensive, but you have to do it. The same may be true of devices and software in healthcare. Still need to do it right.

I don't see the alternative. Do you?


"The only people to blame are those that don't follow manufacturers instructions."

I disagree. I think it was the folks that thought it was OK to encrypt files and hold them for ransom in the first place.


Banks have vaults to store money and other valuables.

Theft is going to happen even with great policemen. People who don't follow manufacturers instructions (especially when they are told the software is no longer supported and yet they fail to upgrade to a supported OS) are culpable. To run on unsupported software or not to run security updates on supported software is like banks without vaults. Just unrealistic in either case.


Best i recall, the NHS had signed an extended maintenance contract for XP. But it had recently been dropped because some beancounter had decided they could not afford it.

In the end it may well be a big validation of the FSF message, as then the NHS (or anyone really) could hire someone to maintain the software beyond the existence of the original company.

This much like how i can find someone to repair something for me that physically broke, because the tools and such are widely available and known (or even attempt to do so myself).


According to an NHS blog post about the matter, the NHS corporate (or whatever) had been warning hospitals, etc. that Microsoft was discontinuing maintenance of XP and to upgrade. They purchased an additional year of coverage giving hospitals yet another year to upgrade to a supported version of Windows. Many ignored the one year time they had as well as the earlier warnings.


> get accredited

By whom?

It's the other way round: shutting down an NHS hospital is the sort of thing that would normally be a front-page disaster for the government, so it's almost never going to happen.


Accreditation is the US is done by The Joint Commission. Hospitals without Joint Commission accreditation cannot get paid for Medicare or Medicaid bills.

In Britain the accreditation is done by Care Quality Commission: http://www.cqc.org.uk

What's the point of an accreditation agency if they aren't going to ensure that the hospital is run safely which includes running on current software?


What about windows 7 and 8.1 ? They were also hacked.The uncertain nature of windows 10 update and the bad press about the telemetry issues was also the reason for the IT Departments to stay clear of it. Why would someone ignore a free windows upgrade ?


It is always best policy in my opinion, to upgrade to the next OS version after the first (point) release. Each new version of the OS (eg, Win 10 over 8.1) has additional security measures not in the previous version. I use Mac primarily and I always do fresh install at the .1 update (eg, 10.12.1 and not 10.12). I run Win 10 on my Parallels VM.


Yes i also the same point, why would any one want to miss the free upgrade. The whole update hijack and privacy concerns kept some people away. Organization running bulk of window's xp and 7, for sheer cost cutting purposes. But still the whole thing affected all, if some people weren't hesitant to migrate to the newer update , they would have been saved.


> "Organization running bulk of window's xp and 7, for sheer cost cutting purposes."

I agree with this observation. I believe this also was the reason that Target and Home Depot did not follow manufacturers (Microsoft) recommendations to upgrade point of sale terminals from the (now) unsupported Windows XP embedded to a later version of the OS still supported. Because they disregarded the advice, millions of people had their credit cards hacked.

At least the CEO of Target got sacked as a result.


It is tough to discern whether the instruction from Microsoft is to safeguard customers or to increase their marketshare.


Windows XP was released in 2001. Just exactly how long should a company supported outdated software?

Since XP there have been the following releases. (1) Windows Vista (2) Windows 7 (3) Windows 8 (4) Windows 8.1 (5) Windows 10.

Actually, it is the best policy to upgrade to the latest version of the OS, for security reasons if for no other reason.


His refusal was already more than validated by acting as a safeguard for privacy.


The article properly quotes that not only privacy but public safety is affected.


Not sure how much apple care about my privacy when they force the phone to call home after factory reset and using it as a dumbphone and blackmail you to provide a valid credit card before installing any free app...

If you care about privacy you don't collect data on your users.


> force the phone to call home after factory reset

Sounds like a useful anti-theft measure?


Yes. Banning cash and making every transaction public without bank secrecy is great anti money laundering measure But we are talking privacy. Privacy means also privacy from the vendor. It is almost impossible in iOS.


People ate already "laundering money" using cryptocurrencies, so banning cash won't fix anything. It will just make it easier for the government to take the taxes directly out of your account. And then invent some new taxes, until you will have to resort to "money laundering" yourself to make ends meet.


You are able to download free and paid apps without using a valid credit card. You only have to provide your address which is not checked by apple from my experience (they probably have to do that because it is the law to ask for a billing address when you buy something -- physically or digitally). Skipping the "please enter a credit card" dialog is a little bit tricky but from what I remember: you have to create the account by downloading a free app instead of creating an account and then download the app. There are many articles how to download free apps without a CC.

Furthermore, you are also able to buy itunes gift card at many stores in cash, so if you wanted to download something and not be associated with that purchase, you can simple pay for the card in cash. But if you are worried about such privacy, purchasing digital goods is not a wise choice because there are very little places where you can purchase such goods without a CC. Not many sites accept bitcoin or are large enough to have their own gift cards.


With all these companies, they want to own everything about you, but will want to protect that precious IP from any other non-contracted party.

Apple is somewhat unique here in that they make profit off their hardware sales and their online stores, and thus doesn't even have to sell to 3rd party personal info agencies to make a profit.


There's privacy and there's anonymity. They don't care about our anonymity for various reasons (which may or may not satisfy you). But they do (seem to) care about our privacy.


True, but nevertheless, their actions in this case advanced the interests of privacy, both on a technical level, and by the PR of presenting it as something worth preserving.


https://support.apple.com/en-us/HT204034

You've been able to download free apps without a credit card since forever


This is from their own page.

If you're using the iTunes Store or App Store for the first time

If you're using the store for the first time with an existing Apple ID, you must provide a payment method. After you create the account, you can change your payment information to None. If you're creating a new Apple ID, you might be able to create an account without entering your credit card details.

The only way is with gift card which is not sold worldwide.


That's due to them using it as a way to limit fake accounts and account spam.

Quite reasonable as a trade off and as you mentioned it is possible to circumvent the need for a traditional credit card.

Plus, if you're against that then you'd have to be 100% off the grid anyhow - any bank or utility company (inc your phone company) knows more about you than Apple.


If two people know one secret is one to many. How many people in USA work for the FBI, CIA, NSA, police departments, have top secret security clearances? Millions of them. One is all it takes, as shown by Snowden, Manning etc.

So thanks, but no thanks.


> How many people in USA work for the FBI, CIA, NSA, police departments, have top secret security clearances? Millions of them.

To be fair, the number of people who had direct access to the NSA/CIA exploit archives was probably in the hundreds. TS information is usually compartmentalized so only the people who need to access it can (known as TS-SCI).

Still bad that they have that many who can access it, but not in the millions.


Turns out I'm right. I thought about the military and contractors as well. "A Top Secret clearance, meanwhile, costs the government nearly 20 times more, at an average of $3,959 per background check. At that rate, investigating the 1.5 million people with Top Secret passes may have cost as much as $5.9 billion over several years. https://www.washingtonpost.com/news/the-switch/wp/2014/03/24...

"As of last October, nearly five million people held government security clearances. Of that, 1.4 million held top-secret clearances. More than a third of those with top-secret clearances are contractors, which would appear to include Mr. Snowden." https://www.wsj.com/articles/SB10001424127887323495604578535...

Now we can get into semantics until the cows come home but even tens of thousands of people are way too many. Imagine 1+ million people.


My point was that while tens of thousands have TS clearance, they only get access to specific information if and when they need it.

Its not like they go to the "Top Secret File Share" and have access to everything that is Top Secret across the government.


A TS clearance isn't some magical pass card to every piece of TS information.


I know you can't walk to NSA HQ and demand to see everything that they have in TS but Snwoden and Manning case showed that stuff isn't that compartmentalized


Don't you think at least some of those programs will eventually be pushed to the police? It's already happening in the UK where the police's cyber units now have almost the same hacking and surveillance (legal) powers as the GCHQ. Obama signed a new policy just days before he left the office to allow the NSA to share information with 17 other agencies. And that includes the DHS with its "fusion centers," where it shares data with the police.

And let's not forget how the FBI allowed the police to use stingrays illegally and taught them how to hide them from judges in various ways, including claiming the tools were under NDA and they couldn't tell judges about it. Or when they absolutely had to tell the judges about it, they'd prefer to drop the cases (against drug lords, child pornographers, murderers, etc) so as to not reveal the use of the tools.

And on and on it goes like this.


> How many people in USA work for the FBI, CIA, NSA, police departments, have top secret security clearances? Millions of them.

Top Secret clearance && millions of people?

unlikely. maybe some level of clearance && millions.


"As many as 4 million people hold "top secret" security clearance, of which 500,000 are private contractors. One reason for this trend is that the U.S. government has become so reflexive about classifying information, much of which is not nearly as sensitive as an NSA spying program, that clearance are required even for totally banal work."

https://www.washingtonpost.com/news/worldviews/wp/2013/06/12...


Beautiful. Reminds me of the way that RIPA legislation in the UK to regulate covert investigation by police and secret services ended up being used by 800-odd bodies, including the odd local authority that used it to police potential abuse of school catchment areas by parents.


...well, then.


The numbers are a bit older, but https://fas.org/sgp/othergov/intel/clear-2012.pdf claims there were 1.4 millions top level clearances held in 2012.


Couldn't Apple have updated the phone with an iOS that bypasses the verification, then destroy that iOS installer? Then the FBI can access the phone, but they don't have access to the OS, so they can't use it on other phones?


>then destroy that iOS installer

You can't do that in forensic investigations. Everything that is done to the phone needs to be verifiable to prove evidence wasn't planted. And besides, the FBI was more looking to set a precedent for future investigations than it was concerned about that one phone.


Apple has designed both server side and client side keystores to now be resistant against "future evil Apple".

Basically, even apple the company now can't break into their phones with software updates. The phone requires an erase or user unlock to update. It's scary because now apple can't even fix bugs in certain parts of the system without erasing user data.


Yes. This discussion ignores the fact that the FBI stated the work could all be done on Apple premises and that the FBI would never receive the code or software in question.

So the two things discussed here don't really have anything to do with each other, but it's not surprising that people are trying to tie them together.


Do you happen to have a source for that?


> "For example, the FBI is open to the unlocking happening in Apple premises, so Apple could install the patch, unlock the iPhone, and erase the patch before the iPhone is back in the FBI’s hands"

https://www.forbes.com/sites/nelsongranados/2016/02/20/apple...

> "As an additional precaution, the government says Apple can design the program to let investigators try different passcodes by submitting them electronically, so that Apple can keep physical control over the iPhone while the special program is deployed."

http://www.chicagotribune.com/business/ct-apple-ceo-tim-cook...


Every single time iOS is updated on any phone, it requires a new signature from Apple specific to a nonce and that device ID. It wasn't that they needed to build an installer then destroy it, any installer is in fact useless without being signed specifically for that install.


I do think that someone needs to have a good go at suing these exploit hoarders.


I'm not sure what you mean by "exploit hoarders," but that got me thinking. [edit: ah, thanks]

Institutions weigh costs, and somewhere these hospitals decided that having unmaintained, aging information systems, was more cost-effective than either maintaining or upgrading the systems.

Thus, problems like this will not go away until NOT-fixing the systems is more expensive than fixing the systems. So what does that take? Fines?

Irony alert: the hospital that ignores aging systems and hopes that they never get hacked are not at all unlike people who lack health insurance and hope they never get injured or sick.


I meant the NSA in this case, but I agree there is are problem organisations with data protection responsibilities that do not prioritise them highly enough.


Looks like you'd have to file suit against specific employees: http://www.nolo.com/legal-encyclopedia/suing-government-negl...

It also looks like there are some pretty severe restrictions on the timeline of the suit.

Good luck!


Of course, this doesn't end the negotiations, it just means that that privacy advocates have just gained a bit of leverage. Now, how do they use it?


Gotta love a government that stockpiles exploits to use against citizens (them loses them), rather than shares them with companies to protect citizens.


My Company lost a lot of money due to WannaCry. Should I sue the USA Gov?


another article bullshitting, yeah, coz no one is doing it already... great!


That whole story stunk anyway, we still don't know if the FBI withdrew the request because they just found some other way into the device that they wanted to access. So forgive me if my first thought in the midst of all this chaos is not to praise Tim Cook's name.


> we still don't know if the FBI withdrew the request because they just found some other way into the device that they wanted to access

We do know that. It's public knowledge. The iPhone 5C didn't have Secure Enclave (later iPhones did), so it was crackable in another way (that more modern iPhones can't be).

https://www.washingtonpost.com/world/national-security/fbi-p...

https://www.washingtonpost.com/news/post-nation/wp/2016/04/0...


BGR logic strikes again. There is a world of difference between a backdoor and an exploit...

Neither is desirable but one can at least be secured by a key or something.


A hoarded vulnerability and a hoarded backdoor key are difficult to distinguish is this context, ie, leaking of either results in the same catastrophe.


Apple already has a backdoor key which they use every single time any iOS device needs to update. The FBI wasn't asking for the key, they were asking for the key to be used to sign one update. That signature would inherently have only worked once, as it includes a nonce and device ID.


If the NSA let its hacking toolkit leak, what makes you think they can keep a backdoor key 100% secure for eternity?


And how many keys has been done ? Once a company accepts to develop backdoor to operate in the US, what guarantee do you have they have no similar arrangement with other governments ?

Patriotism ? That's a dangerous currency to use with global companies.


It's not even got anything to do with patriotism. If the US government can legally mandate companies to put back doors in devices in US markets, then foreign governments can mandate the same in their markets. Of course they could anyway, but the US and other western governments should be setting a principled example here and diligently pressuring other governments to adhere to the same standards.


> one can at least be secured by a key or something

Which can be leaked.


I agree with you on all points besides securing with a key being somehow a more desirable trait to a backdoor. Keys can leak(such as TSA checked baggage master keys) or authentication could be bypassed(such as Intel ME).

BGR is simply virtue signalling against the feds when it comes to comparing the "virus of the day" with a hypothetical data recovery method that would only be available in an offline forensics lab setting with no viral attributes.

WannaCry is much more Microsoft's fault than the feds or Wikileaks.


Secured by a key, you say? Like one of these?

https://prod01-cdn05.cdn.firstlook.org/wp-uploads/sites/1/20...




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: