And then there is the loss of credibility of the US government and the companies involved.
*Edit: FBI changed to US agencies.
Was GP generalizing, or overspecifying? Because from where I sit the responsibility (for not notifying Microsoft) does appear to lie within the Departments of Defense and Justice, and I wouldn't be surprised if the Venn diagram for the split responsibility is based on a simple division of labor.
You can't pin that on spy agencies. Criminals would have exploited the same holes eventually. Manufacturers and software developers could just start using better languages and better development practices instead of whinging every time their C/C++ software is trivially owned.
Officially, the NSA's role is not to sit on zero-day exploits. If it discovers zero-day exploits (either by accident or through active research), then it should follow industry practices for disclosing and patching it.
 There are discussions of what constitututes "industry best practices", but I think we can all agree that "telling nobody, including the software vendors, with the hopes of being able to exploit it later" does not qualify.
You might argue whether or not NSA should be into 'hacking other people with 0-days' business or 'making computers more vulnerable' business, but NSA definitely is not in the business of making computers less vulnerable. Especially for computers located in other countries, they have no business helping to protect them - that's against their role, they're paid to facilitate the exact opposite goal.
Third possibility: their role involves securing domestic infrastructure, in which case they should be looking for vulns and getting them fixed. IIRC this actually is part of their statutory mandate, alongside offensive operations.
That being said, it still wouldn't help against WannaCry because Microsoft made a blunder when they disclosed the vuln by patching new versions of Windows while they left majority of XP installations unfixed. Even without the NSA leak somebody may have reverse engineered the bug from the released updates - such things happened in the past.
This sounds similar to what IAD does. The difference is that IAD focuses on securing systems against entire classes of vulnerabilities and attacks.
SID has the mission of gathering signals intelligence, so that is why it makes sense for them to utilize individual zero day vulnerabilities. They need to get into an adversary system, so vulnerabilities are required (when needed) to get that access.
I don't have any old versions of Windows installed, so I truly don't know how it rolled out there.
Last time I spoke with the NSA, admittedly over a decade ago at a college recruitment event, they were in that business as well. So much so that they created a hardened version of Linux that a lot of us today .
In addition to it's main role of collection of foreign intelligence data (which includes breaking encryption and finding/making vulnerabilities), it does have a secondary mission of protecting US government systems. Hopefully that does mean that they have ensured that US gov't systems have been patched against those vulnerabilities. For example, Microsoft patching the SMB vulnerability (including for WinXP) before it was made public was likely done after a report from NSA, and would be consistent with such a goal. Having "private" custom patches that would get installed in sensitive gov't systems but not made available to general public would also match this goal.
However, the relationship of NSA and the main victims of WannaCry are at best neutral. For example, it seems reasonable that NSA should want companies like Spain's Telefonica to stay vulnerable to various exploits - their mission explicitly includes obtaining intelligence from (among other sources) foreign telecommunications operators, so if they had the ability to protect Telefonica, then doing so would go against the mission given to NSA by the congress.
In the same vein, the NSA isn't about protection. That's not their role, just like the point of the military isn't protection.
I'd argue that it is, if indirectly.
So it leads to the same logical conclusions: If the point of the military is to apply force to an opponent, then the NSA has to have ways of applying force to a digital opponent. Disclosing vulnerabilities would be precisely the opposite of what they should do.
Examples from the last 50 years?
I've repeatedly said on HN -- if the US tech companies can't build the most secure systems possible, the US government itself will not be able to secure their own systems. Secure devices and communication is not a given nor will it become easier.
They are military. It was a cyberweapon. That's the point of a cyber weapon. Russians stole it from the NSA. Yes, that's incompetence, but an attacker only has to get lucky once. It's much harder to defend a cyberweapon than a tomahawk missile.
Does that mean we shouldn't build cyberweapons? No. That's just how cyberweapons are. That's the reality we live in now, and the world will just have to deal with it. There's nothing to be done, is there? Not unless you're suggesting that computers should never be used for offensive capabilities, which from a game theory perspective means that country will be powerless in the digital arena.
Don't do the crime if you cant do the time.
Personally, I'd rather our military NOT be full of people who are willing to do this kind of thing.
If this discussion was about missiles, and the gov agency in charge of the missiles were using them on Americans, I'd say that they should go to jail as well.
You are right. These cyber exploits are weapons. And they should stop using those weapons on Americans, as a multitude of leaks have shown that they do, through mass spying and mass data collection programs.
They are not allowed to upgrade automatically because their software will break. The cost to upgrade the software isn't feasible when you have a multi-billion wide gap.
Edit: Umm, humans have formed armies and started wars over far less theft and far lesser threats to their populations than this. If the attackers are located, some piano wire around their necks is really cheap in comparison. And for this amount of damage, I do not doubt for a moment that it's a consideration by every single government spy organization.
Maybe leaving my front door unlocked was a bad idea, but you still robbed my house.
Note to the NSA/CIA - I want to cheer for my government, to be proud to be an American (even the U.S. variety) and it would be wonderful if the U.S. could return to being viewed as "the good guys". I'm not sure that type of patriotism will ever return.
You mean internally or that there was ever a time where the external perception of US was of "good guys"?
I mean, I can't possibly imagine the latter being the case. For instance: it's pretty common knowledge in South America of US involvement in setting up bloody dictatorships in the 60's and 70's.
Only if US citizens believe in their own soft power propaganda (e.g. Hollywood).
See , it gets pretty ridiculous after 1949.
Edit (Relevant Carlin):
Israeli murderers are called commandos, Arab commandos are called terrorists. Contra killers are called freedom fighters. Well, if crime fighters fight crime and fire fighters fight fire, what do freedom fighters fight? They never mention that part to us, do they?
The UK's NHS computers were running Windows XP (which was EOL). http://www.telegraph.co.uk/news/2017/05/12/nhs-hit-major-cyb...
Otherwise, the patch to fix the security flaw was released back in March before it was a real problem. There will always be holes in software, it's important to have a system in place to patch them in a timely manner.
It wasn't just affecting XP machines, it was affecting other machines if they hadn't been patched.
> We know there have been warnings before about IT security in the NHS - last summer a review said it needed looking at.
> But the problem is that over the last three years the capital budget - which is a ring-fenced fund used to pay for buildings and equipment - has been raided by the government to bail out day-to-day services, such as A&E.
> Last year a fifth of the capital budget was diverted.
> That, of course, makes it more difficult for trusts to keep their systems up to date.
EDIT: Here's a better post https://www.instituteforgovernment.org.uk/blog/nhs-cyber-att...
> One of the problems with digital government is reforming the technology infrastructure which underpins its services (‘legacy’). There has been much speculation about how the continued use of Windows XP operating systems within the NHS contributed to the cyber-attack. Although only 4.7% of NHS devices use Windows XP, these are spread across 90% of trusts. Computers that have not been updated with Microsoft’s latest software were susceptible to the ransomware. Meanwhile, NHS legacies are further complicated by the patchwork of contracts across trusts. This digital fragmentation is in keeping with the scale of fragmentation within the NHS itself.
My point is mission critical software should always be up to date. To go months without installing the update on important systems is unacceptable. It doesn't matter who originally released this exploit, exploits routinely become available. It's about fixing them in a timely manner when they appear.
I was recently thinking that software producers wishing to discontinue support for a software product should have to open source that product version (or at least, disclose the source to to paying customers). That way others can continue to support it and backport patches if it's truly necessary.
Either outcome seems better for customers. In fact, it's significant incentive for better development practices and good software design so these companies can maintain older versions more easily. Sounds pretty good all around, so where's the downside?
Would it also be "wonderful" if the US Army "shut down for a week or two?" How about the Coast Guard?
As an American, NSA protects you, that is NSA's job. NSA's job is not to impress you or make you think they are good guys.
The military is completely different. The troops on the ground are generally following orders and highly disciplined. There are definitely occasions where I don't agree with the orders they're following - I can't possibly fault the soldiers. Adding the Coast Guard to the mix is borderline ridiculous. And you completely forgot the Red Cross - there are some very militant nurses involved.
operate without enough oversight
This is your opinion. FISA Court exists for a reason, and is there to protect the rights of Americans.
frequently flaunt the law (and ignore the constitution)
No idea how you're getting to this assumption, unless you're taking clickbait at face value. There have been contested judgements regarding the legal justification for metadata collection. That hardly constitutes "ignoring the constitution." I'm also curious as to your claim of frequency.
eroding the rights of the people they're supposedly serving
NSA cannot collect information about American citizens, only FBI can. I believe NSA can only be involved with a specific warrant issued in response to an imminent or direct threat, but even then, I think it has to be FBI.
The military is completely different
This is the point I'm trying to make to you: NSA is the military. One more time. NSA is the military. Excuse the gross oversimplification, but NSA provides intelligence to other branches of the military so that they can function in the most informed way possible.
If you "can't possibly fault the soldiers" who are "on the ground" and "generally following orders," I'm going to assume that you are someone that supports keeping US soldiers as safe as possible. Yet, you seem to delight in the notion that the NSA would be shut down for a week, which would directly endanger those lives. How would you rectify this?
Your assertion that the NSA can't collect data about American Citizens would be great if it were true. There are pretty convincing arguments that they aren't abiding by at least the spirit of the law and that their lawyers are actively engaged in weasel-wording away the restrictions. I'm sorry but collecting the meta-data of American citizens is still collecting data (meta is a sub-class).
You've got a pretty strong opinion of the agency ... I'm guessing that you work for the PsyOps division. Or perhaps you're just a youngin' who hasn't been around the block quite enough times. And don't get me wrong - I've had a great life as a U.S. citizen. I gained my political awareness during the Reagan presidency and was at that point very proud of everything my country did - I didn't know how much was happening outside the view of the TV screen that I wouldn't be proud of - or how many other parts of our history could be questioned.
I'm certainly not advocating anything beyond having exactly this kind of discussion. It's not my fault that the NSA, CIA and U.S Government are viewed so negatively - they have the reputation that they deserve. I can understand why the rest of the world hates us but I hope we can decide to change our behavior.
NSA is not allowed to intercept communications of American citizens. I should have been more explicit about that. I am simply trying to add context to what I believe is an erroneous perception that the NSA is a building full of gunslinging unchecked hackers that gleefully eschew US law so that they can... do what exactly?
> You've got a pretty strong opinion of the agency ... I'm guessing that you work for the PsyOps division. Or perhaps you're just a youngin' who hasn't been around the block quite enough times.
LOL. What does it say that you'd assume anyone who has a non-negative opinion of NSA is either working for them, or young and misinformed? Sorry to disappoint, but I am neither. I simply respect the truth, and misinformation regarding the US intelligence community is a cancerous detriment to the long-term personal safety of Americans. This is why I asked for references for your claims. But if you take nothing else away from our exchange other than the understanding that NSA is military, I'm happy.
> I'm certainly not advocating anything beyond having exactly this kind of discussion.
Good. I'm glad. See how it only takes a simple challenging of one's opinions to go from "I hope the NSA gets shut down" to "let's have a discussion?"
You can. We all collectively decided at Nuremberg that I was just following orders is not a credible defence of one's actions.
Because they had been disciplined, this isn't how military used to work 10k years ago.
I agree with your parent that it probably is safer to work on disciplining the NSA (and others such) than hope that they will get pwned or do anything in that direction.
There have been many instances reported publicly about US tech falling into enemy hands. I've included a few from recent memory. This is something that happens, and I share your concern. It is a bad thing that happens, among many bad things that happen in war.
But to optimize away from it, would be to sacrifice the technological edge that keeps Americans safe.
No, it doesn't. It protects the status quo.
The window that this occurred in is much smaller than many people realize. You need to go back in time a little, but not too far back — also, only look at our actions abroad and mostly ignore how we treated certain people domestically. From the outside looking in, the US has mostly kind of been the "eh, okay I guess?" guys. Or the "good by contrast guys."
Maybe the default position for all governments (and people?) should be "they did something good at this instant in time". It's kind of sad to view the world that way but probably safer than swallowing the revisionist explanations that are usually offered up.
For example, the american revolutionary war was an important event... but also we had slaves (some who were forced to fight), it fractured many indigenous tribes, and the first presidential election 10 years later only allowed male landowners to vote.
1) We need funding to fight X
2) If X happens anyway we need more funding
As long as there is a veneer of democracy over the outright theft of tax funds, there will be a way for these "essential" agencies to grow until they kill/consume their host.
That does seem to be roughly the stance taken by security agencies, but it also occurs to me that I view this as a completely reasonable argument when I hear it from an IT department or the CDC.
I want the CDC to have lots of resources to protect against serious outbreaks of infectious disease, and if one happens, I see it as a signal of three possibilities:
(a) They need more resources.
(b) They need to fix their methods.
(c) Nothing's 100% safe and we should accept that.
I definitely see TSA as mostly pretty much 100% (b)+(c) (mostly (c), to be honest), but the reasoning you cided isn't always wrong--sometimes (a) is legitimately the case, so you can't dismiss it out of hand.
As a reminder, a custom version of iOS is absolutely useless without the means to get it onto a device. In fact, it's possible and relatively cheap for a talented team to make that custom version without Apple's help.
That means is a key held only by Apple, needed to sign software before it can run on any iOS device. Apple uses this key every single time any iOS device around the world is updated- a nonce is generated on device, sent to Apple, signed along with the firmware, then verified on-device before it allows the new software to run.
Creating the software may have been work for Apple, but would not contribute in any way to making phones less safe. Signing a piece of software inside their own premises for a particular nonce and device ID can likewise not be used to make any other device unsafe. This is a process Apple does many times a day, whenever any device updates.
Not matter what you do, there is still a very good chance that somehow unauthorized people get access to sign an OS for any of apple's devices device they want. It might not result in such a huge attacked but nonetheless, there is still damaged to be done.
We don't have much of an issue with criminals getting wiretaps through faking subpoenas to phone companies (although I did find https://arstechnica.com/tech-policy/2017/03/feds-brooklyn-pr... when searching.)
Note that I'm not arguing that Apple should have complied, that the court should have said a particular thing, or anything like that (did enough arguing last year when the case was going on). All I'm saying is the comparison is invalid, and the author doesn't know the technical details and is therefore wrong.
Just like the fastest code is the code that is never executed, the safest code is the code that is never written in the first place.
Creating a new iOS version in no way increases the attack surface. If you think it does then you probably don't understand how the technicalities work, or I'm misunderstanding you (in which case please sketch a scenario where Apple doing what was requested increases attack surface).
Now, cracking the signing process so that it authorizes a third-party-modified version of the OS is something else completely, but I'm not sure it would make any difference if the attack is done with physical access. It could be useful for remote attacks.
>code gets leaked
>incentive to find bugs in signing increased
>bug in signing found
Offhand I would think the value of an exploitable signing bug is an order of magnitude greater than the cost of making a custom software. At the very least it trivially allows jailbreaks, which puts it in the million dollar range depending on how exploitable it is, and I think creating iOS software that skips the passcode would at worst be in the low six figure range if it didn't need signing, more likely mid five figure.
Saurik said it could be done in a week by a talented programmer (see https://news.ycombinator.com/item?id=11153022), so even though the supply of such programmers is limited I'm still confident the total cost would be 5 figures.
So the additional incentive would be pretty insignificant compared to the existing incentive.
The only people to blame are those that don't follow manufacturers instructions.
They were warned ahead of time. There were even news reports about this.
Target and Home Depot both had credit card hacks precisely because they did not follow manufacturer (again Microsoft) instructions of upgrading Windows XP embedded to a supported version of the OS.
I don't understand how hospitals can get accredited if their IT systems are not up-to-date and verified by cybersecurity experts. Since the companies and the NHS can't be counted on to follow manufacturer instructions, this is extremely important.
May 11, 2017: British Medical Journal: The hackers holding hospitals to ransom - Hospitals need to be prepared to avoid shutdowns
Also: Hospital accreditation
EDIT: The computer systems are capital equipment and like any other form of capital equipment (eg, vehicles in the motor vehicle pool) they must be maintained. Complex machines undergo changes over the life of the equipment and manufacturers issue updates (eg, field change orders) that should be followed by the purchasers of the equipment.
Regarding proper cybersecurity, that would include hardware upgrades since later Intel CPUs incorporate hardware that assist with proper security that is taken advantage of by later versions of the Microsoft OS.
The problems of hospitals both in the US and in Britain were because of a refusal to follow the manufacturer's (Microsoft's in this case) instructions.
Those familiar with the hacks of Target and Home Depot would know that they were hacked because Target and Home Depot refused to follow Microsoft's instructions to upgrade their point of sale software from unsupported Windows XP embedded to a later, supported version of the OS.
A reason. Real events have lots of root and proximate causes. MS could also have avoided writing the bug, or discovered it themselves with better auditing. NHS could have disabled SMB on systems that don't use it, or otherwise firewalled it.
And, of course, the NSA could have disclosed the bug when they found it instead of hoarding it. Or better protected their tools from theft. Or the Shadow Brokers could have better audited their disclosure to avoid spilling active hacks into the public.
Almost all of these things were required to get to where we are, and all of them are "simple and expected" from at least someone's perspective.
Lots of blame to go around, basically.
No machines are built perfectly from the start and there are changes made over the capital equipment's life be it airframes, jet engines, or computer systems.
When purchasing capital equipment or buildings, bridges, etc, part of the responsibility of the firm is to follow the manufacturer's or builder's instructions and including maintenance upgrades. In the case of Microsoft, they gave warnings for years that the Windows XP software would not continue to be maintained.
The issue was not Microsoft, but NHS (and other governments, firms) decisions not to budget for and perform maintenance for the capital equipment that they purchased.
It's staggering to me that so many recent revelations have shown that in their efforts to protect us, the 3 letter agencies have allowed vulnerabilities to exist on our machines -- and considered them assets -- without warning us.
That's what I read somewhere, not sure how accurate it is.
I imagine the ideal solution would be to have a clause in the contract with the software company that they have to test and re-validate their software on new versions of windows and updates in a timely fashion so that the computers running the software can be updated? That would probably increase costs though
And the reality is, the overwhelming majority of IT people have no training or background in risk assessment, they are not computer security people. But the bean counters, always optimizing, without good information let alone perfect information, have assessed risk incorrectly. And now there's a wildfire in progress, and basically being incompetent at the task they were handed, are all surprised.
Many other people who have considered this eventuality are not surprised.
But in this case, as in the case of credit card hacks of Target and Home Depot, it was simply a matter of following manufacturers maintenance instructions and upgrading the OS version from an unsupported version to a supported version.
As many have pointed out, in the case of Windows 10, the upgrades had been free at no charge from Microsoft.
Also, that it was free for consumers doesn't mean it was free for enterprises. That licensing is different. And even if the licensing is free, the cost to know how by your IT staff is not free. Either you're getting new staff who can support it, or you're sending the staff to currency training. Doctors, lawyers, pilots, have done such recurrency training for decades, but in IT it's not a given it's very much driven by the CTO. And some of them do not care to have a staff more capable than minions. They're cheap. That's all they care about. And then this happens and they quickly will have to look for someone to blame.
The CEO of Target got fired for not upgrading their point of sale software from an unsupported version of Microsoft Windows embedded (XP) to a later version which caused their system to be hacked.
Ultimately, this is going to be a problem as long as firms don't treat computers as any other form of capital equipment (such as repair vehicles) that need ongoing maintenance. Boards need to be asking their management about keeping their capital equipment under maintenance.
As for software that doesn't run on Windows 10, perhaps it is best to avoid firms that produce software where they don't upgrade it to the most secure version of the OS. I think this is mostly a hypothetical for most and it is probably important that firms change vendors if the vendor isn't willing to upgrade to the latest version of the OS.
So your answer to the difficulty in making an OS secure is that anyone writing an application to run on that OS should instead be contractually required to support that application on arbitrary future OSes that don't even exist yet?
Good luck getting anyone to supply anything on that basis. You'll need it.
A piece of software that is operating on an non-supported OS is not a functioning piece of software. Usually, when software in the corporate environment is purchased, it is with a maintenance agreement or it should be. Part of the maintenance agreement should be for future versions of the OS from the same software vendor (eg, Microsoft in this case).
Perhaps in the future with more and more apps going from Desktop to SAAS or mobile this might be less of a problem. Don't know.
Why not, exactly? What is mysteriously going to stop working just because someone's legal arrangement expired?
Or to be blunt, how many people do you think we should kill by not using medical technology bought at great cost just because some lawyers would like a bit more money please?
Usually, when software in the corporate environment is purchased, it is with a maintenance agreement or it should be. Part of the maintenance agreement should be for future versions of the OS from the same software vendor (eg, Microsoft in this case).
That's a lovely theory, but in the real world organisations buy very useful, very expensive equipment all the time with the expectation that its useful lifetime will be longer than any currently available OS is officially supported for. Moreover, in many cases it might not be economic to purchase at all without that. This is why standards and compatibility are so important.
Perhaps in the future with more and more apps going from Desktop to SAAS or mobile this might be less of a problem.
Heaven help us if anything important ever moves to SAAS, because no-one else will. SAAS is sometimes useful for convenience or short term flexibility. Self-hosted is for professionals who need guarantees.
No, it is about engineering and resiliency as opposed to just getting something to work. Would you fly on a plane with unsupported software?
Firms can choose to use unsupported software at their own risk (at least in most situations -- there may be situations where it is illegal to do so such as mission critical safety systems).
> "That's a lovely theory, but in the real world organisations buy very useful, very expensive equipment all the time with the expectation that its useful lifetime will be longer than any currently available OS is officially supported for."
First, I don't know if that's true about the expectation that a very expensive piece of equipment is expected to run on unsupported software. If it truly an expensive piece of equipment that usually comes with maintenance agreements (eg, MRI scanner, CT scanner) and in that case, the vendor can't be using unsupported software. It should be part of FDA approval process that that be the case, but I don't know for certain.
Regarding SAAS, it might be a private firm server (farm) but one that is a server none-the-less which is easier on upgrades than entire sets of desktop systems.
I believe that firms that use unsupported, outdated software open themselves up to various liabilities.
There are best practices for dealing with out of support equipment that are exceptional edge cases. Segregation and isolation from the network being the top ones. Having them sit with the smb service listening on the local subnet is completely inexcusable and a sign of incompetence.
In Microsoft's case, they have the means to know who their users are and ensure they upgrade. Forced upgrades should be an opt-out feature, especially with an OS as prominent as Windows.
Airplanes have to undergo preventative maintenance. We don't count on airplanes not being maintained. Sorry.
Highway bridges and other structures undergo constant inspection.
It is one matter for individuals in their homes choosing not to update a computer or for a privately held firm.
But firms with stockholders, or governments must follow manufacturers instructions when they purchase equipment and that includes maintenance whether for their vehicle pool or other equipment.
They are supposed to, but underfunding/mismanagement can abuse the overengineering (which thankfully existed in the first place!) and skip regular inspections, sometimes with fatal consequences.
They kind of acknowledge how bad this advice was:
"Update, May 15: With the Windows 10 Creators Update, Microsoft has largely addressed the forced updates that often resulted in lost work. And, while the the recent WannaCry ransomware does not (thus far) appear to affect Windows 10, you need to make sure your PC is kept up-to-date with security patches to avoid exactly those sort of attacks. To that end, consider the information below to be out of date, with a more thorough update to come."
Well, on one side, you have the OS manufacturer's instructions to update to Windows 10.
On the other side, you have the software manufacturer's instructions to stay on Windows XP at all costs.
Which instructions do you listen to?
(Arguably, before this week, it would have been empirically correct to listen to the software manufacturer; upgrading from XP would have definitely broken the system, but staying on XP didn't break it so far.)
In many cases, they can probably upgrade using Win XP running on a VM in Windows 10.
It is one thing for an individual to purchase software for their own needs, but any firm or government should not purchase software that will not be upgraded to the current versions of the Operating System. That is simply irresponsible and should have been rectified in the purchase contract.
As it is, are there specific examples of software used by firms and government that are not updated to current versions of Windows?
I've tried the upgrade for both. The Windows 7 one has problems with the WiFi card -- reason: Intel never made newer drivers, the old aren't actually compatible with the newer systems. The Windows 8.1 one has problems with the touchpad: reason: the newer drivers are broken for that model. It also has some display issues on the Windows 10. The WiFi card, the touchpad and the display effectively aren't replaceable. But the computers, apart from not running the newer OS, work without a flaw.
If I as a consumer with only a few computers have such problems, I can't see how you don't understand that the whole companies can have immense number of computers that work perfectly with their current OS and would simply break with the newer one.
They would not upgrade, but they would install the security updates. But guess what, Microsoft doesn't want to give the security updates to everybody who has Windows XP. Only after this failure they "exceptionally" did it.
So Microsoft is absolutely responsible. They can say "we'd prefer to motivate people to buy new" but they are still morally responsible for not releasing updates when there are so many actively used systems.
Windows XP is an OS from 2001. It is way out of date and does not include with it all of the security of a newer OS such as Windows 10.
In the case of most firms they upgrade every few years anyway. It might be different for some non-profits or government. But at any rate, I feel it is totally unrealistic to expect MS to support software from 2001.
Simple does not exist in even smaller private organizations, let alone large government organizations.
> As it is, are there specific examples of software used by firms and government that are not updated to current versions of Windows?
There is code written 60 or more years ago running on today's mainframes. The cost of rewriting even a fraction of this code can be staggering. This is precisely why COBOL programmers get paid big bucks and why Microsoft gets paid by large clients to provide support past end-of-life. If it works, it works.
You can't blame one person for something going wrong in an increasingly connected world.
How much software that is generally available only works on XP and that can't run in XP running on a VM?
Have you heard the expression "captive market"?
It's not like they can buy equivalent software from a company that guarantees upgrades.
Sounds like a hypothetical. What applications software is being purchased for which there is no equivalent and only a captive market?
And even then if you absolutely can't upgrade that XP machine running an MRI then there are best practices to segregate and isolate it from the network properly.
I'm getting a little sick of the people with an anti-whatever axe to grind pretending this was some big sophisticated attack that only nation states can do. This patch was released months ago, there's no excuse not to have it. It was just one of the hundreds of Windows exploits found per year. If admins aren't upgrading their equipment or segregating them properly then that's on them. Note, there was no NSA leak for Conficker, codered, heartbleed, slammer, etc. This stuff will not stop unless admins and management take security more seriously.
No admin, myself included, had any issues this weekend. It was completely avoidable.
If you look through Wikileaks, only after the email hack does he get the recommendation to turn on the 2FA which presumably happens.
This person arguably has an injury claim against Microsoft (their bug, their policy to stop updating software without disabling it thereby permitting the broad weaponization of their software). And they have an injury claim against each medical service institution who refused service because their systems were fully or partially non-functional due to their negligent choice to run old software; to not have their own user space programs on a constant "everygreen" cycle so that, without any gaps, it's possible to run their mission critical software on a currently supported and secure OS.
Basically, these injured 3rd parties have to sue or legislate. Either way it requires a sovereign to intervene in the market.
And what role and blame do millions of non-business end users running outdated software, and their computer was part of a web enabling such prolific spreading of malware? Do they get sued or billed their incremental share? What if it's $1000 per person running Windows XP? And now that they have liability, maybe they can all sue Microsoft because Microsoft permits them to run outdated software?
This shit really needs to be engineered better, and really where we're at is, we're monkeys playing with computers and reveling in our own self-importance, when in reality we still sometimes suck. And this is an (yet another) example of how primitive we are still.
Nobody forced people to purchase Microsoft software. There are other vendors (Apple, Linux based systems). But if they choose to purchase the software, they need to follow the vendors maintenance instructions if they want properly operating equipment.
Does United Airlines ignore the maintenance instructions of Boeing or Airbus? If they even thought about it, the FAA would be all over them.
In the case of hospitals in the US and the UK there are accrediting agencies. In the US it is The Joint Commission. In the UK it is Care Quality Commission. If the people running hospitals are irresponsible and choose not to follow manufacturers instructions for maintenace for any capital equipment, then the accrediting agencies should refused accreditation.
So when it comes to the computers used by hospitals and doctos and the like, OS manufacturer instructions are lower down on the priority list, below the manufacturer instructions for the applications and medical devices they use. And if $MedicalRecordsApp or $MedicalEquipmentController is only certified by the manufacturer to run properly under Windows XP, then you can bet that the hospitals and doctors offices are going to continue to run Windows XP.
The OS is simply a part of a piece of capital equipment. Like any other form of capital equipment (think trucks that need their oil, brakes, and tires replaced) the computer is also a piece of capital equipment that must be maintained. The fact that it is not maintained is a function of poor management that should be replaced with management that is more responsible. Nobody would settle for a truck not having brake jobs and tires replaced. That same standard should apply to any form of capital equipment.
In this case, MS said that the OS was no longer supported. End of Story. Responsible boards need to start hiring CEOs that understand that it is important to maintain capital equipment.
If Cessna says the only powerplant that your Cessna 172H is certified to use is the Continental O-300, then that's what you run. Continental doesn't make the O-300 or parts for it anymore? Too bad, you fly with an O-300, or you don't fly. Guess it's time to turn to the secondary market for parts and support for that engine.
Get a newer airplane with a newer, better engine that's still being supported by the manufacturer? Hmm, the presently-owned 172H is paid off, still flies and still makes money. Buying a brand new 172S requires taking out a new mortgage and isn't going to make any more money than the 172H does. Seems like we're going to keep the 172H for the time being.
I don't disagree with you that capital assets should be maintained. However, the companies running these older applications that require older OSes aren't the only ones at fault here. The manufacturers of the medical devices and software themselves are just as much to blame. This situation didn't come about solely because CEOs were being slackers about maintaining capital assets.
Yes, they are, but when purchasing high value pieces of capital equipment, people need to have this maintenance as a part of the purchase agreement. In cases such as MRI, CT, many of these firms are generally very large and have the capacity to do so.
And what happens if some of the software or equipment used with that system is not compatible with Windows 10? I can make a very secure house if I give it solid, metre-thick concrete walls, ceiling and floor, but it won't be very useful as a house without things like doors and windows.
You're far too quick to excuse Microsoft for what was, ultimately, a defect in the original product.
It is irresponsible for a firm or government (as opposed to an individual) to purchase software without a contract that ensures that it will be upgraded to current versions of the OS.
> "You're far too quick to excuse Microsoft for what was, ultimately, a defect in the original product."
What defect are you referring to exactly? That Microsoft years ago couldn't predict every way that cybersecurity attacks happened to their OS? Or that users of versions of Microsoft software still under support contracts (including firms and governments) did not run software update tools for a patch released in mid-March?
It is simply unrealistic for a firm to anticipate all forms of cybersecurity attacks. Later versions of OSs include not only different architectures to fix cybersecurity flaws but take into account changes in the Intel hardware that help to combat cybersecurity attacks.
We're talking about things like medical equipment, often in regulated fields. There may be no such contract available, and even if there is, upgrading to a totally new OS may or may not be acceptable. The idea that you should throw out millions of pounds of high-tech medical scanner every few years because a hundred buck OS that it was supplied with couldn't be kept secure is laughable.
What defect are you referring to exactly?
They supplied an OS that was not secure. The rest is just rationalisation and apologies.
It is simply unrealistic for a firm to anticipate all forms of cybersecurity attacks.
It is also unrealistic to expect organisations concerned with literally saving lives not to buy equipment unless they have a plan in place to deal with updating the software on or used with that equipment in arbitrary ways with arbitrary consequences whenever anyone responsible for any part of its software sneezes. The need for stability, reliability and longevity is precisely why so much in this kind of sector is regulated in the first place.
Perhaps the problem here was that such systems should never have been built on such an insecure platform and then connected to a network, but that too is on the device manufacturer and not the hospital. There are basic standards of fitness for purpose that are reasonably expected with such devices, even without more specific regulation. Particularly in this sort of field, those devices should then be supported for their entire working lives, not some arbitrarily shortened version of their working lives dictated by one software development organisation.
When people purchase equipment, there are equipment contracts. Medical equipment, depending on the type, needs to follow FDA approval guidelines. Clearly, not piece of equipment should allowed to be running in a hospital based on an unsupported version of an operating system. That is simply irresponsible.
Each newer version of Microsoft OS implements security features not in earlier versions. In some cases, there are changes to the Intel hardware for additional security that later versions of the OS take advantage of.
Again, firms and governments should not be using unsupported versions of hardware or software.
> "It is also unrealistic to expect organisations concerned with literally saving lives not to buy equipment unless they have a plan in place to deal with updating the software on or used with that equipment in arbitrary ways with arbitrary consequences whenever anyone responsible for any part of its software sneezes. The need for stability, reliability and longevity is precisely why so much in this kind of sector is regulated in the first place."
The longevity is mostly the supported API across different versions of the OS, and not the OS itself. Each version of Microsoft OS comes out with additional security features and vendors who use Microsoft OS (or any OS) need to plan for upgrading their application software that uses the OS and purchasers of application software should insist on it.
You keep writing things like that, as if the only practical alternative in many cases was not turning the equipment off altogether.
Each newer version of Microsoft OS implements security features not in earlier versions.
And more spyware, too. What should a hospital do if Microsoft decides to instruct them to update to a new version of Windows that will automatically upload all of the data on the local hard drives to a Microsoft-cloud-hosted backup system, complete with all the legally protected sensitive personal information? Again, this issue can't just be as simple as suppliers getting to move the goalposts however they want after the sale.
The longevity is mostly the supported API across different versions of the OS, and not the OS itself.
Then why are so many healthcare providers who were caught last week saying they couldn't upgrade from Windows XP because some essential functionality of other software or equipment no longer worked on Windows 10?
In the US, they might possibly be in violation of HIPAA health data privacy laws because by running on an unsupported OS they are leaving themselves open to cyber attacks and stolen health care data.
Best to deal with vendors of software that upgrade their software to the newest OS versions, especialy if it has anything to do with health data.
> "You keep writing things like that, as if the only practical alternative in many cases was not turning the equipment off altogether."
Microsoft had been warning for years that they were going to discontinue service and they have done it with previous OSs, so no surprise there really. Software vendors and their customers need to prepare ahead of time.
If properly maintained by their IT Depts, these Windows 10 Upgrade messages never come up.
Fixing by writing drivers may not be cheap for the vendor of the software, but they should do it. You simply can't get around the fact that each new version of the OS is more secure than previous versions.
Running on an unsupported and even if supported, not updated, software, invites failure and nobody can feel that their health records are secure.
Airplane maintenance for a Boeing or Airbus is very expensive, but you have to do it. The same may be true of devices and software in healthcare. Still need to do it right.
I don't see the alternative. Do you?
I disagree. I think it was the folks that thought it was OK to encrypt files and hold them for ransom in the first place.
Theft is going to happen even with great policemen. People who don't follow manufacturers instructions (especially when they are told the software is no longer supported and yet they fail to upgrade to a supported OS) are culpable. To run on unsupported software or not to run security updates on supported software is like banks without vaults. Just unrealistic in either case.
In the end it may well be a big validation of the FSF message, as then the NHS (or anyone really) could hire someone to maintain the software beyond the existence of the original company.
This much like how i can find someone to repair something for me that physically broke, because the tools and such are widely available and known (or even attempt to do so myself).
It's the other way round: shutting down an NHS hospital is the sort of thing that would normally be a front-page disaster for the government, so it's almost never going to happen.
In Britain the accreditation is done by Care Quality Commission:
What's the point of an accreditation agency if they aren't going to ensure that the hospital is run safely which includes running on current software?
I agree with this observation. I believe this also was the reason that Target and Home Depot did not follow manufacturers (Microsoft) recommendations to upgrade point of sale terminals from the (now) unsupported Windows XP embedded to a later version of the OS still supported. Because they disregarded the advice, millions of people had their credit cards hacked.
At least the CEO of Target got sacked as a result.
Since XP there have been the following releases.
(1) Windows Vista
(2) Windows 7
(3) Windows 8
(4) Windows 8.1
(5) Windows 10.
Actually, it is the best policy to upgrade to the latest version of the OS, for security reasons if for no other reason.
If you care about privacy you don't collect data on your users.
Sounds like a useful anti-theft measure?
Furthermore, you are also able to buy itunes gift card at many stores in cash, so if you wanted to download something and not be associated with that purchase, you can simple pay for the card in cash. But if you are worried about such privacy, purchasing digital goods is not a wise choice because there are very little places where you can purchase such goods without a CC. Not many sites accept bitcoin or are large enough to have their own gift cards.
Apple is somewhat unique here in that they make profit off their hardware sales and their online stores, and thus doesn't even have to sell to 3rd party personal info agencies to make a profit.
You've been able to download free apps without a credit card since forever
If you're using the iTunes Store or App Store for the first time
If you're using the store for the first time with an existing Apple ID, you must provide a payment method. After you create the account, you can change your payment information to None.
If you're creating a new Apple ID, you might be able to create an account without entering your credit card details.
The only way is with gift card which is not sold worldwide.
Quite reasonable as a trade off and as you mentioned it is possible to circumvent the need for a traditional credit card.
Plus, if you're against that then you'd have to be 100% off the grid anyhow - any bank or utility company (inc your phone company) knows more about you than Apple.
So thanks, but no thanks.
To be fair, the number of people who had direct access to the NSA/CIA exploit archives was probably in the hundreds. TS information is usually compartmentalized so only the people who need to access it can (known as TS-SCI).
Still bad that they have that many who can access it, but not in the millions.
"As of last October, nearly five million people held government security clearances. Of that, 1.4 million held top-secret clearances. More than a third of those with top-secret clearances are contractors, which would appear to include Mr. Snowden." https://www.wsj.com/articles/SB10001424127887323495604578535...
Now we can get into semantics until the cows come home but even tens of thousands of people are way too many. Imagine 1+ million people.
Its not like they go to the "Top Secret File Share" and have access to everything that is Top Secret across the government.
And let's not forget how the FBI allowed the police to use stingrays illegally and taught them how to hide them from judges in various ways, including claiming the tools were under NDA and they couldn't tell judges about it. Or when they absolutely had to tell the judges about it, they'd prefer to drop the cases (against drug lords, child pornographers, murderers, etc) so as to not reveal the use of the tools.
And on and on it goes like this.
Top Secret clearance && millions of people?
unlikely. maybe some level of clearance && millions.
You can't do that in forensic investigations. Everything that is done to the phone needs to be verifiable to prove evidence wasn't planted. And besides, the FBI was more looking to set a precedent for future investigations than it was concerned about that one phone.
Basically, even apple the company now can't break into their phones with software updates. The phone requires an erase or user unlock to update. It's scary because now apple can't even fix bugs in certain parts of the system without erasing user data.
So the two things discussed here don't really have anything to do with each other, but it's not surprising that people are trying to tie them together.
> "As an additional precaution, the government says Apple can design the program to let investigators try different passcodes by submitting them electronically, so that Apple can keep physical control over the iPhone while the special program is deployed."
Institutions weigh costs, and somewhere these hospitals decided that having unmaintained, aging information systems, was more cost-effective than either maintaining or upgrading the systems.
Thus, problems like this will not go away until NOT-fixing the systems is more expensive than fixing the systems. So what does that take? Fines?
Irony alert: the hospital that ignores aging systems and hopes that they never get hacked are not at all unlike people who lack health insurance and hope they never get injured or sick.
It also looks like there are some pretty severe restrictions on the timeline of the suit.
We do know that. It's public knowledge. The iPhone 5C didn't have Secure Enclave (later iPhones did), so it was crackable in another way (that more modern iPhones can't be).
Neither is desirable but one can at least be secured by a key or something.
Patriotism ? That's a dangerous currency to use with global companies.
Which can be leaked.
BGR is simply virtue signalling against the feds when it comes to comparing the "virus of the day" with a hypothetical data recovery method that would only be available in an offline forensics lab setting with no viral attributes.
WannaCry is much more Microsoft's fault than the feds or Wikileaks.