If Microsoft did not provide this update for older OSes, I believe more people would update because they simply had to. Smaller hacks might not be enough for the manager of those machines to allocate resources to update, but this hack might have been just fatale enough for them to update those machines.
And we should all expect to see such a hack again. This is not a one in a century thing. Such costly hacks could easily happen a few times a year.
I understand that newer versions of Windows do have stronger security, so it is better if people can upgrade to a more secure version of Windows. But it would have been better if those security features were somehow back-ported. But seriously, can the current version of Windows please be considered "finished"? Why not spend the next 50 years just maintaining Windows 10, just the way it is? Imagine if they devoted all of their resources to finding and fixing all of the possible security issues until it's virtually bulletproof, and the price of a zero-day gets to a billion dollars. I think it's a shame that Microsoft constantly releases unnecessary upgrades and tried to get people to keep buying new licenses.
It's not Microsoft's fault that people depend on Windows XP. IMHO it's the fault of companies buying hardware and software from manufacturers who are unwilling or unable to upgrade their product to run on newer versions of Windows.
To put it another way: the only reason there is a huge demand for COBOL programmers is because banks are too spendthrift to rewrite their software in more modern languages.
> IMO, Microsoft has acted irresponsibly, and the decision was only driven by money.
Welcome to the world of successful businesses. They don't do things for altruistic reasons, they do it because it makes money.
> Imagine if they devoted all of their resources to finding and fixing all of the possible security issues until it's virtually bulletproof, and the price of a zero-day gets to a billion dollars. I think it's a shame that Microsoft constantly releases unnecessary upgrades and tried to get people to keep buying new licenses.
A lot to unpack here.
1) I personally don't think it's possible for anyone to design a general purpose OS as complex as Windows that is bug free.
Just look up how small the space shuttle software was (IIRC ~600,000 LOC) and how mind bogglingly expensive it was.
2) Businesses could already avoid a lot of this if they didn't view IT as a cost centre and instead as an investment.
3) Yes Windows 10 is a privacy nightmare. But Microsoft has made real strides in OS security since XP. It's wrong to claim that all they've done is put a minimalist theme on the same old OS.
From a security perspective they absolutely are not "unnecessary upgrades"
It is Microsoft's fault. A software customer doesn't know that a vendor is going to go out of business or get bought by a competitor that discontinues their product and promotes an incompatible alternative with seven figure transition costs.
Microsoft are the ones who worked so hard to make sure that software for Windows isn't compatible with not-Windows, creating all their own alternatives to POSIX, OpenGL, Open Firmware and everything else so that it's as difficult as possible for software compatible with Windows XP to be compatible with any Unix or Linux, leaving the user out in the cold if it also isn't compatible with Vista or later.
> To put it another way: the only reason there is a huge demand for COBOL programmers is because banks are too spendthrift to rewrite their software in more modern languages.
The reason there is huge demand for COBOL programmers is that it's more reasonable to hire a COBOL programmer than to discard a working system with a multi-million dollar replacement cost. "Just throw away everything you own and start over from nothing" is only rarely cost effective.
It is categorically NOT Microsoft's fault that software vendors are bought or go out of business. It's on the software customer to ensure that they don't get stuck with vendor lock-in.
Reverse the situation: A bunch of critical systems which run Linux 2.4 are being compromised by cyber criminals via a kernel exploit. You're going to argue that it's Linus' fault for providing such a great kernel and not supporting it forever?
Or that it's Linus' fault vendor XY who hasn't existed since 2 mergers ago chose Linux 2.4 for their product and now aren't around to provide updates for legacy software?
> Microsoft are the ones who worked so hard to make sure that software for Windows isn't compatible with not-Windows, creating all their own alternatives to POSIX, OpenGL, Open Firmware and everything else so that it's as difficult as possible for software compatible with Windows XP to be compatible with any Unix or Linux, leaving the user out in the cold if it also isn't compatible with Vista or later.
And so does every other operating system on the planet. You cannot take a MacOS binary and run it on Linux. And until very recently you couldn't take a Linux binary and run it on Windows.
You could argue that with Linux, the only thing preventing it from running Win32 or MachO binaries is that those operating systems are closed source, but this is the world we live in. If you want a "universal" binary, write it in something like Java.
> The reason there is huge demand for COBOL programmers is that it's more reasonable to hire a COBOL programmer than to discard a working system with a multi-million dollar replacement cost. "Just throw away everything you own and start over from nothing" is only rarely cost effective.
Yes, and I feel that I addressed this when I said "because banks are too spendthrift to rewrite their software"
It's a business decision. Currently it's cheaper for banks to hire COBOL programmers at obscene rates to fix their software. Eventually, there either won't be COBOL programmers, or they'll be too expensive, and the bean counters will dictate it's time to rewrite.
I think there's a simpler way to think about this. Windows XP still functions for these businesses, and many of them would gladly pay an ongoing subscription fee for support. Sure, Microsoft wants to expand, but I don't see why that has to happen exclusive of supporting their well-loved early 2000s release. I run software which hasn't changed substantially for more than 20 years on a daily basis, and I don't see why Microsoft has to pretend that that's not a use case.
> > Microsoft are the ones who worked so hard to make sure that software for Windows isn't compatible with not-Windows, creating all their own alternatives to POSIX, OpenGL, Open Firmware and everything else so that it's as difficult as possible for software compatible with Windows XP to be compatible with any Unix or Linux, leaving the user out in the cold if it also isn't compatible with Vista or later.
> And so does every other operating system on the planet. You cannot take a MacOS binary and run it on Linux. And until very recently you couldn't take a Linux binary and run it on Windows.
Open Firmware, POSIX, and OpenGL are explicitly not related in any way to binary compatibility. You can run unmodified OpenGL programs (with some abstraction) on Linux and OS X out of the box, and with some fiddling, on Windows as well.
It's clear that this is an argument that Microsoft has refused to support standard APIs, not standard ABIs.
Not saying there aren't any, but nearly everything I've seen fail has been coupled to hardware drivers, where other platforms are just as bad, or other outdated APIs, e.g. ancient versions/insecure configurations of Java. This is a big problem in IT, but I don't think it's fair to blame Microsoft more than others for it. Maybe for making as much a mess of Vista as they did, but 7 was still available more than early enough for slow migration.
(And at least the NHS had an extended support contract for Windows XP, but ended that at some point, despite (as far as I know) it still being available from MS for other big customers)
It is Microsoft's fault that the programs only run on Windows XP. If they had used standard open APIs it would have been much easier to port the applications to other platforms to begin with, and more of the original developers would have done so before going out of business.
If you want to argue that customers should not buy Windows-specific software you'll get no argument from me, but that is certainly not Microsoft's position.
> Reverse the situation: A bunch of critical systems which run Linux 2.4 are being compromised by cyber criminals via a kernel exploit. You're going to argue that it's Linus' fault for providing such a great kernel and not supporting it forever?
Linus absolved himself by providing source code. If you really want to keep using Linux 2.4 and patch it yourself, you can, and some companies actually do.
> You cannot take a MacOS binary and run it on Linux.
And that project is in a weak state not for difficulty but for lack of demand, because so many programs that run on one OS already run natively on both since it's so much easier to port between them than between Windows and anything else.
> You could argue that with Linux, the only thing preventing it from running Win32 or MachO binaries is that those operating systems are closed source, but this is the world we live in. If you want a "universal" binary, write it in something like Java.
> Yes, and I feel that I addressed this when I said "because banks are too spendthrift to rewrite their software"
> It's a business decision.
That's the point. How is it not also a business decision with Windows XP, which Microsoft has forced everyone to make in a way that many would otherwise not?
I heard that medical devices are approved (by whatever regulatory body) to run an _exact_ set of software. As such, even applying a security update invalidates the approval, because it could potentially introduce bugs that place a patient's life at risk.
The software affected in the NHS was mainly in GP surgeries and admin offices to do with scheduling of procedures. Not directly life-threatening in the sense of a machine going wrong but indirectly damaging in the sense of procedures and consultations having to be cancelled, records not being available &c.
Looking decades into future I'm wondering: Web applications + fairly basic clients Chromebook style for the scheduling/records stuff.
And in any case the compromises were in the UK, which may have different rules on this subject than the FDA.
The upgrade treadmill is good for microsoft's revenue stream, but bad for customers who already have a working system. New != better, just different. I guarantee you that w10 has just as many security holes as xp, they just haven't been found yet because it's so new.
Both would continue if Windows was "perfect" feature-wise and only security updates were provided going forward.
Therefore the kernel is secure, assuming spherical cows are used in the bubble memory of the return stack.
It seems to ultimately boil down to software becoming more expensive to sell, especially when it will be used in critical situations.
Legally, they are covered.
But there is also the more abstract problem of how hospitals and the like go about using computers.
Many of the existing XP computers a big monolithic systems that have been decentralized so Microsoft could sell more licenses.
Think networks of retail stores using hundreds or thousands of PCs for their point-of-sale. Or government agencies like the DMV with locations in every rural county.
They put a PC on every not-quite-a-desk.
When XP was released, Microsoft was unarguably a monopoly. Monopolistic businesses have responsibilities that other, non-monopolistic businesses don't. The law is vague, but the FTC could reasonably force Microsoft to continue to patch security problems in windows XP forever. After all, the defects to which the support period refers existed within that support period, regardless of when they were proven to exist.
Only being a pedant because this came up in /r/soccer today about an hour beforehand. That means the opposite of what you want it to mean.
Thanks! I actually didn't know this. Do you have a suggestion of a better term for the behaviour I'm trying to describe? I was trying to be more elegant than "cheap"
It absolutely is. Not releasing the Windows XP successor in 2004 as it was promised to customers and shareholders is their fault. Microsoft only released a successor 6 years after XP release, 3 years later than promised. And it doesn't help Windows Vista was a disaster on its release.
My thoughts exactly. If you look outside of computers/tech, at other large pieces of infrastructure like electrical and water supply systems, you'll find plenty which are much older and still in active operation. 2001 was only 16 years ago. I have many things older than that and probably "unsupported", yet continue working with only occasional maintenance.
Thus, as computers become more integrated into our lives and a part of the infrastructure, it makes sense that they should stop "moving forward" and settle down stably at some point, yet all I've seen is the complete opposite. In fact, I'd say instability in infrastructure is immaturity --- to take an example, in the early days of electrical power, there were plenty of incompatible sockets, voltages, and frequencies. Now we have settled on a small set of standards, which have been nearly the same for the past decades.
I understand that newer versions of Windows do have stronger security
The flip-side is that some of this security isn't really helping the user, i.e. DRM. Newer version of Windows also come with other unnecessary/unwanted features like telemetry (possibly with their own vulnerabilities), regressions in UI, etc. In other words, if you upgrade you could be more secure from remote attackers and get some new features you actually want, but you're also giving away more privacy to Microsoft and moving toward a more locked-down-against-you ecosystem. It's not all positive.
If only Linux and WINE, or even ReactOS, were in a more usable state...
At a certain point, it's cheaper for businesses to rebuild their solutions with the latest tech Microsoft would prefer they are on, so they can re-license, rebuild their solution with modern tech, and start enjoying the benefits that come with free support for a new licensing period.
During the days of Windows XP, XP wasn't designed for a world of Internet threats, I think it would have been difficult for them to continue to support XP in a production environment while preserving compatibility with existing features.
Windows 10 is built for the modern world of threats. Microsoft sells a Windows 10 Enterprise LTSB (Long Term Serving Branch) with guaranteed support for 10 years to address many companies' needs. Everybody has learned from XP, Microsoft has learned that long term support is needed so the innovated their support model, and companies have learned that they can't expect an operating system to exist in a connected world for 15+ years. Hopefully with new support options and more reasonable expectations from business, Microsoft's LTSB support model is going to address everyone's needs.
If a customer wants to use Windows XP, they can, but if a customer wants support for it, they can pay Microsoft a lot of money for it (and I think some actually still do).
Besides, don't forget that businesses do need to make money. Are you suggesting that Microsoft should have supported/should support Windows XP forever, for free?
The trend in security has lately been toward isolation (sandboxing, etc.) instead of fixing security issues as they come up. Newer versions of Windows have more isolation features. Setting up a sandbox on Windows XP is a nightmare, and there's no way to shut off access to FAT drives or win32k.sys, etc.
I don't think you understand how much developer effort is required in maintaining such an old OS. In addition, these same developers need to get paid and since MS is in the business of making money, supporting a SW that massively burn cash (I assume the maintenance is not cheap) is blatantly foolish.
> Why not spend the next 50 years just maintaining Windows 10, just the way it is?
If I understand it correctly, this pretty much is the plan with Win10. It's supposed to be the last Windows OS
The best developers (the ones with the most choices) will leave because nobody wants to maintain a project.
Why isn't XP good enough to run a map kiosk in a mall? Or flight arrivals screen in the airport?
Face it, the needs of many single-use applications where solved decades ago...the need to constantly undergo the endless forced upgrade cycle waste a tremendous amount of human effort.
It is and the stripped-down versions of XP Microsoft sells to run kiosks still get updates 
> Windows Embedded Standard 2009. This product is an updated release of the toolkit and componentized version of Windows XP. It was originally released in 2008, and Extended Support will end on January 8, 2019.
> Windows Embedded POSReady 2009. This product for point of sale devices reflects the updates available in Windows Embedded Standard 2009. It was originally released on 2009, and extended support will end on April 9, 2019.
However, MS makes it very difficult to acquire and manage those products. Generally speaking, you must buy their embedded products with a motherboard/cpu purchase from an authorized vendor.
MS business strategy basically mandates that a whole class of "single purpose" customers can't / won't buy via the way MS wants to sell it.
If you try to mandate that the mall buy your special (expensive) motherboard/XPe combo you will generally make no sales. Therefore the default becomes that your customers just go buy "whatever computer they can that matches specs" and run that. Hence you wind up with tens of millions of devices that aren't supported anymore.
(Hell, I've been at one plant where PC side of the automation still ran on NT 4.0! The customer is reluctant to replace that machine because the interface to the plant automation is a freaking ISA card, and it has become rather difficult to procure a motherboard with an ISA slot.)
It's easy to tell people to update, but in some cases it's just not that simple. (OTOH, the plants I know about are not connected to the Internet.)
And don't get me wrong, I totally agree that it would be better in so many different ways if these machines were upgraded. But some users basically have no choice.
I Fully understand that money involved could seem (and would be) a huge investment, but if it breaks... tons of money will be involved too.
Fortunately, these plants I am talking about are food plants (yoghurt, pudding, and such), so the risk of some foreign government wanting to shut down that plant is rather low. ;-)
And the risk of becoming infected by drive-by malware is contained by not letting these machines talk to the Internet.
(There is one connection to the regular corporate network, which does have Internet access, to tell the ERP system how much of each ingredient is left so the Purchasing department will order new ingredients on time. But in my benevolent imagination that connection is one teeeny-tiny hole through a humongous firewall.)
There are alternatives to this problem but many companies don't have the budget, don't want to invest, or don't know that these drivers can be rewritten to new platforms with reverse engineering.
Even so, there are formal ways to show you are solving a problem with certain probability. These kind of projects require extensive testing.
For example in the labs I use there are plenty of equipment hooked up to PCs running Windows 95 and Windows 98. The equipment works fine. The proprietary software for the controller and the logging software is old and there is no version that works on newer systems. The only option is to buy a whole new kit which is a stupid waste of money. So we just use these stand alone Win 95/98 computers instead.
The problem we're having in the real world are all of the companies and schools that have 15 year old machines that they use for reading email and filling in spreadsheets that are connected to the internet because their corporate IT was a guy that liked computers in high school and was hired with zero training.
Shame too, since almost all of those cases would be well covered with Linux distributions, or now even Chromebooks.
I mean, an airgapped Windows 95/98 is still reasonably safe I would have thought, if it stays exactly the same and no USB skullduggery etc.
...and running unsecured operating systems is what?
Let's say your front doors stopped locking one day. Is it a waste of money to solve the problem, even if that meant replacing the entire door?
Ultimately project management and ownership is responsible, but they won't be interested if we don't make our expectations clear.
Downvoting these sort of opinions is just saying it's too expensive to secure some things. To me that means we can't afford to computerize some things in the first place, at least not with the chosen tech stacks.
OTOH, the people in the other departments (running up-to-date machines) fell multiple time for malicious email attachments while our developers department was never infected despite the fact we download stuff from the Internet on a regular basis (be it for libraries or tools) - stuff that doesn't even need sneaky means to get executed. Yet we are in theory subject to the "no software installation without permission from IT dept." internal rule.
So if those machines in that lab don't run an email client and are not used to browse the Internet, they are actually quite safe. The only threats that remain are worms spawning from local infected machines or infected USB pen drives.
The thing is, security can quickly become an unhealthy topic. It's so damn easy to FUD people.
My "new" Win7 machine has this "security advisory" that pops every time I copy a file from/to a network drive that say "this file can damage the computer" even when it's a freaking text file - and I do that all the time (BTW imagine what mental model of security it generates in the mind of non technical people - it's not protection or education, it's fearmongering).
So I went to disable this warning but I then paused for a moment, thinking - what if one day I make an actual mistake and get infected? Will I be blamed for disabling it?
It's so damn easy to say, "if you don't follow this PITA security measure, you will be held responsible for the consequences". I admit that like many I would submit to that. There's no point in gambling my job on this after all, and I have better things to do.
I think that the cyber-security topic needs to be sanitized. And the first thing to do would be to tell week-end security consultants, who don't understand a thing about security contexts and threat assessment, to keep quiet a little so that people that are actually in charge of cyber-security can listen and learn from actual experts.
I'm just saying it's not cheaper to ignore technical debt. It's actually a bug somewhere in the operational budget or business plan.
If the system should run for thirty years on without an OS upgrade, design and budget for that up front. I promise that no one considered that when they configured a Windows XP box and threw it in a lab somewhere. And, hey, maybe that's OK in the short run. But there was no budget or business plan to replace those systems down the line either.
I'm afraid most of the value is probably in the lab hardware itself, not the FOSS stack fetishism. As such you'd need to figure out how to ship better and/or cheaper lab hardware to gain market share, when an entrenched provider has already learned how to ship state of the art hardware. If you really believe market forces will value FOSS for FOSS's sake that highly, perhaps you'd like to start that hardware startup yourself? You make it sound so simple... perhaps you already started one? But for most, I think the Hobonson's choice I think you're actually proposing is between:
1) Continuing to do lab work with the proprietary hardware
2) Letting someone else replace you when you switch careers to work on your new hardware startup. Your replacement is unlikely to care as much about FOSS as you (after all, you were willing to quit over it!), and even if they do, their replacement likely won't. The perverse result of this is a decreased demand for FOSS lab equipment, not increased demand!
You might increase supply of FOSS lab equipment. Or (and I think this more likely) you'll go bankrupt before moving the needle. Speaking for myself, I can't convince myself of the untapped market potential of this niche, I don't know hardware, I'd be going up against incumbent experts, I'm not passionate about lab equipment... if I somehow acquired investor funding under these circumstances I would worry I had conned them more than I had convinced them of the merits.
Better to keep doing lab work if that's what you enjoy, and perhaps agitate for less proprietary FOSS options if that's something dear to your heart. Maybe you'll cure cancer and save the lives of future developers of FOSS lab equipment.
No, because if the same guys would be coding for GNU/Linux, it would mean a closed source binary compiled against a very old C library that most likely wouldn't even start in modern GNU/Linux.
Or try to access kernel features, drivers or pathname that no longer exist in modern distributions.
Using *BSD or GNU/Linux for laboratory hardware doesn't mean that the code is made available, or even if it is, it is cost effective to pay someone to port it.
Many of those XP systems have actually code available, at least in life sciences, just labs don't want to pay to port the code to new systems.
Also the Linux kernel should always be backwards compatible.
You could also place any missing files.
Even if that is all too complicated, you could just use an old Linux distribution or Linux kernel and backports patches for yourself, although not sure if that is easier.
What I'm saying is, it's good to have the choice and option to do that.
Of course, like you say, if you have that option but just don't do it because it takes time / money to do it, then that is your choice.
If dynamic linking was used, the entry points might have been moved, the syscalls have changed, bugs are were being taken advantage of were fixed.
If static linking, the calls into the kernel might have changed or data read from /etc, /dev or other type of expectations from target OS.
No, Linux takes binary compatibility seriously. I can run statically linked binaries from 1992 on a modern system just fine.
ELF was being slowly introduced in 1994, with Slackware 2.0 being one of the first distributions to support it.
So I really really doubt you can pick a static executable compiled against kernel 0.x.y, using a.out format and execute it in Ubuntu 16.04 LTS as example.
Actually, I just need to take the dust out of my Walnut Creek CD's to prove my point.
Not to mention how disparate the file system structure, including device drivers, of something like Yggdrasil Linux 1.0 is compared with modern distributions.
As I said, easily verified by dusting off Walnut Creek CD's and picking a random binary from them.
XP is 16 years old! Microsoft is in a tough spot, this was a worldwide problem so they had no choice.
They should make the support contracts for old version quadruple in price for every year and offer incentives to upgrades. Better for them, more revenue and less bad press for MSFT.
The only good thing that was added in Windows 7 was a search field in start menu. And builtin firewall became able to filter outgoing connections. But it is not worth bothering with upgrading the system.
There were some problems with opening Let's Encrypt-encrypted websites on one of these browsers since they've stopped supporting it, can't remember which one.
In an ideal world, sure. However many organisations run older machines because of large-scale investment in now-defunct technology, which they can't afford to re-invest.
It's unconscionable that they sat on this.
I bet there are still tons of systems that suffer from the Shellshock, or Heartbleed because they are either not updated at all or they are running old linux version which are no longer supported (I bet there are still tons of RHEL/Centos 2, 3, 4, and 5, which do no longer get security update or the companies do not have extended support contracts).
The real issue is that people are afraid of updates because they tend to break things. They do not want to invest into "slow rollout strategies" and the like.
If updates were applied immediately to 10% (or maybe even less if the company is big enough) of all machines, and if there was a way to quickly rollback the update, there would be less problems and the consequences of failed updates would be less serve. This way you can have your systems up-to-date within 48h (maybe: 1% of 'key users' who do not freak out if things break, and then after maybe 4h 10% of normal users who can call "IT support" to roll back the update, and after 24-48h, to all PCs. This would be even easier for stateless servers because you could redirect all requests to other servers if the 10% fail with 0 downtime).
Just because the update is free, doesn't mean there's no risk associated with it.
Isn't it entirely plausible an attack of precisely this sort could occur in a world where Linux (or macOS, or templeOS, or whateverOS) is the go-to desktop OS? Isn't Windows the preferred target for attackers because of its ubiquity? How in the world would this be mitigated by "open source"?
Open source might be part of the answer to this, or some kind of legal 'right to migrate'.
If all of your patient records are in some ancient software, the new vendor would probably be happy to get them out again if there were documents or a codebase saying how.
If you need XP to run ExpensiveScannerManager95, if you had a legal right to get the code somehow, I'm sure you could find you an SME that would port the driver to windows 10.
Maybe we / our companies and governments need these legal rights now. But what exactly should they be?
It isn't. Everyone who tried to decide over which version of a distribution to run should know this. It's fine as long as you run the newest or don't need new things. But once you need something specific and especially once you start installing things outside the package manager things go down hill quickly.
I wish people wouldn't use this argument in favor of open source, because if you make institutions choose between open source and proprietary solutions based on "updates" it's appstores, cloud software and subscriptions that will win.
I have used Linux as my primary operating system for more than 15 years, I have been using Arch as my primary distribution for more than 5 years. I do not know this.
>t's fine as long as you run the newest or don't need new things.
So which is it, I am fine if I want to run the newest, or if a do not need the newest? Your statement is a contradiction
>But once you need something specific and especially once you start installing things outside the package manager things go down hill quickly.
No, not really... I install things all the time outside the Package manager, of course I know what I am doing so...
>because if you make institutions choose between open source and proprietary solutions based on "updates" it's appstores, cloud software and subscriptions that will win.
How so? App stores to not solve the Lockin problem the OP is talking about, if anything it makes it worse
> So which is it, I am fine if I want to run the newest, or if a do not need the newest? Your statement is a contradiction
I don't see the contradiction, maybe I didn't express myself very well. The problem is when you mix old and new software and distributions. As long as you run a single release (old or new) and all software is for that release you're fine. When you have to deal with many different versions of third party software, libraries, interpreters, shells, build systems etc. is when you run into problems. Just like in the case with "ExpensiveScannerManager95".
How does the development of Configuration Management tools for linux support any of your statements? I fail to see the connection. Linux has needed enterprise configuration management tools for awhile, it is one area where Windows is better as there are many many many many Configuration Management tools for Windows.
>>maybe I didn't express myself very well.
I think this is true, because I still do not understand
1. What you are really system
2. Why you believe windows is better at any of these things than linux
3. How it is relevant to what we are talking about.
Yes when you mix old and new things you may have problems, depending on the system. I however maintain you have LESS problems with linux than you do with Windows, having managed both systems in large enterprise environment, Windows is a finicky broken system that does not play well with anything.
I spend the majority of my time fixing broken shit on windows. The idea that Linux is worse is laughable
Also due to the nature of Linux being a Monolithic Kernel and open source, there tends to be less issue with backward compatibility issues with Linux making it easier to update systems that today companies refuse to update windows on because it is not compatible with older hardware/software
Infact Linux often has the reverse problem in that hardware support for new technology often lags behind because hardware vendors focus on Windows first.
The real problem is that organisations had devices sufficiently connected to be vulnerable that had not the patch applied. That leads to questions about software update policies within those organisations, and that in turn leads to some quite difficult questions about regulated medical devices and how they are supplied and maintained.
Not for XP/2003. That patch was not generally available months ago
Consider the fact that Windows has the best backward compatibility in the business, while even drivers break across relatively minor Linux kernel versions and compatibility is likely to be a bigger problem with Linux.
>Consider the fact that Windows has the best backward compatibility in the business,
That is a complete and utter myth. Windows has terrible backwards compatibility, and changes to the Windows Driver model, and other changes require drivers and software to be completely rewritten between generations of windows.
>while even drivers break across relatively minor Linux kernel versions and compatibility is likely to be a bigger problem with Linux.
Where do you get this? Drivers are included in the Linux Kernel, it is impossible for a driver to "break" across minor version of Linux, if a driver breaks the kernel fails and is not released.
Linux comes with a limited set of device drivers in the main source tree, just like Windows' bundled drivers. Most of this thread is about rare medical equipment or proprietary drivers/programs from companies that have gone out of business.
Also, the Linux kernel ABI routinely breaks drivers, unlike Windows which happens much more rarely.
Yes, the Linux kernel being what it is makes for good/great backwards compatibility.
But that's so far from the point it's not even funny. This is about update policies and internet security at the organisations involved.
It was reportedly available to those who were still officially supported, though, possibly as far back as February.
As others have suggested, Microsoft has historically offered support (in the sense of at least security patches) for each generation of Windows for much longer than any of the major FOSS operating systems. Obviously you don't get free, unlimited, eternal support with any version of any OS, but even then Microsoft has apparently made arrangements with those who really didn't want to update to a more recent one than XP to continue offering support in return for additional funding.
As I said before, the real problem here is how to deal with the conflict between wanting to keep connected systems up-to-date with security patches, while at the same time not breaking their essential functionality. Medical systems used in regulated environments where failures may literally be a matter of life or death are pretty much the ultimate example of this difficulty.
What's impressive is how people manage to spin this into an anti MS rant even when Microsoft is doing the right thing (stopping support for paleolithic software).
Windows 8 was shipping on brand new computers three years ago.
And while upgrading your OS is nice in theory, it often means abandoning perfectly-good hardware because driver support for multiple versions of Windows is terrible. In this age of barely-getting-faster CPUs, how long do you think a piece of hardware should be usable?
Purely out of interest what kind of hardware is incompatible between the two versions?
Talking about 8, not 8.1
> Purely out of interest what kind of hardware is incompatible between the two versions?
Many drivers tend to break as windows goes through changes. When I tried a preview of the windows 10 creators update I had a driver stop working right. There were also serious nVidia problems that broke the old driver versions; imagine that happening with your typical company that stops providing drivers after a year.
Though the driver comment wasn't specifically about 8. Good luck upgrading XP to not-XP, even if your hardware can perform better than a modern Surface.
Windows 8.1 Update is basically a roll-up update service pack to Windows 8 and is a free upgrade. That's like complaining that the released patches work on XP SP2/SP3 but not the original XP.
If someone fails to update Windows 8 to 8.1 Update or has magically written software that works on Windows 8 but not 8.1 Update, it's on them.
>Many drivers tend to break as windows goes through changes. When I tried a preview of the windows 10 creators update I had a driver stop working right. There were also serious nVidia problems that broke the old driver versions; imagine that happening with your typical company that stops providing drivers after a year.
Those drivers are typically either using bad coding practices or relying on unsupported features or bugs. Maybe you should contact them as a customer and let your displeasure known. If enough people do that they may actually do something about it.
2) Windows barely ever changes. The lengths through which the MS devs go to preserve compatibility are insane. It's incompetent driver writers who write buggy code who should take responsibility. If you were even minimally familiar with the Windows API, you'd know. And if you aren't, here's a quote from a Windows developer:
"I could probably write for months solely about bad things apps do and what we had to do to get them to work again (often in spite of themselves). Which is why I get particularly furious when people accuse Microsoft of maliciously breaking applications during OS upgrades. If any application failed to run on Windows 95, I took it as a personal failure. I spent many sleepless nights fixing bugs in third-party programs just so they could keep running on Windows 95."
( source: https://blogs.msdn.microsoft.com/oldnewthing/20031015-00/?p=... )
Just because they had the vaccines ready in warehouses or could manufacture more easily doesn't mean that their customers "deserved" them for free before the epidemic hit.
If the customers actually desired security, they would've paid for XP/2003 patches or upgraded to a different supported OS. Those customers messed up on their own, and Microsoft is giving them an out here.
Does that mean I can get info on current Windows 0days simply by subscribing to XP support program?
You could do what you said but it's pretty expensive and only works for a bulk deal at around $200/PC/year with a large number of PCs at a minimum.
I hope that the fact this patch was signed in February doesn't imply that it was published in February and available to every semi-competent cyberwarfare unit in the world.
I've managed a bunch of computers which weren't configured so, but configuring them that way you lose everything Microsoft created for the management of the computers in the local network -- to be effective you'd have to maintain and develop your own tools, which most of the companies wouldn't like you to do. It the users are supposed to "normal work" on the given computers, not enabling the file access is much harder to achieve.
I know there are "everything virtualized" approaches, but they are really expensive.
At the end, I blame Microsoft for not recognizing enough what their users actually want: I know a lot of the companies which actually pay "the Microsoft tax" (as much as the Microsoft accounting is considered, they "use Windows 10") while in fact using Windows XP and anything but 10.
And they are right to do so. The problem with everything after XP wasn't that the companies wouldn't pay for support. The problem was that Microsoft "innovates" in the areas that businesses find directly harmful. The business would of course like updates, would of course like better and safer protocols implemented, would of course turn on new security settings if they would be delivered, but they don't want all the annoyance of all that doesn't have to do anything with the infrastructure, like "Windows or Windows Server which you have to use through the new 'phone' UI."
In short, there are many reasons there's a lot of Windows XP use, and Microsoft simply decided that they don't care.
Linux is of course even worse, even with the efforts of Red Hat to have long-term stable OS versions, the concept is that most of what the user consider "just apps" are typically so dependent of so many random stuff that maintaining the stability is unnecessarily hard.
Finally, Apple traditionally doesn't care for the company use of their products much.
Which leaves most of the infrastructures in not having any "straightforward" choice. And "redeveloping everything" every time the OS companies decide to "innovate" is really not possible.
That's were we are now. There's simply not enough awareness among the OS companies that "every non-programming entity" wants the stable infrastructure. Instead, the attention deficit goals of the managers of the moment are typically chased.
More specifically on the NHS, it appears there was a decision in 2015 not to update some OSes because of Conservative budget cuts. I'm trying to track down details.
I'm told by a friend in IT in an NHS Trust that the NHS actually came off quite lightly - all the affected systems were front-end PCs that don't store patient data locally, the patient data was safe on back end databases, so he spent Saturday reimaging a few hundred PCs and not one satoshi of ransom was paid to the attackers. Hopefully they won't get complacent about the bullet they dodged. (Ahh, who am I kidding.)
Instead of remote apps, I think we need something more like Microsoft's edge intelligence they demoed at Build. The central data-centre pushes containers down to the devices and monitors them. You get the centralised control without the latency and disruption that remote communication would add and be unacceptable in many medical scenarios.
edit: regarding the "same building latency" do we want every hospital, clinic and doctor's office running its own local datacentre? That will come with its own availability horror stories. For something like the NHS a multi-region centralised AWS style datacentre makes sense.
- Critical devices (that in most cases don't even run Windows): already safe, because using higher security standards.
- Administration devices (patient reports, etc.): don't have local database. And if you don't have connection, your computer is useless. That's the reason of "computers are not working" on hospitals when network is down. So a network failure would be a denial of service both when running local applications accessing a remote database, and for the case of pure remote applications. With the difference that with pure remote applications the attack surface would be near-zero at client side.
Distribution, redundancy and routing around faults should be our vision for these systems and IMHO edge devices get closer to that. There are many ways a hospital can still shunt data around and use it locally in an emergency without giving up due to failure of remote systems.
Move to the Cloud and never have to worry about security again /s
Look at tradeoffs. There are no magic solutions. Pretending that cloud services don't solve any problems is as bad as pretending they solve all.
There are vast numbers of XP boxes out there. They represent a risk to all of us.
If you are in a highly regulated environment like the UK NHS then there is no excuse for either not being current, paying the proper fee Microsoft to support the OS you choose to continue to run, or taking other measures to ensure that your systems are protected, such as keeping them on an isolated / secure network with no Internet connectivity. We have solutions for this stuff, Microsoft isn't the bad guy here. The people that consciously made the budgetary decision to disregard their customer's / patient's data / welfare are responsible for this.
I'm no Microsoft fanboy, but blaming Microsoft for this is like blaming Ford for a traffic death that occurs today in a car that was manufactured in the 1950s before seat belts were standard equipment. We now know seat belts save lives, if you chose to take the risk of driving a car without them that's on you, not Ford.
This is more like blaming Ford for a road accident caused by faulty brakes on old cars that they knew about and didn't recall.
That was really cool, but the whole project disappeared.
If it had been open source, I bet it would still be actively maintained to this day.
Edit: Found the paper https://www.microsoft.com/en-us/research/wp-content/uploads/...
Xen is open source.
I found some PV IO drivers at https://wiki.xen.org/wiki/Xen_Windows_GplPv which mention XP (search for 'XP' including (!) single quotes), and a quick Google does immediately give hits on running XP as a HVM guest.
I'm (genuinely) curious what you're describing/referring to here. What project disappeared?
I don't expect that kind of thing to ever leave a research environment though. It would mess with too many people's heads and give people too many ideas of running bare-metal kernels other than NT.
Now I think about it, I realize the reason why HW virtualization really took off is because it let vendors keep their operating systems as actual operating systems in the traditional sense of the word, making for fewer legal issues (among many other reasons).
Also, I thought Xen was essentially just a super-thin layer to kickstart VT-x/AMD-V. I didn't know it could do anything else. In fact, I thought there was only emulation and hardware-assisted virtualization. Is there a middle ground I'm not aware of?
I didn't know Drawbridge was that amazing - that's incredible.
And now I'm starting to understand Microsoft's vision: they have WSL to get Linux infrastructure onto Windows, and Dk to get selected Windows infrastructure onto Linux. Impressive.
But now I think about it that way, I know Dk will only ever be an internal framework - if that got released we'd basically have "perfect Wine" and it would allow quite a few too many applications to move off of NT.
The Drawbridge NTUM(User-Mode NT kernel) is maintained as NT 6.2 (Windows 8) which is new enough for almost all purposes - except modern Windows apps.
I'll try to find and link that comment that managed to make a better point than I did.
We knew these devices were insecure by default. Some even shipped with a network enabled MS SQL Server with a blank sa-password. Quite literally a free root-kit.
Scientists and doctors working on these machines were forced to use portable storage (floppies, ZIP-drives or CD-RWs).
It was cumbersome, but no network was a strict policy, and it was there for a reason.
I wonder how tricky it would have been to set up a MAC- and plug-location-based VLAN to isolate those devices onto, with a very very carefully locked down machine sitting between the devices and the rest of the network. Deep packet inspecting firewall, copious logging, antivirus turned up to 11, the works.
I ask because I'm curious how well a theoretical setup like the above would have worked out for the described scenario - I'm sure there are similar environments where it may be impossible to get having no network approved by management.
And no, Microsoft can't guarantee everything developed by a 3rd party will continue working, and nor should they.
The bottom line when it comes to places like the NHS is that they decided to cut costs by either not entering into a custom support agreement with Microsoft so that they could continue to get security patches for XP, or by upgrading their systems to run on newer versions of the OS.
Good on Microsoft for doing this.
Your comment was directly above https://news.ycombinator.com/item?id=14330193 when I saw this page. Screenshot: http://i.imgur.com/8fydOGG.png
I'm starting to seriously dislike HN's lack of moderation transparency. I don't know who changed the title - if it was the post author or a moderator - and when.
Here is the direct link to executable just in case. 
 - http://download.windowsupdate.com/c/msdownload/update/softwa...
I'm not saying that's truly what's happening, but it's easy to imagine. I'd verify I'm connecting to the right domain and double-check with e.g. VirusTotal if I were you.
Having to check because registrars are dumb has nothing to do with the fact that doing the check is easy.
Incidentally, when I copied the link out of Chrome (57) it pasted the punycode link even though it showed "apple.com" in the omnibox. So then I carefully copy-pasted just the domain and TLD to work around Chrome's link-copying magic, submitted, and... discovered that Arc punycode-ifies Unicode domains.
So that was interesting, but it kind of killed the impact of the point I was making.
I'd also look into http://download.wsusoffline.net/ . It's an offline installer, has helped me out before when WU when south.
I will try upgrading again when I get time.
The only free Windows 10 upgrade is the one for users of assistive technologies. Of course, you can just pretend to be a user of assistive technologies, but I'm uncertain about the legal ramifications of doing so, regardless of whether or not you'll be caught.
Additionally, Windows 10 silently accepts Windows 7 and 8 product keys, but the legal situation is equally nebulous in the face of .
 https://support.microsoft.com/en-us/help/12435/windows-10-up... "Is the Windows 10 free upgrade offer still available?" => "The Windows 10 free upgrade through the Get Windows 10 (GWX) app ended on July 29, 2016."
You don't have to pretend anything. Even if you've never used an assistive technology, you can go turn on magnifier.
I wish more people would learn what can and can't be removed from a URL before sharing it. It's not that difficult, and it's easy to test that the "minified" version still works.