The article does at least do some digging vis-a-vis the history of Zeffy's work and how it came to reach the state that it's in currently, so some credit is due to computerworld.
* do nothing on machines running < Kaby Lake
* not install on machines running >= Kaby lake with no modifications
* only install and function on machines running >= Kaby Lake that have been hacked by the user to download future Microsoft patches
Operating systems are hard, drop Windows and sell your software on Linux. We all know it's inevitable in the long run.
Following your argument, they'd have to be tested as well. A few versions more of Windows 7 and 8 doesn't seem like the huge budget breaker, it just takes more time. I don't think they'd use a dedicated machine for each and every version, you can use the same machines and just change the drives or use NetBoot or whatever.
The real issue here is that they want users to upgrade to Windows 10.
> A few versions more of Windows 7 and 8 doesn't seem like the huge budget breaker,
There's a lot of versions of Windows 7 and Windows 8 as well. Windows 7 had at least 6 SKUs.
> I don't think they'd use a dedicated machine for each and every version, you can use the same machines and just change the drives or use NetBoot or whatever.
You're completely underestimating the scale necessary to carry out proper testing and validation.
I think you're completely overestimating the scale. In any case, neither of us really know what's going on.
One thing we know is that Microsoft has the means to test these things if they wanted to.
Windows 10 has been out for a while, other processor generations came out; they didn't do this trick until now.
This only came up because Microsoft is seeing that their product is extremely unpopular with their customers.
From a profit perspective, they're doing the logical and obvious thing: to force users to upgrade because they need their license fees.
No version of Linux released in 2009 is still (a) supported, or (b) supports Kaby Lake.
I wouldn't mind Windows 10 if I could switch off every new crappy feature they've pushed into it (e.g. the phone-home stuff and ads), and trust Microsoft to respect my decisions.
The phone home stuff is indeed a seperate, and very serious, problem.
Just to get an idea.
That's only true if you're wedded to a distribution, though. There's always the option to switch to one that doesn't use systemd, and recreate your favored experience there.
Also, was it possible to install Gnome on Ubuntu if you didn't want to use Unity? I never used that distribution myself.
Windows 10 "Basic" level telemetry:
That is all Apple's policy has to say about Telemetry collection. It doesn't say when or how it gains consent.
With Windows you can't install without consent. I'm not going to re-image my Mac to check but I wouldn't at all be surprised if consent was part of that first boot EULA that you agree to.
Ultimately you find Microsoft's transparency on the matter is unsettling but Apple's hand waving "Trust us to do the right thing" is ok?
Microsoft does neither. You can't turn it off easily if you're not on Enterprise / Education, and the data being sent is encrypted.
Windows 7 still has twice as large install base as Windows 10. It's literally nothing other than MS stopping support for 7 because 10 is out - imagine the outrage if Tesla stopped supporting an older model S because a new one was out.
IIRC, the compatibility problems with the new processors are processor drivers and power management.
However, the dumb thing is that Microsoft already made the engineering effort to support those things, since they used to support Windows 7 on the now-unsupported architectures. Now they're taking it away to push Windows 10 some more.
And don't get me wrong, I think Linux is a fine OS and use it daily (lately more often than Windows). But "replace Windows" is not a static target, and the organizations in charge of promoting Linux as a desktop OS have nothing now or in the works that can compete with the corporate Juggernaut of Microsoft.
Selling software is hard, especially on Linux. We tell people FOSS means free as in freedom, but the vast majority of even people who parrot that line thinks it means free as in beer. As a software vendor, you'd be killing yourself to ignore 90% of the market.
I was gung-ho on linux on the desktop in the late 90's, but there's one fatal flaw in the bazaar volunteer-driven model that prevents it from catching on: it is more fun to write anew than to maintain and improve. The desktop linux ecosystem is stuck in an endless loop of rebuilding what was, instead of advancing.
If you're referring specifically to the various DEs on Linux desktop, I agree, there are a ridiculous amount of options, none of them doing "modern" super well. Personally, I'm on Xfce and that is because it doesn't do "modern" on purpose. DEs were feature complete in 1995, everything else has been useless fluff.
Yes the Linux kernel was developed ~25 years ago. Linux is not an operating system it is a kernel. Gnu/Linux distributions have several desktops, depending on what part of the world you are from, etc...
Linux hasn't hit the desktop market because you can not just go to a store and buy one.
The problem is availability, the Average Consumer does not what to buy something, and then alter or modify the product as soon as they get home.
To most people computers are entertainment devices. They just want it to work. They just want to buy one.
The reason that Microsoft, Apple, and even Google (chrome books, android) have hit the market share is because they realized the formula of be a hardware company first, software second.
Yes Microsoft has a cartel on Hardware, as they got all vendor companies to make hardware for them. It's like MacDonalds being a real estate company and not a burger flipping company. They take their money off the top of the hardware vendors. They have created a ecosystem that is endless cycle of upgrade. The Cost of ownership of a Microsoft ecosystem is quite expensive over the long term. They have developed a system of hooking users with the home/personal lines, that have forced business to purchase their products.
I should also point out that, Android (Linux kernel) has the largest market share at the time of writing this. Personal computers are changing to things that fit into pockets.
The average consumer is not going to want a desktop or a laptop anymore. As that their use case of just being entertainment devices.
If Gnu/Linux wants to compete for smaller the professional,Gamer,power user desktop/laptop markets. There needs to be a Hardware company to champion it, (Like Dell) that will put it front and center at the Wal-marts/best-buy's/Fry's of the world. So that we can just go buy one.
Remember the superior product is a matter of perspective. Which is the superior product? The product which is perfect, barely breaks, and is fixed quickly if it does. Or the product that has flaws, multiple ways to break, crash, etc... only lasts a few years.
It all depends. To the salesman the one that breaks of course, so we can sell more of them. If it does not break I can only sell it once.
So... long rant short; until we can just drive to the store and buy a Linux PC off the shelf and start using it . There will be poor market share, and we'll fighting to FUD at the work place.
Remember the 90s when you could do that, at least the smaller shops used to have some Linux CDs. What happened, why is none seemingly interested in selling it? What i mean, the smaller hw shops that are still around could sell NVMe disks preloaded with Linux distros, they could also sell enclosures so that users would be able to run it via usb 3 as a portable operating system. Let's face it, dual booting is dead, for those with interest in Linux but who can't escape the Windows trap maybe could run it on portable storage.
Imagine if RedHat back in 96-97 could have got HP, Dell, Compaq, etc... to sell a pre-loaded product. How many more people would have been exposed to Linux. Good or bad, exposure is exposure.
standard application format, working across all the distros - with isolation, permissions and stuff (see android's .apk files, snap, flatpak, appimage etc).
fault-proof system updates. atm it's possible to break something via regular update and it'll cost you dearly. maybe win-style restoration points/backups.
unified desktop experience, not current shitshow where every single app can draw it's very own shaped buttons
For those unfamiliar with "snaps", here's a quick primer:
Google "gpl 2 patent clause"
First result, for me:
Patent clauses in software licences - software patents wiki (en.swpat.org)
Jump to GNU GPL v2 - (See: GPLv2 and patents). See section 6 and section 7. Patent attorney Dan Ravicher argues that GPLv2 includes an implicit ...
> The GPLv2, despite being silent with respect to patents, actually confers on its licensees more rights to a licensor’s patents than those licenses that purport to address the issue. This is the case because patent law, under the doctrine of implied license, gives to each distributee of a patented article a license from the distributor to practice any patent claims owned or held by the distributor that cover the distributed article. The implied license also extends to any patent claims owned or held by the distributor that cover “reasonably contemplated uses” of the patented article.
At least with Microsoft you have support for APIs for a long long time; eg. MFC is old but it is still supported.
With Linux you are on your own.
So "get your software on Linux" is easy to say but in the real world it isn't so easy.
Soo... how do I run Linux on a new Dell XPS or a Surface Book without updating kernel to newest 4.x branch? Because last I check Linux forced me to update to new version (including userspace since new kernels weren't compatible with old userspace) to run on new hardware as well.
Seriously, it's kludge after kludge. Windows 7 uses the WSUS protocol (which is a hairball set of web services) to figure out which updates it needs to apply - of which it does through recursively querying what base packages are on the system and goes some way to explain the incredibly slow updates people see... but it gets its latest package list and after Microsoft realised they had so many security patches they were releasing they discovered their CAB file format needed to be rearchitected as it could hold enough files... hence WSUSCAN2.CAB now must be downloaded along with something like a third or fourth attempt at getting the windows agent written correctly.
But it gets worse, because somehow it must check what it has downloaded in a gigantic (over 1GB) opaque ESE edb file, which it synchronised with the sift are distribution cache in the Windows directory. Here it looks up the Componebt Based Services registry, along with a CONPONENTS hive that only the Windows Update service loads and you generally can't see when you open Regedit. Except that there is a set of keys in the HKLM\Software\Microsoft\CurrentControlSet\Conponebt Based Services - consisting of an ApplicabilityCache, a list of packages, a package index, a set of package detection keys, and a set of componebt detection keys.
Once you begin to decipher a current and applicable state, you must work out how it relates to the package index list in the registry, which in turn somehow related to the packages keys, which have their own interesting binary keys that Microsoft set...
So once you've worked that out then you need to decipher the manifest files in the Windows Sude by Side system. These are signed with CAT files, abdysuslly have Microsoft Update files that go along with actually payload files. Somehow these relate to a set of session XML files, which are meant to help you troubleshoot when things go wrong and package states go awry. Except the XML format isntvfocumebted except in a few tantalising blogs which aren't in any way complete and some of which seem to becMicrosidt developers reverse engineering the file format themselves...
WinSxS itself holds every files ever installed by Windows Uodate in the %systemroot%\winsxs folder, which is a bunch of folders with NTFS hardkinks back to the C:\Windows\system32 files. Microsoft originally wanted to see the state of an installation so they made the decision that when the kinks to newer files were updated they would keep The of files around so they could know the system state and presumably try to allow rolling forwards and backwards o a snapshot of time - they reasoned this was ok because they released the dism.exe took to remove of hitfixee and updates from previous service packs. Unfortunately that switch never got used because the DISM got released in Windows 7 SP1 and sonebody in Mucrosodt decided to go with rolling updates, and no more service packs. Consequently there are often 8 or 9 GBof unneeded and unused files in most Windows 7 systems (depending on the age and how frequently updated the system has been maintained)... and evidently someone st Microsofy realised this because about 4 or 5 months ago they released an update for, or all things, the Windows Disk Cleanup Woizard to remove these old packages. You must run it, then reboot and depending on the amount if packages it removes has been known to have edb ysers stuck in "100% of updates have been installed" for 15 to 45 minutes whilst it cranks through the cleanup process. Of course most end users think there PC has crashed and Windows is "stuck" so they reboot it half way through, to varying results.
There are three different tools to check corruption - the sfc utility, the dism utility and a variety of Windows diagnostics that old you can download and that are a bunch of Powershell of VBscriot scripts that attempt to slowly fix the plethora of issues that can prevent Microsoft Update from working.
No, kernel reversing engineering is fun, but poking around Windows Update is not. I wouldn't recommend getting hired to untangle it...