Hacker News new | past | comments | ask | show | jobs | submit login
The Windows Update Marathon in a VM: From Windows 1.01 to XP (winhistory.de)
344 points by pionerkotik 12 days ago | hide | past | web | favorite | 250 comments





The level of backwards compatibility in Microsoft products is very impressive.

Microsoft Office 2019 can open .doc files created in Microsoft Office 97 and upgrade them to a modern format, with zero loss (except for things like OLE objects, which are no longer supported for security reasons). My parent's computer has documents which haven't been touched in 20 years and they still open just fine.

Or applications written for Windows NT that still work - unmodified! - on Windows 10 thanks to layers of compatibility hacks. A true feat of engineering.

If you want to read more about the horrors of maintaining APIs that are backwards compatible with software written decades ago, I can heartily recommend Raymond Chen's blog: https://devblogs.microsoft.com/oldnewthing/


Back about 20 years ago I built a system for my Dad's company to generate export documentation, manage custom price lists for multiple customers and track inventory and factory production, all based around multiple Excel files and a fair bit of VBA code.

Fast forward multiple upgrades of Office it still works to this very day! You just have to keep ignoring the scary self-signed certificate warnings. ;)

I'm a die-hard Linux guy, but credit where credit is due - Excel is awesome.


Backwards compatiblity. It sounds great.

Windows can run the previous version of Office. But running the current version, alongside say, Office 2007, is a major pain in the ass.

Maybe you just need to run the old version of Outlook, because your IP phone system or ERP software has some plug-in that was never written for the next version and isn't compatible.

Maybe you need to run a version of Access that's so stinking old that you have to use Windows XP in a VM.

Now, to make things worse, the database connectors for that version of Access are so outdated, that it's holding the entire company back on MySQL 5.5 because if you upgraded, your Access application wouldn't work anymore.

Not to mention, all that code you wrote 20 years ago in VBA is insecure as all hell. Unencrypted connections, ripe for SQL injection, plus that version of Access can't work with large datasets or return large results. Dammit!


You're kind of conflating several different points.

The ability for an application (eg, Office) to install side-by-side with other versions of itself is entirely different than its Backwards Compatibility (eg: ability for current versions to open and/or save Office 2007 format). In fact, because of the latter, provided they did a good job at it, the former is arguably irrelevant.

I'm not familiar with Office/Outlook/Access's APIs and their history of backwards compatibility, but given that Microsoft usually does a good job with BC, I'd tend to suspect the other vendors (IP phone/ERP software) are at fault. In my experience, there's a good chance the only real incompatibility is with the vendor's installer or some version check that prevents it from working, as opposed to really not actually working, say, because Microsoft removed some API they used.

The rest of what you're talking about really points to some other disfunction, or bad or unlucky decision.

Did you buy an IP phone system expecting 20 years of use from it? Why can't you get updates -- is it because you don't want to pay for maintenance, or did they go out of business?

The cascade effect you're talking about here though highlights something else: So many companies do things they think are saving money, but actually cost them more.

For example, they can't go to the latest IP phone software, because they need to buy a hardware upgrade that will cost $x, so it means they have to use old version of Access that uses old MySQL which means other development team spends $y extra working with that version (or paying for custom dev for other products to work with it, etc). Did anyone actually evaluate which is larger? The decision to "save" $20k on a phone system upgrade could easily cost $50k or more for several months of a developer's time.


>The ability for an application (eg, Office) to install side-by-side with other versions of itself is entirely different than its Backwards Compatibility (eg: ability for current versions to open and/or save Office 2007 format). In fact, because of the latter, provided they did a good job at it, the former is arguably irrelevant.

I am still somewhat surprised that Windows doesn't have a Docker-esque sandboxing system to solve this problem (I mean, it has Docker, but it's based on a full VM). I suppose in the third-party realm at the very least you have Sandboxie, but meh...


"I am still somewhat surprised that Windows doesn't have a Docker-esque sandboxing system to solve this problem (I mean, it has Docker, but it's based on a full VM). I suppose in the third-party realm at the very least you have Sandboxie, but meh..."

Starting with Office 2016 (maybe 2013) App-V from MS does precisely what you describe and it doesn't require any sort of classic virtualization component to be accessibly by the OS or hardware.

See: https://en.wikipedia.org/wiki/Microsoft_App-V

The tech is there. Usage is a different story.


Interesting! Didn't know about this.

Don't forget Backwards compatibility as being able to run old versions of office on newer versions of windows

Yeah, on the surface that stuff always sounds nice because "see, I can run this thing from 20 years ago and I didn't have to do any extra work" sounds nice, but that also means it hasn't been maintained for 20 years...

I'd rather have planned deprecation and make choices to (at some point) abandon certain APIs, features and formats. In the case of formats, a public release of some parser code would be nice, so that if someone really wants to get some old data, they can, but other than that, stagnation isn't as good as people make it out to be.


> Yeah, on the surface that stuff always sounds nice because "see, I can run this thing from 20 years ago and I didn't have to do any extra work" sounds nice, but that also means it hasn't been maintained for 20 years...

Some things don't need intensive maintenance, and it's a bad thing to force it for unnecessary reasons. Backwards compatibility can often be understood as 10 vendor engineers performing maintenance effort so 10,000 customer engineers don't have to. It's a massive productivity win for society.

I'm a software engineer. I hate working on ancient codebases as much as the next guy, and I personally benefit from the constant make-work of re-engineering. But, as a customer, I'd rather invest my time and money to solve new problems, rather than spend it re-solving old problems.


This assumes (like everyone else seems to do) that the code was perfect, the APIs that were called were perfect, and that the process never changed. This is almost never the case.

> This assumes (like everyone else seems to do) that the code was perfect, the APIs that were called were perfect, and that the process never changed. This is almost never the case.

The code and APIs will never be "perfect," and many processes change little.

Also, it's a little ambiguous who's code you're talking about: the customer's or the platform's? If a platform that maintains backwards-compatibility is an achievement worth celebrating, because it saves loads of effort across society and give people access to software who might not otherwise be able to afford it. If a platform breaks its customer's code because it's chasing after API perfection, it's kinda abusing its customers.


Yet in reality almost no business software will 'work' on newer versions of a platform be it purely due to licensing, support or version check restrictions. Heck, get a SIMATIC PC and randomly upgrade that 'platform' and watch a whole system stop functioning.

The theory that a platform as 'backwards compatibility' is only good for lazy developers and lost source code. And yes, not doing maintenance saves time, but so does not upgrading your platform. I mean, if 8 inch floppies work for nuclear installations, OS/2 works for subway passenger systems, and Windows 3.1 for radio servers and other flight control systems, might as well not maintain anything at all. Heck, why not use relay logic in the subway control rooms.


> might as well not maintain anything at all.

I think the issue here is that you're viewing this as too all-or-nothing. If the OS/2 subway software is well-tested and does everything it needs to do, what's the benefit of spending $10 million rewriting it or paying a guy to keep up with platform changes? If IBM (or whoever owns it now) keeps maintaining OS/2 so the software can run with little to no modifications on new hardware (since hardware fails), good on them. There's probably 1,000 such systems, and that's $10 billion in savings for society.

That doesn't mean no software should actively be maintained, a lot of it should. It also doesn't mean that working old systems shouldn't sometimes be replaced or upgraded. But it does mean backwards compatibility is important and shouldn't be dismissed.


This looks like very idealistic approach. The world does not revolve around perfectionist's wishes.

As for "...means it hasn't been maintained for 20 years..." - the OS did maintain the old API's and the way they behave to make sure old software that uses said APIs still works on new OS. Does not mean that the new APIs were not developed.


With complex formats, you can make them as open as you'd like and nobody is going to get it right but you. See the Office Open XML format.

Given the relative lack of sophistication of Access databases migrating to literally any other RDBMS isn't very hard. MS offers an Access(and mysql,postgres,oracle) to MSSql converter for free. You may have to manually port stored procedures but it's not like MS doesn't give a fuck.

https://www.microsoft.com/en-us/download/details.aspx?id=542...

Cheap bullshit, not technical limitation, is at the heart of this level of "compatibility" issues.


This doesn't seem to take into account that Access isn't just a database.

The database is already on a real SQL server it's just the old dated connectors that you have to use

As another Linux fan, you are indeed correct that Excel is awesome. But I think Visio is awesomer. You have LibreOffice Calc that gets within the ballpark of Excel, but nothing is decently close to Visio on Linux.

[flagged]


What percentage of developers who use Macs do you imagine create macOS/iOS software, vs. POSIX software?

I’m fine with using an ephemeral toolkit to create long-lasting infra, and it even makes sense to me that some people want to sometimes write new pieces of the ephemeral toolkit to help me do that (like stylish GUI text editors or SFTP clients or notification-area service managers.)


I even have software (Drawful 2) that was released in 2016 and won't run on Catalina because it is 32-bit. That's just three years.

That’s on the developers to be honest, less so on Apple (but Apple is not entirely guilt free). It was very clear in 2016 that they should’ve been targeting 64bit.

Not really, Apple could have just supported running 32 bit apps.

Google is pulling similar things in the Play Store. They will pull apps just because they are not 64 bit. They delayed the deadline for a year now, because many developers really don't care.

And as a consumer it really does not matter. Why would a 32 bit app not work 10 years from now?


They could have, but they chose to not have stalled developers on their platform instead.

That's a stupid argument, when comparing to windows examples, were that simply doesn't matter, you develop for current and 20years later it's supported.

There is no technical reason (in that they could technically keep 32bit support) for Apple to drop 32bit support since the x86 CPUs they rely on support them - i've heard that there were some architectural reasons with Cocoa (i think) that made it harder to do that, but they came up with that architecture themselves while knowing that people have invested on 32bit support. They knowingly decided that making the work easier for a handful of people was more important than the many thousands of applications that would break - they put themselves above their customers.

Go ahead and install Wine x86 on a Gentoo x64. Only then you will understand what it means to support 32 bit apps.

I heard that the 32-bit Objective-C runtime was just a massive pain to support because it never got the new features from 2.0.

Suppose the sole developer died five years ago. Had he created an application for Windows, his users would still get to enjoy it. Had he created a FOSS application and given his users the source code, his users would still get to enjoy it. But if he created a closed source MacOS application, then his application soon joins him in death.

Oh hey, Drawful 2 is the reason I’m not upgrading!

Please be less patronising in the future; it's not a good look. In reality, of course, many people currently "sitting around typing" are doing plenty of useful work using a variety of different tools.

There are various tradeoffs to be made when considering the benefits and costs of a strategy for long-term backwards compatibility. Microsoft was traditionally on the "it will continue to work until the heat death of the universe" end of the spectrum, and Apple on the "we will be dropping support soon" end. That is, Microsoft invests a bunch of effort into ensuring old software will continue to run on new platforms, whereas Apple prefers to drop support for various platform features (68k, PPC, 32-bit) with a few years of notice.

In some cases and for some users, backwards compatibility has a value that is worth the cost paid in development time, testing, and complexity. For others, it is not. I've been a Mac user since the G4 days, and I don't think I've ever really been in the situation where I thought "oh I need to run this old software but it's no longer supported" – that is, the feature has no value to me. And I'm pretty confident that the things I produce will continue to have value in the future.

TLDR – other people are different to you sometimes.


What percentage of code written 20+ years ago is still in production - without any kind of updates - today?

Also, many developers use Macs not because they're in love, but because of the practical conveniences of working on a Unix-based OS.


Not every piece of software is a vape-delivery-via-drone-app.

I have about a dozen small very specific scientific 'calculators' (only the windows/linux ones that still run, the mac ones are all dead which is a darn shame since there were some jewels out there) that were written by researchers 10-20 years ago and never updated since. They haven't been touched by the original developer in years, in fact many of their host websites have been lost to the ages so they exist only through sharing on forums and whatnot. But, since they do not need any sort of connection to the outside world (times were simpler back then you see...) they still run fine in compatibility mode. For some of them their functionality can be replicated with modern software (probably in a web app or something), but the original tools still work perfectly for their jobs and are still useful today. Don't even get me started on all of the equipment which is stuck running windows 98 because of the serial port library used to communicate with the equipment was using the dos api...

examples: Psst! - https://www.st-andrews.ac.uk/~psst/

Laser Line - http://www.skywise711.com/laserline/

I also use a number of paid tools that I paid for back in the '90s or early 2000's and haven't felt compelled to 'upgrade'. They run great and give me the answers I need, so until the vendor has added actual useful features (instead of reskinning them with a flat/metro interface...) I see no reason to upgrade.


In 1991 I updated for the last time a piece of software (DOS based with proprietary multithreading and graphics layers). The software controls some scientific device, collects, processes and displays experimental data in real time and stores data for later analysis.

About 2 years ago or so I think I was shocked to discover that they're still using said device and the software in unmodified form running it under DOSBox or something like that


I wasn't claiming that there isn't any software built 20+ years ago which hasn't seen updates and yet still works and still has utility.

The point is there isn't much of it relative to the wider ecosystem. One particularly important reason for that is security. Most of the dependencies for software written 20+ years ago have long since been EOL'd and are therefore vulnerable.


There is more software out there than just desktop apps.

At my work there is code from 1980's written for IBM mainframe in PL/I language still running. Has seen minor changes over the years but very much 40 year old code at the heart of it. This code has basically had continuous uptime since then. I work at industrial plant I have heard not uncommon for Banks, insurance companies, power plants etc to have similar setups with same vintage of software.

We have C and C++ code from the 90's originally written for RS/6000 era Unix boxes. That stuff is pushing 30 now. still actively maintained.

There is Fortran with genesis dating back to 80's still being run today.

A 30 year lifespan for industrial equipment is not uncommon it would be nuts to replace control software and things like that so often.


Yeah! Screw those filthy Mac lovers! They do nothing important. You can tell by the OS they are using! They should get a real computer designed for serious business! I am still using Windows Vista and plan to for the rest of my life!

Dude, if you are trolling, don’t. If you aren’t, also don’t.


If there is anyone who'd think "screw those filthy Mac lovers" that would be Apple themselves since it is they who are in a position to preserve backwards compatibility and yet they decide to not do so.

As a Mac user, I don’t feel it. I know people get really particular about this shit, but I spend roughly 4 hours every year making macOS do what I want. That’s it. When I ran Ubuntu, it was about 8-12 hours every 6 months. When I ran Windows it was... it was a lot more. Backwards compatibility isn’t as important to me as shit just working. My current setup hasn’t been changed in about 2.5 years. Early 2017 is the last time I opened system settings at all. Sure I periodically install a homebrew package or two, but I consider that to be a part of the normal workflow, not an OS tuning thing. All my other daily software and even stuff I don’t use frequently just works.

I know that sounds like I am some kind of fanboy. I assure you I am not. I wish I could go back to Ubuntu where I feel most at home. But the fact that I think so little about my OS that I forget it is there is what makes this so easy for me to stay.


> Backwards compatibility isn’t as important to me as shit just working.

Backwards compatibility is about shit working! You can't have that when a program you want to use stops working because the OS broke backwards compatibility.

I have used an iMac for a long time, every time there is a new version of the OS something breaks and in some cases there isn't a way to fix it. My 2009 iMac went through several OS upgrades and pretty much half of the installed software doesn't work anymore and in most cases i cannot even upgrade - either because i bought the software outside of MAS (which didn't exist at the time i made some of those purchases), or because the developer stopped supporting the software (including some of Apple's own software - e.g. i really like iWeb, though Apple doesn't anymore) or simply because the developer doesn't exist anymore. Or i just prefer the version i have over the version the developer would want me to use (because they bloated it up, added ads or whatever reason).

Yeah, when i first started using macOS i had pretty much the same stance as you. It went away when thing started breaking left and right over the years. There is no "shit just working" if the OS breaks said shit.

(of course in the 2009 iMac's case that isn't very relevant anymore since Apple decided a 3GHz Core 2 Duo isn't enough to run their OS, but it still happens with some newer Macs i have)


Backwards compatibility is for when software reliant on the OS isn't actively maintained. If it is, developers simply update it and it keeps working. So it depends on what you use in your stack/workflow.

[flagged]


Yeah. But you are commenting on what people do with the product, without any basis whatsoever for it.

And just curious when was the last time you used a non-MS OS as your daily computer? Am I close with a guess of “not in the last 10 years”?


> I'm just more open-minded than others here.

You made a patronizing remark about millions of people you don't know. (https://news.ycombinator.com/item?id=21452097)

To follow up with this comment and putting yourself on a pedestal is definitely trolling, and not very creative either.


> few of those people you see sitting around typing on their MacBooks are doing anything that will be of value 5, 10, or 20 years from now.

I mean, the people "typing on their MacBooks" combined together have resulted in a company with THE highest value ever in human history, and is copied by all the other companies that you champion.


1. This morning MSFT's market cap is higher than AAPLs.

2. But that's not how "most valuable" is usually rated by analysts: https://www.forbes.com/sites/alapshah/2018/08/02/apple-hits-...

3. There are privately held companies that are probably "more valuable", like Saudi Aramco

4. I don't know what I've done (which you asked in your original post), but I know I'm not a fanboy/shill/astroturfer for Apple.


Alright, so ONE of the highest valued companies ever.

> but I know I'm not a fanboy/shill/astroturfer for Apple.

No, you're just a hateful individual suffering from brand envy.

See? Assumptions are easy and fun indeed!


Very few of those people using Macbooks are creating software for Apple devices. It is a fashion statement. They could be using any laptop.

Graphviz was written on Macs and Linux. Its native OSX container was developed by someone now working at Apple. Most of its code is OS independent. Like everyone, we are not thrilled with MacBooks these days (expensive, not great keyboards, useful ports removed, and seeming love-hate relationship with Unix * ) but if you want to run Office and Lightroom, run typical python data science code, what else is there? A colleague tried the Windows Linux system and it kind of works but not well enough for what we're doing. It's the same everywhere.

* Here's a fun exercise: try to dump your iCloud passwords in cleartext so you can scan for potentially compromised passwords on forgotten systems (using shell commands, not a proprietary password manager.) OSX doesn't even generate messages to acknowledge that it's designed to not let you do this.


> if you want to run Office and Lightroom, run typical python data science code, what else is there? A colleague tried the Windows Linux system and it kind of works but not well enough for what we're doing

Ubuntu, Debian, Fedora, Arch

The answer to Office/Adobe/gaming is simply don't do it, get someone else to do it, or get a different job. Your time is better spent on stuff like python data science, which Linux supports better than anything else out there.

If you must, run Windows 10 in a VM. If you're doing heavy duty video editing or Photoshop, get a second GPU just for the Windows 10 VM[1].

[1] https://wiki.archlinux.org/index.php/PCI_passthrough_via_OVM...


> The answer to Office/Adobe/gaming is simply don't do it, get someone else to do it, or get a different job.

I am not sure recommending someone to get a different job is a good idea simply because of the choice of their operating system... I'm a huge linux fan, both of my personal laptops have linux and have been using it since 2006. But I still know that realistically it's not the best OS for certain tasks, like gaming. That's why my gaming PC has Windows on it.


> try to dump your iCloud passwords in cleartext so you can scan for potentially compromised passwords on forgotten systems (using shell commands, not a proprietary password manager.)

Allow me to direct you to the Security APIs, which would make your task a couple lines of code that's as permissive as you'd like it to be: https://developer.apple.com/documentation/security/1398306-s....


I suppose that depends on where you live or work. Around here there are perhaps 10% of users of any brand who act like that (except in the BlackBerry era where it was part brand, part BBM). The rest just uses what makes sense for their use case.

I currently do part time consulting in company that has floor full of developers. All Apple. Not a single piece of software they develop actually targets Mac. They do full stack wit cloud and browser targets.

Most of the companies I visited the last few months have a choice of systems plus a BYOD budget, but still most people prefer getting a Mac, and again, almost none of them are writing software for macOS or iOS, and are as in your example often working for web, cloud or managing *nix servers.

The obvious conclusion is that there is quite a bit of prejudice and bigotry in your worldview.

And if you find this observation unwarranted: you not only made a sweeping generalization against everyone who does X, you authoritatively dismiss all FUTURE potential of their work. It must take no small amount of bitterness to see things that way.


Well looking at where the Microsoft ecosystem is now and where the Apple ecosystem is, I think Apple made the right choice. Apple alone sells more devices running macOS variants than all Windows PC makers combined.

As far as backwards compatibility though - MS’s new ARM based Surface won’t run 64 bit x86 Windows apps and run 32 bit x86 apps slowly and still gets half the advertised life of iPads.

Windows on Surface took 12GB of hard drive space compared to about 3GB for iOS on iPads.


Apple alone sells more devices running macOS variants than all Windows PC makers combined.

Where did you find those numbers?

https://www.macrumors.com/2019/01/10/apple-mac-sales-drop-q4...

In Q4 of 2018, Apple sold 4.9 million Macs. Lenovo alone sold 16.6 million PC's. All brands except apple sold 63.7 million.


I said MacOS variants - iPhones, iPads, and Macs. Those are three areas where we have recent real numbers from Apple before they stopped reporting volume numbers.

I could include Watches and AppleTVs but those are far from generally purpose computing devices.


MacOS is a desktop operating system, the other devices you mentioned don't run desktop operating systems. You have to do an Apples to Apples comparison, because otherwise linux wipes the floor with every OS if all we are talking about is market share on metal.

Why do you have to compare desktop operating systems? An iPad Pro running the latest OS is faster than most x86 PCs, run Office and now Photoshop and can easily take the place of a desktop computer for many people.

But regardless, if you are referring to “successful”. Which I’m discussing, success from a business standpoint is profitable. Apple is definitely making more in profit and revenue than Microsoft. As in, Apple had the more successful strategy.

There are plenty of people whose only computing device is a phone and others who are hardly ever use a computer for personal use. Even the iPhone can keep up performance wise with some low end PCs being sold.


If you want to use iOS to claim that macos is more popular than windows, then I may as well use Android to claim Linux as the most popular consumer OS

Well, if I wanted to say that, that’s what I would have said....

But I said variants. But, if you take MacOS out, the statement remains.

And seeing that iOS is now running Microsoft Office a version of Photoshop and is using more powerful processors than most x86 based PCs, we can complete take out MacOS and just iOS.

I said nothing about popularity, I was saying successful. Success in a business isn’t marketshare or popularity its profitability. Seeing how little profit that OEMs are making selling either Android devices (most of whom are losing money - except for Samsung) or PCs, I wouldn’t be surprised that Apple is more successful than all of the PC makers combined selling Macs and its well known that Apple captures more than 70% of the profit in mobile.


People don't use their phones to write documents or do professional image/video editing or any professional workload, really. And I think market share is as good a metric as any when it comes to defining "success". Profitability is due to many reasons and it's not always due creating the best product.

Can you spend “marketshare”? Can you sustain a business on “marketshare”?

A company can’t stay viable based on “marketshars”. Next am I arguing what’s “best”. Do you think Dell would rather be in Apple’s position with “market share” or Apple’s in Dell’s?

There are plenty of people whose only personal computing device is their phone either by choice or necessity. Heck I am a developer and the only thing I use my computer at home for is as a Plex server.

It’s on my list to get a powerful enough NAS to transcode movies and I won’t even use it for that. I’ll run Plex and bit torrent (to uhh download Linux ISO’s) and B2 backed cloud backup app directly on it.

My wife gave her computer to our son because she uses her iPad for everything including Office.

Now that Apple (finally) supports a mouse, I connect my same Bluetooth keyboard+mouse to my iPad that I use for my work computer when I bring it home.


> People don't use their phones to write documents or do professional image/video editing or any professional workload, really.

The people you know. Not everyone can afford to buy a desktop computer, and for many of these people a phone is their primary computing device.


>> MacOS variants - iPhones, iPads, and Macs

As an outsider, it is very difficult for me to understand Apple's story on whether these devices are, or are not, on the same OS; what its/their "root" was; and what's the future direction (separate OS, or convergence, and if latter which one will form the core and which one will transform). The newest "iPad OS" or whatever it is called does not help matters - again, as an outsider, I cannot tell if this is a marketing differentiation or a true one.

All that being said, I rather thought that iPhones & iPads run "iOS", and Macs run "OSX", and that they were different.


First there were two operating systems Classic MacOS and Next.

Apple acquired Next and they combined some parts of classic MacOS and NextStep to create OS X.

Apple then stripped OS X down, got rid of some Mac specific parts, added some new frameworks to create “iPhone OS” and introduced the iPhone Initially they claimed that the iPhone was “.running OS X”.

They introduced the iPod Touch later the same year and the iPad 3 years later at this point, they renamed “iPhone OS” to “iOS”.

They introduced watchOS and tvOS as more variants of iOS. There is also some iOS variant running on HomePods.

Over the years the two operating systems have both somewhat diverged and they still share both some old code and new code between the two.

This year, they renamed the version of iOS running on the iPad “iPadOS” as they started adding more iPad specific features.

They also introduced the “Catalyst” framework to bring iOS specific frameworks to the Mac to make porting from iPad to the Mac easier.

Finally, they introduced Swift UI as a common cross platform framework for watchOS, iOS, macOS, iPadOS and tvOS.


They all run the same kernel and runtimes, but some diversified distributions have extra runtimes and frameworks for their task-specific implementations.

- Cocoa

- Objective-C

- XNU

It used to be all based on pure Darwin (also from Apple, but OSS). But since the iOS releases it has been diversified too much and Apple no longer wanted to backport to their own OSS (but they still backport to all GPLv2 and lower).


> I said MacOS variants - iPhones, iPads, and Macs.

Oh, come on. That's profoundly misleading and you know it.


What does “variant” mean? It’s not some obscure word that no one understands. Would it have been more clear if I said Darwin variants? Who the heck knows what Darwin is besides a few nerds?

> Would it have been more clear if I said Darwin variants?

Yes.


How so? iOS evolved from MacOS not Darwin. Darwin is at best a grandparent of iOS.

> iPhones, iPads, and Macs

Which is just two variants

meanwhile Windows 7 support will end in 2020 and is still being sold

https://support.microsoft.com/en-us/help/4057281/windows-7-s...


If you want to be pedantic, macOS and variants.

I didn’t say that MS isn’t good at maintaining backwards compatibility. My mom in fact uses my old Mac Mini 1.66Ghz Core Duo introduced in 2006 running Windows 7 and I just retired a Core 2 Duo 2.66Ghz laptop introduced in 2009 from running my Plex Server running Windows 10.

I’m saying that it hasn’t been a sound business strategy as the rest of the tech industry has moved on. Windows (not Microsoft) has failed in the cloud, the web browser market, and mobile.


> Windows (not Microsoft) has failed in the cloud, the web browser market, and mobile.

I don't agree on the Cloud, it's the strongest competitor to Amazon, and on the other businesses there is no competition: mobile and browser is Google.

Amazon failed as well in the mobile business.


That’s what I meant. Microsoft hasn’t failed in the cloud - Windows has.

As far as mobile being “google”. In terms of revenue, it came out in the Oracle lawsuit that Android has only made Google about $33 billion its entire existence. Less than the amount that Apple makes in two quarters on iPhones. Google also reportedly still pays Apple $8 billion a year to bectge default search engine in Apple devices.


> Microsoft hasn’t failed in the cloud - Windows has.

as a server, maybe.

as a client not really.

> Android has only made Google about $33 billion its entire existence

Yeah, they are good at hiding profits

and they are an ADV company which dominated the mobile market because it was strategic, they don't need to profit from selling (and manufacturing) the HW, they just need your screen time.

There are between 2.5 and 3 billion android devices around the world.

And it's almost impossible to own an android devices without Google SW on it.

> Less than the amount that Apple makes in two quarters on iPhones.

That's not really true, and iPhone revenues are declining every year.

In 2018 thy made $33.36 billions, down of 9.2% from the previous year.

If Apple loses the mobile market, it's finished.

But people would still watch YouTube advs on iPhone replacements.


Yeah, they are good at hiding profits and they are an ADV company which dominated the mobile market because it was strategic, they don't need to profit from selling (and manufacturing) the HW, they just need your screen time

You think Google lied under discovery? Oracle wasn’t just counting their meager hardware sells but they were also counting as sales, Google Play revenue etc.

And it's almost impossible to own an android devices without Google SW on it.

There is a country that has over 1 billion people running Android with no Google Services.

That's not really true, and iPhone revenues are declining every year.

And it’s still about the same as MS’s revenue last quarter at 38 billion

If Apple loses the mobile market, it's finished.

Apple is far more diversified than Google. Almost all of Google’s profits come from advertising, 48% of Apple’s revenue come from the iPhones.

The Mac business by itself is about the size of McDonald’s the last time I checked.

Estimates for Youtube is that it’s barely profitable if at all.


> You think Google lied under discovery?

I think the known taxable profits are only a fraction of the real profits.

Don't jump to conclusions, just because you wanna prove a point.

> There is a country that has over 1 billion people running Android with no Google Services.

so you're agreeing with me: it's almost impossible, the alternative is live in China and give up your freedom.

Because you can't use Chinese services outside China.

There's also that.

Anyway Google China it's a thing.

And at this point I would prefer China spying on me than Google.

Maybe Europe should start banning US services as well...

> Apple is far more diversified than Google.

Is it?

Mac HW is less than 10% of their revenues.

What else they produce?

Could Apple really survive out of wearables or iPads or iCloud without iPhones?

I seriously doubt it.

> Estimates for Youtube is that it’s barely profitable if at all.

Nobody knows how big YouTube really is

Estimates are in the range 16-25 billion dollars / year


"Well looking at where the Microsoft ecosystem is now and where the Apple ecosystem is, I think Apple made the right choice. Apple alone sells more devices running macOS variants than all Windows PC makers combined."

This is a fancy way of saying that Apple sells more watches and phones than Microsoft does computers, which is a whole lot less impressive than the disinformation version you wrote above.

When I look at the microsoft ecosystem: something like 90% of all computers, nearly every corporation and business in the world, etc, I don't see anything to be ashamed of.


This is a fancy way of saying that Apple sells more watches and phones than Microsoft does computers, which is a whole lot less impressive than the disinformation version you wrote above

How is it “disinformation”? Do you think anyone on HN doesn’t understand the statement to mean what I said for it to mean? I said nothing about the computers MS sells (the Surface line). I said Windows PCs in general.

Microsoft sells a lot of Windows licenses to business and consumers but all of the energy from a development and usage standpoint is on the web. I bet most businesses could replace a lot of their computers with Chrome OS boxes and not miss a beat.

Apple sells a >$1000 phone labeled for "Professionals" which comes with 32GB of on board memory.... total. When you upcharge high end customers $150 for a $15 stick of NAND, yeah, people are more touchy about 8GB of system file

It’s not about the “system file”. It’s also about RAM usage. Lowend Surface laptops come with the same amount of storage as iPads but between using x86 chips that are a lot slower and less energy efficient and the boost of Windows, it’s nowhere near the same experience.

No iPhone Pro comes with less than the 64GB of storage. Do you really want to talk about marginal price and marginal cost? A Windows license has no marginal cost.


> How is it “disinformation”? Do you think anyone on HN doesn’t understand the statement to mean what I said for it to mean?

At least one person[0] didn't understand and publicly stated as much. While iOS was originally spun off from macOS, it's very much it's own system, with there being no ability to run macOS applications on iOS, and vice versa (although the latter is slowly changing with recent macOS releases). Comparing the two is disingenuous at best.

[0] https://news.ycombinator.com/item?id=21452862

EDIT: And here's at least one more: https://news.ycombinator.com/item?id=21453368


Well actually..,

Most iOS app can run on MacOS - the iPhone simulator compiles iOS code natively to x86 and links to an x86 version of the iOS frameworks.

Apple also introduced both catalyst and SwiftUI to make porting back and forth easier.

And yes “words mean things”. I specifically said “variants.” So if again you want to be pedantic, I could just as easily say that Apple sells more iOS devices than all Windows PC makers combined.

Also, if you want to try to exclude iPads from the iOS ecosystem because Apple now calls it “iPadOS”, iPads can still run iPhone apps.


> Most iOS app can run on MacOS - the iPhone simulator compiles iOS code natively to x86 and links to an x86 version of the iOS frameworks.

Yeah, except no app developer ships their simulator build to customers to run in the iOS Simulator…


They are starting to ship Catalyst based apps though....

You can't run After effects or do software development on iOS. You can't run PHPStorm or docker or final cut pro. I see the iPad/iOS as being usable for a very small slice of professionals.

Guess what? Most people aren’t developers nor do they run After Effects. They do however run Office. Photoshop was also just introduced for mobile. Heck most people don’t do anything with their personal computers but “consumption” and even that has been moving to mobile

You clearly don't know about Active Directory, Group Policy and the myriad of other services Windows offers for enterprises.

What part of my post were you addressing? My whole argument is that in the long term, MS’s focus on the Enterprise and backwards compatibility caused them to miss out on all of the next waves of tech industry - including the web and mobile.

Even in their crowning achievement - the cloud, they are a distant second to AWS and Amazon always brags that they run more Windows instances in AWS than Microsoft runs on Azure.

MS’s revenue is half that of Apple’s and they have lower profits. Heck, last quarter Apple made about the same as Microsoft even if you subtract iPhone revenue.


> My whole argument is that in the long term, MS’s focus on the Enterprise and backwards compatibility caused them to miss out on all of the next waves of tech industry - including the web and mobile.

They missed out on mobile for sure, but it's not because of their focus on the enterprise or backwards compatibility, and you'll have a very hard time proving so. Also they haven't missed out on the web either, and neither can you prove that they did. Just because there is a bigger cloud provider doesn't mean that they 'missed out'. And there will be bigger fish to fry, new technologies are coming out every year.

Apple and Microsoft are in vastly different spaces. Apple focuses on consumer products, Microsoft makes its money from the enterprise. An Apples to orange comparison is not meaningful.


You don’t call having to use its biggest rivals browser engine and Bing missing out on the web?

As far as mobile, Apple was able to take its much smaller less bloated OS, cut it down to a size that could fit on a device with 128MB of RAM and have for what at the time was a full featured web browser. Did you ever use IE on a Win CE device?

And are you forgetting that Microsoft was the dominant consumer operating platform before mobile? MS didn’t have any choice but to retreat to Enterprise.

So what “bigger technologies” do you see coming out then mobile? The smart phone already has an 80%+ worldwide penetration rate of adults.


>You don’t call having to use its biggest rivals browser engine and Bing missing out on the web?

They run windows at Google HQ and AWS but you don't see anyone saying Google and Amazon "missed out"


They run Windows because their customers demand it for their VMs. You really think Microsoft spent 24 years fighting the browser wars and liked just giving up and giving control over to Google?

Microsoft for once acted very differently from its usual self, and Windows Phone repeatedly broke backward compatibility. Arguably even more than iOS did - and this was one of the factors in its failure. So this is not the argument against backward compatibility you think it is.

Besides, if you're comparing MS and Apple, you need to include all the VMs running Windows (it's not a physical device - but MS makes the same amount of money on it), and also MS's foothold in Cloud.

Apple practically does not exist in those spaces, and Microsoft would have been in the same position had it broke compat as often as Apple does on the desktop.


> The obvious conclusion: few of those people you see sitting around typing on their MacBooks are doing anything that will be of value 5, 10, or 20 years from now.

Maybe to a troll or someone with an axe to grind, but my point of view is different.

If you’re doing the same exact thing 5 years later (let alone 10 or even 20) without any tweaks in your process, then frankly you have a job a robot should be doing.

I’ve got production code I wrote twenty years ago still up and running, despite multiple upgrades to the language and hosting Infrastucture itself that it’s running on. Should that be the case? Not in my opinion. Business needs change, that code should’ve been updated. Instead developers just added on more around it and that’s how you get bloated beasts like MS Office, Photoshop, etc.


> If you’re doing the same exact thing 5 years later (let alone 10 or even 20) without any tweaks in your process, then frankly you have a job a robot should be doing.

Banks and insurance companies tends to disagree with you.

Maintaining software is just like maintaining buildings, if you don't they fall apart, but mainly it's just about checking that everything is still the same it was when it was built.

You don't change elevators in a building just because the old model is not supported anymore.

There's no conceot of "not maintained anymore" for elevators.

So maybe the poster was trolling, but it is true that you cannot rely on Macs if your software has a predicted life span longer than a couple of years.

> Business needs change

Again, many established businesses work because they don't change much over time.

They just keep doing what they do best.

> that code should’ve been updated.

That code worked, why in the hell risking to introduce new bugs?

I worked on software packages made by millions of lines, you don't just update them because your supplier won't bother supporting your workflow for at least 10 years.

Even Github, a modern fast changing company, was running on Rails 3.2 that was released in 2012 until September 2018, they switched to 5.2 and it took a tech company with some of the smartest engineers around and Rails contributors one year and a half.


"If you’re doing the same exact thing 5 years later (let alone 10 or even 20) without any tweaks in your process, then frankly you have a job a robot should be doing."

Find me a robot that can navigate 4x4 trails and identify rocks with accuracy.

None exist.

And that kind of prospecting hasn't changed in centuries.


> Find me a robot that can navigate 4x4 trails and identify rocks with accuracy.

There's at least one doing so right this very moment on Mars, though it's pretty slow at it.


That thing is following very carefully-planned paths, not just hunting around of its own volition like a prospector would.

Those are some very naive assumptions you've got there padawan. Sure some software needs to fade away, but just rewriting something to rewrite it is stupid.

Funny. Microsoft Works getting discontinued, and Microsoft Office being unable to open Works documents is what pushed me toward OpenOffice (which could open Works Word Processor documents, mostly without issue), since all of my family's documents were in Works Word Processor. Years later, Microsoft finally came out with an importer for Word, but there's still no migration path to anything for Works Database.

While Microsoft Works has been discontinued, you should still be able to run it in modern Windows (which is what backwards compatibility is about - Office is a totally different product, after all).

(1) The post I was responding to was about Microsoft Office 2019 being able to open documents created by older versions, not about Microsoft Office 97 being able to run on modern Windows.

(2) Microsoft Works (at least the version we had a license to) definitely does not run on modern Windows. My dad's business is forced to keep a Windows XP VM alive, in order to keep running Works Database, because as I mentioned, there's still not a migration path to anything for it.


It used to be that the .doc format changed incompatibly with every release and if you had to share documents with someone on an earlier version you had to manually select the format on the "save as" screen. I have distinct memories of opening a Word 97 doc in Word 95 and seeing a bunch of weird boxes between each character - in retrospect, obviously because they changed the internal format from some 8-bit encoding to UTF-16.

But I can see how the conflicting requirements to keep the upgrade treadmill rolling and never break any old apps have made certain versions of Windows into such unholy piles of kludges.


the .doc format changed incompatibly with every release

Pre-XML the .doc format was just the in-memory representation of the document serialised to disk and loading it was the converse. Conceptually you could imagine it like mmap(). It was different between versions because the code was different, not by any deliberate effort.


I'm not very sure about Word, but Excel 97 onwards, Excel's story around forward and backward compatibility has been very good -- the Excel equivalent of the scenario you describe wouldn't happen, because of the introduction of Future Record Types[1].

These are still used in the newer XLSB format[2].

[1] https://www.loc.gov/preservation/digital/formats/digformatsp... (Page 14)

[2] https://docs.microsoft.com/en-us/openspecs/office_file_forma...


Word, Excel and Powerpoint were forwards and backwards compatible from 1997 to 2004. The newer versions emitted various backward compatible formatting (eg Word 2000 supported table-in-table, but emitted sufficient formatting for Word 97 to at least see one level of tables and consider the rest of the cells to be paragraphs)

It was far from bug-free though.


I've had some particular cases where OpenXml docs created with Word 2016 looked terrible in Word 2010 because the layout was different.

Another comment here points out that word emitted tags for forward compatibility reasons. That's still true - I've had reason to parse OOXML directly and see those tags. I guess it sometimes isn't enough.


The way desktop content is consumed hasn't changed much in 20 years though. Ok we now have OOXML instead of DOC but the layout of the content and even editors to some extent hasn't changed much. This is why PDFs still open. GIFs, BMPs, JPEGs, etc still work fine etc. Granted there will be a problem with older and/or more esoteric proprietary formats but that's a risk people should weigh up when using less common software and saving files from them.

In my opinion the biggest short term threat we have at the moment isn't the popular document file formats (though that is a long term threat) but content posted exclusively online. Web standards are constantly evolving and any given website would have a multitude of files -- many of which wouldn't even be hosted on the same domain as the HTML endpoint, which itself is most likely dynamically generated. Plus our local copies of web content is just a temporary cache that routinely gets flushed so once it disappears from the web it's likely gone for good (web archive aside).

I'll be more worried about classic file formats when computing undergoes it's next paradigm shift away from personal computing (arguably that's already started happening with smart phones and cloud computing).


Yep that's why I got annoyed when we moved away from Docs/PDF's and open source wiki at work for confluence. Now we're stuck forever with that software unless we hire an army of interns to move it to whatever the next platform is when we need those old documents.

I was impressed when I upgraded an old Win7 system and Office 2007 still worked on W10. Makes me wonder what the oldest version of Office is that would still work. There are a lot of things to not like about Microsoft, but their dedication to backwards compatibility is great.

Reminds me of that time when they fixed a buffer overflow in the ancient Office Equation Editor by manually patching the binary because they lost the source code:

https://arstechnica.com/gadgets/2017/11/microsoft-patches-eq...


It’s worth noting that a patch around 2 months later removed the Equation Editor entirely, because more security issues were found and binary patching wasn’t going to be sufficient.

Are are more explanations for the manual patch than losing the source code though: why spend days/weeks setting up a 20-year old dev and build environment when a two-byte patch does the trick?

Oh that reminds me of learning some assembler by writing cracks to shareware programs to patch away test period warnings, license key checks, etc. using a disassembler (w32dasm I believe) on the binary executable. That was a fun learning experience, helped a 14 year old to use software he otherwise couldn't buy (me) and helped understanding quite a few internals of binaries.

Some binaries are obfuscated by overlapping several routines together.

Oh, there's a bunch of ways you can try to make reverse engineering harder, but I don't think these techniques are really that common for most software.

Wow, that's impressive.

Has its downsides, though - one of the reasons Active Directory has been (and still is) so insecure out of the box is backwards compatibility with an ecosystem of shitty third party vendor solutions.

Sometimes, change is good, even if it's painful in the short term.


I'm sure. I still much prefer Microsoft's model over Apple's, even if there are some drawbacks.

What actually impresses me there isn't that Office 2007 is still compatible with Windows 10 (I mean it does impress me, but I'm used to it by now), but rather the fact that a program like office doesn't break after an upgrade -- and I mean that from the perspective of the OS upgrade. Upgrading the whole OS underneath my programs and having them still work fine is just not something I ever expect on any OS, and it was quite unexpected the first time I saw it happen on Windows.

You should try to upgrade a Linux machine. Not only keeps working before the upgrade. It allows you to WORK meanwhile the whole OS is upgrading!

Shit! I was playing Stellaris meanwhile I was updating to Kubuntu 18.10


I agree, that's impressive, but... every time I upgrade, and even some substantial portion of non-upgrade updates, I have to go in and fix my Grub config or else my Windows dual boot just infinite loops. That's somewhat less impressive, and it affects me much more substantially.

The more immediate usability issues mean that I have a clear divide between the OSes I prefer to use for my daily driver (OSX/Windows) vs. the OSes I prefer to use for servers (Linux). With WSL2, the virtualization capabilities are good enough that I pretty much never need to explicitly boot Linux on my laptop/desktop anymore.


> I have to go in and fix my Grub config or else my Windows dual boot just infinite loops

My limited experience with Windows dual boot, shitty Windows issues or Bios caused problems.

Do you have a good reason to blame Linux here?


The problem is consistently fixed by an edit to grub.conf to invoke ntldr explicitly. It's a fairly well documented issue: https://askubuntu.com/questions/725290/grub-windows-10-doesn...

Of course it's still possible that on my particular configuration it's actually Windows doing something dumb that makes grub.conf changes necessary. However, it's (1) well documented for years, (2) an issue that the Linux (distro) installer creates, and (3) an issue that still gets recreated on every update that mysteriously reverts my grub.conf changes, so Windows gets the benefit of the doubt and Linux doesn't.

That's been my experience in general with desktop Linux: I have never experienced a Linux distro where I didn't have to get into the weeds to fix up a clean install, even when Linux is the only OS on the system.


Ok. I set up a Dell XPS15 with dual-boot Windows 10 and Ubuntu 18.04 plus some XPS15 "fixes".

I only booted into Windows a couple of times (to keep it up to date) but I never ran into any problems with dual booting.

Ubuntu has its problems, especially Gnome works but I hate it's dumbdown, also high density screens work but have issues.

I still far prefer Linux to Windows.


Yeah, then you reboot and find out what's really broken. About 30% of the time there's some weird problem. Last time, my Nvidia drivers weren't upgraded and my laptop's external display stopped working.

Server upgrades are usually smooth (early versions of Ubuntu being an exception.)


Sure, sometimes, maybe. But pretty much only if you've only ever used software from the main repo and of course not anything they've dropped between versions. I've had plenty of Linux upgrades break stuff. Not just programs, but core functionality. Linux Desktop has pretty much the opposite compatibility story to Windows.

> You should try to upgrade a Linux machine.

I have!

> Not only keeps working before the upgrade. It allows you to WORK meanwhile the whole OS is upgrading!

Yeah, and then it breaks and trashes my install after the upgrade. Happened to me many times.


I think I still have an old copy of Office 95 at home. Gotta try that with Win10 just for shits and giggles...

Works just fine. It's legitimately the setup everyone at my house uses (except for me). We have a legit license for it, and it works just fine, so why not

Report back, I'd love to know.

I have Office 2002 on my Windows 10 desktop. I don't use them often, but Word and Excel still work. The only issue (I can't remember if this affected me on older windows versions) is that it sometimes crashes when you close it and tries to restart.

You say it nonchalantly but I'd think twice if it wouldn't work anymore if I'd upgrade (at all). Buying new full versions of Office is not very cheap.

You shouldn't be impressed that existing apps continue to work, since that's the norm for Microsoft. It's all the other things they changed around and the subtly added "telemetry" spyware that should get your attention more. ;-)

Office 95 can run on Win10, with only a little difficulty:

https://www.youtube.com/watch?v=mRfn4M5DXTE


> It's all the other things they changed around and the subtly added "telemetry" spyware that should get your attention more. ;-)

As if that isn't already discussed to death around here or elsewhere. I see far more disdain for Microsoft than I do any praise at all. And while I'm not particularly happy with Microsoft either, I'm still able to appreciate the good things that they do.


But AFAIK it's only thanks to European Union and pressure from various groups that we can open MS Office formats on other platforms and programs, no strings attached. Being retro-compatible in a walled garden only doesn't make all that sense.

The antitrust case helped, but it's only thanks to massive reverse engineer efforts by open source developers that this kind of compatibility exists. Even when you have the specs, it's a massive effort to re-implement.

> Being retro-compatible in a walled garden only doesn't make all that sense.

It makes perfect sense from the company's point of view.


>> Being retro-compatible in a walled garden only doesn't make all that sense.

> It makes perfect sense from the company's point of view.

Yeah, for sure, but I was criticizing OP that was praising MS for still being compatible with their own 20+ years old proprietary and hegemonic document format.


Well, it was kind of the old way of inplementing a subscription based business model: want to stay compatible with the rest of the world? Pay up!

I think in the era of cloud and web this was bound to happen anyway due to market pressure, similar how Windows/Azure embraced Linux

Why doesn't the EU pressure Apple to make sure the Apple Write and Apple Pages, etc, still work?!

I was talking about opening the dominating document formats at the time. Apple didn't dominate that market at all.

And still doesn't

Or applications written for Windows NT that still work - unmodified! - on Windows 10 thanks to layers of compatibility hacks.

I'd call that more "stable API" than anything else --- Win32 is Win32, and while they've added functionality over the years, the basic stuff remains the same.


It's not just the formal API - features like UAC and high DPI scaling require emulation layers. And there's more to the OS than just Win32.

UAC's compatibility workaround is privilege exemption, not emulation. It is not like TrustedInstaller was suddenly introduced in Windows 6.

Recently I bought a large format printer for a project (a HP Designjet 488ca 36") and I was amazed when I found out that the drivers, dated from 1997, were still working on Windows 10. All you needed was to deactivate the signature verification, that didn't existed at the time. It feel crazy to me that my Windows 10 tablet can still print over a parallel port using a simple adapter.

My favorite of these kind of stories is https://www.joelonsoftware.com/2000/05/24/strategy-letter-ii...

They introduced a special mode for the memory (de)allocator that allowed a (disallowed) pattern that SimCity used and that no longer worked with a new change, and used this mode when SimCity was running.


I think I've run into this bug while working on winevdm. https://github.com/otya128/winevdm/issues/322 I ain't trying to fix it.

Now try it with MacOS. Can you open executables/files made with MacOS 10.0 in 10.15?

Good luck, right? No way your ppc32 binaries will run on x86-64.

My LaTeX files from 1994 still compile, too.

If you allow recompiles, POSIX-compliant C programs older than that still work too.

TWM, from 80's, keep working fine on any modern GNU/Linux

Since we’re talking about excel and VBA. Then does a program built in perl5/Java/C++ have that same problem? Assuming the code hasn’t changed for 20 years.

I totally agree that Microsoft maintained compatibility to the point of breaking good engineering practices. But shouldn’t gcc/jvm count as programs too? And code developed for those program still runs assuming same input. It’s much more impressive with Microsoft because they are GUIs but functionally I think they’re pretty much the same.


I am with you on the applications that still work thanks to all the love and tears of the MS engineers.

A document file that can still be read decades later should be the norm though, and Office 2019 needing to 'upgrade' them is less than ideal.

At that time I think people already smelt that important docs would need to exported in something else (ex. PDF) just in case MS changed its mind about how text or layouts would work.


The old .doc format was basically serialized C structs, and it's still supported, you're encouraged to upgrade it for security reasons.

But there was formula fiasco recently. When you wasn't able to edit formulas created by early versions.

https://support.office.com/en-us/article/equation-editor-6ea...


See comment above about the Equation Editor binary patch - they apparently lost the source code and it became too much of a security risk to keep (since it's missing all of the mitigations that a modern compiler generates).

https://arstechnica.com/gadgets/2017/11/microsoft-patches-eq...


Well, the result is the same. There're tons of old documents with formulas flying around and there's no clear path to upgrade, so the only reasonable way outside of re-typing every formula with new editor is to keep using old vulnerable editor. IMO they should have reverse-engineered old format and upgrade it to new automatically.

As far as I know, this was a component that was licensed from a third party. The company sold the editor separately as well IIRC. So, even if there are no legal hurdles to reverse engineering the file format, delving into an undocumented format developed by a different team and no previous information is tough. This means a lot of money and effort for something that comparatively few people really need. I am not surprised that they didn't do that.

And yet, when I try to open my 7-year old resume in Office 365, what I see is a pile of document processor vomit on my screen.

Ironically, Google Docs (Which was, for obvious reasons, not my first choice for opening an old Word document) parsed it in exactly the intended format.


With exception of Internet Explorer, for which backwards compatibility ended up being the bane of every UI developer's existence. It seems fixed for the most part now with Edge.

My work place still has to support IE 11. We were working on a skinning project a few months ago, and used CSS vars for practically every browser, but had to fallback to plain old stylesheet overrides for IE 11. It was painful.

I bet engineers who work on these backward compatibility learn really a lot and learn them fast. Wondering if there is anything similar one can do with open-sourced projects.

I wish I could say the same for iWork. I can’t believe Apple is so naive in not supporting old document formats. Such a clear use case it’s ridiculous.

Hmm? You can open old document formats just fine.

Newer versions will try to upgrade iWork '09 documents, but often they will not do a particularly good job.

And it’s hurt Microsoft by making it impossible to ship operating systems that are not years late, with promised features removed and they haven’t been able to move beyond the desktop - which has become increasingly less relevant over the decade.

They are now at a point where they have to base their browser on a rival’s engine, their developing a mobile device based on Android and their own cloud platform hosts fewer Windows instances than Linux instances.


Their attempt to modernise their phone platform involved several rounds of effectively throwing away the existing library of developed software. If anything, that shows the value of backwards compatibility.

Google supposedly didn't support windows phone, in part, because they were upset that they couldn't use the wince apps as a base.

IE's backwards compat method was to simply embed the old version of the engine, and use it as necessary. That's not really a factor in the many reasons people mostly use IE to download a better browser for personal browsing, but is a major factor in why it was the corporate browser of choice for decades: it is expected to be able to continue to work the same way on a long term basis.

Windows vs Linux usage rates on servers is mostly a statement of Linux is at least good enough, and a good enough solution with simpler (no) paid licensing is a clear win for ease of use. I haven't seen any Linux vs Windows server performance benchmarks in a long time; I'm assuming they're not that far apart, outside of whatever bits and pieces that either platform is truly bad at.


Google supposedly didn't support windows phone, in part, because they were upset that they couldn't use the wince apps as a base.

I was a WinCE developer - both .Net and MFC. Trust me, no one wanted to use WinCE for anything. Besides, MS abandoned WinCE with VS 2010.

I can’t speak for performance, but the resource requirements for Windows is huge compared to Linux and that really makes a difference in cloud environments when it comes to price (even excluding licensing cost) and startup time - that makes a difference when you need elasticity and to scale up and down fast.

You can do a lot with a 128MB/.25 vCPU Linux instance. You can barely get away with a 4GB RAM Windows instance with 1 vCPU.


> Trust me, no one wanted to use WinCE for anything.

Well, maybe nobody wanted to, but to support a new platform with unknown uptake, would you rather use your existing code, write something new on a less capable api[1], or just walk away?

[1] well, a lot of wince stuff was still there on wp7, if you were willing to do terrible things in order to get available features that weren't exposed.


And they have made hundreds of billions of dollars. Surviving forever is not the only possible goal for a company: if you have a company that makes 100 dollars then it goes bankrupt then that's likely better than a company making steady 1 dollar a year.

And Microsoft is only seeing a resurgence by basically marginalizing Windows. I didn’t argue that MS isn’t successful but that Windows has lost importance in the modern tech industry.

Microsoft is about Microsoft, not Windows. While windows is loosing relevance, my Microsoft stock keeps on getting more valued. Dividends also keep coming and growing.

In the Gates/Ballmer era MS was definitely about Windows. They even withheld Office from iOS even though they had it ready during the Ballmer era to prop up Windows.

Does that have a working RSS feed? The obvious link is broken.

I wonder when the txt file was invented?

It's only in the last couple of years that Notepad's been able read txt files from Unix properly...

Am I the only one that doesn't like this about Microsoft products?

You don't like backwards compatibility? Then what do you prefer?

Backwards compatibility shouldn't be some 'amazing feat' - it should be the norm.

I should be able to fire up a BASIC script from 1970 and have it still run. It shouldn't require me hours of hunting retro-computing websites for the right simulators - it should just be part of the OS.

I don't care how - whether that be emulation, or maintaining direct compatibility.

A set of emulators for all ancient computing platforms only comes in at a few megabytes, so there really isn't an excuse not to include it.

Why you say? Two reasons: 1. maintaining backwards compatibility isn't hard - you just switch to emulation every time you want to 'get rid of cruft', and then what happens inside the emulated container can be frozen in time and requires no maintanance. 2. Every time one breaks a legacy bit of code, all the users of that code have to do some work to re-invent it. It would be like shredding Leonardo da vinci's work just because we have better painters now.


The industry norm has been to leave things behind. And it worked really well for Apple, they make money hand over fist, and you see Macbooks being heavily used by even software developers. The market has spoken, it doesn't care too much about it. Microsoft being the exception is good reflection on them but they rarely credit for it, but get heavily criticized for every other small flaw, real or perceived.

You're kidding right? For one, Windows still enjoys a rather ridiculously huge lead on desktop computers. Who's second in that market? Well it was Mac not that long ago, with Linux in a distant third, though maybe that's changed now.

Mac actually took great pains to be backwards compatible until relatively recently. PPC's could run 68k programs, OSX could run tradition MacOS programs, and x86 macs could even run PPC programs.

And Linux Desktop can't run programs not in its repo unless you want to compile it from source.

So what does that tell us? Well, there at least appears to be a pretty strong correlation between compatibility and success with desktop OSs.


> The market has spoken

I legit see more Microsoft stuff all around, Linux stuff close second.


Eh no one wants to maintain and check all that old stuff or support people who bought something 10 years ago. If it works then great, if not then keep an old system around with that setup that supports it and make contingency plans when it inevitably fails.

Who cares what the people using the computer want, right? Fuck them. Developers don't like maintenance so everything needs to be burnt to the ground every 3 years so they can have fun reinventing the wheel again (usually worse than the last time). That's what personal computing is all about, after all, developers!

I agree with all this. It's pretty much exactly the same argument I'd make.


"is it just me" was meme on the Late Terry Wogans show.

And yet it does seem like I'm the only one, in this thread at least.

Cool to see this. I saw a video of someone doing it before, but I didn't want to share it with co-workers because the person making the video used whatever the "paint" equivalent was on that version to draw genitalia before upgrading to the next version.

My last clean install of Windows was with Vista and since then I have done in-place upgrades without much hassle. There is no hardware left anymore from the original setup and it even survived several mainboards, switches between AMD and Intel and the boot drive migrated from an IDE HDD to an AHCI SSD.

I believe that periodic clean installs of OSes are not necessary anymore and every modern OS install should survive hardware changes and upgrades to newer versions.


The thought alone of this makes me jitter. I reinstall my PC every 3-6 months and it's always a fresh breath of air.

Probably because I know I will reinstall not too long from any point in time I just install everything I think I need. I need to convert an IMG to ISO? Let me just try these 5 apps and forget about them. They will be cleaned after the fresh install anyway.


I totally agree. Periodic clean installation of OS for your personal computers is unnecessary and only creates hassle. I've upgraded from Leopard to Mojave without problem, with several different computers.

Servers, though, are a completely different story. Servers are like cattle, but personal computers are like pets.


The screenshots documenting the process are fun but it would also be neat to see graphs of install time per OS and how much storage and memory were in use after completing each upgrade step. I don’t recall: could you upgrade 16-bit DOS/Windows to 32 bit or did that require a clean install?

Depends on what you count Win95 as. It billed itself as a 32 bit OS, but it wasn't really. However, you could update from Win 3.11 to Win95.

To go from the DOS based Windows (95, 98 and ME) to the NT based Windows versions (Windows NT 4, Windows 2000, Windows XP) required a fresh install.

Update: Actually, I was wrong: XP could update from ME - as seen in the OP.


I'm curious why you think Win95 wasn't a 32bit OS?

It was still possible to use "real mode" (16-bit DOS) device drivers with Win95. [1] I believe there was a 32-bit kernel running on top of this 16-bit foundation, but it was definitely a hybrid arrangement. User space processes definitely ran in 32-bit "protected mode", and 16-bit legacy processes ran in their own protected address space when run from inside Win95 - with the 32-bit OS kernel taking the place of what would normally have been the interplay of BIOS and DOS, handling the interrupts routinely generated by 16-bit DOS software for making what we would now call syscalls.

I believe much of this was already in place with Windows 3.1(1)'s 386 enhanced mode and Win32s, although from memory Windows 95 moved far more device drivers into the 32-bit kernel. Presumably this was enabled by not having to be able to end the windowing session and quit back to DOS without rebooting.

[1] I seem to remember CD-ROM drive controller drivers being particularly common culprits in this regard, with virtual mode device drivers often being unavailable, so you had to use real mode drivers in config.sys even on Win9x. Yes, back in the day of single-, double-, and quad-speed CD-ROM drives, you typically had an ISA card with a custom controller that then connected to the CD-ROM drive via an "I can't believe it's not IDE" ribbon cable. Only later did we get ATAPI and hard drives and optical drives could use the same controller and bus. (Sound cards often had an on-board CD-ROM drive controller and ribbon cable connector around that time.)


> Presumably this was enabled by not having to be able to end the windowing session and quit back to DOS without rebooting.

The original "It's now safe to turn off your computer" screen after shutting down windows 95 is actually just a dos prompt.

You cannot see it, because the computer was left in a graphics mode, but if you can blindly type a command to switch graphics mode you're back at a normal DOS prompt.


I agree with most of what you said except for the statement that "there was a 32-bit kernel running on top of this 16-bit foundation". The 32-bit kernel didn't run on top of 16-bit anything. If anything it ran on top of the 32-bit VMM (virtual machine manager), as did the DOS compatibility layer.

IIRC, the accurate statement is that Win95 wasn't pure 32-bit.

In Win9x/Me, the core of the OS (the VMM) is 32-bit (which was true even in Windows 3.x in 386 Enhanced mode.) But, some other parts of the OS remained 16-bit code, and 32-bit apps will sometimes end up invoking 16-bit code via thunks when calling OS APIs.

By contrast, NT-based Windows, a 32-bit app will never invoke any 16-bit code. The only scenario in which 16-bit code would ever run would be when running a legacy 16-bit app.

(Someone please correct me if I'm remembering this wrong.)


Hm, if you were using 32bit applications and 32bit drivers then you wouldn't touch 16bit code (unless of course you did so explicitly).

However there were commonly used built-in dos utilities that were 16bit so if you called out to any of those then obviously you would invoke 16bit code.


One other difference: on Win9x/Me, it was possible for a 32-bit process to load a 16-bit DLL. It was a bit convoluted, in that you needed to create a 32-bit companion DLL which mediated between the 32-bit process and the 16-bit DLL, but it was supported – known as "flat thunking".

By contrast, Windows NT doesn't support 32-bit processes loading 16-bit DLLs, although it does support the reverse (16-bit processes loading 32-bit DLLs – "general thunking")

http://rgmroman.narod.ru/flthunk.htm

I'm not sure how widely this facility, of loading 16-bit DLLs into 32-bit processes, was used. I thought, Microsoft actually used it internally in implementing parts of Win95, but I could be misremembering that.


It wasn't entirely 32bit. The core certainly was, but large chunks of the UI portions were still 16-bit hence for example there was a limit of 16,386 GDI objects across the whole OS so you would get resource depletion errors with many programs running especially if one or more of them "leaked" GDI handles. 16,386 comes from the fact that while they were four-byte (32-bit) values they were stared in a single 64Kbyte byte block of memory (the largest block byte-wise addressable by a single 16-bit register).

Sure there were still some data structures that were limited by being designed in the old 16bit era. Fun fact: even today kernel "Unicode strings" are limited to a length that can be held in a signed 16bit value. Fortunately 32,767 is enough for anybody so it was never changed.

> Sure there were still some data structure

IIRC it was more than just data structures and that other chunks of GDI were 16-bit, in part due to problems with 32-bit driver support for graphics hardware at the time.


Total tangent, but this reminds me of the old Windows 95 joke: 32-bit extensions and a graphical shell for a 16-bit patch for an 8-bit operating system originally coded for a 4-bit microprocessor by a 2-bit company that can’t stand 1 bit of competition.


OTOH, the history of Windows 10 seems to be one botched update after another.

True story, last week I tried to update Windows 10 v1803 (preinstalled) on my dinky ASUS VivoBook E12 to v1903. The result was a seemingly endless "Reverting changes" bootloop that I haven't gotten around to fix yet.


Yep, I had the exact same problem with v1903. In the end I had to make an install USB, reformat the disk and do a true reinstall (Windows claims it can do a clean reinstall of itself but it still keeps a ton of data and doesn't fix a broken installation)

I don't think your experience is very common.

Deploying Windows 10 to users from Windows 7 has been a lot less hassle than XP to 7 in my environments, and I haven't heard different stories from other admins.


I can't update to 1903 either. Continuously tells me that my PC can't run Windows 10 because they can't access the system reserved partition. It's going to take a manual reinstall, so I've been putting it off.

Yes, it is true, Microsoft has not taken good care of Windows in recent years.

I had a similar issue one my Microsoft Surface a while ago. After several months of completely failing to install updates, it finally succeeded last month for reasons I can't figure out. The Surface didn't get much use anyway when this started, so I wasn't really interested in investing a lot of time if fixing it, but it's really amazing to me that Microsoft can't even get Windows 10 updates right one their own hardware.

I still remember using Windos ME as a kid and loved it because it felt more modern than Win98. I still don't get what was so bad about it but then again even if I would know I would have no idea.

Stability was the main issue. My personal experience is that the kernel would crash frequently and unpredictably. If I remember correctly, this would depend to a large extent on drivers, and so vary depending on your hardware. The kernel and driver model design is based on the legacy Windows 9x family, rather than Windows NT (like its successors).

Apparently, according to Wikipedia: "Although Windows 9x features memory protection, it does not protect the first megabyte of memory from userland applications. This area of memory contains code critical to the functioning of the operating system, and by writing into this area of memory an application can crash or freeze the operating system. This was a source of instability as faulty applications could accidentally write into this region and with that halt the operating system"

As it was aimed exclusively at home users, various features aimed at enterprise (but often desirable to power users) were stripped out or broken.


A bunch of gadgets and stuff that worked fine with 95 and 98 mysteriously broke with ME. Because it was a dead end with a short life there was never any incentive to support or fix it. And what was the point of breaking everything for a year?

I remember first being able to use USB mass storage devices (like a Lexar Jumpdrive with capacity measured in tens of MBs) with ME on the family computer. Then I learned that 98 SE had such support...but my HP Pavilion 8530 (with an AMD K6 CPU!!!) had the original 98. Copying files between the two was still limited to Zip 100s. Slightly related, apparently MSFT put out an update for 95 that included rudimentary USB support.

Resetting the machine two times a day got old after a while.

Okay, thought experiment time: how far back can you do this with Linux? I mean, obviously with enough effort you can make Yggdrasil to modern era, but like, without dropping into a root shell?

I think you can probably get from ubuntu's release to current. But I also happen to know they 'supported' upgrading from Debian to warty warthog. And I've done a few Debian OS upgrades, so that's also doable. Possibly all the way back to 1.1 released in 1996. Can you upgrade from something else to Debian?


I've been upgrading Debian continually since around 2000 (don't remember exactly). It's even the same disk image, migrated to various disks (using mdadm), expanded, converted to ext3 from ext2.

But not without pain, I've almost always needed to manually intervene and fix configuration files. So you need root shell.


I'd enjoy a timelapse video of this with a timer at the bottom and an occasional pauses or slow-downs to show specific parts.

Fun stuff. My first thought is the time I was installing Windows 9x from floppies and one of thirty-something were bad...

Considering how much I fresh install things, I really had no idea that you could upgrade all the way from 1 to XP.


Is there a 32bit Windows 10? Perhaps possible to go all the way.

Would be cool to produce some files in the first Windows like a picture in Paint 1.0 and then upgrade it again and again.


I did not realize there was a 32-bit version of Windows 10. Crazy. With a standard component, 16-bit Windows applications can still run on Windows 10 32-bit:

https://www.groovypost.com/howto/enable-16-bit-application-s...

Getting 16-bit on 64-bit Windows takes a little more work:

https://medium.com/hackernoon/win3mu-part-1-why-im-writing-a...


For the latter here is a working VM/layer partially based on Wine:

https://github.com/otya128/winevdm

It can be installed systemwide to get almost seamless support for 16bit Windows apps.


As much as Microsoft would love to kill 32bit windows, 16bit line of business apps are sadly still a thing in some large companies. Companies that may not have the source of or cannot for a variety of reasons port the code to 32bit or 64bit. Because 32bit windows can still run 16bit apps it will continue to live on in the consumer and end user space. But as far as I know windows server 32bit does exist but is not licensed such that it can be used in production.


>Is there a 32bit Windows 10?

Yes, all Windows 10 editions are availables in 32 or 64bit.


Yes, and yes. I saw some video of it once, so it can be done.

Damn I'm old. I remember all of them :/

Am I the only one who doesn’t like the backwards-compatibility that Microsoft Windows provides? I find it frustrating to see the operating system itself not coherent, thinks like having multiple places for configuration (settings app and the control panel), non-HiDPI panels that popup from various parts of the operating system, etc.

I find it frustrating to not have an operating system that has the polish & coherency of macOS on non-Macs, and that’s one of the biggest reasons why these Mac people (myself included) doesn’t even consider moving out of macOS.


We could retain the ability to launch old apps and still have Settings in one place. They don't inherently go together.

Microsoft might need to add something like a legacy control panel which only appears if you install an old app that adds something there. Users who don't use old apps wouldn't see it, so no harm done versus dropping support completely. And, I don't understand why Microsoft couldn't integrate those panels into the modern Settings UI, even if the underlying framework and visuals were different.


>Microsoft might need to add something like a legacy control panel which only appears if you install an old app that adds something there.

That's literally what they've done, the problem is most of their own systems are legacy!


> I could imagine you might need a legacy Control Panel that only appears if you install an old app that adds something to it—but anyone who doesn't use those legacy apps wouldn't see it.

Yeah, and that means (at least for me) non-coherency of the platform. Why should there be two different settings on one OS, depending whether the user installed the app it or not?

> And even then, there's no reason Microsoft couldn't add a visual update and even integrate the UI into Settings.

FYI: the reason Microsoft can’t remove this control panel stuff is because some legacy apps (mostly designed for enterprise) installs DLLs that adds a control panel item. This is one of the examples where their backwards compatibility drags them to make a better OS.


> Yeah, and that means (at least for me) non-coherency of the platform.

Then don't install old apps and you'll get coherency. This is much better than outright blocking those old apps for everyone.

> FYI: the reason Microsoft can’t remove this control panel stuff is because some legacy apps (mostly designed for enterprise) installs DLLs that adds a control panel item

If I were Microsoft, I would remove the "Control Panel" as a program the user can open, but keep the underlying framework. Then, if a legacy program adds a control panel item, I'd place a shortcut to that item in modern Settings. It might not be 100% visually perfect, but you'd get most of the way there.

If there's some technical reason Microsoft can't do that (I don't understand how there could be—we're talking about shortcuts), at minimum the legacy Control Panel shouldn't appear until a 3rd party item is pushed to it, and it shouldn't contain any Microsoft settings. The current situation is completely unnecessary.


> If I were Microsoft, I would remove the "Control Panel" as a program the user can open

They did and people got upset. It's a "who moved my cheese" backward compatibility problem for people. People get upset if they can't find that thing they always used and worked just fine.

For a brief while Windows 10 the legacy Control Panel was entirely removed from Search, and there were so many complaints so they backed off and it's searchable again. That's about the only way to find it; at this point in Windows 10 it's not in the Start Menu, it's not File Explorer Quick Access. People have to intentionally find the Control Panel.

People still talk about the COM GUID to the legacy Control Panel as if it were some secret cheat code to Windows, making shortcuts to the Control Panel because it looks familiar and powerful and/or they don't want to relearn anything new, don't want to get used to the all the cheese that moved around in the modern Settings app.


I have vague memories of complaints, but I recall it being because Settings was lacking a lot of really basic functionality at the time.

I think the attitude to Settings resembles a lot of the attitude to the Office Ribbon switch. Just about no one uses more than roughly 20% of Control Panel/Settings, but just about everyone uses a slightly different 20%, so even just defining what's "basic functionality" is fraught with a wide variety of subjective opinions.

> that’s one of the biggest reasons why these Mac people (myself included) doesn’t even consider moving out of macOS

Fortunately for Microsoft this "issue" has historically resulted in some additional (if uneven) familiarity among Enterprise users, where MacOS is nowhere to be found, though I do concede it's been getting out of hand now with Windows 10


Yeah, and AFAIK they actually maintain an alternative version of Windows (LTSB they call it) for enterprise.

I would like a modernized Windows without all the cruft; one of the reasons Windows apps have such poor quality is that apps don’t need to update their internals, so they just can’t get advantage of the newer features. I’m not sure, but last time I saw the Windows land, win32-based apps HiDPI support is opt-in, not opt-out.

Apple has a great record on deprecating things in a fairly understandable speed, so apps in macOS generally have up-to-date internals with all of the features working consistently.

Most apps use the control-center (compared to Windows where every app reinvents notifications, even with the existence of the notification center added in Windows 10), use macOS tabs, users can define keybindings that work in every Cocoa TextField, etc...


> I would like a modernized Windows without all the cruft

I miss Windows Phone 10 every day still.

Sometimes I still miss Windows 8, when the Win32 desktop booted late and if you stayed entirely in UWP apps things were wildly clean and Windows ran like a dream. (I also know how many people hated that experience.)

Windows has the "flex" to do it, just not the developer nor the user buy-in.

There's an hope with "Windows 10X" they are trying to "flex" a bit, show people what Windows can do when allowed to limit the legacy cruft. The legacy cruft is all still there though, Microsoft has learned the hard way it's there for good, but maybe there's a small bit of hope that developers might be interested in enough in making first class "10X" applications, users enough interest in the magic of post-legacy Windows UI to try new experiences at least some of the time they aren't using the comfort food of the classic Win32 legacy. Maybe a little hope, we'll see.


LTSB/LTSC isn't an alternate version of Windows per se. It's a "branch" of Windows 10 that gets security updates but not feature updates.

As far as I'm concerned, LTSC is _perfect_. I want security updates, but I don't want the rest of my OS to change out from under me. And, because the security updates are much smaller, they download and install far more quickly.


> the operating system itself not coherent

> settings app and the control panel

After using Mac, I once decided to explore Win7's control panel on a friend's machine. I counted ten different kinds of windows in there, not including third-party additions. Some of the windows had controls that weren't used anywhere else in the system—iirc it was the side panel with navigation links.

Hearing that MS now made another entire app for the settings, in addition, was a knee-slapper.

Control panel is my litmus test for the quality of a desktop environment. E.g. you can easily see how Gnome 2 stole ideas from Mac, in the good sense.


You're not the only one, but I'm of the other thought.

I like the insane level of backwards compatibility.

When you can run a windows game from 1996 with only a few tweaks, it's beautiful. It's a worthless gimmick, but it makes me pop.


The nostalgia... spent many hours of my life looking at some of those install screens and inserting new floppy disks.

This is beautiful

Is MS Virtual PC really this slow? On VMWare Workstation I can install Windows 3.11 in about 7 seconds, not the 6 minutes this author is showing.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: