Companies like IBM, DEC, etc. built highly-proprietary systems that weren't compatible with each other, and were able to set high prices because of it.
Only later, in the late 20th century, did computers become more compatible, as open standards for hardware interconnects, communication protocols, and file storage formats were adopted. And the rise of FOSS in the 1990s also did a great deal for this.
There has never been a time where vendors didn't try and 'innovate' in ways that tried to lock out competitors, or at least made them less convenient. Right or wrong, it's not an insight that it's still a business model.
The rise of the Wintel duopoly in the '80s and '90s did that; FOSS operating systems didn't have enough market share for OEMs to take notice until the early to mid-2000s.
(Arguably, FOSS operating systems still haven't had an effect on the fundamentals. There is no open standard by an independent standards body for the architecture of an x86 PC/server. It's governed by Microsoft's HLK: https://en.wikipedia.org/wiki/Windows_Hardware_Lab_Kit )
Honest question, what is this statement referring to?
In Apple’s case - repairing is near impossible for an average user. Even the fucking screws are non standard. They charge an arm and a leg for repairs.
The entities that put forward the resources have strong reasons/goals, and they are going to target their OS or distribution toward those reasons/goals.
It never was or every will be different.
An OS is a platform, so a vendor will want to appeal to users/customers. But that's one of several concerns they balance. They generally target a market -- a group or multiple groups of users with a similar perspective. If you aren't in one of the groups they are targeting, you will probably be unhappy with a relatively large number of tradeoffs they make. In other words, maybe that OS isn't for you.
In the end, you will be disappointed if you were expecting anyone to do a lot of work or expend a lot of resources to cater to your needs to the detriment of their own.
Put another way: No one's your mother and father but your mother and father (not even them in some cases, but you all don't need to hear details of my childhood).
Also: it never made any sense for Apple to adopt Vulkan into their OSs. A general OS needs a low-level abstraction on top of the GPU. Adopting an external API for this purpose means OS releases would tied to a dependency they don't control.
(Not to mention: Metal predates Vulkan. Imagine the awkward transition that would have been to drop Metal a year or two in and adopt Vulkan instead.)
So, e.g, the next version of iOS might not be able to drop because a Vulcan committee debate on the semantics of a new API are running long... thus delaying the release of the next iPhone which depends on the new OS version. (That would be bad for the vendor and for the customers who want that new iPhone.)
That was just never gonna happen.
Apple could have developed and supported Vulkan API as a translation on top of their native graphics abstraction API (Metal), but that's something really much better done by a third-party via an independent library. A game, e.g., is much better off targeting a specific Vulcan API version and including the library that implements it (with its own platform compatibility support) in their own release.
If Apple did it as part of the OS, a game dev would have to work with whatever version that shipped with the OS the end-user is running. You need an API to be stable to release software written to run on top of it.
Think of it this way: if Vulkan was the native graphics hardware abstraction API of macOS or iOS, game devs would need another abstraction on top of it, to write their games against. (Many do this already, of course, with Unity or other game engines.)