Hacker News new | past | comments | ask | show | jobs | submit login
General-purpose OS, special-purpose OS, and now: vendor-purpose OS (drewdevault.com)
83 points by ddevault on June 26, 2020 | hide | past | favorite | 9 comments



Historically, wasn't this the case?

Companies like IBM, DEC, etc. built highly-proprietary systems that weren't compatible with each other, and were able to set high prices because of it.

Only later, in the late 20th century, did computers become more compatible, as open standards for hardware interconnects, communication protocols, and file storage formats were adopted. And the rise of FOSS in the 1990s also did a great deal for this.


Yup...this reads like a low-fidelity rant than a substantive observation. Historically, proprietary/walled-garden operating systems were the norm. Trying to get, say, an IBM VM/CMS environment to talk to a DEC VAX/VMS system involved the deepest of dark incantations, and usually some expensive third-party software and hardware (show of hands...anyone else try and implement an APPN to DECNet gateway? Not recommended.).

There has never been a time where vendors didn't try and 'innovate' in ways that tried to lock out competitors, or at least made them less convenient. Right or wrong, it's not an insight that it's still a business model.


> "Only later, in the late 20th century, did computers become more compatible, as open standards for hardware interconnects, communication protocols, and file storage formats were adopted. And the rise of FOSS in the 1990s also did a great deal for this."

The rise of the Wintel duopoly in the '80s and '90s did that; FOSS operating systems didn't have enough market share for OEMs to take notice until the early to mid-2000s.

(Arguably, FOSS operating systems still haven't had an effect on the fundamentals. There is no open standard by an independent standards body for the architecture of an x86 PC/server. It's governed by Microsoft's HLK: https://en.wikipedia.org/wiki/Windows_Hardware_Lab_Kit )


Good point, this isn't an entirely new trend.


The obvious counterpart is vendor DRM/copy-protection, which has been around since forever(see: the NES lockout chips, every fancy disk-sector protection method used on the 8-bit micros). The attempt is always to make the computer used for something less than it is literally capable of. Extension of these limits into the context of an apparently generalized operating system is just a way of elaborating on the idea.


> They’ve also long been making similar moves in their hardware design, adding anti-features which are explicitly designed to increase their profit — adding false costs which are ultimately passed onto the consumer.

Honest question, what is this statement referring to?


Removing ports. Zero upgrades possible - not even hard disk, memory.

In Apple’s case - repairing is near impossible for an average user. Even the fucking screws are non standard. They charge an arm and a leg for repairs.


Dongles in apple products probably


Well, OS's take a lot of effort and resources to update, maintain, and release.

The entities that put forward the resources have strong reasons/goals, and they are going to target their OS or distribution toward those reasons/goals.

It never was or every will be different.

An OS is a platform, so a vendor will want to appeal to users/customers. But that's one of several concerns they balance. They generally target a market -- a group or multiple groups of users with a similar perspective. If you aren't in one of the groups they are targeting, you will probably be unhappy with a relatively large number of tradeoffs they make. In other words, maybe that OS isn't for you.

That's OK.

In the end, you will be disappointed if you were expecting anyone to do a lot of work or expend a lot of resources to cater to your needs to the detriment of their own.

Put another way: No one's your mother and father but your mother and father (not even them in some cases, but you all don't need to hear details of my childhood).

Also: it never made any sense for Apple to adopt Vulkan into their OSs. A general OS needs a low-level abstraction on top of the GPU. Adopting an external API for this purpose means OS releases would tied to a dependency they don't control.

(Not to mention: Metal predates Vulkan. Imagine the awkward transition that would have been to drop Metal a year or two in and adopt Vulkan instead.)

So, e.g, the next version of iOS might not be able to drop because a Vulcan committee debate on the semantics of a new API are running long... thus delaying the release of the next iPhone which depends on the new OS version. (That would be bad for the vendor and for the customers who want that new iPhone.)

That was just never gonna happen.

Apple could have developed and supported Vulkan API as a translation on top of their native graphics abstraction API (Metal), but that's something really much better done by a third-party via an independent library. A game, e.g., is much better off targeting a specific Vulcan API version and including the library that implements it (with its own platform compatibility support) in their own release.

If Apple did it as part of the OS, a game dev would have to work with whatever version that shipped with the OS the end-user is running. You need an API to be stable to release software written to run on top of it.

Think of it this way: if Vulkan was the native graphics hardware abstraction API of macOS or iOS, game devs would need another abstraction on top of it, to write their games against. (Many do this already, of course, with Unity or other game engines.)




Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: