Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
About the security content of macOS Monterey 12.1 (support.apple.com)
163 points by ksec on Dec 14, 2021 | hide | past | favorite | 227 comments


Hi CEO of Kolide here and one of the reporters of CVE-2021-30987.

This vulnerability is basically an information disclosure issue that enables you to get BSSIDs from the Airport Utility without the appropriate location tracking entitlement. Even worse the OS will not even flash the "location being accessed" indicator in the status bar when geolocation is determined this way.

Essentially any app can shell out to the CLI tool and parse its output and then feed the resulting BSSID to one of the many external BSSID -> Geolocation services and determine a device's location.

At Kolide we are huge believers that device geolocation should not be accessible to third party programs without end-user prompting (even ones managed by MDM) so we aggressively seek out gaps the TCC authorization model (or other OS issues that undermine the tenets of https://honest.security) and report them whenever we find them.

I reported this vulnerability on January 17th 2020.

Edit: Btw, I am not really sure why the CVE was withdrawn. To me this is a pretty serious information disclosure style vulnerability and everyone should upgrade to this release as soon as possible.


This - again - lowers my confidence in the CVE mechanism...

Thanks though for your background information what is / was going on!


Why is Apple referencing CVE-2021-30938 and CVE-2021-30987 in their listing of security updates when they are not searchable in the National Vulnerability Database? Their status there is listed as "* REJECT * DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: none. Reason: This candidate was withdrawn by the CVE program. Notes: none.". Can anyone give more clarity what is going on?



Apple should keep providing security updates for the latest version supported on any machine.

There are lots of macs stuck at macOS High Sierra (the last one that supports GPUs without metal), and none of them are receiving any security update.


They also released an update for Big Sur and Catalina: https://support.apple.com/en-us/HT201222

No luck with High Sierra though. Its last update was in Nov 2020.

Apple is notable for providing ZERO official guidance on how long various versions receive security updates. Officially they just say you must be on the latest version. At the moment they're still releasing updates back to Catalina on macOS and iOS 12 for all the devices still stuck on that version. But they make no claims on how long that will continue.


They seem to support the current version and the previous two before that. So Mojave is not getting support, but Catalina is at this time. But be careful what you wish for -- the last Catalina security update broke software using openGL accelerated graphics for Apple computers with integrated Intel HD4000 graphics. This includes Google Earth, Google Street view, AutoDesk, RStudio and many web apps. So a lot of people have had to revert to older unsupported MacOS versions just to get any work done.


Yup, this is a well known secret among anybody who supports Macs - Apple typically supports the latest release and the two immediately preceding it. iOS is a different story though, as they sometimes go back a few versions to fix CVEs.


I'm also locked on High Sierra...

Only solution is to use Big Sur installation using Hackintosh patches and to run it on mine computers.


>They also released an update for Big Sur and Catalina: https://support.apple.com/en-us/HT201222

How one knows that BigSur for instance isn't already compromised by one of those wonderful capabilities for code execution as they provide update only now ?


It’s uncommon that previous versions of software and OS aren’t impacted by new vulnerabilities.


Not sure what you're getting at? They released the Monterey, Big Sur, and Catalina security updates all at the exact same time. They're all vulnerable, and they all had patches released at once.


I have a machine still rockin MacOS 9. No security updates either, though also not much of a target…


And any app can run arbitrary code with kernel privileges by design.


Did anything bad happen?


Macs running classic MacOS were famously unstable for professional work. I got used to restarting my Mac when getting a cup of coffee as preventative maintenance.

The few viruses that were around were mostly annoyances.


Some of the professional Software was also famously unstable. Generally speaking, saving frequently helped. (Quark, I'm speaking of XPress 3 and memory leaks, guaranteed to strike, when it really hurts…)


That's not just Mac OS and professional software either. Windows did that too. I was very young back then, but I do remember being angry at the computer crashing and losing whatever I was doing, only to be told "it's your fault, you should save you file more often next time". It becomes a sort of a subconscious habit really quickly.


The important point being, once you saved, the working memory was also reset to sane conditions and you could keep on going for days, or weeks. Saving frequently not only kept the injuries, which added to the insults of a crash, in bounds, it also prevented those crashes from happening at all.


What counts as professional? I think my friend had a G3 that never had issues as a student.

I just had it in "computer class" to play crappy math games that hurt my eyes but they never crashed, and I wasn't a professional. I used Photoshop in 2003 or so, I think it was OSX by then, and never had an issue if you call photoshop and internet usage "professional" work.


Professional has in getting paid for your work. MacOS 9 crashed if you looked at it wrong.

And I still miss it.


How about using Snow Leopard with Rosetta?


You can't run classic Mac OS programs on Snow Leopard with Rosetta. Rosetta just let you run PPC Mac OS X apps on Intel Mac OS X. So it was just as stable as Mac OS X. Even with Classic on Mac OS X for PPC you'd just crash the Classic system at the worst and not whole OS.


Nah. Once you got creative users to stop installing scads of extensions, Macs were way more stable than Win95 boxes.

The problem was that creative users had no concept of "some extensions just aren't compatible with others".


It's not as simple as that. We ran a tight ship, extension-wise.

There were plenty of bugs in QuarkXPress, Photoshop and Illustrator to make our Macs bomb, and the fundamental lack of memory protection, process isolation and preemptive multitasking in Mac OS meant that the whole system was wobbly from the bottom up.

Also, my post wasn't meant as a comparision between Win95 and Mac OS. I just responded to a question about what Mac OS was like to work with on a daily basis in the decade of the 90s. Win95 wasn't around for half of it.


It’s slightly weird and unnerving to realise that modern macOS is older today than classic MacOS was when it was obsoleted.


And it's been through 2 CPU architecture transitions.


More than that when you consider modern Mac OS X initially ran on 68k black NeXT hardware, then 486, PA-RISC and SPARC... plus PPC and Apple Silicon.


Unnerving indeed. How time flies.


> I just responded to a question about what Mac OS was like to work with on a daily basis in the decade of the 90s.

Your experience does not match my own.

I had way more issues supporting users on Windows, at least until Windows 98SE landed.

It was a happy day indeed when Windows 2000 finally landed with memory protection, process isolation and preemptive multitasking.


You keep bringing up Windows. I avoided it like the plague. It also wasn't an option for our use.

(Until, as you point out, Windows 2000 came around. Then I started working on Mac and PC simultaneously.)


There was also the thing with virtual memory. Some apps did run with it (so you had "more memory") and some didn't. Enabling or disabling it meant reboot. So you rebooted based on which app you needed to run.


That had to be a long time ago. I ran MacOS 7.6 through 9 for much of the 90s and don’t recall any apps not working with virtual memory enabled.


Even in 9... Photoshop had it's own memory manager and didn't play nice with VM. Pro Tools couldn't even launch.


Woz has a chapter in his book about how a stable system magically became unstable after install IE. "Without making accusations" he says, but goes into some detail on how IE being installed caused a lot of chaos.


I haven't ever used classic Mac OS when it was current, and I don't know anyone who did, but I do remember Windows 9x crashing and locking up being an almost daily occurrence. PCs came with dedicated reset buttons for a reason!


My memory of Mac OS 7.1 was that it was rock solid. When it did bomb it wasn’t a surprise — you knew it was coming because you’d been messing around in ResEdit or playing with the debugger.


Or installing a hundred conflicting INITs


System 7.1 -- it didn't become Mac OS until sometime in the 7.5 era.

EDIT: Correction, I said 8.5. Sleepy.


7.5/7.6 was the switch-over to the "Mac OS" name.


Yep, sorry. Sleepy :)


I did, it was all intranet, I mentioned this in another post, but they didn't have the same threats as we do today.

>PCs came with dedicated reset buttons for a reason!

They still do! Windows 9x did suck, Windows NT however was amazing, it is like the most Linux like of all the Windows. How could they release Windows ME...


All modern Windows versions since XP have NT lineage.


They have other issues now, mostly performance, but not as many crashes at least. Its like saying all modern Intel CPUs are descended from Core M, but I don't know if it adds any relevant information, its mutated a lot and its very different, but at least it doesn't crash like 95, 98, 98 Plus! and ugh ME.


Whats it for?


I think they do. There are machines for which there isn’t any version supported on that machine, though.

That’s not different from the situation with Windows. An Intel 386DX CPU with 4 MB of system RAM and 50–55 MB of hard disk space won’t see Windows security updates, either, anymore.

What is different is that Apple is less clear in support periods and often more aggressively leave older hardware behind.

I don’t think you can require them to support things forever, but requiring everyone to be clear about it at time of sale, or requiring X years of support would be good ideas, IMO.


> That’s not different from the situation with Windows.

Yes. And that is also part of the problem. They also don't patch reported vulnerabilities nor provide a clear path for the community to fix themselves.

> What is different is that Apple is less clear in support periods and often more aggressively leave older hardware behind.

Yes, that amplifies the problem a lot. Many of the abandoned machines are 10-year old but still very suited for daily use. They have 64-bit Core i processors with decent amounts of RAM (such as 8 GB).


>I don’t think you can require them to support things forever, but requiring everyone to be clear about it at time of sale, or requiring X years of support would be good ideas, IMO.

Why you can't require them to support things forever ? Why not? It was their choice to lock completely those devices they sell.

I think as long as device is looked and you cannot have root access and install own software of course you can required them to support those things forever until they provide unlock for them at least, no?


Agreed. But that wouldn’t change anything about the macs under discussion. They aren’t locked and never were.


That is why locked devices without ability to have root access and full control over them is a problem. At some point they simply do not provide software while blocking you from providing own software.

Perhaps it's time to question this practice and discuss the issue with potentially making such devices illegal? At least when company doesn't provide software?

Shouldn't then Apple made be legally obligated to provide root access for the devices it no longer supports?

Otherwise it's what? You buy device then it's not functioning anymore just because they decide so. Doesn't it look like some kind of a fraud/scam ? How it's called then?

I mean if one owns computer/iphone and cannot install own software there since those devices are locked and Apple do not provide their software with security updates what an owner of the machine should do?


>At some point they simply do not provide software while blocking you from providing own software.

You knew this getting these devices. You didn’t buy it and except to be able to run android on iPhone did you?

> Perhaps it's time to question this practice and discuss the issue with potentially making such devices illegal? At least when company doesn't provide software?

Why? I don’t see a huge benefit. I don’t love it, but it’s not like you can’t run old versions or it stops working. Do you expect them to give you software updates forever once you buy it? Apple is one of the best at updating mobile.

If it was up to me I’d like to have the paid updates model again, it would stop forcing upgrades, I’m not fond of the security comes with features model, and I think it would force them to have more support for other versions, they expect everyone to always update.

Then again if you don’t like the product, don’t buy it. Many Android phones lets you do all of that, and once I set up an iMessage server to route it I’ll switch completely.

>I mean if one owns computer/iphone and cannot install own software there since those devices are locked and Apple do not provide their software with security updates what an owner of the machine should do?

What security problem are you worried about? I never update iOS from the version it came from and had never been hacked ever, I’d love to hear of anyone with that experience, I have never heard of it happening once, same with my android devices although I root. Maybe my Adblocks, blacklist, or the VPN on mobile prevents it anyway, on my home network I rely on my router to protect my devices.

Do you have a realistic fear, which exploit? Why is it scary? Best practice is to stop assuming you have any safety, updates or not, lots of unpublished backdoors, lots of ways to hack you if they really wanted to.


> That is why locked devices without ability to have root access and full control over them is a problem. At some point they simply do not provide software while blocking you from providing own software.

All of the Macs stuck at High Sierra can also run Windows or Linux.


>All of the Macs stuck at High Sierra can also run Windows or Linux.

What about current M1 model after few years?

Btw what about IPhones/ IPads? Can they run linux?


Yeah you can run linux on iphone, but take some tech skills:

https://www.macworld.co.uk/news/linux-iphone-7-3800398/


Sorry, project is long dead, no updates and no functionality. Since iOS is Unix you can just do CLI on iOS anyway.


You seem to have confused iOS devices with Macs.

Macs can run another OS by design.


>You seem to have confused iOS devices with Macs.

I do not think I did. See above ... "I mean if one owns computer/iphone ... "

>Macs can run another OS by design.

If you are talking about previous models, to the certain degree yes except GPUs if I am not mistaken.

But current M1 Mac as far as I understand it doesn't boot anything else unless it boots first from the internal storage and only then "external OS loader" which should be authorized too at least once by Apple.

You can say that "Macs can run another OS by design." but design in which they authorize anything while do not provide full specs of HW interfaces for "another OS" I can't call so.


>design in which they authorize anything while do not provide full specs of HW interfaces for "another OS" I can't call so

When did Intel release the full specs for the hidden OS that runs on the Management Engine in x86 chips?

>[Intel] processors are running a closed-source variation of the open-source MINIX 3. We don't know exactly what version or how it's been modified since we don't have the source code. We do know that with it there Neither Linux nor any other operating system have final control of the x86 platform.

It can reimage your computer's firmware even if it's powered off. Let me repeat that. If your computer is "off" but still plugged in, MINIX can still potentially change your computer's fundamental settings.

https://www.zdnet.com/article/minix-intels-hidden-in-chip-op...


That's simply not true. Apple allows booting unsigned/custom kernels on Apple Silicon Macs without a jailbreak. This is according to the Asahi Linux docs.


>That's simply not true.

"Strictly speaking, the things can boot off of DFU (USB device mode) too, but to make that useful for regular boot you need to ask Apple, as currently you cannot boot a normal OS like that as far as I know, only their signed restore bundles (which is how you fix an M1 Mac if you wipe the SSD)." (https://news.ycombinator.com/item?id=26116017)


Strange, T1 didn’t.


I think that's understandable, it's primarily a security chip so if you're ever going to lock anything down, that would be it. OK, from a purist perspective that's arguable, but I think it's also arguable allowing unsigned code would make it less secure. Anyway you can still run another OS on a Mac with a T1.


Also not true. Someone just needed to write Linux drivers for that hardware.


No, Apple’s security coprocessors were not designed to be unlocked (but now can, via checkra1n).


The thing that initially kept Linux from installing on those Macs was the lack of a driver for the SSD controller which is located on that T1/T2 chip.

https://t2linux.org/


Developing a SSD driver does not require unlocking the device, though.


We're talking about the ability to install Linux on the hardware.

The drivers to do so exist now.


>You seem to have confused iOS devices with Macs.

"macOS on ARM is very clearly a descendant of the way iOS works. It's just macOS userspace on top of an ARM XNU kernel, which was already a thing. The way the boot process works, etc. is clearly iOS plus their new fancy Boot Policy stuff for supporting multiple OSes and custom kernels." https://news.ycombinator.com/item?id=28181976

I think after M1 it will be more and more confusing to see them too much separately.


Apple doesn't support software - they support hardware. When they stop supporting a particular piece of hardware, you're SOL.


I would question why anybody is still running High Sierra at this point. Any hardware that far back has to be miserable to daily drive.


I would question Apple's fast and furious update schedule. Fundamentally there is very little difference between Mac OS X 10.0 Cheetah and macOS 12 Monterey. The operating system itself has probably seen very little changes since NeXTSTEP's first release in 1989, it only appears different, and the vast majority of changes are in the graphical interface. It begins to appear like the software philosophy pioneered by Adobe: buy the same software every 2 years. But Apple upped that game to every stinking year. Why? What is the benefit to the user? Wouldn't it have been far better to not place so much attention on a constant, never ending stream of so many new features so few use, and instead focus on the necessary features everyone uses, as well as security, and fixing the bugs. I also don't understand Apple's philosophy of abandoning compatibility. Was 32-bit compatibility really all that crippling to Apple's 64-bit endeavors? Or is it just an easy way to break production software to force purchase of new software? Imagine if the automobile industry changed fuels every 10 years starting in the 1940's, such that every decade that passes, any car older than 10 years would no longer function for lack of access to fuel. I don't think we would have let that stand. And yet we eat what Apple puts on our plate. I like macOS, but this "new OS every year!" sucks, has no benefit to the user, and it is, fundamentally, a lie.


>Fundamentally there is very little difference between Mac OS X 10.0 Cheetah and macOS 12 Monterey. The operating system itself has probably seen very little changes since NeXTSTEP's first release in 1989, it only appears different, and the vast majority of changes are in the graphical interface.

Well, that range covers three different CPU architectures(!) and at least one complete file system rewrite.

But your point is fair - to the layperson there hasn't been much change.


> Well, that range covers three different CPU architectures(!)

The exercise proves that hardware is irrelevant to an operating system

> and at least one complete file system rewrite.

Which filesystem was completely rewritten, and why? Unless you're referring to AFS... which wasn't rewritten, but developed.

> But your point is fair - to the layperson there hasn't been much change.

Pretty sure not a whole lot has changed for the experienced expert, either. What has incrementally changed (but not so much it's unrecognizable) is the graphical user interface. A GUI is not an operating system.


>Unless you're referring to AFS... which wasn't rewritten, but developed.

You know what I meant and the distinction is irrelevant in this case, so don't be that guy. The point is, a lot of impressive engineering resources have gone in to the OS over the years. In particular the seamless transition (due to Rosetta x 2) from Motorola -> Intel -> Apple Silicon (among other things) so the statement that MacOS 10 -> 12 has little fundamental difference is a bit hyperbolic.

Anyways, I'm not sure why you're trying to start an argument but I'm not interested. You sound way more uhhh... passionate about this topic than I care to be.


It's really ok if someone disagrees with you. It doesn't make them argumentative.

NeXTSTEP was compiled to run on 68k, x86, RS6000, and Spark, it didn't change the version number. Tiger and Leopard ran on PowerPC and Intel. Snow Leopard ran both 32-bit and 64-bit with distinct kernels. Big Sur and Monterey run on Intel and ARM. In each instance, these are the same OS at the same version running on different platforms. But in each instance, it is still the same operating system, regardless of any differences in the code on any particular platform. The hardware is irrelevant.

There are technologies introduced and some simultaneously deprecated at each increase of a macOS version number. Most of this has to do with the user interface, but there are some changes to the OS having little to do with the user interface. These changes, however, are incremental point increases, at best, not version whole number advances, which Apple has inexplicably ticked forward every year since 2012. What are the differences between Monterey and Mountain Lion that warrants their distinction? Ignoring platform, shown irrelevant above, they merely appear very different, along with a few inconsequential new or removed features, but they are more alike than they are different. So I think Apple has employed version numbering as marketing rather than marking milestones of significant advancement. Microsoft has done this too, in spades. Sure, Vista and Windows 8 are unrecognizable, but they are both still fundamentally NT, and not distinct operating systems.


>It's really ok if someone disagrees with you. It doesn't make them argumentative.

No, but tone does. As does uncharitably interpreting comments.

And I never said hardware itself mattered, I said the level of software engineering in seamlessly transitioning MacOS across different architectures was impressive and resource consuming. Rosetta (executed flawlessly twice) in particular is an incredible feat. As were Time Machine (effortless backups), full disk encryption (seamless and transparent) at the time they were released. Not world changing, sure - but what OS feature in the past 20 years is?

>Apple has inexplicably ticked forward every year since 2012. What are the differences between Monterey and Mountain Lion that warrants their distinction?

You've moved the goal posts. The comment I replied to was:

>Fundamentally there is very little difference between Mac OS X 10.0 Cheetah and macOS 12 Monterey.

Anyways, I'm no Mac fanboy so it feels weird trying to hype MacOS up. My only point was that the above statement comes across as more than a little hyperbolic.

I'm not entirely sure how to counter your argument since you've provided no criteria for how much the OS should have changed over the years. You've only said it hasn't changed enough. That's a bit vague. What OS meets your criteria? What obvious feature is MacOS lacking? What's your gold-standard benchmark for what an OS should be?

To be clear, I agree with you that many (most) MacOS changes are marketing-driven fluff. But I also fail to see any significant advancements in other OS's that MacOS sorely needs.


Well, calling a dissenter argumentative, or griping about tone, is actually ad hominem fallacy, because it ignores the argument entirely and personally attacks the man.

In fact, you did list migrating platforms to support macOS versioning, which can be interpreted as saying the hardware matters enough to advance versioning.

Goal posts weren't moved (farther apart, they were moved closer together). This is not the same thing as the meaning of the idiom "moving goal posts," especially because no goal was met.

> since you've provided no criteria for how much the OS should have changed over the years.

This is actually a good example of moving goal posts, because it is beyond the scope of supporting the argument. It is perfectly ordinary that something be proved as less than ideal without providing what a perfect or better ideal is.

> But I also fail to see any significant advancements in other OS's that MacOS sorely needs.

This is a straw man and skew to the argument, which is simply that calling annual upgrades a new operating system due to new GUI features doesn't make one OS distinct from it's previous incarnation, but they are instead, at best, incremental point increases of the same operating system (which is actually what Apple does, 10.5, 10.6, etc.). I'm not sure where you got the notion that I was arguing anything beyond that, arguing anything beyond decrying the Adobe-style marketing strategy of labelling the same software as though it were new and innovative. With Mac OS X, OS X and macOS, the kernel hardly changes and the user land barely changes if at all, and even the window manager barely changes. What mostly changes is a few GUI features deprecate while adding a list of new GUI features; what changes between macOS versions is superficial and incremental.

And it is merely my opinion, not a claim of some fundamental truth. So let me tell you what you're going to do, you're going to have better days and not sweat it so much.


>Well, calling a dissenter argumentative, or griping about tone, is actually ad hominem fallacy, because it ignores the argument entirely and personally attacks the man.

Well, I didn't ignore the argument, nor did I attack you personally (your tone is not your personality, is it?). So you're misusing "ad hominem".

Second, on a human level, I pointed out that your argument came across as curt and unnecessarily argumentative. You can choose to do with that feedback what you will. As in "hey, sorry, I didn't intend it like that" or "NAH-Nah you committed a technical foul therefore I win the argument!". It's your choice, but when people make observations about how you're coming across - it might be wise to assume they mean that genuinely.

>Goal posts weren't moved (farther apart, they were moved closer together). This is not the same thing as the meaning of the idiom "moving goal posts,"

You're plainly wrong. Further apart or narrower doesn't matter - what matters is you changed the scope of your argument to make it easier to support. Your original comment covered 2001-2021, you then casually changed it to 2012-2021 without explicitly acknowledging you were changing the argument.

That is textbook goal post moving. Textbook.

The rest of the argument is simple - you're saying MacOS hasn't changed enough. I'm saying - relative to its peers it has changed about the same amount. If you can tell me some great advancements that other OS's have had during the same time frame that MacOS lacks that would go a long way to supporting your argument. I'm actually curious on this (since for example - I don't follow Windows too closely).

Similarly, I can say Samsung (or whatever) TV's haven't advanced that much in the past 20 years. But if I can't point out a single feature or advancement they are lacking compared with their peers then it's not really a strong argument. It just sounds like I'm angry with Samsung for some reason.

>This is a straw man and skew to the argument, which is simply that calling annual upgrades a new operating system due to new GUI features

You're incredible. Actually screaming "straw man" in the same sentence you're saying Apple (or literally anybody) calls its point releases "a new operating system". Amazing.

You're now 0 for 2 in use of the term "fallacy".

>So let me tell you what you're going to do, you're going to have better days and not sweat it so much.

I'm fine. As previously mentioned, you're the one who sounds angry.


I can only speak for myself, but I'm running Mojave because there are 32-bit apps I need to use for my current project, which any newer version of MacOS will refuse to run.


If you have any interest in compiling 32-bit applications, you'll need to downgrade or set up another machine with High Sierra. Mojave will still run 32-bit applications with a warning, but it can not build for i386, will build only for x86_64.


12 core Xeon Mac Pro @ 3.33ghz w/ 128gb RAM and PCIe SSD. Yep, it’s “miserable”. ;)

That said, I’ve upgraded the video cards and applied an EFI “fix”, so it runs later releases with full FV2 boot support (hence the EFI “fix”).


Based on what? CPUs haven’t significantly improved performance for 10 years.


Honestly not really. Even a 2012 MacBook Pro i5 with 16 gigs of ram and a SSD is a solid machine.

High Sierra runs well on this device.


It doesn't have to be that way. Linux works great on extremely old hardware


But it is that way, and it's not changing.


Until this year I used a 2009 Macbook Pro. I finally switched to Thinkpad because I wanted to use an OS that gets security updates. It's not miserable at all if you have sufficient RAM.


It actually runs pretty well for most things. The hardware is still good unless you're doing high intensity computing or trying play AAA games on it (for which it was never good anyway).


The consumer doesn’t purchase a lifetime support contract when they buy the hardware. Should MS continue to support XP for home users?

That said, Apple did provide some support for iOS 12.x recently, which was unusual.


Security is not support. If they distributed defective software (with security vulnerabilities), they should fix reported vulnerabilities or at least provide a clear and canonical path for the community to fix it themselves.

Yes, it also applies to MS and possibly many other companies.


> they should fix reported vulnerabilities

One could claim that they do, always, in the newer versions of the OS, which are almost universally incremental improvements based on the older versions.

Not supporting old hardware is something that all companies [1] eventually must do, since resources are limited.

I don't think Apple or Windows would be a good choice, if openness is a feature you're looking for.

1. https://lwn.net/Articles/769468/


> Not supporting old hardware is something that all companies eventually must do

I get it, but macOS does not need to do anything hardware-specific to fix security vulnerabilities, they'd only need to patch the old version (which already supports that hardware) and release an update, and only if someone reports a vulnerability. If they planned to group hardware deprecations only every X versions (let's say they'll remove hardware every 5 years), they would have very little work to do to keep those old versions safe.

> I don't think Apple or Windows would be a good choice, if openness is a feature you're looking for.

I agree, but that doesn't mean we shouldn't also push for them to be better. Personally I love linux but I also need mac for iOS development.


You can hack it to do it, but I am wondering what virus/security problem worries you?


I'm worried by many of the security issues fixed by the linked official apple release notes, including CVE-2021-30939, which indicates that it may be possible for a computer to be compromised by opening images or maybe even accessing its metadata, and many privilege escalations allowing user apps to acquire kernel privileges.


7 vulnerabilities with impact "may be able to execute arbitrary code with kernel privileges" :-(

Are these things in Objective-C or Swift? Does Swift memory management make buffer overflows and use-after-free mistakes harder to make?


There is no Swift or Objective-C in the xnu kernel, so the answer to the first question has to be "no". To the second question, Swift is generally safe against these bugs except (1) in the face of race conditions and (2) when the developer uses things that have "unsafe" in the name.


ObjC and Swift are just application level programming languages. The XNU system kernel for macOS is written in mostly C with some C++. Anyway Swift is written in C++ last I checked and ObjC is a superset of C anyway. Everything done with ObjC can be done in C if the developer is feeling adventurous enough.

https://github.com/apple/darwin-xnu

C 82.5%

C++ 8.7%

Python 2.3%

Roff 2.3%

Assembly 1.4%

HTML 1.1%

Other 1.7%


Yes, Swift is memory safe in this regard (as long as Unsafe.... isn´t being used), however most code on macOS is still mostly C, C++ and Objective-C.


Its probably less to do with buffer overflows and use-after-free of these applications, and more to do with the underlying OS APIs, and I assume most of the OS is still C, C++ and Obj-C.


Regarding WebKit, where there are several CVEs, is written in C++.


I'm still a few OS releases behind so I'm not used to paying attention to updates, but does this seem like an atypically high amount of RCEs?


What do you upgrade for? Using Mac for music production means never updating immediately, and maybe not at all.


I use the Nvidia GPU drivers, so I'll upgrade if Apple ever agrees to sign Nvidia's binaries for new OS releases (fat chance!)


http://dosdude1.com/mojave/ just posted this if you're interested, never thought I'd suggest hackintosh resources to a Mac user, but if you need any of those new features: https://www.tonymacx86.com/nvidia-drivers


> never thought I'd suggest hackintosh resources to a Mac user

The first generation Mac Pros (2006/2007, with 32-bit EFI) are like this too and need to be basically hackintoshed to boot Mac OS X 10.8 and later: https://forums.macrumors.com/threads/2006-2007-mac-pro-1-1-2...


I heard the issue is that you lose some functionality from someone who did that. I wonder if it will exist with the M1 as well.


Sorry I'm not quite sure what I'm looking at. Does this patcher let you use the High Sierra Nvidia drivers in Mojave?


I think it is the drivers linked. Try it on an external first.


Nope, they all have this many :)


If it works for you - maybe it's better to wait. Monterey is the trashiest version of Mac OS I ever saw. Full of bugs.


Such as? I've been using it daily since release and honestly haven't had a single problem.


Lucky you!

Here you can find some examples: https://reddit.com/r/macbookpro/

I’ll not go into my personal relationships with Monterey, only global statistics matter, but I can say that right now I’m trying to not reboot my system accidentally because one of the issues accidentally disappeared and I don't know what caused it.


Not sure how that sub-reddit looked during the Catalina days, but I can't imagine it had fewer bug reports. I don't see anything particularly serious either from scrolling quickly. Your experience certainly sounds like a shitty one though.


This seems to be said every release, and for a while it was definitely true, but I think Catalina was a low point and it’s improved since then.


Worst than Yosemite?


So many RCEs in this release.

Apple used to be so smug about not being vulnerable in those 2000s commercials. Now that they have reached critical mass, their OS is equivalent to Windows.


Thought the exact same thing. Karma's a bitch...


> their OS is equivalent to Windows.

Yeah, they're basically identical, except macOS doesn't even have Defender.


Mac keeps a list of (hashes of) apps that are not allowed to run. It’s updated constantly and synced to your computer, so in effect they can remotely disable malicious software for all users all at once.

This was a big deal a few years ago when (IIRC) a bug caused it not to cache the list long enough and it started causing problems when it couldn’t phone home. Aside from that it’s always been a perfectly seamless process.


> Mac keeps a list of (hashes of) apps that are not allowed to run.

That's the XProtect blocklist, it's been around pretty long and hasn't caused any issues that I'm aware of.

> bug caused it not to cache the list long enough and it started causing problems when it couldn’t phone home

What changed is that since Catalina macOS additionally makes a synchronous check the first time you open an app. For certain things that are not code signed (eg. shell scripts), it checks every time. This can cause multi-second delays when launching apps or executing commands in the shell. (see https://sigpipe.macromates.com/2020/macos-catalina-slow-by-d...)

To my knowledge, this issue has not been fixed and is still a problem in Monterey (first time app launches on my M1 Max sometimes take several seconds).

It's possible to deactivate the check for Terminal (by marking it as a "Developer Tool" using a checkbox that sometimes shows up in System preferences, and sometimes doesn't). I'm not aware of a way to avoid this slowdown in Finder without disconnecting from the internet.


I'm seeing 42 CVEs, of those, 24 are solved at a foundational level by memory-safe languages (I'm talking out-of-bounds, use-after-free, race conditions/locking issues)

I wonder how long the same story will repeat before the balance shifts in favor of just rewriting in more modern languages. It's expensive work with a long ROI, but sounds like a lot of these are in foundational libraries that you want to be robust in the long run.


Alternatively one can do the same that Mozilla does for a few components in Firefox [1]. That is, sandbox C/C++ libraries/components at the compilation time so memory-safety bugs will not be able to escape the sandbox. The big plus is that this avoids code re-write for the price of slower execution due to extra checks in the generated code.

This is especially applicable for various parsers that are typically self-contained code that is not performance critical but very prone to bugs with nasty consequences like the article demonstrated again.

[1] - https://hacks.mozilla.org/2021/12/webassembly-and-back-again...


Apple is indeed doing both, but it turns out that sandboxing is an attack surface just like any other code.


It's interesting to see the number of bugs related to race conditions here, and it would be very interesting to know if Rust's protection against race conditions would have been relevant ... there are plenty of good modern memory-safe languages, but not many with Rust-style freedom from data-races.


> [...] there are plenty of good modern memory-safe languages, but not many with Rust-style freedom from data-races.

Haskell's Software-Transactional-Memory (and general focus on immutability-by-default) are another interesting approach towards the same goal.

Software Transactional Memory (STM) is perhaps much easier to get started with than Rust's model. (Though the same can not be said for the rest of Haskell.) The failure model with STM is that your stuff runs slow, if you don't know what you are doing.


STM is super exciting, but also an old concept. Is the runtime penalty too high, or is there any other reason it has not gained popularity? I remember Intel touting CPU STM extensiona back in a day.


The ROI on rewriting existing code may be questionable, but the long-term ROI on writing new projects in safe languages seems unassailable. Choosing C or C++ for new infrastructure projects is madness at this point.


Depends on what its for. I wouldn't mind any unsafe non network connected device, if they really want to hack my programmable thermometer with no network access, or my wired keyboard, they can be my guest! If my server is on intranet, I don't have any fears either, without memory safety hacks can happen, but spectre and meltdown haven't been seen in the wild.

For OS and browsers though, I agree completely if infrastructure as long as its not always connected to the open internet, it doesn't seem to be much of a threat, and even though I don't care about memory safety, the performance of rust tools impresses me! I wish people talked about performance of rust more than its memory safety, I don't care as much about that, but everything being faster? Who wouldn't want that? I made this thread a few days ago. https://news.ycombinator.com/item?id=29456115


I remember quite a few posts on the Firefox blog about how Rust allowed them to write faster programs.

See eg https://hacks.mozilla.org/2019/02/rewriting-a-browser-compon...


They could write better performing programs, as well as do it faster than C, the thread I linked to has GNU coreutils in rust for instance, it has excellent performance and every CLI tool I used is very fast, much faster. This is another example if you are interested. https://github.com/BurntSushi/ripgrep


The thing is, code tends to be reused across projects. If you write a library or a utility program and it's full of holes, that's only OK if you're sure it will always be used in a "safe" context with no untrusted input. Who really wants to commit to that?


Any non networked devices would be easy to commit to. Nobody will force you to use such a library if you are worried about holes. I think internet connected software like OS and browser matters, but I not only don't care if its in these devices, I WANT it to be easy to hack to run custom software. I am glad that the PSP had holes, I am happy camera firmware had holes, and I am also glad that android had holes, I never been hacked on it once, but I sure did hack it myself!

The threat of most security issues is vastly overblown, spectre and meltdown don't exist in the wild, but they crippled all CPUs just in case. Security at what cost? I disabled it, I have no need to make my computer slower for a virus that will never affect me, I "wear" an updated browser. ;)


Also, if you have a large enough corpus of random enough input, you are bound to hit similar bad cases as if you had some malicious input.

More pithy: Hanlon's razor says 'never attribute to malice that which is adequately explained by stupidity.', but the reverse is also true: enough stupidity or just randomness can look like malice.


I disagree because that still drives demand for C/C++ developers and tool chains.

Nothing new should be written in these unsafe languages.


I'd rather have choice rather than a draconian ban.


My experience writing in Swift (a mostly memory safe language) is that it's nice for high level stuff, but as soon as you have to interface with low level or legacy code it quickly becomes unmanageable. What would be two or three lines of C code quickly turns into 10 or more lines of very hard to understand Swift code full of types named UnsafeRawBufferPointer or similar and it's doubtful that thing is in any way safer than a char* with a manual range check.


>Choosing C or C++ for new infrastructure projects is "madness" at this point.

If Spartans were alive they would be using C or C++.


It's still way easier to find experienced C++ coders than Rust coders. :\


An experienced C++ programmer can learn Rust quite easily. The hard part (ownership & lifetimes) is what experienced C++ developers should already know by heart, but the primary difference is that the compiler enforces the same rules.


It is hard to write in Rust style in C++ since many libraries are hostile to it.

Iterators require at least two mutable references or mixing mutable and non-mutable references. Move-only types require a lot of boiler plate and accidental usage of moved thing is not detected by the compiler. Using std::variant feels like a trolling from the language designers who refused to provide proper type-safe unions into the language.


> Using std::variant feels like a trolling from the language designers who refused to provide proper type-safe unions into the language.

And, alas, deliberate trolling would be preferable to reality.

Deliberate trolling would indicate a conscious design. C++ accumulated cruft over the years in what looks like brownian motion in retrospect, and almost never shed any.


Rust just kinda sucks to develop in, and the library support isn’t anywhere near C/C++


Isn't Swift, which came from Apple, memory-safe in a single thread environment? It can't prevent data races at compile time like Rust, but I thought it was far safer in practice than C/C++/ObjC where I suspect all these vulns originate.


Absolutely, but Apple clearly hasn't decided to switch their foundation libraries to it. I suspect it'll be a decade-long effort, but it can't happen too soon.


Swift is their "user-facing" language because this trendy stuff with "smart" languages is apparently very important for PR. I would be surprised if most of the OS itself is written in something other than C/C++/ObjC.


I take it you're not a big fan of Swift, but there's little need to veer over into borderline conspiracy theories.

I do think Apple is honestly committed to Swift over the long term, but it takes loads of time and care to replace the foundations of a building that's in daily use.


I'm not a fan of these "smart" languages in general. I'm one of those who believes that the syntax of the language itself should be dumb, but all the smarts should be in the standard library. Like in Java for example. It's much easier to reason about what the code actually does when the language is dumb and calls into its standard library are explicit. It's bad when some benign-looking syntactic constructs trigger some kind of complex behavior.

They might be committed to it, but I'm doubtful that it's at all possible to write e.g. parts of the kernel in it. And maybe, just maybe, they should direct all their efforts toward rewriting the OS in a safer language instead of making their UIs uglier with nonsensical paddings and messy-looking icons that pretend that there's no pixel grid.


Metal and Driver Kit came after Swift, they happen to be written in a mix of C++ and Objective-C, with Swift bindings.


Fair point. Is it fair to guess that development started before Swift was ready for prime time, and before Apple's own devs of these were up to skill?


Only a couple of macOS frameworks are pure Swift, most of them are actually bindings via Objective-C runtime interop.


The dock was rewritten in Swift, a few other parts of macOS have been too.


Oh so I guess that's why I'm seeing "ghosts" of finder windows under the dock sometimes, especially right after taking a screenshot? And they disappear if I switch to finder. I kept thinking "how could they've possibly broken that". Updated to 12.1 today, will see whether that improves anything.

But it would be more interesting to see the parts that deal with untrusted data (file format parsers, protocol handlers) rewritten in Swift.


Morris worm came out in the mid-1980´s, it will only be fixed when proper liability is in place.


Eh, only recently did we really have, in the consumer space, the computational and memory margin to seriously consider spending on more memory-safe languages. Certainly in the 80s through 00s, switching to a language that gave you nebulous "safety" improvements wasn't worth losing even 10% performance. The competition would eat you up over it.

(There were absolutely spaces where robustness and safety were the priority, but those weren't consumer, and their cost reflected it.)

Even today, I'm not sure we have the engineering margin spend on such efforts (time-to-market is still the priority), though I think the pressure is slowly increasing to do it (again, in consumer spaces, which macOS definitely is).


This is simply not true. For example Ada was much more memory safe than C/C++ and existed from eighties.


Ada was not made for the consumer space: it was made for situations where being provably solid was more important than any other consideration.


Nothing in Ada made it impossible to use it on PC or similar devices. And performance of its compiled code was similar to that of C code if not better. And some of its features like ability to return from a function a dynamically-sized stack-allocated array is still not available in C/C++ or Rust for that matter.

I guess what really made it a niche language was the cost of compilers. DoD vendors in eighties already learned how to milk their customer.


Sure. But that's not what the original poster was arguing.


Object Pascal then, created by Apple for Mac OS and Lisa.


Hence why we need liability, everyone pays the same for faulty software.

No one would enjoy paying for a fridge that leaks water, why should they do the same for software.

Bad programming shops taught them that way, that is why.


>Bad programming shops taught them that way, that is why.

Most of these programming languages when they were created did not have online connectivity or hackers as a threat. It would be ridiculous to expect them to protect against a threat that didn't exist like calling Japanese idiots for not making nuclear defenses and their bad teachers are the reason. In embedded hardware I still don't see the benefit for memory safety.


> In embedded hardware I still don't see the benefit for memory safety.

If anything, safety considerations in embedded devices are even more important than elsewhere, if only for the fact that those devices typically don't get patched. It has to work right the first time. F*cking this up can have dire consequences, depending on what the device is doing.


Thats possible, but I don't see that in practice, I could die any second, but my fan controller, my heating pad, and my camera lens are not being hacked or broken frome memory issues.


Security and safety in programming languages is a know issue since the 1950's, as anyone that actually cares about this subject is aware.

In fact it was one of the sales pitch for Burroughs (1961), still being sold by Unisys, which not surprisingly keeps using its safety over classical POSIX as sales pitch.


They made banking computers, seems relevant there, and also typewriters from what I briefly looked at. Were the typewriters memory safe too? If they were not, what was the benefit, and if they weren't do you see my point?

By the forces of the market, POSIX won because it had more useful features. Personal computers where you were expected to load software yourself and were not connected to any threats? I do not see any reason for me to care about that even today, and its not because programmers had bad habits, they made reasonable tradeoffs, making worst performing software for nonexistent threats of that time is not teaching them bad programming, over engineering for nonexistent threats is bad programming. In non networked devices, I still see no benefit.


POSIX won thanks to free beer, had AT&T been allowed to sell it from day one, history would have been quite differently.

It is like 1 euro shops, quality is not what customers are looking for.


No, it won because it was better. You ignore every question I ask about how memory safety is useful or relevant for those uses.


Because UNIX won the same way people flock to 1 euro shops, quality has nothing to do with it, free beer is what counts.

Had UNIX been sold like every other OS from the same decade, and it would have been a footnote on the history of OS.

When one is thirsty any liquid goes down regardless of the taste.


Good idea, lets ban Linux and all the free software, and make all free ones paid.


> No one would enjoy paying for a fridge that leaks water, why should they do the same for software.

There's no law against licensing bad fridges with the explicit warning that they might leak, is there?

Many people choose to go with fridges that are higher quality than the absolute minimum. Some people also choose to pay for extended warranties.

Many other people choose less reliable options, because they have other preferences.

This choice on offer is a good thing.

Why do you want to ban this choice?

Would open-source be effectively outlawed in your favourite world?


Yes there is, when the health of others are at risk, sanitary inspection will close down the shop.

Whatever one does to themselves on their own place is their own thing, if they happen to land at the hospital due to food poisoning caused by bad refrigeration.

In any case, the fridge was only one example among thousands.


> No one would enjoy paying for a fridge that leaks water, why should they do the same for software.

See, you are suggesting here that customers don't want bad fridge, so they don't buy bad fridges. The problem solves itself.

Why not give customers of software the same responsibility and maturity?


Yeah, not gonna happen when people paid ZERO dollars for OS.

> No one would enjoy paying for a fridge that leaks water,..

For fridge people pay serious money. For software people dig deep in their pocket and then come back with "Fuck it , I am gonna use open source stuff"


I started using open source stuff precisely because I could expect updates and maintenance not to be constrained by commercial shenanigans.


There's no law against selling software with liability.

People by-and-large _choose_ to license software where the license contract denies liability.

You are free to offer 'proper' liability, and try to charge enough extra for it to make up for your extra costs.

(Or do you want to forbid certain kinds of contracts, so that your preferred kind of contract 'wins' because the competition is banned?)

Do keep in mind that some software does come with liability, and things that are a bit like liability. The latter category is eg when you sell both software and a support contract, and your support people have to work harder when stuff goes wrong.


> There's no law against selling software with liability.

In fact, in many jurisdictions there are laws against (or, more to the point, denying any effect to) the waivers of liability much software comes with.


Is there some Ship-of-Theseus way to do this? Redo each component that needs it as it comes up?


To a certain extent. Though some problems require more thorough surgery.


If you missed the news, the Log4Shell RCE vulnerability in Log4j impacts a memory-safe language (Java).

It's too early to tell -it dropped last friday - but it will probably be marked as one of the most egregious vulnerability to date due to the sheer omnipresence of Log4j in production java's code and the simplicity of its exploitation. We are talking Heartbleed/EternalBlue/Struts2 vulnerability level here.

The difference of ROI of memory-unsafe/memory-managed is not so evident. Usually memory-managed languages have fewer bugs, but those tend to be massively impactful


Note that I didn't claim all bugs/vulns would be solved by a sweeping use of memory-safe languages. (Just that a distressingly large proportion of the ones in this security bulletin would be)

The log4j bug fits in the "other" category.

Also, it's a fallacy to believe that memory-safe languages aren't that much better because their bugs are worse. All languages can have the worse bugs, it's just that memory-safe languages solved the easier type of bugs, so there isn't as much of them to bring the average down.

It's like thinking that flying is riskier than driving, because plane crashes are so devastating. Driving kills more people overall, but they're spread across more, smaller events, so we're not as aware of them.


I am also wondering how many of the disclosed CVE's are Intel x86 specific, and how many the ARM64 8.3+ authenticated pointers have prevented (even though the authenticated pointers would not preclude a DoS attack, but they would certainly render the remote code execution impossible).


Look up how many of these appear in the iOS release notes, perhaps?


It's not clear the ROI ever pays off.

A rewrite will likely introduce plenty of CVEs that weren't present in the original code -- possibly more than are fixed, for many years to come.


I imagine the failure of the C# Windows project looms large in such discussions.


I see lot of corporations whose employees are getting the credit, so does that mean the bounty goes to the employee or the corporation or its shared?


I would assume it depends on your employment contract?


Has Monterey been fine for app/pods/React Native development so far? I swear every time I do a big update it takes ages to get the dev experience working again and I can't afford to lose time this close to Christmas with deadlines looming.


I learned long ago to stay 1 release behind. I update to the latest just before a new one is released. This has served me very well for the last 5 years.


Same for music production plugins.


This frustrated me so much I've given up on the Mac. Bought a Dell this week. Actually quite excited to dive properly into Linux.


Can't find any details on CVE-2021-30939, but the apple description sounds pretty bad.


Also multiple “arbitrary code execution” vulnerabilities if you play crafted audio files.


It's 2021 and the list is still full of memory bounds checking errors.


Apple, and most other companies in the industry, are still writing code in languages that don't have implicit bounds checking. Let alone all the code that they have written years ago that they still keep deploying. Nobody is going to rewrite that mountain in Rust any time soon.


Apple has been providing security updates for iOS 14, but so far no 14.8.2, strange since all other supported OS-es were updated today. Often the samen CVE’s get fixed in macOS, iOS, tvOS etc together. They are pushing iOS 15.2 though. I wonder if that means EOL for iOS 14?


There have been no clarity on the status of “being able to stay on iOS 14”. Previously, you could stay on iOS 14.8.1 and the system saw it as “fully updated” but with the option to upgrade to iOS 15. Now, any devices with Automatic Updates turned on gets updated to iOS 15 automatically. I figure they allowed users to stay on iOS 14 in order to avoid the CSAM (legitimate concern) but now they are steamrolling everyone to iOS 15. Does that mean they will be removing CSAM in the future? I will wait for a couple of days to see if 14.8.2 or 14.9 appears. If it does not, I think we can consider iOS to be EOL (and that Apple changed their mind about iOS 14).


Anyone tried this before to patch older OSX? PCs are more open so I assumed (correctly) this would exist.

http://dosdude1.com/mojave/


I wonder how many of those bugs will be fixed in previous versions.


Apple normally continues issuing security updates (but not necessarily fixing other bugs) for around 2 years after a new version of macOS has been released, though I don't believe they've ever formally specified that in writing. Monterey was released on October 25, 2021, so Big Sur should keep getting security updates until October 2023 or so. More than one version back is generally considered unsupported, I believe.

I see a Big Sur update dated December 13, so it's definitely still being supported. While I haven't done an exhaustive check, it looks like it patches many of the same security holes.

https://support.apple.com/en-us/HT212979


This is what Apple wants you to believe. In reality, Apple routinely skips security patches on older versions: https://www.vice.com/en/article/93bw8y/google-caught-hackers...

The only way to be safe on Apple's computers (and phones) is to update to the latest version the day it comes out.


You can jailbreak and get user patches too.

>The only way to be safe on Apple's computers (and phones) is to update to the latest version the day it comes out.

That has not been the case with 14 and 15, its a heuristic but unknown backdoors are a constant threat. Realistically we should state that they are never, ever safe.


Err... that article explicitly says that the hole was in fact patched on Catalina, and it even has a link to the notice.

https://support.apple.com/en-us/HT212825


It was patched on Catalina months after being patched on Big Sur, and after being exploited by the CCP against dissidents in Hong Kong (therefore it was a zero-day).


Errr....

CVE-2021-30869 was patched in Big Sur on September 23, 2021

https://support.apple.com/en-us/HT212147

and was patched in Catalina on...September 23, 2021

https://support.apple.com/en-us/HT212825

The article says that the Chinese team said it was on Big Sur, and that the Google team then discovered it was also on Catalina. It doesn't say that Apple didn't patch it on Catalina, or that they patched it "months later".

I would recommend against using vice.com as any kind of authoritative source. On anything.


Looks like you're correct in this instance, thanks! But my general point is still true: Apple routinely skips patches on older versions. I'm aware of a fair bit of proprietary evidence for this.


This page shows the latest updates:

https://support.apple.com/en-us/HT201222

Big Sur and Catalina got corresponding updates today too.


I wonder how many bugs they are aware and decide not to report.



For a company that demanded Flash be verboten because of the constant security threat, that's a whole lotta arbitrary code execution in webkit. Hey guys, you can't run this VM in our browser. Tell you what, let's just make our browser do everything it does. We won't run into any problems, I mean, everything's sandboxed right?


Why doesn't this work, does their sandbox suck? Qubes os seems to do it fine.


Qubes uses a VM (without GPU acceleration in most scenarios) as the isolation boundary. That has performance and usability downsides…

The sandbox keeps being tightened each release.


VM performance is excellent, and has been so for about a decade, heres a recent thread with VFIO for GPU passthrough and VM (basically performance difference is negligible). https://old.reddit.com/r/VFIO/comments/n3mjj3/native_vs_vm_b...


> GPU passthrough

That’s just not an option for a regular phone or laptop, which is most of Apple’s market.

Don’t forget about the RAM pressure too.


>That’s just not an option for a regular phone or laptop, which is most of Apple’s market.

Lots of Macs have more than one video card. I disagree, I mentioned it since they make their own silicon, and can do virtual GPU, memory management for the browser would be an issue, but Apple doesn't care, they just use swap on the SSD and it would still be cool to have that as an option for single tabs. All web browsing is just a huge amount of javascript virtual machines, and they have custom hardware to run it faster, they might do it via hardware, all their sandboxing is just OS based, TouchID ran its own OS and took years to crack, and needs hardware tools to do so.


Flash ran in Safari on MacOS until 2020.


Sure. But it was dead by then. Apple led the charge to kill it, and the main argument given was that Flash was susceptible to past, present and future use-after-free attacks, like the Java VM or anything else that ran as a black box and managed its own memory. And now we're close to square one on that, which is a natural result of people wanting functionality in-browser that mirrors a desktop experience, including APIs that reach out beyond the sandbox. From Apple's point of view, Safari itself is probably one of their biggest liabilities. If they could eliminate browsing the open web completely, that would solve all these problems (except, obviously, whatever they miss in app store review).


While security was one of the concerns which Apple brought up in "Thoughts on Flash" [1], it was a relatively minor one. The big highlights were standards compliance and performance.

[1]: https://web.archive.org/web/20100501010616/http://www.apple....


Flash was a nightmare on mobile and made pretty much half of the internet unusable on a smartphone (yes, half. Remember those menus in flash?). It was also indeed full of security flaws that Adobe was suuuper slow to patch, if at all. Eye candy was their priority, not the web's wellbeing.


The way poorly written JS clogs up web pages is worse than Flash ever was. Flash didn't have access to the DOM, except to send messages. Pages now are slower and heavier than ever.

And back in 2010 or so, when Flash plugin was still available in beta on iOS and Android, but was declared "too slow" or too heavy on the battery, I wrote an extremely fast-for-the-time <canvas> based screen graph to do some side by side comparisons of manipulating sprites, animating vectors, masking touch areas, blitting some basic particles, etc. Javascript performance was not even close to Flash performance in an iPhone. It really has never gotten close, even with much faster processors and with v8, to replicating on canvas what Flash was doing. Only via WebGL has it become possible to get mobile graphics performance in the browser now resembling what Flash could do in 2010. So the performance complaint was, as far as I'm aware, a lie.

The security complaints were, indeed, legitimate; and it's true that Adobe sucked at patching them, and Flash was a dangerous point of failure for corporate networks. That being said, it would have been a lot better if Adobe and Apple could have come to an arrangement to base a standard off of it. Instead what we have is a million workarounds to achieve the same effect, and we still have a huge amount of terrible, web-choking code all over the place. It's just in untyped JS, which is worse. Bad code is always bad code. At least with Flash you could kill the process without killing the page.


I didn't say we're making good use of javascript either...


>Eye candy was their priority, not the web's wellbeing.

It was very easy to make stuff on it it was developer and artist friendly (still has lots of cool stuff that doesn't exist), it was a standard, but performance wise it sucked, and I would find it interesting for you to tell me what "web wellbeing" is. I would say that any analytics, advertising, and (my most controversial opinion) javascript is bad for web wellbeing. Every issue you leveraged against flash is the same with javascript.


I didn't say we're making good use of javascript either...


Should we ban it?


I think Apple's end goal in ten years is to ban browsers in general. They're just a way for people to get content without paying Apple for it.


Security was just one reason Apple didn't like Flash.

Also, it was really Flash's own limitations that killed it on mobile (which, it turn killed it on desktops). Don't forget it was on Android for a while and aggressively marketed as an advantage over iOS. But it was also a poor UX (slow, power-hungry, designed for mouse & keyboard not touch) and a propriety third-party black box.

Apple didn't so much kill Flash as just be the first to see it coming and act on it.


As someone who was developing browser-based games in the space at that time, there was hardly a better option for touch interaction then. True, lots of older hover/highlight code just didn't work, and that was a pretty common thing with Flash. But all UX in Flash was built by coders who are now working in JS. It was entirely up to coders what to do within the box. Now the box is the whole window of the browser.


> there was hardly a better option for touch interaction then

Native. That's what Apple wanted (and still wants) you to be doing.

To address your point more generally, though: that HTML 5 -- or some other technology -- was as bad as Flash in some ways is not a sufficient reason to support it. Flash was worse than the alternatives is some ways (and critically so in some cases, e.g., wrt power) and didn't offer any unique killer capabilities.


It had a couple killer capabilities (to me). I did a lot of work with the Starling engine, and Away3D. And although this conversation is about browsers, a lot of my work was deployed via AIR, which still limps along, but I wouldn't invest in it now. Being able to deploy the same 2D/3D games on any platform and on the web with native GPU access is a killer app - and the best contender now is probably Unity. The ability drive fast vector animations without recourse to SVG or something, all within one dev environment and without many dependencies, provided an incredible workflow from artists to coders. Certain things that were taken for granted in Flash, like the native engine for determining mouse/touch position over layers and layers of different vectors (not bounding boxes) are still incredibly hard to replicate. Whole new platforms had to be written in JS to support these kinds of occlusions and improve rendering performance to a tolerable point. Things we have now like Pixijs would have absolutely destroyed the battery life on a 2010 iPhone.

You're spot on that Apple always wanted native code, and still do, but what small studio has time to write everything three times? The idea of deploying and maintaining totally separate code for a casual game on Android and iOS and the web is a deal-breaker for a 5-person team, and putting all your energy into one walled garden is not what's best for the developer.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: