Hi CEO of Kolide here and one of the reporters of CVE-2021-30987.
This vulnerability is basically an information disclosure issue that enables you to get BSSIDs from the Airport Utility without the appropriate location tracking entitlement. Even worse the OS will not even flash the "location being accessed" indicator in the status bar when geolocation is determined this way.
Essentially any app can shell out to the CLI tool and parse its output and then feed the resulting BSSID to one of the many external BSSID -> Geolocation services and determine a device's location.
At Kolide we are huge believers that device geolocation should not be accessible to third party programs without end-user prompting (even ones managed by MDM) so we aggressively seek out gaps the TCC authorization model (or other OS issues that undermine the tenets of https://honest.security) and report them whenever we find them.
I reported this vulnerability on January 17th 2020.
Edit: Btw, I am not really sure why the CVE was withdrawn. To me this is a pretty serious information disclosure style vulnerability and everyone should upgrade to this release as soon as possible.
Why is Apple referencing CVE-2021-30938 and CVE-2021-30987 in their listing of security updates when they are not searchable in the National Vulnerability Database? Their status there is listed as "* REJECT * DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: none. Reason: This candidate was withdrawn by the CVE program. Notes: none.".
Can anyone give more clarity what is going on?
No luck with High Sierra though. Its last update was in Nov 2020.
Apple is notable for providing ZERO official guidance on how long various versions receive security updates. Officially they just say you must be on the latest version. At the moment they're still releasing updates back to Catalina on macOS and iOS 12 for all the devices still stuck on that version. But they make no claims on how long that will continue.
They seem to support the current version and the previous two before that. So Mojave is not getting support, but Catalina is at this time. But be careful what you wish for -- the last Catalina security update broke software using openGL accelerated graphics for Apple computers with integrated Intel HD4000 graphics. This includes Google Earth, Google Street view, AutoDesk, RStudio and many web apps. So a lot of people have had to revert to older unsupported MacOS versions just to get any work done.
Yup, this is a well known secret among anybody who supports Macs - Apple typically supports the latest release and the two immediately preceding it. iOS is a different story though, as they sometimes go back a few versions to fix CVEs.
How one knows that BigSur for instance isn't already compromised by one of those wonderful capabilities for code execution as they provide update only now ?
Not sure what you're getting at? They released the Monterey, Big Sur, and Catalina security updates all at the exact same time. They're all vulnerable, and they all had patches released at once.
Macs running classic MacOS were famously unstable for professional work. I got used to restarting my Mac when getting a cup of coffee as preventative maintenance.
The few viruses that were around were mostly annoyances.
Some of the professional Software was also famously unstable. Generally speaking, saving frequently helped. (Quark, I'm speaking of XPress 3 and memory leaks, guaranteed to strike, when it really hurts…)
That's not just Mac OS and professional software either. Windows did that too. I was very young back then, but I do remember being angry at the computer crashing and losing whatever I was doing, only to be told "it's your fault, you should save you file more often next time". It becomes a sort of a subconscious habit really quickly.
The important point being, once you saved, the working memory was also reset to sane conditions and you could keep on going for days, or weeks. Saving frequently not only kept the injuries, which added to the insults of a crash, in bounds, it also prevented those crashes from happening at all.
What counts as professional? I think my friend had a G3 that never had issues as a student.
I just had it in "computer class" to play crappy math games that hurt my eyes but they never crashed, and I wasn't a professional. I used Photoshop in 2003 or so, I think it was OSX by then, and never had an issue if you call photoshop and internet usage "professional" work.
You can't run classic Mac OS programs on Snow Leopard with Rosetta. Rosetta just let you run PPC Mac OS X apps on Intel Mac OS X. So it was just as stable as Mac OS X. Even with Classic on Mac OS X for PPC you'd just crash the Classic system at the worst and not whole OS.
It's not as simple as that. We ran a tight ship, extension-wise.
There were plenty of bugs in QuarkXPress, Photoshop and Illustrator to make our Macs bomb, and the fundamental lack of memory protection, process isolation and preemptive multitasking in Mac OS meant that the whole system was wobbly from the bottom up.
Also, my post wasn't meant as a comparision between Win95 and Mac OS. I just responded to a question about what Mac OS was like to work with on a daily basis in the decade of the 90s. Win95 wasn't around for half of it.
There was also the thing with virtual memory. Some apps did run with it (so you had "more memory") and some didn't. Enabling or disabling it meant reboot. So you rebooted based on which app you needed to run.
Woz has a chapter in his book about how a stable system magically became unstable after install IE. "Without making accusations" he says, but goes into some detail on how IE being installed caused a lot of chaos.
I haven't ever used classic Mac OS when it was current, and I don't know anyone who did, but I do remember Windows 9x crashing and locking up being an almost daily occurrence. PCs came with dedicated reset buttons for a reason!
My memory of Mac OS 7.1 was that it was rock solid. When it did bomb it wasn’t a surprise — you knew it was coming because you’d been messing around in ResEdit or playing with the debugger.
I did, it was all intranet, I mentioned this in another post, but they didn't have the same threats as we do today.
>PCs came with dedicated reset buttons for a reason!
They still do! Windows 9x did suck, Windows NT however was amazing, it is like the most Linux like of all the Windows. How could they release Windows ME...
They have other issues now, mostly performance, but not as many crashes at least. Its like saying all modern Intel CPUs are descended from Core M, but I don't know if it adds any relevant information, its mutated a lot and its very different, but at least it doesn't crash like 95, 98, 98 Plus! and ugh ME.
I think they do. There are machines for which there isn’t any version supported on that machine, though.
That’s not different from the situation with Windows. An Intel 386DX CPU with 4 MB of system RAM and 50–55 MB of hard disk space won’t see Windows security updates, either, anymore.
What is different is that Apple is less clear in support periods and often more aggressively leave older hardware behind.
I don’t think you can require them to support things forever, but requiring everyone to be clear about it at time of sale, or requiring X years of support would be good ideas, IMO.
> That’s not different from the situation with Windows.
Yes. And that is also part of the problem. They also don't patch reported vulnerabilities nor provide a clear path for the community to fix themselves.
> What is different is that Apple is less clear in support periods and often more aggressively leave older hardware behind.
Yes, that amplifies the problem a lot. Many of the abandoned machines are 10-year old but still very suited for daily use. They have 64-bit Core i processors with decent amounts of RAM (such as 8 GB).
>I don’t think you can require them to support things forever, but requiring everyone to be clear about it at time of sale, or requiring X years of support would be good ideas, IMO.
Why you can't require them to support things forever ? Why not? It was their choice to lock completely those devices they sell.
I think as long as device is looked and you cannot have root access and install own software of course you can required them to support those things forever until they provide unlock for them at least, no?
That is why locked devices without ability to have root access and full control over them is a problem. At some point they simply do not provide software while blocking you from providing own software.
Perhaps it's time to question this practice and discuss the issue with potentially making such devices illegal? At least when company doesn't provide software?
Shouldn't then Apple made be legally obligated to provide root access for the devices it no longer supports?
Otherwise it's what? You buy device then it's not functioning anymore just because they decide so. Doesn't it look like some kind of a fraud/scam ? How it's called then?
I mean if one owns computer/iphone and cannot install own software there since those devices are locked and Apple do not provide their software with security updates what an owner of the machine should do?
>At some point they simply do not provide software while blocking you from providing own software.
You knew this getting these devices. You didn’t buy it and except to be able to run android on iPhone did you?
> Perhaps it's time to question this practice and discuss the issue with potentially making such devices illegal? At least when company doesn't provide software?
Why? I don’t see a huge benefit. I don’t love it, but it’s not like you can’t run old versions or it stops working. Do you expect them to give you software updates forever once you buy it? Apple is one of the best at updating mobile.
If it was up to me I’d like to have the paid updates model again, it would stop forcing upgrades, I’m not fond of the security comes with features model, and I think it would force them to have more support for other versions, they expect everyone to always update.
Then again if you don’t like the product, don’t buy it. Many Android phones lets you do all of that, and once I set up an iMessage server to route it I’ll switch completely.
>I mean if one owns computer/iphone and cannot install own software there since those devices are locked and Apple do not provide their software with security updates what an owner of the machine should do?
What security problem are you worried about? I never update iOS from the version it came from and had never been hacked ever, I’d love to hear of anyone with that experience, I have never heard of it happening once, same with my android devices although I root. Maybe my Adblocks, blacklist, or the VPN on mobile prevents it anyway, on my home network I rely on my router to protect my devices.
Do you have a realistic fear, which exploit? Why is it scary? Best practice is to stop assuming you have any safety, updates or not, lots of unpublished backdoors, lots of ways to hack you if they really wanted to.
> That is why locked devices without ability to have root access and full control over them is a problem. At some point they simply do not provide software while blocking you from providing own software.
All of the Macs stuck at High Sierra can also run Windows or Linux.
I do not think I did. See above ... "I mean if one owns computer/iphone ... "
>Macs can run another OS by design.
If you are talking about previous models, to the certain degree yes except GPUs if I am not mistaken.
But current M1 Mac as far as I understand it doesn't boot anything else unless it boots first from the internal storage and only then "external OS loader" which should be authorized too at least once by Apple.
You can say that "Macs can run another OS by design." but design in which they authorize anything while do not provide full specs of HW interfaces for "another OS" I can't call so.
>design in which they authorize anything while do not provide full specs of HW interfaces for "another OS" I can't call so
When did Intel release the full specs for the hidden OS that runs on the Management Engine in x86 chips?
>[Intel] processors are running a closed-source variation of the open-source MINIX 3. We don't know exactly what version or how it's been modified since we don't have the source code. We do know that with it there Neither Linux nor any other operating system have final control of the x86 platform.
It can reimage your computer's firmware even if it's powered off. Let me repeat that. If your computer is "off" but still plugged in, MINIX can still potentially change your computer's fundamental settings.
That's simply not true. Apple allows booting unsigned/custom kernels on Apple Silicon Macs without a jailbreak. This is according to the Asahi Linux docs.
"Strictly speaking, the things can boot off of DFU (USB device mode) too, but to make that useful for regular boot you need to ask Apple, as currently you cannot boot a normal OS like that as far as I know, only their signed restore bundles (which is how you fix an M1 Mac if you wipe the SSD)."
(https://news.ycombinator.com/item?id=26116017)
I think that's understandable, it's primarily a security chip so if you're ever going to lock anything down, that would be it. OK, from a purist perspective that's arguable, but I think it's also arguable allowing unsigned code would make it less secure. Anyway you can still run another OS on a Mac with a T1.
"macOS on ARM is very clearly a descendant of the way iOS works. It's just macOS userspace on top of an ARM XNU kernel, which was already a thing. The way the boot process works, etc. is clearly iOS plus their new fancy Boot Policy stuff for supporting multiple OSes and custom kernels." https://news.ycombinator.com/item?id=28181976
I think after M1 it will be more and more confusing to see them too much separately.
I would question Apple's fast and furious update schedule. Fundamentally there is very little difference between Mac OS X 10.0 Cheetah and macOS 12 Monterey. The operating system itself has probably seen very little changes since NeXTSTEP's first release in 1989, it only appears different, and the vast majority of changes are in the graphical interface. It begins to appear like the software philosophy pioneered by Adobe: buy the same software every 2 years. But Apple upped that game to every stinking year. Why? What is the benefit to the user? Wouldn't it have been far better to not place so much attention on a constant, never ending stream of so many new features so few use, and instead focus on the necessary features everyone uses, as well as security, and fixing the bugs. I also don't understand Apple's philosophy of abandoning compatibility. Was 32-bit compatibility really all that crippling to Apple's 64-bit endeavors? Or is it just an easy way to break production software to force purchase of new software? Imagine if the automobile industry changed fuels every 10 years starting in the 1940's, such that every decade that passes, any car older than 10 years would no longer function for lack of access to fuel. I don't think we would have let that stand. And yet we eat what Apple puts on our plate. I like macOS, but this "new OS every year!" sucks, has no benefit to the user, and it is, fundamentally, a lie.
>Fundamentally there is very little difference between Mac OS X 10.0 Cheetah and macOS 12 Monterey. The operating system itself has probably seen very little changes since NeXTSTEP's first release in 1989, it only appears different, and the vast majority of changes are in the graphical interface.
Well, that range covers three different CPU architectures(!) and at least one complete file system rewrite.
But your point is fair - to the layperson there hasn't been much change.
> Well, that range covers three different CPU architectures(!)
The exercise proves that hardware is irrelevant to an operating system
> and at least one complete file system rewrite.
Which filesystem was completely rewritten, and why? Unless you're referring to AFS... which wasn't rewritten, but developed.
> But your point is fair - to the layperson there hasn't been much change.
Pretty sure not a whole lot has changed for the experienced expert, either. What has incrementally changed (but not so much it's unrecognizable) is the graphical user interface. A GUI is not an operating system.
>Unless you're referring to AFS... which wasn't rewritten, but developed.
You know what I meant and the distinction is irrelevant in this case, so don't be that guy. The point is, a lot of impressive engineering resources have gone in to the OS over the years. In particular the seamless transition (due to Rosetta x 2) from Motorola -> Intel -> Apple Silicon (among other things) so the statement that MacOS 10 -> 12 has little fundamental difference is a bit hyperbolic.
Anyways, I'm not sure why you're trying to start an argument but I'm not interested. You sound way more uhhh... passionate about this topic than I care to be.
It's really ok if someone disagrees with you. It doesn't make them argumentative.
NeXTSTEP was compiled to run on 68k, x86, RS6000, and Spark, it didn't change the version number. Tiger and Leopard ran on PowerPC and Intel. Snow Leopard ran both 32-bit and 64-bit with distinct kernels. Big Sur and Monterey run on Intel and ARM. In each instance, these are the same OS at the same version running on different platforms. But in each instance, it is still the same operating system, regardless of any differences in the code on any particular platform. The hardware is irrelevant.
There are technologies introduced and some simultaneously deprecated at each increase of a macOS version number. Most of this has to do with the user interface, but there are some changes to the OS having little to do with the user interface. These changes, however, are incremental point increases, at best, not version whole number advances, which Apple has inexplicably ticked forward every year since 2012. What are the differences between Monterey and Mountain Lion that warrants their distinction? Ignoring platform, shown irrelevant above, they merely appear very different, along with a few inconsequential new or removed features, but they are more alike than they are different. So I think Apple has employed version numbering as marketing rather than marking milestones of significant advancement. Microsoft has done this too, in spades. Sure, Vista and Windows 8 are unrecognizable, but they are both still fundamentally NT, and not distinct operating systems.
>It's really ok if someone disagrees with you. It doesn't make them argumentative.
No, but tone does. As does uncharitably interpreting comments.
And I never said hardware itself mattered, I said the level of software engineering in seamlessly transitioning MacOS across different architectures was impressive and resource consuming. Rosetta (executed flawlessly twice) in particular is an incredible feat. As were Time Machine (effortless backups), full disk encryption (seamless and transparent) at the time they were released. Not world changing, sure - but what OS feature in the past 20 years is?
>Apple has inexplicably ticked forward every year since 2012. What are the differences between Monterey and Mountain Lion that warrants their distinction?
You've moved the goal posts. The comment I replied to was:
>Fundamentally there is very little difference between Mac OS X 10.0 Cheetah and macOS 12 Monterey.
Anyways, I'm no Mac fanboy so it feels weird trying to hype MacOS up. My only point was that the above statement comes across as more than a little hyperbolic.
I'm not entirely sure how to counter your argument since you've provided no criteria for how much the OS should have changed over the years. You've only said it hasn't changed enough. That's a bit vague. What OS meets your criteria? What obvious feature is MacOS lacking? What's your gold-standard benchmark for what an OS should be?
To be clear, I agree with you that many (most) MacOS changes are marketing-driven fluff. But I also fail to see any significant advancements in other OS's that MacOS sorely needs.
Well, calling a dissenter argumentative, or griping about tone, is actually ad hominem fallacy, because it ignores the argument entirely and personally attacks the man.
In fact, you did list migrating platforms to support macOS versioning, which can be interpreted as saying the hardware matters enough to advance versioning.
Goal posts weren't moved (farther apart, they were moved closer together). This is not the same thing as the meaning of the idiom "moving goal posts," especially because no goal was met.
> since you've provided no criteria for how much the OS should have changed over the years.
This is actually a good example of moving goal posts, because it is beyond the scope of supporting the argument. It is perfectly ordinary that something be proved as less than ideal without providing what a perfect or better ideal is.
> But I also fail to see any significant advancements in other OS's that MacOS sorely needs.
This is a straw man and skew to the argument, which is simply that calling annual upgrades a new operating system due to new GUI features doesn't make one OS distinct from it's previous incarnation, but they are instead, at best, incremental point increases of the same operating system (which is actually what Apple does, 10.5, 10.6, etc.). I'm not sure where you got the notion that I was arguing anything beyond that, arguing anything beyond decrying the Adobe-style marketing strategy of labelling the same software as though it were new and innovative. With Mac OS X, OS X and macOS, the kernel hardly changes and the user land barely changes if at all, and even the window manager barely changes. What mostly changes is a few GUI features deprecate while adding a list of new GUI features; what changes between macOS versions is superficial and incremental.
And it is merely my opinion, not a claim of some fundamental truth. So let me tell you what you're going to do, you're going to have better days and not sweat it so much.
>Well, calling a dissenter argumentative, or griping about tone, is actually ad hominem fallacy, because it ignores the argument entirely and personally attacks the man.
Well, I didn't ignore the argument, nor did I attack you personally (your tone is not your personality, is it?). So you're misusing "ad hominem".
Second, on a human level, I pointed out that your argument came across as curt and unnecessarily argumentative. You can choose to do with that feedback what you will. As in "hey, sorry, I didn't intend it like that" or "NAH-Nah you committed a technical foul therefore I win the argument!". It's your choice, but when people make observations about how you're coming across - it might be wise to assume they mean that genuinely.
>Goal posts weren't moved (farther apart, they were moved closer together). This is not the same thing as the meaning of the idiom "moving goal posts,"
You're plainly wrong. Further apart or narrower doesn't matter - what matters is you changed the scope of your argument to make it easier to support. Your original comment covered 2001-2021, you then casually changed it to 2012-2021 without explicitly acknowledging you were changing the argument.
That is textbook goal post moving. Textbook.
The rest of the argument is simple - you're saying MacOS hasn't changed enough. I'm saying - relative to its peers it has changed about the same amount. If you can tell me some great advancements that other OS's have had during the same time frame that MacOS lacks that would go a long way to supporting your argument. I'm actually curious on this (since for example - I don't follow Windows too closely).
Similarly, I can say Samsung (or whatever) TV's haven't advanced that much in the past 20 years. But if I can't point out a single feature or advancement they are lacking compared with their peers then it's not really a strong argument. It just sounds like I'm angry with Samsung for some reason.
>This is a straw man and skew to the argument, which is simply that calling annual upgrades a new operating system due to new GUI features
You're incredible. Actually screaming "straw man" in the same sentence you're saying Apple (or literally anybody) calls its point releases "a new operating system". Amazing.
You're now 0 for 2 in use of the term "fallacy".
>So let me tell you what you're going to do, you're going to have better days and not sweat it so much.
I'm fine. As previously mentioned, you're the one who sounds angry.
I can only speak for myself, but I'm running Mojave because there are 32-bit apps I need to use for my current project, which any newer version of MacOS will refuse to run.
If you have any interest in compiling 32-bit applications, you'll need to downgrade or set up another machine with High Sierra. Mojave will still run 32-bit applications with a warning, but it can not build for i386, will build only for x86_64.
Until this year I used a 2009 Macbook Pro. I finally switched to Thinkpad because I wanted to use an OS that gets security updates. It's not miserable at all if you have sufficient RAM.
It actually runs pretty well for most things. The hardware is still good unless you're doing high intensity computing or trying play AAA games on it (for which it was never good anyway).
Security is not support. If they distributed defective software (with security vulnerabilities), they should fix reported vulnerabilities or at least provide a clear and canonical path for the community to fix it themselves.
Yes, it also applies to MS and possibly many other companies.
One could claim that they do, always, in the newer versions of the OS, which are almost universally incremental improvements based on the older versions.
Not supporting old hardware is something that all companies [1] eventually must do, since resources are limited.
I don't think Apple or Windows would be a good choice, if openness is a feature you're looking for.
> Not supporting old hardware is something that all companies eventually must do
I get it, but macOS does not need to do anything hardware-specific to fix security vulnerabilities, they'd only need to patch the old version (which already supports that hardware) and release an update, and only if someone reports a vulnerability. If they planned to group hardware deprecations only every X versions (let's say they'll remove hardware every 5 years), they would have very little work to do to keep those old versions safe.
> I don't think Apple or Windows would be a good choice, if openness is a feature you're looking for.
I agree, but that doesn't mean we shouldn't also push for them to be better. Personally I love linux but I also need mac for iOS development.
I'm worried by many of the security issues fixed by the linked official apple release notes, including CVE-2021-30939, which indicates that it may be possible for a computer to be compromised by opening images or maybe even accessing its metadata, and many privilege escalations allowing user apps to acquire kernel privileges.
There is no Swift or Objective-C in the xnu kernel, so the answer to the first question has to be "no". To the second question, Swift is generally safe against these bugs except (1) in the face of race conditions and (2) when the developer uses things that have "unsafe" in the name.
ObjC and Swift are just application level programming languages. The XNU system kernel for macOS is written in mostly C with some C++. Anyway Swift is written in C++ last I checked and ObjC is a superset of C anyway. Everything done with ObjC can be done in C if the developer is feeling adventurous enough.
Its probably less to do with buffer overflows and use-after-free of these applications, and more to do with the underlying OS APIs, and I assume most of the OS is still C, C++ and Obj-C.
I’ll not go into my personal relationships with Monterey, only global statistics matter, but I can say that right now I’m trying to not reboot my system accidentally because one of the issues accidentally disappeared and I don't know what caused it.
Not sure how that sub-reddit looked during the Catalina days, but I can't imagine it had fewer bug reports. I don't see anything particularly serious either from scrolling quickly. Your experience certainly sounds like a shitty one though.
Apple used to be so smug about not being vulnerable in those 2000s commercials. Now that they have reached critical mass, their OS is equivalent to Windows.
Mac keeps a list of (hashes of) apps that are not allowed to run. It’s updated constantly and synced to your computer, so in effect they can remotely disable malicious software for all users all at once.
This was a big deal a few years ago when (IIRC) a bug caused it not to cache the list long enough and it started causing problems when it couldn’t phone home. Aside from that it’s always been a perfectly seamless process.
> Mac keeps a list of (hashes of) apps that are not allowed to run.
That's the XProtect blocklist, it's been around pretty long and hasn't caused any issues that I'm aware of.
> bug caused it not to cache the list long enough and it started causing problems when it couldn’t phone home
What changed is that since Catalina macOS additionally makes a synchronous check the first time you open an app. For certain things that are not code signed (eg. shell scripts), it checks every time. This can cause multi-second delays when launching apps or executing commands in the shell. (see https://sigpipe.macromates.com/2020/macos-catalina-slow-by-d...)
To my knowledge, this issue has not been fixed and is still a problem in Monterey (first time app launches on my M1 Max sometimes take several seconds).
It's possible to deactivate the check for Terminal (by marking it as a "Developer Tool" using a checkbox that sometimes shows up in System preferences, and sometimes doesn't). I'm not aware of a way to avoid this slowdown in Finder without disconnecting from the internet.
I'm seeing 42 CVEs, of those, 24 are solved at a foundational level by memory-safe languages (I'm talking out-of-bounds, use-after-free, race conditions/locking issues)
I wonder how long the same story will repeat before the balance shifts in favor of just rewriting in more modern languages. It's expensive work with a long ROI, but sounds like a lot of these are in foundational libraries that you want to be robust in the long run.
Alternatively one can do the same that Mozilla does for a few components in Firefox [1]. That is, sandbox C/C++ libraries/components at the compilation time so memory-safety bugs will not be able to escape the sandbox. The big plus is that this avoids code re-write for the price of slower execution due to extra checks in the generated code.
This is especially applicable for various parsers that are typically self-contained code that is not performance critical but very prone to bugs with nasty consequences like the article demonstrated again.
It's interesting to see the number of bugs related to race conditions here, and it would be very interesting to know if Rust's protection against race conditions would have been relevant ... there are plenty of good modern memory-safe languages, but not many with Rust-style freedom from data-races.
> [...] there are plenty of good modern memory-safe languages, but not many with Rust-style freedom from data-races.
Haskell's Software-Transactional-Memory (and general focus on immutability-by-default) are another interesting approach towards the same goal.
Software Transactional Memory (STM) is perhaps much easier to get started with than Rust's model. (Though the same can not be said for the rest of Haskell.) The failure model with STM is that your stuff runs slow, if you don't know what you are doing.
STM is super exciting, but also an old concept. Is the runtime penalty too high, or is there any other reason it has not gained popularity? I remember Intel touting CPU STM extensiona back in a day.
The ROI on rewriting existing code may be questionable, but the long-term ROI on writing new projects in safe languages seems unassailable. Choosing C or C++ for new infrastructure projects is madness at this point.
Depends on what its for. I wouldn't mind any unsafe non network connected device, if they really want to hack my programmable thermometer with no network access, or my wired keyboard, they can be my guest! If my server is on intranet, I don't have any fears either, without memory safety hacks can happen, but spectre and meltdown haven't been seen in the wild.
For OS and browsers though, I agree completely if infrastructure as long as its not always connected to the open internet, it doesn't seem to be much of a threat, and even though I don't care about memory safety, the performance of rust tools impresses me! I wish people talked about performance of rust more than its memory safety, I don't care as much about that, but everything being faster? Who wouldn't want that? I made this thread a few days ago. https://news.ycombinator.com/item?id=29456115
They could write better performing programs, as well as do it faster than C, the thread I linked to has GNU coreutils in rust for instance, it has excellent performance and every CLI tool I used is very fast, much faster. This is another example if you are interested. https://github.com/BurntSushi/ripgrep
The thing is, code tends to be reused across projects. If you write a library or a utility program and it's full of holes, that's only OK if you're sure it will always be used in a "safe" context with no untrusted input. Who really wants to commit to that?
Any non networked devices would be easy to commit to. Nobody will force you to use such a library if you are worried about holes. I think internet connected software like OS and browser matters, but I not only don't care if its in these devices, I WANT it to be easy to hack to run custom software. I am glad that the PSP had holes, I am happy camera firmware had holes, and I am also glad that android had holes, I never been hacked on it once, but I sure did hack it myself!
The threat of most security issues is vastly overblown, spectre and meltdown don't exist in the wild, but they crippled all CPUs just in case. Security at what cost? I disabled it, I have no need to make my computer slower for a virus that will never affect me, I "wear" an updated browser. ;)
Also, if you have a large enough corpus of random enough input, you are bound to hit similar bad cases as if you had some malicious input.
More pithy: Hanlon's razor says 'never attribute to malice that which is adequately explained by stupidity.', but the reverse is also true: enough stupidity or just randomness can look like malice.
My experience writing in Swift (a mostly memory safe language) is that it's nice for high level stuff, but as soon as you have to interface with low level or legacy code it quickly becomes unmanageable. What would be two or three lines of C code quickly turns into 10 or more lines of very hard to understand Swift code full of types named UnsafeRawBufferPointer or similar and it's doubtful that thing is in any way safer than a char* with a manual range check.
An experienced C++ programmer can learn Rust quite easily. The hard part (ownership & lifetimes) is what experienced C++ developers should already know by heart, but the primary difference is that the compiler enforces the same rules.
It is hard to write in Rust style in C++ since many libraries are hostile to it.
Iterators require at least two mutable references or mixing mutable and non-mutable references. Move-only types require a lot of boiler plate and accidental usage of moved thing is not detected by the compiler. Using std::variant feels like a trolling from the language designers who refused to provide proper type-safe unions into the language.
> Using std::variant feels like a trolling from the language designers who refused to provide proper type-safe unions into the language.
And, alas, deliberate trolling would be preferable to reality.
Deliberate trolling would indicate a conscious design. C++ accumulated cruft over the years in what looks like brownian motion in retrospect, and almost never shed any.
Isn't Swift, which came from Apple, memory-safe in a single thread environment? It can't prevent data races at compile time like Rust, but I thought it was far safer in practice than C/C++/ObjC where I suspect all these vulns originate.
Absolutely, but Apple clearly hasn't decided to switch their foundation libraries to it. I suspect it'll be a decade-long effort, but it can't happen too soon.
Swift is their "user-facing" language because this trendy stuff with "smart" languages is apparently very important for PR. I would be surprised if most of the OS itself is written in something other than C/C++/ObjC.
I take it you're not a big fan of Swift, but there's little need to veer over into borderline conspiracy theories.
I do think Apple is honestly committed to Swift over the long term, but it takes loads of time and care to replace the foundations of a building that's in daily use.
I'm not a fan of these "smart" languages in general. I'm one of those who believes that the syntax of the language itself should be dumb, but all the smarts should be in the standard library. Like in Java for example. It's much easier to reason about what the code actually does when the language is dumb and calls into its standard library are explicit. It's bad when some benign-looking syntactic constructs trigger some kind of complex behavior.
They might be committed to it, but I'm doubtful that it's at all possible to write e.g. parts of the kernel in it. And maybe, just maybe, they should direct all their efforts toward rewriting the OS in a safer language instead of making their UIs uglier with nonsensical paddings and messy-looking icons that pretend that there's no pixel grid.
Oh so I guess that's why I'm seeing "ghosts" of finder windows under the dock sometimes, especially right after taking a screenshot? And they disappear if I switch to finder. I kept thinking "how could they've possibly broken that". Updated to 12.1 today, will see whether that improves anything.
But it would be more interesting to see the parts that deal with untrusted data (file format parsers, protocol handlers) rewritten in Swift.
Eh, only recently did we really have, in the consumer space, the computational and memory margin to seriously consider spending on more memory-safe languages. Certainly in the 80s through 00s, switching to a language that gave you nebulous "safety" improvements wasn't worth losing even 10% performance. The competition would eat you up over it.
(There were absolutely spaces where robustness and safety were the priority, but those weren't consumer, and their cost reflected it.)
Even today, I'm not sure we have the engineering margin spend on such efforts (time-to-market is still the priority), though I think the pressure is slowly increasing to do it (again, in consumer spaces, which macOS definitely is).
Nothing in Ada made it impossible to use it on PC or similar devices. And performance of its compiled code was similar to that of C code if not better. And some of its features like ability to return from a function a dynamically-sized stack-allocated array is still not available in C/C++ or Rust for that matter.
I guess what really made it a niche language was the cost of compilers. DoD vendors in eighties already learned how to milk their customer.
>Bad programming shops taught them that way, that is why.
Most of these programming languages when they were created did not have online connectivity or hackers as a threat. It would be ridiculous to expect them to protect against a threat that didn't exist like calling Japanese idiots for not making nuclear defenses and their bad teachers are the reason. In embedded hardware I still don't see the benefit for memory safety.
> In embedded hardware I still don't see the benefit for memory safety.
If anything, safety considerations in embedded devices are even more important than elsewhere, if only for the fact that those devices typically don't get patched. It has to work right the first time. F*cking this up can have dire consequences, depending on what the device is doing.
Thats possible, but I don't see that in practice, I could die any second, but my fan controller, my heating pad, and my camera lens are not being hacked or broken frome memory issues.
Security and safety in programming languages is a know issue since the 1950's, as anyone that actually cares about this subject is aware.
In fact it was one of the sales pitch for Burroughs (1961), still being sold by Unisys, which not surprisingly keeps using its safety over classical POSIX as sales pitch.
They made banking computers, seems relevant there, and also typewriters from what I briefly looked at. Were the typewriters memory safe too? If they were not, what was the benefit, and if they weren't do you see my point?
By the forces of the market, POSIX won because it had more useful features. Personal computers where you were expected to load software yourself and were not connected to any threats? I do not see any reason for me to care about that even today, and its not because programmers had bad habits, they made reasonable tradeoffs, making worst performing software for nonexistent threats of that time is not teaching them bad programming, over engineering for nonexistent threats is bad programming. In non networked devices, I still see no benefit.
Yes there is, when the health of others are at risk, sanitary inspection will close down the shop.
Whatever one does to themselves on their own place is their own thing, if they happen to land at the hospital due to food poisoning caused by bad refrigeration.
In any case, the fridge was only one example among thousands.
There's no law against selling software with liability.
People by-and-large _choose_ to license software where the license contract denies liability.
You are free to offer 'proper' liability, and try to charge enough extra for it to make up for your extra costs.
(Or do you want to forbid certain kinds of contracts, so that your preferred kind of contract 'wins' because the competition is banned?)
Do keep in mind that some software does come with liability, and things that are a bit like liability. The latter category is eg when you sell both software and a support contract, and your support people have to work harder when stuff goes wrong.
> There's no law against selling software with liability.
In fact, in many jurisdictions there are laws against (or, more to the point, denying any effect to) the waivers of liability much software comes with.
If you missed the news, the Log4Shell RCE vulnerability in Log4j impacts a memory-safe language (Java).
It's too early to tell -it dropped last friday - but it will probably be marked as one of the most egregious vulnerability to date due to the sheer omnipresence of Log4j in production java's code and the simplicity of its exploitation. We are talking Heartbleed/EternalBlue/Struts2 vulnerability level here.
The difference of ROI of memory-unsafe/memory-managed is not so evident. Usually memory-managed languages have fewer bugs, but those tend to be massively impactful
Note that I didn't claim all bugs/vulns would be solved by a sweeping use of memory-safe languages. (Just that a distressingly large proportion of the ones in this security bulletin would be)
The log4j bug fits in the "other" category.
Also, it's a fallacy to believe that memory-safe languages aren't that much better because their bugs are worse. All languages can have the worse bugs, it's just that memory-safe languages solved the easier type of bugs, so there isn't as much of them to bring the average down.
It's like thinking that flying is riskier than driving, because plane crashes are so devastating. Driving kills more people overall, but they're spread across more, smaller events, so we're not as aware of them.
I am also wondering how many of the disclosed CVE's are Intel x86 specific, and how many the ARM64 8.3+ authenticated pointers have prevented (even though the authenticated pointers would not preclude a DoS attack, but they would certainly render the remote code execution impossible).
Has Monterey been fine for app/pods/React Native development so far? I swear every time I do a big update it takes ages to get the dev experience working again and I can't afford to lose time this close to Christmas with deadlines looming.
I learned long ago to stay 1 release behind. I update to the latest just before a new one is released. This has served me very well for the last 5 years.
Apple, and most other companies in the industry, are still writing code in languages that don't have implicit bounds checking. Let alone all the code that they have written years ago that they still keep deploying. Nobody is going to rewrite that mountain in Rust any time soon.
Apple has been providing security updates for iOS 14, but so far no 14.8.2, strange since all other supported OS-es were updated today. Often the samen CVE’s get fixed in macOS, iOS, tvOS etc together. They are pushing iOS 15.2 though. I wonder if that means EOL for iOS 14?
There have been no clarity on the status of “being able to stay on iOS 14”. Previously, you could stay on iOS 14.8.1 and the system saw it as “fully updated” but with the option to upgrade to iOS 15. Now, any devices with Automatic Updates turned on gets updated to iOS 15 automatically. I figure they allowed users to stay on iOS 14 in order to avoid the CSAM (legitimate concern) but now they are steamrolling everyone to iOS 15. Does that mean they will be removing CSAM in the future?
I will wait for a couple of days to see if 14.8.2 or 14.9 appears. If it does not, I think we can consider iOS to be EOL (and that Apple changed their mind about iOS 14).
Apple normally continues issuing security updates (but not necessarily fixing other bugs) for around 2 years after a new version of macOS has been released, though I don't believe they've ever formally specified that in writing. Monterey was released on October 25, 2021, so Big Sur should keep getting security updates until October 2023 or so. More than one version back is generally considered unsupported, I believe.
I see a Big Sur update dated December 13, so it's definitely still being supported. While I haven't done an exhaustive check, it looks like it patches many of the same security holes.
>The only way to be safe on Apple's computers (and phones) is to update to the latest version the day it comes out.
That has not been the case with 14 and 15, its a heuristic but unknown backdoors are a constant threat. Realistically we should state that they are never, ever safe.
It was patched on Catalina months after being patched on Big Sur, and after being exploited by the CCP against dissidents in Hong Kong (therefore it was a zero-day).
The article says that the Chinese team said it was on Big Sur, and that the Google team then discovered it was also on Catalina. It doesn't say that Apple didn't patch it on Catalina, or that they patched it "months later".
I would recommend against using vice.com as any kind of authoritative source. On anything.
Looks like you're correct in this instance, thanks! But my general point is still true: Apple routinely skips patches on older versions. I'm aware of a fair bit of proprietary evidence for this.
For a company that demanded Flash be verboten because of the constant security threat, that's a whole lotta arbitrary code execution in webkit. Hey guys, you can't run this VM in our browser. Tell you what, let's just make our browser do everything it does. We won't run into any problems, I mean, everything's sandboxed right?
>That’s just not an option for a regular phone or laptop, which is most of Apple’s market.
Lots of Macs have more than one video card. I disagree, I mentioned it since they make their own silicon, and can do virtual GPU, memory management for the browser would be an issue, but Apple doesn't care, they just use swap on the SSD and it would still be cool to have that as an option for single tabs. All web browsing is just a huge amount of javascript virtual machines, and they have custom hardware to run it faster, they might do it via hardware, all their sandboxing is just OS based, TouchID ran its own OS and took years to crack, and needs hardware tools to do so.
Sure. But it was dead by then. Apple led the charge to kill it, and the main argument given was that Flash was susceptible to past, present and future use-after-free attacks, like the Java VM or anything else that ran as a black box and managed its own memory. And now we're close to square one on that, which is a natural result of people wanting functionality in-browser that mirrors a desktop experience, including APIs that reach out beyond the sandbox. From Apple's point of view, Safari itself is probably one of their biggest liabilities. If they could eliminate browsing the open web completely, that would solve all these problems (except, obviously, whatever they miss in app store review).
While security was one of the concerns which Apple brought up in "Thoughts on Flash" [1], it was a relatively minor one. The big highlights were standards compliance and performance.
Flash was a nightmare on mobile and made pretty much half of the internet unusable on a smartphone (yes, half. Remember those menus in flash?). It was also indeed full of security flaws that Adobe was suuuper slow to patch, if at all. Eye candy was their priority, not the web's wellbeing.
The way poorly written JS clogs up web pages is worse than Flash ever was. Flash didn't have access to the DOM, except to send messages. Pages now are slower and heavier than ever.
And back in 2010 or so, when Flash plugin was still available in beta on iOS and Android, but was declared "too slow" or too heavy on the battery, I wrote an extremely fast-for-the-time <canvas> based screen graph to do some side by side comparisons of manipulating sprites, animating vectors, masking touch areas, blitting some basic particles, etc. Javascript performance was not even close to Flash performance in an iPhone. It really has never gotten close, even with much faster processors and with v8, to replicating on canvas what Flash was doing. Only via WebGL has it become possible to get mobile graphics performance in the browser now resembling what Flash could do in 2010. So the performance complaint was, as far as I'm aware, a lie.
The security complaints were, indeed, legitimate; and it's true that Adobe sucked at patching them, and Flash was a dangerous point of failure for corporate networks. That being said, it would have been a lot better if Adobe and Apple could have come to an arrangement to base a standard off of it. Instead what we have is a million workarounds to achieve the same effect, and we still have a huge amount of terrible, web-choking code all over the place. It's just in untyped JS, which is worse. Bad code is always bad code. At least with Flash you could kill the process without killing the page.
>Eye candy was their priority, not the web's wellbeing.
It was very easy to make stuff on it it was developer and artist friendly (still has lots of cool stuff that doesn't exist), it was a standard, but performance wise it sucked, and I would find it interesting for you to tell me what "web wellbeing" is. I would say that any analytics, advertising, and (my most controversial opinion) javascript is bad for web wellbeing. Every issue you leveraged against flash is the same with javascript.
Security was just one reason Apple didn't like Flash.
Also, it was really Flash's own limitations that killed it on mobile (which, it turn killed it on desktops). Don't forget it was on Android for a while and aggressively marketed as an advantage over iOS. But it was also a poor UX (slow, power-hungry, designed for mouse & keyboard not touch) and a propriety third-party black box.
Apple didn't so much kill Flash as just be the first to see it coming and act on it.
As someone who was developing browser-based games in the space at that time, there was hardly a better option for touch interaction then. True, lots of older hover/highlight code just didn't work, and that was a pretty common thing with Flash. But all UX in Flash was built by coders who are now working in JS. It was entirely up to coders what to do within the box. Now the box is the whole window of the browser.
> there was hardly a better option for touch interaction then
Native. That's what Apple wanted (and still wants) you to be doing.
To address your point more generally, though: that HTML 5 -- or some other technology -- was as bad as Flash in some ways is not a sufficient reason to support it. Flash was worse than the alternatives is some ways (and critically so in some cases, e.g., wrt power) and didn't offer any unique killer capabilities.
It had a couple killer capabilities (to me). I did a lot of work with the Starling engine, and Away3D. And although this conversation is about browsers, a lot of my work was deployed via AIR, which still limps along, but I wouldn't invest in it now. Being able to deploy the same 2D/3D games on any platform and on the web with native GPU access is a killer app - and the best contender now is probably Unity. The ability drive fast vector animations without recourse to SVG or something, all within one dev environment and without many dependencies, provided an incredible workflow from artists to coders. Certain things that were taken for granted in Flash, like the native engine for determining mouse/touch position over layers and layers of different vectors (not bounding boxes) are still incredibly hard to replicate. Whole new platforms had to be written in JS to support these kinds of occlusions and improve rendering performance to a tolerable point. Things we have now like Pixijs would have absolutely destroyed the battery life on a 2010 iPhone.
You're spot on that Apple always wanted native code, and still do, but what small studio has time to write everything three times? The idea of deploying and maintaining totally separate code for a casual game on Android and iOS and the web is a deal-breaker for a 5-person team, and putting all your energy into one walled garden is not what's best for the developer.
This vulnerability is basically an information disclosure issue that enables you to get BSSIDs from the Airport Utility without the appropriate location tracking entitlement. Even worse the OS will not even flash the "location being accessed" indicator in the status bar when geolocation is determined this way.
Essentially any app can shell out to the CLI tool and parse its output and then feed the resulting BSSID to one of the many external BSSID -> Geolocation services and determine a device's location.
At Kolide we are huge believers that device geolocation should not be accessible to third party programs without end-user prompting (even ones managed by MDM) so we aggressively seek out gaps the TCC authorization model (or other OS issues that undermine the tenets of https://honest.security) and report them whenever we find them.
I reported this vulnerability on January 17th 2020.
Edit: Btw, I am not really sure why the CVE was withdrawn. To me this is a pretty serious information disclosure style vulnerability and everyone should upgrade to this release as soon as possible.