This is simply how Apple does things. They provide software support for ~10 years and then they drop it.
Is it wrong? I’m not entirely sure. You can call that greedy, but take a few things into consideration. Firstly, Apple provides highly specialized software that runs extremely well and efficiently on their own hardware. Continuously providing support for old hardware while simultaneously maintaining the same level of performance is simply not feasible. I challenge any software developer to achieve a similar goal.
We are not talking about Linux here, where if you’re lucky, things work. Anyone’s that ever used macOS even once knows that things really work.
Now, I assume this window of time between a new hardware’s release and the software drop for that given piece of hardware will increase over time, given that new hardware released by Apple nowadays is incredibly performant, this would allow longer software support.
Frankly, my opinion is that 10 years of support is more than enough for anyone to consider renovating their hardware.
Recently, in the last XNU kernel release (corresponding to macOS 13/iOS 16) the 32-bit part of the kernel has been entirely removed, meaning that XNU won’t support any 32-bit device anymore. This is really exciting to me, as I see the technology moving forward, without getting stuck on prehistoric hardware support.
They’re probably talking about most desktop Linux distributions, which are less stable than what servers are running both because servers run distros with packages that are older and more polished and because servers don’t usually have hardware with notoriously troublesome drivers (e.g. Broadcom, Nvidia, some Realtek stuff, etc).
Of course desktop users can tailor their machines to use with Linux and also run something like Debian but it’s more likely they’re using whatever computer they happened to have and running Ubuntu (or one of its many derivatives), Fedora, or Arch which are indeed more likely on average to break. I know because I’ve seen it happen several times in my own usage.
I went from Mac to Linux. POPos was what the machine came with. It’s very stable though the UI took a little getting used to, it’s not as big a change as I expected.
The upside is Bioinformatics tools run natively and it saves me from doing all my work on the cluster (just finished a multi day run that was “quick” using all 16 of my cpu cores. ).
> which are indeed more likely on average to break
Maybe; it's not happened to me in 15 or so years, but people have different experiences. That said, "more likely on average to break" is a far, FAR cry from "it works, if you're lucky".
Fair point, but I've run *buntu on various laptops (old and new) and desktop(s) my son and I constructed from stuff chosen from pcpartpicker.com not 3 months ago.
If comments on HN are anything to go by, linux on the desktop is perfect and easily usable by any office worker.
I'm sure if you walk up to Susan in accounting, handed her an arch install usb, she'll have tmux and neovim up and configured by lunch!
In all seriousness though. I'm not sure a lot of HN commenters (probably myself included) have any actual perspective on how a normal person wants to interact with their computers, or what business need from desktop endpoints.
The fact that anyone things that editing source is a good way to configure a window manager (dwm) is mind boggling.
My ~6 year old MacBook Pro (13" late 2016 model (with touchbar), purchased new in 2017) is no longer "supported" (no Ventura). As far as I can tell, linux does not support WiFi or audio on it, so it's pretty much a piece of trash as far as being a laptop once apps stop supporting macOS Monterey.
How many of these apps will drop support for Monterey within the coming year(s)? How many of your apps will actually stop working when there are no updates?
I think that if a Laptop lives for 6-8 years, that's quite good. If you don't think so, maybe Windows is a better fit for you because you can purchase fove (cheap) laptops for the price of one Mac. If you "need" your Mac for work, I think replacing it every 6-8 years shouldn't be an issue.
Why buy 5 Windows laptops for the price of one Mac when one PC laptop can be supported for 2-3 times as long? You can run Windows 10 on a ThinkPad T60 from 2006. Windows 10 will get support until 2025. That's 19 years support from one machine. Use a Linux distro and get even more.
Why spend more for a Mac if you get less useful life and are constantly having to repurchase software from OS upgrades breaking stuff?
I think you're missing my point a bit. Sure, if you even want longer support, don't purchase a laptop. Cheapest lifelong option...
My point was that the argument "apps won't run anymore" may be true, but there's no evidence for that. There are a lot of people running 10 year old Macs and the software does not just break.
As a lifelong Intel/MS user in my early 40s, I've been staring down an M1 Mac Studio and thinking hard about this.
I had a 2006 Dell Inspiron Laptop that I got for less than $1000 from the Dell Outlet, that came with Vista XP, and through various promotions had been able to upgrade the OS all the way to Windows 10, legitimately - and I think I might have only paid $60 for the Win 7 upgrade. About six year ago, I donated the laptop (the internal wifi card was spotty, and with 4gb of RAM, it wasn't much for multitasking).
Similarly, I've got a Dell Desktop with an i7 processor that's 11 years old, which through the magic of SSDs and the occasional video card upgrade, continues to do a solid job with the creative work I do in Photoshop, multitrack audio, Sketchup, and Twinmotion.
Ten years ago, I started using a Macbook Pro at work because internally, we switched to Ruby on Rails for the ecommerce platform we built. I was asked if I wanted to keep working in Windows, but basically everyone building in RoR was on a Macbook, so why swim against the current? I'll spare a this vs that comparison, and leave it to say that (muscle memory for keyboard shortcuts aside) I eventually was okay with working on a Mac. It also gave me a lot more exposure to Apple culture, for better and for worse. A lot of it was insufferable idolatry and ideological pontification - but I started to understand, if not necessarily agree with, the product lifecycle in Apple. I had a 2013 Macbook Pro, it was still working fine, but I traded it last year towards my first iPad (now that they had USB-C and more of a creative focus than a consumption focus, I was willing to take the plunge)
It's a lot like leasing cars. I'm someone who tries to drive a car into the ground, proverbially - I do regularly maintenance though, so it's more like after 10-15 years, the safety improvements on modern vehicles outweigh the cost advantage of driving something I paid off a decade ago.
Back to Apple - once they got serious about recycling, the pattern became: you spend a bunch of money up front, and then you just keep trading your hardware in for the latest version. I would hold onto my iPhone for three product cycles because I spent a lot of money on it and I was going to eek out every last cent. But with Apple, you have a large initial outlay, and you can leverage the trade-in as almost a kind of hardware subscription. They've got a solid backup/recovery process that makes this really, really simple. And I kinda get it, the way I kinda get why some people lease cars instead of buy. Get a Mac with AppleCare and upgrade it every year or two, and that premium you pay essentially means you don't have to think too hard about breaking your computer, and you're always within a year or two of the latest, greatest hardware.
It still costs more, no doubt about that. But like with a leased car, you kinda don't have to worry about maintenance. With cars and computers, I do a lot of work myself, but I'm getting to a point in life where I just don't care about doing that anymore, and I have better financial resources, and I'm thinking real hard about going with size, power, and performance of a Mac Studio, knowing that it means reshaping my relationship with computers. The fact that they've been reducing or eliminating proprietary connectors in recent years has played a very large factor in swaying me in this direction too, I'd add, while knowing that it's still very much Apple culture to charge you $30 for $0.35 worth of cable.
> A lot of it was insufferable idolatry and ideological pontification - but I started to understand, if not necessarily agree with, the product lifecycle in Apple.
I have an iPhone and iPad as well as a hackintosh.
I will never for the life of me understand the apple ideology. It is almost like some of these people's whole identify is tied to a company they neither own or work for?
Sure apple does make some great stuff, but they also make crap and have a history of bad choices (dock, firewire and lightening come to mind).
what i have a serious issue with is the artificial limits on OS updates. Clearly if Opencore legacy patcher gets the OS on the laptop apple intentionally blocked it?
Many who have used Opencore are happy with the experience so why exactly did MacOS refuse to install?
It's funny you mention Firewire. Up until last year, I was using a Firewire audio device with my Win10 machine - it was released in 2007, and the last driver update was in 2012. Even though official support had dropped a decade ago, it still worked. Because of the architecture of Firewire vs USB, if you wanted a lot of high speed I/O, Firewire was the way to go - as long as you weren't running a Mac. While researching the Mac Studio and its M1 processor, I read some rather pathetic stories of using a series of dongles - Firewire -> USB -> Thunderbolt (or some such chain), and some hackery to get old drivers installed.
I decided I'm in a comfortable enough place now that I can upgrade to a modern interface... without getting too into the weeds, I replaced an outboard mixer and two Firewire interfaces with a smaller, higher quality analog hybrid device that runs USB-C. And funnily enough, I've nearly recouped the cost of the purchase by selling my old gear.
Hanging on to old equipment just does not fit the Apple model. Trade in early, trade in often, or get stuck on old architecture, and get phased out of the upgrade path. I've got a 2015 Macbook Pro I should have traded in the moment I heard about the M1 processor. It was the vintage to have, and now I'd be lucky if I got $400 for it.
Anyway... there is a large portion of the human population that seeks out and worships idols. It's like any rational part of their brain decides that with THIS person, they can relax things like curiosity or critique. If I keep typing about the topic, I'm going to start saying unpleasant things that will no doubt upset people, so I'll just leave it at that.
Recently my old MBP (2013 model) stopped working, with a mysterious kernel process taking up max CPU to grind everything to a halt on every boot. Fortunately I had its files backed up to a disk in Linux-compatible format (ExFAT32?), so I finally decided to wipe it clean and install Pop!_OS. https://pop.system76.com/
It gave new life to the laptop - I'd recommend trying it, you might be able to get more mileage out of the machine.
It has that whole "MacOS" dock look and feel right out of the box.
Good that you found an alternative OS and kept your old mac running. So many people seem content to spend big money on a Mac laptop, have it reach EOL and become landfill as they run out and buy a new one.
Ubuntu Budgie looks nice, thanks for mentioning it - I'll try it sometime.
An old Macbook with an Ubuntu variant is a nice combination of sturdy hardware, light(er)-weight operating system, and well-designed user interface. In a way, it feels more like the spirit of "Macintosh". And it would make for a good educational/toy computer for children and young people. If I see an opportunity, like family or friends who have old laptops or computers, I'm going to offer to set it up for them.
You have another couple years until apps stop supporting Monterey. That said, I think it was a poor decision by Apple to drop the 2016 version but keep the 2017.
>"Anyone’s that ever used macOS even once knows that things really work."
I am not a regular Mac user but I was once asked by client to port my CI/CD related script so it can work for developers who use Macs. Script involved dealing with docker among other things. After dicking around for a while I've discovered deeply nested folder created by docker that was not supposed to exist after successful script completion. Do not remember all the details but this folder was a thorn. Short of sacrificing a virgin no matter what I tried I was not able remove it. Invoking search power of Google did not help either - plenty of recipes and suggestions and none worked. Finally after giving this Mac to system gurus and them dicking with it for a day I was told that they have failed so far it is no longer worth their time and they will reinstall the OS.
Macs have a bunch of weird quirks where simple things simply do not work. I got a new mbp for work recently, second to my main windows gaming machine with 34" ultra wide 1440p monitor.
And I've basically installed chrome, vscode, and a whole bunch of little packages to make basic hardware operate normally. There's an app to allow the track pad and mouse wheel scroll directions to be set differently. There's another to allow the mouse side buttons to work. It's a basic mouse. These are settings and compatibility issues that do not need to exist.
But the most annoying is that the monitor is a bit blurry on the M1. I use vscode for a few mins before I realise there's a fuzziness to it. And apparently this is a known issue, because macs only scale properly for 4k resolutions. So there's another app to make it think its 4k or something. Helps a bit but not that well, and it's a trial.
It's maddening. How the hell is anyone supposed to know that Apple has just decided that they won't be compatible with an entire category of bloody monitors, of all things.
Macs only look good at a 110 or 220 pixel density.
34" ultra wide 1440p has a pixel density of about 110 ppi, and that should work just fine. You might need to use a different cable, though. You should look your monitor up on rtings.com and search for notes on Mac compatibility.
I have a MacBook Pro 14 that I use daily with a 34" ultra-wide 1440p gaming monitor, and I've never had a problem. I had a MacBook Pro 13 before the 14 that also worked without issue.
The problem is that macOS killed off subpixel anti aliasing, which is important to making text look good on 110ppi displays. So 220+ ppi displays are the only real option here.
Yeah, using non-Apple monitors with a mac is a pain. Try "Retina Display Manager" - https://github.com/avibrazil/RDM - as it allows you to switch to the sharper HiDPI resolutions supported on your monitor.
Already upgraded the cable for max refresh rate while gaming.
I might not have noticed an issue with the way it looks, if I wasn't running it alongside a Windows machine that is very crisp.
As I said, it takes a few mins of vscode before something starts to feel a bit off. Its subtle, but it's there
I've used macOS professionally for years and it absolutely does not Just Work. I greatly prefer Linux, which also doesn't Just Work, but at least exposes enough knobs that you can fix what doesn't work. Unlike on Mac where you have to disable SIP just to get started fixing it.
On the other hand, if thinking about the world as it is and not about the abstract, I'd have to say that none of what Apple did since 2015 has anything to make me upgrade my hardware.
So I have like 6-8 computers that are all more than 7 years old by now. I'm not looking for new hardware.
And looking at the software they've released in those years, I'm not looking forward to new software.
Methinks I ate myself, thanks Apple. You got me spoiled.
> This is simply how Apple does things. They provide software support for ~10 years and then they drop it.
Monterey was released in 2021 and dropped support for my 2014 MacBook Pro, so that's only 7 years.
It's fair to say that you can continue to run macOS N-1 and N-2, which would add years of life, but the author of the linked article is trying to argue that you're not safe unless you're running the latest major version.
But then you get a few more years of your software not being 100% current but still getting a good chunk of the security updates and generally being compatible with current software, which adds up to that full 10 years.
Well, that’s 7 years of major updates with all the security fixes. Moreover, Apple continues to release minor versions for devices that are not supported anymore, in order to fix major flaws that are actively exploited ITW (see iOS from 12.4.2 to 12.5.6, those are a lot of minors released over the years to address 0d ITW, same thing happens with macOS)
With lldb you can do that, basically you have the option of running commands when a given breakpoint is hit, so you can just make it place another breakpoint, and it will be placed only if the first breakpoint is hit. I assume you can do something like this on gdb as well.
People in this thread seem very confused about a lot of stuff so I’m going to try and make some clarifications:
1) This is an RCE, so what it does is achieving code execution in the browser, i.e. it can run arbitrary code from the attacker, it’s literally like running a compiled program inside the target’s browser. This doesn’t bypass Chrome’s sandbox, so a lot of OS operations are not reachable from within the browser (for example, a lot of syscalls can’t be called).
This is the first step in an exploit chain, the second one would be a sandbox escape to expand the attack surface and do nasty stuff, or maybe a kernel exploit to achieve even higher privileges (such as root)
2) WASM being RWX is really not a security flaw. W^X protections in modern times (even in JIT-like memory mappings) makes sense only when coupled with a strong CFI (control flow integrity) model, which basically tries to mitigate code reuse attacks (such as JOP or ROP). Without this kind of protection, which needs to be a system-wide effort as iOS has shown, W^X makes literally zero sense, since any attacker can easily bypass it by executing an incredibly small JOP/ROP chain that changes memory protections in the target area to run the shellcode, or can even mmap RWX memory directly.
You understand that having W^X protections on any JIT area is fairly useless without a strong CFI model in place right? Any attacker could easily execute a ROP/JOP chain to switch JIT protections to RX or even more simply allocate an RWX area where the shellcode can be copied and executed.
Yes, and this is the part of the problem of the general direction of JS ecosystem development.
JS promoters want so hard for JS to subplant other major languages, but not noticing themselves ignoring the decades long other path major languages took on robustness, and security.
Doesn’t really work like that. First of all, when would you reboot your phone? Once per day? Once per hour? Every five minutes? Regardless, these attacks are incredibly advanced, remember they require zero interaction from the user.
Even if you rebooted constantly and the exploit lacked a persistence vector, they would still be able to exploit you whenever they want. There are literally no good defense mechanisms against zero-click attacks. The only effective one being turning off your phone forever.
Something like these exploits takes 1-2 minutes maximum to achieve full data exfiltration. This means you’re not safe even if you reboot every five minutes.
So preventing persistence vectors is not really useful against these types of attacks. Persistence is more of a “comfort feature” for attackers, is not really something essential.
Normally bugs in these types of attacks target daemons that are always connected even if not logged onto iMessage or even if you disable iMessage. Or at least this was the case with previously known bugs.
This would give you a controlled relative write primitive if you can repeatedly call this function in a loop and going OOB.