Are you sure you aren’t confusing the download size and update size? Asking because on PS5 the situation seems to be exactly that, 30GB download, ~100GB free space required to install.
I've been wondering for a while, maybe someone with experience can chime in? Why are game updates so huge. Oftentimes even small patches (that I believe don't contain new or fixed assets, at least significantly) are huge? Shouldn't game logic changes be much smaller?
I don't know how GOG delivers updates, but 100GB is basically the download size of the whole game, so it sounds like it's downloading it again. On Steam, the update was ~20GB. Still pretty ridiculous for what is essentially a patch though.
Not sure if that was something GOG specific, but basically had to download the whole game again.
The official specs are that the patch is 30GB, but requires 130GB free disk space.
GOG has has some strange calculations it performs for free space, particularly for its offline installers. One of the Cyberpunk 2077 updates required 400GB+ free space to install a sub 50GB patch, which was ludicrous and I think under-reported due to most using the Galaxy install process.
Fast internet is not that ubiquitous. There are plenty of people in rural and even urban areas with 16 Mbit or less even in developed countries. Game downloads are extremely frustrating to some friends of mine because it takes literally 20 hours to download sometimes.
You can't mean it in a way where someone wants to play a game and you need to DL 30 or 100 GB before it will let you play today.
I sort of get it if this was some unimportant background download being done, but for a game you actually would like to play, you are stuck with "you should have started your computer 2-10-20 hours before actually wanting to play" depending on how fast your "Fast internet" is.
The good news is that BG3 can be played offline, so if it's going to take a long time to update and you want to play now you can just keep playing the old version.
Not macOS but my iPhone is struggling with space to update and there’s a ton of data I just can’t delete like data X stores somewhere or WhatsApp that duplicates photos in addition to a massive amount of system files that aren’t specified.
On the iPhone and iPad, there are options you can select to let Apple unload certain applications or data to iCloud, where you can re-download them again in the future if you need them.
This lets the system clear space for large downloads like this. You just have to know which options to enable.
I would prefer to have access to the filesystem too but this isn't really a solution to the problem. People shouldn't have to go around deleting System32 to make space on their device so they can update…
Yeah, the only reason this is a concern and a problem sometimes is because of Apple greed. Considering how much they charge for storage, you don't go for the large version, and since everything is soldered then it might become a problem in a future update.
For example, Apple Watch series 3 cannot update without replacing the whole system for at least 3 years. In this case they didn't even sell more storage.
In the end, whatever; it's just Apple nonsense, better be done with them and look for hardware elsewhere than care about all this...
There's a few reasons I can't do that and even though I've been using Linux since the mid 2000s, I don't want to spend my time tinkering.
Apart from that, the bigger issue is that this is something that affects many users and the selling point of Apple was always that it should just work, which it clearly doesn't in this case.
Well, Android was the more obvious implication, but I bet you know that and disagree.
Which is weird because many manufacturers don't restrict flashing a different rom, and even with standard Android you have more access than iPhone gives you.
> many manufacturers don't restrict flashing a different rom
Manufacturers don't, but app makers do. I know people running on LineageOS that have to keep a second smartphone on stock Android for their banking authenicator app (which fails on Lineage because it's not the manufacturer-supplied ROM and that supposedly means it's been hacked).
If it couldn't be fooled through software (and there are sandboxing solutions to run stuff that needs google services etc) I don't think I'd stay with such a bank.
We need to look at this from several angles. Yes, most have fast internet these days, which means that for a single person the time spent downloading a 1GB update isn't really a problem like it might have been 10 years ago, but what about the literal hundreds of millions of other macOS users? We're talking hundreds of petabytes of network traffic that happens multiple times a year.
Say that update could have been optimised down to say 100MB through shipping only binary diffs instead of complete executables, that's hundreds of petabytes of network traffic that doesn't need to be transferred, which means less link congestion, less packet processing, less data temporarily stored not just on customer machines but in datacenters as well.
Unnecessarily large updates in systems with hundreds of millions of users has a non-negligible environmental impact.
I'm not 100% convinced it's all that negligible, but in general I agree with your sentiment. Inefficient software updates are a pet peeve of mine. If I download 5GB of software patches I'd expect to see major differences after applying the patch, but most of the time these days it's barely noticeable changes, yet the patches are still huge.
Just look at the iOS 17 OTA packages for iPhone 14 Pro.
- 16.6.1 -> 17.0: 3.3GB - fair enough, new major version
- 17.0 -> 17.0.1: 452MB - security updates and bugfixes
- 17.0.1 -> 17.0.2: 384MB - security updates and bugfixes
- 17.0.2 -> 17.0.3: 450MB - security updates and bugfixes
- 17.0.3 -> 17.1: 1.35GB - this one is interesting, an update from 17.0.2 to 17.1 is actually about 150MB smaller than from the latest version which uses the same upgrade package as 17.0 GM
- 17.1 -> 17.1.1: 395MB - security updates and bugfixes
- 17.1.1 -> 17.1.2: 414MB - security updates and bugfixes
In total, I've downloaded about 3.44GB of updates for my iPhone since the iOS 17 upgrade in September, and have received WiFi AirDrop support, some security fixes, and better Home app Matter support. Security fixes are in general just a few lines of changed code, and should result in minimal changes to a binary, so if you just diff the new binary with the old one it should come out pretty minimal, yet the patches are several hundred megabytes.
Someone in a different thread suggested Apple were doing partial binary patching, but I don't see it.
In the US many internet providers have data caps. And very frequently there is no alternative provider in the same area, so no pressure to lift the caps.
Comcast for example has an 1.2TB per month cap and $10 for each additional 50GB chunk.
With 4k shows on Netflix, online gaming, and 60gb device updates it’s very easy to hit the cap.
I got a deal on Comcast to get 1.2Gbps + unlimited bandwidth for $80. I was previously using Century Link Small business at 150Mbps + unlimited.
According to their app, I'm consistently using 3-4TB per month. I don't know what the breakdown is but my guess is it's mostly streaming video and video game downloads.
Yeah I don't give a shit myself. I mean 850 meg is what an hour of decent quality streaming and we get one of these every month or two. Meh. Even when I was stuck tethered on 4G for 9 months this wasn't a big issue.
This is the type of article that can be published continuously for all of history. The computers get bigger and bigger. It would be more impressive if the updates were smaller for a while.
I thought one of the features of having signed and verified boot partitions was that you could know exactly what's in it and can thus do cool things like binary diff patching where it makes sense (which should be like >90% of all files in an update), and only ship complete binary images where absolutely necessary.
Obviously this means you have to maintain 2 update tracks, one optimised flow for all the normies, and one for everyone who has disabled SIP. But the benefits are worth it when you consider all the electricity and bandwidth used by all the excess data you're no longer sending over the network.
Combine this with APFS snapshotting and it should make for a resilient update schema which uses minimal amounts of data.
Or am I completely wrong in my assumptions about binary diff patches?
iirc Windows already does this - MacOS is just behind the curve here. Perhaps there are security/stability edge cases that make this more difficult than it would seem?
Sonoma is also setting records for the number of bugs. Now even the buggiest windows version ever feels like an etalon of stability comparing to Sonoma.
Not sure if it’s used a lot in English, but in French a « valeur étalon » is a value of reference. Like you measure something precisely, define it as standard and it becomes your étalon for future measures
It's neutral. For example, it was used with metric system definitions: at some point the "kilogramme-étalon" was literally an object weighing 1kg (by definition, the kg was this object).
Think of it as a horse. So, it looks like OP is referring to a very stable horse. Although it would probably be a stretch to call it a dependable workhorse..
> you haven't started one up in a week you need to download an important patch - somehow using 6GB in the process.
What game was this? No games are pumping out multi GB patches per week unless they're early access on steam, at which point yeah, it's early access.
> I'm pretty sure there is just next to no effort put into keeping the file sizes down.
This is a silly viewpoint, and I've had this out many times on this site. We do care, the demands are just growing and people have unreasonable expectations.
Game sizes have ballooned because expectations are huge.
4k HDR textures are
16x larger than 1k textures. HDR is an extra 25% for the increased bit depth. Textures can be 20x larger than they were 10-12 years ago.
Map sizes are increasing (compare cyberpunk to GTA 5 for example).
Loading screens are heavily frowned upon so compression becomes a lever to adjust.
Audio/VO is highly dynamic and there's absolutely loads of it. Take a look at Overwatch, I play a few hours a week of that game and I still here new voice lines every day.
I worked on shipping smaller patches on AAA games at $PREV_JOB. If you compress everything to high hell and back to be stored on disk, you can do a partial update of it anymore. We tried to be tactical and adjust compression/groupings of what would change frequently and what wouldn't, but sometimes we got it wrong.
Occasionally large things change and force something akin to a full download. We don't want it, you don't want it, but that's where the content comes from.
Then you have the actual logic of patching. If we change a 30MB file, that's part of a 1GB compressed block, then it's a 1GB download. We also need to copy the 1GB we're about to replace somewhere to roll back if you lose connection during the update, plus enough extra space for bookkeeping, extraction, etc.
If you think you can fix this, I'm sure most game studios will hire you with the drop of a hat.
My wife plays a lot of Counter Strike, but she is also a busy person with a stressful day job. She can't play every day, but because of this, every time she wants to play she has to sit there for 30 minutes to an hour to wait for the update to download.
This is the exact reason I pulled 1Gbit ethernet to my wifes PlayStation. It's so rare that she has the time to play, so spending 30 minutes or more to pull in updates is preventing her from enjoying what little time she has to play. Switching out the wifi for a physical connection have reduced those download time to just few minutes.
The amount of updates to what should be more or less finished products is just insane and I can't understand why the industry doesn't see it as a issue... Well, money but still.
I feel her pain. My 4 year old girl asked really nicely to play "the flying game" (Microsoft Flight simulator 2020) the other week. I got her out of the bath and we were both looking forward to playing but it the downloaded a major update which was still running when I started work the next morning.
You mean the season update yesterday that changed the look of the entire map, had a one-off event, came with a new battle pass, added the festival, Lego and rocket league racing modes?
I was flabbergasted when I went to update a bunch of Samsung TVs in the office the other day and the package Samsung told me to download was 3.1GB!
Sure, that package actually contained 2 complete images for different series, but why does a TV OS need a 1.5GB partition in the first place? (And also, why are they bundling two images for different series)
Rhetorical question, I know the answer, but my point is nobody actually likes the features that cause this problem.
I was running software updates at Apple and I remember the concern and effort when the "full" update first went over a gig. We were able to get it back under a gig for a release or two but it was hopeless without major new engineering work which would take time. Leadership kept yelling that there must be a way, and they didn't abate until we pointed out that Windows full updates had been over a gig for months.
Of course, Apple used binary patching and compression and all that jazz... this was for the "full" version that delivered entire files.
As the owner of a 2012 iMac I fear this because eventually I'll be forced to upgrade due to some application requiring the latest OS. The process for installing a new SSD involves, I shit you not, taking a razor blade to the glue between the glass screen and the chassis. I could be a wasteful consumer and buy a new one every 3 years but it's about the principle. It's a great tidy computer to have for casual use in a shared family room and it's a shame that a software issue might be the death of it.
And when that comes to an end, you can try Fedora Linux and it may serve very well. Especially if for a shared computer you need browser plus some other popular apps. I use Fedora on my old Hackintosh (that served me well for over a decade) and I hadn’t noticed any difference with macOS software and aesthetics-wise.
As an embedded developer who measured code size in KB, I'm still absolutely baffled why desktop software is GB in size.
Why oh why does a basic Windows/MacOS desktop use tens of GB to install and use gigs of RAM? When it's sitting at an empty desktop, what is it really doing that's multiple orders of magnitude bigger than Windows 2000 on 64MB RAM?
Why would they plot the cumulative size of the updates? That's always going to go up.. Just plot the size of each update over time?
The trend is linear so, it doesn't look like the updates are actually getting bigger over time, rather is a new paradigm of overall larger updates with Apple Silicon Macs.
Back in the macOS classic days, I would use ResCompare to make patches to my app. The resulting exe was a few kB ("It's one patch Michael, how many megabytes could it be?")
I can't even begin to understand what's going on these days, but I'm sure there's an explanation.
I think the OS is written to some kind of read only partition, and so they are having to write the whole thing each change? It's also why safari updates need reboots these days, isn't it?
It's protected by T2, the reboot disables the system security protection (SSP) temporarily in the T2, which then allows the OS partition to be written to, then it re-enables SSP and reboots. Only 'special' signed code can enable the update mode.
Well it seems to have made the Finder worse for me, so there's that. And, yeah, it's not particularly quick to install (M2 Pro, 512GB SSD).
I hope Apple can stop faffing about with new whiz bang features for a few iterations and instead focus on working down the ridiculous list of bugs in Sonoma. Somehow they've managed to utterly fuck up application switching, the mind boggles.
Edit since I'm being throttled:
I wholeheartedly agree (and I'm sure I've commented to this effect on HN before): this is probably the nicest Apple laptop I've laid my hands on and nearly the worst version of MacOS I've seen in three decades. I really like the 14" pro form factor. Thankfully I have until January to decide if I want to keep the laptop.
Keyboard focus does not reliably follow you to the new foreground application. I saw this mentioned in the comments in the Ars review but it seems to be worse for me than that guy.
Finder does not reliably display a progress indicator when copying big files, and quick look is as unreliable as ever.
AFP support is completely broken. Apple should just rip that code out instead of letting it rot like that. It's particularly frustrating because Apple dropped support for the version of SMB where Samba "properly" supported unix-y permissions.
SMB support worked for a while in the Finder, but would corrupt metadata from the command line. With 14.1.2 it seems like creating/modifying files on an SMB share from the Finder is as unreliable as the command line.
"Apple Inc. hit pause on development of next year’s software updates for the iPhone, iPad, Mac and other devices so that it could root out glitches in the code." [1]
The keyboard focus. I thought my keyboard went bad or something was stealing my keyboard inputs when switching apps. Has there been any explanation why this is happening? Could you please link the Ars article?
AFP not working was bothering me after updating to 14.1(.1). I have a Mavericks machine around (I know, it's old) that I was reliably copying files to/from until that update -- along with Apple Remote Desktop -- broke and I just couldn't see the Sonoma machine from any other Mac (Mojave couldn't see it either).
I figured out that by turning off File Sharing / Remote Management / etc on the Sonoma machine and turning them back on, it fixed it. Hope this helps.
So the AFP failure mode I saw was that I could list a directory or two and then the finder would hang and eventually the connection would just drop. Apple's been nudging people to SMB for a while now so I'm kinda tempted to stay with Samba.
Except for the whole SMB data corruption thing that's apparently been a problem since Ventura.
Apparently, and I can't remember where I read this, they're working on cleaning up the bugs now. There are a few nasty ones but I seem to be the guy who manages to walk through the field of landmines at the moment and not step on any.
Fully agree. I started to install the newest iOS & MacOS releases just before the next major version is being released. I'm always one version behind but kinda works well this way.
Apple is historically pretty bad at managing differential updates (including apps) so it's probably mostly a few files changed and the re-download a good portion of the OS.
I‘m guessing they completely re-shipped Safari and related components in that update. That’s where an actively exploited 0-day has been fixed. Wouldn‘t surprise me for Safari and some system components to be that size.
This article is saying Sonoma has the smallest updates in years, and that installation time is decreasing.