16GB pc isn't even the correct comparison, on apple its 8GB unified between cpu+gpu+more.
The 16GB pc equivalent, especially for equivalent pricing of a mbp is going to have a video card with dedicated vram in addition. A $1549 dell xps has a 6GB Nvidia card, with options for 4GB-8GB cards.
8GB equivalent to 22GB is the real comparison we should be making and the answer is obviously no. Especially with all the other io performance cuts apple has been making.
Now you will say theres far more bloatware on the windows laptop, especially in a corporate environment necessitating all that ram, and thats true.
For light use 8gb would be plenty, but its not going to cut it for power users.
Im also curious what the "memory compression" they use is, i would imagine its similar to linux compressed in memory swap. Unless they are just talking standard gpu texture compression and such.
macOS has been doing memory compression for years. Basically, apps and background processes that are idle have their memory compressed, and then upon exiting idle (due to user interaction, a timer firing, etc.), the memory is decompressed. The compressed memory doesn’t touch swap.
It’s possible that Apple Silicon has some hardware acceleration for doing this in-place memory compression, but otherwise it’s pretty old news, which makes this marketing shtick even more frustrating. 8GB was borderline not enough even back when memory compression was added to macOS (maybe ~10 years ago?).
It has been known for years that megahertz are a terrible metric to measure CPU performance. The peak of that discussion was during the launch of Pentium 4, where it was proven that the IPC was much worse when comparing to a previous Pentium even though it had a lot more the clock speed.
Both are available on most modern Windows and Linux machines as well, so these cannot be used as a justification for why 8GB is somehow different on a Mac.
They aren't really usable in any meaningful way. Those low-end models get bought by people that mostly use them as Netflix/Facebook machines. I'm really not kidding about this; I have seen it with my own eyes. I have a friend who has a MacBook for the 2015 era, and I have only ever seen him use it to watch movies, the most tabs I have seen him use are maybe 5 and the most apps he ever used at once is probably 5 too, and just because he forgot to quit the torrent video player client I put there or something like that. It's all about prestige for social status. This is precisely why Apple can get away from selling hardware so compromised.
I told him many times that he could have gotten a computer at one third the price that would have done essentially the same thing just as well, but he doesn't believe it. He thinks Apple is obviously better and obviously longer. Even though I'm tech geek using Apple hardware before he even knew Apple existed. He also has a fetish for Apple cables and chargers, because somehow, they are the only ones who know how to build those things (not that they are bad per se, but there are better options at half the price...).
It's really a lot like a religion or some sort of cult at this point. Unsurprisingly this guy has a lot of beliefs about many things; true Apple fanboys are very often like that.
As far as I'm concerned, Apple stuff is pretty good but the pricing just doesn't make sense, especially nowadays considering of the locked everything down...
Apple is well known for their extremely optimized software due to their tight integration with hardware and software. This would make sense to keep it 8GB for Apple-only software idealistically. Realistically, those modern software are not optimized for any OS. I had MBA M1 4GB, it wasn't enough. I imagine macOS put memory compression to constant heavy use and swap.
Just to nitpick, on Linux it depends on your distribution (or yourself) having enabled zram; some modern distributions enable it by default on recent releases, but yours might not have.
You don't want to buy Apple's entry-level tier. Buy up - it'll last a lot longer. My daily driver is a 2012 MBP. I'm looking to replace it (it's 11.5 years old and is still running just fine). To me it looks like the machine to go for to replace my daily driver is the MBP with the M3 Pro and 36 GB of RAM. It costs $400 less than what I paid for my existing machine in 2012 - and that's in 2023 dollars. $3199 in 2012 is worth $4200 today.
@2800 vs. $4200. That's $1400 cheaper, in today's dollars, than what I paid back then - and I'm getting a helluva lot better machine for it, and I should be able to get a machine I can use for at least 8-10 years.
Is it the cheapest route? Probably not. Is MacOS the best OS today? For my purposes, yes, but it's not a hands-down yes. Yes, this machine costs more than $1599, $1200 more. But it's a machine I'm planning on using for 10 years so that works out to $10 per month. I think it's worth it.
While I have no idea what an equivalent Windows laptop feels like with 8 GiB of RAM, in Apple’s defense, most end-users are unlikely to notice the inevitable swapping that occurs. The NVMe drives, save for the M2 lower-spec’s units, are absurdly fast. I have a base M1 Air, and didn’t think anything was swapping until I bothered to actually check.
The only times I’ve ever known that I needed more than 8 GiB on my Air is working with VMs. This is presumably a common use case for devs via Docker, but for anything else I question the actual need for more.
The NVMe drives in a MacBook Pro are the same speed, or in some cases slower, than any modern PC laptop with NVMe Gen4. [1]
The arguments I've seen defending Apple seem to boil down to one of a few different base arguments:
* The software is better optimized! (dubious, needs evidence)
* Even when the software isn't as optimized, the OS does memory compression so swapping doesn't happen as often! (true, but so does Windows and Linux, so it's not a differentiator for Macs)
* Okay but when swapping does happen, the SSDs are super fast! (true, but not any faster than a comparably-equipped PC, and often slower if you're looking at PCs in the same price range as the Mac)
I could be missing something but I haven't yet seen any credible argument backed by evidence that Macs are anything but mediocre with 8GB of memory.
BUT, even if there were such arguments, it would add literally a few dollars to the BOM to make 16GB the base, so why are we all bending over backwards to defend Apple?
edited to add: don't tell me that it would cost more than a few dollars to make 16GB the base -- another common false argument is that the memory in a mac is somehow "special." It's not, it's completely standard LPDDR4X or LPDDR5, sourced from the same suppliers as everyone else uses.
I think the adjective “mediocre” is a tad heavy-handed here. I’ve had a variety of laptops – PC and Mac – since the mid-2000s. Apple Silicon is far and away the biggest leap forward I’ve seen, bar none. On a recent 5 hour flight, I shut off WiFi and only had a terminal open. At the end of the flight, I still had over 90% battery remaining. That is monstrously impressive, if only for the screen’s power consumption.
Add to that the fact that the terminal is nix instead of Windows (Powershell is not the same, and if you’re falling back to WSL then congrats on admitting nix tooling is superior), and it’s a very compelling machine.
Yeah, I use an M1 machine for work, and you’re right about the efficiency. But as a creative pro or a developer (two of the target markets for Macs), your machine is the main tool you use to get things done. Calling a machine “Pro” and shipping it with 8GB of memory in 2023 is a slap in the face to those professionals when it’d be less than 1% more expensive to double it to 16GB.
That’s why used the term mediocre, and I stand by it.
I don’t have much to say about the terminal stuff other than that my personal thinkpad that runs Linux does the terminal thing a hell of a lot better than my Mac does.
If you’re running Linux on a laptop, then yes, that’s even better. However, AFAIK nothing has battery life close to Macs. I’d consider one if they did.
My work Mac is perma-docked, but my personal Air spends most of its time on my lap, and I greatly value the battery life.
My Thinkpad T480 with high capacity battery gets about 10 hours of run time under Linux, running an editor, browser with a million tabs open, docker containers, and the occasional video playback. That’s enough for me, and I’m not convinced that a Mac would have meaningfully higher battery life with the same workload (especially running Docker, which is a hog on MacOS).
> They last for years with constant writes; on the order of magnitude with petabytes
For anyone else wondering, that test did use one TLC disc (Samsung 840). It had a major issue at 300TB, then went on till 800 (failed somewhere around 900).
What are you trying to say here, that one can only expect what's warrantied?
I expect nothing more than OEM support in that warranty... and that things generally continue to work outside of it.
Techreport (and others, before we go down that path too) tested those along with many other drives. They were written to for more than a year and a half continuously.
Sure, Samsung and others only agree to financially be on the hook for N bytes. The point is, the drive can handle it - and the situation doesn't make sense.
Let's take a step back or two. We're focusing on numbers one would never even have the patience to reach while swapping
Used drives can have life in another device or the whole device may be reused for a less taxing purpose ... if it hasn't been exhausted because of cost cutting.
Aren't most companies on a 3-5 years replacement schedule for laptops? Usually based on some combo of accounting/tax rules, warranty and service plan availability?
If devices have only one short life at a single customer before being trashed or smelted then folks need to wake up. Reuse comes before recycle for a reason.
I rarely ever buy new and usually got hand-me-down equipment at work. The last brand new equipment I was issued on the job was probably 2013.
My employer was on a fixed 3 year cycle when I started here (20+ years ago). Every multiple of 3 years, you got a new laptop. At the time, the old ones were given away to employees for personal use.
At some point, they went to a 4 year cycle. Employees can keep equipment as long as they want - 4 years is the minimum to request new hardware (without a reason and approval). The give away disappeared as well, not sure what happens to the old stuff today.
Back when I was on Windows/Dell, that 3-4 year replacement cycle for crucial. The systems just didn't hold up over time. Even more so for people who travelled. Now that I'm on a Mac, the hardware seems to hold up much better (aside from the occasional poor design like the keyboards of ~5 years ago).
I suddenly needed to replace my maxed out M1 Pro Max with a new laptop a couple months ago. The base M2 Air 8GB was the only one that would ship to me quickly. I was reluctant because I figured it would barely be functional with that much memory - before I got rid of my Pro, I loaded up my usual workload and checked how much memory was being used in total, and it was easily more than 8GB.
But I've been doing my usual (Firefox with many tabs, VS Code with a bunch of tabs and a couple windows, a Next.js dev server or two, multiple terminals, sometimes even more apps open) and I haven't noticed even a slight struggle to handle the load, no frame drops or anything.
Of course, it only supports 1 external monitor instead of the 2 I was using with the Pro. I'm sure it would struggle with the heavier VM/docker workloads. But it handles medium demands just fine.
Is no one using AI models for inferencing on Apple hardware? I imagine you can’t compress memory you are actively using. These computers must blue screen for even thinking of going to huggingface.
I took a quick look at the site of a major retailer here, and the first smartphone I clicked on (Moto E22) has 4GB of RAM; I then clicked on five other smartphone models, and of them, three also have 4GB of RAM (the other two have 2GB and 3GB). My own main smartphone (which is neither of these models, and is a reasonably high-end model) also has 4GB of RAM. From this quick look, it seems 8GB of RAM isn't that common for smartphones.
They work differently from what you're probably used to from older/different systems.
They don't run into a brick wall when swapping, in fact they swap all the time (very heavily; a lot of a running Application can be swapped-out on normal use). You're never really "out of RAM", you're just increasing memory pressure which gradually impacts the system's responsiveness.
Though I might add is that at some point they do fail catastrophically (extreme lag sometimes leading to crashes) once the pressure goes into the red.
For more demanding tasks like video editing, even 16gb is not that much. Especially when you're working on big projects with hundreds of files and 100GB+ of footage. I can't see how any "pro" device only has 8gb in 2023
The 16GB pc equivalent, especially for equivalent pricing of a mbp is going to have a video card with dedicated vram in addition. A $1549 dell xps has a 6GB Nvidia card, with options for 4GB-8GB cards.
8GB equivalent to 22GB is the real comparison we should be making and the answer is obviously no. Especially with all the other io performance cuts apple has been making.
Now you will say theres far more bloatware on the windows laptop, especially in a corporate environment necessitating all that ram, and thats true.
For light use 8gb would be plenty, but its not going to cut it for power users.
Im also curious what the "memory compression" they use is, i would imagine its similar to linux compressed in memory swap. Unless they are just talking standard gpu texture compression and such.