Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple puts the RAM in the chip package because they integrate the GPU, and then they want to be able to have multiple channels to feed the GPU without having that many slots. (Their entry level models also don't have any more memory bandwidth than normal PC laptops and there is no real reason they couldn't use a pair of SODIMMs.)

But low end iGPUs don't need a lot of memory bandwidth (again witness Apple's entry level CPUs) and integrating high end GPUs makes you thermally limited. There is a reason that Apple's fastest (integrated) GPUs are slower than Nvidia and AMD's fastest consumer discrete GPUs.

And even if you are going to integrate all the memory, as might be more justifiable if you're using HBM or GDDR, that only makes it easier to not integrate the CPU itself. Because now your socket needs fewer pins since you're not running memory channels through it.

Alternatively, there is some value in doing both. Suppose you have a consumer CPU socket with the usual pair of memory channels through it. Now the entry level CPU uses that for its memory. The midrange CPU has 8GB of HBM on the package and the high end one has 32GB, which it can use as the system's only RAM or as an L4 cache while the memory slots let you add more (less expensive, ordinary) RAM on top of that, all while using the same socket as the entry level CPU.

And let's apply some business logic to this: Who wants soldered RAM? Only the device OEMs, who want to save eleven cents worth of slots and, more importantly, overcharge for RAM and force you to buy a new device when all you want is a RAM upgrade. The consumer and, more than that, the memory manufacturers prefer slots, because they want you to be able to upgrade (i.e. to give them your money). So the only time you get soldered RAM is when either the device manufacturer has you by the short hairs (i.e. Apple if you want a Mac) or the consumers who aren't paying attention and accidentally buy a laptop with soldered RAM when their competitors are offering similar ones for similar prices but with upgradable slots.

So as usual, the thing preventing you from getting screwed is competition and that's what you need to preserve if you don't want to get screwed.



> integrating high end GPUs makes you thermally limited.

Even if you have a surface area equivalent to a high end cpu and high end gpu, combined in a single die?


A high end CPU (e.g. Threadripper) is 350W. A high end GPU (e.g. RTX 5090) is 575W. That's over 900W. You're past the point of die area and now you're trying to get enough airflow in a finite amount of space without needing five pounds of copper or 10000RPM fans.

Separate packages get you more space, separate fans, separate power connectors, etc.

In theory you could do the split in a different way, i.e. do SMP with APUs like the MI300X, and then you have multiple sockets with multiple heatsinks but they're all APUs. But you can see the size of the heatsink on that thing, and it's really a GPU they integrated some CPU cores into rather than the other way around. The power budget is heavily disproportionately the GPU. And it's Enterprise Priced so they get to take the "nobody here cares about copper or decibels" trade offs that aren't available to mortals.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: