Hacker News new | past | comments | ask | show | jobs | submit login
M1 Mac Mini Teardown (egpu.io)
181 points by gjsman-1000 on Nov 18, 2020 | hide | past | favorite | 205 comments



One thing to note is that the JHL8040R is a TB retimer not a controller. The difference is a controller “speaks” the TB protocol and exposes USB, PCIe, and DP signals to the (typically) PCH and/or Intel CPU. They’re typically 10-20W and, from personal experience, absolute shit to work with (buggy firmware).

A retimer is a < 1W part that just strengthens the TB signal coming from the outside world and cleans it up so the controller (integrated in the M1 here presumably) does not have to deal with a weak or noisy signal.

Important to note that Intel plans to integrate a TB4 controller in their future (mobile?) processors but seems like Apple beat them to it. Also it’s unclear if M1’s integrated TB controller is TB3 or TB4 (you can have a TB4 retimer drive only TB3 signals).


According to Apple's tech specs is Thunderbolt 3: https://www.apple.com/mac-mini/specs/

Aside: I''m sure I had read before that the M1 Macs come with USB4 and Thunderbolt 3. However the tech specs say it's USB 3.1 Gen 2 and Thunderbolt 3.


That page says:

"Two Thunderbolt / USB 4 ports with support for:

* DisplayPort

* Thunderbolt 3 (up to 40Gb/s)

* USB 3.1 Gen 2 (up to 10Gb/s)

* Thunderbolt 2, HDMI, DVI, and VGA supported using adapters (sold separately)"

The wording isn't terribly clear, but given this snippet from Wikipedia's USB4 page:

"The USB4 specification is based on the Thunderbolt 3 protocol specification.[2] Support of interoperability with Thunderbolt 3 products is optional for USB4 hosts and USB4 peripheral devices and required for USB4 hubs on its downward facing ports and for USB4-based docks on its downward and upward facing ports."

I'd say it's a USB4 port that is also compatible with TB3 and USB 3.1 gen 2.


The tech specs page you linked says both things:

"USB 4 ports with support for [...]"

"USB 3.1 Gen 2"

It's rather confusing. Is USB 4 just another name for USB 3.1 Gen 2 with Thunderbolt 3 support? I wouldn't put it past the USB IF to come up with something like that, but I don't think that's correct here.


USB4 is essentially a branding strategy around the USB C connector which with tunneling various standards (USB 3.2 , thunderbolt ) and speeds. Some are required and some are optional to get branded as USB4.


One could alternatively describe it more as a rebranding of thunderbolt multiplexing layer.

For those unfamiliar: USB4 does not specify how to interface with specific non-host devices like USB 3 and older did.

Instead it is a multi-protocol tunneling (think multiplexing with routing) system, that allows tunneling USB3.2, Display Port, and optionally PCI-E (i.e. what thunderbolt is known for). It also specifies a requirement of support for different alternate modes, like display-port alternate mode (non-tunneled), and optionally thunderbolt 3 alternate mode. (Thunderbolt 3 runs at a different rate than USB4, among a small list of other differences, not counting the additional features specified in USB4).

Hubs are required to support the otherwise optional thunderbolt 3 alternate mode and USB4 tunneled PCI-E, thereby making every USB4 hub a valid Thunderbolt hub, both for classic thunderbolt (TB3), and for USB4-based thunderbolt (i.e. using PCI-E tunneling in USB4, which I assume part of the updates made in TB4)

Hubs will also support plain USB 3.X from a host, since USB4 is just a negotiated alternate mode over USB3. Thus USB3 can be used unencapsulated if only the hub supports USB4; Fully encapsulated if Host, hub, and device all support USB4; or partially encapsulated, like if the host and hub support USB4, but one of the downstream devices only supports USB3. The hub becomes responsible for encapsulating/unencapulating the tunneled USB3.

In a similar way, a hub can support connecting a classic thunderbolt (TB3) device to a hypothetical host that only supports USB4 PCI-E encapsulation. The upstream port and downstream ports are independent, so one can use the TB3 data rate while the other can use USB4, and the hubs are required to pass the PCI-E data through a PCI-E switch, so everything just works. (Especially since talking TB3 is deliberately nearly the same as talking USB4, other than link speed).

Obviously, USB2 is of course supported as well, in parallel to all other modes since it has separate data lines.

But USB4 is not all perfect. For example every USB4 port on a host must support display-port output. That is not really a huge problem on say a laptop, but it is a pain on enthusiast PC building, since it means motherboards will need a display-port passthrough socket in order to support that requirement. And undoubtedly there are or will be many non-compliant devices out there that break the whole intended it-just-works approach.


Thanks for the explanation, I've been lost in the USB specification for a while. The last major read I did about it was on USB 3.0 so it's nice to have a summary. Will be easier to go dig deeper.


The marketing site clearly says USB4.

"""With two screaming-fast Thunderbolt / USB 4 ports, two USB-A ports, HDMI 2.0, Wi-Fi 6, and Gigabit Ethernet, Mac mini is up for anything and everything."""


> They’re typically 10-20W

Ah so that's why whenever I plug anything into a TB port the CPU temps go up by 10ºC.

This has happened on all my macs since TB1.


> Also it’s unclear if M1’s integrated TB controller is TB3 or TB4

AFAIK TB4's spec was never opened up and only TB3 was donated to USB Forum. Not to mention the Spec of TB4 was only finalised and released sometime this year so I seriously doubt M1 would have TB4.

And It is pretty clear Apple implemented USB4, which is basically USB 3.2 with USB-C being Mandatory with optional TB3 compatibility.


Perhaps you could explain the difference between TB3 and 4 - my understanding was that TB4 was a branding/certification mark now by intel on top of USB4's integration of TB3 tech into the specification. It requires certain architectural features outside of conformance, such as memory protections.

But to hear your post, it sounds like there is a difference between a TB3 and TB4 retimer. Are there other additions to TB4?



> As soon as the M1 Mac mini arrived, I tore it down to confirm a few things Apple didn't make clear. All M1 Macs come with Thunderbolt 4 ports.

I think there's some nuance and technicality to this, because even though the ports themselves could meet Thunderbolt 4 specs under different conditions, they don't in the current configurations. Apple deliberately calls these "Thunderbolt / USB 4" ports, leaving out the version number from Thunderbolt. One of the requirements of a Thunderbolt 4 port is it can drive 2 4k displays, which Apple is very clear it cannot in the current models. So, that alone makes these not Thunderbolt 4. I don't think anyone's tested the rest of the Thunderbolt 4 features to know which ones are and aren't present.

https://newsroom.intel.com/wp-content/uploads/sites/11/2020/...


Looks like the site is down. Here's the archive.org mirror: http://web.archive.org/web/20201118135839/https://egpu.io/fo...



This returns 404 for me.


Now that I see egpu on hackernews I know why the site was slow ;-)


Something I was personally wondering about is how the DRAM was added to the M1 'chip'. This image shows clearly: http://web.archive.org/web/20201118042108/https://egpu.io/wp...

I guess they must be using much higher density for their 16G than a typical 16G SODIMM for example. I'm not fully understanding why the memory chips need to be on the M1 package if they're using the same package as all other memory.


> I'm not fully understanding why the memory chips need to be on the M1 package if they're using the same package as all other memory

Some guesses:

Reduces the number of pins (lands these days) on the CPU package which in turn reduces cost and potentially improves thermal efficiency due to freeing up space previously used by pins to be used to conduct heat away from the die.

Improved performance due to shortened traces between CPU and memory die.

Ability to remove ESD protection circuitry from both CPU and memory pads, thus improving performance.

Reduced power consumption due to lower inductance in CPU <-> memory traces because they're shorter.

Arrangement anticipates a future SoC with DRAM on-die, without motherboard redesign.

Somewhat more secure due to difficulty probing CPU <-> memory traces.


Each land/pin conducts heat so the more the better for thermal conduction. But these CPUs use so little power its not a big deal regardless.


> Each land/pin conducts heat so the more the better for thermal conduction. But these CPUs use so little power its not a big deal regardless.

Is it possible this is part of a plan for a future high-performance Mac Pro SOC where power consumption ends up being much higher?


A big part of the reason is that the CPU, GPU, and neural engine all have direct access to the memory. That's the idea behind the unified memory that Apple talked about during the keynote. Compare this to a normal setup, where you have main memory and dedicated VRAM. You load something into a game by fetching it into RAM, and then loading it into VRAM by dispatching it over PCIe to the graphics card processor, which then stores it in VRAM. There's a lot of extra steps. My guess is that their memory subsystem is fairly radical to handle the different types of memory access for the different functional subsystems, so they optimized for specific memory modules.

I really would love to see some more documentation on the M1 memory subsystem, there are so many unknowns behind the very impressive performance numbers we've seen over the past few days. And I personally suspect that the future will have traditional ram, with the on-chip ram acting as a kind of last level cache. You can't really scale this to 64GB or higher without some major thermal problems.


The LPDDR4X memory chips in the M1 work about the same way as any modern iGPU or mobile SoC setup, the shared cpu/gpu memory controller arbitrates access to the DRAM.


> . And I personally suspect that the future will have traditional ram, with the on-chip ram acting as a kind of last level cache.

Amiga Chip RAM/Fast RAM is the new hotness.


Closer to the core, better performance.


The RAM latency was still up at 90ns, as measured by Anandtech, so I'm not sure it's increasing performance much.

Perhaps it's beneficial for reduced power consumption? Or maybe it's just a design holdover from mobile, and it was the easiest place to put the RAM.

Though some people see upgradable RAM as a necessity, I'm not sure how often it actually happens in practice, if you look across consumers. I can't remember the last time I changed the RAM configuration after assembling a PC. One has to be a huge tinkerer for that to even enter into the picture.


Power consumption and package pinout/motherboard simplification would be the main two benefits.

Latency in DRAM access is dominated by the controller logic on the CPU side (minority) and the bank access and amplification circuitry on the DRAM side (majority). Time of flight down the wires is fairly negligible.

Power draw for the interface does scale meaningfully with wire length though. Also, routing the 100+ signals (command, address, data, reference voltages) can be a challenge, especially if the package is constrained on pin count.

The tradeoff is that maximum RAM capacity is significantly reduced. Die can be thinned and stacked quite high, but the need to share command, address, data busses means signal integrity problems and increased power for the interface with no performance increase. Stacking die has yield issues as well, so cost rises faster than linear. And finally, power dissipation becomes a real issue, DRAM doesn't like getting too hot and the bottom of that stack is a long way from the heatsink.


If most people are like me, they like socketed RAM because they can buy the cheaper low end model and immediately add RAM.

But now that the whole industry is apparently going to non-socketed RAM, I don't think this is going to be much of a differentiator anymore.

If there was an XPS or ThinkPad with a Ryzen and socketed RAM, I would have already bought it... but if the choice is between a $1400 non-socketed MacBook or a $1400 non-socketed ThinkPad... that's a tough choice.


Likely makes it easier/cheaper to run DDR4X at 4500 Mhz. Also DDR4x is often dual channel and Anandtech claims the M1 has a 128 bit wide interface in the form of 8 16 bit channels. So ML, GPU, and CPU duty can have more load/stores in flight and might help explain why the M1 performs so well.


I bet design holdover from A-series SoC. It also reduces PCB. I'm exciting to see what RAM is used on higher end M-series SoC.


Yeah, I'm looking at the rationale for HBM which looks physically similar and it seems to be closeness and the ability to have more wires.

https://en.wikipedia.org/wiki/High_Bandwidth_Memory

But the HBM photos have the 'interposer' layer and look to be using bare dies, the M1 looks to not have the interposer and to be using the DRAM package. I'm not fully convinced with why the DRAM needs to be on the M1 but there must be a reason. Apple's marketing site says their 'Unified Memeory Architecture' uses a 'custom package' so maybe the DRAM package appearance is the same but the interface is not.


I wonder if this will become the norm for all CPU manufacturers at some point.

If they put 64Gb directly next to a high end CPU and that made it 10% faster all around i suspect it would be a pretty popular choice despite the lack of upgradability.


I think for the most part upgrading RAM (capacity-wise) has been pretty stable (for my uses at least - programming and games). I've been on 16gb since...well as long as I can remember.

Every time I upgrade I'm jumping from DDR2 to 3 to 4 and probably soon 5. So its not like I keep the sticks anyway. Even when I stay on DDR4 I have to upgrade because the speed of the RAM is too slow to feed the processor. Its unfortunately rare for me to not have to buy RAM with a new CPU.


If they put 64Gb directly next to a high end CPU and that made it 10% faster all around i suspect it would be a pretty popular choice despite the lack of upgradability.

What about ECC?


As usual, the uncrippled "workstation" version will have ECC and the crippled "consumer" version won't.


I mean, ECC is going to mean lower yields from the fab and a higher TDP. Will it be worth it?


I don't see why that would be the case.


Because it’s half the density of non-ECC so bigger wafers, and runs at a higher voltage?


DDR4 ECC needs just 1 parity chip for 8 chip, not half. Of course it consumes a bit more power to calculate checksum. IMO it worth for every computing.


> This image shows clearly: http://web.archive.org/web/20201118042108/https://egpu.io/wp...

It's like a much smaller, more advanced version of how the adjacent cache RAM was soldered onto the same circuit board on a Slot 1, Pentium 2 CPU.

https://cdn11.bigcommerce.com/s-a1x7hg2jgk/images/stencil/12...

The RAM is not actually part of the CPU die.

It looks to me like the separate green PCB that contains both the CPU and the RAM has a ball grid array on its underside, and is then attached to the motherboard.


Interesting that they're still using the old 150W power supply when Anandtech measured the entire system's maximum power draw at 37W IIRC. It makes this first generation of ARM Macs feel kind of lazily cobbled together. It'll be interesting to see future designs where the M1 wasn't just retrofitted but is a first class citizen.


Purely speculating here, but Thunderbolt allows for pretty substantial power to be sent bidirectional. It's conceivable that the excess power could be used for that or some other peripheral. I know I can charge my MBP off the excess power from my Dell w/thunderbolt 3, with a similar power supply discrepeny (only ever measured 50-60W used in system, but power supply is rated for 110-120W IIRC)

Or, they're just being lazy/economical and reusing old stock.


For the Mac Mini, I'd rather see it grow the ability to use more power than get smaller. More RAM, more cores, more GPU, more ports, eGPU capability. It's already a fairly convenient size that you can buy server racks and brackets for.

The new system design I want to see is an update on the 11" MacBook Air or the 12" MacBook. I'd like to see if they can get a clamshell with good battery life and decent performance under two pounds and just wide enough for a full-size keyboard. The 11" MacBook Air is still my favorite laptop I've ever owned; I'm sad they discontinued it and didn't refresh the design. Maybe it can come back with an M1 or M2 chip, though.


I’d love them to just ship the MB 12” board inside a keyboard with 2 USB-C ports and have a modern day Commodore. It can be powered by the USB-C monitor.


Yes, what I'd like to see is a new Apple Silicon eGPU. Based on the Blackmagic design, but smaller, fanless and graphics power to compete with nvidia.

Why fuss over driver compatibility if the performance on Apple's graphics cards blows away the competition?


The new Air only weights 2.8 lbs. I'm curious, what difference does a pound less really make when it's already so light?


Another pound is a full third of that weight. That’s a big percentage.

Anecdotally, there was a step function between 3 lb ultraportable laptops with 12-13” displays and the 2.2 lb 11” MacBook Air. It was the most grabbable, portable computer I’ve ever owned with enough CPU power to handle my Computer Science coursework. It changed my relationship with my laptop. It no longer felt like a lump in my bag that needed to be handled carefully, like all the previous plastic bodied things I’d owned.

I have since used a 13” Air, Pixelbook, Samsung Galaxy Chromebook (until it died), and I’m typing this now on an iPad Pro, and I still miss that laptop.

The HP Dragonfly looks real nice, maybe if I put Linux on it. It’s also $2200 for decent specs though, whereas the 11” MacBook Air was a comparatively reasonable $899.


Yes, please a new 11“ would be truly awesome.


To play devil's advocate, why bother designing a new PSU when you can just slap in the existing one and be perfectly compatible?

The efficiency differences with a 37 W max power draw is negligible.


It's also a proven component with an established supply chain. A new PSU has to go through worldwide certification testing, etc. Why incur additional cost for a low-volume product?


"lazily" is probably the wrong word.

For the first release, the approach here is pretty obviously: The minimum changes needed to existing machines to accommodate the M1.

That approach makes a lot of sense because it minimizes the time to market. It puts Apple silicon in people's hands as soon as possible. That's important because the sooner that happens, the sooner the software will be updated for AS.

Think about it: the alternative was to just make people wait for the redesigns, which will take a year or two more.

And the redesigns are coming... judging by all that empty space in that mini, they will be dramatic.


> That approach makes a lot of sense because it minimizes the time to market.

Apple most likely has the oomph (financial and human) to do both at once. However only changing one thing makes troubleshooting way easier if things go wrong. Here the case and overall machine design is a known entity.


Could it be that the 150W PSU is kept in order to power thunderbolt peripherals, or does thunderbolt not offer the ability to transmit power to peripherals from the host?


I'll speculate that the follow-on to the M1 Mac Mini will use a more powerful processor that has higher power requirements and supports more ports with higher power requirements. Is it cost effective to drop to a smaller power supply for one generation? Or just leave that supply chain in place?


Honestly, from their point of view, they less they change in the first gen the better. Reduces the risk of exciting first gen Apple product problems marring the M1 launch.

They did this with Intel, too, more or less.

Also, historically the Mac Mini has been a split line; one SKU with a low power (~20W) chip and one with a 45W chip. But otherwise mostly the same. It's fairly plausible that they repeat that here; when the M1X or whatever shows up I'd expect to see a more expensive Mac Mini SKU that has one.


What happens when I plug an iPad and 3 other TB devices into the Mac Mini? Is the power still 37w?


So funny how the heat spreader looks like someone just snipped a portion of it off during assembly to make it fit with the dimms there.


The A12X looks like this as well IIRC.


Very interesting to see how they bailed on using an IHS over the two dram modules ON the processor die! I love how you can see "reach" engineering solutions in apple products, it's clear that they realized the ram would be too large to fit under an IHS and just decided to go with what ended up in the M1 Mac Mini. Very much hoping apple starts allowing even third party nvidia drivers again for latest gen GPU's...


> it's clear that they realized the ram would be too large to fit under an IHS

This is exactly the same design they used in the A12X and A12Z ( see: https://en.wikipedia.org/wiki/Apple_A12X )

They clearly do have room for a larger IHS that covers the RAM too, but either this makes manufacturing cheaper since they didn't have to change anything from the previous rev or there's actually a real benefit to not dumping excess core heat directly into the RAM (seems more likely). Sinking heat is great, but only if the temperature differential points away from what you're trying to cool.


Interesting! Thanks for the info!


Unfortunately it looks like they're taking the opposite approach, and phasing out even AMD drivers, with eGPUs apparently (?) not supported on M1 devices. It seems they want to phase out all non Metal APIs.

(Which is to say, I bet the return of eGPUs on mac will only happen if apple releases a discrete Metal-based GPU)

It makes sense, so far as it means they can further integrate hardware solutions they make themselves and not deal with a higher level driver interfaces. Otherwise they'll be limited to incorporating features present in all supported vendors.


> and phasing out even AMD drivers, with eGPUs apparently (?) not supported on M1 devices. It seems they want to phase out all non Metal APIs.

That doesn’t make sense as Metal APIs are implemented on top of the graphics drivers. There’s Metal drivers for AS, Intel iGPU, and AMD. The last two are not built for arm64. On AMD’s side it would require work from AMD as most of the driver is shared code from Windows. I wouldn’t be surprised if that changes in the future.


You're right -- I didn't mean to imply that's why eGPUs aren't supported right now, I meant I suspect it's a policy change, and Apple plans to move to more integrated drivers in the future, so in "Courage!" fashion, they've preemptively ditched it.

I can't imagine there's a hard technical reason why they couldn't support eGPUs. I've used both AMD and nvidia Linux drivers on a ppc64le workstation -- I'd be surprised if porting to darwin/arm64 was that much work. Their endianness matches -- though the difference in page sizes might be a problem. On Linux, some drivers just assume 4kb page sizes, and updating them is really difficult[1]. But AMD and Intel GPU drivers are fine with non-4kb at least, so...

[1] https://bugs.freedesktop.org/show_bug.cgi?id=94757


I think part of the issue with the eGPU as mentioned in the forum post that is linked is that the drivers haven't been compiled for arm64. So maybe they will bring it back once the drivers are stable enough?


What does IHS mean?


Integrated heat spreader, I believe.


Yes, and if you look at the at the IHS it looks like it was intended to cover the RAM, but couldn't, so they just chopped that part of it off.


integrated heat spreader


Is that the SSD soldered directly to the board? If so, that's a big disappointment considering the amount of free space in there.


It is soldered to the board since 2016 (on portables), so no surprises there. Since T2, it is only flash chips anyway, the drive controller is inside the T2.

So yes, make sure your backup procedure is good.


They nuked the lifeboat connector as well that was potentially useful for actually recovering data if your SSD died. If you lose your CPU, you now lose your SSD.


Is losing a CPU a common enough failure-case that it makes sense to implement fail-safes for? I would think that's pretty low on the list of likely causes of failure.


I mean it's more than CPU - losing power regulators, or anything else that causes your machine not to boot to a DFU mode. Watch Louis Rossman's channel for long enough and you'll see how many possible failure modes there are.

My 2014 MBP died a few years back of power regulator cause, but I was able to pull the SSD and image it in the case that they were unable to repair the device.


As noted below, it's a lot more than the CPU.

I lost three Macbooks in two years from USB controller blow outs, which is a known issue because the part they used is lousy, and it took out the entire storage with it. I had some backups thankfully, but I know people who have lost everything.


I'm fine with them integrating the SSD into the mainboard, just wish they offered a slot in there to add some additional storage later.


Everything is soldered now, your ownership is basically the same of an iPhone or iPad now.


Ownership (and adjacently right-to-repair) and upgradeability are two separate problems, let's not conflate them.


This rig has two PCI Express ports right on the back. Upgradability seems OK.


AFAIK, the ARM Macs have no eGPU support.


That's covered in the article. It's a lack of AMD ARM drivers.


That's not really conclusively proven at the moment.


Wow, such empty!


is it apparent from the teardown if there is a limitation which prevents 10g ethernet from being included in this model? pcie lanes or something similar?


Not sure, but if you still need 10G you can get comparatively priced ($100 option vs $150 product) TB3<->10G adapters: https://www.sonnetstore.com/products/solo-10g


Any idea why this is so large?


My only slightly informed assumption is that it's using an older 10G PHY and the size is for cooling. These were a bit over 10W, not including the USB circuitry.


Could be worse, see https://www.amazon.com/CalDigit-Connect-10G-Thunderbolt-Ethe.... Looks like the controller generates significant heat, so there's a tradeoff between active cooling or increased thermal mass.


that is reasonable, thanks for sharing.


This is the lower end model. Apple kept the higher end intel mini in the line-up, which means they intend to replace that with something more powerful, which may add the 10G option.


I wish I shared your confidence in this. Apple has been all over the map with the mini so it's really hard to tell.

If they did though, it looks like there is plenty of space to turn this into a real powerhouse with the right CPU.


It used to be complicated because of the Intel SKUs.

Now Apple will simply use the next M Chip on MacBook Pro 16" and put it into Mac mini. And that should have spare PCI-E lane for 10Gbps Ethernet.


I don't think it was that simple. There were some massive gaps in the mini-upgrade cycle which didn't correspond to gaps in Intel's CPU lineup.

More recently though, it seems like Apple got the memo that people love these things. Hopefully we'll see a consistent flow of ever-improving minis along-side the MacBooks.


Apple obviously has sales numbers for 10g on the existing consumer products that offered it. I imagine it hasn't been a big seller.


my impression was always that the 10g was primarily offered to satisfy whale-customers like MacStadium. I recall seeing somewhere they accounted for estimated > 50% of Mini sales (although I may be completely embellishing that number). anything else just gravy.


SFP cage is huge


Don't need SFP for 10G. an 8P8C connector is still fine for that.


Not so few 10G users today need fiber. 10G over copper is a niche application.


Holy crap there's a lot of dead air space in that teeny tiny case.


I would love the same comment to be true about the MBP. Add a couple of mm to the thickness. This would allow for useful ports be added to the body, and then fill the remaining space with battery.


I think there’s a max size battery you can put in laptops before you’re not allowed to bring your laptop on a plane. It’s either 80Wh or 100Wh.

So manufacturers can only stuff so much battery into that space.


100Wh.


This is also what the 16-inch MacBook Pro has, so even if they find some empty space inside it after Apple Siliconing it, there's really no going up battery-wise anymore.


In practice, external batteries are so cheap these days and easier than ever to charge your laptop from with usb-c, it's less of an issue than it used to be. It certainly holds up with phones and smaller devices, though.

I'd think the best argument for a thicker MBP would be greater cooling potential anyway.


> This would allow for useful ports be added to the body, and then fill the remaining space with battery.

More ports would be nice (I'm hoping but doubting that we can get some more ports on the pros once the cooling system can be smaller) but the 16" is already pretty maxed out on the battery front so don't expect any changes there.


I never expect anything other than to be disappointed in Apple hardware. I'm a different animal than their target demo. Small is not cute to me. Smaller means I'm sacrificing something. I can sympathize with the move from USB-A to USB-C. We had to learn the same with the transition from ADB->USB and/or SCSI->USB. The one thing I feel is inexcusable was the removal of the SD card reader.

What does "maxed out" on the battery front mean to you? Every laptop/iDevice comes with a different battery design/shape. Laptop batteries should be like water looking to fill every crack and cranny with more space to store electrons.


The 16" MacBook Pro has a 100 W*hr battery. This is the limit of what the FAA considers a "consumer sized battery" [1], so going larger would restrict the ability to take them on commercial airplanes...

[1] https://www.faa.gov/hazmat/packsafe/more_info/?hazmat=7


The 16" MBP has the biggest lithium battery you can have and still fly on US planes:

https://www.tsa.gov/travel/security-screening/whatcanibring/...


You can have a 100Wh main battery and a 300Wh spare battery, but not a 150Wh main battery? What is the reasoning for that?


I suppose it's less likely to go off like a Note 7 if it's not attached to the computer and in use. Otherwise, no idea.


I'm delighted to no longer have an SD card reader I've never once in my life used jammed into every laptop.

Different strokes for different folks.


> The one thing I feel is inexcusable was the removal of the SD card reader.

A SD card reader seems super-niche. I'm sure it's important to you.... but everyone probably has some random port they'd appreciate and they can't add one for everyone. I think a dongle and USB-C is a much much better solution, for everyone on average.


The one thing I feel is inexcusable was the removal of the SD card reader.

What do you need an SD card reader for? For the average person, whose primary camera is their phone, I can’t think of a reason they’d need a built in SD card reader.


> The one thing I feel is inexcusable was the removal of the SD card reader.

I suspect the number of USB ports and removal of SD has a lot to do with the capability of the M1 CPU which was designed around the lower end of Apple's lineup.

With a limit on how many channels they have for USB devices, having an additional USB port is more flexible than having an SD reader.



Looking forward to a macbook teardown.


HN crashed egpu.io


I expect Apple to role out its own gaming console by 2nd or 3rd gen. There’s so much potential here. One of the most underrated Apple announcements in a very long time. I haven’t been this excited for Mac since they announced OS X!


> I expect Apple to role out its own gaming console by 2nd or 3rd gen. There’s so much potential here. One of the most underrated Apple announcements in a very long time. I haven’t been this excited for Mac since they announced OS X!

I would be shocked if this happened, since Apple has no ecosystem of game developers whatsoever.

Both the newest xbox and the PS5 are for the most part backwards compatible with the massive base of xbox one and PS4 games, since they're a shared x86_64 architecture.

Microsoft has the huge advantage that games can be developed for both windows and the xbox at the same time, with the same libraries. This goes all the way back to the origins of the first xbox which was nearly named the DirectX box.


There are a ridiculous amount of games on the iOS platform. Many of which are offering the full game. Examples that come to mind off the top of my head are Final Fantasy, and the GTA games.


> There are a ridiculous amount of games on the iOS platform

not comparable at all, 95% of them are simplistic games that can be built by a team of 5-10 people, max. they're things in the same general category as candy crush.

there is nothing analogous to horizon zero dawn, or destiny 2, or death stranding, or cyberpunk 2077 on the ios platform. I define that as AAA size games that require significant GPU horsepower, 70GB+ disk space, and a dedicated handheld game controller.

Things are that are not compatible in any user-interface way with fingers on a touchscreen don't really exist for ios.


> I define that as AAA size games that require significant GPU horsepower, 70GB+ disk space, and a dedicated handheld game controller.

Nintendo seems to do just fine with iPad-like (or lower) specs on the Switch. Breath of the Wild is one of the bulkier games, weighing in at ~14GB, Super Mario Odyssey is ~6GB; these are widely acclaimed as some of the best games on any platform.


The majority of money that is made on games in the industry is with "simplistic games"


Most of those are ports; I know FF has some 'native' mobile games, but those are Gacha games optimized for mobile devices - microtransactions, repetitive gameplay, addictiveness abound.

The closest thing Apple has to a games console is Apple TV, and the offerings on there have been... tentative? They have pushed for developers to pick up Apple TV development, but to date, most of the games on there are ports from the other Apple devices.


> I would be shocked if this happened, since Apple has no ecosystem of game developers whatsoever.

Apple is one of the biggest publishers in the market, with over $8bn revenue from games


Games on iOS sold very well but I don't think Apple's game platform/strategy works well.

iPhone as a gaming platform: device sold well, SoC is excellent, RAM is fine, API is fine, users are relatively rich, IAP is easy.


Apple Arcade?


Most Arcade games run on the Apple TV already. Many even require a controller (PS/Xbox/bluetooth).


Uh, AppleTV?

Apple may very well beef up AppleTV and market the gaming angle of the device a lot more than they currently are, but given their history with Pippin I think your claim they will pitch a new dedicated game console type device is unlikely in the extreme.


> given their history with Pippin

Although I agree with your conclusion, the Apple Newton didn't prevent the creation of the iPad. That would have been a little silly—both Apple and the world are so different now.


I was thinking more in terms of iOS apps and games eventually making their way into Mac Mini type of system ($6-700 range). They already have the ecosystem with tons of developers. It would not surprise me if Apple has 5nm GPUs in the works that can surpass Nvidia and AMD by a significant margin.

I am also very interested in what they do with Mac Pro.


It's worth noting that $6-700 would make a theoretical Apple game console more expensive than the top-of-the-line models from Sony and Microsoft, which are themselves the most expensive consoles in recent memory.

The Apple TV 4K already costs $180/$200 without a controller, which could easily set you back an additional $60. Combined, that's more than half the price of a disc-less PS5, and very close to the cost of a Nintendo Switch.


>very close to the cost of a Nintendo Switch

The Nintendo Switch is pretty deceptive in its pricing. You buy it and quickly realise that you need to spend a lot more money.

Internal storage is limited, so you need an SD card at least.

Then if you plan on carrying it around, a screen protector is pretty much a requirement. And if you are planning on carrying it around, you'll also want a case for it.

And if you don't plan on carrying it around and are only docking it, you probably want a Pro Controller since the included ones are not really that good for couch play.

And after you've accessorised yourself, you're now stuck with the most expensive game library, with first-party titles pretty much never dropping in price. Want to buy a first-party launch title for the Nintendo Switch? That'll be $50-60 for Breath of the Wild. Meanwhile Horizon Zero Dawn, which is an acclaimed first-party title pretty much the same age as BotW, is available for $20 or less. Even cross-platform games might be more expensive on the Switch due to the cartridge costs.


I don't think I agree with the accessories bit. The internal storage is fine if you're buying physical copies of games, and while it's true that some cartridges require extra downloads, most don't, and in fact I don't think they're present in any first party titles. I expect a lot of Switch owners just pick up copies of Animal Crossing, Zelda, and Mario, and never need an SD card. Personally, I bought my Switch at launch, and it took about a year before I ran out of internal space.

I also think the JoyCon grips are fine for TV play (In fact I prefer them over the pro controller) and while I'd also recommend a screen protector, I don't know anyone else who has one IRL. The types of scratches picked up by the Switch are usually only visible in direct light, and a lot of people seem to not care.

That first-party Nintendo games don't go drop in price like most games has been a truism for a very long time. I'm not sure how Nintendo does it. I generally respect them for it—whatever the proper cost of games should be, I don't think it's healthy for games as a medium that they loose a third of their value in a few months.


>The internal storage is fine if you're buying physical copies of games

Well, sure, if you're only buying physical. However, that's not really something you can expect in these days. The Switch has 26 GB of usable storage. That's not a lot considering Breath of the Wild is over 13 GB.

The Xbox One and PS4 come with at least 500 GB. Even if you account for the fact that the download sizes there are larger, there's still much more space proportionally, so you're going to be spending more money to get the full experience on a Switch.

>I also think the JoyCon grips are fine for TV play

It's pretty much the worst controller out there. Not only is it incredibly uncomfortable, it's rather unreliable. The drift issue on the Joy-cons is pretty noted. And once again coming to the controller price issue, the Pro Controller is $70, Joy-cons are $80, where as the Xbox and PlayStation ones are $60, and there's much less of a need to get one since there's one in the box for you already.

>while I'd also recommend a screen protector, I don't know anyone else who has one IRL

I don't know if there are any people who haven't actually gotten screen protectors, especially since it was reported that sliding the device in and out of the dock may incur scratches on the screen. When one of the basic features of the devices can result in scratches, you really want one.

>whatever the proper cost of games should be, I don't think it's healthy for games as a medium that they loose a third of their value in a few months

I'm not exactly sure if it's healthy for old games to retain their value better than stocks. I don't think it's good for consumers if they're still going to be paying brand new game price for BotW couple of years later when it's a five-year-old game.

Really, speaking from experience, I have a PS4 and a Switch. For the PS4, the only accessory that I've had to buy was a charging dock, and I don't really consider it an essential one. I could charge with a cable just fine, but the dock is just a bit nicer and looks a bit better. For the Switch however, I've had to buy an SD card, case, screen protector and a Pro Controller, all of which I consider pretty necessary to get the full experience.


Apple already has a huge gaming platform with iDevices and the app store. Mobile games are a huge business and Apple takes 30% of that revenue.

So why would an energy efficient ARM chip suddenly make a gaming console of interest to Apple? Both Playstation and Xbox are X86 PC hardware. Apple could have just used x86 years earlier, but they didn't. Why? Because that's not their market.

Sales of consoles are driven by price, and the game portfolio. I have yet to meet a consumer who will be excited for a energy efficient console which costs more than the competitors and have almost no big titles.


Apple has never understood gaming, never will.

They cannot compete against PS5 and Xbox. This is not an easy market to enter, making the hardware itself is just a tiny part of the job.


Completely disagree.

They understand hardware design and have a larger (and more accessible) software distribution platform than the ps5 and Xbox combined.

Many ALREADY use Apple products as a gaming platform.

They have now wiped the floor with CPU performance. If they can provide comparable GPU performance (they are ONLY a generation behind), then I absolutely think they could compete in that arena.


The many gaming on their iPhone are not the audience that would buy a dedicated gaming console and hundred of $ worth of games.

They would if 1) the Apple exclusives could compete with the Microsoft and Sony ones and 2) games would not be mobile stuff but current generation AAA.

Yeah, not in a million years an Apple gaming console would succeed.


This entire discussion is about how impressive Apple's performance is with their chips.

Why wouldn't Apple's console be capable (or arguably even better) at running AAA titles? The vast majority of AAA titles aren't platform exclusives.

On top of that, it would seemingly draw less power / generate less heat.

Hell, Apple could release a Nintendo Switch style competitor (that also actually competes with the Xbox and PS) that is purpose built for AAA gaming at home and on the go.


The problem with introducing a third console to the world has never been performance. No gamer cares about heat, PS4 was a jet engine yet it sold like hotcakes.

The question is why would a gamer, a hardcore one, not the one that wants to play Candy Crush, spend multiple hundred of dollars and find space in his living room when he already has a PS5 and the new Xbox? What could possibly entice him?

And which game studio would port their game to a console that uses their own API, own OS, and is basically incompatible to anything else? Yes, the PlayStation is its own thing as well, but it's the bloody PlayStation, not some big unknown.

I spend a ton of money on what is basically my hobby, video games, and right now, nothing would make me buy an Apple console. Absolutely nothing. Who cares about performance, show me the game lineup first.

EDIT: One quick word about the Switch: the reason it exists among giants like Sony and Microsoft is because it has Nintendo exclusives, and Nintendo is a common household name and has always been associated with "games for the whole family". Well, the form factor too. But again, the emphasis is on exclusives and game lineup.


>PS4 was a jet engine yet it sold like hotcakes.

I don't really even understand how the noise even is an issue. The only time I actually hear my PS4 being loud is when I stop playing my games and put the console to rest mode, since I won't have any game audio at that point and the fans are still spinning. I've never heard the PS4 over my game audio.


Well presumably the apple machine would be much cheaper, given the price points people are talking about the M1 coming in at.


Unless it was sold at a loss, no. The GPUs in the consoles are much, much more powerful than in the M1, they have completely different memory configurations, and have much faster storage as well as hardware decompression systems.

To get to this level, cost would significantly increase, add in expensive cooling and so on and you'll get to a price higher than the consoles. Indeed, the slower Mac Mini is more expensive.

Also, even then the margins would be too low for Apple to be interested.


The GPU in a ps5 is equivalent to an RTX 2060. The onboard graphics in the m1 is (almost) equivalent to a GTX1060. That is only one generation behind.


No, it really isn't. It's equivalent to a 1060 running outdated graphics APIs on barely supported drivers.

I assure you it's not comparable to a 1060, and when people start benchmarking them on Vulkan or DirectX instead of decade old OpenGL this is will be apparent.


The benchmarks don't take advantage of Metal either!

Obviously we will have to wait for more sophisticated benchmarks that can take advantage of otherwise unsupported features - but based on raw performance, they ARE comparable. That is not even up for debate.

As should be clear - these aren't consoles. But apple certainly have the knowledge to create one.

Markup on apple products is typically 50%. That gives a LOT of wiggle room for them to meet the pricing of your famed loss leaders (that are ALREADY a generation behind at release!).


The benchmarks might not take advantage of Metal, but they take advantage of modern drivers.

If you want to know just how much of a disaster NVidia GPUs are on Mac OS, you can test them on Windows vs Mac OS and you will notice a huge difference.

It's so bad that people were working on OpenGL to Metal translation layers.

The 1050Ti is indeed much more powerful than the GPU in the M1, as even on OpenGL, where it is generally 30% slower than the competition vs on DirectX, and even though it has to deal with horribly bad drivers, and even though it's extensions are not fully supported on Mac OS, it stays competitive. And Apple does not have the knowledge to be competitive with AMD in GPUs, not even remotely close.

If you want to look at raw performance, we can. The M1 GPU does 2.6 teraflops. The GPU in the PS5 does 10.28 teraflops.

The PS5 is, in raw performance, 4 times faster than the M1 chip.


> The 1050Ti is indeed much more powerful than the GPU in the M1

The 1050ti is 2.1 teraflops, vs the 2.6 for the M1! Also, M1 GPU has "extensions" that aren't being benchmarked...

> The PS5 is, in raw performance, 4 times faster than the M1 chip.

I already said that the ps5 GPU is a generation ahead of M1, so it is no wonder it is more performant.

Anyway, like I said, we will have to wait for some comparable benchmarks. Not sure how that will ever happen though.


The PS5 isn't just a generation ahead, 4 times more compute power is 2-3 generations.

As for the 1050Ti, while it has a bit less raw compute power, it uses many tricks like variable rate shading, streaming memory compression, and vastly faster memory, that the M1 doesn't have.

The difference between those features, exposed via extensions but not used on macOS, is that they actually significantly speed up the rendering of video games. If there were in the M1 chip, I don't know why Apple wouldn't expose them. NVidia does, on every platform except macOS because Apple doesn't want them to.

As for benchmarks, the best you're going to get is a Vulkan benchmark running on the Nvidia card via the native driver and on M1 via MoltenVK.


Nope - ps5 is same as last generation GPU in terms of both performance and features (e.g. Nvidia Turing). The 1050ti is the generation before that (Pascal).

M1 has "variable rate shading", and "streaming memory compression" (since, at least, A12) - in addition to many, many more features both before and since. I don't understand why you would think otherwise. Go read the docs.

We can't just talk about "faster memory", as the architecture is so completely different. But lets just start by saying that it is DDR4 in the M1 vs DDR3 (GDDR5) in the 1050ti. i.e. its MUCH lower latency, and has a higher bandwidth per module. It is also utilises a unified memory pool. i.e. the CPU can GPU access the same memory - so there is no "transfer" involved. This isn't your "shared memory integrated GPU" of old.

As I said, there is no fair benchmark to compare theses two. Based on openGL, the m1 wins. You think the 1050ti features would make it more performant - I'm saying the m1 has those same features (and more), that also wouldn't be utilised up by a simple openGL benchmark.

By all means enjoy your ps5. It is a great console (I want one myself), and it isn't being threatened by the m1.

But don't deceive yourself into thinking it is something magical, or cutting edge - or that a trillion-dollar computer hardware designer and manufacturer doesn't have the know-how to create something as performant.


> Why wouldn't Apple's console be capable (or arguably even better) at running AAA titles? The vast majority of AAA titles aren't platform exclusives.

Stadia is perfectly capable of running AAA titles (more or less, though it's a pain to develop for, as an Apple ARM box would be).

It's still a miserable failure, because gaming is about understanding a huge ecosystem play and, in many ways, culturally going for a market segment that Apple flat out doesn't like and doesn't understand.


Exclusives don’t take the market, as is evident by PCs having a larger market share of the gaming market than both PS5 and Xbox combined.

Apple have products other than mobiles - this very article is about one of them...


> Exclusives don’t take the market, as is evident by PCs having a larger market share of the gaming market than both PS5 and Xbox combined.

The PC platform has huge amounts of exclusives, and lots of other advantages that aren't remotely available to Apple's sensibility (end user modding, porn, heavy discounting etc).

Apple would fail miserably at a games console, which is why people have been predicting for years the Apple TV would turn into a games console and it doesn't, because it's a terrible idea and they have no competency in this area, and because to succeed they would have to start doing things completely alien to Apple.


There’s a difference between exclusive and available. “Big PC” isn’t forming deals with studios.


> Many ALREADY use Apple products as a gaming platform.

For casual mobile games, which is a whole different segment.


That’s simply not true. There are games available across Apple products, and partnerships with AAA studios announced last year - and they haven’t launched a console yet.


Which AAA titles?


Mobile gaming is basically antithetical to conventional console and PC gaming. Low commitment, low complexity, low budget. It's what people play when they precisely don't want to invest into a richer gaming experience.

It's like comparing ping pong and tennis.


Hardware is excellent but I still don't think Apple (and partially Google) empathized gamer's mind like Nintendo, Sony, and Microsoft.


Already a lot of games people play regularly on their iPhones/iPads that will work out-of-the-box on a game console with an MX chip.

In the same way Nintendo isn't a super direct competitor to Xbox/PS, I would expect the same of Apple's offering.


Nintendo has half a dozen wildly popular, exclusive franchises which date back 30+ years. Apple has...nothing, frankly.


Nothing... except for the (very many) games that have already been made for iOS.


How many games are exclusive to iOS?


People have been trying for years to do what you're describing with gaming-centric Android TV boxes.

People don't want to play mobile games on their TV. They want to watch TV while playing mobile games.


Difference being that those are all severely underpowered devices.

If you combine the existing iOS games (some of which are actually kinda impressive visually) with an actually beefy machine that the AAA titles can easily target (shouldn't be that difficult), you have an offering unlike the others.


I think you underestimate the number of IOS/Iphone/Ipad users who would happily pay for a small, quiet, console that "just worked" with any game in the app store, but would drive a 4k screen at a higher frame rate than any IOS device.


Apple runs billions in revenue through games on their platform.

Sure, they may not be "real" games according to some, but their opinion is irrelevant to the actual revenue figures.


They "don't need to understand them", they need to hire people that do..


Didn't work for Google or Amazon.

Both of whom spent a lot.


I’m expecting this chip (or a successor) to become the foundation of the regularly hoped-for Apple VR and AR headsets.


There is zero potential here. Consoles/Game Stores are built around network effects. The more people have the same console the more high end games will be released on them. If you keep adding game stores and consoles then attention will be divided among all participants. This is the perfect recipe for an oligarchy with around 3-5 big players that each cater to a slightly different experience.

Oh and regarding the hardware... The next gen consoles completely smoke whatever tiny little GPU Apple has. It's probably the first time that consoles get hardware that is on par with the PC ecosystem instead of being one generation behind.


If they were going to do this I wouldn't have expected them to make Apple TV+ available on the Xbox and Playstation.

If Apple makes real forays into gaming I would expect a sort of cloud gaming service before I expect a full-fledged console. At best I think we can expect more tight integration of Apple Arcade into Apple TV with controller support.


I'd be shocked if they did it. It's a small market, ultra competitive, and shrinking, if not in absolute terms, definitely in relative terms.


If they do that, it will have only safe design-by-commitee AAA games. Or at best games designed in house by someone who's never played a game. They should call it "the sleeping pill".


Amazing that people are downvoting this when Apple Arcade exists to prove it.


Apple Arcade is where Apple is effectively acting as a _publisher_. It has nothing that I'd have thought of as a AAA game; it mostly seems to be stuff from established indie devs.

There are plenty of other games on the App Store, tho.


Error establishing a database connection

resources smashed


Is that website hosted on that Mac Mini? It's slow as hell :-)


If it were hosted on a M1 Mini, it would be much faster :)


[flagged]


You're mostly right, but you can get away with a lot of humor if it's not quite so low effort, but fwiw, your joke would have been better if you pointed out the 10gbe deficiency - I didn't realize that was a thing, but I also think it's pretty clear at this point that all these models are meant for lower end consumers, so the top-of-the-line needs are absolutely not in this round, and we're all well aware of that at this point.


>your joke would have been better if you pointed out the 10gbe deficiency

I think it's more funny that when you make a joke about apple, the apple fanboys act more like the guy in the famous 1984 TV-advertisement, really grumpy truffles ;)


Maybe, but downvoting low effort jokes is common on HN, no special assumptions about apple required.


Never seen a joke that got up-voted...especially not about apple even when it's "high effort".


I agree with this, and appreciate your style.


Probably just getting the HN hug 'o death - just not loading for me atm.


Honestly, I never understand the HN hug of death.

Having had sites of mine get to the top of HN multiple times, it's always been around 4 visits per second. (= 15K/hr, or ~100K total over the course of a story.)

Given that things like grabbing something from the database or serving up an image are on the order of a handful of milliseconds... it should be fine.

If HN were sending 1,000 visits per second it would be a totally different story. But it's not. The amount of traffic just isn't that much.


A badly designed site can use a second or so of CPU time per user request (eg. scanning through the entire database because someone didn't put an index on something).

That won't be noticeable till you have 4 users per second hitting it...


I think it's just because of the Apple M1 topic. I'm about as far as you can get from an Apple fanboy (Linux and open protocols/data all the way) but still curious what they've pulled off this time.


This is a teardown. The page includes lots of big, high resolution photos. Too much traffic per client.


From the error I get it seems to be behind CloudFlare though, which ought to be serving the images


Yeah and those numbers seem in line with what others have reported in HN meta threads on the topic, and at least an n of 1 personal experience. I totally get it when it’s a game or some other server intensive thing. Guess caching is an afterthought for a lot of smaller sites.


The C10K website was nearly 20 years ago.


Maybe you content wasn't that interesting, and people only cared about the headline.


If this site is at the top of HN then it may be at the top of other aggregators.

If you have slow internet or a very asymmetrical link, and the site is heavy (lots of pictures or other BS), then does the server just refuse new connections when it saturates the link? I'm not sure what the behavior is.


Given that they claimed the Macbook Air is fanless, I was wondering if this one would be. Since it isn't, that seems disappointing. Not that I was going to buy one.


Lol what do you mean claimed? The Air is indeed fanless, the Mini and Pro both have fans. None of this is a surprise.


Sometimes when people say the word 'claimed' they are not expressing skepticism but are instead being precise. In an E-Prime sense, for instance.


Why is it disappointing?


Some people love fanless solutions. No need to worry about dust.


If it has a hole, dust will find a way.

My RPi's are fanless and I have to clean them. Not often, but I've cleaned them more than my laptop (which, to be honest, I've never cleaned).


It would take a lot of dust to block enough air to lower the cooling performance to a macbook air.


> ... to worry about dust.

and noise.


Noise is more complex than that. Big slow fan is silent. Some electronic components might emit sounds without any moving parts.


And moving parts make a system less durable. And cooling itself requires energy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: