Hacker News new | past | comments | ask | show | jobs | submit login
Apple unveils M1, its first system-on-a-chip for portable Mac computers (9to5mac.com)
1365 points by runesoerensen on Nov 10, 2020 | hide | past | favorite | 1346 comments

Completely unconfirmed speculation incoming:

There's a solid chance that the logic board is exactly the same on all of the Macs announced today and the only difference is the cooling solution. If you play around with the Apple Store configurator, the specs are all suspiciously similar between every new Mac.




At Apple's volume and level of system integration, it doesn't make sense to do assembly sharing at that level between different models. Presumably the SoC package is the same between the different products, but binned differently for the Air, Pro, and Mini. The actual logic boards would be custom to the form factor.

Not just that. At 5nm there will also be yield problems. I.e they will put the best yield into high end and the worst yield into low end.

This is undoubtedly why they launched the Mac Mini today. They can ramp up a lot more power in that machine without a battery and with a larger, active cooler.

I'm much more interested in actual benchmarks. AMD has mostly capped their APU performance because DDR4 just can't keep the GPU fed (why the last 2 generations of consoles went with very wide GDDR5/6). Their solution is obviously Infinity Cache where they add a bunch of cache on-die to reduce the need to go off-chip. At just 16B transistors, Apple obviously didn't do this (at 6 transistors per SRAM cell, there's around 3.2B transistors in just 64MB of cache).

Okay, probably a stupid question, but solid state memory can be pretty dense: why don't we have huge caches, like a 1GB cache? As I understand it, cache memory doesn't put off heat like the computational part of the chip does, so heat dissipation probably wouldn't increase much with a larger chip package.

Nand flash is pretty dense, but way too slow. Sram is fast but not at all dense, needing 6 transistors per bit.

For reference: https://en.m.wikipedia.org/wiki/Transistor_count lists all the largest cpu as of 2019 amd's epyc Rome at 39.54 billion MOSFETs, so even if you replaced the entire chip with Sram you wouldn't even quite reach 1GB!

Dram would be enticing, but the details matter.

Nand is nonvolatile and the tradeoff with that is write cycles. We have an inbetween in the form of 3D Xpoint (Optane), Intel is still trying to figure out the best way to use it. It currently like an L6 cache after system DRAM.

Well not just Intel. Optane is a new point in the memory hierarchy. That has a lot of implications for how software is designed, it's not something Intel can do all by itself.

SRAM is 6 transistors per bit, so you're taking about 48 billion transistors there, and that's ignoring the overhead of all the circuits around the cells themselves.

DRAM is denser, but difficult to build on the same process as logic.

That said, with chiplets and package integration becoming more common, who knows... One die of DRAM as large cache combined with a logic die may start to make more sense. It's certainly something people have tried before, it just didn't really catch on.

> DRAM is denser, but difficult to build on the same process as logic.

What makes it so difficult to have on the same chip?

I don't know the details, but the manufacturing process is pretty different. Trying to have one process that's good at both DRAM and logic at the same time is hard, because they optimize for different things.

Or, well, how about putting all your ram on the package as Apple says they are doing with the M1?

Cost. Area is what you pay for (at a given transistor size).

the bigger the cache the slower it gets

so ram is your 1gb cache

Are you are referring to latency due to propagation delay where the worst case increases as you scale?

Would you mind elaborating a bit? I'm not following how this would significantly close the gap between SRAM and DRAM at 1GB. Since an SRAM cell itself is generally faster than a DRAM cell, and I understand that circuitry beyond an SRAM cell itself is far simpler than DRAM. Am I missing something?

Think of a circular library with a central atrium and bookshelves arranged in circles radiating out from the atrium. In the middle of the atrium you have your circular desk. You can put books on your desk to save yourself the trouble of having to go get them off the shelves. You can also move books to shelves that are closer to the atrium so they're quicker to get than the ones farther away.

So what's the problem? Well, your desk is the fastest place you can get books from but you clearly can't make your desk the size of the entire library, as that would defeat the purpose. You also can't move all of the books to the innermost ring of shelves, since they won't fit. The closer you are to the central atrium, the smaller the bookshelves. Conversely, the farther away, the larger the bookshelves.

Circuits don't follow this ideal model of concentric rings, but I think it's a nice rough approximation for what's happening here. It's a problem of geometry, not a problem of physics, and so the limitation is even more fundamental than the laws of physics. You could improve things by going to 3 dimensions, but then you would have to think about how to navigate a spherical library, and so the analogy gets stretched a bit.

Area is a big one. Why isn't L1 MB? Because you can't put that much data close enough to the core.

Look at a Zen-based EPYC core- 32KB of L1 with 4 cycle latency, 512KB of L2 with 12 cycle latency, 8MB of L3 with 37 cycle latency.

L1 to L2 is 3x slower for 8x more memory, L2 to L3 is 3x slower for 16x more memory.

You can reach 9x more area in 3x more cycles, so you can see how the cache scaling is basically quadratic (there's a lot more execution machinery competing for area with L1/L2, so it's not exact).


I am sure there are many factors, but the most basic one is that the more memory you have, the longer it takes to address that memory. I think it scales with the log of the ram size, which is linearly with the number of address bits.

Log-depth circuits are a useful abstraction but the constraints of laying out circuits in physical space imposes a delay scaling limit of O(n^(1/2)) for planar circuits (with a bounded number of layers) and O(n^(1/3)) for 3D circuits. The problem should be familiar to anyone who's drawn a binary tree on paper.

With densities so high, and circuit boards so small (when they want to be), that factor isn't very important here.

We regularly use chips with an L3 latency around 10 nanoseconds, going distances of about 1.5 centimeters. You can only blame a small fraction of a nanosecond on the propagation delays there. And let's say we wanted to expand sideways, with only a 1 or 2 nanosecond budget for propagation delays. With a relatively pessimistic assumption of signals going half the speed of light, that's a diameter of 15cm or 30cm to fit our SRAM into. That's enormous.

Well, the latest AMD Epyc has 256 MB L3 cache, so we're getting there.

but any given core only has access to 16 soon to be 32mb

in Zen 1 and Zen 2, cores have direct or indirect access to the shared L3 cache in the same CCX. In the cross-CCX case the neighboring CCX cache can be accessed over the in-package interconnect without going through system DRAM.


when i started with computers, they had a few KB of L2 cache, L3 did not exist. Main Memory was a few MB.

With the DRAM this close, it can probably be very fast. Did they say anything about bus or bandwidth?

AnandTech speculates on a 128-bit DRAM bus[1], but AFAIK Apple hasn't revealed rich details about the memory architecture. It'll be interesting to see what the overall memory bandwidth story looks like as hardware trickles out.

[1] https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...

Apple being Apple, we won't know much before someone grinds down a couple chips to reveal what the interconnections are, but if you are feeding GPUs along with CPUs, a wider memory bus between DRAM and the SoC cache makes a lot of sense.

I am looking forward to compile time benchmarks for Chromium. I think this chip and SoC architecture may make the Air a truly fantastic dev box for larger projects.

That’s what “binned differently” means btw.

Interesting that commenter knew the process but not the terminology.

As someone who works with a lot of interdisciplinary teams, I often understand concepts or processes they have names for but don't know the names until after they label them for me.

Until you use some concept so frequently you need to label it to compress information for discussion purposes, you often don't have names for them. Chances are if you solve or attempt to solve a wide variety problems, you'll see patterns and processes that overlap.


It’s often valuable to use jargon from another discipline in discussions. It sort of kicks discussions out of ruts. Many different disciplines use different terminology for similar basic principles. How those other disciplines extend these principles may lead to entirely different approaches and major (orders of magnitude) improvements. I’ve done it myself a few times.

On another note, the issue of “jargon” as an impediment to communication has led the US Military culture to develop the idea of “terms of art”. The areas of responsibility of a senior officer are so broad that they enter into practically every professional discipline. The officer has to know when they hear an unfamiliar term that they are being thrown off by terminology rather than lack of understanding. Hence the phrase “terms of art”. It flags everyone that this is the way these other professionals describe this, so don’t get thrown or feel dumb.

No one expects the officer to use (although they could) a “term of art”, but rather to understand and address the underlying principle.

It’s also a way to grease the skids of discussion ahead of time. “No, General, we won’t think you’re dumb if you don’t use the jargon, but what do you think of the underlying idea...”

Might be a good phrase to use in other professional cultures. In particular in IT, because of the recursion of the phrase “term of art” being itself be a term of art until it’s generally accepted. GNU and all that...

Where can I learn more about the US Military culture's "terms of art" idea?

> How those other disciplines extend these principles may lead to entirely different approaches and major (orders of magnitude) improvements.

Fascinating. Would you happen to have any example off the top of your head?

But then how will developers exert their local domain dominance to imply and an even greater breadth of knowledge when patronizing other devs? /s

I have always assumed this term was in widespread and general use in the US business culture. Is that not the case?

This gets even more fun when several communities discover the same thing independently, and each comes up with a different name for it.

My favorite is the idea of "let's expand functions over a set of Gaussians". That is variously known as a Gabor wavelet frame, a coherent state basis [sic], an Gaussian wave packet expansion, and no doubt some others I haven't found. Worse still, the people who use each term don't know about any of the work down by people who use the other terms.

Reminds me of the Feynman story about knowing something vs knowing the name of something :-)

Reminds me of self-taught tech. I’ll often know the name/acronym, but pronounce it differently in my head than the majority of people. Decades ago GUI was “gee you eye” in my head but one day I heard it pronounced “gooey” and I figured it out but had a brief second of “hwat?” (I could also see “guy” or “gwee”.) It’s, of course, more embarrassing when I say it out loud first...

First time I went to a Python conference in SV, more than a decade ago, I kept hearing "Pie-thon" everywhere, and had no idea what the hell people were talking about.

I took me a solid half hour to at last understand this pie-thingy was Python... in my head I had always pronounced it the French way. Somewhat like "pee-ton", I don't know how to transcribe that "on" nasal sound... (googling "python prononciation en francais" should yield a soundtrack for the curious non-French speakers).

I thought numpy was pronounced like it rhymes with bumpy for a year or so.

Picture 18 year old me in 1995, I got a 486SX laptop as a graduation present out of the blue from my estranged father. I wanted to add an external CD-ROM to it so I could play games and load software for college, and it had a SCSI port. I went to the local computer store and asked the guy for a "ess see ess eye" CD-ROM drive, he busted out laughing and said "oh you mean a scuzzy drive?" Very embarrassing for me at the time but that's when I learned that computer acronyms have a preferred pronunciation so I should try to learn them myself to avoid future confusion.

> Very embarrassing for me at the time

it shouldn't be, it should be a badge of honor of some sorts - it points to somebody reading to expand their knowledge that is not available in oral form around them, so kudos to them !

It's even more visible in non-English speaking countries. In Poland: first everyone says Java as Yava and after a while they start to switch to a proper English pronunciation. Many times it divides amateurs from professionals, but I wouldn't really know, because I don't work with Java.

Not the one I was thinking of but same point :) https://fs.blog/2015/01/richard-feynman-knowing-something/

Great story, yes. But there's no such thing as a "halzenfugel" in German as far as I can tell as a native speaker. Even www.duden.de, the official German dictionary, doesn't know that word ;-0

That's OK, AFAICT there's no bird called the "brown throated thrush" either.

As a native English speaker and middling foreign-language speaker of German, "halzenfugel" sounds to me like a mock-German word that an English speaker would make up.

Hah, good to know. However, unless you are talking to people from the same domain, it's usually a better approach to spell out things instead of relying on terminology. Concepts and ideas translate much better across domains than terminology.

I think a bunch of people learnt a new thing from your comment, so it is a good one.

I hope my reply didn’t come out as gatekeeping, it was genuinely just to help put a name to a thing.

May just have skimmed GP and missed it.

Well then that was a good explanation because I didn’t know that!

That's now how yield works. Yield is the number of functioning chips that you pull out of a wafer.

I think what you are trying to refer to is frequency binning.

That's only partially true.

For example, AMD sells 12 and 16 core CPUs. The 12 core parts have 2 cores lasered out due to defects. If a particular node is low-yield, then it's not super uncommon to double-up on some parts of the chip and use either the non-defective or best performing one. You'll expect to see a combination of lasering and binning to adjust yields higher.

That said, TSMC N5 has a very good defect rate according to their slides on the subject[0]

[0] https://www.anandtech.com/show/16028/better-yield-on-5nm-tha...

Which is likely why there are some "7 core" GPU M1 chips.

Yep for the MBA. I think for devs that can live with 16GB, the cheaper 7GPU MacBook Air is very interesting instead of the MacBook Pro for $300 cheaper.

Plus, defects tend to be clustered, which is a pretty lucky effect. Multiple defects on a single core don't really matter if you are throwing the whole thing away.

Is that not what the parent comment said? I thought "binning" referred to this exact process.

Isn’t that what “binning” means?

If you compare the M1 Air and Pro, the only difference seems to be the addition of the Touchbar, 10% better battery life, and a "studio" speaker/mic on the Pro.


I assume the addition of a fan on the Pro gives it better performance under load, but there doesn't seem to be a hugely compelling reason to not just get the Air.

I think they got it wrong. I would pay money to NOT have the touchbar.

I got one of the MBP with the touchbar this year after holding out for many years (changed jobs so had to change laptop). Man it is so much hard to do the volume changes, and there has so far for me been zero benefit.

Not what you're looking for, but I'll mention it anyways just in case:

It's possible to set the default touch bar display to only ever show the expanded control strip (System Preferences > Keyboard > Keyboard > Touch Bar shows: Expanded Control Strip). In that mode you tap volume up and down instead of using a volume slider.

Again, I know you're looking for physical keys (aren't we all) but it's better than nothing.

I've been using the MacBook Pro 16 (with a physical esc key plus a touch bar) and I think it's a pretty good compromise between me who wants physical keys and apple who wants to push the touch bar.

The other thing that kept happening to me: I would accidentally tap the brightness button when reaching for ESC. For that, you can "Customize Control Strip..." and remove individual buttons, so that there's a big gap on the touch bar near the ESC key so that stray taps near ESC don't change the brightness.

I realise I'm an outlier here but I actually have grown to like the touchbar.

It's often unused, yes, but when I fire up Rider for my day job it automatically flicks to the row of function keys and back depending on which app has focus and it fits quite nicely for me between work and entertainment (I like having the video progress bar if I'm watching something on the laptop). Maybe I'm just strange but the non-tactile function keys didn't really bother me much either.

In any case, I could live without it, which is probably not a roaring endorsement in any case, but I'd rather have it than not.

I like it as well, especially in applications like CLion/IntelliJ which have tons of keybindings I keep forgetting because they are different between Linux and macOS. The context-sensitive touch bar is actually very useful in these applications for things like rebuilding, changing targets, stepping through the debugger etc. without having to use the mouse.

There's a lot of things to complain about with Apple products, but if you ask me there's been enough touch bar bashing by now and people should just get over it. It's pretty useful in some situations, and IMO no real downsides, especially now that the esc key is a real physical key again. Why all the hate?

Physical F keys are still useful so why not both?

If you hold Fn you get the traditional row of function keys, which seems like a pretty good tradeoff. And if you really hate that, you can simply configure the touch bar to always display them, in which case literally the only downside is that they are not physical keys anymore. Do people really touch-type the function keys so heavily that this becomes a an actual annoyance and not just an ideological one?

Adding an extra row of physical keys that do the same thing as the row of virtual function keys, at the expense of trackpad size and possibly ergonomic (harder to reach the touch bar) doesn't make a lot of sense IMO.

You can’t hit them without looking at the bar, because you have nothing to index your fingers on.

The touchbar is the second worst thing Apple has ever done in the history of the Mac, following closely on that abomination of a “keyboard” they used from 2016-2019.

I liked the keyboard too.

And if you ask me, It has not been enough touch bar bashing...

I’ve opted to buy a 2015 Mac Book Pro this year, it might be easier to get over apple than the touch bar even...

The only time that there would be enough touch bar bashing is when Apple listens and give users an option to have function keys instead.

Pretty sure they just did. The MacBook Air is virtually identical now save for the absence of a fan and the Touch Bar.

My guess - that you couldn't get a Macbook without the TouchBar. I'd like to be able to choose between a Macbook with or without a TouchBar, but with otherwise entirely identical specs.

I've been holding out on upgrading my 2013 MBP (mostly out of frugality) to a newer version, mostly due to the butterfly keys and the TouchBar.

Yup, me too. Especially when debugging in Xcode, the TouchBar shows a bunch of useful functions.

Those are f keys on a regular keyboard, and a few million of us have developed the muscle memory to use them over the say last thirty years that they’ve been assigned to those f keys in most IDEs.

I’m with you: I enjoy the TouchBar, and it’s genuinely useful for some apps I use

This is one option, but it still suffers from my main complaint about the touch bar -- it's way too easy to accidentally activate something. Therefore, I have my touch bar set to activate only while I'm holding down the FN key.

I will not remove that safety key until they make the touch bar pressure sensitive so that "buttons" on it only activate with a similar amount of force that was required to activate the tactile buttons they replaced. Until then, I consider it a failed design improvement.

I need my ESC, so I'm glad it's there. As for the rest of the keys on the top row, I was not in the habit of using them except in vim, where I hooked them up to some macros I had written. For them, I kind of like the touchbar now, because I have the fake keys labelled with sensible names. (No more trying to remember that I have to hit F3 to do such-and-such.)

I've also found the touchbar pretty useful in zoom calls, because my zoom client shows keys for common actions.

All in all, I think a physical escape key plus the touchbar is a slight win. I would not pay more for it, but I have reversed my previous opinion that I'd pay more not to have it.

I suspect these new machines are going to be quite nice, although I won't buy one for a while since I purchased a mbp a few months ago.

I don't understand why they don't put the physical keys AND the touchbar in. There is so much space on the 16" model they could easily fit it in and shrink the obscenely large trackpad just a touch.

I think that the trackpad needs to be bigger. Doing gestures is much easier with a big pad.

Especially on the 16 there is no excuse, they could really have a function key row and a touch bar :( more than enough space for them.

I ordered my 13" touchbar MBP with the UK keyboard layout. Adds an extra key to the right of left shift (mapped to tilde), letting me remap the key that is normally tilde on a US keyboard to ESC.

I've been really happy with the following mod for the last couple years of TouchBar usage:


Fully customizable while being much better for muscle memory by giving you exactly what you want where you want it, gives you icon-shortcuts to script, and still allows you to have as much dynamic functionality / information as you like. So, for example, mine looks roughly like this:

- Fullscreen

- Bck/[play|pause]/Fwd


- AirDrop

- ConfigMenu

- Emoticons

- (Un)Caffeinate

- (Dis)connectBluetoothHeadphones

- (Dis)connectMicrophone

- (Un)muteVol

- VolDown

- VolUp

- ScreenDim

- ScreenBright

- Date

- Time

- Weather

- Battery%

CURRENTLY_PLAYING_SONG playing shows the album cover, song name, and artist, but only shows up if there IS something playing. Same with AirDrop, which shows up only if there's something that I could AirDrop to, and then gives me a set of options of who to AirDrop to. The Emoticon menu opens an emoticon submenu on the TouchBar with most-recently-used first.

That all fits fine into the main touchbar, with other dynamic touchbars available modally (ie, holding CMD shows touchable icons of all the stuff in my Dock (my Dock is entirely turned off)), CTRL shows LOCK AIRPLAY DO_NOT_DISTURB FLUX KEYBOARD_DIM/BRIGHT, etc. ALT shows me various window snap locations.

Edit: BetterTouchTool also replaced a bunch of other tools for me. Gives you the same kind of tools for scripting eg Keyboard macros, Mouse macros, remote-control via iPhone/Watch etc with a lot of reasonable defaults.

I've heard a lot of complaints about the touchbar. The loss of tactile feedback is a fair one, and admittedly removing the escape key was a terrible idea. I recently upgraded to a machine with a touchbar, learned quickly why the default settings are pretty bad, and then quickly found BTT and set it up. The touchbar is not a revolutionary innovation, but it definitely improves functionality in some cases, and it's fun to mess with. Oh, and a button to "toggle mic in zoom" actually solves a real problem.

The people who complain about the touchbar functionality must not be putting any effort at all into it. I customize so many other things on my system, regardless of the OS. Why would a new hardware feature be any different?

I didn't know about this GoldenChaos thing though, thanks for that.

> The people who complain about the touchbar functionality must not be putting any effort at all into it.

I would say that people who complain about uselessness of F-keys must not have put any at all effort into using them.

Upon getting my MBP 2016, I spent numerous months trying to make the TouchBar useful; from customizing the contents where apps allowed it, to BTT.

What it came down to is that things worthy a keyboard shortcut are things I want to be able to do fast, reliably, and instinctively. I don't want to search for the button on the TouchBar – I'm using the keyboard, it needs to be as natural as typing, without the need to look down at it. I have a screen already, I don't need another one on my keyboard.

I firmly believe TouchBar can't even come close to the same realm of usefulness as F-keys, much less being worth the price hike it imposes. Function keys are twelve, tactile, solid, free, reliable(1) buttons for keyboard shortcuts; TouchBar is a touchscreen that sometimes(2) works.

> a button to "toggle mic in zoom" actually solves a real problem

I haven't used Zoom, but if it's a decent-ish Mac app, it either already has a keyboard shortcut to toggle microphone, or you can set one in Keyboard Shortcuts, in System Preferences.

(1) as far as anything is reliable on the butterfly keyboards.

(2) same story as butterfly keyboard – if it's even slightly sporadic, it is a shitty input device.

That's fair. Personally, I used the tactile function keys for exactly six things (brightness up/down, volume up/down, mute, play/pause). Those six functions are now available on my touchbar. They're no longer tactile, which, yes, is a minor inconvenience. I wouldn't use a touchscreen for touch-typing code, of course. But for a handful of buttons way off the home row, it doesn't affect me much.

In exchange, I get to add other buttons which can also display customizable state. Yes, zoom has a global shortcut to toggle the mic. The problem is, the mic state is not visible on your screen unless the zoom window is visible. This is a frustrating design flaw IMO, which really should be addressed in some consistent way across every phone/video app. But, it's not, and so I need to pay attention to my mic state. My touchbar button toggles the mic, and displays the current mic status. I would imagine that every phone/video chat app is similar.

I don't understand how Apple missed the boat so completely on haptic feedback for touchbar, considering their mastery of it on trackpads and touch screens.

I use this app alongside BTT; it attempts to supplement haptic feedback via the trackpad haptics. It's no where near as good as a real haptic solution would be but does provide some tactile feedback when pressing buttons https://www.haptictouchbar.com/

I have a haptic touchbar and I’m pretty sure it was enabled via OOTB with GoldenChaos on BTT, but maybe not.

Did you notice you can hold and slide to change volume? You don’t need to hit the volume slider where it appears. Same with brightness. Totally undiscoverable gesture.

Yup - I love this feature but I'd guess based on people I've shown it to that no more than 20% of users are aware of it.

Exactly this. I tried so hard to like it (since I paid for it), but I have found 0 good uses cases for it.

I would assume that macOS sends at least some basic usage data for the touch bar back to Apple HQ. I wonder how often it is actually used... and I would love the hear the responsible product manager defend it.

Same situation, my trusty 2012 rMBP finally gave up the ghost and I had to get a new one with this icky touch bar. It's useless to me and makes it harder to do everything. My main complaint is that I am constantly bumping it when I type, leading to unexpected changes in volume and brightness.

Oh yeah I forgot that. I keep hitting the chrome back button in the touch bar all the time. In the beginning I was not sure what was happening then I realized it was the touch bar.

My problem with the touchbar is that I tap it accidentally while typing all the time. It needs to be like another centimeter away from the keys.

Or use the same haptic feedback as the touchpad.

Haptics plus requiring a bit of pressure to register a press, just like the trackpad.

That's actually one of the things I like better on the touchbar. Just press down on the volume icon and slide.

I continue to be disappointed about the lack of haptics though. It's such a strange thing to not have when they've proven to know how to make very convincing haptic feedback. It works very well in both the Apple Watch and the MacBook's trackpad.

You CAN configure it to show the old-style mute/down/up with the touch bar, so you are not relegated to the ultra-shitty slider. No replacement for a tactile switch, but at least you are not stuck with the default arrangement.

Easiest way is to press and hold the Touch Bar on the volume control button and slide your finger left or right–that way you barely need to look at the Touch Bar.

You can use the touchbar to skip commercials on Youtube in Safari.

I love it.

Instead of press and hold, it's press, hold, and drag. Definitely annoying when it freezes, but when it's working it doesn't seem that much different.

The main difference is that I need to look down at the keyboard to operate the touchbar. With the keys I can rely on muscle memory.

Also I think every device which makes sound should have a physical mute control. The worst is when I want to mute, and the touchbar freezes, and I have to go turn the volume down with the mouse.

I intentionally took a performance hit by moving from a Pro to an Air almost entirely for this reason (although the low power and light weight are pleasant benefits). I'm glad that the new Air still has F-keys with Touch ID; but I'm flabbergasted that they're still not making the Touchbar optional for the Pro series, given how polarizing it's been, and the underwhelming adoption of new Touchbar functionality by third-party developers.

They brought back the escape key which is what really matters.

Honestly, I think it's only polarizing here and among some developers.

I mean if you think about other professions that use Macbook Pro they don't need it either. Are video professionals using it to scrub through video? Nope. For any audio professional it's useless. No one who is serious about their profession would use it.

I’m serious about my profession and I use it daily. Wonder what that says about me.

Please tell me what it can do that hot keys and just generally knowing a program can't do? I'm honestly interested.

I'd be curious to know what portion of their user base for the Pro series are developers. Anecdotally, quite a lot of devs seems to use Macs; but I have no idea what that fraction is, relative to the rest of their market.

The touchbar is probably why I'm probably getting the Air for my next upgrade, and not a Pro.

Honestly, the idea of a soft keyboard is a great one, particularly one as well integrated into the OS as the Touch Bar is. However, while I have a Mac with the Touch Bar, I never ever ever ever use it intentionally, as I spend 90% of my time on the computer using an external keyboard.

Just put that control panel at the bottom of the screen. It would still be close enough, and it would be out of the way of my accidental touches.

My daughter loves it. It's her emoji keyboard

As someone using a hackintosh considering a real Macbook, what's so wrong about it?

There are loads of rants out there that are easy to find, but personally it's mostly: you can't use it without looking at it to make sure the buttons are what you think they are (nearly always context-sensitive, often surprising when it decides it's a new context), and where you think they are (can't go by feel, so you need to visually re-calibrate constantly). Button size and positioning varies widely, and nearly all of them have a keyboard shortcut already that doesn't require hand movement or eyes (or at worst previously had an F-key that never moved).

The main exception being things like controlling a progress bar (mouse works fine for me, though it's a neat demo), or changing system brightness/volume with a flick or drag (which is the one thing I find truly better... but I'd happily trade it back for a fn toggle and F keys). But that's so rarely useful.

When I watch non-HN type people use it, they like it. They never used Fn keys in the first place.

I just hated the lack of ESC key (which they brought back, though my Mac is older). I have no muscle memory for any other key in that row.

I think the touchbar was my favorite part of my old MBP, specifically because of the contextual buttons that are always changing.

I'd probably pay a little extra to get one on future non-Mac laptops, but not too much extra.

Yeah, most people I know almost never use F keys (except perhaps F1 for help). They leave it on the media-keys mode... which is the same as the touchbar's default values, but without needing to know what mode it's in.

With the physical media keys, if they want to mute, it's always the same button. Pause music, always the same button. They're great in a way that the touchbar completely destroys.

(and yes, I know you can change this setting, but if we're assuming non-techy-users we also generally have to assume default settings.)

Honestly, I hardly ever used the function keys either. As a result the Touch Bar doesn't really bother me -- but neither does it seem the slightest bit useful for the most part.

Lot of non-hn people also type while looking at the keyboard and some with single finger from both hands.

It's just not useful. The context-aware stuff is too unpredictable, and I'm never looking at the keyboard anyway so I have never learned it. So the touchbar ends up being just a replacement for the volume and brightness keys, but a slow and non-tactile version of them

For me at least (and I'd imagine most of the other folks who hate it) - I had the key layout memorized. If I wanted to volume up/down/mute I could do it without taking my eyes off the screen. With the touchbar ANYTHING I wanted to do required me to change my focus to the touchbar - for the benefit of...?

I'm sure someone somewhere finds it amazing, but I have no time for it.

To me it's no different than volume controls in a car. I've been in a cadillac with a touchbar for volume, and a new ram truck with a volume knob - there's absolutely no room for debate in my opinion. One of these allows me to instantly change the volume 100% up or down without taking my eyes off the road. The other requires hoping I get my finger in just the right spot from muscle memory and swipe enough times since there's 0 tactile feedback.

For me it hides the things I use all the time (media and volume controls) to make room for application specific controls that I never use.

If it was more customisable I wouldn't mind it, but the apparant inability to force it to show me the things I actually want is annoying.

I can imagine there are some people for whom the application specific buttons are useful, but for me they are not worth it for what they displace.

Not sure if it's helpful for you but you can customize the behavior by going to your System Prefs > keyboard settings and toggling "Touch bar shows:".

I did this on like day 2 of having my MBP for what sounds like the same reason you want to. The setting I have turned on is "Expanded control strip" and I never see any application-specific controls, only volume, brightness, etc.

Omg, thank you. Somehow I'd missed that setting.

Check out BetterTouchTool if customization is holding you back.

FYI you can customize it, and force it to always display certain controls.

I had the exact same frustrations as you. Took me 10 mins digging into settings to figure it out. Now I have my touchbar constantly displaying all of the controls that are buttons on the Air (ie a completely over-engineered solution to get the same result)

The latency on the touch bat is terrible. Perhaps 1/2 second to update when you switch apps, for example!

It doesn't provide tactile feedback.

https://www.haptictouchbar.com/ is a great app I use, provides haptic feedback for the touch bar.

They do such a good job on the iPhone with this that it is quite mystifying why not.

In addition to what everybody else said, because the touch at is flat and flush with the body chassis, I find it’s very easy to accidentally press when you press a key in the row below. Eg, accidentally summoning Siri when pressing backspace, muting the audio when pressing “=“. And then you’re forced to look down and find the mute “button” to fix it.

No haptic feedback, mainly. A lot better with a real escape key, but still.

hahaha 2 times Air user, touchbar is a big NO.

Air doesn't have a fan, so if you want consistent performance, you have to buy the touchbar.

Maybe some kind of clip-on, thin, third party cooling solutions will become a thing?

so true! i hate the touchbar. if i would have to change my laptop today, i'd buy air just because i hate the touchbar.

I think it’s good actually

My Y-series Pixelbook with passive cooling performs as well as a U-series laptop from the same generation -- until I start a sustained load. At that point, the U series systems pull ahead. Actively cooled Y-series systems get pretty close in lots of applications only falling short due to half the cache.

If you are doing lightweight stuff where the cores don't really need to spin up, then they'll probably be about the same. Otherwise, you'll be stuck at a much lower base clock.

Surely they have different clock rates, but Apple isn't referencing clock rates in their marketing material for Macs with M1 chips.

Yeah I am surprised more people are ignoring this. If the two computer models have identical chips, then why does Apple even bother putting a fan in the pro?

To me the fact that the Air is fanless and the pro has a fan would indicate to me that they have different clock rates on the high end. I am sure the Air is capped lower than the Pro in order to make sure it doesn't overheat. It is probably a firmware lock, and the hardware is identical. But once we do benchmarks I would expect that the pro outperforms the air by a good margin. They added a fan in the pro so that it can reach higher speeds.

Surely the Air is capped so that users don't ruin their computers by running a process that overheats the computer.

But of course Apple doesn't want to reveal clock speeds. The best they can give us is "5x faster than the best selling PC in its class". What does that mean? The $250 computer at Walmart that sells like hotcakes for elementary age kids that need a zoom computer or the Lenovo Thinkpad Pro that business buy by the pallet? Who the hell knows.

They said in the presentation for sustained workloads. I get the impression they're the same peak clock speed but the air throttles faster.

The fan is only there for sustained loads. They all have identical chips (aside from the 7 core gpu option Air). They all hit the same pstates and then throttle accordingly due to temperature.

The MBP and Mini are there for people who want maximum sustained performance.

I recently got an Air after using a MBP13.

Aside from the loud fan and slow performance which should be fixed in this release, my biggest complaint is that they only have the usbc plugs on one side of the laptop.

Really obnoxious when the outlet is in a difficult spot.

Unclear whether the new MBP13 also has this problem...

Edit: the new M1 MBP13 has both usbc ports on the same side. No option for 4 (yet). Ugh.

The two-port and four-port 13" MacBook Pros have been separate product lines since their introduction. This new A1 MBP only replaces the two-port version. Presumably the higher-end one will share an upgraded processor with the 16".

I'm confused of their new pricing scheme / spec tiers for Macbook Pros.

There's no more core i7 for Macbook 13. You have to go to Macbook 16. I'd rather get a Dell XPS or other Core i7/Ryzen 7 ultrabooks.

So now, spec-wise, Macbook Air and Macbook Pro are too close.

I'm guessing the MBP13 is now a legacy model, being refreshed more to satisfy direct-product-line lease upgrades for corporate customers, than to satisfy consumer demand.

Along with the MBP16 refresh (which will use the "M1X", the higher-end chip), we'll probably see the higher-end MBP13s refreshed with said chip as well, but rebranded somehow, e.g. as the "MacBook Pro 14-inch" or something (as the rumors go: same size, but less screen bezel, and so more screen.)

And then, next year, you'll see MBP14 and MBP16 refreshes, while the MBP13 fades out.

These are transitional products so it makes sense. I'm looking forward to see the replacement of the iMac Pro and Mac Pro. Will be interesting to see what those include.

Addendum: just got an email from the Apple Store Business Team about the MBP13. Here's the copy they're using (emphasis mine):

> "Need a powerful tool for your business that’s compatible with your existing systems? Let’s talk. We’ll help you compare models and find the right Mac for you and your team."

That's a corporate-focused legacy-model refresh if I've ever seen one.

Specs don’t tell you the thermal story. You can buy an i9 in a thin laptop and feel good about it until it throttles down to 1GHz after 30s.

The MBP should be built with better thermals to avoid throttling since you might be running simulations or movie encoding all day. The air should throttle after a certain amount of time.

And prioritize function over form? I think you just want to buy a PC.

This has to be the funniest take on the release of a whole new CPU architecture.

I agree, I use a small form factor desktop box which fits in my messenger bag.

    There's no more core i7 for Macbook 13
Sure there is – you just have to select one of the Intel-models and customize the processor.

You are correct. They hid the option.

it's there, you just have to configure it

I think MacBook Air is very compelling, and that's why got more screen time. Unless you want to run your system on >60% for extended periods of time - MacBook Air should be great.

Pro has one extra GPU core as well.

The base model Air has 7 GPU cores instead of 8, but the higher models have all 8 cores. Seems to be +$50 for the extra GPU core.

Note that it's a 7 core GPU only for the 256GB SSD

I am curious - my 2017 12" MB is an absolutely awesome machine (fast enough for casual use and light development while absolutely quiet and lightweight), but 30+ degree summer day is enough for it to get so close to it's thermal envelope that it throttles down to cca 1 GHz during normal (browser) use soon.

So, the sustained performance might make quite a difference for pro use.

It shouldn't really throttle to 1ghz - it's because Apple puts low quality paste in it, and sometimes because of dust.

My Macbook from 2014, is still excellent but it started throttling to 1ghz with video encoding.

After going to a repair shop and telling them about the problem they put high quality thermal paste in it for about 100 usd and the problem disappeared. Now i get 100% CPU no matter what i throw at it, pretty incredible with a computer from 2014!

Just fyi..

Thanks for your experience, but the passively cooled 12" Macbook is really a different thing. Basically, it isn't a 2.5 GHz CPU that throttles, but a 1.1 GHz CPU which can boost itself for a very short time, and the it runs into thermal limit and returns to a 1.1 GHz sustained performance.

And on hot day, that boost might last 30 seconds and then that's it.

My guess is different clock speeds, also base air has one gpu core less...

And this might be the old production trick where one part of the core fails QA and so they shut it out and make it a cheaper part.

The GPU parts might be the tightest silicon and highest rate of failure so this approach reduces waste.

By "trick" you mean the only approach every chip-maker has been following for decades? Literally every single one. It's called binning.

The gp is using “trick” not with the nefarious connotation, but more along the lines of “hack” or “clever idea”.

yeah I guess I should have said 'hack'

I think it's proper way than hack

I think there’s one more notable difference: the M1 MBP also has a brighter screen at 500 vs 400 nits. Both have P3 gamut and are otherwise the same resolution.

The Pro screen is 500 nits vs 400 nits on the air.

Battery is not so important post-covid.

Air is clearly the better value option, if you really want to get one of these.

I guess the cooling let’s them tweak the CPU clocks accordingly? Wonder if we can hack the Mac mini with water blocks and squeeze higher clocks. The memory limitation makes it a dud though.

Wouldn't be surprised if the cooling solution was serialised and it had a detection whether the cooling is originally programmed for the particular unit like they do now with cameras and other peripherals (check iPhone 12 teardown videos). I bet that the logic would check the expected temperature for given binning and then shut down the system if it is too cool or too hot. Apple knows better than the users what hardware should work with the unit.

For a while, the fan was broken in my 2017 MacBook Pro 13". Didn't spin at all. The MacBook never complained (except when running the Apple hardware diagnostics). It didn't overheat or shut down unexpectedly. It just got a bit slower due to more thermal throttling.

I expect it would work the other way, too. Improve the cooling and performance under load would improve.


This is a video from Linus Tech Tips that demonstrates that no matter how much you cool it, they've physically prevented the chip from taking advantage of it.

And if it could be fixed with software, they would have worked out how, they're into that kinda tweaking.

Intel chips, on the other hand, are designed to work with a varying degree of thermal situations because they don't control the laptop it is put in. In this situation, Apple could potentially get more creative with their approach to thermals because they control the entire hardware stack.

Sure, if by "more creative" you mean handicap the CPUs by not delivering them any power because they know they can't cool them.


Intel processors use Intel designed throttling solutions... which exist to keep their own processors from overheating because they have no control over the final implementation.

These new M1 laptops are the first laptops that have complete thermal solutions designed by a single company.

As an example, there is the potential to design a computer with no throttling at all if you are able to control the entire thermal design.

> As an example, there is the potential to design a computer with no throttling at all if you are able to control the entire thermal design.

This is not true. A laptop needs to work in a cold room, in a hot room, when its radiator is dusty, etc. If your CPU is not willing to throttle itself then a company with Apple's scale will have machines overheating and dying left and right.

For a computer to never _need_ to throttle, either (1)the cooling system has to be good enough to keep up with the max TDP of the CPU, or (2) you "pre-throttle" your CPU by never delivering it more power than the cooling system could handle. Apple refuses to accept solution 1, so they went with solution 2. If you watch the video I posted, it shows that even when there is adequate cooling, the new macbooks will not deliver more power to the CPU. In effect, the CPU is always throttled below its limit.

If Apple actually did that Louis Rossman would be out of a job.

No, not in the sense that the cooling lockout would make him unable to fix MacBooks - he clearly has the industry connections to get whatever tools he needs to break that lockout. Yes, in the sense that many Apple laptops have inadequate cooling. Apple has been dangerously redlining Intel chips for a while now - they even install firmware profiles designed to peg the laptop at 90C+ under load. The last Intel MBA had a fan pointed nowhere near the heatsink, probably because they crammed it into the hypothetical fanless Mac they wanted to make.

Apple actually trying to lock the heatsink to the board would indicate that Apple is actually taking cooling seriously for once and probably is engineering less-fragile hardware, at least in one aspect.

So, essentially their new Macbook line is a glorified iPhone/iPad but with a foldable display (on a hinge)?

Not too far-fetched when you see the direction MacOS is headed, UI-wise. And it sounds nice, but if it means that repairability suffers then we'll just end up with a whole wave of disposable laptops.

To be fair to apple, people keep their macbooks for years and years, keeping them out of landfill longer. They are well made and the design doesn't really age. Written on my 2015 Macbook pro.

To be fair to the rest of the world, this comment is written on a 20 year old PC. It has had some component upgrades, but works like a champ after 20 years.

If you keep replacing failed/failing components or give needed upgrades to the system every few years, is it fair to call it 'working like a champ for 20 years'?

I'll take it a step further. Is it fair to even call it the same system after 20 years of changes?

Like the Ship of Theseus thought experiment, at what point does a thing no longer have sufficient continuity to its past to be called the same thing? [1]

[1] https://en.m.wikipedia.org/wiki/Ship_of_Theseus

Some parts are kind of like tyres on a bike, just need to be replaced from time to time, it doesn't mean the bike is bad or not working like a champ.

Yeah, but it does mean it is no longer the same bike. If you replace every part of a bike, even one at a time over years, it is no longer the same bike. So it all depends on what GP means by "replacing some parts". Is it entirely new computer in a 20 year old case? Or is it a 20 year old computer with a couple sticks of RAM thrown in?

Regardless, I have a hard time believing a 20 year old computer is "working like a champ". I've found the most people who say their <insert really old phone or computer> works perfectly have just gotten used to the slowness. Once they upgrade and try to go back for a day, they realize how wrong they were. Like how a 4k monitor looks "pretty good" to someone that uses a 1080p monitor everyday, but a 1080p monitor looks like "absolute unusable garbage" to someone who uses a 4k monitor everyday.

Definitely not if the metric we care about is keeping components out of landfills.

I don't understand the landfill argument here.

A typical "Upgradable" PC is in a box 10 times the size of the mini. If you upgrade the GPU on a PC, you toss out an older GPU because it has pretty much zero resale value. Typical Apple hardware is used for 10-15 years, often passing between multiple owners.

It's a shame we don't have charities that would take such parts and then distributed them to less fortunate countries. Ten years ago a ten year old graphics card would no longer be quite usable, but now 10 years old card should work just fine for most of the tasks, except more advanced gaming.

I don't see the point. There is nothing to put it into. It's far cheaper to just ship modern CPUs with integrated graphics which will be faster and more efficient than that 10 year old GPU. The era where computer components were big enough for it to make sense for them to be discrete parts is coming to a close.

This is particularly true on the lower end where a 10 year old part is even interesting.

I thought you could donate any part of a computer and then people could sort and match, but I think you're right.

if only two parts got replaced, then landfill mass was reduced.

Why do I think of Trigger's Broom when I read this?

Apples and oranges. I've never kept a laptop for five years.

That's only applicable to Macbooks made upto 2015.

I guess I'll throw my 2016 MBP out then.

You probably will before I throw out my 2010 MBP thanks to easily replaced parts.

To me it looks more like they swapped the motherboard out with their own, keeping the rest of the hardware the same.

With RAM and SSD already soldered to the motherboard, repairability can't really get much worse than it already is.

It's not difficult to replace RAM or SSD with the right tools (which may be within reach of an enthusiast), problem is that you often cannot buy spare chips as manufacturers can only sell them to Apple or that they are serialised - programmed to work only with that particular chip and then the unit has to be reprogrammed after the replacement by the manufacturer. I think they started doing it after rework tools became affordable for broader audience. You can get a trinocular microscope, rework station and an oven for under a $1000 these days.

You can get a screwdriver (allowing you to replace RAM and SSDs in most laptops, including older macs) for $5. There's really no excuse for them to do this all the while claiming to be environmentally friendly.

Depends on the model. My 2012 mbp15r uses glue and solder, not screws. Maxed out the specs when I got it, which is why it's still usable. Would've been delighted for it to have been thicker and heavier to support DIY upgrades and further improve its longevity while reducing its environmental impact, but that wasn't an option. Needed the retina screen for my work, bit the bullet. Someday maybe there will be a bulletproof user-serviceable laptop form factor w a great screen, battery life and decent keyboard, that can legally run macOs... glad to say my client-issued 2019 mbp16r checks most of those boxes. /ramble

Something like ATX standard but for laptop shells would be awesome - imagine being able to replace a motherboard etc, just like you can with a desktop PC.

Intel tried this more than a decade ago. The designs were as horrible as you might imagine, and a few OEMs did come out with a handful of models and parts.

As I recall, consumers didn’t care or wouldn’t live with the awful designs that they initially brought out. I don’t remember. I remember thinking I wouldn’t touch one after seeing a bunch of engineering samples.

Maybe it was too early for this kind of thing. I could imagine today such shell would be much slicker.

Except the RAM is in the M1 now. Pretty good excuse Id'say.

Is it? I thought only memory controller is in the chip, not the memory itself.

The M1's RAM is integrated into the SoC package. But it's still separate RAM chips, not actually on the same die as the M1.

Mmm... it's certainly better than they had before. But really they ought to be designing repairable machines. If that makes them a little slower then so be it.

My 2007 MBP, yes. I don't think that's true of my 2017 MBP, nor my 2012 MBA.

It's been years since Apple did away with this stuff, and nobody expected them to suddenly allow after-market upgrades.

Serialized components should be illegal, frankly.

There are good privacy and security reasons that someone might want serialized components.

Sure, but you add the option to ignore the serialization, or options to reset the IDs as part of the firmware or OS. That way the machine owner can fix it after jumping through some security hoops, rather than requiring an authorized repair store.

Mostly because, its doubtful if state level actors (or even organized crime) aren't going to pay off an employee somewhere to lose the reprogramming device/etc. Meaning its only really secure against your average user.

I don't believe those reasons are more important than open access and reducing the environmental impact of planned obsolescence, outside of the kind of government agencies that are exempt from consumer electronics regulations anyway.

Surely there is a better (and I'd bet, more effective) way to handle environmental regulations than mandating specific engineering design patterns within the legal code.

Perhaps instead, it might be a better idea to directly regulate the actions which cause the environmental impact? i.e. the disposal of those items themselves?

Engineers tend to get frustrated with laws that micromanage specific design choices, because engineering practices change over time. Many of the laws that attempt to do so, backfire with unintended consequences.

It is quite possible that your solution might be just that -- many industries with high security needs are already very concerned with hardware tampering. A common current solution for this is "burner" hardware. It is not uncommon for the Fortune 500 to give employees laptops that are used for a single trip to China, and then thrown away. Tech that can give the user assurance that the device hasn't been compromised decreases the chance that these devices will be disposed of.

As a side note, I don't think serialized components is even one of the top 25 factors that does(/would) contribute to unnecessary electronics disposal.

I think resetting instead of bricking doesn't compromise security, but saves a burner laptop from ending up in landfill. I get your point, but I think company would have to demonstrate that e.g. serialising meets particular business need that is different from planned obsolescence. Could be a part of certification processes that products before getting marketed have to go through.

Based on what legal principle should they be illegal?

In practice, such a law could resemble right-to-repair bills like the one recently passed in Massachusetts, which requires auto manufacturers to give independent repair stores access to all the tools they themselves use. A bill like this for consumer electronics could practically ban serialized components, even without mentioning them explicitly.

Illegal, no. Taxed extra.

Why beating around the bush? If the function of extra tax is to stop producers from implementing planned obsolescence, then why not just stop them directly and require that components are not serialised etc. as a part of certifications products need to go through? If you add tax, then all you do is restricting access to such products for people with lower income.

the point is to push the market into the correct^Wdesired direction without outright banning anything. non-serialized would be cheaper, hence more accessible. there are use cases where serialized parts are desired (e.g. if i don't want somebody swapping my cpu with a compromised part).

Normally I prefer nudges to bans, but I'm not sure they work on giant monopolies. Unless the tax were high enough to have no chance of passing, Apple would dodge it or write it off as cheaper than being consumer-friendly.

With Apple Silicon, the RAM is not even on the motherboard. It's integrated into the SoC package!

I don’t think that’s true for M1.

Yes, it is true of the M1. RAM is integrated into the M1 SoC package (but not on the same die).

> So, essentially their new Macbook line is a glorified iPhone/iPad but with a foldable display (on a hinge)?

This isn't some new. Since day 1, the iPhone has always been a tiny computer with a forked version of OS X.

> but if it means that repairability suffers then we'll just end up with a whole wave of disposable laptops.

Laptops have been largely "Disposable" for some time. In the case of the Mac, that generally means the laptop lasts for 10-15 years unless there is some catastrophic issue. Generally after that long, when a failure happens even a moderate repair bill is likely to trigger a new purchase.

You won't be able to go beyond the built in P-states which in the end is a power limit not a thermal one.

8GB and 16GB configurations seem more than enough..

I had a quad-core Mini with 16GB in 2011. Almost 10 years later we should be much further, especially as the Intel Mini allows up to 64GB. (Which you probably would use only if you upgraded the memory yourself).

We're not any further in terms of capacity per dollar, but we are advancing in terms of speed.

The M1's memory is LPDDR4X-4266 or LPDDR5-5500 (depending on the model, I guess?) which is about double the frequency of the memory in the Intel Macs.

Apparently, this alone seems to account for a lot of the M1's perf wins — see e.g. the explanation under "Geekbench, Single-Core" here: https://www.cpu-monkey.com/en/cpu-apple_m1-1804

Bleeding-edge-clocked DRAM is a lot more costly per GB to produce than middle-of-the-pack-fast DRAM. (Which is weird, given that process shrinks should make things cheaper; but there's a DRAM cartel, so maybe they've been lazy about process shrinks.)

Not all types of processes shrink equally well.

Apparently DRAM and NAND do not shrink as well because in addition to transistors in both cases you need to store some kind of charge in a way that is measurable later on - and the less material present, the less charge you are able to store, and the harder it is to measure.

> The M1's memory is LPDDR4X-4266 or LPDDR5-5500 (depending on the model, I guess?) which is about double the frequency of the memory in the Intel Macs.

That's a high frequency, but having two LPDDR chips means at most you have 64 bits being transmitted at a time, right? Intel macs (at least the one I checked), along with most x86 laptops and desktops, transfer 128 bits at a time.

> Apparently, this alone seems to account for a lot of the M1's perf wins — see e.g. the explanation under "Geekbench, Single-Core" here

That's a vague and general statement that site always says, so I wouldn't put much stock into it.

No virtualisation -> I’m guessing no Docker.

Am I missing something?

Mind you with 16Gb, Docker won’t be that useful.

Why do you think there is no virtualisation? Apple showed Linux running in a VM during WWDC already.

I missed that, I assumed virtualisation was dependent on Intel VT.

Then again I would have expected them to have discussed it as much as the video editing.

I am guessing that they’d need a M2 type chipset for accessing more RAM for that. Or maybe they’ve got a new way to do virtualisation since that is such a key thing these days.

Edit: thanks for pointing that out though, that’s why I mentioned it



And the mentioned Virtio here:


How well this fits in with current virtualisation would be interesting to find out; I guess this will be for a later version of Big Sur, with a new beefier M2 chip.

Are they virtualizing x86 though? Having Docker running arm64 on laptops and Docker x86 on servers completely nullifies the main usecase of Docker imo.

But you can run Docker on arm64 servers!

The intel Mac Mini is still available with the same 8GB in its base model, but configurable up to 16/32/64. RAM is definitely the biggest weakness of these new Macs.

On iOS they can get away with less RAM than the rest of the market by killing apps, relaunching them fast, and having severely restricted background processes. On Mac they won't have that luxury. At least they have fast SSDs to help with big pagefiles.

With the heterogeneous memory, your 8GB computer doesn't even have its whole 8GB of main system memory.

When the touchbar MBP launched in 2016 people were already complaining that it couldn't spec up to 32GB like the competition. Four years later, and it's still capped at 16GB.

Hopefully they can grow this for next year's models.

And the Intel Mac Mini had user-replaceable RAM. Tired of fan noise and slow response, I went from a 4 Thunderbolt 2018 MacBook Pro with only 8GB of RAM to a 2018 Mac Mini with 32GB of RAM (originally 8GB, bought the RAM from Amazon and upgraded it).

The difference was incredible

8GB ram is just soul crushing - even for basic office workloads. I need 16GB minimum.

What in a basic office needs 8GB RAM?! I used Word 6.0 under Windows 95 with 64MB of RAM!

Have you looked at Activity Monitor to see what is butchering your memory?!

Well, for starters, the most obvious answer, which is Office 365. Have you glanced at the RAM use of those apps, ever?

Second: web browsers, which can easily grab 5-10GB by themselves or even more if RAM is available.

So in other words: everything.

Probably Chrome.

I'm idling at 18gb right now and doing what I consider to be next to nothing.

It doesn't make sense for the system not to 'grab' a big chunk of your RAM. That is what it is there for. You want stuff to be preloaded into RAM so you can access it quickly if needed. You only want to leave some of it free so that if you launch a new application it has breathing room.

For example Chrome will scale the amount of RAM it reserves based on how much you have available.

> It doesn't make sense for the system not to 'grab' a big chunk of your RAM. That is what it is there for. You want stuff to be preloaded into RAM so you can access it quickly if needed. You only want to leave some of it free so that if you launch a new application it has breathing room.

Cache is excluded from just about any tool that shows RAM use, at least on desktops. If the ram shows as in use, the default assumption should be that it's in active use and/or wasted, not cache/preloading.

> For example Chrome will scale the amount of RAM it reserves based on how much you have available.

Which features are you thinking about that reserve ram, specifically? The only thing I can think of offhand that looks at your system memory is tab killing, and that feature is very bad at letting go of memory until it's already causing problems.

That seems like a hell of a lot of RAM for next to nothing.

I'm not a mac user but that seems ridiculous. I'd be investigating what's hogging it all.

I build my desktops with a lot of ram.

I have chrome, Firefox, photoshop, vs code, docker and a few other things running. As a kid I had to manage RAM. As an adult, I buy enough RAM to not need to think about it.

I was committed to buying an M1 on day one. I won’t buy a machine with only 16gb of RAM.

I'm the same, my current desktop has 32Gb, but still I'd be pretty concerned about 18Gb in use with nothing running.

The 2018 Intel Mac Mini has user-replaceable RAM. The 2014 mini has fixed RAM.

Another note on the Mini and MacBook Pro (in higher end SKUs) - these both used to have four USB-C ports, and now only have two. The Mini at least keeps its a pair of USB-A ports, but on the MBP you're back on the road to dongle-hub-land.

I'm betting this is due to Thunderbolt controller and PCIe lane capacity. They couldn't do four Thunderbolt ports with the M1 SoC, so they dropped the ports. Having four USB-C ports but only two supporting Thunderbolt would be a more obvious step back from the previous MacBook Pro. This way people can just blame it on Apple doing Apple things, instead of seeing a technical limitation.

Yes, based on my experience on a mac, I would not buy any mac with less than 32gb ram (I personally use 64gb and it's so much nicer)...

Yes, it seems crazy, yes it's a lot of ram, but I like to be able to run VMs locally and not have to boot up instances on AWS (insert provider of choice), I like to keep tabs open in my browsers, I like not to have to close apps when I'm using them and I like my computer to be snappy. 64 GB allows that 16 doesn't, 32 barely does.

Having read a bit more about the new M1 I really think it is designed and speced for the new Air. The RAM is on the package, which makes it very performant and 16G is a reasonable limit for an Air-like computer. The Mini got cheaper and more powerful, so it is not a bad trade off. I strongly assume, that there will be variations/successors to the M1 which are going to support more memory and also more IO (more USB-4 ports, more screens).

From their schematic, the two DRAM modules were directly on the SoC - possibly to improve bandwidth etc. So it looks like this cannot be upgraded / replaced. That said, it might be worth it to look past the specs and just use your applications on these machines to see how they perform. SSD storage is much faster these days and if the new OS has decently optimized paging, performance will be decent as well.

Also, lack of 10gbe is a big let down...

16GB limit with the latest MBP M1 13inch seems a big downer, I will wait for 16 inch MBP refresh now.

You have to factor in possible memory management improvements with the M1 chip, and ability to run iOS apps instead: https://twitter.com/jckarter/status/1326240072522293248

That is fine with the Air. But for a small desktop computer not to support more than 16GB in 2021? Its predecessor allowed up to 64GB (and possibly more with suitable modules).

They've also halved the 2 base SSDs to 256/512

I thought with the last update they'd finally seen the light and moved to 512/1tb, now we're back with the silly 256gb.

If you factor in having to upgrade ram to 16gb and ssd to 512 it's only £100 shy of the old price. Good, but not as good as it looked to begin with.

You can get an external M2 USB 3.1 Gen 2 (10Gbps) enclosure plus 1TB M2 SSD for $130 and a 2TB for $320. That makes the 16GB Mac Mini 256GB a decent buy at $970 imo.


For the mini sure, but it's a massive pain having an external drive for a laptop. I use one all the time and as well as being waaaaay slower even with a good drive, I lose it all the time.

Yeah it's not feasible at all to use external storage on a two port laptop. Dongles that allow you to plug in power and monitor are still just not reliable enough for storage, the only reliable connection I can get on my 2 port MBP is with a dedicated Apple USB-C to A adapter.

Shocked they're still selling the two port machine, it's been nothing but hassle for me as someone who has to use one.

That's why I'm waiting to upgrade my laptop.

I've still got my 2013 MacBook Pro. Still chugging along.

I'm hoping it can wait for v2 of the MacBook Pro 16"

Apple's second version of everything is always worth the wait.

For pro users, the fact that 32GB isn’t even an option is pretty surprising

My guess is that the next wave will be Pro.

And they will have significantly upgraded CPU/GPUs to match the memory.

But it's right there in the name: 13" MacBook Pro

The 13" 'pro' has never really been a 'real' pro. They were/are always spec'd with less cores than the 15"/16" and never had dedicated graphics.

There are two lines of 13" MacBook Pro, the two-port and four-port versions. The two-port always lagged behind the four-port, with older CPUs, less RAM, etc. The four-port (which has not yet been replaced) is configurable to 32GB of RAM.

Entry level 13" MacBook Pro is for prosumers.

Think web developers, photographers, bloggers etc.

Web developers and photographers are the opposite of 'prosumers', kind of by definition. Plus, think of the size of a full res photo coming out of a high-end phone, never mind a DSLR.

Most of the professional photographers that I work with have PC workstations with 64gb to 256gb of RAM. Retouching a 48MP HDR file in Photoshop needs roughly 800MB of RAM per layer and per undo step.

Old undo steps could be dumped to SSD pretty easily.

And while I understand that many people are stuck on photoshop, I bet it would be easy to beat 800MB by a whole lot. But so I can grasp the situation better, how many non-adjustment layers do those professional photographer use? And of those layers, how many have pixel data that covers more than 10% of the image?

From what I've seen, quite a lot of layers are effectively copies of the original image with global processing applied, e.g. different color temperature, blur, bloom, flare, hdr tone mapping, high-pass filter, local contrast equalization. And then those layers are being blended together using opacity masks.

For a model photo shoot retouch, you'd usually have copy layers with fine skin details (to be overlaid on top) and below that you have layers with more rough skin texture which you blur.

Also, quite a lot of them have rim lighting pointed on by using a copy of the image with remapped colors.

Then there's fake bokeh, local glow for warmth, liquify, etc.

So I would assume that the final file has 10 layers, all of which are roughly 8000x6000px, stored in RGB as float (cause you need negative values) and blended together with alpha masks. And I'd estimate that the average layer affects 80%+ of all pixels. So you effectively need to keep all of that in memory, because once you modify one of the lower layers (e.g. blur a wrinkle out of the skin) you'll need all the higher layers for compositing the final visible pixel value.

Huh, so a lot of data that could be stored in a compact way but probably won't be for various reasons.

Still, an 8k by 6k layer with 16 bit floats (which are plenty), stored in full, is less than 400MB. You can fit at least eleven into 4GB of memory.

I'll easily believe that those huge amounts of RAM make things go more smoothly, but it's probably more of a "photoshop doesn't try very hard to optimize memory use" problem than something inherent to photo editing.

So why are you blaming the end user for needing more hardware specs than you'd prefer because some 3rd party software vendor they are beholden to makes inefficient software?

Also, your "could be stored in a compact way" is meaningless. Unless your name is Richard and you've designed middle out compression, we are where we are as end users. I'd be happy if someone with your genius insights into editing of photo/video data would go to work for Adobe and revolutionize the way computers handle all of that data. Clearly, they have been at this too long and cannot learn a new trick. Better yet, form your own startup and compete directly with the behemoth that Adobe is and unburden all of us that are suffering life with monthly rental software with underspec'd hardware. Please, we're begging.

Where did I blame the end user?

> Also, your "could be stored in a compact way" is meaningless. [...]

That's getting way too personal. What the heck?

I'm not suggesting anything complex, either. If someone copies a layer 5 times and applies a low-cpu-cost filter to each copy, you don't have to store the result, just the original data and the filter parameters. You might be able to get something like this already, but it doesn't happen automatically. There are valid tradeoffs in simplicity vs. speed vs. memory.

"Could be done differently" is not me insulting everyone that doesn't do it that way!

You must not rate photographers very highly if you are mentioning them in the same sentence as "bloggers, etc".

I should wait for a 64 GB option. I've already got 16 GB on all my older laptops, so when buying a new gadget RAM and SSD should have better specs (you feel more RAM more than more cores in many usage scenarios).

It was surprising to see essentially the same form factor, the same operating system and not much to distinguish the three machines presented (lots of repetition like "faster compiles with XCode").

BTW, what's the size and weight of the new Air compared to the MacBook (which I liked, but which was killed before I could get one)?

Seeing two machines that are nearly identical reminds me of countries with two mainstream political parties - neither discriminates clearly what their USP is...

I don't think today's computers were aimed at those kinds of usecases.

Apple has a "missing middle" problem.

They have a ton of fantastic consumer-level computing devices, and one ridiculously-priced mega-computer.

But there are many of us that want something in the upper-middle: a fast computer that is upgradable, but maybe $2k and not $6k (and up).

(The iMac Pro is a dud. People that want a powerful desktop generally don't want a non-upgradable all-in-one.)

Apple's solution for upgradability for their corporate customers, is their leasing program. Rather than swapping parts in the Mac, you swap the Mac itself for a more-powerful model when needed — without having to buy/sell anything.

Apple has missing middle _strategy_

Apple doesn't care about your upgradability concerns on the notebook lineup. Once you get past that, it has traditionally done fairly well at covering a wide spectrum of users from the fanless MacBook to the high-powered MacBook Pros.

I have a late-2013 13" MBP with 16GB of memory. Seven years later I would expect a 13" MBP to support at least 32GB. I can get 13" Windows laptops that support 32GB of memory. The Mini is a regression, from 64GB to 16GB of memory. The only computer worth a damn is the new MBA.

Pretty sure my 2014 ish 13inch MBP with 16gb and 512 storage cost me around £1200, today speccing an M1 13inch MBP to the same 6 year old specs would cost almost £2000.

Seems absurd.

Wait just a bit and I'm sure your concerns in this area will entirely disappear.

They already disappeared, I switched to Windows in 2019.

I use MacStadium for compiling and testing iOS apps. I was wondering if the ARM machines would be worth a look, but they are disappointing. If I was still using Macs as my daily driver, I would buy the new MBA for a personal machine.

But 16gb is what I had in a computer 10 years ago.

I was just window shopping a new gaming rig, and 32gb is affordable (100 bucks), 64gb (200 bucks). Cheap as shit, what’s the hold up?

The memory is on package, not way out somewhere on the logic board. This will increase speed quite a bit, but limit physical size of memory modules, and thus amount. I think they worked themselves into a corner here until the 16” which has a discreet GPU and reconfiguration of the package.

A little bit up is was shown the memory in M1 is 5.5GHz DDR5. https://news.ycombinator.com/item?id=25050625

Can you please provide the link to 64GB DDR5-5500 for $200? I'd love to buy some too!

It's fair but if they choose fast but expensive and unexpandable technology, possibly the choice is failed in some perspective. I think most people who buy mini prefer RAM capacity than faster iGPU.


I guess DDR 500 runs you 350. It looks like Apple charges you 600 for ddr 400 32gb. I don’t know, what am I missing here?

Can you actually link to a product, not a search ? Because none of the items coming up there are DDR5-5500, they're all DDR4-3600 or worse, as far as I can see.

I guess I was wrong, everything is ddr4.

I’m confused. The link is for DDR4, it’s all too slow, and Apple doesn’t offer a 32MB M1 option at this time.

A new processor architecture. Wait a couple months and you'll probably have the computer you wanted released too.

The DRAM seems to be integrated on the same package as the SoC.

I went to Apple's website right after I finished watching the keynote with the intention of buying a new Mac mini ... the lack of memory options above 16GB stopped that idea dead in its tracks though.

Also no 10G networking option. The combination of those feature exclusions makes it a dud for me; I don't want to have a $150 TB3 adapter hanging off the back, not when previous gen had it built in.

I bet “pros” never bought it and it’s only been viable as a basic desktop. Probably nobody ordered the 10 gigabit upgrade.

I bet they’re only upgrading it because it was essentially free. They already developed it for the developer transition kit.

I commend the idea of a small enthusiast mini desktop like a NUC but I don’t think the market is really there, or if they are, they’re not interested in a Mac.

I think it is notable the new mini’s colour is light silver, rather than the previous dark ‘pro’ silver. Presumably there will be another model a year from now.

Over the years the mini has had a variety of different shades and designs. I wouldn't read too much into it.

I bet “pros” never bought it.

16gb can be limiting for some work flows today, and doesn't give you much future proofing (this RAM is permanent, right?)

Yes, it's in the SoC (or SiP now).

How do they get the ram into the SoC? Is it like a massive die?

Don’t touch Xcode then, it welcomes you to paging hell.

With that fast SSD, do you notice paging in Xcode? Would it be worth the extra $300 or however much Apple asks for extra 8GB of ram in the US store?

yes, you notice paging, even with 'fast' SSD.

or perhaps it's not 'paging', and just dumb luck I hit and see beachballs on multiple new higher-end macbook pros regularly.

It's not normally paging, but thermal throttling which involves the machine appearing to 'spin' but it's actually just the kernel keeping the cycles to itself, which typically give you beachballs as a side-effect.

And one tip is to use the right hand side USBC ports for charging, not the left hand ones as for some reason or other they tend to cause the machine to heat up more...

the right hand ones are the only ones that can drive external monitors (on mine anyway). I feel like I'm the only one that has this - I had a MBP 2019 - first batch - and I thought I'd read that one side was different than the other re: power. Power works on both sides, but monitors won't run from the left usb-c ports. but it's not documented anywhere. :/

thx for tip.

I just tried plugging my monitor into a right hand socket on my 2019 MBP, and it worked fine for me.

On my machine it is true that charging on the right is better. Charging on the left spins up the fan.

Just a thought, but maybe everyone should be appalled at that extra $300. And the lack of upgradability on a Pro machine, especially.

You're talking to Apple customers. Being gouged is a way of life for them.

The wording in the event supports this. Particularly when speaking about the Mini's fan "unlocking" the potential of the M1 chip.

This makes sense.

Most likely this is why the CPUs are all limited to 16GB. It's likely when they unwrap the 16 inch MacBook Pro, it will open up more configurations (more RAM in particular!) for the 13" MacBook Pro and hopefully the mini.

Going into the event, my thinking was that they'd have two aims:

1. "Wow" the audience given the anticipation without a full overhaul of the range. 2. Deliver some solid products that enable the transition while being effective for non-enthusiasts.

From my viewing they hit both. I expect they'll fill in the range next fall with bigger upgrades to the form factor.

I agree, It almost feels like they are going to have 3 main M Series CPUs. This one. One for the iMac and higher end MBPs. And perhaps a third for the high end iMac/ Mac Pro.

RAM limits are pretty easy to explain. 16GB chips cost disproportionately more and use more power.

I wonder if they use 2 4GB chips or 1 8GB chip in the low-end SKU?

It's even easier to explain than that. The RAM is integrated into the CPU. While there are a few SKUs here, Apple only designed and built one CPU with 16GB RAM. The CPUs are binned. The CPUs where all RAM passed testing are sold as 16GB, the 8GB SKUs had a failure on one bank of RAM.

There are no 32 or 64 GB models because Apple isn't making a CPU with 32 or 64GB of RAM yet.

It looked to me that they were placing 2 DDR4 modules beside the chip.


Well maybe? Huh. I don't know now, certainly looks like it.

The logic board probably isn't the same, but the SoC [probably] is identical, and with it a lot of the surrounding features/implementation. My own speculation as well of course :)

I doubt the logic board is the same. It’s just that the M1 integrates so much.

Actually that was what I notice in the video as well. Mac mini has huge amount of empty space. And the only difference was the cooling unit fitted on top.

There used to be a DVD player and a hard drive in there, there has to be spare room.

When comparing the mini to other SFF computers be sure to note that the mini has the power supply built in though where most of the others have it external.

If you look at the power supplies it's 30 vs 60 Watts, definitely interested to see what kind of TDP Apple targets with these machines.

They've stated that they target 10W in the Air, the cooling system in the Pro is identical to the Intel so probably 28W ish, and the Mac Mini specs say maximum 150W continuous power, but that probably includes a few tens of watts of USB and peripherals.

2 port Pros usually have 15 watt parts and only a single fan. Of course, PL2 goes far above 15W.

M1 chip page shows that 10 watts is the thermal envelope for the Air

So I get the whole performance-per-watt spiel, but if they're targeting the same 10W as with Ice Lake Y-series[1], it's gonna be hot during continuous workloads like screen sharing, since they've seemingly decided to get rid of even the "chassis fan" they've had on 2019 Air.

[1] https://ark.intel.com/content/www/us/en/ark/products/196596/...

Worth noting that Intel doesn't actually honor those 10W listings, and often boosts above it for significant portions of usage.

This would make sense given the pricing, too.

For example, the $1,249 air is very similar to the $1,299 pro. The MBA has a bigger SSD, but the MBP has a bigger battery, isn't throttled (i.e. has a fan), and also has the touchbar (which may be controversial, but for the sake of comparison, remember that it comes with a manufacturing cost).

It seems reasonable that these are priced similarly. Of course, the machine I want is the MBP with the bigger SSD and no touchbar :)

The max ram in all 3 is only 16gb :(

I noticed the slides kept saying "Up to" 8 GPU cores.

That left me wondering if there are different variants of the M1 with different core counts.

(Note: It always said 8 CPU cores)

Looks like there's two variations of the Air: one with 7 GPU cores and one with 8 GPU cores.


According to Apple's website the Macbook Air appears to only have 7 active GPU cores. I suspect that chips in the Air may be binned separately and may or may not support the higher clock speeds even with the active cooling of the Mac Mini and Macbook Pro.

7 on the base model, 8 on the upgrade. You're probably correct that this is a binning thing.

According to the tech specs on the Apple site, the Air is available with 7 or 8 GPU cores. All other new Macs have 8.

They did the exact same thing with the A12X and the A12Z. 7-core vs 8-core GPU is the only real difference between them.

My guess is maybe this is a yields thing for Apple Silicon? They use the same chips for Air and Pro, but shut off a faulty gpu core that didn't pass QA? Or a temperature thing.

One of the slides mentioned that the air is limited to 10watts though. I wonder if it does have the same soc but its nerfed beyond 10watts.

The two differences are cooling and that the base Air appears to receive binned processors with 7 GPU cores.

The MBP now only has 2 USB-C/Thunderbolt ports which would support this theory.

That's the same as previous low-end MBP.

that's right but the 'regular' one had 4. I've already seen a pro user (in music production) complain about this.

But my point here is that the fact they are both the same supports the theory that the logic board is the same on both models.

They’re continuing to sell the 4 port Intel 13” Pro.

Maybe they'd launch a more expensive 4 USB-C/Thunderbolt ports model with their more powerful chip (and upto 64/128GB memory) like they did with the earlier MBP13s.

Applications are open for YC Winter 2024

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact