Hacker News new | past | comments | ask | show | jobs | submit login
Apple M1 Pro and Max Surprises (mondaynote.com)
154 points by evo_9 on Oct 26, 2021 | hide | past | favorite | 140 comments



While there’s plenty of “told you so” -> “no you didn’t” being flung around here quite childishly, my read on this is:

Apple surprised people by what these actually turned out to be.

They’re better than folk expected.

We’ll have to wait and see how that plays out in the long run, but I don’t really think it’s something to get worked up over.

Yay, new cpus that don’t suck on a laptop that doesn’t suck.

I know it’s a low bar to clear, but it still makes me happy.


I think people are excited because it's been many years since any large manufacturer has shown they care about things professionals care about in laptops. IBM used to take care of us but after they sold ThinkPad to Lenovo it really wasn't ever the same. Apple used to take care of us but then the "March of Thin" happened. There are small companies like System76 but I think they mostly just use ODMs like Clevo et al. There are also some near-misses from Dell with their XPS line, they designed them for consumer and tried to adapt them.

This is really the first time in years a purpose-built professional laptop has came out with nice build quality, keyboard, display etc that clearly prioritizes pro users both in form (it's fatter for cooling again!) and function (beastly ARM SOC).

So while it's a low bar it's a loooong time since it's been cleared.


A similar thing happened with the Mac Pro. The old 2012 era Mac Pro was much loved. Then the "trash can" came out which was form over function. Then Apple eventually admitted their mistake and went back to a tower design.

Apple just had an extremely fallow decade in the 2010s (in terms of design) because they were making so much $$$ with iPhone that they stopped caring about the Mac. Jony Ive also left the company during that period.


I wish they had went for a design that scaled down better with the Mac Pro. It's an amazing machine and it's great for professional video production which is it's reasons d'etre but it would have been nice if it scaled down well enough for software engineers. Sort of like how the old G5 did, it was an "pro" machine, rather than a "professional video editing rig".


I suspect the M1 Pro will eventually find its way into a Mac Mini since they certainly have the thermal headroom for it, and the new Mac Pro will get some kind of M1 Supermax.


There's definitely a long-time rumor that a more powerful Apple Silicon Mac mini is in the works. It was expected to come out with the macbooks, but it didn't: https://www.macrumors.com/2021/08/22/gurman-high-end-mac-min...


If the predictions by Gurman hold true, they have chips in planning which are twice and four times the size of the current M1 Max...

I so hope there will be a Mini with the M1 Max, as this would finally be the desktop Mac, so many people have been waiting for (me including).


I really wanted a cheaper Mac tower ten years ago but most of the reasons I wanted one aren’t really relevant anymore:

The future finally happened in terms of fast external peripherals (especially storage), and I don’t have a burning need for a Mac with upgradable internal storage anymore.

In 2010 I got a $100 graphics card that let my PC do things that the PS3 and Xbox could only dream of. Now for the cost of a “mid-range” GPU it’s possible to get a Switch with a game or two that look fine and run at fast FPS.

One thing: Macs need to cut the Bs and give us upgradable RAM back


Low power, high performance, upgradable RAM: pick two.

RAM slots aren't free; all that extra interconnect length increases power requirements and becomes impractical with large buses. You'd need 8 SO-DIMM slots to match the bus width of the M1 Max models, at much higher power consumption and slower performance. LPDDR RAM doesn't even exist in removable form for these reasons.


For those wondering about the math: a SODIMM channel (e.g. DDR5) is 64 bits. An LPDDR channel is 16 bits. M1 has 8 of those (128 bits), M1 Pro has 16 (256 bits), and M1 Max has 32 (512 bits). That means the M1 is equivalent to dual channel RAM, the Pro to quad channel RAM, and the Max to octa channel RAM, in typical desktop/laptop RAM terms.

Modern embedded SoC integration is insane. Yes, the "two" memory chips you see on the M1 are really 8 discrete dies/chips (4 stacked together inside each of the two packages), and the "two" big fat things around the Pro are actually 2x8; double them up for the Max and that's how you get to 32 channels. Good luck putting that through a connector...


All macs should have 32GB without customization. It’s not so expensive without sockets, and Apple customers deserve the best. We shouldn’t even have to choose.


The “but muh upgradable RAM” folks need realize: computers are so fast now that just making RAM human-replaceable requires such long connectors that resulting speeds are unacceptably low (due to speed of light vs clock cycle time).


Maybe we will see a return of the amiga’s ‘fast ram’ expansion style, with high price, high performance modules designed for specific functions and improvement.


>Macs need to cut the Bs and give us upgradable RAM back

I don't see that happen any time soon, not with their new focus on building the core as a SoC. Maybe with the MacPros, there will be on-SoC RAM and slower off-SoC, but I doubt it till I see it.


> One thing: Macs need to cut the Bs and give us upgradable RAM back

never going go happen as Apples loathes users' freedom. they prefer selling a disposable brick.


More like a vocal minority of users want upgradable RAM. Most people don't even know what RAM is and think it's the thing that stores their files.


> There are small companies like System76 but I think they mostly just use ODMs like Clevo et al.

Yes. I remember seeing several Clevo models in their offerings. Like most laptops, they have really bad thermal management and atrocious manufacturer software.


I have a 4800H with a 1070 mobile GPU, 32GB ram, 512 nvme and 1TB sshd. I paid about $1000 all in for that ($800 laptop plus ram and storage).

The m1 stuff has "tensor" cores and allegedly a faster GPU - that is, it's faster at some things my GPU handles for me, such as h264/h265 encoding.

Why should I get rid of my laptop and use the apple instead?

Ignore the 140W PSU!


Hey, if what you have works for you then by all means keep using it. Less e-waste is always a good thing. If this new machine can save you time, maybe it's worth an upgrade. My last upgrade brought my compile times down more than five minutes, which makes it absolutely worth the cost. It looks like one of these new MBPs might do that again, so I'm going to get one.

Another thing to consider is that it may be marginally easier on you to develop on an ARM machine if you're planning on deploying to ARM servers, though this depends heavily on your workload.


From my experience, Apple has a premium on overall quality and longevity. It's hard to quantify or benchmark, but it adds up over time.

I bought a Zephyrus G14 a year ago with high stats, but I often get weird issues where I have to restart my computer or find some fix online. And there are some build/design quirks that are a little annoying. I already want to replace it.

I also have a Macbook Pro from 2015 that I only switched because I wanted a better GPU. It still works fine and looks great. I don't think the G14 will be so great after 6 years.


My wife has been using Macbook Pro 15" 2015 that we bought in 2016 rather heavily. The only significant problem is overheating due to dust accumulation that requires to bring it a local non-apple repair shop for cleaning once in a year. Another problem is that after 4 years the speakers started to produce clicking sounds and had to be replaced. It was about 100 Euro in the same shop.

We also have Macbook 13" 2018. The famous butterfly keyboard on it got broken after a year. It was repaired cost-free, but it took for Apple reseller 2 weeks to do that. And then overheating was terrible even after a dust cleanup. It was necessary to use a cooling pad under a heavy use. When I got a newer laptop on my job I tried to sell it for a reasonable price, but it turned out the resale value was much lower than I expected. So we keep it as a sort-of spare/guest machine.


> From my experience, Apple has a premium on overall quality and longevity. It's hard to quantify or benchmark, but it adds up over time.

I agree. If you know how to hack, of course hardware lasts longer. I think i have a Motorola PPC[0] mac somewhere that boots BSD

[0] totally 603e


The M1 Max / Pro is likely faster in every way compared to the $1000 laptop you bought, mostly by significant amounts.


It also costs at least 2.6x as much¹ so the question becomes "do you get a 2.6x productivity gain as well?"

1. https://www.apple.com/shop/buy-mac/macbook-pro/14-inch-space...

configured with 32GiB RAM and 1TB SSD for $2599


> It also costs at least 2.6x as much¹ so the question becomes "do you get a 2.6x productivity gain as well?"

Wrong question. You're comparing different values. Even 5-10% productivity gain can offset the 2.6x difference.


> Even 5-10% productivity gain can offset the 2.6x difference.

I think this heavily depends on what you're gaining. I think for a lot of tasks or scenarios it's not worth it. Especially around things like video encoding.

For example if you're a solo worker and you need to batch render 10 hours of video and your computer is essentially unusable for that time you would do this action overnight. If rendering everything finishes in 5 hours instead of 6 hours that doesn't matter because you're sleeping. If you need to render a quick 1 off video that takes 10 minutes there's a bunch of semi-productive things you can do while that finishes and it finishing in 9 minutes instead won't make a noticeable difference.

Now you might say if you're a professional video renderer you'll want to render videos during business hours as fast as possible in which case if you're operating at that scale you wouldn't use your daily laptop to do the rendering. You would remotely render it on a dedicated server (this is how most "pro" places work). This way you can comfortably edit video #2 while video #1 is rendering.

Even 6 year old hardware that costs $800 all-in can give you sub-100ms code reload times in most Rails, Django, Phoenix, Laravel and Node web apps even with Webpack or esbuild running -- even running within Docker on Windows (WSL 2) or native Linux. Also all major code editors can run well without input latency. That's a decently large amount of "pro" web development use cases. Unless you're doing hardcore continuous compiling in a spot where you expect to do this sequentially on your main laptop I can't really see a 10% gain being worth a 2.5x+ cost increase for most dev work.


For video editing the productivity gains are mainly in the editing, not in the rendering. Rendering time is more like a performance metric. More responsive editing process, effortless real-time previews - these are the real gains.


For sure but a 10% gain isn't a gain if you already reached your target goal of smoothness.

For example if you can edit at $your_target_resolution and everything is liquid smooth without dropping frames even with a decent amount of effects then any improvement is overkill.

I'm sure for crazy high end set ups (8k video, etc.) an M1 Max would be nice but for other folks editing 1080p or 1440p videos at 30-60 fps you can get liquid smooth editing even with ~5-6 year old hardware that costs $1,000 all-in (a bit more expensive now due to GPU prices being temporarily crazy in price).


That depends entirely on the circumstances (whether the performance increase of the local machine can get you that, that is).

It's really a very specific niche that's targeted here: namely mobile(!) video editing and -production.

There are very few use cases these days that benefit from a high-performance mobile (cannot stress this enough) solution.

So unless your specific use case benefits at all from local processing power, I'd argue that it'd be hard to even get a 5% productivity increase that cannot be reached by other means (better/bigger display, more ergonomic peripherals, optimised workflow, etc.).

I'm not denying the fact that even a 5% improvement may justify the price difference. I'm just questioning whether the improvement in performance can meaningfully contribute to that in most use cases.


Exactly. If your time is expensive (as it is for most senior developers in first world markets), even 15% more productivity could easily pay the price premium in very little time.


Being X% faster while consuming Y% less energy and therefore being Z% quieter and having W% longer battery life absolutely outweighs 2.6x price. Add build quality, longevity, resell value, this isn’t even fair, the laptop market is basically garbage or Apple these days.


We have one guy with the Dell XPS in the office. This laptop keeps making noise…


As the other person mentioned even a small productivity difference can offset the cost. And we haven't even touched on QoL items like trackpads, screens, and battery life. Everyone is so focused on AS it feels like the screens are not being talked about enough.


As I mentioned, QoL items can be had externally for substantially less money (e.g. displays, mice, keyboards).

The improvement only materialises in a very niche use case (e.g. primarily mobile workplaces with virtually no additional weight/space for things like mice).


The cost of equipment is negligible compared to the personell cost so unless your machines work by themselves that's not very relevant. That's why professionals and the people having to pay and maintain them care so much about quality and ease.


I clicked a couple of comparisons and they say the opposite.

https://www.cpubenchmark.net/compare/AMD-Ryzen-7-4800H-vs-Ap...

The rest is not strictly about M1/Apple, but I found technically superior hardware can be quite sluggish depending on your software stack.


That seems to be about the Macbook Air M1


Macbook trackpads blow all others out of the water. At least they did up until two years ago when I was in the market last.


This is still the main thing that sends me back to mac every time I try to switch to a linux daily driver.


That seems like a lot of laptop for sub-$1000! Where did you buy/spec it?


> Why should I get rid of my laptop and use the apple instead?

You probably shouldn't. Your laptop is more than adequate for your tasks. It may even be better in some respects (you can play some AAA games if you want).


well I would guess screen is way nicer on MBP which might be something one cares about having to stare at it all day (not saying you should switch)


If the machine is used primarily with external monitors that point becomes much less important, though.

Another question is whether the undoubtedly much better screen is worth the >2x price tag. You can get a nice portable monitor and a very good stationary monitor for just the price difference.


> Another question is whether the undoubtedly much better screen is worth the >2x price tag

For me, yes! It's how interact with the machine for hours and hours each day. This is also why the keyboards over the last few years were such a travesty - they are the primary input method.


Thats def a personal choice for me it def. is :)


Wrong comparison . You are comparing old M1 (that laptop is around 1000 bucks). Here’s the comparison with M1 max https://nanoreview.net/en/cpu-compare/apple-m1-max-vs-amd-ry...


for me the biggest reason is that i can't (locally) compile to ios on anything but a mac.


It is always interesting to look back. It is clear Intel barely made any improvement since 2016. And the jump from 2014 to 2016 wasn't that spectacular in terms of IPC. PCI-E also happened to stuck at PCI-E 3.0 for a long time. DDR (5) improvement slowed to be point all resources went to LPDDR. ( Although one could argue LPDDR was more in demand ). DRAM and NAND price rise because of Supply and Demand cycle for 3 years. Crypto causes GPU price hike. HDD price only went back to normal in 2016 after the Thailand flood. And HDD roadmap such as HAMR, MAMR were all 5+ years behind. Wireless 802.11ax suffered 3-4 years delay compared to original schedule. Took a long time to agree on NBase-T.

We basically have a 5+ year gap of little improvement in PC Industry. ( Comparatively speaking ).


In parallel:

7 year gap, during which Apple also experimented with the trashcan, the emoji touchbar, the shallow keyboard, the non-magsafe non-SD connectors. It was like a pause on personal computing.

7-year gap from 2013 to 2020 where social media became political machines, and all companies became cloud-first.


Oh Yes. It is interesting to look back now.

Strange decade indeed. Unless you are in US and the financial disaster hit hard, then I think I look back in 2010 that 2000s were comparatively uneventful.


If the author doesn't ring a bell, he's an ex prominent Apple exec. https://en.wikipedia.org/wiki/Jean-Louis_Gassée


Gassee also founded Be. BeOS was always a pretty remarkable piece of engineering.

I seem to recall the future of Apple hung in the balance, and they were looking to acquire either NeXT (and Steve) as the foundation for the modern Mac - or Gassee and Be. Sort of a fun parallel universe of what could have been.


Glad they went with NeXT though--Obj-C is a much better framework language than C++.

I honestly think Objective-C is a big part of Apple's OS success.

Edit:

Building system APIs based on message passing provides a lot of flexibility and capability without having to worry about ABI compatibility or object layout. Message passing is good for hiding implementation details.

Obj-C also provides what Swift now calls protocol-oriented programming. Define a protocol and any object no matter its type can implement that protocol and be, say, a delegate for a long running operation. Etc.

This allowed Apple to innovate in their OS, providing increasingly more functionality more rapidly than what seemed to be happening elsewhere.

I think there's even more to this, but I'm tired.


Objc, in my experience is a great language, particularly for building GUIs and I concur was a big part of Apple's success with OSX and later iPhone and iPad. With the core foundation classes (and other frameworks) Apple inherited from NeXT an excellent bedrock to build on.

If you think that for a long period of time all apps were built in Objc for Mac & ios, and how quickly they managed to make an ecosystem of developers who were building applications for Apple systems - I'm pretty sure if they had a c++ framework that wouldn't have happened so easily.

As an illustration, there's a video of SJ building a quick GUI app to demonstrate RemoteObjects (in response to MS's Distributed OLE) where he quickly knocks up a slider and text box. There's also another video where a NeXT developer goes head to head with a Sun dev in a race to finish a simple gui app (you'll never guess who wins).

Why did SJ and NeXT go all in for RAD tooling and making it easy to build apps on their platform? Because of feedback from his days at Apple when developers complained it was too hard and slow using the MacOs toolbox APIs.

FWIW I also believe that a large part of Microsoft's success was due to Visual Basic, also a hugely successful RAD tool that brought thousands of new developers to its platform.


A language like C++ without a stable ABI makes a terrible OS framework language since you'd have to recompile apps every time the framework classes change.


That's one of the problems that original BeOS had -- they were locked into gcc 2.95 abi. In the late days of BeOS, gcc 3 was released and it was not possible to use it.


I also think C++ is uniquely bad for this. There are maybe other languages that would fare better, perhaps even C.

For example, CoreFoundation has a C API... that would be a bit of a test of a C-only framework world.

You can see however that's missing the convenience of Obj-C.

I hope Swift will be successful. It is nice to use (most of the time).


Sure. Win32 and Unix are good examples of OS's with a C interface.


The same problem we have now with Swift. Even after ABI stability code compiled and linked across different versions don't often work.


Not sure why the downvotes--someone care to comment?

FWIW I've written frameworks in both Obj-C and C++ (macOS kernel)


Generally speaking HN is quite anti Obj-C ( or even anything C, C++, basically the whole C family ). They think of it a as relics of the past. The future has to be Rust or Swift.

I often wonder what if Apple continue to improve Obj-C instead of Swift.

I really like Alan Kays's metaphor. Instead of reinventing the wheel ( which isn't so bad ), more often than not in technology they reinvent the flat tire.


I think one of Apple's priorities is memory safety for their systems programming language. Probably too hard to get Obj-C there, and that's before you get to Swift's nice syntax and modern type system. Unfortunately Swift does introduce some type system handcuffs, which I don't like, but overall it seems good.


If I had to guess, I would say that the HN crowd is not fond of unverified claims that are not backed by any source other than the person personal opinion.

> "Obj-C is a much better framework language than C++. How and why is that true?"

Anyone could throw these sort of claims about absolutely anything and doesn't elevate the debate at all.

> "I honestly think Objective-C is a big part of Apple's OS success."

How and why is that true? Same thing.


From Aaron Hillegass Cocoa® Programming for Mac® OS X, Third Edition

Once upon a time, there was a company called Taligent, which was created by IBM and Apple to develop a set of tools and libraries like Cocoa. About the time Taligent reached the peak of its mindshare, I met one of its engineers at a trade show. I asked him to create a simple application for me: A window would appear with a button, and when the button was clicked, the words “Hello, World!” would appear in a text field. The engineer created a project and started subclassing madly: subclassing the window and the button and the event handler. Then he started generating code: dozens of lines to get the button and the text field onto the window. After 45 minutes, I had to leave. The app still did not work. That day, I knew that the company was doomed. A couple of years later, Taligent quietly closed its doors forever.

It is asserted a bit later than C++ was the problem.

I get the feeling some of the Java Cocoa experience applies to C++. Although the C++ of the 90's is most definitely not the C++ of the 2020's.


I updated my comment


As someone who has wanted an ARM laptop for decades (and also thought an ARM tablet would be cool) I mostly see this as a return to what was always possible and that monopoly interests held us back from.

It's like a government during an oil boom. They can get away with a lot of shady stuff as everyone is feeling the wealth. It's only much later that you can soberly analyse how much was stolen from the people and figure out if you are in Iraq, Venezuela or Norway in terms of the benefits of Moore's Law and how that got spread around the economy that made it possible. If there's Billionaires anywhere in the picture then something probably went wrong.


What's sad here is that a proprietary, "walled garden" style vendor like Apple is the one who decided to push TSMC/ARM like this. It's not like there's magic Apple smoke in there.

It didn't have to be this way. We could have had a good AMD ARM platform instead. But they, like Intel, had become complacent (and this despite their position as fighting for Intel's table scraps in the x86-64 world). The failure to take risks leaves us all worse off.

Intel needs to exist, if only to prevent a TSMC monopoly at this point, but what good is AMD if they can't even compete with Apple when outsourcing to TSMC?


I doubt AMD lacks the engineering talent to knock out a spectacular ARM chip but we also have to consider that a few years ago they were running on fumes, near bankruptcy. Didn't they also have to sell their HQ and then rent the building out?

With that pressure they cannot sit around and take a risk on a long term unknown that did not guarantee returns. Maybe today they can pursue this path(I bet they are experimenting internally). I recall some quote by Lisa Su that they are open to other arch long term.


Very good point.

I suppose it's natural to cheer on the technology leader - and Intel is and has been a serious company who has invested a lot to move fab technology forward.

The efforts to build a consumer brand started by Andy Grove have probably also helped quite a bit.

Nevertheless, its striking how little focus there has been on Intel's market share and the lack of real competition on the desktop and server. Good that that's changing.


I may just be old, but I remember them being busted for literally paying dell (and others) billions not to use their competitors. Which is big flashing arrow pointing at monopoly profits.

https://www.extremetech.com/computing/184323-intel-stuck-wit...


To me its getting a bit surreal with Apple Silicon performance. Sure, it's great.

I know it's shocking that the company that produced in-house ARM chips that are top performers in the mobile space for the last 10+ years - has made a bigger chip and shockingly(!!!) it's good.

So, the biggest company in the world has created a semi-custom closed ARM design, running on their properiatery Operating Systems - iOS/iPAD OS/MacOS, that run only on their absolutely unable to repair/upgrade machines, that are essentially bricks of alliminium with a display.

Me personally have a

ASUS ROG STRIX G15 laptop with Ryzen 9 5900HX and 2 x 32GB DDR4 3200 of memory - I can run ot it Windows, Linux, MacOS, BSD, etc.

99% of the time it's connected to the power outlet and I can overclock it.

I don't get the Apple Sillicon excitement, tbh.


>99% of the time it's connected to the power outlet

This.

Now go outside and try this laptop as a mobile device.

What's the point of comparing what is basically a workstation to a relatively thin mobile device?

>I don't get the Apple Silicon excitement, tbh.

You get very power-efficient mobile device. If the mobile part is of no interest for you - than there is nothing to talk about really.


I already have a very power-efficient mobile device - it's called iPhone. :)


now try to replace your laptop with it because that’s what was being talked about


I can join teams/zoom meetings, read emails, reply to emails with my iPhone :)


Yeah and you can write code as well. That doesn't mean it's a laptop replacement.


One part of the excitement is about “look what’s possible!”, but frankly this is also worrying to me. Now that people know you can get this level of performance with that power efficiency, they will demand it from other vendors. That is going to pose an interesting challenge: remain with an open architecture with separate parts limited by their interconnects, or like apple go to a closed architecture with everything on an SoC. Most buyers will not care that the architecture is closed, they just want a light and fast laptop with good battery life. We may be witnessing the beginning of the end of the open pc platform and its wide array of vendors. If the mass market of laptops uses SoC’s the volume will not be there to support OEM parts vendors, and that market will thin out.


Also, I look forward to even more bloated websites and applications now that developers have faster computers.

It will be fast enough on their machines, but they will forget some people don’t like/can’t afford these machines.


I can see why you might be sad about the loss of a component ecosystem, but I think the integrated architecture is primarily a result of physics rather than commerce. This doesn’t mean it can’t be open, like RPI, and maybe we can hope that this will be Intel’s response in the medium term.

But the trend towards on-chip integration and SoCs has been decades long. CPUs also used to be made from discrete components. So I think, for better or worse, the transition to SoCs is inevitable.


AMD tried to do this a decade ago with their APUs. It flatly flopped because they couldn't get the buy-in from software makers. That buy-in problem still exists today.


Linux boots on the M1 and the Asahi Linux project will soon have a reasonable installer. NetBSD and OpenBSD also having working ports. Sure all of the bells and whistles don't work now, but none of the bells and whistles worked in the beginning for any of the FOSS OSs until people had time to reverse engineer it. I've got an M1, and once NetBSD has gotten to a certain level of maturity on the M1, I will be replacing MacOS with it.


I still have my healthy skepticism over that.

It would join the very short list of usable reverse engineered GPU drivers, and not for lack of attempts. Just look at noveau for an idea of what to expect at most...


And for people who need a GPU, thats fine. The way I uses a computer, I don't really need GPU rendering. CPU rendering to a frame buffer is good enough for me. When I started using Linux many moons ago, very many things were handicapped because driver support wasn't there, and plenty of us were happy enough to continue using it. I imagine there are people who feel the same way about support for BSDs and Linux on M1s.


Say goodbye to any semblance of performance and/or good power management, though.


You cannot legally run macOS on that device.


What sort of battery life does your laptop get?


~7 hours if not using Visual Studio 2022. If using it for real work - maybe 2-3 hours max.


Dang that sounds more like a desktop that you get to briefly unplug than a competitor to any M1 laptop.


Not only that, but they are a node ahead of the competition thanks to TSMC and they place memory on package. This accounts for the performance gains.


"Intel fans reacted predictably but a more reasoned, authoritative evaluation came from an unexpected source."

Oh come on.


To be fair, "guy who used to be President of the Windows division at Microsoft" is about as authoritative a source as you're going to get on this sort of stuff.


It’s been fascinating to see how often Sinofsky comes across like an Apple fanboy (and I say that as an unashamed fan of Apple). He seems like he genuinely appreciates the way Apple conducts business.


“Intel fans reacted predictably by turning on at full speed and waking up the rest of the house”.


I recognize the author’s avatar and name! Amazing he hasn’t changed it in all these years.

Several years ago, Nilay Patel of the Verge reviewed an Apple product (think it was a phone) and lamented that it didn’t have style. That comment unleashed a wave of memes and ridicule towards him because he was known for wearing a spike bracelet with every outfit.

Well, I found the hoopla hilarious and jumped in a back and forth Patel was having with the author of this article. I was promptly blocked and still am to this day. I forget but quickly remember when I follow a link via Slack.

Totally off-topic ramble, carry on.



> Patel [...] leaving home every day looking like GWAR's webmaster

Amazing.


The guy has a point. He made a comment about a personal style preference, not a comment on the stylishness overall. And he wears something for sentimental reasons. The blog post was a cheap shot.


Wearing it for sentimental reasons doesn’t make it not tacky.


Sure, but he seems to know that it's tacky and not stylish.


"An authoritative, well-worth-the-read AnandTech article provides a graphic that helpfully compares the three devices:"

That image came straight from apple's keynote.


Fair enough, but the AnandTech article _was_ good and well worth reading. The article suggests Apples graphs weren't lying.


The point that seems to have got less attention than it deserves is the balance of resources spent on CPU vs GPU - ie M1 Max is a big GPU with a CPU attached.

Yes these machines are used by lots of video editors etc but it does seem to be symbolic of a general shift towards more compute being done on the GPU - now even on laptops - and not just for video.

And having more GPU compute available at your fingertips will only reinforce that trend.


Has to be for their mixed reality product rumored to launch next year that they’ll want native apps to be developed for. There are only so many ProRes video editors out there.


It's just reversion to the mean. M1 had a beefy CPU and lightweight GPU. M1 Pro/Max is a beefy CPU and a beefy GPU.


Apple spent 100% of its 23 billion transistor budget Max vs Pro on GPU resources - that’s probably indicative that GPU compute is being prioritised.


Yep, just as they did with the 2013 Mac Pro. We all remember how well that worked out.

There's a substantial subset of high-performance computing tasks that are limited by GPU performance. But that's far from the whole equation.


> The first thing to say about the M1 Pro and M1 Max, Apple’s new homegrown processors, is that we misunderestimated them.

trying to work out if this malapropism was used ironically...


Of course.


I wonder if apple will start using their own silicon for their own cloud servers.


I hope they will, and I hope their cloud servers run Linux.


I do think we have to temper the level of surprise somewhat.

The key thing in consumer silicon chip design is always individual core design (or designs, there's Firestorm and Icestorm in big little). That's what matters for single threaded workloads.

Fully parallelized workloads in a consumer laptop are just exceedingly rare. Plus it's exceedingly pointless to do lengthy multi-threaded development on a laptop, it's going to overheat and under-clock. SSH into a cloud instance and develop in an identical environment to where you deploy.

Apple vastly exceeded expectations by making enormous silicon dice, and tying in vastly more GPU, bandwidth, and parallelism than expected. To the joy of video editors everywhere (which cannot be done over ssh).

But for people who watch closely, the M1 Max delivered exactly what was expected, plus a gargantuan GPU and memory bandwidth.

The M1 Max is still ultimately the A14's Firestorm/Icestorm cores, in a MBP form factor, delivering amazing idle and low power consumption performance, and single threaded that matches Intel/AMD's desktop chips, at a far lower PPW because of the node shrink.

Given that the A15 still features the Firestorm/Icestorm core design, it's almost guaranteed that we'll have to wait till 2023 to see a more than marginal uplift in single threaded workloads and low power performance.


> Fully parallelized workloads in a consumer laptop are just exceedingly rare.

Eh? Probably THE most common workload for a laptop (opening a webpage) is multi-threaded. And even consumers multi-task. As soon as you have multiple processes, you can use multiple cores even without multi-threading support.

> Plus it's exceedingly pointless to do lengthy multi-threaded development on a laptop, it's going to overheat and under-clock.

The promise of these laptops is that they won't. As you say, their power consumption is a lot lower than competing chips, and it's that power that produces the heat.


"Fully parallelized workloads in a consumer laptop are just exceedingly rare."

Really? I've observed these kind of consumer workloads getting more and more common in recent years. In any case I wouldn't call them exceedingly rare.


I want to emphasize the fully part of fully parallelized with this.

It's very common to multi-thread something across three threads for example.

But something that is perfectly parallelized across all your threads, or at least 8 or more, that's actually something I almost never see in end user devices.

And yeah, ultimately having 4 "big" cores that are as fast as possible matters for those use cases that are threaded across a limited number of cores.


ffmpeg to reduce/transcode off air digital TV would be my expected regular use case. If this design can do this faster I and a lot of other people will be delighted.


That's a massively niche use case, and I say that as someone sat in a TV OB truck editing an ffmpeg video filter at this exact moment.


If you use plex, you transcode all the time. If your smart TV doesn't grok your native encoding, plex will do this in flow. Or for handholds. I don't think plex is alone in doing this.

Maybe I misunderstand how many people do streaming TV with downconvert.

But that said, very chuffed OB does ffmpeg. You rock!


"getting more and more common" and "exceedingly rare" does NOT actually contradict each other.


If they're exceedingly rare now, despite having become more and more common, what were they before? Super-exceedingly rare?


>Given that the A15 still features the Firestorm/Icestorm core design...

The A15 uses Avalanche performance cores and Blizzard efficiency cores. I'm not sure how different they are, except that Blizzard in particular is a lot faster that Icestorm.


Ah that makes sense.

The high performance cores are very similar iirc, accounting for binning and density. But if Apple has upgraded just the efficiency cores this time around that makes sense.


> SUSTAINED, LONG RUNNING HEAVY Fully parallelized workloads in a consumer laptop are NOT THAT COMMON, YET. Most of time, those workloads come in bursts.

There, I fixed it for you.


How come the new MacBook Pros are available for order all around the world, but in SG the website says ‘Check back later for availability’


IS this the beginning of the end of modular PC's? Is the future SOC?


Maybe modular SOC? Have motherboards with a standardized pinout but the CPU has RAM/GPU soldered on. You could choose a CPU with productivity graphics performance, HTPC performance or high end gaming performance with associated different RAM sizes. Time to short Corsair/G.SKILL/etc.


He mentions "containers". Anyone knows what he meant?


The machines that contain the M1Pro and M1Max, aka the new MacBook Pros.


Yes M1Max is nice, or great, but in a mostly useless place.


This is similar to how I feel. The internal chips are irrelevant if nobody else can buy and use them. The product is the laptop.

People who are used to Apple laptops and like using them, will love this new laptop. People who did not look favourably upon Macbooks, still have no reason to, except maybe battery life. But even there, AMD laptops are reaching 9-10 hours easily, which may be more than good enough for many users.


Why?


I thought it was sort of common knowledge that Apple was going to update the MacBook Pro with a Msomething powerhouse following the other updates, but maybe I’m just completely out of the loop on hardware rumours?

Anyway, I don’t think it’s surprising that the company which has one of the best global production and logistics operations in the world is capable of building good processors. It’s just another part of the puzzle after all.

It’s just a shame that their design department still doesn’t seem to have learned it’s lesson from Jobs in that it once again has failed to build a product that is designed in a way that people want. Why they ever deviated from the MagSafe, why they added the weird screen Function bar and now why they’ve added a bezel to a ducking laptop is really beyond me? Like, do they not use computers? But improving upon the supply chain by creating new powerful hardware? Absolutely within what I would expect from modern Apple.


Apple's risky behavior has brought both failures and huge successes.

They were the first to remove serial ports, floppy disks, internal modems, optical drives... And it turned out good even if you could say that's not what the people wanted.

The touch bar, magsafe removal, etc.. were failures alright, I bet the notch won't be a failure, however


> And it turned out good even if you could say that's not what the people wanted.

The difference, and what I probably failed to explain well enough. Is that these removals might not have been what people thought they wanted, but what Apple knew they did.

With the touchbar, new keyboard, removal of the MagSafe, removal of the charger from iPhone boxes Apple has made things people didn’t actually want.

The notch on the phone may have seen wide adoption, but in my anecdotal experience nobody likes it.


Looking at a notch, it is easy to imagine it not being there. So easy for someone ignoring why its there to complain.

But do you really think people would rather forgo the camera and other features to remove the notch?

Alternately, do you really think people would prefer a notch-high bezel at the top of their phones instead of more screen with a notch?


Check this out:

https://arstechnica.com/gadgets/2021/10/samsung-tablet-leak-...

The notch is already being copied. Perhaps the wrong way.


The 'bezel' (im assuming you're talking about the notch) does not impeed the screen real estate. The section where the menu bar is rendered and where the notch intrudes is actually extra screen size on top of the 16:10 screen.

People are complaining about getting extra screen real estate.


Well yes, but have you ever heard anyone who liked the notch on their phone?

I have one on mine and while I don’t particularly dislike it, I’d certainly rather not have it.

Which is sort of my point in Apple losing touch. They stopped changing things in ways that I ended up liking, and I use myself as a reference because I tend to be rather ordinary in my consumption.

Maybe my anecdotal experience is completely wrong, but I have never heard anyone who were happy about the notch on their phone.


I honestly couldn't care any less about the notch and frankly I find it amazing that people care about it at all


People are complaining because screens have been fucking rectangular since forever! And this change forces klunky software compromises for no good reason.

If they made a triangle of extra screen area jut out on the right, you'd get extra screen space too. I bet people would still defend it not because "it's extra screen" but in reality because it came from Apple.

The fact that people are so willfully ignorant of that is amazing to me. The Apple-think bubble is stronger than I thought.


It makes the screen bigger... thats the reason. The menu bar no longer takes up ANY screen real estate.

They could have just not extended the screen and had a larger bezel. I don't see the issue.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: