Hacker News new | past | comments | ask | show | jobs | submit | page 2 login
Apple's follow-up to M1 chip goes into mass production for Mac (nikkei.com)
519 points by lhoff 8 months ago | hide | past | favorite | 645 comments



This would be quite an accelerated timeline if Apple ships its second-generation M-series chip only eight months after the first. Typically, they’ve followed a sort of six-month tick-tock pattern for the A-series, launching a new major revision in the fall with the new iPhone, and launching an “X” revision in the spring with new iPads.

I think most observers have been expecting an “M1X” for Apple's first pro-oriented ARM Macs, so an M2 already would be a surprise.


> "This would be quite an accelerated timeline if Apple ships its second-generation M-series chip only eight months after the first."

The M1 was available on Nov. 17th, 2020. The article states that the chip is entering mass production, and due for release sometime in 2H. This could easily be released a year after the M1, if not 13 months later.


We're in a brave new world. The early prognosticators thought the M1 was more a proof of concept (shove it into existing designs to get it out there).

But now we know that it was always intended to be a real player (it's in the iMac and iPad Pro).

So this news is interesting to me because now it seems to cut back the other way that maybe the M1 was designed for a short shelf life.

In a world where Apple is controlling so much of its silo, "normal" design and building deadlines will be upended.


> The early prognosticators thought the M1 was more a proof of concept (shove it into existing designs to get it out there).

Which is a weird take when you consider the thermal issues that Intel macs were plagued with. It's almost like the chassis was designed with 10w of dissipation in mind which Intel couldn't operate within, but the M1 could easily.

I had assumed that Apple designed for the M1 and then fit Intel chips into those designs.


My private conspiracy theory (supported by nothing) is that Intel promised to Apple good 10w processors back in 2014-ish, and Apple designed 2016 MBP based on that promise. And when Intel didn't delivered, they shipped Macs anyway, and either started working on M1 or cleared any doubt about should they continue working on it.


that’s not just your (conspiracy) theory, it’s exactly what happened (something i’ve also noted before). intel screwed apple years ago and apple decided to move on. it just took many years of chip development to get to this point.


Honestly wouldn't be surprised if it came out that they started working (at least theoretically) on Apple Silicon when they transitioned to Intel in the first place and it just took this many years for all the pieces to be ready and lined up.


Not only plausible, I'd say this is the most likely way it played out.

At the time of the Intel transition, Apple had already gone through the process once before with 68k to PPC. It had to be clear to the long-game thinkers at Apple that this cycle would keep repeating itself until Apple found a way to bring that critical part of its platform under its own control. Intel was riding high in 2006, but so had IBM in 1994.

Within two years of the Intel transition, Apple acquired P.A. Semi. The iPhone had barely been out for a year at that point, and still represented a fraction of the company's Mac revenue – and while it looked to us outsiders like the acquisition was all about the iPhone and iPad, in retrospect, a long-term replacement for Intel was almost certainly the endgame all along.


possible, but as outsiders, it's hard to be sure of that sequence of events with those sets of facts, to draw that conclusion definitively. perhaps that was a backup plan that quickly became the primary plan.

but with the 2016 line of macs, it was obvious that apple was expecting faster, smaller, cooler, more power efficient 10nm chips from intel, and intel fell flat on their face delivering. it's not clear how far before that that apple knew intel was flubbing, but 2014 seems a reasonable assumption given product development timelines. as intel's downward trajectory became clearer over the following months, along with the robustly upward trajectory of apple silicon, the transition became realizable, and eventually inevitable.

as an aside, i'm using a beat up 2015 macbook pro and eagerly awaiting the m2 version as its replacement, seeking to skip this whole intel misstep entirely.


I think Apple would have been perfectly happy buying CPUs from Intel as long as Intel kept their end of the bargain up.

After the PowerPC fiasco and IBM leaving Apple high and dry, I have zero doubt that there was a contingency plan always under way before the ink even dried on the PA Semi acquisition, but it wasn't probably a concrete strategy until about the third time in a row Intel left Apple high and dry on a bed of empty promises.

Apple has so much experience with processor transitions they don't have to stay on ARM either. And they have the capital to move somewhere else if it makes enough sense to them. I find it highly unlikely - but if it made sense it would be highly probable :)


I actually wonder if the end-game isn't going to be a Intel/M1 hybrid on the high-end ... a system with multiple chips and architectures.


That sounds terribly complicated.


Fascinating. It's amazing the 3D chess these companies have to play effectively.


> "I think most observers have been expecting an “M1X” for Apple's first pro-oriented ARM Macs"

I'm pretty sure that's what this is, rather than a next-generation ("M2") chip. It will likely have the same cpu core designs as the M1, just more of them. And possibly paired with the new, long rumored "desktop class" Apple GPU.


Given the timing, I doubt it. Apple has established a fairly stable cadence of core improvements every 12 months. Enough time has elapsed between the M1 and this new chip that I'd expect it to have more in common with the "Apple A15" generation SOC than the A14/M1.

As to what Apple's choice of marketing name, that's entirely arbitrary. (For what it's worth, my guess is they're ditching "X" suffixes and will designate higher spec variants with other prefix letters e.g. the "P1" chip.)


Considering M1 doesn't support LPDDR5, and A15 will be in a similar time frame. I would not be surprised it will be a M2 ( Based on A15 ), or more likely a M2X.


I'm not sure about that, because the M1 cores are, in essence, quite old at this point; they're pretty similar to the A12 cores. Apple usually does quite a major micro arch refresh every few years; it's probably coming up to time.


> I think most observers have been expecting an “M1X”

An M2 name implies some architectural differences like extra cores or more external bandwidth. I'd be totally happy with an M1x with some tweaks like more external connectivity and more memory.

Which, for me, would be quite perfect. The only reason I'm holding back this purchase is the 16 GB memory limit.


Same here. I want a replacement for my 27in iMac and would have held my nose at the slightly smaller screen, but really want more memory than 16GiB (docker, etc...).

So Apple will just have to wait to get my money until fall (or whenever they announce the successor to the 27in iMac).


I'm excited for the 27 (or maybe it will be 29") variant


I'm not sure you'll see a true second generation chip, I would be expecting it to be mostly the same thing, but with more cores and some solution to providing more RAM.

Having said that, Apple does have something of a history of pushing out v1 of a product that sets a high bar for everyone else to try and catch up with, then immediately pushing out a v2 that raises the bar well above where everyone else was aiming.

Overall though, it's awesome that the Macs now get to benefit from the vast investment going each year into making faster/better CPUs every year, for hundreds of millions of new iPhones.


Apple forecast a 2 year transition a year ago at WWDC. That means they need a processor for their base laptop/ consumer desktops. One for their high end laptops and many of their pro desktops. And arguably one to replace the Xeon in the iMac Pro and Mac Pro.

Unless they are going to use this same CPU for the Mac Pro, this is right on schedule.


I think getting out the M2 before the fabled "M1X" actually makes sense. This could explain the decision to put the M1 into the new iPad Pros, to re-use the M1 chips elsewhere once the new M2 becomes available.

Main reason being the M1 was a more proof of concept and rushed out (despite being as good as it turned out to be). The M2 will be a more refined M1 but with notable improvements such as LPDDR5 support - akin to AMD's Zen1 and Zen+ releases.

On the other hand, there could be a M1X being readied for release in the upcoming June WWDC. It may be architecturally older than the M2 but still superior performance on a big cores differential, e.g. M1 only has 4 big cores and 4 small cores, the M1X just needs more big cores to be notably more performant.

All highly speculative of course, will have to find out in about a month.


When they switched to Intel they released the first Macbook Pros in January 2006 (32-bit Core) and in October 2006 shipped 64-bit Core 2 Duos.


Imagine they put Macbooks on a yearly upgrade cycle like the iPhone - OMG, that would be impressive.


I don't know if you're being serious but, given the lack of improvements in chip design lately, that would indeed be impressive.

I don't mind upgrading every other year. I just want the upgrades to be meaningful.


It’s not that you, as a user, need to upgrade, but Apple could upgrade the SOC in their machines each year. It’s like the phone, small incremental updates each year. If you wait a few years to buy a new one, the change feels dramatic.


Some people (a lot?) are now waiting for the new Pro devices and won't buy new until then


I'd buy a MacBook Air in a heartbeat if I could get 32GB of RAM in it. RAM is the only thing causing me to turn up my nose at the M2.

If they would have released the new iMac with a bigger panel so there were options for the 27" as well as the former 21" then my mother would be rocking a new iMac next month.

I know they said up to two years for the transition but I want to transition now :)


Not really, because M1 was probably meant as a stop-gap, and it's mostly a rehash of A12.

M2 is probably based on the Arm v9 ISA and has been in design for years.


The M1 is no stop gap. When you have people criticizing it because it only bests 90% of the current PC market but not all of it...

Well, if that is indeed a stop gap then I can't wait to see their first "real" chip :)


All very impressive, but here's my question: what are they going to do about graphics cards? Will they find a way to connect existing graphics cards to their CPU? Will they make their own ARM-based graphics cards? Will AMD or Nvidia?


Nvidia? Ha, never in a million years.

Support for one of the recent Radeons was recently added to macOS, so it's a possibility. No reason the M1 can't do PCIe, as far as I know the only thing keeping eGPUs from working on the M1 right now is software support. It could also be that the driver was added because of the extensibility of the Pro, though.

My expectation is that they'll keep the GPU on the same level, which is "good enough for most users", and focus on hardware acceleration for tasks like video and audio encoding and decoding instead. With an ML chip and fast audiovisual processing, most consumers don't need a beefy GPU at all, as long as you stick to Apple's proprietary standards. Seems like a win-win for Apple if they don't add in an external GPU.


Yeah I imagine the Radeon support was for the Pro and the existing Intel Macs (though I don’t know if those Radeon GPUs are really supported via eGPU. Are there enclosures where they fit?)

Still I can’t see Apple only developing one integrated GPU per year unless they somehow figure out how to magically make them somewhat approach Nvidia and AMDs modern chips. What would the ARM Mac Pro use?

It seems that Apple has put in a lot of development resources into getting Octane (and maybe Redshift and other GPU accelerated 3D renderers) to support Metal (to the point where it sounds like there may have been Apple Metal engineers basically working at Otoy to help develop Octane for Metal) and I can’t just imagine that happening just to support the the Apple Silicon GPUs. I wouldn’t be surprised if we see eGPU support announced for ARM Macs at WWDC (and maybe even the iPad Pros that support Thunderbolt. Yeah the idea of plugging your iPad into an eGPU enclosure is funny, but if it’s not to hard to implement, why not?)


>It seems that Apple has put in a lot of development resources into getting Octane to support Metal...and I can’t just imagine that happening just to support the the Apple Silicon GPUs.

At the start there will still be a lot more Mac Pros running AMD hardware that must be supported.

It may not be obvious, but Apple has repair work to do in the pro community. Four years ago this month, Apple unusually disclosed that it was "completely rethinking the Mac Pro." [1]

This new Mac Pro design wasn't announced until June of 2019 and didn't hit the market until December 10th of 2019. That's just _six months_ prior to the Apple Silicon announcement.

So, unless Apple simultaneously was trying to honor pro users while also laying plans to abandon them, it is hard to imagine that Apple spent 2017-2019 designing a Mac Pro that they would not carry forward with Apple Silicon hardware. Keep in mind, the company had just gotten through a major failure with the Gen 2 cylindrical Mac Pro design.

The current, Gen 3 2019 Mac Pro design has the Mac Pro Expansion Module (MPX). This is intended to be a plug-and-play system for graphics and storage upgrades. [2]

While the Apple Silicon SoC can run with some GPU tasks, it does seem it does not make sense for the type of work that big discrete cards have generally been deployed for.

There is already a living example of a custom Apple-designed external graphics card. Apple designed and released Afterburner, a custom "accelerator" card targeted at video editing with the gen 3 Mac Pro in 2019.

Afterburner has attributes of the new Apple Silicon design in that it is proprietary to Apple and fanless. [3]

It seems implausible Apple created the Afterburner product for a single release without plans to continue to upgrade and extend the product concept using Apple Silicon.

So, I think the question isn't if discrete Apple Silicon GPUs will be supported but how many types and in and what configurations.

I think the Mac Mini will remain its shape and size, and that alongside internal discrete GPUs for the Pro, Apple may release something akin to the Blackmagic eGPU products they collaborated on for the RX580 and Vega 56.

While possibly not big sellers, Apple Silicon eGPUs would serve generations of new AS notebooks and minis. This creates a whole additional use case. The biggest problem I see with this being a cohesive ecosystem is the lack of a mid-market Apple display. [4]

[1] https://daringfireball.net/2017/04/the_mac_pro_lives

[2] https://www.apple.com/newsroom/2019/06/apple-unveils-powerfu...

[3] https://www.youtube.com/watch?v=33ywFqY5o1E

[4] https://forums.macrumors.com/threads/wishful-thinking-wwdc-d...


Nit: Afterburner is built on FPGAs, they are architecturally different from the M-series chips and GPUs.


I'm not sure what you mean by nit.

Apple designed and released custom hardware that used a new slot to accelerate compute. My point is that this illustrates Afterburner as a product shows clear direction for Apple to put Apple Silicon into discrete graphics or other acceleration compute in the Mac Pro.


> So, unless Apple simultaneously was trying to honor pro users while also laying plans to abandon them...

You say that like Apple doesn’t do stuff like that all the time.


Stuff like what? Can you give examples where you’ve known the company’s plans and intent?


> Still I can’t see Apple only developing one integrated GPU per year unless they somehow figure out how to magically make them somewhat approach Nvidia and AMDs modern chips. What would the ARM Mac Pro use?

What do mac users need a beefy gpu for?

AFAICT apple just need a GPU that's good enough for most users not to complain, integrated Intel-GPU style.


What I said in before, 3D rendering (and video processing and anything else you might want a powerful GPU for).


Do people do 3D rendering on macs ? (there are no GPUs available with hardware raytracing support...)

Most people I know doing 3D rendering nowadays just connect their UI to a remote rendering farm. So a Macbook Air would be more than enough for that.

For video processing you don't need 3D rendering capabilities, just hardware acceleration for the video formats you are using.


And are you thinking the solution for people who do need a powerful GPU is eGPUs and Mac Pros?


I don't think Apple cares much for those people, they can buy the Mac Pro or a PC if they really need the GPU power.

eGPUs can be a nice addition, but I doubt Apple will release an official eGPU system. You're already limited to AMD GPUs after the clusterfuck of a fight Apple and Nvidia had, and I doubt Intel's Xe lineup will receive much love for Apple right after the Intel CPUs have been cut from Apple's products.

Honestly, for the kind of work that does need an arbitrary amount of GPU horsepower, you're barking at the wrong tree if you buy Apple. Get yourself a Macbook and a console or game streaming service if you want to play video games, and get yourself a workstation if you want to do CAD work.

I don't think the work Apple would need to put into a GPU solution would be worth it, financially speaking.


> I doubt Apple will release an official eGPU system

They already have one[1], and you can even buy eGPUs from the Apple Store[2].

1. https://support.apple.com/en-us/HT208544

2. https://www.apple.com/sg/shop/product/HM8Y2B/A/blackmagic-eg...


That's a Radeon Pro 580, AFAIK this eGPU offering hasn't been updated in several years.


That specific one is, yes, but you can also buy third party Thunderbolt 3 sleds (eg Razer makes one) and use more recent cards.


How would you fit Apple's AR/VR ambitions into this perspective? (I.e., given AR/VR has steeper GPU requirements, both on the consumption and creation side.)


Well unless Apple can pull an M1 and do with their GPUs what they did with their CPUs and start to embarrass Nvidia and AMD with lower power, higher performance GPUs.


Kinda feels like Apple's choice at the moment is just their own integrated GPUs. eGPU is also a possibility.


Will probably not be great for battery life


To make a Mac Pro-scale system with real gains, they would roughly need the equal of 9x the number of performance cores of an M1 (~36 to 48 cores), if they were to scale GPU in the same way (72 core GPU) you are looking at a 72 core GPU with over 23 TFlops (FP32), they could also find room in clock speeds and 5nm+ to get an additional 10 out of it I imagine. In general that would be enough for many but I wouldn't be too surprised to see them do something more exotic with their own GPU.


They are R&D’ing their own GPUs to vertically integrate according to some rumors from my Apple friends.


Why would Apple build fancy graphics cards? They have no meaningful gaming market and haven't cared about it for years. For machine learning?


They don't need to build them, but they need their machines to be able to use them (for the same reasons their current pro machines use them).


They already have. Their integrated graphics now rival that of discrete gaming laptops.


Rival how?

A Surface Book 3 with an intel processor and an outdated Nvidia 1650 TI laps around M1 in games. Almost 2x performance. I'm not even going to compare it to laptops with modern GPUs.

https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...


Option 1 - yes, they will

Option 2 - no, but does it matter? It's not like the previous gen Macs had great GPUs and no one is gaming on a Mac anyway.

Option 2.5 - bring back eGPU


> no, but does it matter? It's not like the previous gen Macs had great GPUs and no one is gaming on a Mac anyway.

True, but previous macs were never really competitive with PC alternatives on the hardware side, since they all used the same chips just with a higher price tag. With M1, that's starting to change, and Apple has the opportunity to attract a much large customer base for Mac than it ever has.

And of course, they're much more interested in gaming nowadays thanks to iOS. Maybe not interested enough to suck up their pride and apologize to Nvidia for giving them the finger, but probably enough to at least stick a beefier GPU into macs.


Even putting aside the performance issue, Apple and gaming have never worked together quite well.

Apple's modus operandi of quickly and frequently deprecating old architectures and requiring app developers to constantly keep up goes down very badly with the traditional video game development model - of spending time and budget finishing up one game, releasing a few patches, then moving on to the next game with little further upkeep or maintenance of the now-done game. (Yes, games as a service is more common nowadays, but a huge number of games still go by this old model.) This model relies on long-term compatibility of old binaries on the platform being pretty stable, which is fairly true for consoles and Windows, but Apple platforms are anything but.

There are massive piles upon piles of only slightly old games that are not just unsupported but simply refuse to run on both the iOS App Store and Steam for Mac (including Valve's own back catalog!), due to the abandonment of 32-bit binary support a few versions back. And even if the developer is willing to do bare minimum upkeep work to recompile an old game and make it run on current hardware, chances are that between the time of release and now, lots of new mandatory hoops (eg. natively support a certain screen size) have been added to the app store checklist so that passing store certification requires tons more work than a simple recompile, further deterring the dev.

Perhaps you could chalk it up to the dev being lazy for not doing regular maintenance of the game, but the rest of the game industry doesn't force you to do that, while only Apple does.


You also need GPUs for rendering video, which people do use Macs for.


I hate to say that, but I am likely going to buy M2 Mac. I don't like Apple and their anti-competitive tactics, but I admit they won their spot for now. However, as soon as good PC competitor comes in, I'll drop Apple like a hot potato.


The Apple's second version of everything is always the one to get. The first version is always exciting, but usually comes at some large expense.

- iPhone 2 had 3G and was so much better than 1 - iPad 2 was about 3 X slimmer/lighter than 1

Lots of other examples if I though about it.


Obviously, we won't know if the M1X/M2 has a similar big advantage over the M1 until it ships, but...

You can also look at it this way: The M1 is not the first version. It is (IIRC) an enhanced version of the A14 SoC family, and a clear evolution of the work Apple has been doing on its silicon for the past decade.


What, in your opinion, is the drawback of the M1 product series?

I haven’t read many negative reviews.


The most commonly cited limitation I've heard is a max of 16GB RAM on the current M1 Macbooks. The limitation of a single external monitor is probably the second most common.

A lot of users would like to run multiple external monitors and have > 16GB of RAM. I know I'm in that group.


The RAM is limited to 16GB, IIRC the max I/O throughout is also somewhat limited as well, so you have to compromise on number of USB4/10Gb Lan etc


* Limited IO * Max 16GB Memory ( at unjustified cost ) * Limited Multi monitor support * No eGPU support ( as of now )

* Only about 50 a 55% of all the software is M1 ready ( looking at some tracking sites ). Technically this is not a M1 flaw but you need something to push the market/iron out the bugs. While when the M2 gets introduced, you may be at 60 or 70%. As in less likely to run into issues as the M1 users are the real beta testers. Even Parallels only recently got good M1 support ( with massive speed increases ).

As of this moment, buying a M1 laptop is a less beta tester feature then it was several months ago. If you ended up buying in Nov/Dec, you spend a lot of time under Rosetta 2 or dealing with issues.

> I haven’t read many negative reviews.

The above was kind of always overglanced by a lot of the reviews, as the youtubers mostly looked at their own us for video editing etc and there most software was on point very early in the release cycle.

You see a lot of reviews in Jan/Feb from people going back to Windows laptops after a month, not because of the CPU being bad but because they ran into software issues.

In the mean time the software situation has evolved a lot more but the progresses with software being made M1 ready has also slowed down a lot.

As a PC user i want a M1 like laptop, that has long battery life, is silent and still powerful ( unlike a lot of Windows laptops where its always the old saying: Pick 2, you can never have all 3 ).

But i prefer one with 8 performance cores, double the iGPU cores ( with preferably DDR5 ) for light gaming and standard 16GB. So some macBook 16 Pro or whatever, if the price is not insane. We shall see what Apple introduces...

So far the new offerings from AMD and Intel are fast but still power hungry and heat generating ( aka fan noise! ). AMD is only going little.big in 3nm.

Intel's alder lake may be a M1 competitor ( for battery life under light loads ) but its again first generation product so expect to be a beta tester until Windows gets fine tuned for a long time to properly use the little cores! For heavy loads, ... well, 10nm is 10nm, no matter how many +++ you add.


Memory, ports. That might be more of a product level issue but its a blocker for me.


In the case of CPU architecture switches, Apple's gone with the same strategy every time so far: switch out the guts, keep the outside design. So maybe not negative reviews regarding the CPU, just a bit boring design.

I disagree with OP though, not second, but third generations have been the one to get for me: iPod 3rd gen, iPhone 4, Apple Watch Series 3. OK, iPad 3 was a bit of a failure, but still, first retina.


The new iMac diverges dramatically from this model. It's a complete new design, one that could only have been done after a switch off Intel.


It would be fine for 98% of the things I do but I still need to do that 2%. With the x86 CPUs I always had virtualization with USB pass through as a final workaround solution but with the M1 there are things I absolutely can' do.


Don’t get me wrong, it looks great. But so did iPad 1 and iPhone 1, until v2 came out.

The main things I would hope for though are more RAM, and probably a much beefier GPU


It’s got a better GPU than anything which uses less than 35W, probably even higher than that.


Yeah it's great. I've recommended 2 people get these laptops and i probably would have got one too if there was a 15 inch one.

I'm just hoping there are some more surprises in store with that slightly bigger battery.


Dedicated graphic cards doesn't work. No Linux support.


> ”No Linux support”

It’s coming:

https://arstechnica.com/gadgets/2021/04/apple-m1-hardware-su...

(Virtualized Linux is already well-supported on M1 Macs)


multi monitor support. You can have two monitors max.


The original Intel MacBook comes to mind as well

- released with the (32-bit) Core Duo processor in May 2006 - replaced with (64-bit) Core 2 Duo model in November 2006

It only was supported through to 10.6 Snow Leopard as 10.7 Lion went 64-bit only


I got shafted with a similar 6 month duff product - the iPad 3. Now that was a bad decision. The iPad 4 came out later that year, with about 4X better performance and a switch to the lightning connector too.

I was so burned by that experience that I only bought an iPad again a month ago.


Yes! Good example. I was thinking about that too but couldn’t remember my history enough to explain.

AirPods VS AirPods Pro is another I just remembered

I think watch v2 was a big improvement too.


My parents owned (still have) an OG Intel Mac Mini - the box said "Core Solo". Seems like that was one of the few devices sold with that chip.


I still have an iPad 2. I use it in the kitchen to browse recipes.


I still have another one near bed for reading eBooks


I have a first-gen iPad Mini that still sees light use. Approximately the same internals as the iPad 2, AFAIK.


I'm really hoping Alder Lake or a similar AMD product gets PC closer to M1+ performance and battery consumption.

The M1 chip is amazing but I'm a tiling window manager man.


This will obviously not be comparable to a tiling window manager, but I've been pretty happy with Rectangle [1] on my mac. The keyboard mappings are pretty easy to configure and i've found it to work well even in multi monitor setups.

[1] https://github.com/rxhanson/Rectangle


+1 for Rectangle; I've been using it ever since it was Spectacle. There's nothing I really miss about a proper tiling window manager (though I'm sure hardcore users would disagree)


> There's nothing I really miss about a proper tiling window manager (though I'm sure hardcore users would disagree)

Agreed. I mostly just needed the keyboard driven window snapping I was used to on Gnome and rectangle has filled than need 100%.


This is Mac OS:

- https://www.reddit.com/r/unixporn/comments/jupmda/aquayabai_...

- https://www.reddit.com/r/unixporn/comments/mvuplf/yabaimacos...

Its called Yabai (+ skhd): https://github.com/koekeishiya/yabai

That is, you can have a tilin WM today with all the advantages of running MacOS.


From your third link:

   System Integrity Protection needs to be (partially) disabled
   for yabai to inject a scripting addition into Dock.app for
   controlling windows with functions that require elevated
   privileges. This enables control of the window server, which
   is the sole owner of all window connections, and enables
   additional features of yabai.
The risk Apple kills yabai after you're adjusted to it is real.


> The risk Apple kills yabai after you're adjusted to it is real.

This holds for anything in the Apple ecosystem, up to Fortnite.

Yabai has been going for long, and every issue could be worked around relatively painlessly.


> This holds for anything in the Apple ecosystem.

Holds for anything in almost any ecosystem. With that said, Apple has stated over and over the Mac will remain developer friendly. Being able to disable SIP, IMO is part of that.


I use yabai without disabling sip. You get most of the features. It's the first tiling WM I've used, so it's possible I'm missing something critical without disabling sip, but so far I'm quite happy with it despite whatever features are missing. ymmv, of course.


FWIW, I run yabai without having disabled SIP and it works great. There is probably some subset of functionality I am missing out on, but it does what I need it to.


It's not exactly a tiling window manager, but if you can program some simple Lua then Hammerspoon is a godsend. You can program anything any of the other window managers for Mac (like Rectangle, Spectacle, etc.) can do and have complete freedom to set up your own keyboard shortcuts for anything.

I have some predefined layouts[1] for my most common usage. So, one keyboard shortcut arranges the screen how I want, and I have other keyboard shortcuts[2] (along with using Karabiner Elements for a 'hyper' key) to open or switch to common apps.

[1] https://github.com/kbd/setup/blob/1a05e5df545db0133cf7b6f1bc...

[2] https://github.com/kbd/setup/blob/1a05e5df545db0133cf7b6f1bc...


Alder Lake will get closer because of the big.LITTLE structure, but I don't know if we will really see a contender from Intel until Meteor Lake. Lithography size gets too much attention for its nomenclature, but it actually matters for battery consumption and thermal management. Intel must execute flawlessly on 7nm and spend generously on capex to keep the ball rolling.


It seems like apple is TSMC priority customer so I suspect they will always be a node ahead of AMD.

Doesn't matter if Apple does mobile and low power stuff exclusively, but if they can scale this design into a higher TDP/core count it's going to get interesting.



You’ve gotten plenty of recommendations, but I’ll add one for Magnet [1]. I’ve been using it for years, and I love it. One of the best software purchases I’ve ever made - small, lightweight, does exactly one thing and does it very well, and made by a small developer to boot.

[1] https://magnet.crowdcafe.com/index.html



I also am a tiling window manager man and tried that a few years back (as well as everything else on the market) when I had a mac from work, unfortunately, without real support from the OS, these are all just poor mans windows managers and can't compare to the real thing. I gave up trying to use any of them and ended up installing a virtual machine where I could actually use one.


I'm a gnome addict. If you ever asked yourself "who are they building this monstrosity for?". That would be me, it fits my exact preferences in a workflow out of the box. Unity irritates me to no end, and my wife's mac is almost unusable for me which she greatly appreciates.


Not sure why you're being down voted, I use Amethyst and love it.


Zen 4 will probably at least match the M1, but it will be a while before those chips come out and Apple will soon improve even more.


I have a feeling that Apple has pulled ahead and will stay ahead for a LONG time. They are extending their moat. And applying Mx to VR they will create a new moat.


They started pulling ahead in the A6-A7 days and never looked back. Amazing progression.


I'm curious to see if Zen 4 manages actually. It'll be an interesting datapoint as to what actually makes the M1 better.


Well, here's hoping that both Linux and Windows work flawlessly on their hardware at some point.


Not even macOS works flawlessly on the hardware, why should Windows and Linux do. But as far as it goes for Windows, not having every other update completely break my system would be a welcome change.


> Not even macOS works flawlessly on the hardware, why should Windows and Linux do.

This is just pedantry and needless nitpicking. Replace "work flawlessly" with "work well" in my previous comment.


Because with Linux excitement can change things. What are you gonna do if you miss something in iOS/macOS? The right people can in principle make anything work in Linux but with macOS you are left praying Apple decides your use case is their business case.

Imagine what would happen if the compute/$ M1 laptops would perform in some respect better with Linux than macOS. Things may get out of hand, when huge chunks of the Linux community gets involved.


Of all the operating systems, I'm finding macOS to be less annoying than the rest. So far Apple did not make a single computer suitable for me, but if they would release something like Mac Pro Cheap Edition (or whatever, I just want workstation-level specs for $2k), I'll switch to it. I don't really care about M1 or Intel, I think that any modern CPUs are fast enough for any tasks.


No need to hate. Nowadays people in general do make necessity out of convenience and virtue out of necessity.


I almost went for the M1 but "1.0" kept me sane. I will definitely go for the M2.


"Antitrust is okay if you have a fast processor"


"It's not my personal responsibility to attempt to enforce antitrust against companies by boycotting them at significant personal expense".


Exactly, there are bodies that are supposed to protect consumers from that behaviour, unfortunately they failed everyone massively. That in itself begs for an inquiry how those institutions actually work and whether it is worth spending tax payer money on them if they consistently fail to deliver.


...and if that exact kind of navel-gazing is why they don't work? What then?


Charge people for failing to deliver, dispense huge fines and jail time. Then rebuild it in a way to avoid mistakes why the previous solution didn't work.


Additionally, it is still (for the moment) possible to engage with Apple computer hardware without using any of their bullshit services (like the App Store) into which all their anticompetitive behavior has so far been constrained.

This is of course not so on Apple mobile devices: it's dox yourself to the App Store or GTFO over there.


For CPUs Apple is actually upending the AMD Intel duopoly, isn’t that good for competition? Furthermore, AMD only recently broke back into the market, which Intel had a stranglehold on. This is the most competitive the CPU market has been since the early 00s.


What antitrust are we talking about.


It would seem to me there is fairly healthy competition between apple and windows laptops?


The thing is, every competitor is going to upgrade to these and if you stay with inferior Intel products, you give yourself competitive disadvantage. Unfortunately this is the current state of play. If I won't be able to achieve something the same speed competitor can, I put myself in a bad spot. I try to not mix emotions and business, however for a personal machine, I am rocking 5950X and it is awesome.


> I am rocking 5950X

Irrelevant from a mass market perspective. That chip is still out of stock at Best Buy and the Amazon third party version is marked up 54% versus MSRP.

Incidentally the 11900k is also out of stock at many retailers, but it's so much cheaper. You can still buy a pre-binned version that clocks 5.1GHz; even with markup that costs 30% less than the aforementioned third party 5950x.

Availability and price matter. My take on AMD's heavy focus on the enterprise segment right now is that they have no choice. If your enterprise partners get faced with scarcity issues, they will lose faith in your supply chain. You can tell a retail customer to wait patiently for 6 months, and even an OEM that makes retail devices (eg Lenovo) may forgive some shortfalls as long as there's a substitute available, but Microsoft and Google aren't going to wait around in line like that.


Mass market isn't buying individual PC parts and assembling the PC, they're buying prebuilts or using whatever their office mass-purchased. Go on dell.com right now and you can order a PC with a 5950x and RTX 3080. Good luck buying either of those individually without writing a web bot.


I just did. Fastest delivery (express) for ALIENWARE AURORA RYZEN™ EDITION R10 GAMING DESKTOP with base specs aside from 5950x and the cheapest liquid cooler (required by Dell for that CPU) is May 26. Some of the higher end default/featured configurations would deliver in late June. Not sure whats up with that.

Honestly the price/performance ratio there is pretty nice in the eyes of a Mac user like me, but I don't know what office is buying Alienware, and a bulk order would no doubt take longer to deliver. Those are the only machines popping up when you filter for AMD 5000 series on dell.com.

Considering that 5950x is AM4 compatible, folks who had bought a pre-built machine and want to upgrade are also part of the mass market. And I think you can't discredit the homebuilt PC crowd for a high-end desktop chip. The people who care enough to want this chip can probably figure out how to clip it into a motherboard and tighten a few screws and connectors here and there.


While this news is about "a rumor according to sources familiar with the matter" it's obvious that Apple will be doing this at some point. Whether it's the M2 or if there will be a new letter designator (X-series silicon for eXtreme performance? Apple X1?) I am very interested to see what the performance numbers will be for an ARM-powered workstation rocking 16 to 32 high power cores. Aside from the Ampere eMAG, is a 16+ core ARM-powered workstation even a thing yet? (I don't count Amazon's Gravaton chips in this discussion because I cannot own a machine on my desktop powered by one).


M2 seems very unlikely to me because that will create product misinformation easily. Imagine the following

M1 -- 4 A14 cores

M2 -- 8 A14 cores

M3 -- 4 A15 cores

That "third generation" sounds better, but is actually inferior to the second generation. X1 or M1x seem much more likely. It's the same reason why they made an A12x instead of calling it A13.

They probably need 3 designs, but economies of scale begin to be a problem. A14 and M1 are now selling in the millions, so they get a huge benefit. More importantly, they are in devices that get replaced a bit more frequently.

M1x (8-16 cores) will be in bigger iMac and laptops. They don't sell nearly as many of these. In addition, yields on the larger chip will be lower. Finally, people with these devices tend to keep their machines longer than phones, tablets, or cheaper laptops.

The 16-64-core chips are a major potential headache. If they go chiplet, then no big problem (actually, the M1x as two M1 chiplets seems like a desirable direction to head). If it is monolithic, the very limited sales and production will drive prices much higher. A logical way to offset this would be bigger orders with the excess being sold to Amazon (or others) to replace their mini cloud, but that's not been mentioned publicly.


I always assumed the "M" in the M1 designation meant "mobile" and that higher-powered (in terms of processing, electricity, and heat dissipation) would be coming later and have a different letter designator. Either that or we'll get Air/Pro suffixes to the chip numbers (eg: M1 Pro, M2 Air...)


...I always thought it was for "Mac".

Though the fact that they've just labeled the chip in the latest iPad Pro as such does add a bit of confusion to that.


M for mobile could make sense given they've just put an M1 in an iPad Pro. I assumed it was Mac before that and we'd get an M1X or something for higher end models but that seems wrong now.


...but they also just put it in the iMac...


The variant of it that uses laptop parts, but fair point. Mobile does seem a stretch for iMac. Let's just call it Apple's unique naming convention :)


The iMac has long used laptop grade parts.


Another possibility is that they skip M1x altogether; if the Pro machines are coming in the autumn and not for WWDC, then the frame will be closer to the iPhones and it would make sense for them to use that later tech.

M1 (winter 2020) — 4x A14 cores

M2x (autumn 2021) — 8x A15 cores

M2 (winter/spring 2022) — 4x A15 cores

etc.

There’s really no reason for the naming and timing of Pro machines to lag behind consumer machines just because of arcane numbering. And there’s precedent, Intel Core Duo Macs were released before Intel Core “Solo” Macs.

But if they’re actually ready for WWDC, then no, it’ll just be called M1x.

As for the Mac Pro… we’ll it’s definitely going to be weird. I think the existence of the Afterburner card proves that Apple sees a future in highly specialized modular add-ons providing most of the differentiation, but buyers would still want a bigger chip than in the 16” laptops, so who knows… of course nobody even knows how an M1 would perform with proper cooling and a higher clock!

[edit] also making M2x ahead of M2 will give them the benefits of “binning” some chips


There’s a couple good reasons for a M1x using A14 cores.

With new nodes, it takes time to get things right. A smaller chip is generally preferred as easier to work with and test. It also gives the node more time to stabilize so your big chips get better yields.

Addressable market is another factor. There are over a billion iPhone users replacing devices every 1-3 years on average. There’s around 100 million Mac users as of 2018 replacing their computers every 5-8 years and that being split between 2 and 3 major market segments.

That means making over 250 million A14 chips, but only around 15 million M1 chips (aside from iPads which are there to improve economies of scale by reducing two designs to one). The M1x probably addresses around 4.5 million computers and the M1p or whatever they call it probably addresses under a half million machines.

A lot of people wanting the performance boost or fearing outdated, unsupported x86 machines will probably cause an increase in M1 demand, but that will be followed by a decreased demand for M2 and maybe even as far as M3 or M4 while demand catches up with the altered purchase cycle. Apple is no doubt well aware of this.

In any case, it’s best to try a new node with a smaller part where you know you need a million wafers anyway. Once you’ve optimized and refined a couple steppings, you move to the larger chip with smaller demand where yields are worse in general.

Personally, I think they would greatly benefit from a unified chiplet design starting with something close to the A14 then scaling up.


There is something available for building workstations. Not sure what about performance though. https://www.solid-run.com/arm-servers-networking-platforms/h...


Here's an ARM workstation: https://store.avantek.co.uk/ampere-altra-64bit-arm-workstati... "Unfortunately" the minimum CPU is 64-core.


I haven't yet used an M1 mac but based on what I've read about it I have fully bought into the hype train.

Hoping my next laptop will be a M2-powered MBP, assuming they can increase the maximum RAM to at least 32GB.


I replaced my Core i9 fully spec'ed out MacBook Pro 64GB RAM with an M1 MacBook Pro with 16GB of RAM and I can tell you my computing experience is better! Specially the fact that my computer works on battery!


I wouldn’t get too hung up on the RAM. Went from an XPS with 64GB that used about 16-20GB in my day to day, still able to use the same workflows and memory is handled fine behind the scenes on my M1 Air with 16GB. Maybe get your hands on one from a friend and play around with it. Would imagine ram heavy tasks like rendering etc would choke but I just run many workflows / builds simultaneously which works fine.


16GB might be passable right now, but I am regularly using 20-25GB at a time on my work laptop, a 2019 15" MBP, occasionally bursting to ~30GB when I'm really pushing certain dev environments.

I keep my personal laptops for a long time, and I don't want to be stuck with 16GB when it's getting close to not cutting it now based on what I do. I can't imagine being stuck with 16GB in 8-10 years.

My current personal laptop, a 2015 13" MBP, has 16GB of RAM. I can't imagine NOT upgrading.


All fair points, but the deal is these M1 MacBooks make more out of the ram they have. And I can’t imagine not upgrading a laptop in 8-10 years, still have a T430S but it can’t keep up with any reasonable workflow today


There are rumours of a supposed M1X that may hit before the M2, so you may be waiting a little longer than you’d think. :)

Of course, the Apple rumour mill; grain of salt, etc - but I wouldn’t be surprised if we saw an M1X that could, for instance - support 32GB of RAM by the end of the year - (which is the only blocker from me buying an M1) - and pop out the M2 next year maybe with an Apple-CPU-powered Mac Pro?

Food for thought. :)


Yeah I guess 'M1X' vs 'M2' doesn't matter so much. As long as they've had time to work out any kinks from the first gen and increase the RAM, I'm all in.


There's missing features (like number of USB ports) but no real kinks that I've come across. Although I don't need it, I tried parallels + ARM windows + x86 windows apps on a lark and it worked without any noticable performance issues.


Whoa. That's excellent to hear.

Since they are non-upgradeable, I will certainly be waiting until a unit with at least 32GB RAM (ideally 64GB) before I'd upgrade at all and consider it future-proofed, but this is great to know!


Yes, I would expect them to have yearly M1, M2, M3, etc consumer-grade SoCs, and then alongside that yearly or maybe less-than yearly X-versions of the current SoC, with features like more RAM, more IO, etc.

So 6 months from now you'll probably have the choice between an M2-based MacBook Air with better single core performance or an M1X-based MacBook Pro with more cores and more max. ram - unless they decide to do the reasonable thing and shift the regular and X versions to the same release cycle.


Why do you want 32GB? With my 8GB m1 Mac mini I was surprised to get away with as little memory as I did. I felt that the M1 needs far less memory than x86 to feel snappy.


I am glad that you found something that works for you, but I am confident 8GB will not work for the type of work I do. 16GB might be passable right now, but I am regularly using 20-25GB at a time on my 15" MBP now with my workloads, occasionally bursting to close to 30GB when I'm really pushing certain dev environments.

I keep my personal laptops for a long time, and I don't want to be stuck with 16GB when it's getting close to not cutting it now based on what I do. I can't imagine being stuck with 16GB in 8-10 years.


FTA: "the latest semiconductor production technology, known as 5-nanometer plus, or N5P. Producing such advanced chipsets takes at least three months”

I know as good as nothing of this process, but I can’t imagine the latency from silicon wafer to finished product is 3 months. I also can’t imagine some inherent start-up delay for producing the first chip (but as I said: I know as good as nothing of this process), so where do those 3 months go? Is it a matter of “you have to be extremely lucky to hit so few minor issues that it only takes 3 months to have a first working product”?


Latency for a chip is 4 weeks at the low end of complexity, and 12 weeks (3 months) at the high end of complexity.

My mind was blown when I first found that out


Thanks. Also good to hear that I’m not the only one who finds that surprising.


Fellow mind blown friend here!


Just a side rant here... I'm really frustrated I can't monitor the Neural Engine's usage in the M1 in my MacBook Air. Apparently Apple did not build an API for extracting data points from the these 16 cores, so I can only watch what the CPU and GPU are doing when running and optimizing Tensorflow models while the NE remains a black box.


If Apple is having this kind of success, it seems they should look to compete in the data center with this or the following generation of chips. I wonder if it is a good time to invest in Apple.


What's in it for Apple? I'm not trying to be glib, here, but unless there were some Mac only server functionality, nobody would buy an Apple ARM powered datacentre machine.


With a sufficiently wide lead in energy efficiency, just selling hardware without any follow up lock-in harvest can be attractive even for a company as spoiled as Apple. They'd likely want to make the modules offered sufficiently big to make them unattractive for desktop use or else they'd risk cannibalizing their lock-in market.


They get a gigantic boost in internal datacenter performance if they can jam a bunch of Apple Silicon chips into a rack mounted server and boot Linux on it. If they can get a similar boost in performance at lower power efficiency in the chip that is going in the Mac Pro, taking that chip and putting it on a board with Ethernet and Power wouldn't be a ton of engineering cost and then they could massively reduce the power consumption and cooling costs of the datacenters.

And then they could resell the box they designed as a server, either with Linux support out of the box (unlikely, but since in this mystical scenario they'd have to write kernel patches for Linux to get it to boot...) or with a build of macOS that could be set up headlessly, in order to recoup the development costs. Apple shops that want machines to run Compressor or Xcode's build server would eat that up.


Eh, the Asahi linux people already have linux running on this chip.

What's in it for Apple is money and better economies of scale for chips. But I don't really think it fits Apple's MO so I doubt they'll do it.


MO is my thought, too. Getting back into server hardware would require supporting a vastly different kind of client, and possibly require them to start doing some things that they, as a company, might find distasteful. Supporting equipment they stopped making a long time ago, for example. Including maintaining stockpiles of replacement parts.


> Eh, the Asahi linux people already have linux running on this chip

More specifically, people are running Linux on the CPu cores.

The M1 is a system-on-chip, and according to the floorplan [1], the CPUs are maybe 1/5th of the chip. There are many other features that aren't unlocked, such as GPU (which is a brand new architecture) or power management. The latter is key to exploiting the chip to its full performance envelope.

I don't expect Asahi to get anywhere further than proof-of-concept before it becomes obsolete by the march of the silicon industry.

[1] https://images.anandtech.com/doci/16226/M1.png


I think it depends on how much changes between generations. So far it seems like most of this isn't really new, but almost exactly what's been on the iDevices for a long time. If they don't re-architect substantially between generations I can see the Asahi project keeping up.

The GPU stuff is new, but it seems like good progress is being made: https://rosenzweig.io/blog/asahi-gpu-part-3.html

For data centers, it helps that GPU and so on is just less important. It's wasted Silicon, but the CPU is competitive even before considering the added features so that's not the end of the world. There's a good chance that Apple can use that to their advantage too, by using chips with broken GPUs/NNPUs in the DC... or designing a smaller chip for data centers... or one with more cores... or so on.


If Linux was supported, it would be an interesting competitor to AWS's graviton instances.

As for what's in it for Apple, it would be the profit from selling a hopefully large number of chips, but adding official Linux support and also commuting to an entire new market for a minimum of three years is probably far higher a cost on focus than any potential profits.


What if Rosetta 2 was that Mac only server functionality? I don't know that they'd do it, but from M1 Mac reviews it sounds like M1 + Rosetta 2 will run at least some x86 code faster and more power efficiently than any competitor.

I don't know how feasible it is to scale that up to a datacenter though, and I expect MacOS licensing costs would wipe away any power efficiency gains. But I do wonder if they could hypothetically scale up and beat the best Intel/AMD have to offer just using Rosetta 2 to translate x86 code.


First of all, Apple could save a huge amount of money replacing Intel based servers with their own chips. Both on the CPU price, expecially the Xeons are really expensive as well as on electricity consumption, probably the largest running cost of data centers.

Then the gains of scale, making a CPU just of the Mac Pro would mean too low production numbers, but with data center usage would drive those up - especially if Apple also sold it to other customers, e.g. bringing the Xserve back. For the OS they could run Linux virtualized or they give the native Linux on Mac developers a hand.


The amount of money that Apple would save to put a bunch of 2U ARM servers into racks is dwarfed by the costs of building such systems. Seriously, nobody (to the first approximation) is going to buy a macOS server, and so there's no reason for Apple to do this.


If they were to (re)enter this market they'd have to support Linux, which I just don't see happening.

What's interesting to me is to see if they'll use M-series chips in their own datacenters. They already run Linux there apparently.


Do you mean that they should start making and selling servers? It's unlikely.

Or do you mean that they'll start selling parts (SoCs)? Not in a million years :-)


I am just trying to forecast how bright Apple's future is. Seems like they have options. So, is there going to be a shift towards ARM/RISC in general or not. If so, where do I put my money?


Well, speaking for Apple, Apple 99% sells consumer products. So look for big consumer products markets which they haven't entered. Cars would be one of those markets, for example.

AR/VR/more wearables would be another.

Home appliances/electronics would be another.


I think they will soon, but only for a specific use-case: MacOS and iOS development. Devops in cloud is expected these days for development and the only offerings available for it are basically mac minis jammed into a rack, often with grey-area virtual machines running. A new Xserve model and Xserve cloud service would be great!


How would they be succesful in DC? Designing a product for consumers is very different than for servers. On top of that you add terrible server support for MacOS.


They only need to support a Linux kernel. They’ve used google cloud, azure, now aws. The contract is worth billions, and will end in 23 or 24.it’s very likely they’ll at least sun their own cloud completely. And maybe they’ll compete as a public cloud later


I'm really curious how this will play out. DataCenters haven't been apple's market for a long time, and the requirements of datacenter customers are kinda anti-apple these days.

More likely I would see Apple making an exclusive server chip licensing arrangement with ARM or something similar.


High throughput at relatively low power. To me, it seems like a match made in heaven. There is the practicality of building rock solid containerization on these CPUs. I don't know where that stands, but it seems like an obvious fit.


It would be great, but do you see Apple making commodity datacenter hardware for running Linux/Windows?


I doubt that they would bother with the 2nd coming of macOS Server for anything other than Apple shops.


I think if they ever released the servers, they would want a total control over what you run on them, so you couldn't just upload your service, Apple would have to approve it first.


How does that even make sense? You can run anything on Macs, and these are one level less enterprisey.


Here's hoping this chipset will support 32GB+ RAM and more than 2 displays!


Is anything known about the chip? The deal breaker of M1 (for me) as it currently stands is the amount of RAM it can handle (16 GB).

Edit: Mistyped 16 as 1, sorry about the confusion


Also the relatively low number of cores.


I've only read very rare workloads actually be constrained by 16GB on the M1, what use case do you have that you know will be hampered on a M1?


My current laptop has 64 GB. Could probably be fine with 32, but I'm seldom under 20 in usage. I run certain services locally in Docker, and then have 5+ instances of IntelliJ for various stuff open, some of them running a java server or node building the frontend, and others on my team may be running Android+iOS simulator at the same time as well.

I could alter my usage patterns and work with less. But people not used to having loads of ram don't know what they're missing out on.


I went from a 64GB XPS 15 as a developer utilizing ~10-20GB during general workloads and I can get the same work done on my new M1 MacBook Air 16GB without a hitch. Unless you are reading and writing to ram incredibly quickly or need a specific massive amount of data in memory like for rendering type tasks, running general applications beyond the 16GB point is totally fine, and the OS will figure out the rest.

I’m curious to know if it’d work for you, do you have access to an M1 to try out your workflow on? The max ram ‘issues’ seem to be way overblown.


side-question, isn't iOS simulator running natively on M1? That would mean it consumes less RAM than on x86. If that's true, it should be possible to fit Android+iOS workflow.

As a data point: I am running node + iOS Simulator (along with XCode + VSCode + Chrome with 50+ tabs) setup on M1 16GB and it works fine, I also keep them running while I take a break to play a LoL match. Works great for me.


Simulators run natively in x86 they simulate the api/abi as opposing to emulating the processor


My Ableton starter template project takes about 22GB of RAM to open. Music production is a pretty common use case that can be very heavy on RAM.


I think you mean 8GB?


I think he meant 16GB.


I own an M1 Macbook Air with 16GB of RAM


Yep. Unfortunately the M1 MacBook Pro currently only goes up to 8GB.


I am typing from an M1 MacBook Pro with 16GB.

edit: you have to select the SSD size then you can choose the RAM.


Ah, you are right! Thanks for the edit. I didn't click through on the presented options to see if they offered additional upgrades.


I think you mean 16GB?



To think a 36 year old architecture that Apple seeded is finally coming to fruition. I give them props for the long game for sure.


I’m reading somewhat incompatible reactions in the top level comments e.g. [1] > Somewhere, the collective whos who of the silicon chip world is shitting their pants. Apple just showed to the world how powerful and efficient processors can be. All that with good design. Customers are going to demand more from Intel and the likes.

Another [2]: > I really want the next MacBook Pro to support driving two 4K monitors over thunderbolt and have an option to buy 32 gigs of ram.

Meanwhile the last Intel MacBook Pro supports driving four (4!) 4K displays [4]. Apple silicon is far ahead in benchmarks but how does speeds and feeds translate into what customers actually want?

Battery life is impressive but unfortunately not the usual differentiator during a worldwide pandemic. The M1 Macs are quite quiet (the first MacBook Air without a fan—in 2020!) meanwhile the Intel Surface Book was fanless in 2017. We shot the messenger of the recent Intel attack Apple ads [5] but message is still worth reading. I bought an M1 MBA and realized the speed didn’t make a difference as my consumer computer. For the first time in decades I’m not sure if Apple provides the most pleasurable experience.

[1] https://news.ycombinator.com/item?id=26956336

[2] https://news.ycombinator.com/item?id=26955682

[4] https://support.apple.com/en-ca/HT210754

[5] https://www.macrumors.com/2021/03/17/justin-long-get-a-mac-i...


How are the reactions incompatible? People like me, who don't need more than 16 GB of RAM and one monitor, are happy with the M1. Other people are waiting on the M1X/M2 chip to bring what they need.

> meanwhile the Intel Surface Book was fanless in 2017

The MacBook was fanless in 2015 and, like many other fanless designs using Intel chips, it was slow.


> Other people are waiting on the M1X/M2 chip to bring what they need.

Well those people must have been bullish on Apple Silicon and 'not just' M1. They think its worth skipping over M1 rather than going all in on the 1st gen product which at the time had primitive support for most mainstream software, especially for developers.

Maybe Apple knew that the M1 could not drive more than 1 monitor on the Macbook Air and in fact left that limitation in with a small disclaimer.

Perhaps they will announce this capability in the M2 Macs.


> How are the reactions incompatible? People like me, who don't need more than 16 GB of RAM and one monitor, are happy with the M1. Other people are waiting on the M1X/M2 chip to bring what they need.

I agree with your nuance.

> The MacBook was fanless in 2015 and, like many other fanless designs using Intel chips, it was slow.

The Surface Book 2/3 and M1 MacBook Air are not slow (hence the point of my comparison)


Arm-based laptops that are competitive with Apple M1+ could arrive as early as 2023, powered by a Qualcomm SoC based on their $1.4B acquisition of Nuvia, lead by a team that designed Apple's M1, iPad and iPhone SoCs. This would likely run Windows for ARM and Linux. Hopefully it will also include hardware virtualization support, like M1.

https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...


Is Apple going to be affected by the chip shortages we have been hearing about?


Apple tends to buy fab capacity in very large chunks. IIRC, they bought all of TSMC's 5nm capacity for a year.


Nikkei says they have already postponed some Mac and iPad production. Not sure how reliable that story is, but so far it doesn't look like customer orders are at all delayed. I bought a nonstandard one when this article came out, and they delivered it a week ahead of schedule.

https://asia.nikkei.com/Business/Tech/Semiconductors/MacBook...


I would expect that Apple's chips are probably some of the more profitable chips made by TSMC. Apple has historically had relatively large margins, so they can probably afford to pay a bit more.


Probably not since they likely didn’t scale back their orders in 2020.


Taiwan also has a severe water shortage. I'd assume that's still a threat to production for Apple. https://www.taipeitimes.com/News/feat/archives/2021/04/22/20...


I had to wait a month to get my Macbook Air M1 here in Norway.

Was quite a painful wait (as my Macbook 12' got water damage, with the SMC limiting in to 1Ghz).


I know a lot of people waiting for this one, myself included. Here's hoping Asahi Linux is ready around then!

I'm guessing the 2021 MacBook Pro is going to be the fastest laptop ever made.


Every generation is "the fastest ever made". The question is more: will this one be the Pro version?


Surely not every mbp has been the fastest laptop ever made. Has this in fact at any given point in time actually been true?

This could be it tho, but there are probably some desktop TDP chip current gen laptops out there.


I didn't mean just Apple. I just meant the fastest laptop money can buy.


The fastest laptop money can buy has always been available. I guess this is the first time you are considering buying one?


Sometimes I fantasize about printing a t-shirt that simply says "shut up, you know what I meant".


I’ll buy two


Maybe their two requirements for a laptop were it being (1) an Apple product and (2) the fastest laptop money can buy.


I expect every mac user at my job to get one.


Besides the radio antenna (Qualcomm) that Apple is quickly replacing with their own, is there any other tech/chips inside Apple SoC that they don't design themselves?


Oh I think _they are_ getting into the radio chip business.

https://www.apple.com/newsroom/2021/04/apple-commits-430-bil...


When I was young, every time I got a new computer (which wasn't often, but it happened a few times), the new machine was absurd amounts faster than the previous one. Like 10 times more processing power and memory.

I'm glad we can live that feeling again, even if just for a short while.


How is this possible in this time of shortage of IC production capacity?


Apple wasn't cheap and pre-bought production capacity.


Apple booked the production capacity at TSMC years in advance.


My wallet is ready for the next line of macbook pros.


Think they new chip will have a better RAM solution ultimately allowing for more RAM?



People can be both legitimately impressed by the power and efficiency of Apple's first desktop-class processor, while also understand that certain more niche features were out of scope of a first version. I'm certainly expecting this to be fixed by the second generation, and if it's still missing I won't be quite as understanding.


> People can be both legitimately impressed by the power and efficiency of Apple's first desktop-class processor, while also understand that certain more niche features were out of scope of a first version. I'm certainly expecting this to be fixed by the second generation, and if it's still missing I won't be quite as understanding.

I’m responding to a HN commenter who was not just impressed about the power of the M1 but hyperbolically asserts that it is better than everything else yet the next top voted HN comment demonstrates otherwise with a demand for downgraded feature. The tenor of those reactions are opposed and my aim is to reflect the nuance


This is not an oxymoron. You can both feel that the M1 chip is superior to previous designs in most aspects, and admit that it is lacking in others.


I consider the ability to drive more than one external display to be directly related to the power and design of the chip.


I have an M1 MacBook Air, and I’m blown away. I cannot wait for a 16 inch MacBook Pro with whatever madness they have planned.

I love the direction Apple is headed with their hardware.


people wanting more cores, more memory, eGPU support, and I'm here just wanting them to have multiple colours...


So, M(n)+ or M(n+1) ?

"tentatively known as the M2"

Blasphemy! Plus then N+1!


I might consider buying one when it is able to run proper Linux. And even then it's probably going to be limited to ARM Linux only.


"Limited to the CPU type that is installed"... how is that a "limit"?


For my workloads that is a limitation that must be considered when purchasing hardware. It's the same reason why I don't buy ARM Chromebooks.

It's going to take years to run proper Linux on M1 and even more for the ecosystem to catch up to x64_86.

For me it's reasonable to keep using AMD Ryzen 5000 which is faster than M1 on my multithreaded workloads anyway despite using 7nm. Plus it has better GPU, more memory, more storage, more ports and supports multiple monitors.

Sure it is more expensive, but that's just because my segment is geared to pros with higher requirements. Apple currently has no laptop offering on this tier.


soldered ram and SSD coupled with SSD Wear issues leading to a less than 3 year lifespan of a laptop makes all of this a hard pass for me, and should be for any sensible person too.


Even if true (it isn't - SSD issues appear to be mostly related to as-yet non-native software), a 3 yr lifespan for the price of 6 yrs worth of half-speed laptop makes sense.


Not sure why you are downvoted, but this is true is it not? It's even worse with Apple Silicon machines since if the SSD dies, well the whole thing is bricked. Unlike the Intel Macs.

It seems the Mac aficionados (Especially the M1 fanatics) are in denial of the degree of lock-in with the Mac as it gradually descends into become nearly as locked in as an iPhone.

I'd give it 0.1 out of 10 for repairability. At least with the latest Surface line-up the SSD can be upgraded.


You can wear out any SSD. There's no evidence that Apple SSDs are any worse than others. You need to have backups. You need to understand that Apple products are sealed and disposable and only buy them if your use case can accommodate that.


Cool! So now (or at least soon-ish) I can get my hands on some dirt cheap, practically never used M1 hardware on eBay to play around with?

I wonder if Apple is familiar with the Osborne effect[1].

[1] https://en.wikipedia.org/wiki/Osborne_effect


This is a rumor, Apple didn't make this announcement. It is not an example of the Osborne Effect.


I don't think Apple will put the new processor in the existing M1 products, except for the 13" MacBook Pro.


Why a new SoC? Isn't the M1 basically maxiing out what can be done on a SoC, but what's missing is the version with external memory and GPU?

They can refresh the core in the M1 of course, and I expect they will do that yearly like the AXX cores, but it would be weird to go even 2 generations of the SoC without addressing the pro cpu.


Apple could easily fit 2x-4x the performance on an SoC so that's what people expect the M1X and M1Z to be. Note that it's still an SoC if it has "external" memory (Apple's "unified memory" isn't what people think).


> it's still an SoC if it has "external" memory

I meant "soldered ram next to SoC with cpu and gpu cores", i.e. not DIMMs and 200W PCIe GPU. For a pro chip in a desktop form factor I think DIMMs and PCIe graphics are inevitable, and that's the interesting storyline in the evolution of the M1. We know they'll produce better M1's, but the integration with 3rd party graphics, the choice of DIMM type and so on is interesting.


They may never have DIMMs. Apple is crazy enough to solder 256GB and call it a day. People would complain... and then they'd still buy it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: