> The Mac mini with M1 chip that was benchmarked earned a single-core score of 1682 and a multi-core score of 7067.
> Update: There's also a benchmark for the 13-inch MacBook Pro with M1 chip and 16GB RAM that has a single-core score of 1714 and a multi-core score of 6802. Like the MacBook Air , it has a 3.2GHz base frequency.
So single core we have: Air 1687, Mini 1682, Pro 1714
And multi core we have: Air 7433, Mini 7067, Pro 6802
I’m not sure what to make of these scores, but it seems wrong that the Mini and Pro significantly underperform the Air in multi core. I find it hard to imagine this benchmark is going to be representative of actual usage given the way the products are positioned, which makes it hard to know how seriously to take the comparisons to other products too.
> When compared to existing devices, the M1 chip in the MacBook Air outperforms all iOS devices. For comparison's sake, the iPhone 12 Pro earned a single-core score of 1584 and a multi-core score of 3898, while the highest ranked iOS device on Geekbench's charts, the A14 iPad Air, earned a single-core score of 1585 and a multi-core score of 4647.
This seems a bit odd too - the A14 iPad Air outperforms all iPad Pro devices?
My guess: in geekbench air and pro score the same, because geekbench is shortlived and not thermally constrained. In cinebench you'll see the pro pulling ahead.
Of course the iPhone chip isn't as beefy as the M1, but the results still speak for themselves.
Apple makes it clearer that in the real world, these machines are only going to offer their incredible performance on Metal, iPad/iPhone apps and for any Mac apps that happen to have been ported over to M1 by their developers (using Xcode). These machines will only offer similar performance to existing Intel Macs when running existing Intel Mac apps because the incredible performance will be reserved for Apple's Rosetta2 software to make those unmodified apps compatible.
But what went unsaid, except during the part where they say they 'learned from their experience in the past processor transitions', is that by introducing the chip at the low-end of the lineup first, they create a market for the (few remaining relevant) Mac developers to invest in porting their code over to ARM and likewise, because these new machines run iPad apps at full speed on upto 6K displays, there is incentive for the iPad/iOS-only devs to expand the functionality beyond what their wares can do on a tablet/phone. (Any Mac dev that drags their feet porting may find that there are 50 iPad apps that now run fullscreen performing 75% of their functionality, costing them sales in the big volume accounts where they buy licenses by the thousands.) Meanwhile, the type of users who can get by with two USB ports, 16GB of RAM and a single external monitor probably don't run many third-party Mac apps and are going to have an awesome experience with the iPad apps and Apple's native apps.
GB deliberately avoids running up the heat because it is focused on testing the chip, not the machine's cooling ability.
Cinebench, as you say, tests "real-world" conditions, meaning the entire machine, not just the chip.
In a majority of cases, burst performance only affects things like responsiveness, and those things should be measured instead for a better reflection of the benefits.
For example, if someone thought M1 was thermally constrained, they might decide to rip mini out of the case and attach a different cooling method.
> they might decide to rip mini out of the case and attach a different cooling method.
99% of customers will never do this.
"Geekbench 5 is a cross-platform benchmark that measures your system's performance with the press of a button. How will your mobile device or desktop computer perform when push comes to crunch? How will it compare to the newest devices on the market? Find out today with Geekbench 5"
In this view, it's entirely possible that the Air simply did not have time to throttle before the benchmark ran out.
It's a useless benchmark, what I want to see is things like time to compile a bunch of different software, things that take long enough for the processor/cooling to reach thermal equilibrium etc.
I.e. stuff that more closely matches the real world
It's really only intended to be one of many benchmarks to tell the whole story; of course Linus would attack that because it doesn't make any real sense in his use and isn't the full story for him. If Geekbench was not tested, it would not cover the majority of computing uses and it would weigh cpus that had poor turbo or burst performance unfairly high for most uses.
Geekbench is kinda like 0-60MPH times and other tests (like SPEC2006) are like top speed I guess? The whole story is between them.
Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?
- locked bootloader - no bootcamp - can't install or boot linux or windows
- virtualization limited to arm64 machines - no windows x86 or linux x86 virtual machines
- only 2 thunderbolt ports
- limited to 16GB RAM
- no external gpu support/drivers - can't use nvidia or amd cards
- no AAA gaming
- can't run x86 containers without finding/building for arm64 or taking huge performance hit with qemu-static
- uncertain future of macos as it continues to be locked down
Genshin Impact is a great game that is on iOS in addition to "real consoles". Oceanhorn 2 is an amazing game that was originally on Apple Arcade and brought to Nintendo's "real console".
There is also quite a number of ports that I think you aren't aware of.
It's like calling yourself a programmer because you can set a timer on your VCR. (dated but still accurate)
You think "real games" are for "hard triers only", but it's just your point of view.
GPU was not the issue here.
Mac users who hope to play anything from their steam library or dual boot Windows are going to be very disappointed.
Buyers don't especially care about performance either to be honest, unless they care about one of those factors in order to need it.
This is, arguably, a disadvantage of any Mac.
But Apple Silicon may actually improve the situation over time, as having the same GPUs and APIs on Macs and iOS devices means there is now a much bigger market for game developers to target with the same codebase.
But on the whole I am optimistic.
The only issue might be multi-touch based games on M1
Not really. The business models for desktop gaming are completely different to mobile devices, and there is no meaningful common market.
I think people will actually be surprised at how few games from iOS will even run on an ARM Mac because developers will block them.
It used to be possible to do some gaming on a Mac - the vast, vast majority of Steam users have graphics hardware of a level that was perfectly achievable on a Mac, especially with an eGPU. The end of x86 is the end of that market, forever.
Exactly. So it was never really the hardware that held back gaming on Mac, but the fact that from a game-development perspective it's an esoteric platform that has limited or no support for the main industry standard APIs (DirectX, Vulkan, etc).
It was never worth the effort for most game developers to bother porting games to the Mac because writing a custom port for Metal was way too expensive to justify for such a niche market.
But now with Apple Silicon, that all changes. If you're going to port your game to iOS (and that's surely tempting - it's a huge platform/market with powerful GPUs and a "spendy" customer base) then you basically get Mac for free with close to zero additional effort.
I think it's more that gaming wasn't held back on the Mac. It was just bootcamp was much more common than people think.
> If you're going to port your game to iOS (and why not? It's a huge platform with powerful GPUs and a huge, "spendy" market)
Because mobile gaming and desktop gaming have very little in common. Note that Nintendo didn't port their titles when they released iOS games, they made new games. Users want different experiences, and flat ports of successful console gaming titles to iOS tend to fail. There are, all told, very few ports of successful PC/console games to iOS, and those that exist tend to be brand reuse rather than literal ports.
> then you basically get Mac for free with close to zero additional effort.
Not even remotely. The way you secure your service has to be totally different, the UI paradigm is completely different, you have to cope with totally different aspect ratios etc etc. It's significant effort, and it will be very hard to justify for most game studios. It's certainly more work in most cases than porting a Windows game to MacOS was when using a mainstream engine, and that was not a huge market.
2) You have to rebuild the UI, which costs money which the Mac version may well not recoup.
3) You have a different version for desktops that costs more upfront with less reliance on in-app mechanics that you don't want to undermine.
OK, but that's no different to Windows and Android.
> "You have to rebuild the UI"
No. Even with apps this is no longer the case (see: "Mac Catalyst"), but it's certainly not true for games. Maybe you'd need to add some keyboard/mouse bindings, but that's about it. Even iPads support keyboards and mice now days!
Apple detractors LOVE to bring this idea up, but there's nothing to it in any real sense. Do Macs ship with a checkbox filled in that limits software vendors? Yes. This is a good thing. Is it trivial to change this setting? Also yes.
Anyone who buys a Mac can run any software on it they like. There is no lockdown.
I don't care that I can't run Linux on my Mac. If I wanted to run Linux, I'd have different hardware.
Of course, Apple as an OEM does not support running non-Mac OSes, so virtualization should still be preferred for most use cases.
They had crappy code signing policies (only store apps on Windows RT tablet) which guaranteed poor adoption but that was a policy decision, not a technical one.
Although instead of lasting 1 year they only last 7 days, but there is no fee for a user to sign and install their own binaries.
To clarify iOS, so the app erases itself after 7 days? Or is it something like you can install an app for only 7 days after downloading/using Xcode?
macOS has plenty of warts, but my experience with high quality equipment (Thinkpad, XPS, Alienware) has left me ultimately disappointed with Windows in many day to day situations compared to Mac.
Windows is still clunky, despite many improvements. And aside from a Thinkpad Carbon X1, I haven't used any laptop with the performance and build quality (for the size/money) as a Macbook Air.
For travelling, I don't think anything beats a Macbook due to how light, thin, and resilient they are. But my 2016 MBP is a pretty shit machine for its price. It's also loud (like every other laptop I've had). I avoid using it. Sure, if you take size/design/mechanical quality into account, it is probably unmatched. But for 95% of my computer usage, those are irrelevant, as I just sit at my desk. I had a company provided HP laptop (not sure if stock or upgraded by our IT staff) at my previous job which was far more performant than my Macbook, so I don't really agree that Windows laptops are necessarily bad, but it was even louder than the Macbook, and of course clunky and ugly.
For me personally, the new Macbooks are disqualified as viable work machines if it's really true that you can't use more than 1 external screen. That's just not a viable computer for me (for work). I will always have a Macbook though just because of how much I love them for travel. But a Macbook is more of a toy than a serious computer, especially if the 1 screen limit is true.
Unfortunately they will also blow your wallet.
The iMacs are a mistery to me, but guess I'm not the target market anyway. (I have a 2018 MBP)
It's not even a contest or similarly powerful, spend $3000 on an AMD + Nvidia PC and its significantly more powerful than the $5000 Mac Pro in both CPU and GPU compute.
When my current Mac dies, that's where I'm headed, but running Linux; Microsoft is less of a danger, so I don't outright boycott anymore, but I still find Windows super annoying to use.
As for bios (well EFI these days) that should be handled very seamlessly via fwupd on all major Linux distros:
(Frankly seems much more robust to how it is handled on Windows - not at oll or via half broken OEM bloatware.)
I understand that this may be because PC touchpad hardware reports jitter, sometimes higher than it really is, and this causes the Precision Touchpad software to increase the hysteresis. Macbook touchpads have low jitter and the driver is tuned to benefit from it.
If anyone Microsoft with input into the Precision Touchpad reads this, why don't you fix it or work with your licensees to fix it?
I.e. "We raised the walls on our garden further"
Balls to that, if I buy hardware I want to be able to run what I want on it or it's not a general purpose computer, it's something else.
This has been a claim made about the Macs since the T2 chip came out. It was strictly false then (you just had to boot into Recovery Mode and turn off the requirement that OSes had to be signed by Apple to boot) and we still don't know for sure now. Apple has stated in their WWDC that they're still using SecureBoot, so it's likely that we can again just turn off Apple signature requirements in Recovery Mode and boot into ARM distros.
Whether or not that experience will be good is another thing entirely, and I wouldn't be surprised if Apple made it a bitch and a half for driver devs to make the experience usable at all.
>- virtualization limited to arm64 machines - no windows x86 or linux x86 virtual machines
True, but this isn't a strictly unsolvable limitation of AS and more like one of those teething pains you have to deal with, as it is the first-generation chip in an ISA shift. By this logic, you could say that make doesn't even work yet. Give it some time. In a few months I expect all of these quirks to be ironed out. Although, I suppose if you're concerned about containers it sounds like you want to be in the server market, not the laptop market.
>- only 2 thunderbolt ports, limited to 16GB RAM, no external gpu support/drivers, can't use nvidia or amd cards, can't run x86 containers without finding/building for arm64 or taking huge performance hit with qemu-static
See above about "give it some time".
>- no AAA gaming
I mean, if you're concerned about gaming, you shouldn't buy any Mac at all. Nor should you be in the laptop market, really. Although, this being said, the GPU in the new M1 is strong enough to be noted. In the Verge's benchmarks, Shadow of the Tomb Raider was running on the M1 MacBook Air at 38FPS at 1920x1200. Yes, it was at very low settings, but regardless – this is a playable framerate of a modern triple-A game, in a completely fanless ultrabook ... running through a JIT instruction set translation layer.
>- uncertain future of macos as it continues to be locked down
I disagree. I know we were talking about the M1 specifically, but Apple has shown that the future of ARM on desktop doesn't have to be as dismal as Windows made it out to be. Teething pains aside, the reported battery life and thermal performance on the new AS machines have been absurdly fantastic. I think, going down the road, we'll stop seeing x86 CPUs on all energy-minded machines like laptops entirely.
I thought Google, Microsoft, Nvidia, etc. were all pushing streaming gaming services that will run on any hardware with a decent internet connection. I would imagine the hardware video decoder in the M1 chip would allow 4K streaming video pretty well.
There are enough people who do not want to deal with MacOS and Darwin regardless the hardware specs.
Also the way of least friction is usually to use whatever the rest of your team uses. There are even relevant differences in Docker for MacOS vs Docker for Linux that make cross platform difficult (in particular thinking about host.docker.internal, but there are certainly more). Working with C/C++ is another pain point for cross platform, which already starts with Apples own Clang and different lib naming conventions.
Going away from x86 makes this situation certainly not better.
A walk in the part to anyone that had to deal with coding with C or C++ across UNIX flavours.
Toy projects don't count.
That said, my wife returned the macbook air she bought 3 weeks ago in favor of this new one, so I'll be able to test on that machine before I dive in.
Mark my words, this is going to be a massive shit show for people using those ecosystems, for 5 years if not 10. It already happened with the PPC transition.
“ fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on current gen Intel, and ~6.5 nanoseconds on an M1”
“…and ~14 nanoseconds on an M1 emulating an Intel”
We still can't emulate some 20-year-old machines at full speed on modern hardware due to platform impedance mismatches. Rosetta2 may be good, but until someone runs a DAW on there with a pile of plug-ins and shows a significant performance gain over contemporary Intels (and zero unexpected dropouts), I'm not buying the story of Rosetta2 amazingness.
Edit: And Apple has already discussed how Rosetta2 handles complexities like self modifying code. It probably won’t help with performance but the M1 has a lot of power to keep even that code running fast.
But more importantly video/audio apps aren’t going to be using Rosetta2 for very long. 99% of code written for x86 MacOS is going to be a simple recompile to native, if not more. Not going native when your competitors did and got 2-5x faster is corporate suicide.
If you read my parent comment you'll see how DAWs are going to be using Rosetta2 for years to come, maybe even a decade, for many people. Even if there are ARM versions, you won't be able to use them until all your dozens if not hundreds of plug-ins, some of which won't be actively developed any more or will require a re-purchase for an ARM version, have also migrated.
People invested in such ecosystems aren't just going to up and give up half their collection of software, or spend thousands re-purchasing upgrades to get ARM versions.
You're also going to be in a bind if Apple decides they don't care about the long tail and stops supporting emulation before all of your plugins have been converted (if they ever are).
There is an exception for apps with JIT and those will perform poorly (think Chrome and every Electron app).
Just because binary translation is used doesn't mean it's magically as fast as native code. Converting code that runs on architecture A to run on architecture B always has corner cases where things end up a lot slower by necessity.
Nonetheless, the translated code is going to be slower than ordinary native code because a lot of the information compilers use for optimization isn't available in the resulting binary, so the translator has to be extremely conservative in its assumptions.
And most use electron-builder which does not have Mac Arm support. Expect super slow mode for a while!
as a power user I will not be touching anything apple ARM until all my hundreds of software apps are certified to work exactly the same as on x86_64. i will not rely on rosetta to take care of this. i need actual testing.
besides this, 8GB of RAM is how much a single instance of chrome uses. i run 3 chrome instances, 2 firefox and 2 safari. and this is just for web.
this could be a good time to jump the apple ship. it's pretty clear their focus is not their power users' focus.
as such i was looking into a lenovo thinkstation p340 tiny.
you can configure it with 64gb ram and core i9 with 10 cores and 20 threads for less $$$ than what an underpowered 6 core mac mini is selling for.
Apple is at day 1 of their two year migration to Apple Silicon. Your judgement seems not just a little premature.
I think many professionals who need new hardware will use this as the catalyst to make them move back to PC hardware. The M1 looks amazing, but I need more than just Apple software to do my work. It’ll be a while before all the things I use get migrated.
“two year migration” sounds just about right for a transition to something non apple.
we can then re-visit apple in 3 years time.
Their focus is not on power users ? They just completed the first, small, step of the migration to ARM. They only updated the very low-end models, those who were never targeted at power users anyway, and we're seeing that their cheapest, lowest-end models are whooping the i9 MBPro's ass.
Sure, the features and RAM may not be there yet, but again, these are the low-end models. If we're seeing this level of performance out of an MBAir or Mini. I can't wait to see what the Mac Pro is going to be capable of.
The big screen model might give you more cores and RAM but IPC is going to be exactly the same.
> but IPC is going to be exactly the same.
I am not sure what you mean with this?
And the problem with the M1 isn’t performance, single core is already off the charts. The M2 is going to provide 32Gb and 64Gb systems with up to four thunderbolt/USB4 ports and support for dual 6K monitors.
Let alone multicore performance. Apple's core are also far behind in IO, 64GB of RAM and 4x Thunderbolt is less than what current gen laptop chips can do.
The M1 is a system on a chip, with all the benefits and drawbacks of that including RAM and port limits.
The next releases will likely be
A) a tweaked M1 for higher end PowerBooks with more RAM and ports and
B) a desktop version with plenty of ports, significantly higher clock speeds, and off chip RAM.
I think there will always be faster CPUs out there, but not remotely near the M series in power per watt, and cost per power.
Most importantly, Zen 4 is a chiplet design, so for the same amount of cores it will be cheaper to make than the M1 chip.
As for performance per watt, Renoir in low power configurations matches the A12. I would really doubt that a laptop Zen 4 on 5nm LPP wouldn't pass the M1/M2 in both performance and performance per watt, because Renoir is on 7nm with an older uArch and gets close.
Depends on the definition of "power user". Music producers, video editors, and iOS developers will be served quite well.
> lenovo thinkstation p340 tiny. you can configure it with 64gb ram and core i9 with 10 cores and 20 threads for less $$$ than what an underpowered 6 core mac mini is selling for.
When making that calculation, one should also take power consumption into account. $ per cycle is very low now with the new CPU.
(Of course, power savings are important in their own right for mobile / battery-operated use cases.)
Low RAM is still an issue with such fast SSDs, as someone who ran RAID0 Gen3 NVMe SSDs (so equivalent to what's in there).
Let's back up a second: Tim Cook said this transition would take place over two years. This is just the first batch of computers running Apple Silicon.
I certainly hope and think that Apple can come out with a beefy 16 inch MacBook Pro with 32 gigs of ram within the next two years. Also, in that time I imagine everything in Homebrew would be ported over natively.
So for things like software development where you compile frequently your projects, the new Apple computers are a little slower than similar computers with AMD CPUs.
So even when taking only CPU performance into consideration, there are reasons to choose other computers than those with Apple Silicon, depending on what you want to do.
Of course, nobody will decide to buy or not buy products with "Apple Silicon" based on their performance.
Those who want to use Apple software will buy Apple products, those who do not want to use Apple software will not buy Apple products, like until now, regardless which have better performance.
That's exactly the reason why you would chose Apple Silicon right now where you can choose between Intel and Apple SoC. There are of course other reasons such as battery life and price.
The x64 options from Apple are also uncompetitive with existing PCs already because they're using Intel processors when AMD's are faster.
There will be a long tail of edge case software that runs in emulation, but that won't affect the majority of users.
You also have the problem with proprietary software that even if a port exists, it's not the version you have an existing license for, and you may not be able to afford a new laptop and all new software at the same time.
I'm not an Apple fanboy, and I'm still very displeased with many of their decisions (touchbar being #1 on MBPs). But if you consider the packaging (small, light, sturdy, now-decent keyboard), and consider their performance, and then consider macOS, I think they are more than competitive.
Even if you match every spec, including size/weight and durability, it comes down to Windows vs macOS. Ironically, macOS is free while Windows is not, but macOS is worth more (to me and many others).
If you're only looking for computers that are comparable according to the usual hardware specs (cpu, ram, etc.), a Mac costs 25-50% more than the cheapest comparable PC.
If you also throw ergonomic factors like weight and battery life into the comparison, there's no price difference.
(This was USA prices.)
what laptop are you buying where you need to purchase a Windows license?
Or if you buy a bare system or build your own, you need to buy Windows yourself.
Apple gives their OS away, but in theory you can only run it on their hardware.
If you don't understand why this isn't free then I have a bridge to sell you.
- crappy webcam,
- no built-in SD card reader (a 1TB SD card is ~200$, and my music does not need to be stored on an expensive SSD)
- magsafe.. if this was the only downgrade, I'd upgrade, but TBH I love magsafe on my mac and I would miss it if I upgrade.
Oh wow, that's cool, I didn't know that. Do you have a link to where I can download the free edition of macOS? Google doesn't seem to be helping me.
That's true of many other common goods worldwide. Unless you can buy a locally made item in a lower purchasing power country, you will usually pay a currency exchange equivalent price for the item. Actually you often pay more because the local shop selling the product cannot get bulk pricing and pass along the discount to you.
Finally, when you add the local taxes - 23% in Portugal, for example - the price can be much higher compared to Alaska, US (< 2%). That last bit is really not Apple's fault.
I'm not an Apple fan, but the change in value is stunning. I don't need a new laptop currently...
Plus there's the brouhaha about Electron apps.
I for one really wouldn't mind if Apple would build a native app to replace Electron apps, e.g. a chat app that works as a good client for (what I have open right now in Rambox) Discord, FB Messenger, Whatsapp and multiple Slack channels. Or their own Spotify client. Or a big update to XCode so it can use language servers for VS Code and it's viable to use for (what I do right now) Typescript, PHP and Go development.
They have more than enough money to invest in dozens of development teams or startups to push out new native apps.
One day I'll switch away from Chrome in favor of Safari as well. Maybe.
(I am taking recommendations for native alternatives to apps)
Use Apple Music, Messages, Safari, Swift if you want first-class support.
Or one of the better options now might be to use the iOS apps for Slack, Spotify etc.
I guess there will still be issues for people who need to run VMs or media apps like Adobe CC etc, and also it will take a while for some dev environments to be fully supported (https://github.com/Homebrew/brew/issues/7857 for example shows it will take some time to get to feature parity).
Overall though a lot of the hard work has already been done, and I'm sure in 2 years time or whenever the transition is 'complete', mac owners will be getting much more value for money with few drawbacks (the main one being higher walls around the garden)
They don't have desktop UIs, and will be a big step down for most users. You can't seriously argue the UI doesn't matter on a Mac.
Won't this be handled by just porting V8 to the M1?
Unfortunately, although applications like that exist, they're not the common case.
X86 code translated by Rosetta2 on the M1 retains/releases objects TWICE as fast as native x86 processors.
> Rust just brought their arm support to the tier 1 support level
So plenty of enterprise-class JVMs available.
This is an early access build from today https://github.com/microsoft/openjdk-aarch64/releases/tag/16...
FWIW, I originally thought your mention of Azul was a typo, so I parsed your comment as "Azure and Microsoft" before I realized the tautology, which was why I posted the question. I didn't realize that Azul had pivoted to be a software-based vendor of the JVM.
For me atm thats a dealbreaker.. but I still want one
$1200 for the Macbook Air with 16GB RAM in USA. No touchbar, no garbage keyboard.
Another thing is that you can buy "gaming" laptop for $999. Something like i7-10750H with GTX 1650. And it's powerful enough to run almost any game on high to medium setting. Apple GPU is awesome compared to Intel GPU, but compared to dedicated Nvidia GPU - not so much. So if you need GPU for gaming, that's another area where Apple does not compete with their new laptops. At least for now.
Ultrabook with focus on portability and long battery life - Apple is awesome here.
Exactly that, I think that's the ultimate reason to have a laptop and if not, it might make sense to re-think the setup. Why should I buy a 1500$ Intel/AMD mobile workhorse when the battery is empty after 2 hours? It usually makes more sense to have a server at home or VPS for that. Also a lot of native Apps like Steam have first-class support for that nowadays. For the rest Parsec might work.
But it's not really an Apples to Apples comparison.
In raw performance per buck you could always get a customer PC setup for cheaper, especially in desktop form.
In some countries, even a $300 laptop comes down to half a year's salaries...
> Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?
Apple devices are definitely not priced competitive outside first world countries.
You might be better served by wiping it and installing Linux though
iPad pro - the current 2020 gen iPad pro has A12Z (essentially the same chip as 2018 A12X with extra GPU cores) - significantly older chip than A14. I think there will be an A14 iPad Pro refresh with mini led display in early 2021.
I see that statement a lot, and yes, at some point that is going to happen.
But the analysis seems to fail to take into account what utterly amazingly low power devices these chips are. So while it will happen, it might take a long time.
it looks like they're all in the same ballpark (i.e. the Air is not leading others, just comparable).
I also imagine not all customers are ready to jump on ARM day 1. Some will want to wait until the software ecosystem has had time to make the transition.
Seems pretty obvious to me that there will be another more higher end variant of the M1, though maybe the only difference will be the amount of RAM, the number of GPU cores, the number of supported USB4 ports or something like that, not raw CPU performance.
Either way, it seems obvious to me that the M1 is their low end Mac chip.
That will be interesting to watch.
Server farms are going to switch rapidly, one leading Mini server farm just announced a 600 unit starter order, and the CEO noted that Big Sur also made significant changes to licensing to make its use in server farms easier.
Apple released a killer low end SOC in the M1. It contains the highest performance single core processor in the world along with high end multi core performance. But it’s limited to 16Gb and two USB4/Thunderbolt ports, so it’s targeted at the low end.
When the M2 is released mid next year, it will be even faster, support four USB4/Thunderbolt ports and will also come in 32Gb and 64Gb versions.
Greatness takes a small wait sometimes.
Where I can be wrong is that Apple could release two chips. First an upgraded M1, let’s call it M1x that supports a bit more on chip ram (24 or 32 Gb) and four ports. It would be only for high end MacBook Pros and again optimized for battery life.
And they would release a M1d for desktops that has more cores, but moves RAM off chip. That would improve multicore performance, but I don’t know how it much it would hurt single core with slower memory fetches. Probably they could compensate with higher clock speeds, power budgets, and more active cooling.
I wouldn't buy a Pro now because I would wait for the next version, but I wouldn't trade a current Pro for a new Air just for the CPU bump...
Has an interesting comparison of an iPhone 12 mini doing similar work to an i9 iMac
Now I haven't dug into the details to verify both produced the same results. I believe most of the difference is from software encoding versus hardware encoding. the follow up tweets suggest similar output.
it does show how workloads can cause people to jump to conclusions on simply one test and not having all the details to support the conclusion they desire to arrive at
iPad Pro is still on an older generation of SoC (A12Z), while the iPad Air just got the new A14.
Well yeah, every year for the last bunch or years the A series of chips have had sizeable IPC improvements such that the A12 based iPad Pros are slower than the new Air. Apple's chip division is just industry leading here.