I guess, the 12" MacBook was designed on a promise of Intel of delivering processors, they never could deliver due to the issues with their 10nm process. This is where an ARM based design around the iPad processor would be an instant win performance wise.
But it is also great news, if Apple really uses the costs saved by going with their own CPUs to offer the machine at a lower price. Suddenly, it can compete with a lot of branded laptops on the price alone. With the additional speed from the processors, it should be quite a good general purpose laptop. And we know how nice the 12" MacBook was from a design perspective.
For the consumer market, the ability to run iOS apps becomes an additional huge benefit. For more and more people, the smartphone is the default computing platform, especially for younger people, the smartphone might be the device they get into computing at all. So when bying a laptop to go along with your smartphone, the Mac running Apple Silicon has a huge benefit, as it integrates more seamless than ever with your smartphone.
I would expect the Mac market share to get quite a boost as a consequence.
While the MacBook is the obvious machine to equip with Apple Silicon for the reasons listed above, I would also expect Apple to hit the market with a true developer machine just so that developers can not only test their software on Apple Silicon, but that it becomes the development platform right from the start.
They are utter garbage; there isn't a single good Catalyst app, they're all putrefactive shit on the Mac. The JIRA app — which Apple demo'd on stage when they unveiled Catalyst — is hands-down the worst app I have ever used on the Mac, and I've been using the Mac since I was like 13 and my mom had an SE30, and I also spent 2000-2008 as a developer of Mac apps (iGet, iGet Mobile, and worked on some others).
SwiftUI is still kind of painful, but way less painful then it was at launch. SwiftUI is an interesting new paradigm, and it really does seem that it will be good someday, and even possibly soon. It's already awesome if you happen to not hit the (many) areas where it is still lacking.
Catalyst will never be good, and Apple will kill it as soon as they possibly can.
It's the legacy Intel Macs that need Catalyst.
This is just like the Carbon/Cocoa issue back in the original OS X.
5 years from now it will be obvious to everyone that SwiftUI has been a huge leap, but with some normal transitional bumps along the way.
The guys on the ATP podcast talked about this a couple weeks ago (last week maybe?) and it’s a great conversation - I highly recommend it.
Microsoft certainly attempted to create an Android subsystem for Windows, it just didn't work.
I see it more as a stepping stone. It's difficult to imagine how a UI designed for iOS would ever not feel strange on a laptop. But SwiftUI seems like a great tool for sharing large amounts of code between platforms, with some branching to cater properly to each platform.
However given the current dominance of web interfaces for desktop software it's hard to imagine iOS apps becoming relevant on mac anytime soon aside from a few special cases.
Maybe but I'm not totally convinced it's enough. For small apps which are only currently on iOS this might be interesting, but for the large pool of applications which currently target web, iOS and Android, doing extra work to support a mac branch of the application is still that: extra.
Those users are already supported by your web application, and even if it's relatively less work than creating a separate native mac application, adding a mac branch of your iOS app will still add effort and complexity to implement and maintain. And that would be to serve what's probably a tiny minority of your active users.
What is Apple offers a touch screen? Then it becomes essentially an iPad with a keyboard..
Edit: imaging lifting your hand from the table and poking screen for several minutes. Screen wobbles each time. Your hand gets tired from being suspended, but putting it back is almost too much work. Apple might invent something, but there is only so much to be done with certain form factors.
You may want to try one in a store, I personally changed my mind on the 2-in-1 form factor after using one for a few minutes.
The main issue is that Windows 10 is quite bad at dealing with the transition between laptop and tablet mode, similar to when you connect an external display the screen blinks a few times, everything resize hysterically, so you have to wait a few seconds for the UI to transition.
Edit: the touch display is just way too practical when you want to quickly scroll a page or touch a button located on the opposite side of the screen. IMHO it doesn't have to be something you use all the time to be worth it.
I would use the mouse/pad/keyboard 90% or maybe 95% of the time — but the inability to just tap the button or pinch-zoom or whatever the other 5% of the time... it's just stupid.
All 5-year-olds perceive screens that you can't directly interact with to be broken... and they're right.
xinput disable 11
I don‘t see how running an iPad app native on a Macbook would „be difficult to imagine“.
I saw someone attempt to play Slither while testing an iPad Pro with a mouse recently. It looked fairly awkward and unnatural.
But, I doubt if the macOS on Apple silicon in upcoming MacBook would even fulfil all the needs of a generic consumer, if not the developers. Newer macOS releases on even current Macs have been ridden with bugs and it takes couple of major updates to fix them(the case with newer iOS releases have been worse). Many developers have stopped updating their macOS to the latest iteration and are sticking with older versions till the security updates lasts (or if Xcode support is dropped).
So, IMO it's going to be a while till macOS on ARM achieves stability and the $800 is the subsidised early adopter fee to put up with its quirks.
SwiftUI is really nice, but it's incomplete. So you really need an engineer that understands macOS to offer a complete solution for anything that goes deeper than a to-do app.
I don’t really think the connection here is so strong. Having the same ISA means very little.
Personally, I always liked the MacBook Air and MacBook form factors but never purchased them. I went for iPads instead and left power use to a MBP or Mac mini. I could see the MacBook being more compelling if it’s the only way to develop, but they’re moving Xcode to the iPad I think down the road...
Or maybe they think of the MBA and MB as legacy formats. That could make them not directly competitive at all and potentially complimentary.
Under Jobs they also introduced the iPod Nano and killed the iPod Mini even though it was the best selling iPod.
Of course they introduced the iPhone at the height of the iPods popularity.
The iPad overtook the popularity and revenue of the Mac while he was CEO.
The 256GB iPad Air is $649, the High Memory Model are the high margin products. And that is already cheaper iPhone.
As a matter of fact I am surprised how everyone think $799 is a bargain or great pricing. Swapping the cost of camera, gyroscope, Touch Screen, to Track Pad and A14X.
I am expecting it more to be $699. With an 16GB RAM, 512GB version for $899. With a future 14" Model being $100 more.
Edit: Actually thinking about it may be that is why the 12" starts at $799, it will lower the price to $699 a year later to make way for $14".
Typing this from a 16-inch MacBook Pro that cost me $5k, I know Apple loves to make money via raw sale price of premium hardware. Imagine what power they'll be able to pack into such a price once they apply Apple Silicon to the MBP. I'm thinking 128 or even 256 GB RAM, extraordinary processing speed options that are orders or magnitude faster than any other laptop (or even desktop), it's major major news.
The big question mark hanging over it is ability to run native Linux. I'm a Linux user, increasingly. We'll have to watch this space.
If ARM-based chips can be "orders of magnitude" faster than x86, they'll already be dominating datacenters. The performance improvement in sustained compute (within a power envelope) will be incremental rather than the giant leap you're hoping for.
x86 CPUs do not execute their CISC instructions natively; they get decoded to simpler instructions. But the decoder consumes only around 5% of the power budget, so it's not like x86 is spending significant amounts of energy because of the legacy instruction set. Now, due to a decade of high-stakes innovation in the mobile market, ARM solutions will do significantly better in idle power consumption (which was essential for phones).
So, I'll expect way better standby and well managed low-power states/transitions. But a minor increase in full throttle power.
But here's a study which arrives at a number between 3-10%: https://www.usenix.org/system/files/conference/cooldc16/cool...
Also, mops aren’t “RISC”; they’re mops. The only part of the x86 parts I worked in that used a decode ISA was the network & DMA, which desugared down to x86i.
Raw compute performance is the wrong metric for measuring data center worthiness. The correct metric is performance per dollar.
ARM in the DC currently suffers a lack of ARM-specific code optimizers and has to play catch-up in this regard. Amazon recently focused a squad of engineers on adding ARM 64 optimization to the Zend Optimizer in PHP 7.4 to an incremental effect, but then as some cruft is removed on the PHP 8 roadmap, that gain is around 15% in raw performance. Combine that with 20% lower pricing for comparable Graviton2 EC2 instances, and you’ve got a significant cost motivation to migrate to the ARM platform.
Expect AWS to dedicate engineering resource to similarly bring projects like Python, V8, LLVM, etc. into the ARM-future.
There’s also the fixed engineering costs for the life cycle of new hardware and for customers performing migration (where relevant).
They put it at somewhere between 3-10%.
I am doing my day-to-day work with Linux in a VMware VM on my MB Pro and couldn't be happier. When running fullscreen, I don't notice any difference to having a "native" Linux install, plus I am getting all the benefits of being able to suspend my VM, having several VMs and of course, switching to the macOS desktop is just a two finger swipe on my mouse. This really has become my favorite compute setup.
The big benefit of using VMware for this is, that the necessary "drivers" are open source and have been part of the Linux kernel for quite some time. So you don't need to install custom drivers in your VM any more. Expecially with very new Linux kernels this means, you don't have to wait for the VM provider to supply new drivers. Which is the one thing I didn't like about Parallels, it rarely supports the latest Linux kernels.
Running Windows or Linux as a VM in macOS has terrible performance in VirtualBox right now (people are discussing it at length in VirtualBox forums). The expensive software Parallels is the only way for it to be usable. VMware is maybe OK, I don't use it much and only briefly tried it the other day.
It seems if you want to run Windows or Linux as a guest VM, it's better to make Linux your host OS than Mac. I also increasingly prefer open-source software, so VirtualBox at the very least is desirable. On macOS, that solution is not adequate right now.
I stopped doing VMs and just ssh or Remote Desktop to my Linux and Windows boxes. For a little while I was running a small ESXI server.
That's my experience as well. Running macOS VM is even worse. My understanding is that it's because all of these rely heavily on 3D acceleration for the GUI and VirtualBox's acceleration support has not worked properly for a few years now. It's slightly better if you run VirtualBox in the low resolution mode but still not ideal (and of course the graphics are... low resolution). It's quite obvious that macOS hosts are not a priority for Oracle, I mean, a few VirtualBox releases couldn't even be installed (missing notarizations and one other occasion I cannot remember right now) and Oracle noticed only after people started complaining on the forums...
These says I use VirtualBox but in the headless mode and I connect to it via SSH and VS Code's SSH remote extension.
But I'm also exclusively using Linux nowadays, so I guess we'll have to watch this situation unfold for a while to see how viable running Linux on those is. I'm pretty sure it will be possible, but I'm not sure how much pain it will be.
Every now and then I boot up a distro that tends to have quite bleeding edge kernel version like a Tails USB to see how complete the hardware compatibility is out of the box.
Apple's hardware like their trackpad remains superior enough that there seems to be enough interest (like you and me) in seeing it work.
People are basing their OS choices on marginal difference in trackpad quality?
How about the fact that with macOS your tied to one vendor and their hardware choices (keyboard /ports etc). I don’t have a new MacBook(work machine 5yrs old), but I didn’t love the giant trackpad on the one I borrowed.
That means that it either requires more force to click the higher up that pad you go or you have to click more precisely at the bottom of that pad.
That's been a non-issue even before they moved to the artificial click mechanism.
Update: I see from your link it works on 2/8 tested. Better than nothing.
Upgrade prices where never reasonable.
Why would Apple ever reduce upgrade prices just because they might save 100$ on a CPU?
And why would you assume that the development of your own inhouse ARM CPU would make such a financial difference? After all, you have to develop a CPU.
And what? Apple reports over 25% profit; Why would you even assume that Apple is doing anything for you? They could easily reduce there profit margin for you. They don't.
Plus it seems that the fact that they now need developer to write video drivers/ compiler optimizations etc are all additional costs.
They payoff can be big, but so are the risks.
2688x1242 isn’t tiny.
Neither is 4K at 60fps.
Once you start to tinker with your images, the picture change quite a lot for the cpu/gpu
I'm interested in why you think Apple will be able to build an integrated memory controller for a laptop that is capable or 128 Gigs or 256 Gig of RAM. Most laptops with Intel were stuck at 16 Gigs of RAM for years although more recently that has increased. My understanding about the reason for this is that these high capacity integrated memory controllers are very difficult to design. Is there something specific to ARM that would make iterating on these easier for Apple than it has been for Intel?
I wouldn't not be surprised if Apple took every measure possible to make it impossible to run Linux on their new machines.
Apple would have never gotten any of my money if it weren’t for that.
for the moment, most people i know complain about poor performances of linux vms in mac os, particularly when running docker (that gets pretty i/o intensive when dealing with images).
> at least for now, it is announced that Apple Silicon hardware only boots into signed operation systems
this alone already rules out gnu/linux i guess?
Indeed. "Microsoft <3 Linux" they say. Right...
Even before that, there's the question of how they enumerate devices on the platform, how standard or custom the peripherals are…
FWIW they haven't even invested the tiniest amount of effort into stopping people from installing MacOS on "Hackintoshs". And that practice is probably more widespread than Linux-on-Macbook, and has a more obvious theory of revenue impact.
(It wouldn't be possible to completely make it impossible without "trusted computing" hardware, or, coincidentally, switching hardware platforms to something not available on the generic market. But they certainly could think of a new trick to make life miserable for these users with any point update, and yet I found MacOS offered better support for my random hardware than Linux, without even trying.)
But booting these Macs into anything but BigSur is not happening as accessing the hardware without using Apple’s drivers could cause product returns.
Heavy as fsck but oh so pretty.
Just don’t drop it.
I also find it impressive that Apple managed to get us all saying “Apple Silicon” instead of “processor” or “CPU” in the first place. It does have a much hipper ring to it.
Credit where it is due, Apple knows how to name.
Many technology pundits fail to understand (or perhaps more to the point believe) Apple's naming choices, with "iPad" being a notable recent-ish example. 
Even many who "get" Apple's product naming can misjudge. I personally was unmoved by the name "iMac" when it was announced in 1998. However, that leading lowercase "i" was a stroke of marketing genius so broad that it is only just now diminishing some twenty years later.
I never fail to notice it--it nearly drives me crazy.
But as far as the “Bionic” chips in the iPhones, I think Apple execs should stay away from pot shops and happy hours
A bit like saying something is "curated" instead of restricted, if you can make something ordinary sound extraordinary you're on to a marketing winner.
includes degreaser to remove fingerprints
The last few Apple laptop I’ve had for work has been annoying at best, so for home use I wasn’t going to replace my old 2013 Macbook Pro with a new Mac, but I am tempted to try an ARM based Macbook.
Airport Extreme was the last Router where I dont have to restart it every few months. It was stable, rock solid. And you only realise how much better it is once you switch to something else.
I wonder if it would still be based on NetBSD? And TimeCapsule?
Give Ubiquiti a try. Mine has been running for several years and I’ve never had to restart it.
Things might be different now with Wifi6 but there really hasn’t been a single reason to go beyond an AirPort Extreme for 6 solid years.
I was an idiot to want and buy a faster 802.11ac Router, I gave away my AirPort Extreme to my friend and bought an ASUS.
It was faster, slightly better coverage / reception. But in the end I discover I value stability over absolutely everything.
I cant remember what problem I had with Ubnt when I had one set up in my friends house.
You are running a 2013 Macbook Pro; there is no clear benefit for you buying a Macbook from 2020 with or without ARM at all.
Its absolutly unreasonable, if you are already running your Macbook from 2013, that the ARM will give YOU anything relevant extra as the general upgrade from a 7 year old device.
Plus all the other niceties, like USB-3, modern bluetooth, better integrated graphics, a faster and bigger SSD, more RAM, and so on.
Oh, and the ability to run future-proof on the ARM version of Big Sur (and run iOS apps, and other things).
It's not even comparable...
If someone with a Device from 2013, really waits for the ARM Version of a Mac Book, that person will not have any advantage to a normal intel cpu based Mac Book because EVERY device which is 7 years younger is so much faster anyway.
Working on my hobby AppleTV app is doable on the 2013 laptop, it’s just annoyingly slow. I would hope that the ARM perform better.
Please don't downvote, I'm just curious.
Apple has always been big on forcing app developers to rewrite UIs to the device, like they did with the iPhone-to-iPad transition. Without touch on Mac, many iOS apps are going to be janky as hell; an incredibly un-Apple thing indeed.
iPad Pro are aiming at Professionals, from CAD, 3D, PhotoEditing with Apple Pencil, Those will care about the screen. And they knew and understand why it is more expensive to make low latency input. At 9ms it is already one of the best in the industry, and I would not be surprised it they continue to work on lowering it.
Macbook ( The non Pros ) are aiming at casual computing users, in terms of customer group they are more likely to be overlap with iPad and iPad Air. But then both iPad and iPad Air are more towards a consumption devices more than productivity. If they are students, Keyboard and Mouse is still a much better experience for wiring up your homework. Of course you can get a keyboard for iPad as well.
I've seen lots of iPad Pros in use in business meetings where the user is almost certainly not aware of the screen. In some cases it's because they just want a bigger screen. In others just because they wanted 'the best' iPad and the cost wasn't an issue. (I've actually heard someone come into an Apple store and just ask for the best iPad irrespective of cost!)
Not a huge fan of this. I'm hoping you it's not like the ARM ecosystem with Windows where things are heavily nerfed (Surface Pro X) to the point where you can't do anything that resembles proper development.
Proper MacOS, full full system access, ability to run docker and run golang builds, node runtime, python etc natively and locally on the device are a must. Native terminal app (obviously) is a must with the option of running something like iTerm2 etc. I had a chromebook before with crostini and that was a shitty experience for a fanless dev machine. I hope apple can do it right.
side nitpick: i don't align with the apple haters out there who wanna pick at things like the use of the word 'silicon' by apple.
You’ll get all of that, don’t worry.
Everyone aspires to own a Macbook but either flat out can't afford one or can't justify the price. (I'm in the second camp)
This changes all of that.
As for most "normal" customers, these first Apple Silicon devices are going to be perfectly usable on day one—whether they're spending 99% of their time in a web browser or running some outdated finance software through Rosetta. And natively compiled software is going to come thick and fast over the next few months.
As for timing, with the pandemic flattening demand for semiconductors, Apple will have no trouble getting their supply chains running. I'll be shocked if we don't see Apple Silicon Macs in stores by December.
And developers already have Apple Silicon development machines, see WWDC announcements.
B) the 12 inch macbook is probably my favorite device, though I really hope they keep the top line of keys. It's so thin, the battery lasts forever, and somehow it performs better than my xps 13 running linux (in interaction latency, not computational throughput).
It’s soooo light though, I still love it.
Finally, I think the risks here are far more clear than a new laptop form factor.
As an aside, I have a very hard time believing that Apple would reenter the networking industry. Consumer wireless access points are a commodity product at this point -- it'd be difficult for Apple to present a compelling premium option, especially when many consumers are satisfied with the one that came built into their cable modem.
I suspect it's more of a branding/positioning thing. ARM has an association with low power devices like smartphones, and Apple wants to position this as something new, exciting, and better than x86.
Layout of the A12:
If they made an airport express that did that, I'd probably get it (once we can travel again).
It was all of 19 bucks IIRC.
Apple already trademarked "Carbon", and some others like: Aqua, Bonjour, Cocoa, Instruments, Logic, Metal, New York
I was super annoyed by it since I've read this marbleous piece of work in Windows Blog  years ago.
> Silicon innovation
> silicon partners
> silicon features
> silicon supported virtualization
> with a focus on Windows and silicon
and wait for it
> deep integration between silicon, platform, and hardware
At least nobody would be able to use it anymore without infringing if Apple has trademarked it.
If apple combines the word with a unique logo or combines it with some other word, they can get the trademark, but you can't get it for the word silicon by itself.
Given that the chip in macbooks will be a lot faster than the A12Z, it could very well offer twice the multicore performance.
Now try some math workload on A12Z, i.e. real-world stuff and weep. Most of the die space on x64 CPUs is spent on caches and SSE, so if you cut them out to be more power-friendly, you lose performance in high-end applications.
It's not like anything faster than Pentium J is needed for Office, browsing or even front-end development, so in that segment their first gen should be good enough and maybe even better than i3/i5.
However, it will be interesting to see if that A14 big core implements the ARMv9 instruction set, which is said to be imminent. Apple did implement the ARMv8 instruction set much earlier than it's competitors, so there is a bit of a history there.
ARM teased new features like an up to 2048 bit SIMD implementation with SVE2 and Transactional Memory for the ARMv9 instruction set back in early 2019, but has mostly been quiet.
While I think it would have been very convenient to add at least one USB-A style port and perhaps a dedicated HDMI port to their machines, I quite agree how nice it is, that all USB-C style ports do have the same capabilities.
And I don't like LED backlight:
According to the WSJ $75 to $150 per Mac could be saved.
>The new Mac processors will shave $75 to $150 off the cost of building a computer, estimate analysts, who say Apple can pass those savings on to customers and shareholders.
Apple could easily make that saving and more. You can also look at iPad Pro selling price as reference.
And their profit on Mac hardware is only 15-20%. You may price at double build costs, but there is still design, R&D, testing, sales, marketing, distribution, support, warranty costs and admin overhead to be accounted for.
It also requires more energy, which means larger batteries, which means more dollars.
It would cannibalise the iPad Pro very much too, considering the price of those + keyboard stand is much higher.
Unless they deliberately give up some of their huge margins in order to get these in circulation ASAP, I don't see this happening. The first ARM will be a premium product for them. I don't think it will carry a budget price. No MacBook has been this cheap since the MB Air 11" which was a really underpowered 1.4Ghz / 2GB RAM model.
The iPad Air was a $500+ tablet until they were able to build a $299 model at same margins.
I doubt they'd keep the same margin if they really offer it at $800. I don't think they'd save that much by cutting intel out, especially if you factor in R&D. Pretty sure they're cutting into their margins at that price, but it would be good for them to get ARM in the hands of consumers so that developers have a reason to work on it.
But their pricing isn’t going to be greatly influenced by R&D costs, it’s going to be driven by hardware costs. Those R&D costs are ongoing expenses, Apple is always working on new technologies. A rough rule of thumb is retail is twice hardware costs. Most Intel CPUs cost between $200 and $300 each, while Ax processors in iPads and iPhones cost less than $100 each. And another big factor is the SOCs for Apple Silicon will integrate lots of capabilities that require separate cards or coprocessors on current Macs, including high performance GPUs, the T2 security chip, etc.
So it’s reasonable to think Apple will be saving $100 to $200 in hardware costs for the Apple Silicon “equivalents” of their current lineup (Equivalents in positioning, but faster, slimmer and longer battery life versions). That directly translates $200-$400 lower retail prices, which Apple will use to drive higher sales volumes. That will spread R&D, Design, Marketing, Sales, Admin, and Support over more units, supporting spending more to promote and sell them and more on design and R&D in the future.