Their last keynote was clearly a gymnastics exercise to ignore Intel CPUs and dismiss laptop performance while later praising their own chips which power a tablet that has no software ready to use that much speed.
The fact that one of the most secretive companies executes a PR-stunt by providing an exclusive interview to one of the most respected tech outlets only confirms this strategy. Expect similar movements in the following months.
Now, it is only a matter of "when", not "if", Apple will start selling laptops with their chips.
As as an aside, this strategy is extremely similar to the one they used when dropping the headphone jack on the iphone: "leak" the news to a respected outlet, perform damage control before the keynote, and test the reaction of the market. When they introduced the headphoneless iphone, that topic was so beaten up that it got much less attention compared to a surprise revelation.
I’d love the ability to carry all my software and data around in a phone without lugging around a laptop or having to buy a desktop computer. And I’d love to never concern myself with transferring or syncing data again.
And why not let us connect an iPhone to an eGPU for desktop gaming?
> If you could wirelessly and seamlessly use the same device to do that work on a larger screen, powered by that phone, they would absolutely want to push that.
This vision works just as well if the larger screen is powered by its own computer instead of the phone.
 see for example the movie “her”.
You say this as if it is an easy problem to solve.
You can’t look up what he thinks, only what he says he thinks.
Apple would be stupid if they wouldn’t be researching this option just in case. If they ever manage to make it work well, you can be sure what Cook says he thinks about convergence will change overnight.
For those who need more convincing: look up what Steve Jobs said he thought about products Apple didn’t ship, and compare it with what he said he thought about them later, when Apple _did_ ship them)
A friend of mine is a master real estate broker. He's a good example of someone can move almost entirely to a chromebook or an ipad because nearly everything is now web based. So it's not just old folks or people just surfing but regular people doing their jobs. In particular this guy is someone who has half his browser eaten up by toolbars and buys a new computer every few years "because the old one is too slow." A chromebook or ipad is a great place to park those people since they're curated environments and not the free-for-all virus delivery and identity theft machines PCs are for the unwary.
And now that Apple's on board with USB-C, which can do video/power/peripherals, it's not inconceivable that a Monitor+USB hub would be all you need for a plug and play iOS productivity station.
It would be a trivial firmware change for the next release of iMacs to support, and an extension of their existing target-display mode.
A: (Scott Forstall) You don't have to. The user just uses things and doesn't ever have to worry about it.
A: (Steve Jobs) It's like we said on the iPad, if you see a stylus, they blew it. In multitasking, if you see a task manager... they blew it. Users shouldn't ever have to think about it.
The point was that using a stylus as intermediary when doing basic interaction with a touchscreen is indirect/awkward and unnatural (the mouse is too, frankly), not that nobody should ever use a stylus for drawing.
Styluses clearly have a big precision (and visibility) advantage vs. tracking a whole fingertip touching/sliding around the screen (i.e. if we compare inherent human capabilities, not specific hardware), but relying on a stylus is also much more prescriptive about acceptable hand movements, and all of the stylus-first mobile devices pretty much suck compared to finger-based multitouch, in practice.
Besides Microsoft, Google has come the closest, but they're not pushing it as convergence.
I think it is the future, but it's going to be a difficult one that requires an excellent launch - at least 3 years off before the next major inroads.
That's more a reflection of the fact that Samsung and Huawei suck at software--Bixby, anyone?
Everybody said the same thing about WiFi.
I used WiFi when it first came out--PCMCIA cards, external stick on antennas, etc. Worked as advertised but nobody gave a shit about them---until you sat in front of somebody and used it. It was almost a virus and spread like one.
Then everybody gave a shit. And look where we are now.
Everything on your phone--everywhere--is the endgame.
The endgame is everything on every device you own. Sometimes you'll use your watch. Sometimes your phone. Sometimes your tablet or laptop and sometimes your big screen TV. It's just differently sized screens that all connect to your data in the cloud.
That is the endgame here. Nobody wants to go swimming with their phone or watch a feature film on their phones or do 8 hours of office work on their phones.
Given the current performance numbers and the fact that we are switching to USB-C, which allows high speed connections to multiple types of devices, and we may be reaching a point where this is actually feasible.
Obviously, the laptop form factor is missing, but it would be...interesting if the iPad, iPhone, and MacBook were all pretty much the same compute device with different form factors and different battery sizes and possibly different storage sizes. Honestly, the form factor and user experience are going to be the biggest user-visible differences, and those are also the things Apple seems to care the most about (even if they don't consistently get it right!)
Also, why would you want a single physical device anyway? Because hardware is expensive? Sure, maybe. Because you want to keep all your data in one place? That's the purpose of iCloud. Because you don't trust the cloud and want to keep all of your data physically on the same device and still access it from multiple form factors? That's a small fringe of the market that probably wouldn't buy Apple products anyway.
All my files and apps setup they way I want them on a single device that's with me 24/7. Complete privacy and security because my data never leaves my device (except for backups to a time machine or iCloud).
I don't have to own multiple computers. I own one computer: my iPhone. There's only one device to setup, update, and maintain. If I want to have a larger screen, VR headset, eGPU, mouse, keyboard, speakers, headphones, etc., I just pair my phone to one. I can buy different peripherals for my home, my office, etc. but I don't to buy redundant computers.
Think of the Nintendo Switch.
So you don’t want your data to leave your device for privacy reason but you back it up to iCloud?
A tablet is pretty similar, but the dock needs to be behind the screen, which probably makes a thicker than average tablet. Or you just make a foldable phone which folds out to a tablet.
I think it's really interesting what Apple are trying to do (with Marzipan, pro apps for iPad, desktop level SoC's) but I think they still have a couple of years work ahead of them. It's not an easy task, especially with so much legacy.
Getting Adobe and Autodesk to rewrite their flagship software for iOS was quite a good win, though I still wonder how those compare to the desktop versions
Now if they could only get Apple to rewrite their flagship Pro software for iOS.
If Apple invests in enhancing CarPlay, it might be a sign that they are scaling to a wider convergence market. If they don't invest or abandon it, then maybe convergence won't happen. They're famous for saying they're not working on projects that they are actually working on, so we have to read between the lines.
After upgrading from an iPhone 6 to an iPhone XR, I've been thinking how close the latest gen devices are to traditional computers anyway. The configuration/settings and features are so far beyond the first gen devices that I think convergence will happen, nobody knows what it will look like yet.
There have been a few party game apps that allow a person to drive the game on their device while the other participants see what is on the TV. I don't think it ever really caught on; perhaps partly because AppleTV was a niche product way back when and partly because it didn't get much advertising.
That sounds like the most un-apple thing I can possibly imagine. Their ethos was always to build a device that does something incredibly well - a phone that can be a phone but also a desktop computer is everything but this.
And this was during the Jobs era.
That doesn't mean they will ever manifest in real products.
For it to work on the iPhone, they would have to add support for some kind of mouse/trackpad, because the iphone itself is sitting in the dock - and I don't see apple doing that in a million years.
Because that would mean Apple doesn’t get to sell you two devices.
But attach a proper(!) keyboard with touchpad to an iPad Pro, put macOS on it instead of iOS, and there's your next MBP ;)
Sure - one device could do everything, but that doesn't drive sales or profit.
Also makes me not really happy to buy any current hardware of theirs.
In the Jobs' days they'd promote the practical value of a product. For example they might say how the new iPod can have 50k songs, not how big the storage is, let alone the storage type. They'd mention a smaller form factor or improved battery life, not about HDD to SSD.
So when they talk about SoCs, cores, GPUs, intel or anything else hidden, are they signaling to customers? Maybe its signaling to investors that apple is innovating and that ought translate to profits; maybe the inner geek in all of us?
The dirty trick with hidden tech's performance figures is that they don't directly translate to customer value. As you mentioned, you're dissuaded from buying because the new stuff will be so much better. Maybe it will but the old apple would tell you that the new macs can process video in final cut 10x faster, or you don't have to buy a separate gaming rig, i.e. you can do more stuff better.
The good part about focusing on what the products can do is that you can't fake it. You can't fake it to [geeks] like me, [non-geeks like] my mom, my kids, investors, etc.
E.g. this from 2015 (!): https://www.anandtech.com/show/9396/samsung-sm951-nvme-256gb...
This from this year: https://www.anandtech.com/show/13438/the-corsair-force-mp510...
If you actually read technical deep dives like AnandTech posts on SSDs and MacBooks alike, you would see that they really do perform identically to what Apple ships... because Apple is shipping industry-standard SSDs. Apple chooses expensive SSD chips, sure, but it's the same NAND everyone else has. Only the most recent generation of MacBooks (since the T2) has actually integrated their own SSD controller, inside the T2, but it's still the same physical industry-standard NAND chips underneath.
You can buy this SSD today. There are others like it, and some, like Intel's Optane SSDs actually substantially outperform anything Apple has put in their computers in terms of IOPS, even though the Optane line hasn't focused on raw sequential throughput yet. Off the shelf SSDs get the much vaunted performance of "Apple's" SSDs (they're just normal SSDs...) being discussed in this thread.
In fact, the SSDs in those windows computers are as fast as the one I've linked to. The benchmark you were linking to was benchmarking NTFS vs APFS under a small file I/O workload, which NTFS sucks at. It was not benchmarking the SSDs in any effective manner. I am certain that I pointed this out in my comment above! If those laptops were copying a few large files, the performance would have been identical. If those laptops were running Linux, the performance would have been identical.
Look here! https://www.anandtech.com/show/12167/the-microsoft-surface-b...
Scroll to the bottom and tell me what you see! Yes, that SSD is performing as well as your vaunted MacBook!
Apple's iPad Pros are technological marvels. Their laptops' storage systems are not, and you're just deceiving yourself if you think otherwise.
Sample size of one and all that - but for me it was a selling point. The PowerPC G5 (PPC 970) was intriguing when it first came out. Having been with Intel, Cyrix and AMD systems since the 286 and used them since XT (Intel 8088) days, it was nice to muck around with something new and paired with OSX Tiger it was such a fun world to explore. The move to Intel felt like a bit of a letdown, but by that time I loved OSX and Snow Leopard cemented it as my OS of choice.
Apple desktops/laptops moving to custom silicon would excite the nerd in me. I want competition.
1. It helped them jump from a failing CPU platform to a non-failing CPU platform. PPC was not keeping up with x86 anymore and Apple's two PPC vendors were going in opposite directions because there wasn't enough of a market for CPU's for Macs. (There were even rumors of future Power Macs migrating to a full POWER CPU rather than PPC.)
2. It meant you could run Windows, and hence Windows apps, on your Mac if you wanted to, without CPU emulation. This is still a fairly important use case.
3. It may have also simplified matters even for Mac application development, since you didn't have to switch ISA's in addition to switching operating systems. Making matters worse, PowerPC defaults to big-endian and x86 is little-endian.
How does this apply to a potential ARM switch?
1. You can't really say x86 is "failing" if it's still the industry standard, but Apple might believe (rightly so, given the market size of iOS) that they finally have the ability to sustainably outperform the performance of x86 on their A-series chips.
Most of the PowerPC bet was that a newer and more elegant architecture would outperform x86 and provide Apple a competitive advantage, and while that may have occasionally been true sometimes, it was never a huge deciding factor. Intel and ARM kept up because they were able to make investments in keeping x86 afloat. Ironically, Intel themselves also bet that a newer, more elegant architecture would make x86 obsolete, namely Itanium, only for AMD to invent x86-64. Not even Intel themselves could stop the x86 train.
With the rise of mobile devices, ARM now has the same market power as x86, if not more, simply because there are many more ARM-based devices manufactured and sold than PC's. Apple in particular has been able to invest heavily in their A-series chips and has full control of their CPU roadmap and destiny. Perhaps this time, x86 may finally be rendered obsolete. Don't count on it, though.
2. This is really mostly dependent on Apple's strategic priorities. With more and more application functionality moving to mobile and the web, being able to run Windows is less and less important. At the same time, being able to run Linux is more important; for many developers, running a Linux VM in Vagrant or Docker lets us develop in a similar environment to the servers our code will eventually run on. Sure, you can run Linux itself on ARM, and perhaps there will be more Linux distros that support ARM when and if Apple switches the Mac, but it won't actually be the same as the server unless ARM makes serious inroads in the server market.
Maybe they're betting they can surpass x86 enough that they could emulate x86 at respectable speeds. Since they would be migrating CPUs again, they will probably provide a CPU emulation layer again, like they did when migrating from 68k to PowerPC and then from PowerPC to x86. Keeping this emulation layer around would have more of a benefit because, after awhile, nobody needed to run 68k or PowerPC code anymore. This has never been true for x86 code, and it won't be for a long time, so look for Macs to continue to run x86 even if Apple switches.
3. I think A-series is also little-endian by default, and for x86, see above. Maybe Apple is banking on getting more value by running cross-platform iOS/Mac apps than cross-platform Windows/Mac apps. This will probably impact Mac gaming the most, but that's never been a priority for Apple.
I went back to remind myself of the details of the switch from powerpc to intel. It was announced at wwdc 2005 (june) when they released a developer transition kit. The announcement included a commitment to ship computers running x86 by wwdc 2006, so there was pretty much 12 months lead time even for outside developers. Apple also committed to moving to intel fully by the end of 2007, a 30 month total process.
I think Apple is further ahead of the game this time around for how quickly they can go from announcement to shipping product. OS X had been running internally on x86 for years, but this time tons of apple software has been publicly running aarch64 for many years. I do think they need more than 30 months to complete a transition this time around as the user base is much larger and apple may never want to invest the serious dollars it will take to build the giant chips they get from intel. It's one thing to swap out the macbook processor. It's a whole other world to do 130 watt dies.
I'm expecting either wwdc 2019 or 2020 we get an announcement, with products shipping after the OS release in the fall of that year.
Apple's laptop naming has been getting steadily worse since the introduction of the retina macbook pro. The macbook modifying words have lost all meaning when systems labeled pro have major expansion and repair limitations, the device named air isn't the smallest or lightest and the model without a modifier isn't the cheapest. They missed an opportunity to restore some naming sanity this year, but a switch to arm could present it again.
As great as A12X is, it's not touching discrete GPUs and CPUs allowed to burn wattage approaching triple digits. If I was apple I'd lean into this and restore "pro" as a designation that means something. Pro devices would stay x86 and be marketed as supporting more software. Non pro devices would make the jump to arm on their regular update cadence. It would give Apple tons of time to get their custom chips to xeon level scale for core count and interconnects. Even gives apple the option to continue using x86 indefinitely as investing in 100+ watt chips may not have the returns to make it worthwhile. Then the ipad pro becomes poorly named, but I can't solve all of apple's self created problems.
If I was apple this is the mac product matrix I'd have when the dust settles:
macbook: First to move as it's perfect for A12X since it already only has one usb-c and due for a refresh. Drop the intel tax and now it's around $1099. Apple could even use the exact panel from the 12.9 inch ipad minus the touch gear.
macbook air: would need the next generation A chip to support more usb-c and more ram. New sub-$1000 price for the 128 gb model and outrageously long battery life for web browsing or note taking. No need to make it any thinner or lighter.
macbook pro: Kill the weird non-touchbar model that was clearly supposed to be the new air but priced way, way too high. spec bump the 13 and 15 inch, especially discrete GPUs
imac: switch to arm or kill and replace with giant beautiful screen that ipad and iphone docks with.
imac pro: spec bump, but this is as close to a perfect device apple has released in a long time.
mac mini: Use the apple tv case to build an arm mac with A12X or higher that can be sold quite cheap. Use as great PR to give away xcode development systems to schools and developing nations and get swift into the hands of people learning to develop applications
mac mini: relabel as mac mini pro and pretty much keep as is
mac pro: Make it unbelievably expensive but also user repairable, multi GPU on their own standard cards, tons of ram and big xeon chips.
This eloquently summarizes much of the disconnect customers have been feeling about the Mac product lineup. Jobs never would have allowed this to happen.
I think Tim tries to extract all the possible pennies from the supply chain possible. In a vacuum it's a responsible way to run a business, but it has left apple selling some products they should be ashamed to feature and customers unsure which product is built for their needs. When apple sells old iphones more cheaply the number in the name gives people an clear indication they are trading price for newness. The mac line is a confusing mess of exactly what you get when you put down hard earned money.
It's a weird combination of some devices with touch id, how many usb-c you get and whether the usb-c port is also tb3 or not. For those really out of the loop what the hell a touchbar is. The fact every laptop apple makes is thin, light and retina doesn't help.
For something as core to the experience as touch id it really should have triggered across the board refreshes, but apple seems interested only in substantial updates (though I would certainly consider the addition of touch id very substantial).
There is a lot to be said for shipping incremental updates on a regular basis so customers know the expensive product they spend their money on is being given attention. I don't need each revision to blow me away and every customer might not buy every refresh, but every time a product is refreshed it's a strong indicator if I buy this product it will continue to see investment. When it is time for me to put down my hard earned money I can be confident I'm getting good value and not missing some key improvement that is available but not on this particular line yet.
I'm experiencing this problem with ipad mini right now. Adore my ipad mini. Adored the 1st one. 2nd one was a huge upgrade so got that. 3rd and 4th didn't justify the cost, but now that my mini 2 is really showing age buying a 4 doesn't offer very good value when it's unchanged for 3 years. There are finally rumours a new one might be coming, but when? Do I move to a different size class (that I don't really like)? Continue to stick it out for a product I have no reason to believe apple intends to continue developing?
I’m thinking you’re going to have a hybrid architecture for MBP. The T2 will expand to support all of Apple’s own software plus all upgraded one sold through a revamped Mac Appstore (plus hopefully your own compiled stuff when security settings are off). This can power down (all but ~2 cores) and instead power up an x86 coprocessor that supports everything else. If Intel doesn’t deliver that, AMD will. For Macbook I think you’re on the right track, just that the Air will be replaced with a 13 inch Macbook model with the same architecture.
Not compared to other discreet GPUs they haven't. They are still 5 years behind consoles. The only time mobile ever "catches up" and achieves "console-class" is near the end of a consoles ~5-8 year lifetime.
The performance is superb for mobile, and certainly good enough for integrated (it'd be perfect in something like a macbook air), but it's still getting destroyed by the relatively crappy Radeon Pro 560X in the 2018 MBP.
Cavium is sort of your proof that scaling is hard. The 32-core ThunderX2 @ 2.2ghz with 56 PCI-E lanes has the same TDP as the 32-core AMD Epyc with 128 PCI-E lanes. And it's slower than the competition from AMD & Intel at comparable power budgets.
I'm not personally that interested in a hybrid mac as getting rid of the power draw of an intel cpu and all the weight and technical baggage it comes is the real draw of an arm mac for me.
What I really want is to hand apple a little over $1000 and walk away with a laptop form factor with nothing but one of their fantastic A series SoCs inside and running full macOS where I make all the decisions. The gpu being so much better than the anemic intel integrated stuff in my current air is a big part of the attraction of the A series chip running macOS.
I don't think the air branding will go away. Apple spoke so much during this last event about macbook air being people's favourite mac I can't see them giving up on such positive branding. It's far and away the best selling mac and has been for close to a decade.
From an Apple perspective you could easily justify the price for all the old Intel Chip. 1. Intel has a 2 years+ node advantage which you cant get anywhere even if you pay. 2. Intel has the best performance / watt CPU on the market, also highest performance Core on the market, you cant get it anywhere even if you pay. 3. x86 compatibility, which is more like a x86 tax. Although you could get it from AMD.
Now the first two is gone. TSMC has now edged Intel in node, Apple themselves has the best Pref / Watt work in A12X. AMD has proved to be very competitive at the high end. And Intel is still charging Apple the same when they have far less value.
Apple is now being hold up by Intel, but I don't think Apple could dump Intel just yet. There are two thing that is holding Apple up.
Thunderbolt - TB is currently still an Intel only technology. There are no other host controller on the market other than Intel's one. And they cost a fortune ( relatively speaking ) Apple has invested a lot into TB, but Intel is making all the same Firewire mistakes. May be Apple is working on a USB 4.0 solution, and they would dump TB once and for all in 2020. Intel promised to make TB as an open standard in 2018 but has yet to do so.
Modem - Apple relied on Intel Modem for iPhone, which is Apple's bread and butter. Before any move on the Mac side Apple will need to think about its consequences on Modem. In worst case Apple could switch away from x86 and Intel decide to hike the price of Intel Modem. I think the revenue of Modem is roughly the same as revenue of x86 from Mac. Intel 10nm isn't performing. And Apple is not happy with Intel on both front.
thunderbolt is a hurdle to arm macs but I don't think an insurmountable one. As you mentioned Intel already committed to opening the spec and if apple can't negotiate getting tunderbolt host controllers tossed in super cheap and forcing intel to honour its own commitments with modem sales they don't deserve to be valued at 1T.
TB is one of the big reasons I think macbook would be the obvious 1st mac to switch. It currently only has USB-C (no TB) and it's due for a refresh since the lack of touch id makes it an outlier.
When apple switched from ppc it was a multiple year effort and I don't see any reason a switch to arm, whether it's a full move or particular products would be any different.
I hope I’m wrong though.
I agree it would be odd for the mac line to bifurcate like that, but even in the ppc -> x86 days they supported ppc macs for quite a long time.
Apple has been getting great returns from its chip investments because they have been able to reuse the blocks in so many devices. If they do go ARM for mac there is diminishing returns as they move up the product line about how much of the silicon they can reuse. They are right at a point where power hungry multiple memory controllers, complex core interconnects and other things that will likely never make it into an iphone are going to be necessary. I suppose they can use the highest end macs as test beds to see if ideas work out, but is apple really going to have chips fabbed with 20+ big cores and complex intercore transport for products they will never sell in huge numbers?
I think for the highest end apple is better continuing to piggy back off intel's server investments and focus their chip team where it has been been absolutely dominating: sub 10W incredible performance per watt.
macOS on two instruction sets would require some extra developer time, but volunteers keep debian running on 18 arches. After the initial port there is some care and feeding, but it's manageable. The biggest issues are cross compiling, something apple is already really good at, and device drivers. Apple has that covered with the T2 chip. Move more and more of the peripheral connectivity into something like the T2 chip and then you only have to write/maintain aarch64 drivers.
The more I think about it the more it makes really good sense.
And it’s not that the can’t techically, it’s that they are all about focus and unambiguous messages to developers and costumers.
But after reading it all, I guess they could fork the Mac into two categories. Air, light or whatever, and Pro.
I actually like that idea a lot.
MOS/68k, 68k/PowerPC, PowerPC/Intel, Intel/ARM
Now I'm guessing the dwarf format is: x86-64/arm32/arm64 with arm32 being legacy.
Back then we were also switching from GCC to LLVM, which at the time I thought was ludicrous because we'd be losing all of the flexibility in architectures GCC gave us. But I guess my worries were without merit.
while I have seen many laud the iPad Pro for its power I haven't see any mention of heat or how long it can sustain a workload. Personally I do not want to see the Mac line change processors again for two reasons, like I mentioned prior many of us have a lot invested in software that runs on OSX as well as Windows that these machines can run, second I don't need that wall to go any higher.
If I was apple I wouldn't go all in on arm macs. I'd build arm thin and light laptops to get incredible battery life and leave the pro line as x86 for at least the next 3+ years. Even if I did decide to stop building new x86 devices I'd support macOS on x86 for at least 5 additional years.
I won't ever switch my primary computer to something that needs an optional accessory to hold the screen up. ipad may be magical, but there is plenty of wonder left in a device can open up and start typing on a real keyboard immediately. It's been many years since my primary computer has been a desktop, but I'm not willing to compromise on the permanently attached keyboard and freedom to decide what executes.
Seems far cry from being just a consumption device. That canard is getting old. A guess since you can’t run Linux on it or run a sever with it, somehow it isn’t creative?
On iOS it's very hard to build your own custom workflow with a bunch of applications and other tools, the way you can on a desktop OS.
For my particular needs it takes more more effort and time to complete the same set of creation tasks on my ipad compared to my mac. That includes working with media files.
I love casually browsing the web on my ipad, but the moment I want to write more than a few sentences I reach for my mac.
Would moving off of traditional desktop CPUs harm that? Is there a way to do compatibility at the OS level without sacrificing half of the performance gains?
I think this comment underestimates the importance of performance to the iPad and Apple's long term vision for it. While "real Photoshop" won't be ready until next year, it's clearly aiming to make the iPad a real solution for compute heavy graphics tasks. More will undoubtably follow. Why not edit video on a film set on the iPad? I can see it happening. Your comment sort of makes it sound like Apple just threw these into the iPad as a PR strategy and the "real plan" is to transition the Mac. But Apple's plan is to make both of these pro product lines as beefy and power efficient as possible and target real professional creative workflows.
Wheres the example of software that truly shines on these chips? Wheres the software like After Effects, Houdini, Octane Render that you can truly see the power of the machine rev up when on the right hardware.
They're wheeling Photoshop out as proof of this devices power yet as a designer I certainly don't consider Photoshop a heavy piece of software anymore and the only reason it ever chugs is it doesn't use the machines power effectively, mostly single core and disk speed constrained.
This power has been available to iPad developers for a few years now so shouldn't we be seeing truly powerful pro apps that take advantage of it emerging? Are these chips actually powerful for real world pro tasks or are they just talented at providing Geekbench scores.
I guess it depends on what you mean by "ready" how long the list is, but I'd suggest that Photoshop, console-quality games, AutoCAD, video editing tools like iMovie, all can take advantage of the speed.
What console-quality games actually ship on the iPad?
The problem with games is that to actually match an Xbox One S's graphics you don't just need to match its 5 year old hardware in performance. You also need to have the capacity to actually fit the game & its textures. All 40-80GB of it.
Who is going to ship an actual console-quality game at console-quality sizes on a device whose base model can't even hold it?
And Apple is notoriously stingy with RAM. This new one bumps it up to 6GB, which is nice, but still less than the 8GB in an Xbox One S. How much does iOS reserve of that, and how much do the games actually get?
There's little technical detail that wasn't in the ipad review's benchmarks and previous speculation on using apple chips in desktop machines.
Everything new they gave no details on, and the only in-depth answers were hard hitting questions like "you could have made a slow chip, why did you decide to make a fast one instead?" and "why is apple so good at teamwork?" (maybe not the questions that were asked, but they were the questions that were answered :)
If I were to speculate, due to his decades of industry experience evaluating hardware platforms, his role in Apple is to provide strategic direction and guidance on how to build the best hardware platforms. I can't think of any other role they would have wanted him for... which could be a failure of imagination on my part.
So... no, it's not "Apple marketing". Anand Shimpi is involved!
That's the feeling I got, as well as perhaps being a human abstraction layer between the engineers and the executives. If you can describe in relatively understandable terms something that is technically difficult to thousands of laypeople (well, that's unfair - Anandtech was for nerds but being a nerd doesn't make you an IC/EE engineer), you'd be an asset to both the corporate and engineering teams.
And that translation works both ways - understanding the direction of the company with regards to future products vs what you want to get out of your silicon teams (e.g. when Anand mentions thermal envelopes, that might include an understanding of the limitations stemming from the potential form/design and material of a future product).
I wonder how that colors the reviews - if the outlet is too critical about a device, they can quickly lose these special privileges.
They’re already shipping their custom T2 chips in their laptops. The compiler toolchain can build great binaries for their A chips. They’ve swapped CPU architecture before and the modern Mach binary format can hold versions of the executable built for different architectures.
They will probably need a Rosetta equivalent to emulate x86 for all the applications that are slow to switch. That might be tricky because of the huge surface area of the x86 instruction set. But I think it will only be a matter of time. It might also explain why they have kept the MacBook and MacBook Air product lines - they might want one of them to stay with intel’s cpus and the other to switch to their A* chips going forward. Or maybe they’ll just wait another generation or two and switch CPUs across their whole line in one go.
Well, and all the applications that won't switch. Also even on the Mac virtualization/containerization is not nothing. The Mac is a different market and use profile then iOS, and while Apple based on past history won't support an old arch indefinitely neither are they likely to completely blow off backwards compatibility. Compared to previous transitions dropping x86 would have extra complexities as well, so previous experience may not be entirely applicable. In particular Apple would be moving away from the full fat computer standard rather then no change or towards one, which may change the payoff for users despite Apple being much bigger. The absolute performance differences (immediate and future) also aren't likely to be as big.
I don't want to underestimate them, and huge disruption is inevitably coming down the pipe anyway and Arm may well emerge a winner there regardless, but it's also just a really big challenge.
>That might be tricky because of the huge surface area of the x86 instruction set.
Transmeta was able to do a decent job, and I think Novafora is still around and licensing their IP? Granted a lot of instructions have been added since then, but Apple certainly has a lot of expertise there as well and a great deal of capital to aim at the issue.
To think that deep in the Apple labs that don't already have A-X laptops running - and have for, for a while is not thinking like Apple would.
Right. To me it's unthinkable given the amount of low-level code shared between iOS and macOS that macOS hasn't been running on ARM since day 1 of iOS.
Sure, but don't discount how every specific instance can be a bit different either. As I said they've got expertise, they've got capital, and there are even previous paths to follow here. But at the same time every time has its own unique hurdles. Previously with 68k -> PPC and then PPC -> x86 for example they were going to something that was not merely just an improvement in some important respect right off but also had a clear long very steep growth ramp ahead thanks to fabrication improvements if nothing else. In the PC world we were still very much in either a very steep or at least steeper part of the S-curve. But those days are just plain done, the issues presented by physics and the geometry sizes being worked with now are simply fundamentally harder. There is certainly more room for improvement year to year for a long while, and more chances to grow horizontally with valuable new features, but it's not like a system made now will be obsolete in 3 years either.
Additionally those were coming at points in Apple's life with a dramatically smaller installed base, and they were also going away from something more proprietary (at least in principle, obviously CHRP never actually worked out that well). The move to x86 which brought Macs in line with everything else meant a huge amount of software opened up to more trivial porting, huge amounts more opened up to trivial virtualization, a vast 3rd party hardware market became more easily accessible, etc. That definitely helped offset some of the old Mac software that ultimately didn't make it, even more so because due to above it is still quite possible to run old Mac software fine: Classic can be emulated, and 10.6 can be run under virtualization still which in turn grants access to Rosetta even on new Macs, and the absolute performance advantages vs 12+ year old systems are significant enough that even with the overhead it's still fine.
Basically there are a lot of subtle day-to-day advantages that come from everything running the same instruction set underneath, or at least being able to stick some sort of translation layer in there. Again, absolutely not saying it's something Apple can't tackle, just that it's a big challenge and I think it's bigger now then it was any time previously. Of course, Apple too is bigger now then any time previously! They're not infallible though and I hope they get the balance right here.
Anyways, Apple has never been one for smooth transitions. Their history is dotted with big, bold changes. If they kept x86 they would slow the adoption of their new architecture. Apple will likely take a "take it or leave it" attitude like they did with the CD drive and headphone jack.
Despite their ongoing efforts to make the iPad more capable, I think and hope they'll recognize the value in keeping it simple enough for anyone to use, and thus having a separate macOS experience with more tools/flexibility in a laptop/desktop form factor.
Depends on what you mean by "much". They have made good backwards compatibility an important part of every single architecture transition so far, and on the Mac there were good 3-4 year official transitions at least (the 68k emu still ran under Blue Box/Classic Environment so it lasted through 10.4 Tiger, Rosetta lasted through 10.6 Snow Leopard). And that's official, in practice there have continued to be longer last options.
It's certainly not the degree that Microsoft has traditionally cared, but it's not at all been blown off either.
When you strip away all the stuff a laptop doesn't need, you're left with... an x86 chip!
You're left with an ARM chip
FaceID is not solely dependent on ML, it's also managed by the secure enclave co-processor which is also used for Touch ID which is available on Macs now. ML helps to reduce false positives.
Apple's T2 chip is an ARM-based processor that's already in almost all of the newest Macs, it is used as a storage controller (which allows Apple to encrypt the drive very fast and transparently), security enclave processing (touch ID on MBA), Siri processing, and more. Every year, more and more of the processing is moved to Apple's T series co-processor.
Apple's custom silicon allows them to integrate software and hardware on a deeper level. Intel develops CPU for the mass market. Apple develops for their own customers only.
With Apple's focus on on-device ML, I would guess this will be the first part of the A-series trifecta (CPU, GPU, Neural) to be included on a Mac and exposed to developers. I can imagine a bunch of possibilities for such a chip, not just FaceID.
Because of the sandboxed nature of iOS, and the current immaturity of the Files app/feature, it's hard and inconsistent how you can exchange files between apps on the iPad. As a result, you're still mostly stuck to using one app to do things in iOS. The workflow is still fragmented.
Iow: If you want a pc, get a pc.
But from a software usability standpoint, it's not.
On that Macbook Pro I could run several VMWare sessions running Windows and Linux (have run 3 at the same time in the past). I can run Handbrake encoding videos across cores while still browsing the net in Chrome with 4 windows open each to different profiles each with 5 to 20 tabs. Have 4 terminal windows open, at list one of them serving a dev webpage. Run VSCode and Unity and Visual Studio and other stuff all at once. I've also done things like compile Chrome from source. Run XCode, run 2-3 iOS simulators.
I get that an A12X can't do those exact tasks as it's not the same instruction set but could it do the equivalent and get similar perf?
That's amazing if true. An iPad Pro weighs 1/4th of my MBP (2014). My MBP's fans spin like crazy when running a high intensity app and the case gets too hot to touch.
I'd love to believe a machine that has no fans and doesn't get hot and weighs 1/4 as much could actually have the same or more perf for real but when I actually use an iPad it rarely feels as fast and given it doesn't multitask well there's no way for me to check that perf is really comparable in real world use cases.
Anyone have any insight? Is it just because the chip was redesigned to be more efficient it can match or exceed the i7 in my MBP? Should Amazon be filling their AWS racks with A12X based machines that get the same per at much less heat and power? (Yea I know they can't by A12X chips buy still). Don't iPhones and say top Samsung phones generally show similar perf?
But your workload doesn't seem to include those things, so hard to say. Critically A12X is unlikely to have hardware virtualization support, so your use case of VMware would be slow even if it wasn't doing any binary translation.
Also your MBP's fans spin because it's trying to achieve higher sustained performance. Typically mobile devices will just instead thermal throttle hard. Like, lose half their performance hard. How well can the iPad Pro sustain its performance? That's a real big question.
Re AWS racks: No, they can't. A12X in the server world would be a joke. It'd be competing against things like AMD's Rome which is 64 cores / 128 threads with, and this part is critical, up to 4TB of RAM with 128 PCI-E lanes. Even if the A12X could compete on raw CPU throughput it can't compete on I/O, virtualization, etc... The A12X is also going to be pulling a lot more power than you might expect. It's not that much more power efficient.
Best of both worlds.
Don't need x86? Don't spin up the Intel chip, doing something that requires x86, spin it on up.
This way you get software compatibility with the power sipping of the ARM CPU.
I had a PowerMac 6100/60 with the DOS Compatibility card that had its own sound chip, video controller and optionally RAM.
My 6100/60 had 24MB of RAM and the card had 32MB of RAM.
Before that, I had an LCII with a ‘//e card.
I doubt that modern Apple would ship a hybrid x86/Arm laptop though.
Now here's my prediction: Apple does not really want to build an ARM-powered MBP. Instead, they will eventually allow iPads to double boot into iOS and/or MacOS.
Call me crazy, but this would be huge. Of course, Apple would still build traditional laptops, maybe even with ARM processors in them, but only as a byproduct of their iPhone / iPad product line.
You can't use OSX with your finger. There are millions of places across the OS and applications where the hit point is too small for a finger. You just need to compare the keys on the iOS keyboard and then compare that to the traffic lights on OSX.
The MBP's are also battery-powered & thin?
But you seem to be taking geekbench 4 here as gospel. I'd take that with a grain of salt. A really, really big grain of salt.
Even with that said I'm not even seeing any MBP 2017 results in the article...?
They give the impression by saying it’s a custom GPU that it’s a from scatch in-house design but it’s unlikely to be the case.
Moving to a "custom GPU" was basically Apple saying "Ok, thanks, we are taking over from here".
All this power sounds great, but I really don't know what else I'd do with it, beyond surfing an ad-riddled internet on my couch.
If any developer has a life-changing daily use case for their iPad, I'd love to hear it.