Current A12z chips are highly performant; Apple is roughly one chip cycle ahead on perfomance/watt from any other manufacturer. I presume their consumer hardware will launch with an A13Z, or maybe an A14 type chip.
Apple has consistently shipped new chip designs on time; Intel’s thrashing has cost them at least two significant update cycles on the macbook line in the last six years. Search this fine site for complaints about how new mac laptops don’t have real performance benefits over old ones —- those complaints are 100% down to being saddled with Intel.
Apple has a functional corporate culture that ships; adding complete control of the hardware stack in is going to make for better products, full stop.
Apple has to pay Intel and AMD profit margins for their mac systems. They are going to be able to put this margin back into a combination of profit and tech budget as they choose. Early days they are likely to plow all this back into performance, a win for consumers.
So, I’m predicting an MBP 13 - 16 range with an extra three hours of battery life+, and 20-30% faster. Alternately a Macbook Air type with 16 hours plus strong 4k performance. You’re not going to want an Intel mac even as of January of 2021, unless you have a very unusual set of requirements.
I think they may also start making a real push on the ML side in the next year, which will be very interesting; it’s exciting to imagine what Apple’s fully vertically integrated company could do controlling hardware, OS and ML stack.
One interesting question I think is outstanding - from parsing the video carefully, it seems to me that devs are going to want ARM linux virtualized, vs AMD64. I’m not highly conversant with ARM linux, but in my mind I imagine it’s still largely a second class citizen — I wonder if systems developers will get on board, deal with slower / higher battery draw intel virtualization, or move on from Apple.
Languages like Go with supremely simple cross architecture support might get a boost here. Rust seems behind on ARM, for instance; I bet that will change in the next year or two. I don’t imagine that developing Intel server binaries on an ARM laptop with Rust will be pleasant.
I'm predicting the opposite: you won't actually see any difference.
Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU. CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains . Not one thing that they demoed requires any meaningful CPU power.
Similarly, while ARM parts are more efficient than x86 per compute cycle, it's not a dramatic change.
The big changes, I think, are more mundane:
- Apple is going to save $200-$800 cost per Mac shipped
- Apple can start leaning on their specialized ML cores and accelerators. They will probably put that stuff in T2 for Intel Macs. If they're already shipping T2 on every machine, with a bunch of CPU cores, why not just make those CPU cores big enough for the main workload?
Doubling CPU perf is meaningless if you can ship the right accelerators that'll do 100x energy/perf for video compression, crypto and graphics.
 for a regular web browsing type user; obviously if you're compiling stuff this may not apply; if that is true you're almost certainly better off just getting a Linux desktop for the heavy lifting
Apple can start leaning on their specialized ML
cores and accelerators
I think Apple sees this sort of thing as the future, and their true competitive advantage.
Most are focusing on Apple's potential edge over Intel when it comes to general compute performance/watt. Eventually Apple's likely to hit a wall there too though, like Intel.
Where Apple can really pull away is by leaning into custom compute units for specialized tasks. Apple and their full vertical integration will stand alone in the world here. Rather than hoping Intel's chips are good at the things it wants to do, it can specialize the silicone hardcore for the tasks it wants MacOS to do in the future. It will potentially be a throwback to the Amiga days: a system with performance years ahead of competitors because of tight integration with custom hardware.
The questions are:
1. Will anybody notice? The initial ARM Macs may be underwhelming. I'm not sure the initial Mac ARM silicon will necessarily have a lot of special custom Mac-oriented compute goodies. And even if it does, I don't know Mac software will be taking full advantage of it from Day 1. It will take a few product cycles (i.e., years) for this to really bear fruit.
2. Will developers bother to exploit these capabilities as Apple surfaces them? Aside from some flagship content-creation apps, native Mac apps are not exactly flourishing.
1. If done correctly, non-Apple laptops may become significantly less attractive. Just like Android phones.
2. Intel may be in for a tough time, especially with AMD winning big on the console and laptop fronts recently.
3. AMD and Intel may have to compete for survival and to save the non-Apple ecosystem in general. If AMD/Intel can consistently and significantly beat Apple here, it may mean that the non-Apple ecosystem survives and even thrives. It may even mean that Apple looks at Intel/AMD as an option for Pro MacBooks in the future. However, this does seem a little less likely.
4. This could also herald the entry of Qualcomm and the likes into laptop territory.
Looks like a very interesting and possibly industry changing move. This could potentially severely affect Intel/AMD and Microsoft. And all these players will have to play this new scenario very carefully.
What are you talking about? Android has about 72% of worldwide market share, so clearly Android phones are not significantly less attractive.
And I am not an Android fanboy, my first two phones were iPhone 1 and iPhone 3GS, and I still consider them very good phones.
More precisely, Android has the BOTTOM 72% of the market, mostly cheap smartphones with thin profit margins. Almost all actual profits go to Apple.
> Apple dominates the global handset market by capturing 66% of industry profits and 32% of the overall handset revenue.
Samsung and Huawei are second and third with about 20% and 10%, respectively. The three companies combine for about 95% of the profits.
In the same quarter, Apple had 12% of the global market sales, against 21% by Samsung and 18% by Huawei. They combine for 51% of the sales.
So, that quarter, companies representing half of the worldwide cellphone sales combined for 5% of the profit.
Apple sold 12% of the phones and captured 32% of the revenue but 66% of the profit.
Apple is clearly able to sell its phones at a unique premium; I am not sure of a better way to measure "attractive".
Android will always be a low budget product as a market, because it's run by Google. Google doesn't care about its customers at all, but for the data they generate and its impact on ad sales.
Every time a user opens the Google app store, they can expect it to be worse than the time they opened it previously. Every time an Android user buys a new device, it's a crap shoot what sort of hardware issues it will have, even if it's Google or Samsung branded.
Well, they are. You're confusing niche market segments with overall preferences. Veblen goods don't have traction in general markets.
 - Mercedes to debut Formula 1 MGU-H technology in AMG road cars - https://www.formula1.com/en/latest/article.mercedes-to-debut...
ICE is dead. It's a welcome piece of fun at weekends, at the track, and when we drive our classic cars. But, I'm afraid, thats it.
And that's not the future I am willing to buy into.
The abstractions are leaky, the VM is not a pristine environment floating on top of some vaguely well-defined architecture. The software in one has two extra layers (VM software & OS) between it and the actual platform and all this is before you start hitting weird corner cases with cpu architecture differences in the layers.
Since about 5 years Apple provides this kit: https://developer.apple.com/documentation/hypervisor Yeah, you got to use the hardware drivers from Apple unless it also supports PCI pass through, not sure but with the current user base I guess nobody would do that anyway.
I expect Apple to eventually run their ring-1 off the T chip, with everything else from a VM abstraction. It’s just the natural evolution of the UEFI approach, and Apple being themselves they’re doing it “their way” without waiting for the crossfire infested industry committees to play along.
What makes Android phones less attractive, in your opinion?
What matters to everyone I know is screen size, camera quality and that a really small selection of apps (messaging, maps, email, browser, bank app) work well. Raw CPU performance is only a very abstract concept.
Many of these are only barely possible on "pre-neural" mobile ARM CPUs, and at a significant cost to power consumption. Developing for newer devices is like night and day.
> Speech recognition on my old Pixel 2
I don't think the Pixel 2 can be called "pre-neural". "[...] The PVC is a fully programmable image, vision and AI multi-core domain-specific architecture (DSA) for mobile devices and in future for IoT. It first appeared in the Google Pixel 2 and 2 XL [...]" https://en.wikipedia.org/wiki/Pixel_Visual_Core
cpu selection is likely coming from industrialization concerns, less production line to maintain, less price per unit at volume etc, but they're going to beat that drum loud and proud for all it's worth, meanwhile the phone is cheap in area that in 2020 _do_ matter.
I don't know the dev ecosystem for apple broadly, but this doesn't bode well for people "bothering to exploit" the hardware.
Apple's margins are consistent, if their costs go down significantly, pricing comes down or features increase. The iPad is a perfect example, for years it was $500 and they just kept increasing the feature-set until eventually they could deliver the base product for significantly less.
Shareholders benefit from increased market share just as much as they do from increasing margins, arguably more. The base iPad and the iPhone SE both "cannibalize" their higher end products, but significantly expand their base. I wouldn't be surprised at all to see a $800 MacBook enter their lineup shipping with the same CPU as the iPad.
While they won't be competing with Chromebooks for general education use cases, I could very well see Apple trying to upsell schools on a $599 alternative that happens to run GarageBand, iMovie, and even XCode.
While there is no digitizer, there is a keyboard and a touchpad. Also, I expect Apple is going to try to keep a gap between the base Mac and the iPad price-wise so they would add to the base storage and maybe RAM.
Then again, considering the pricing on the base iPad, maybe they will bring it down to $600.
If your phone is good enough to take care of your day to day computing, you can probably get by with an inexpensive all-in-one computer and save the headache of docking.
(And the GPU, and maybe even more RAM etc).
Then what is the point in docking at all? Now you have to keep track of what's on the dock and what's on the phone. Plus, by the time you integrate all this into a dock, you basically have something that costs as much as an inexpensive PC, so why bother?
One problem is that people expect the CPU power of a laptop, which requires much more power and cooling than the typical tablet. As a consequence in tablet mode a Surface Book has about two hours of battery life.
When looking for IDEs or tooling on iOS I still have not found anything remotely professionally usable... (I mean Visual Studio + Reshaper like, not VS Code...) but perhaps somebody could enlighten me...
Related to your example - $1 burgers are increasingly better, than you would expect. The difference between McDonald's midrange line and, say, a burger at a restaurant for $18 is negligible in flavor. I can no longer justify going to a restaurant and pay $18+tip for a burger.
Oh come on. I get that you're trying to make a point but this is ridiculous.
I’ll eat a $18 hamburger because it tastes really good - yes, about 18x better than a $1 burger.
I'd argue that the functional difference between a Honda Fit and a Tesla is less than the difference between the best McDonald's hamburger and an $18 hamburger. That's why I drive a Honda Fit. In the face of Tesla's increasing sales it would be pretty strange to assert that my taste was somehow universal.
But $18 burger is not drastically better than McDonald's $7 burger.
Try doing an actual blind test, with a control... because the simple fact of perception will make you think one is better.
And then we've come full circle to Apple products.
McDonald's has very high quality preparation standards. Their ingredients and techniques were constructed to facilitate their high-speed, high-consistency process, but prevent them from incorporating things that the overwhelming majority of burger consumers prefer.
For example, the extremely fine grind on the meat, the thin patty, the sweet bread, the singular cheese selection, the inability to get the patty cooked to specification, the lack of hard sear or crust and the maillardization that accompanies it, etc. etc. etc. At a minimum, people prefer juicier burgers with coarser, more loosely-packed texture, usually cooked to lower temperatures (though this depends on what part of the country you're in,) and the flavor and texture differential from a hard sear, be it on a flat top or grill, and toasted bread.
For consumers who, at least at that moment, have a use case that requires their food be cheap, fast, and available, well we know who the clear winner is.
In my new career as a software developer and designer, I use apple products. I am willing to pay for the reliable UNIXy system that can also natively run industry-standards graphics tools without futzing around with VMs and things, and do all that on great hardware. There will always be people who aren't going to compare bits pushed to dollars spent and are going to be willing to spend the extra few hundred bucks on a device they spend many hours a day interacting with.
This isn't about perception at all— Apple products meet my goals in a way that other products don't. If your goals involve saving a few hundred bucks on a laptop, then don't buy one. I really don't understand why people get so mad at Apple for selling the products that they sell.
I don't doubt you know more about food. If you applied that knowledge to my actual point instead of what it appears you assumed my point was, this assertion might have been correct.
That's not entirely your fault, I was making a slightly different point than the exiting conversation was arguing, so it's easy to bring the context of that into what I was trying to say and assume they were more related than they were.
The belittling way in which you responded though, that's all on you.
> This isn't about perception at all— Apple products meet my goals in a way that other products don't. If your goals involve saving a few hundred bucks on a laptop, then don't buy one. I really don't understand why people get so mad at Apple for selling the products that they sell.
My point, applied to this, would be to question what other products you've tried? My assertion is that people perceive other products to be maybe 50%-70% as good, when in reality they are probably closer to 85%-95% as good (if not better, in rare instances). That is a gap between perception and reality.
As applied to burgers, I was saying that people that refuse to eat at McDonald's because of quality probably have a very skewed perception of the actual differences in quality in a restaurant burger compared to a McDonald's burger.
I'm fully prepared to be wrong. I'm wrong all the time. I also don't see how anything you said really applies to my point, so I don't think you've really proven I'm wrong yet.
I wasn't annoyed by you misunderstanding, I was annoyed by you misunderstanding, assuming you understood my position completely because it would more conveniently fit with your existing knowledge, and then using that assumed position to proclaim your superiority and my foolishness.
It's not about deep conversational chess on my part, it's about common decency and not assuming uncharitable positions of others by default on your part. A problem, I'll note, that you repeated in the last comment.
Just the mere perception of quality will increase your satisfaction levels. The perception of lack of quality will reduce you satisfaction levels.
Thus I still maintain that your "perfect" $18 burger is only marginally better than McDonald's midrange burger. The fact that you actually spend time on making that burger more appetising - is proof that the low cost foods are getting better and better.
While focusing on my analogy, you literally prove my overall point.
30 years ago you weren't necessary, as low cost food wasn't nearly as good as today. Now - you have to exist to justify that premium.
I eat at McDonald's all the time, and I also get pricey burgers ($13-18) from a local place that makes the best I've ever had.
You can't be serious. If you are, I've gotta say if anyone has a perception issue about their respective quality it's you.
I was making a point less about McDonald's being equivalent to a restaurant burger and more about people's perceptions of McDonald's and how bad it is. That is, there's probably a lot less difference in the taste of those burgers than a lot of people want to admit.
The other aspect to consider is consistency. I had a $14 burger at a restaurant on Saturday that I would have been happy to swap in any single burger I've ordered from McDonald's in the last 12 months. You may not consider it high quality at McDonald's, but you have a pretty good idea what you're going to get.
All I'm really doing is making a point that there's a bit of fetishism about luxury items going on these days. Are Apple devices generally higher quality than many competitors? Yes. Is the difference in quality in line with most people's perception of the difference in quality? I don't think so.
This may be the single worst analogy I've ever seen.
There is no amount of money you can pay at McDonalds to get a good quality burger.
I don't spend $18 for burgers, since there are a million places where you can pay $5-8 dollars and get a damned good piece of beef. But not at McDonalds.
$5-8? At a food truck? The ones that make burgers of an extremely varied quality?
I’ve eaten at McDonald’s around the world, it really depends but they do have good burgers when they’re cooked right.
And the remaining 10% would indirectly benefit benefit their iPhone cash cow in the form of keeping people inside the ecosystem.
The Mac silicon is inheriting the investments Apple made in the iPhone CPUs. This will continue. The bits which Apple invests to make their existing hardware scale to desktop and high end laptops won't benefit the iPhone much at all. On future generation chips, Apple will spread the development costs over a few more units, but since iPhone + iPad ship several times more units than the Mac, the bulk of the costs will be born by them.
`(development cost + (units sold * incremental cost) ) / units sold`
But a lot of Macs have higher end Intel CPUs so the per/ unit cost of Intel CPUs is pretty damned high.
Which of the inputs going into making that device would have applied any inflationary pressure?
Most if not all the parts would have probably got cheaper over that time and wage pressure is always low only because of where these devices are made.
Edit-they also went from being a company with around 8500 employees in 2000 to 137000 today. Surely every part of their organization chart has contributed toward pressures to otherwise push up their prices to maintain revenue.
Another factor is perceived risk. Since the markets are always worrying about the current US China trade talks, that uncertainty helps gold and silver as they are seen as safe havens.
One of the two scenarios (or perhaps a mixture of both) are more likely, and I lean towards #1:
1. Apple decreases the price of the Mac to stay aligned with gross margin targets. This likely has a significant upwards impact on revenue, because a drop in price like this opens new markets who can now afford a Mac, increasing market share, driving new customers to their services, and adding a layer of lock-in for iPhone-but-not-Mac customers.
2. Apple uses the additional budget per device for more expensive parts/technology. They are struggling to incorporate technologies like OLED/mini-LED because of the significant cost of these displays and this would help open up those opportunities.
Why not got the same road with an MacBook SE? I even think this will be the first product out the pipeline.
MacPro buyers usually don't want to be beta testers and will probably be the last to transition out once horsepower is clearly there with mesurable gains.
Of course you'd still get 350$ range crappy product, Apple can't and don't want to compete a those levels.
This is sort-of-OK for consumers but amazing for Apple and its shareholers.
This isn't a zero sum game. Being able to ship less expensive computers which perform better is a win for consumers and Apple shareholders at the some time.
The only loser here is Intel.
Don't we basically know that from the Surface X?
Don't have to venture that far, we know it from the iPad Pro.
This isn't "sort-of-ok", it's "bad-for-customers" and "bad-for-developers".
As a consumer you shouldn't be running unsigned software because you're putting not only your data at risk but any data you have access to.
And as a developer on mac you can still run anything reasonably well in a VM.
If you're using node, you should be running that in a virtualized environment in the first place, albeit I'm too lazy myself to always set that all up.
Actually it's pretty amazing that now we'll be able to run an entire x86 OS environment on an ARM chip and get very usable performance too.
Just curious: why should node be ran in a virtualised environment for development? Is it a security concern? Does that apply to languages like python too? Would you be happy running it in a Docker container from macOS?
I'd say that we've moved away from virtualisation completely, we now use containers, so developers will expect native performance, as we get on other platforms.
If they're already on macOS, that's a thing.
Greater marketshare also provides more value to shareholders meaning that shareholders still win, as do consumers.
More people with macs (and probably iPads/iPhones) would also increase other profit centers for Apple such as services (their highest profit center), warranties, and accessories. The profits and loyalty from these could easily far outweigh the $100-$300 of extra margin they might gain from keeping Mac prices the same.
Meaning that price cuts to macs might actually be more strategically beneficial (to EVERYONE) than hoarding higher margins.
I also don't believe it's reasonable to assume that switching to arm is as simple as putting an iPad cpu into a laptop shell.
Here is an estimate that their 2018 model costs 72 just to make not to design and make.
The a14 that will power a MacBook is likely going to be more expensive not less. Especially with 15B transistors on the a14 vs less than 7B on a12.
Average selling price of Intel cpu looks like around $126. This includes a lot of low end cpus which is exactly the kind of cpu apple fans like to compare.
Apple may realize greater control and better battery life with the switch but they won't save a pile of money and thoughts about increasing performance are fanciful speculation that Apple, the people with the expertise are too smart to engage in.
Which means the actual per-CPU fab cost is going to become a smaller part of the complete development and production cost of a run. And that total cost is the only one that matters.
I expect savings can still be made, because Apple will stop contributing to Intel's profits. On the other hand I'm sure Apple was already buying CPUs at a sizeable discount.
Either way it's an open question if Apple's margins are going to look much healthier.
IMO an important motivation is low power/TDP for AR/VR.
Ax will also eventually give Apple the option of a single unified development model, which will allow OS-specific optimisations and improvements inside the CPU/GPU.
Ax has the potential to become MacOS/iOS/A(R)OS on a chip in a way that Intel CPUs never could.
This only makes sense if you know nothing about Apple's business.
You really think they're doing this to save $50 from ~5m Macs? You really think all this upheaval is for a mere $250m a year in savings? It'll cost them 10x that in pain alone to migrate to a new platform.
Come on now....$250m is nothing at Apple scale. Think bigger. Even if you hate Apple, think bigger about their nefariousness (if your view is that they have bad intentions - one I don't agree with).
Quarterly numbers come in between 4.5-5m units these days but point taken - I recalled numbers for the wrong timeframe.
> I also doubt the chips cost them 50$ per unit. The savings may worth few billions so it's not really like nothing.
The true cost of this move is reflected in more than the R&D. This is a long multi-year effort involving several parties with competing interests. People are talking here as if they just flipped a switch to save costs.
Let me make this clear. In my view, this is an offensive/strategic move to drive differentiation, not a defensive move to save costs (though if this works, that could be a big benefit down the road). Apple has a long history of these kinds of moves (that don't just involve chips). This is the same response I have to people peddling conspiracy theories that Apple loves making money off of selling dongles as a core strategy (dongles aren't the point, wireless is; focusing on dongles is missing the forest for the trees).
The question isn't whether it suits them. The question is: "Why did they choose to take on the level of risk in this portion of their business and what is the core benefit they expect?"
If the the main reason was cost savings, this would be a horrible way to go about it.
There's a better answer: Intel can't deliver the parts they need at the performance and efficiency levels Apple needs to build the products the way they want to build them. This is not a secret. There is a ton of reporting and discussion around this spanning a decade about Intel's pitfalls, disappointments, and delays. Apple might also want much closer alignment between iOS and MacOS. Their chip team has demonstrated an ability to bring chipsets in-house, delivering performance orders of magnitude better than smartphone competition on almost every metric, and doing it consistently on Apple's timelines. It only seems natural to drive a similar advantage on the Mac side while having even tighter integration with their overall Apple ecosystem.
If you want to boil this conversation into one dimension, I'm not your guy - you'd be better suited by finding someone else to talk to. Cheers!
2. I think you actually missed the point of the conversation. OP said "that's still an insane amount of additional profit per unit to be extracted" and followed that up with "amazing for Apple and its shareholers."
It is not insane at all. And not amazing. It just comes off as naive to anyone who's worked in these kinds of organizations and been involved in similar decisions.
I think it's hard for some people to comprehend that trying to save $1b a year for its own sake at the scale of an org like Apple can in many cases be a terrible decision.
Half the fun is writing down your own thoughts!
> You came with your strawman that it was for its own sake
That's possible. I saw the emphasis placed differently than you did even though we read the same words. Probably describes the nature of many internet arguments. Happy Monday - I appreciate you pushing me to explain myself. Seems like others were able to get value out of our back and forth.
Yes, they're a big company. But they're also a mature company. A lot of their efforts are going to be boring cost-cutting measures, because that's how mature companies stay profitable.
Here’s hoping anyway!!
You are overestimating how much a CPU costs...
It also removes any need for a dedicated GPU in their high-end laptops, which is probably $200 alone.
I have no idea how they justify the prices for their lower end laptops as-is, as they have worse screen and performace than recent iPads in pretty much all cases.
1. This is risky for consumers. Whereas the PPC->x86 move was clearly a benefit to consumers given how PPC was lagging Intel at the time, x86 had proven performance and a massive install base. It was low risk to consumers. This? Less so. Sure iOS devices run on ARM but now you lose x86 compatibility. Consumers need to be "compensated" for this risk. This means lower prices and/or better performance, at least in the short-to-medium term; and
1. This move is a risk for Apple. They could lose market share doing this if consumers reject the transition. They wouldn't undertake it if the rewards didn't justify the risk. They will ultimately capture more profit from this I'm sure but because of (1) I think they may well subsidize this move in the short term with more performance per $.
But I fully agree with an earlier comment here: Apple has a proven track record with chip shipments and schedules here so more vertically integrated laptop hardware is going to be a win, ultimately.
If you are a photographer, a developer, a graphics designer, a musician, a teacher, or whatever, and you are looking at buying a new Mac, what is going to get you to buy the new Apple Silicon powered Mac which is almost certain to impact your workflow in some way? If you are making purchase decisions for classrooms, what makes you buy 200 Macs with a new, unknown architecture?
The first generation of Macs on Apple silicon absolutely needs to have a significantly better price/ performance point versus the current generation or they won't sell to anything more than the most loyal fans. If the new Macs come out and pricing is not good, I could seriously see a sort-of anti-Osborne effect where people gravitate towards Intel based Macs (or away from Macs entirely) to avoid the risk of moving to new architecture.
If anything, I expect margins on the first couple generations of Macs to go DOWN as margins on the first couple generations on all Apple products are lower (also public record).
Yes, the "unknown" architecture powering the highest performing phones and tablets.
Apple has plenty of problems selling to schools for classroom use because other platforms have invested more in that use case. But ISA being the reason? No. Simply no.
IT managers are conservative, if they make a bad call, they have to support crap equipment for the next 5+ years or so. Yes, I'm aware Apple's CPUs are in the iPhone and iPad, but it's a huge change for the Mac and it's a big risk for people making those purchase decisions.
If you check technical specifications on past MBP battery specification and battery life you can notice one thing: Watt/hour battery is always decreasing and battery life is always remaining constant (e.g., 10 hours of web scrolling).
Gain in power consumption allows to reduce component space which allows further slimmer designs.
They got pretty dramatic results from the first few options, but it topped out at the thermal pads and nothing else made any difference at all. Their conclusion was that the way the system was built, there was an upper limit on the power the system could consistently provide to the CPU, and no amount of cooling would make any difference after that point.
The obvious conclusion for me was that Apple made decisions based on battery life and worked backwards from there, choosing a chip that fell within the desired range, designing a cooling system that was good enough for that ballpark, and providing just enough power to the CPU/GPU package to hit the upper end of the range.
It actually good engineering to have all the components balanced. If you overbuilt the VRM's for a CPU that would never utilize the current, its just wasted cost.
OTOH, maybe they were downsizing the batteries to keep it at 10H so they could be like "look we extended the battery to 16 hours with our new chips" while also bumping the battery capacity.
We shall see...
> The 16" MacBook Pro, for example, has a 100 Wh battery, which is the largest that Apple has ever shipped in a laptop. This is the largest battery size permitted in cabin baggage on flights.
That's…pretty bad. Do you have anything else open?
This isn't always true. The 16" MacBook Pro, for example, has a 100 Wh battery, which is the largest that Apple has ever shipped in a laptop. This is the largest battery size permitted in cabin baggage on flights.
A wireless solution is long-awaited.
This doesn't really seem to match my experience; at least on a 2015 MBP, the CPU is always consuming at least 0.5-1W, even with nothing running. If I open a webpage (or leave a site with a bunch of ads open), the CPU alone can easily start consuming 6-7 watts for a single core.
Apple claims 10 hours of battery life with a 70-something WH battery, which would indicate they expect total average power consumption to be around 7W; even the idle number is a decent percentage of that.
(Also, has anyone been able to measure the actual power consumption of the A-series CPUs?)
If anything, you should install an adblocker. A single website filled with ads (and they're all filled with tons of ads) can spin the CPU to tens of watts forever, significantly draining the battery.
CPU tends to be quite lean, until something needs to be done then steps up very quickly to consuming 45w.
It's quite variable, the highest brightness can consume double of the lowest brightness for example. One interesting test if one has a battery app showing instant consumption (I know lenovo laptops used to do), is to adjust brightness and see the impact.
If you run the web versions of Electron “apps” in Safari you’ll get substantially better battery life. (Of course, still not perfect; irrespective of browser all of these types of apps are incredibly poorly optimized from a client-side performance perspective.
If large companies making tools like slack had any respect for their users they would ship a dedicated desktop app, and it would support more OS features while using a small fraction of the computing resources.
(Large-company-sponsored web apps seem to be generally getting worse over time. Gmail for example uses several times more CPU/memory/bandwidth than it used to a few years ago, while simultaneously being much glitchier and laggier.)
Turn off animated gifs and emojis in slack.
They demoned lightroom and photoshop which are surely using meaningful CPU resources?
Agreed on the accelerators and the cost savings. All together probably a compelling case for switching.
Oh my sweet summer child.
And those accelerators don't need to be discrete, Apple can add them to their CPUs.
So, it looks like your point is: Sure, Apple is going to jump a couple process nodes from where Intel is, but everything is somehow going to remain the same?
Hard to square this with the simple fact that my 2018 MacBook Pro 13" battery lifespan goes from 8 hours while internet surfing to 1.5 hours for iOS development with frequent recompilations.
It also took a long time for Microsoft to actually tackle the issues that UWP/Metro and WinUI/XAML faced. It took so long, it doesn’t even matter anymore and even Microsoft has moved on. But there’s quite a bit of hypocrisy, with Microsoft telling others to use WinUI while not using it everywhere themselves while refusing to update the styles of other design frameworks.
Whom is Apple going to sell binned A14s to?
Where does everyone think margin comes from in the chip business?
In theory they could offer the Mini with eighteen different CPU options, but that's not really their style.
The 3990X costs more than ten times as much as the 3700X. It has eight times more cores. On anything threaded it smashes the 3700X. On anything not threaded it... doesn't. In many cases it loses slightly because the turbo clock is lower.
It basically means that the processor with the best single thread performance is somewhere in the lower half of your lineup and everything above it is just more chiplets with more cores. That's perfectly reasonable for servers and high end workstations that scale with threads. I'm not sure how interesting it is for laptops. Notice that AMD's laptop processors don't use chiplets.
On the whole my guess would be that we have the iPad Pro and MacBook Air using the same SoC, the MacBook Pro doing… something (it'll still need integrated graphics, but do they really sell enough to justify a new die? OTOH they do make a die specifically for the iPad Pro, and I'd guess it's lowest-selling iOS device v. highest-selling macOS device, and idk how numbers compare!), and the iMac (Pro)/Mac Pro using chiplets.
The silicon team is going to be very busy: they've got the A-series, S-series, T-series, H-series, W-series, and U-series chips to pump out on a regular roadmap.
The A-series (CPU / GPU / Neural accelerator) is the major work. It gets an annual revision, which probably means at least two teams in parallel?
The A-series X and Z variants seem to be kicked out roughly every second A-series generation, and power the iPads. The S-series seems to get a roughly annual revision, but it's a much smaller change than the main A-series.
I could see the Mac chips on a 2-year cycle, perhaps alternating with the iPad, or perhaps even trailing the iPads by 6 months?
Of course, yield is still a physical constraint, but apple sells a wide range of products and shouldn't have any trouble finding homes for defect-ridden chips.
I don't agree. Simply disabling Turbo Boost on MBP16 nets me around 10-15% more battery life. Underclocking a CPU can even result in twice to thrice the battery life on a gaming laptop under same workload.
Here's more details.
All of this is complete speculation of course but I don’t believe it will be a financial decision this one, it’ll be about creating better products.
I was wondering how this would unfold and looks like things are moving in that direction https://blog.tensorflow.org/2020/04/tensorflow-lite-core-ml-...
If TF models can interoperate with CoreML - boy that'll literally be a home run for Apple cos eventually all ML frameworks will follow suit.
I think that hits the nail on the head. Since I only cursory listened to both the keynote and the state of the union I may have missed it, but I heard them neither mention “CPU” nor “ARM”. The term they use is “Apple Silicon”, for the whole package.
I think they are, at the core, but from what they said, these things need not even be ARM CPUs.
-the CPU will be a lot more powerful and faster, but it isn't really faster because it's like an accelerator or something.
-if you actually use your computer get some vague "Linux desktop" or something (which is farcical and borders on parody, completely detached from actual reality). Because in the real world people actually doing stuff know that their CPU, and its work, is a significant contributor to power consumption, but if we just dismiss all of those people we can easily argue its irrelevance.
My standards for comments on HN regarding Apple events are very low, but today's posts really punch below that standard. It's armies of ignorant malcontents pissing in the wind. All meaningless, and they're spraying themselves with piss, but it always happens.
In the end this noise doesn't matter whatsoever.
You know Apple Silicon is going to handle this too, right?
Does Apple actually have its own silicon fab now or are they outsourcing manufacture? If the former, those are /expensive/ and they'll still be paying it off.
In other words, there are definitely gains to be had. My ipad pro offers a generally more smooth and satisfying experience with silent and much cooler running CPU versus my MBP, and they offer similar battery life. Scale up to MBP battery size and I suspect we will be seeing a few hours battery life gain.