Hacker News new | past | comments | ask | show | jobs | submit | page 2 login
Apple’s new M1 Pro and M1 Max processors (apple.com)
1052 points by emdashcomma 3 months ago | hide | past | favorite | 983 comments

"Apple’s Commitment to the Environment"

> Today, Apple is carbon neutral for global corporate operations, and by 2030, plans to have net-zero climate impact across the entire business, which includes manufacturing supply chains and all product life cycles. This also means that every chip Apple creates, from design to manufacturing, will be 100 percent carbon neutral.

But what they won't do is put the chip in an expandable and repairable system so that you don't have to discard and replace it every few years. This renders the carbon-neutrality of the chips meaningless. It's not the chip, it's the packaging that is massively unfriendly to the environment, stupid.

Apple, the company that requires the entire panel to be replaced by design when a 6 dollar display cable malfunctions, is proud to announce its latest marketing slogan for a better environment.

Just because you're not getting that panel back doesn't mean it's destroyed and wasted. I figure that these policies simplify their front-line technician jobs, getting faster turnaround times and higher success rates. Then they have a different department that sorts through all the removed/broken parts, repairing and using parts from them. No idea if this is what they actually do, but it would be the smart way to handle it.

It seems like they will rather shredded perfectly good parts than to let thirt party repair shops use them to help people at sane prices:



So because a company stole devices to sell them makes Apple the bad guy?

It's possible for both companies to be in the wrong.

The recycling center shouldn't have resold the devices (which is, as you point out, effectively theft). However, Apple should not be shredding hundreds of thousands of otherwise usable devices.

Apple does nothing to improve front line technician procedures. They aren't even an engineering factor. If you happen to be able to replace something on an Apple product, it's only because the cost-benefit ratio wasn't in favor of making that part hostile to work with.

Apple puts 56 screws in the Unibody MBP keyboards. They were practically the pioneer of gluing components in permanently. They don't care about technicians. Not even their own. They have been one of the leaders of the anti-right-to-repair movement from day one.

Oh hey they went back to screws on the keyboards? that's nice, they used to be single use plastic rivets, so at least you can redo that.

Also Apple's glue isn't usually that bad to work with. Doesn't leave much residue, so as long as you know where to apply the heat you can do a clean repair and glue the new component back in.

I think apple might have learned a painful lesson about keyboard repairability… with all those warranty repairs they are *still* stuck with.

> Apple does nothing to improve front line technician procedures.

I’m not a fan of planned obselence and waste but this is clearly wrong. They’ve spent loads of engineering effort designing a machine for their store that can replace and reseal and test iPhone screen replacements out back.

Sounds more like using a sword where a knife is needed.

So what’s your proposal? How big would a “phone” be with all those features that an iphone pro has? I am by no means an apple fanboy, but the same way a modern car engine can’t just be tweaked the way it was 50 years ago due to all the miniaturizations that are in large part due to efficiency gains, the same is just as true of phones.

But at the same time, a single chip with everything included will also make these phones pretty sturdy, where it either fails completely, or remain working for long years.

An interesting comparison is formula 1 cars. Peak performance and parts that can be changed in seconds while still running. Even average modern cars have hundreds of parts that a lay person can reach with simple tools. Apple are obviously making a trade off (close it down for reduced size and better weather/water sealing) but then they don't get to pretend to be an environmentally concious company as that is antithical to their design goals.

That's kind of a bad argument, considering that an F1 engine will be absolutely _fucked_ and needs to be thrown away after 2000 km.

That said, the previous poster's argument is terrible and gluing a phone is not what allows """peak performance"""

Gluing is the least of the problem - as others mentioned, it can be easily resealed, and I very much prefer my phone surviving a bit of water.

It was more a comment about their ability to have parts replaced, but you're right that the analogy has many flaws.

The reasoning was to make the device as thin as possible according to Verge iirc. The cable degrades because it's too short and can't be replaced.

Says it all pretty much.

Yes, I want a device as thin as what my wallet is going to be after its repairs

Even in school we were taught the lessons to better and sustainable environmental habits: Reduce, Reuse, Recycle

In the very same order.

So, no sir. Apple isn't the environment friendly company that they claim to be. So much money, and so little accountability.

Obsession with emissions has really made people start to miss the forest from the trees.

Emissions affect everyone on the planet, no matter where they happen. But polluting the ground or water only happens in China, so a lot of Americans that care about emissions don't care about the other types of pollution, because it doesn't affect them.

You would be surprised just how much food you eat has been grown in China using polluted land and water.

It's not so much fresh vegetables, but ingredients in other types of food -- especially the frozen fruit, vegetables and farmed seafood that finds its way into grocery store and restaurant supply chains.

> But polluting the ground or water only happens in China,

Do you have any idea how many superfund sites are in Silicon Valley alone?

Yes, one of them is under my house. But that’s not what my comment was about.

I was pointing out the mindset of people who don’t care about ground pollution of their products because their products are made elsewhere.

> But polluting the ground or water only happens in China

I see you've never been to Houston.

Could you elaborate on what you mean by this?

Not OP. Personally, I've had Dell, HP and Sony laptops. But the macs have been the longest lasting of them all. My personal pro is from 2015.

It has also come to a point where none of the extensions makes sense for me. 512GB is plenty. RAM might be an issue - but I honestly don't have enough data on that. The last time I had more than 16GB RAM was in 2008 on my hand built desktop.

As long as the battery can be replaced/fixed - even if it's not user serviceable, I'm okay with that. I'd guess I'm not in the minority here. Most people buy a computer and then take it to the store even if there's a minor issue. And Apple actually shines here. I have gotten my other laptop serviced - but only in unauthorized locations with questionable spare parts. With Apple, every non-tech savvy person I know has been able to take to an Apple store at some point and thereby extend the life.

That's why I believe having easily accessible service locations does more to device longevity than being user-serviceable.

(In comparison, HTC wanted 4 weeks to fix my phone plus 1wk either way in shipping time and me paying shipping costs in addition to the cost of repair. Of course, I abandoned the phone entirely than paying to fix it.)

We could actually test this hypothesis - if we could ask an electronics recycler on the average age of the devices they get by brand, we should get a clear idea on what brands actually last longer.

I'd much rather have the ability to fix a device myself than be locked into a vendor controlled repair solution. I've been able to extend the life of many devices I've had (the earliest from 2010) through repairs like dust removal, RAM upgrades and thermal paste reapplication.

Also worth noting that some people might be taking laptops to repair shops precisely because they are not user serviceable. Companies like framework are trying to change this with well-labelled internals and easily available parts.

Apple don't offer cheaper laptops. No one doubt it lasts longer in average.

I'm guessing they mean greenwashing statements about lower CO2 emissions glosses over more "traditional" pollution as heavy metals, organic solvents, SO2, NOx. Taming overconsumption is greener than finding ways to marginally reduce per unit emissions on ever more industrial production.

Not to mention all the eWaste that comes with the AirPods.

Who's doing better to mitigate e-waste?

> "AirPods are designed with numerous materials and features to reduce their environmental impact, including the 100 percent recycled rare earth elements used in all magnets. The case also uses 100 percent recycled tin in the solder of the main logic board, and 100 percent recycled aluminum in the hinge. AirPods are also free of potentially harmful substances such as mercury, BFRs, PVC, and beryllium. For energy efficiency, AirPods meet US Department of Energy requirements for battery charger systems. Apple’s Zero Waste program helps suppliers eliminate waste sent to landfills, and all final assembly supplier sites are transitioning to 100 percent renewable energy for Apple production. In the packaging, 100 percent of the virgin wood fiber comes from responsibly managed forests."

Weird that they leave out the parts about the battery and casing waste; and are design to only last on average of 18 months to force you to buy new ones.


absolutely, the worst part of the airpods is the degrading non-replaceable battery... it really left a bad impression.

Does anyone have a comparable product that doesn't have this issue?

No, because, its the limitation of a battery. And there is a reason why, they’re manufactured as a non reparable product. They house battery, speakers, microphone, bluetooth and other electronic logic board. The space is so scarce they need to be machined very accurately. But hey bashing on Apple is better than thinking why. Market has already spoken that it needs tiny things hanging on your ears. There is a limit on what we can expect from such things.

And what are the chances of that part failing enough to impact the environment?

As long as they plant a tree every time they replace a panel, it should be fine?

We'll be able to undo most of our damage to environment this way, as it's always replacements

>so that you don't have to discard and replace it every few years

Except you and I surely must know that's not true, that their machines have industry leading service lifetimes, and correspondingly high resale values as a result. Yes some pro users replace their machines regularly but those machines generally go on to have long productive lifetimes. Many of these models are also designed to be highly recyclable when the end comes. It's just not as simple as you're making out.

Right. Im not speaking to iPhones or iPads here, but the non-serviceability creates a robustness pretty unmatched by Windows laptops in terms of durability.

Was resting my 2010 MBP on the railing of a second story balcony during a film shoot and it dropped onto the marble floor below. Got pretty dented, but all that didn't work was the ethernet port. Got the 2015 one and it was my favorite machine ever - until it got stolen.

2017 one (typing on now) is the worst thing I've ever owned and I'm looking forward to getting one of the new ones. 2017 one: -Fries any low voltage USB device I plug in (according to some internal Facebook forms they returned 2-5k of this batch for that reason) -When it fried an external drive plugged in on the right, also blew out the right speaker. -Every time I try and charge it I get to guess which USB-C port is going to work for charging. If I pick wrong I have to power cycle to power brick (this is super fun when the laptops dead and there's no power indicator, as there is on the revived magsafe) -Half-dime shaped bit of glass popped out of the bottom of the screen when it was under load - this has happened to others in the same spot but user error..

Pissed Apple wouldn't replace it given how many other users have had the same issues, but this thing has taken a beating as have my past laptops. I'll still give them money if the new one proves to be as good as it seems.

> their machines have industry leading service lifetimes

Please stop copying marketing content, it really doesn't help your argument.

Additionally, macbooks have high failure rates, especially with keyboards in the previous generations, but also overheating because of their dreadful airflow. Time will tell what happens to the M1, but Apple's hardware is just as (un)reliable as say, Dell's.

No, personal experience isn't data.

> No, personal experience isn't data.

Do you have data to support your statement?

> Apple's hardware is just as (un)reliable as say, Dell's.

When I had access to reports from IT on a previous job (5k+ employees, most on MacBooks) Apple was definitely much more reliable than the Dell Windows machines in use. More reliable than the ThinkPads as well but this is data from one company, unsure how it compares to other large orgs.

Not only more reliable but customer service was much faster and better with Apple computers than Dell's.

This only makes sense if you presume people throw away their laptops when they replace them after "a few years". Given the incredibly high second hand value of macbooks, I think most people sell them or hand them down.

You're talking about selling working devices but parent was also talking about repairing them.

Seems like a huge waste to throw away a $2000+ machine when it's out of warranty because some $5 part on it dies and Apple not only doesn't provide a spare but actively fights anyone trying to repare them, while the options they realistically will give you out of warranty being having your motherboard replaced for some insane sum like $1299 or having you buy a new laptop.

Or what if you're a klutz and spill your grape juice glass over your keyboard? Congrats, now you're -$2000 lighter since there's no way to take it apart and clean the sticky mess inside.

> Or what if you're a klutz and spill your grape juice glass over your keyboard? Congrats, now you're -$2000 lighter since there's no way to take it apart and clean the sticky mess inside.

Thanks to the Right To Repair, you can take the laptop to pretty much any repair shop and they can replace anything you damaged with OEM or third-party parts. They even have schematics, so they can just desolder and resolder failed chips. In the past, this sort of thing would be a logic board swap for $1000 at the very least, but now it's just $30 + labor.

Oh, there is no right to repair. So I guess give Apple $2000 again and don't drink liquids at work.

Removable ram wouldn't change anything in your story presuming the entire board is fried. Anyway, $30 + labor is a deceitful way to put it. The labor in your story would be 100s an hour and would probably fail to actually fix the issue most of the time.

Not gonna lie, you had me in the first half.

> and don't drink liquids at work.

Which is ironic given that Apple laptops are often depicted next to freshly brewed cafe lattes.

Perhaps this is the real reason behind the "crack design team" jokes? A wholesale internal switchover from liquid-based stimulants after one too many accidents?


> actively fights anyone trying to repare them

What makes you say that? What did you expect would happen if you spill juice into your laptop?

What they are perhaps fighting is unauthorized repairs, in the sense that they want to be able to void the warranty if some random third party messes with the insides. That's not quite the same thing.

Apple has been very helpful when I brought in a 5 year old macbook pro with keyboard issues, replaced some keys for free on the spot. Also when the batteries of 8 and 9 year old MBAs started to go bad, they said they could replace them but advised me to order batteries from iFixit and do it myself, which I did.

Seems like a huge waste to throw away a $2000+ machine

There are other options besides throwing it away.

You can (a) trade it in for a new Mac (I just received $430 for my 2014 MBP) or (b) sell it for parts on eBay.

Or what if you're a klutz and spill your grape juice glass over your keyboard? Congrats, now you're -$2000 lighter since there's no way to take it apart and clean the sticky mess inside.

You can unscrew a Mac and clean it out. You can also take it into Apple for repair.

My F500 company has a 5 year tech refresh policy, and old laptops are trashed, not donated or resold.

Doesn't that mean that the problem is the policy of the F500 company and not whatever the supplier had in mind?

Puts on ewaste management company uniform

"Yeah <normal guy>'s out sick today, I'm his replacement."


In all seriousness I would absolutely love to do this sort of thing IRL, in situations where I'll just make incompetent management etc unimpressed (because I'm showing their inefficiency) and there wouldn't be any real/significant ramifications (eg machines that processed material a couple notches more interesting than what PCI-DSS covers).

But obviously I don't mean I'd literally use the above example to achieve this ;P

I've just learned a bit about (eh, you could say "been bitten by") poorly coordinated e-waste management/refurbishment/etc programs - these can be a horrendously inefficient money-grab if the top-level coordination isn't driven by empathy in the right places. So I would definitely get a kick out of doing something like that properly.

I cannot imagine that represents the majority of the market.

Exactly, corporate never risks them getting into wrong hands, no matter how theoretical risk that might be. Same for phones

We used to remove the hard drives which takes about 20 seconds on a desktop that has a "tool-less" case. Then donate the computer to anybody including the employees if they want it.

It takes a few minutes to do that on a laptop but it's not that long.

One of our interns got that jobs. Left are 200 old laptops, take out the SSD and smash it with a hammer. Right are 200 new laptops, don't touch.

Turns out someone got confused between left and right, gave him the wrong instructions, and he smashed 200 brand new SSDs. Ouch.

Seems like FDE would solve that and make it a hardware non-issue.

Work if you trust FDE. Some FDE are broken, like this: https://www.zdnet.com/article/flaws-in-self-encrypting-ssds-...

I suspect the disposal company your company contracts with parts them out and resells them. Although if you're literally throwing them in the dumpster, that's not even legal in many jurisdictions.

Damn, they could at least just trash the hard drives and just give them to local schools or something...

All of my Apple laptops (maybe even all their products) see about 5 to 8 years of service. Sometimes with me, sometimes as hand-me-downs. So they’ve been pretty excellent at not winding up in the trash.

Even software updates often stretch as far back as 5 year old models, so they’re pretty good with this.

Big Sur is officially supported by 2013 Mac models (8 years).

iOS 15 is supported by the 6s, which was 2015. So 6 years.

And I still know people using devices from these eras. Apple may not be repair friendly, but at the end of the day, their devices are the least likely to end up in the trash.

And here I am sitting at my 2011 Dell Latitude wondering what is so special about that. My sis had my 2013 Sony Duo but that's now become unusable with its broken, built-in battery. Yes, 5 to 8 years of service is nice, but not great or out of norm for a $1000+ laptop.

I guess 2011 Latitudes were the pro models? they were built like a tank.. not anymore i guess

Because they run windows.

If you look at android phones, you're looking at a few years only.

Because of software

Parent is talking about laptops, I am talking about laptops, why are you talking about smartphones? Though I also had my Samsung S2plus from 2013 to 2019 in use, and that was fairly cheap. I do not know any IPhone users that had theirs for longer.

My iphone 2G from 2007 still works well into 2013-2014.

> it's the packaging that is massively unfriendly to the environment, stupid.

Of all the garbage my family produces over the course of time, my Apple products probably take less than 0.1% of my family's share of the landfill. Do you find this to be different for you? Or am I speaking past the point you're trying to make here?

Is there an estimate of what the externality cost is for the packaging per unit? Would be useful to compare to other things that harm the environment like eating meat, taking car rides, to know how much I should think about this. E.g. if my iphone packaging is equivalent to one car ride I probably won't concern myself that much, but if it's equivalent to 1000 then yeah maybe I should. Right now I really couldn't tell you which of those two the true number is closer to. I don't expect we would be able to know a precise value but just knowing which order of magnitude it is estimated to be would help.

It absolutely doesn't render the carbon-neutrality of the chip useless. Concern about waste and concern about climate change are bound by a political movement and not a whole lot else. It's not wrong to care about waste more, but honestly its emissions that I care about more.

> It's not wrong to care about waste more, but honestly its emissions that I care about more.

Waste creates more emissions. Instead of producing something once, you produce it twice. That's why waste is bad, it's not just about disposing of the wasted product.

Not if the company that produces them is carbon neutral, which is theoretically the argument here. In general you're obviously correct, but I'd expect most emissions aren't incurred from waste.

My family has 3 MBP's. 2 of them are 10 years old, 1 of them is 8. When your laptops last that long, they're good for the environment.

Won't that make the chip bigger and/or slower? I think the compactness where the main components are so close together and finely tuned that makes the difference. Making it composable probably means also making it bigger (hence won't fit in as small spaces) and probably slower than what it is. Just my two cents though am not a chip designer.

Just making the SSD (the only part that wears) replaceable would greatly increase the lifespan of these systems, and while supporting M.2 would take up more space in the chassis, it would not meaninfully change performance or power.

Isn’t most of the components in MacBooks recyclable? If I remember correctly, Apple has a recycling program for old Macs so it’s not like these machines goes to landfill when their past their time or broken.

I believe Apple tries to use mostly recyclable components. And they do have a fairly comprehensive set of recycling / trade-in programs around the globe: https://www.apple.com/recycling/nationalservices/

That being said, I haven’t read any third-party audits to know if this is more than Apple marketing. Would be curious if they live up to their own marketing.

> Would be curious if they live up to their own marketing.

Do people really think that companies like Apple et al (who have a huge number of people following them eager to rip into them at ever opportunity) could get away with a "marketing story" like that? Like, really, Apple just making all that up and _not one single person_ whistleblowing on it if it were a lie?

AFAIK their recycling is shredding everything, no matter if it still works and separating gold and some other metals.

When you trade-in your Mac you are asked if the enclosure is free of dents, turns-on, battery holds charge etc.

Strange that they would ask these questions if they were simply going to shred the device.

>> Strange that they would ask these questions if they were simply going to shred the device.

Why not? It does sound micer that way and some customers may actually think they resell them just because of this...

Yeah it’s all a big conspiracy.

As others have said, the option to easily configure the computer post-purchase would make a massive difference in terms of its footprint

>This renders the carbon-neutrality of the chips meaningless.

You have a point but if they were actually truly neutral it wouldn't matter if you make 100,000 of them and throw them away.

> what they wont do is put the chip in an expandable and repairable system

Because that degrades the performance overall. SoC has proven itself to simply be more performant than a fully hotswappable architecture. Look at the GPU improvements they're mentioning - PCIe 5.0 (yet unreleased) maxes out at 128GB/s, whereas the SoC Apple has announced today is transferring between the CPU/GPU at 400GB/s.

In the end, performance will always trump interchangability for mobile devices.

Comparing memory interface vs PCIe isn't valid. Comparing LPDDR5 vs DDR5 latency, throughput, and power consumption would be good.

In this case I'd argue it is, because to communicate between the CPU and GPU on a non-SoC computer, you need to send that data through the PCIe interface. On the M1 SoC, you don't. They operate differently, and thats the main point here. You have to add those extra comparison points.

yep, thats definitely true

but it would be nice if the soc itself would be a module that you could upgrade and keep the case/display, it would also cut down on environmental impact probably as well...

I just used Apple's trade-in service for my 2014 MacBook Pro and received $430.

So there is another, quite lucrative, option besides discarding it.

Good to know and thanks! Looking forward to doing the same as I have a 2014 MBP that works and is in good condition.

They would make less money if they made the chip repairable. This doesn't have to make them evil. Apple being more profitable also means they can lower the cost and push the technological advancement envelop forward faster. Every year we will get that much faster chips. This is good for everyone.

This doesn't mean Apple's carbon footprint has to suffer. If Apple does a better job recycling old Macbooks than your average repair guy who takes an old CPU and puts in a new one in a repairable laptop then Apple's carbon footprint could be reduced. I remember the days when I would replace every component in my desktop once a year, I barely thought about recycling the old chips or even selling them to someone else. They were simply too low value to an average person to bother with recycling them properly or reselling them.

> They would make less money if they made the chip repairable

How would a 5 nanometer chip be "repairable"? Who would be able to repair such a chip and what would the tool cost be?

Chips aren't made out of vacuum tubes any more, you can't "fix" a transistor

>But what they won't do is put the chip in an expandable and repairable system so that you don't have to discard and replace it every few years. This renders the carbon-neutrality of the chips meaningless. It's not the chip, it's the packaging that is massively unfriendly to the environment, stupid.

Mac computers last way longer than their PC counterparts.

Is apple's halo effect affecting your perception of the mac vs PC market? iPhones last longer because they have much longer software updates, and are more powerful to start with. None of these factors apply to macs vs PCs.

Yeah but some Android stuff and windows stuff is so low-end that it only lasts for like 2 years and then it's functionally obsolete because of software. All the mac stuff from 10 years ago seems to still be able to work and has security updates.

> It's not the chip, it's the packaging that is massively unfriendly to the environment, stupid.

Who are you calling stupid? If you're going to call someone or something stupid, don't do it in a stupid way.

It's an allusion to Bill Clinton's 1992 presidential campaign slogan: "It's the economy, stupid." See: https://en.wikipedia.org/wiki/It%27s_the_economy,_stupid

You can always put your money where your mouth is and support someone that is doing all of the above:


Yes, I have one on order! :-)

This isn't relevant to the chips. Take this to the other thread.

agree with your point, but one could also look at the performance/power savings and use that in an argument for environmentally friendliness

What the hell does this have to do with the chips?

Yeah, but one of them doesn't cost a ton to implement (what they're doing) and the other one would cost them a ton through lost sales (what you're asking for).

Always follow the money :-)

Erhh... I think OP gets it, he's just calling out the greenwashing.

I always thought it was strange that "integrated graphics" was, for years, was synonymous with "cheap, underperforming" compared to the power of a discrete GPU.

I never could see any fundamental reason why "integrated" should mean "underpowered." Apple is turning things around, and is touting the benefits of high-performance integrated graphics.

Very simple: thermal budget. Chip performance is limited by thermal budget. You just can't spend more than roughly 100W in a single package, without going into very expensive and esoteric cooling mechanisms.

this is mostly wrong. The real issue has always been memory bandwidth. the highest end consumer x86 CPU has about the same memory bandwidth as a dGPU from 10 years ago. The M1 is extremely competitive with modern dGPUs, only a bit behind a 6900 XT.

If Intel/AMD are seriosus about iGPU, they implement solution for memory bandwidth (they did, Intel Iris Pro eDRAM, AMD gaming consoles using GDDR, upcoming AMD stacking SRAM). So I believe the core problem is that market didn't seriously want great iGPU but just fine with poorer iGPU or rich dGPU by Nvidia.

>The M1 is extremely competitive with modern dGPUs, only a bit behind a 6900 XT.

Do you have a source for this?

Apple compares the M1 Max as having similar performance to a nVidia's 3080 Laptop GPU, which scores around 16,500 on passmark. For comparison, the AMD 6900 XT desktop CPU scores 27,000, while the nVidia 3080 desktop GPU scores 24,500.

So the M1 Max is not as fast as a high end desktop GPU. Still, it is incredible that you are getting a GPU that performs slightly less than a last generation 2080 desktop GPU at just 50-60 watts.

Yes, apple's marketing materials claim 400 GB/s, while the 6900 XT is 512 GB/s. This is very easily Googled. While memory bandwidth isn't everything, it is the major bottleneck in most graphics pipelines. An x86 cpu with 3200 MHz memory has about 40 GB/s of bandwidth, which more or less makes high end integrated graphics impossible.

Ah, I misunderstood your comment. When you said it was competitive with the 6900 XT I thought you were talking about GPU performance in general, not just in terms of memory bandwidth.

According to the numbers Apple is touting, the M1 Max is competitive with modern GPUs in general, being on par with—roughly—a 3070 (laptop version) or a 2080 (desktop version). They've still got a ways to go but this is shockingly close, particularly given their power envelope.

> this is mostly wrong. The real issue has always been memory bandwidth.

Not really wrong. Memory bandwidth is only a limitation for a very narrow subset of problems.

I've gone back and forth between server-grade AMD hardware with 4-channel and 8-channel DDR4 and consumer-grade hardware with 2-channel DDR4. For most of my work (compiling, mostly) the extra memory bandwidth didn't make any difference. The consumer parts are actually faster for compilation because they have a higher turbo speed, despite having only a fraction of the memory bandwidth.

Memory bandwidth does limit certain classes of problems, but we mostly run those on GPUs anyway. Remember, the M1 Max memory bandwidth isn't just for the CPU. It's combined bandwidth for the GPU and CPU.

It will be interesting to see how much of that memory can be allocated to a M1 Max. It might be the most accessible way to get a lot of high-bandwidth RAM attached to a GPU for a while.

GP is talking specifically about GPUs. iGPUs are 100% bottlenecked by memory bandwidth; specifically it is the biggeset bottleneck for every single purchasable iGPU on the market (excluding M1 Pro/Max).

Your compute anecdotes have no bearing on (i)GPU bottlenecks.

They talking specifically about GPUs.

As the charts Apple shared in the event showed, you hit diminishing returns in performance/watt pretty quickly.

Sure. It'd be tough to be the top performing chip in the market, but you can get pretty close.

I dunno. Setting power limits on 3090 at >50% has nearly linear effect on performance.

> I never could see any fundamental reason why "integrated" should mean "underpowered."

There was always one reason: limited memory bandwidth. You simply couldn't cram enough pins and traces for all the processor io plus a memory bus wide enough to feed a powerful GPU. (at least not in a reasonable price)

We solved that almost a decade ago now with HBM. Sure, the latencies aren't amazing, but the power consumption numbers are and large caches can hide the higher access latencies pretty well in almost all cases.

PS4 / PS5 / XBox One / XBox Series X are all iGPU but with good memory bandwidths.

Only time that I can remember HBM being used with some kind of integrated graphics was strange Intel NUC with Vega GPU and IIRC correctly they were on the same die.

That product had an Intel CPU and AMD GPU connected via PCIe on the same package, not the same die. It was a neat experiment, but it was really just a packaging trick.

Still confused how 32 core M1 Max competes with Nvidia's thousands-of-cores GPUs. Certainly there are some things that are nearly linear with core count, or otherwise they wouldn't keep adding cores, right?

Edit: Found answer here. GPU core is not the same thing as a CUDA core. https://www.reddit.com/r/hardware/comments/73i3ne/why_do_app...

The desktop RTX 3070 has 46 SMs, which are the most comparable thing to Apple's cores.

NVIDIA defines any SIMD lane to be a core. They recently have gotten more creative with definition, they were able to double FP32 executions per unit (versus previous gen) and hence in marketing materials, doubled the number of "CUDA cores".

Funny, all these years I've been wondering how they possibly packed so many "cores" into those things.

The apples to apples comparison would be CUDA cores to execution units. Basically how many units which can perform a math operation. Apple's architecture has 128 units per core, so a 32 core M1 Max has the same theoretical compute power as 4096 CUDA cores. This of course doesn't take into consideration clock speed or architectural differences.

CPU core == Cuda SM

Perhaps with Vista? "Integrated" graphics meant something like Intel 915 which couldn't run "Aero". Even if you had the Intel 945, if you had low bandwidth RAM graphics performance still stuttered. Good article: https://arstechnica.com/gadgets/2008/03/the-vista-capable-de...

Video game consoles have been using integrated graphics for at least 15 years now, since Playstation 3 and Xbox 360.

You are mistaken. On both PS3 and Xbox 360 CPU and GPU is on different chips and made by different vendors(CPU made by IBM and GPU by Nvidia in case of PS3 and CPU by IBM and GPU by ATI for Xbox 360). Nonetheless in PS4/XOne generation they both use single die with unified memory for everything and their GPU could be called integrated.

For the 360, from 2010 production (when they introduced the 45nm shrink), the CPU and GPU was merged into a single chip.

When they did that they had to deliberately hamstring the SOC in order to ensure it didn’t outperform the earlier models. From a consistency of experience perspective I understand why, but it makes me somewhat sad that the system never truly got the performance uplift that would have come from such a move. That said there were significant efficiency gains from that if I recall.

Yup. Prior they were absolutely different dies just on the same package

If you mean including PS3 and X360, these two consoles had discrete GPUs. The move to AMD APUs was on the Xbox One and PS4 generation

Longer since integrated graphics used to mean integrated onto the north bridge and it's main memory controller. nForce integrated chipsets with GPUs in fact started from the machinations of the original Xbox switching to Intel from AMD at the last second.

In that case, it's more like discrete graphics with integrated CPU :)

Yeah, and vendors like Bungie are forced to cap their framerates at 30fps (Destiny 2).

They capped PC as well.

If they did, it definitely wasn't at 30. I was getting 90+ on my budget rig.

But no, I don't think they did.

Destiny 2 is capped on PC? The cutscenes are but the actual game is not

It used to have a bug that randomly cap the fps at 30. Only toggle vsync on and off again can fix it. It have no idea whether that has been fixed.

The software side hasn't been there on x86 GP platforms, even though AMD tried. It's worked out better on consoles.

What software is missing? I figured the AMD G-series CPUs used the same graphics drivers and same codepaths in those drivers for the same (Vega) architecture.

My impression was that it was still the hardware holding things back: Everything but the latest desktop CPUs still using the older Vega architecture. And even those latest desktop CPUs are essentially PS5 chips that got binned out.

Deep OS support for unified memory architectures for one. Things they tried to do with HSA etc. Also NVidia winning so much gpu programming mindshare with Cuda, and OpenCL failing to take off on mobile, dooming followon opencl development plans, didn't help.

In the wider picture, gpu compute in general on PC also failed to become mainstream enough to sway consumer choices. Development experience for GPUs is still crap vs the cpu, the languages are mostly bad, there's massive sw platform fragmentation among os vendors and gpu vendors, driver bugs causing OS crashes left and right, etc.

Re your impression, yes, AMD shifted focus more toward cpu from gpu in their SoCs after a while when their initiatives failed to take off outside consoles. But it's been an ok place to be, just keeping the gpu somewhat ahead of Intel competition and getting some good successes in the cpu side.

iGPU Vega is actually really, really good esp when it comes to perf/watt. It is bottlenecked by the slow memory bandwidth. DDR5 will more or less double iGPU performance.

How do they get 200/400GB per second RAM bandwidth? Isn't that like 4/8 channel DDR5. 4/8 times as fast as current Intel/AMD CPUs/APUs? (E.g. https://www.intel.com/content/www/us/en/products/sku/201837/... with 45.8GB/s)

Laptop/desktop have 2 channels. High-end desktop can have 4 channels. Servers have 8 channels.

How does Apple do that? I was always assuming that having that many channels is prohibitive in terms of either power consumption and/or chip size. But I guess I was wrong.

It can't be GDDR because chips with the required density don't exist, right?

It's LPDDR5, which maxes out at 6.4Gbit/s/pin, on a 256bit/512bit interface.

It's much easier to make a wider bus with LPDDR5 and chips soldered on the board than with DIMMs.

I hope we will see this on more devices. This is a huge boon to performance.

Might even forebode soldering RAM onto packages from here on out and forever.

Steamdeck will probably have a crazy 100gb/sec ram b/w. twice of current laptops and desktops.

Steamdeck is 88 GB/s using quad channel.

You aren't wrong, Apple is able to do this because implementing LPDDR is much more efficient from both a transistor and power consumption point of view, and is actually faster too. The tradeoff is you can't put 8 or 16 dram packages on the same channel like you can with regular DDR, which means that the M1 Max genuinely has a 64 GB limit, while a DDR system with the same bandwidth would be 1 TB. Fortunately for Apple there isn't really a market for a laptop with a TB of RAM.

They are using LPDDR5.

Not the usual DDR5 used in Desktop / Laptop.

DDR5 isn't common yet.

DDR4 is the common desktop/laptop chip. LPDDR5 is cell-phone chip, so its kinda funny to see a low-power RAM being used in a very wide fashion like this.

Don't cell phones sell more than desktops, laptops, and servers? Smartphones aren't a toy: they are the highest volume computing device.

They are also innovating with things like on-chip ECC for LPDDR4+, while desktop DDR4 still doesn't have ECC thanks to Intel intentionally gimping it for market segmentation.

Well sure. But that doesn't change the fact that DDR5 doesn't exist in any real numbers.

LPDDR5 is a completely different protocol from DDR5 by the way, just like GDDR5 is completely different from DDR3 it was based on. LPDDR3 was maybe the last time the low-power series was something like DDR3 (the mainline).

Today, LPDDR5 is based on LPDDR4, which diverged significantly from DDR4.

> They are also innovating with things like on-chip ECC for LPDDR4+

DDR5 will have on-chip ECC standard, even unbuffered / unregistered.

That sound like HBM2, maybe HBM3 but that would be the first consumer product to include it afaik.

Basically the bus is really large, and the memory dies must be really close to the main processing die. Those memory were notably on the RX Vega from AMD, and before that on the R9 Fury.


If that were the case you could probably see an interposer. And I think the B/W would be even higher.

And the price would be even higher.

If it was HBM it would have considerably higher bandwidth. A single HBM2E stack is 16GB at 460GBps, at 64GB that's 1.8TBps of bandwidth.

Disingenuous for Apple to compare these against 2017 Intel chips and call them 2x and 3.7x faster.

I would love to see how they fare against 2021 Intel and amd chips.

They did that to compare against the last comparable Intel chips in a Mac, which seems rather useful for people looking to upgrade from that line of Mac.

Reminds me of AMD comparing their insane IPC increase when Ryzen first came out.

How is it disingenuous - defined in my dictionary as not candid - when we know precisely which chips they are comparing against?

They are giving Mac laptop users information to try to persuade them to upgrade from their 2017 MacBook Pros and this is probably the most relevant comparison.

I'm pretty sure they are comparing them with 2019/2020 MacBook Pros, which apparently have chips originally launched 2017.

Looks to me like they are comparing against 2020 MBPs (at least for 13 inch) which use 10nm Ice Lake so nothing to do with 2017 at all!!

Intel's 2021 laptop chips (Alder Lake) are rumoured to be released later this month (usually actual availability is a few months after "release"). I expect them to be pretty compelling compared to the previous generation Intel parts, and maybe even vs AMD's latest. But the new "Intel 7" node (formerly 10++ or something) is almost certainly going to be behind TSMC N5 in power and performance, so Apple will most likely still have the upper hand.

I'd still bet on Apple for the all round package for a laptop but Intel should be coming out of the gates flying when Alder Lake launches.

This is their first ISA that actually reacts to Zen, from what I've heard.

Various leaked benchmarks show it outperforms the comparable Ryzens (and it's pretty obvious these rumours are sanctioned by Intel by their copious omission of watt numbers).

Are those the ones with big.LITTLE designs already?

The slide where they say its faster than an 8-core PC laptop CPU is comparing it against the 11th gen i7 11800H [1]. So it's not as fast as the fastest laptop chip, and it's certainly not as fast as the monster laptops that people put desktop CPUs in. But it uses 40% of the power of a not-awful 11th gen 8-core i7 laptop. The M1 is nowhere near as fast as a full-blown 16 core desktop CPU.

I am sure we will see reviews against high end intel and amd laptops very soon, and I wont be surprised if real world performance blows people away, as the M1 Air did.

[1] https://live.arstechnica.com/apple-october-18-unleashed-even...

... and neither is the M1 (in any configuration) a "full blown 16 core desktop CPU".

Those will be called M2 and come later next year, according to the rumor mill anyway.

Sorry, that is what I meant. I'll edit.

When M1 first released they pulled some marketing voodoo and you always saw the actively cooled performance numbers listed with the passively cooled TDP :D Nearly every tech article/review was reporting those two numbers together.

I suspect that’s because they:

1- want to convince people still on Intel Macs to update 2- lengthen the news cycle when the first units are shipped to the tech press and _they_ run these benchmarks

I thought they compared it with an i9-9980HK which is the top-end 2019 chip in the outgoing 16" MBP?

For me, think about that memory bandwidth. No other CPU comes even close. A Ryzen 5950X can only transfer about 43GB/s. This thing promises 400GB/s on the highest-end model.

As always, though, the integrated graphics thing is a mixed blessing. 0-copy and shared memory and all of that, but now the GPU cores are fighting for the same memory. If you are really using the many displays that they featured, just servicing and reading the framebuffers must be...notable.

A high end graphics card from nvidia these days has 1000GB/s all to itself, not in competition with the CPUs. If these GPUs are really as high of performance as claimed, there may be situations where one subsystem or the other is starved.

No consumer CPU comes close. Just saw an article about the next-gen Xeon's with HBM though that blows even this away (1.8TB/s theoretically), but what else would one expect from enterprise systems. Getting pretty damn excited about all the CPU manufacturers finally getting their asses into gear innovation-wise after what feels like a ridiculously long period of piss-warm "innovation".

Thanks to Apple in this case for taking a holistic approach to making a better computer.

The AMD chips in the PS5 and new XBox reach 448GB/s and 326GB/s bandwith respectively with their unified memory.

Not entirely true. Xbox Series X has two different memory bandwidths. The first 10GB has 560GB/s and the last 6GB has only 336GB/s.

Yeah it's an interesting setup. I believe the 560 is prioritized for graphics processing and 336 for more general tasks.

And only 10 cores so a 5950x completely wreck an M1.

A 5950x uses ~4 times as much power.

What application makes full use of 5950x all cores?

Video encoding, ray tracing. Prime95 if you're trying to stress test your CPU and memory.

To add to the other comments: Spark, Stockfish, some games

make -j32

Compiling stuff.

The benchmark to power consumption comparisons were very interesting. It seemed very un-Apple to be making such direct comparisons to competitors, especially when the Razer Blade Advanced had slightly better performance with far higher power consumption. I feel like typically Apple just says "Fastest we've ever made, it's so thin, so many nits, you'll love it" and leaves it at that.

I'll be very curious to see those comparisons picked apart when people get their hands on these, and I think it's time for me to give Macbooks another chance after switching exclusively to linux for the past couple years.

I think that for the first time, Apple has a real performance differentiator in its laptops. They want to highlight that.

If Apple is buying Intel CPUs, there's no reason making direct performance comparisons to competitors. They're all building out of the same parts bin. They would want to talk about the form factor and the display - areas where they could often out-do competitors. Now there's actually something to talk about with the CPU/GPU/hardware-performance.

I think Apple is also making the comparison to push something else: performance + lifestyle. For me, the implication is that I can buy an Intel laptop that's nicely portable, but a lot slower; I could also buy an Intel laptop that's just as fast, but requires two power adapters to satisfy its massive power drain and really doesn't work as a laptop at all. Or I can buy a MacBook Pro which has the power of the heavy, non-portable Intel laptops while sipping less power than the nicely portable ones. I don't have to make a trade-off between performance and portability.

I think people picked apart the comparisons on the M1 and were pretty satisfied. 6-8 M1 performance cores will offer a nice performance boost over 4 M1 performance cores and we basically know how those cores benchmark already.

I'd also note that there are efforts to get Linux on Apple Silicon.

I was casually aware of Asahi before this announcement. Now I'm paying close attention to its development.

They are selling these to people who know what the competition is, and care.

Apple used to do these performance comparisons a lot when they were on the PowerPC architecture. Essentially they tried to show that PowerPC-based Macs were faster (or as fast as) Intel-based PCs for the stuff that users wanted to do, like web browsing, Photoshop, movie editing, etc.

This kind of fell by the wayside after switching to Intel, for obvious reasons: the chips weren’t differentiators anymore.

I think that Apple took a subtle, not so subtle stand: power consumption has to do with environmental impact.

Apple almost single-handedly made computing devices non-repairable or upgradable; across their own product line and the industry in general due to their outsized influence.

Just today I got one 6s and one iPhone 7 screen repaired(6s got the glass replaced, the 7 got full assembly replaced) and battery of the 6s replaced at a shop that is not authorized by Apple. It cost me 110$ in total.

Previously I got 2017 Macbook Air SSD upgraded using an SSD and an adapter that I ordered from Amazon.

What’s that narrative that Apple devices are not upgradable or repairable?

It simply not true. If anything, Apple devices are the easiest to get serviced since there are not many models and pretty much all repair shops can deal with all devices that are still usable. Because of this, even broken Apple devices are sold and bought all the time.

>Just today I got one 6s and one iPhone 7 screen repaired

Nice, except doing a screen replacement on a modern iPhone like the 13 series will disable your FaceID making your iPhone pretty much worthless.

>Previously I got 2017 Macbook Air SSD upgraded using an SSD and an adapter that I ordered from Amazon

Nice, but on the modern Macbooks, the SSD is soldered and not replaceable. There is no way to upgrade them or replace them if they break, so you just have to throw away the whole laptop.

So yea, parent was right, Apple devices are the worst for reparability period since the ones you're talking about are not manufactured anymore therefore don't represent the current state of affairs and the ones that are manufactured today are built to not be repaired.

Hardware people are crafty, they find ways to transfer and combine working parts. The glass replacement(keeping the original LCD) I got for the 6S is not a procedure provided by Apple. Guess who doesn’t care? The repair shop that bought a machine from China for separating and re-assembly of the Glass and LCD.

Screen replacement is 50$, glass replacement is 30$.

iPhone 13 is very new, give it a few years and the hardware people will leverage the desire of not spending 1000$ on a new phone when the current one works fine except for that broken part.

Only if Apple wants to let them as far as I have seen. The software won't even let you swap screens between iPhone 13s. Maybe people will find a work around, but it seems like Apple is trying its hardest to prevent it.

And yet they authorize shops to perform these repairs. They’re not trying to prevent repairs, they’re trying to ensure repairs use Apple-supplied parts. Which, sure, you may object to that… but it’s very different from saying they’re preventing repairs full stop. And there’s very little chance such an effort would do anything other than destroy good will.

And how will the crafty HW people replace the SSD storage on my 2020 Macbok if it bites the dust?

By changing chips. There are already procedures for fun stuff like upgrading the RAM on the non-retina MacBook Airs to 16GB. Apple never offered 16GB version off that laptop but you can have it[0].

if there’s a demand there would be a response.

[0] https://www.youtube.com/watch?v=RgEfMzMxX5E

You clearly don't have a clue how modern Apple HW is built and why stuff that you're talking about on old Apple HW just won't work anymore on the machines build today.

I'm talking about 2020 devices where you can't just "change the chips" and hope it works like in the 2015 model from the video you posted.

Modern Apple devices aren't repairable anymore.

I would love to be enlightened about the new physics that Apple is using which is out of reach to the other engineers.


Anyway, people are crafty and engineering is not an Apple-exclusive trade. believe it or not, Apple can’t do anything about the laws of physics.

> I would love to be enlightened about the new physics that Apple is using which is out of reach to the other engineers.

That’s known as private-public key crypto with keys burnt into efuses on-die on the SoC.

You can’t get around that (except for that one dude in Shenzhen who just drills into the SoC and solders wires by hand which happen to hit the right spots). But generally, no regular third party repair shop will find a way around this.

I know about it, it simply means that someone will build a device that automates the thing that the dude in Shenzhen does or they will mix and match devices that have different kind of damage. I.e. if a phone that has destroyed screen(irreparable) will donate its parts to phones that have the face id lens broken.

You know, these encryption authentications work between ICs and not between lenses and motors. Keep the coded IC, change the coil. Things also have different breaking modes, for example a screen might break down due to the glass failure(which cannot be coded) and the repair shop can replace the broken assembly part when keeping the IC that ensures the communication with the mainboard. Too complicated for a street shop? Someone will build a service that does it B2B, shops will ship it ti them, they will ship it back leaving only the installation to the street shop.

Possibilities are endless. Some easier some harder but we are talking about talent that makes all kind of replicas of all kind of devices. With billions of iPhones out there, it's actually very lucrative market to be able to salvage 1000USD device, their margins could be even better than the margins of Apple when they charge 100USD to change the glass of the LCD assembly.

Compared to a thinkpad where I can replace the parts with a screwdriver myself, this is still an incredibly wasteful effort.

>I would love to be enlightened about the new physics that Apple is using but is out of reach for the other engineers.

Watch Luis Rosmann on youtube.

I know Luis, he made a career of complaining that it's impossible to repair Apple devices when repairing Apple devices.

Instead of watching videos and getting angry about Apple devices being impossible to repair, I get my Apple devices repaired when something breaks. Significantly more productive approach, you should try it.

>I get my Apple devices repaired when something breaks

Your old Apple devices, that are known to be vert easy to repair. You wouldn't be so confident with the latest gear.

But why spoil it for you? Let's talk in a few year when you find it out the hard way on your own skin.

Louis makes "Apple impossible to repair" videos since ever. It's not an iPhone 13 thing, give it a few year and you can claim that iPhone 17 impossible to repair, unlike the prehistoric iPhone 13.

Here is a video from 2013, him complaining that Apple doesn't let people repair their products: https://www.youtube.com/watch?v=UdlZ1HgFvxI

He recently moved to a new larger shop in attempt to grew his Apple repair operations. Then had to move back to a smaller shop because as it turns out, it wasn't Apple who is ruining his repair business.

Apple is using SOCs now where CPU and RAM are one chip package. How are you going to upgrade RAM here even with the mother of all reflowing stations?

You don't. It's a technological progress similar to one where we lost our ability to repair transistors with introduction of chips. If this doesn't work for you you should stick with the old tech, I think the Russians did something like that on their soviet era plane electronics. There are also audiophiles who don't even switch to transistor and use vacuum tubes. Also the Amish who stick to the horses and candles who choose to preserve their way of doing things and avoid the problems of electricity and powered machinery.

You will need to make a choice sometimes. Often you can't have small efficient and repairable all the time.

Nice, except doing a screen replacement on a modern iPhone like the 13 series will disable your FaceID making your iPhone pretty much worthless.

Only if you go to someone who isn't an authorised Apple repairer.

> Nice, but on the modern Macbooks, the SSD is soldered and not replaceable. There is no way to upgrade them or replace them if they break, so you just have to throw away the whole laptop.

I mean, you can replace the logic board. Wasteful, sure, but there's no need to throw out the whole thing.

People also replace IC’s all the time. Heat it, remove the broken SSD chip, put the new one, re-heat.

I know, but I can understand people preferring socketed parts.

In modern Apple laptops (2018 and later), the storage is soldered as the memory has been since 2015. Contrast this with a Dell XPS 15 you can buy today within which you can upgrade/replace both the memory and the storage. This is the case with most Windows laptops. The exception is usually the super thin ones that solder in RAM Apple-style, but there are some others that do as well.

There's also the fact that Apple does things like integrate the display connector into the panel part. So, if it fails - like when Apple made it too short with the 2016 and 2017 Macbook Pros causing the flexgate controversy - it requires replacing a $600 part instead of a $6 one.

True, but you are talking about devices that are 4-6 years old. Storage is now soldered. Ram has been soldered for a while now, and with Apple Silicon its part of the SoC.

For context, Apple started soldering RAM in 2015 and soldering storage in 2018.

Perhaps leading to fewer failures and longer device lifespans.

As far as I understand, the less components and heat, the longer the electronics keep working.

That isn't "less components", that's same components but soldered so customers can't replace it.

It removes connectors and may remove buffers. It is hard to pop a memory SIMM out over time using a laptop if everything is soldered in.

not that I've heard of anyone popping out a DIMM over time, but I'd rather pop it back in rather than having to ship it to a repair shop with BGA workstation to replace if a DRAM chip develops fault over time.

newer MacBooks have both the SSD and RAM soldered on board, it's no longer user upgradable, unless you have a BGA rework station and knows how to operate it.

Single handedly?

>According to iFixit, the Surface Laptop isn’t repairable at all. In fact, it got a 0 out of 10 for repairability and was labeled a “glue-filled monstrosity.”

The lowest scores previously were a 1 out of 10 for all previous iterations of the Surface Pro


One might argue that Surface laptops were Microsoft's answer to MacBooks.

If repairability was important to consumers, it would be a selling point for competitors. But it's not.

If Apple actually cared about sustainability, they would make their devices repairable.

They are repairable, but not by consumers in most cases.

They are mostly not repairable even by authorized repair providers.

Basically, they can only change a few components (keyboard, display (with assembly), motherboard, and probably the aluminium case), but that's it.

You literally cannot replace the battery in that Surface Laptop without destroying the whole thing.

It's made to be thrown away, instead of repaired.

Conversation is about Apple.

The conversation is about repairability, and Apple has yet to make a line of products that consistently earns a repairability score of 1 or less.

And they get away with it because Apple normalized it.

Weirdly these machines have a "6.1 repairability rating" when you go in their store. I wonder what ifixit will think of them.

I'm still daily driving a 2015MBP. Got the battery replaced, free under warranty, a few years ago. Running lates MacOS without any issues

The phones in my family are an iPhone 6S, iPhone 8 and an iPhone XS. All running the latest iOS. The 6S got a battery swap for 50€, others still going strong.

Similar with tablets, we have three and the latest one is a 2017 iPad Pro. All running the latest iPadOS.

Stuff doesn't need to be repairable and upgradable if it can outlast the competition by a factor of two while still staying on the latest official OS update.

Can't do that with any Android device. A 6 year old PC laptop might still be relevant though.

Apparently, you didn't compare Apple devices with what the bulk of the market consists of.

Also, implying that repairability is required for environmental sustainability is questionable at best. People in their vast majority tend to get rid of 5 years old phones and laptops.

It’s almost like it’s just about marketing and not much else…

FWIW, they are in general quite accurate with their ballpark performance figures. I expect the actual power/performance curves to be similar to that they showed. Which is interesting, because IIRC on the plots from Nuvia before they were bought their cores had a similar profile. It would be exciting if Qualcomm could have something good for a change.

If we can get an actual Windows on ARM ecosystem started things will get really exciting really quickly.

There still will be a question of porting the bulk of Windows software to Arm.

If Apple can implement Rosetta 2, then surely Microsoft can do it as well (they’ve actually done it, just with terrible performances).

> I'll be very curious to see those comparisons picked apart when people get their hands on these, and I think it's time for me to give Macbooks another chance after switching exclusively to linux for the past couple years.

I really enjoy linux as a development environment, but this is going to be VERY difficult to compete with..

Asahi Linux is making great strides in supporting M1 Macs, and they're upstreaming everything so your preferred distro could even support them.


You can always run Linux in a VM too

That's just not the same.

Yeah this is the first time they actually compared to something with the possibly bigger performance.

For the first time ever, they have something to brag about in their laptop specs. They are no longer just pulling parts off the shelf.

They're just marketing to the audience (actual Pros; not the whole 'uni student with macbook pro for taking notees'.

I'm not going to wait for the comparisons this time. Maxing this baby out right now.

Honest question, what do you do where a $6,099 laptop is justifiable?

I skip getting a Starbuck's latte, and avoid adding extra guac at Chipotle.

I'm kidding, that stuff has no affect on anything.

Justifiable, as in "does this make practical sense", is not the word, because it doesn't. Justifiable, as in, "does it fit within my budget?" yes that's accurate. I don't have a short answer to why my personal budget is that flexible, but I do remember there was a point in my life where I would ask the same thing as you about other people. The reality is that you either have it or you don't. That being said, nothing I had been doing for money is really going to max this kind of machine out or improve my craft. But things that used to be computationally expensive won't be anymore. Large catalogues of 24 megapixel RAWs used to be computationally expensive. Now I won't even notice, even with larger files and larger videos, and can expand what I do there along with video processing, which is all just entertainment. But I can also do that while running a bunch of docker containers and VMs... within VMs, and not think about it.

This machine, for me, is the catalyst for greater consumptive spending though. I've held off on new cameras, new NASs, new local area networking, because my current laptop and devices would chug under larger files.

Hope there was something to glean from that context. But all I can really offer is "make, or simply have, more money", not really profound.

Thank you for a very honest and thorough answer.

There's also future-proofing to some degree. I'll probably get a somewhat more loaded laptop than I "need" (though nowhere near $6K) because I'll end up kicking myself if 4 years from now I'm running up against some limit I underspeced.

Yeah I forgot to mention that, its a given for me.

Like there’s the potential tax deductibility, along with being a store of value (it will probably be $2300 in a few years but thats okay), making it easier to rationalize future laptops in the future by trading this one in. But I’m not betting on any of that.

I’ve just been waiting for this specific feature set, I’m upgrading from a maxed out dual GPU 2015 MBP that I purchased in 2017.

I skipped the whole divergence and folly.

No butterfly keyboards, no tolerating usbc while the rest of the world caught up, no usbc charging, no touch bar, I held out. And now I get Apple Silicon which already had rave reviews and blew everything else out of the water in the laptop space, and now I get the version with the RAM I want.

Surprisingly little fanfare, on my end. Which is kind of funny because I remember fondly configuring expensive maxed out Apple computers on their website that I could never afford. Its definitely more monumental if you save money for one specific thing and achieve that. But now I just knew I was already going to do it if Apple released a specific kind of M1 upgrade in a specific chassis, which they did and more. So it fit within my available credit, and which I’ll pay off likely by the end of the week, and I’m also satisfied that I get the points and a spending promotion my credit card had told me about.

But I was going to buy this irregardless.

A few thousand dollars per year (presumably it will last more than one year) is really not much for the most important piece of equipment a knowledge worker will be using.

It's still a waste if you don't need it though. This money could be spent on much more useful things.

If it improves compilation speeds by 1% then it's not a waste.

My time is worth so much more to me than money.

Then why are you using a laptop?

Why even bother with such an inane answer ?

It's because I need to use my computer whilst not physically attached to the same spot i.e. between work/home, travel.

You know the same reason as almost everyone else.

If faster compilation speeds matters as much as you said earlier then I'm sure it would be worth investing in machines for both work and home.

I'm not sure I understand how this would help if a user wants to stay mobile or what does this have to do with 'better investments'.

What does saparate machine for work has to do with "compilation speeds" in the first place?

which Mac desktop has similar performance?

Like what?

Guac at Chipotle

I mean, the Audi R8 has an MSRP > $140k and I've never been able to figure out how that is justifiable. So I guess dropping $6k on a laptop could be "justified" by not spending an extra $100k on a traveling machine?

To be clear, I'm not getting one of these, but there's clearly people that will drop extra thousands into a "performance machine" just because they like performance machines and they can do it. It doesn't really need to be justified.

Truthfully, I'm struggling to imagine the scenario where a "performance laptop" is justifiable to produce, in the sense you mean it. Surely, in most cases, a clunky desktop is sufficient and reasonably shipped when traveling, and can provide the required performance in 99% of actual high-performance-needed scenarios.

If I had money to burn, though, I'd definitely be buying a luxury performance laptop before I'd be buying an update to my jalopy. I use my car as little as I possibly can. I use my computer almost all the time.

Dude this is Hacker News. I'm surprised when I meet an engineer who doesn't have a maxed out laptop.

and yet, when I commented on Apple submissions about 16GB of maximum RAM being not enough in 2021, especially at that price point, people answered to me that I was bragging and their M1 Air with 8GB of RAM was more than enough to do everything, including running a production kubernetes cluster serving thousands of customers.

When commenting on Mac hardware it is always difficult for me to separate wishful thinking, cultism and actual facts.

I assume "vmception" requires a lot of power...

If you don't max out the HDD space, but max out all the portions which effect performance, it only about half that.

IIRC bigger SSDs in previous generations had higher performance.

That's fundamental to how NAND flash memory works. For high-end PCIe Gen4 SSD product lines, the 1TB models are usually not quite as fast as the 2TB models, and 512GB models can barely use the extra bandwidth over PCIe Gen3. But 2TB is usually enough to saturate the SSD controller or host interface when using PCIe Gen4 and TLC NAND.

All depends on your priorities and such.

My personal desktop was about $4k for what's inside the case. Add in my $2k monitor, and I'm right up there.

Some people call it excessive, I do too. But man, my desktop is blazing fast and my gaming experience is top notch.

The $1000 5950x was the easiest decision. Cut my compile times by 80%.

If I was serious about a portable development and such machine that many people with MacBooks are, I could see dropping $6k.

I'm not, hence I have a $2k M1 MBA and remote into my gaming desktop for anything where speed matters.

Not OP but ordered a maxxed out 16" with 1TB SSD (can't justify 2k more for disk space, I'll just buy an external and curb my torrenting).

My work flow is intensive yet critical:

I have at all times the following open:

ELECTRON APPS: Slack, Telegram, Teams, Discord, Git Kraken, VSCode (multiple workspaces hosting different repos all running webpack webservers with hot module reloading), Trading View.

NATIVE APPS: Firefox (10 - 32 tabs, many with live web socket connections such as stock trading sites, various web email providers, and at least one background YouTube video or twitch stream), Chrome (~6 tabs with alternate accounts using similar web socketed connections), iTerm, Torrent client (with multiple active transfers).

All of this is being displayed on two external 4k screens + the laptop.

So ya, I can justify maxxed out specs as my demands are far higher than that of an average user and that's with me actively closing things I don't need. Also my work will happily pay for it, so why not?

Not your father's currency. If you think of them as pesos, the price is easier to comprehend.

Not to mention if it makes a 200k salary worker 5% more productive, its a win. (Give or take for taxes.)

It's a win for a worker who's compensated based on their work output, which is pretty much the opposite of what a salaried worker is.

Productivity is productivity, doesn't matter how one is paid.

…then why mention a salary at all?

Perspective. It was a noise word really. Imagine instead a contractor working $100 an hour and pulling enough hours to make $200k a year. Does that change the discussion any? I don't believe so.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact