Hacker News new | past | comments | ask | show | jobs | submit | fossuser's comments login

This blog post is wrong, it also just doesn't matter in practice. Nobody who is making this sort of decision is going to give this kind of obviously false argument a second thought. When you're the one that actually has to make decisions that matter you're going to be better at ignoring bullshit. The audience for this is just people wanting something to upvote that sounds good to them - driven by motivated reasoning.

> “You could blame the workforce,” he says, “or the managers and leaders who are leading that workforce.”

They often do fire large swaths of middle managers - when Musk bought Twitter cutting out the middle of the hierarchy and the orgs that didn't need to exist was a big part of it. It's the same with DOGE. After the twitter layoffs they've shipped more features, faster, with better margins (and he fired the CEO). Meta and Coinbase over hired during the covid 'zero interest rate phenomena' and had to fix it. Reducing hours instead is a joke - I find it hard to believe anyone being honest takes that seriously.


There's nothing close, Apple has better talent and the vertical integration gives them an edge (especially on performance per watt on their chip designs).

Since the M series chips, there's been no other option if you care about quality. There are crappy alternatives with serious tradeoffs if for some reason you are forced to not use Apple or choose for non-quality reasons.


The leap from intel to the M series chips really left everyone else behind. I can't even use my 2019 Macbook anymore it feels so sluggish.

I have an M3 Pro and it blows all my old computers out of the water. Can handle pretty insane dev workflows (massive Docker composed environments) without issue and the battery life feels unfair. I can put in an 8 hour workday without my charging cable, I don't think I have turned it fully off in a few months, it just chugs along. It really embodies the "it just works" mindset.


I have the M3 Max, and a custom built pc using some Ryzen chip that has roughly equivalent benchmark scores to the M3 Max.

The amount of cooling and power required for such a PC versus the aluminum slab with small fans that almost never turn on is a testament to the insanely good engineering of the M series chips.

I compile large c++ codebases on a daily basis, but the M3 Max never makes me feel like I can “grab a cup of coffee”.


M series mac's are my dream c++ development machines. Just this week I was investigating some potential bugs in Qt's javascript engine and I was recompiling it from source across multiple tags to bisect. On my i9 mac I would compile Qt overnight, on my m3 pro it takes about 10 minutes, on battery, silently. Truly remarkable.

I have a M3 Max as well, 16" iteration, it's the best laptop I had, and it's clearly a desktop replacement for my usage and until I want to generate meme vids with LLM...

Nowadays I am looking forward to the Nvidia digits+ MacBook Pro duo.


I can easily take my M3 MBA on trips, using in on the plane both ways and a couple hours there for a few days, and not charge it once.

I honestly looked for alternatives when I bought it last summer but there weren’t any competitive options.


“When we saw that first system and then you sat there and played with it for a few hours and the battery didn’t move, we thought ‘Oh man, that’s a bug, the battery indicator is broken’”

https://9to5mac.com/2021/07/09/m1-macbook-battery-life/


I mean my AMD T14 G4 gets like 12 hours of battery running windows, 150 browser tabs and a virtual environment. Not sure how the newer ones are and no they aren't as sleek or probably durable as a metal apple or dell XPS but I haven't got any complaints for the price.

Yeah everyone that went from "old shitty Intel" to M1 or above somehow gaslit themselves into believing nobody could catch up.

AMD did catch up quickly, it's too bad they had to solder RAM to match but it is what it is...


I don't think they "gaslit themselves," but I do think M1 was good enough a lot of people stopped thinking about hardware and their frame of reference is horribly out of date

Also it can’t be understated how bad Windows has gotten and what that trajectory looks like.

T14 Gen 5 AMD has replaceable RAM, SSD, battery, and WWAN. Just got one for Linux (besides my MacBook) and loaded it up with 64GB RAM (which was only 160 Euro) and a 2TB SSD.

I had never really used a mac (or anything from apple) before M1, and since I got an M1 air I never looked back. Not all people who are hyped about apple silicon were previously apple users.

It is true though that in terms of laptops my only experience to compare it with was with intel chips, but that's because it used to be hard to find an AMD laptop back then.


The problem is that those AMDs still have to run something as awful as Windows 11

In the context of software development, I run Linux on those AMDs and it's a great experience. It's not for everyone and I respect that, but it's not too hard these days.

Also a Windows machine with WSL isn't the worst thing, just treat it well.


The problem is that you pay with battery life. I did some Windows vs Linux laptop battery life testing when I bought my Thinkpad T14s AMD gen4.

The test itself is simple: a Puppeteer script that loads the Reddit front page and scrolls it at 50 px/s for 10 minutes, in a loop until the battery dies. This actually produces a fairly decent load including video decoding (because Reddit autoplays by default, and there are plenty videos in the feed). I also had Thunderbird, Discord, and Telegram running in the background.

On Windows 11, the battery dies in 500 minutes.

On Linux Mint 21.3, it dies in 200 minutes.

Now, this is because Chrome (and Firefox!) disable GPU-accelerated rendering by default on Linux due to "unstable drivers". To be fair, it really is unstable - when I enabled it and watched the test as it was going, I saw the Firefox tab in a crashed state more than once. But even then, with Firefox + GPU acceleration, I got 470 minutes of battery life on Windows vs 340 minutes on Linux.


I have fedora on an all amd laptop and it’s wonderful

The problem with Apple is there is this awful MacOS along with the walled garden company called apple. The Problem with windows is.. none I can install my own.

I have T14s Gen 3 AMD running Linux and I don't even come close to 12 hours, maybe half of that. That's my biggest complaint.

I never understand why people are so fixated on battery runtime. If you actually use the device indoors, don't you have a possibility to charge anytime. For me, I alternate between my home office room and the living room. Sometimes I work when on the train. And even less often in national flights and on airports. Except when flying or on very outdated trains, there never is an issue charging.

I like to use my devices without needing to be tethered to an outlet. I don't like having to deal with wires creating trip/pull hazards because my laptop needs charging. Sitting on the porch without needing to run extension cords is also nice.

I always connect to multiple external screens. I cannot imagine having only my laptop screen/keyboard.

I have a problem when the laptop doesn't survive 2 days on suspend... My previous T480 never had a problem, even on a 50% battery... but the newer T14 sometimes does.

This is my biggest issue.

If I close my lid with 100% battery at the end of the workday, I should be able to open it up the next morning and get at least a few hours of work in before the battery dies.

And this used to work.

But with the same laptop, a certain version of windows has basically eliminated any benefits of shutting the laptop hinge.

Heck the worst part is the same thing happens even if you shut down windows. The only reason it’s now become usable for me is because I learnt if you do shift + power off that does a real shutdown, unlike a regular shutdown.


You can thank windows modern suspend for that one

Airplanes

100% this. My daily driver is a 2015 MacBook Pro that I only have one complaint about: the battery life doesn’t come anywhere close to letting me work on an airplane if there’s no 120V plugs available. I mean… most of the time I don’t mind just sleeping but it would be great to take better advantage of the quiet time with no slack messages.

Browser tab #89 is running at 120% CPU. Why are you running Daily Mail anyway

Looks like they’re decent laptops. Although surprisingly the newest models are hundreds of dollars more than similarly spec’d MBAs. Not sure on how the CPU/GPU performance compares.

The upgrade to the M chips was shocking to me. My intel suddenly felt like it was defective.

Well, it was. Intel MacBooks had really crappy thermals, getting awfully hot and throttling at any hint of system load.

it always was. you people just gaslit yourselves into thinking it wasn't. intel macbooks were always pretty bad

You obviously have never used a PowerPC based Mac... Or else you would know that the inter macs were a massive leap forward.

I'm still using my M1 MacBook Pro. The thing hasn't slowed down at all. It's a great device.

Same for me. Workflow handles everything from web dev to Photoshop to Ableton - all working together - without hesitation

My pet peeve is the trackpad

How on earth are there literally ZERO non Apple laptops with a trackpad as smooth as Apple’s?

This is an old technology. Surely someone must have reverse engineered this by now?


How on earth are there still no non Apple laptops that instantly and reliably go to sleep when you close the lid?

Yeah, because apple is totally reliable for sleeping when it is supposed to. I love my macbook, but I have dealt with over a decade of macbooks waking when they should not. Love it when my macbook cooks itself in my bag overnight.

I've never experienced this and I've been using macbooks since 2007.

I have never met someone with this problem, and have not experienced it myself in over 12 years. Closing the lid and walking away is what MacBooks are (were?) known for, I guess in my circles.

It’s a pretty common issue. Search around and you will find pages and pages of people experiencing the same issues on macrumors, the apple discussion forum, etc. Glad it hasn’t been a problem for you, I envy you!

I assume you’ve turned off “allow Bluetooth devices to wake this computer” and “wake on network access” in system preferences? Those are the only things I can think of that can randomly wake up macs

Oh yeah, I’ve turned every single possible thing off. It still randomly dark-wakes. And then, when it does, it will sometimes get into a doom loop of scanning the network and refusing to release its power assertions, meaning it won’t go back to sleep. Eventually it will hit a thermal emergency and then shut down.

I suspect the answer to these questions are the same. When different companies are developing the OS, the drivers and the hardware it is much harder to get everything to play together nicely.

harder perhaps, but not impossible, and the market has had decades to figure this out.

My guess is typical PC manufacturers have not felt it worth the time to invest in getting this aspect right.

Ironically, the best non windows trackpad I ever used was when Vizio tried to make computers. They actually got the trackpad right. In fact, those computers were really cut above everyone else, but my guess is they didn't sell well because they were taken off the market almost as fast as they were introduced


I’ve always wondered if the Apple trackpad was just the capacitive part of an iPhone screen. It feels like glass. It responds similarly. And they have a huge user base sample size for improvements.

It is glass

I think it’s another example of vertical integration making it better. Apple making the hardware plus OS gives them an advantage, making the trackpad experience great is hard if you don’t control both.

Apple has also learned a ton about how to do this well from the iPhone.


It’s less vertical integration than MS not actually taking the steps to control this market.

For example, the Precision touchpad, which was the first actual touchpad tech MS created, with Windows previously testing touchpads as mouses, has been released for a little more than a decade.

Touchpads had been around for over 4 decades and have been standard to laptops since the mid to late 90s.

And the worst part is that MS still allows hardware vendors to ship non Precision touchpads today.

MS won’t allow you to run windows 10, but vendors can ship touchpads that only support decades old software.

It’s very hard to get Windows manufacturers to pay attention to the touchpad when MS itself isn’t interested.

Unfortunately, the Windows domination in the non Apple part of the industry means that serious change is only driven by MS.

A lot of advantages we think Apple has due to vertical integration are more because MS is pretty terrible


This is seriously the thing I like the most about my 2017 and 2023 macbooks. The trackpad feels so good. Every other manufacturer that I have tried, and no it is not all of them but a lot, they all make my fingers feel bad after using them. I don't know if they are rougher or textured somehow? The only one that does make my finger pads feel sore is the macbook.

It's also the accuracy. I'm able to do light photo editing work right from the trackpad, even basic sketches and airbrushing. Have never been able to do anything remotely close with other laptops

100% this. I use a MacBook at work, and I bought myself a Framework laptop for personal stuff. Overall, the Framework is great, but the touchpad is a letdown.

It's not like they care

In my experience the Snapdragon X Elite is about the same as an M2. It's got slightly worse battery life but still a battery that blows the competition out the water.

Plus you get the benefits of loading out your laptop with 64GB RAM etc without paying Apples ridiculous prices

Snapdragon are just getting started. The Snapdragon X2 is coming out later this year with 18 cores

Apple does have some serious competition now


> you get the benefits of loading out your laptop with 64GB RAM etc

If RAM is all you need on a M_ air type of machine sure, but the selling/buying point of apple silicon's unified memory is mainly around GPU/memory bandwidth at a low energy consumption level, which is yet to be rivaled (maybe AMD recently took steps towards there). If one's workload optimises for CPU-only and very high RAM, apple silicon was and probably will be the worst choice cost-wise.

Also, for me, the no-no reason for snapdragon x elites until now is having to use windows, plus, as it turned out, the early unreliability of the actual products sold by laptop manufacturers.

But the market has opened up, so prob we will see more competition towards there, which is great. Apple's good but not doing anything magic that others cannot eventually do to some extent. Though I am far more optimistic for AMD than Qualcomm tbh.


Is the performance gap so huge? Power efficiency yes, absolutely, but for peak performance last I saw the last AMD vs M3 benchmarks were a slightly slower single core, and a little faster in multicore. Doesn't seem as world changing as described.

My $2000 linux desktop is still faster and snappier than the $4000 macbook, but it’s the only thing laptop sized that feels even close.

I feel like somehow my big Linux desktop with a Ryzen 7950X and 64 GB of ram feels less "snappy" than my M2 Macbook Air running Asahi when doing lightweight tasks, despite the big Ryzen being obviously much better at compilation and stuff. I'm not sure why and my guess was the RAM latency. But maybe I misconfigured something in my Arch Linux...

> My $2000 linux desktop is still faster and snappier than the $4000 macbook, but it’s the only thing laptop sized that feels even close.

What brand?


Probably diy.

2k buys you a decent thread ripper or 59xx series and as much ram as you can throw at it.


Speak for yourself. In Canada the cheapest threadripper I can find is $2k just for the CPU. No way can I build an entire desktop for that price.

I'd be interested in hearing about the specs. Planning on building a new Linux desktop soon.

It has a Ryzen 9900X, 64GB of DDR5, AMD Radeon RX6600XT, 2x2TB Hanye NVMe, ROG B650 ATX motherboard and 850W power supply.

I bought the system mostly to increase the single core performance from the Ryzen 5 3600 I had before. As well as to get rid of all the little 256GB SSD disks I had in the previous one.


850W is an overkill and will affect efficiency. I'd go with less power.

Then you suddenly want a Nvidia GPU on the side and you need more power.

Is 850W really "overkill"? I don't know. I suppose it depends on how likely it is you will expand and add devices.

Noone would expect it to be slower.

Yes. No other laptop can sustain peak performance as long as the M-series Macs. The only thing that competes is a dedicated desktop with a big cooler and fan.

Mac laptops feel faster, even if the synthetic benchmarks say otherwise.


I don’t agree. Compile times are definitely and very noticeably quicker on my Intel gaming laptop (that’s actually a few years old now) vs my M3 MBP.

That said, I’ve never once felt that the M3 MBPs are sluggish. They are definitely super quick machines. And the fact that I can go a full day without charging even when using moderately heavy workloads is truly jaw droppingly impressive.

I’d definitely take the power performance over that small little extra saved in compile times any day of the week. So Apple have made some really smart judgements there.


In guhidalg's defense, they did say that the "Mac laptops feel faster" (emphasis mine) not that they are faster. There's a trick here with Macs, which is that their user interfaces for the OS and many programs are tightly integrated with the hardware which makes the UI faster--that's the "feel faster"--it's a software, not hardware thing. In cases where the software is equivalent (i.e. cross-platform compilers like GCC/Clang/Cargo) you're going to see little difference, but your OS experience is definitely snappier on Macs.

The arm architecture is also optimized for UI-like tasks, quick to start and stop processes on one of many cores with differing power constraints, whereas x86 is more for workstation-type sustained workloads

M3 vs other high end intel chips on code compilation generally has the higher clock speed always winning. Only with the M4 is starting to hit clock speeds nearer to high end intel chips . We are 2 generations out to probably 5ghz sustained on Apple chips.

I think this has a whole lot to do with the memory throughput, as well as great efficiency.

My M1 still holds right up! It is the smallest RAM model, and even that is not the end of things.


I bought an M1 Max with 64gb of ram at release. I'm still not sure what will get me to replace it other than it simply breaking. Maybe an M5 will finally make me want to buy something new. I'm debating getting a cheaper Air and maybe a base Ultra now that I do most of my heavy work at a desk.

You mean plugged in or battery?

You’ll get regular performance on battery.

I’ve gone entire work days with my Pro on battery because I didn’t notice I hadn’t plugged it in. All my docker containers, IDE etc plugged into my external monitor. It was a good 9hrs before I noticed.

Macs are easy to beat depending on what trade offs you want to make though.


Most laptops are thermally constrained when it comes to speed - power efficiency means you can run at full speed longer without overheating.

Also, most laptops will run at significantly worse performance when not plugged in. Macs are much more consistent both thermally and when not plugged in.

The power efficiency gap equates to a fan noise gap, and the fan noise/heat of powerful Windows laptops is much more annoying than merely having poor battery life.

I ran some bioinformatics pipelines on an AMD pangolin notebook. Its was faster than the apple M2 (I think it was an M2 or M3) notebook my work neighbor had. My machine had more RAM, but still for workloads that use the extra cores it made a difference.

The performance alone says nothing. What about the battery life, size, weight, temperature, fan noise, and quality of the touchpad? These are important trade offs in any laptop.

When we’re both plugged in and my process finishes 10 minutes faster, it says something. Also the gp post specifically was about performance and specifically not efficiency.

I could get 7+ hours from my work Linux laptops battery and I don’t really care for macOS. The OS quality matters more than the touch pad to me. I’ve come the appreciate a Mat screen. But im glad there is choice.


Everyone values different things and has different requirements, so I’m glad your laptop works out for you. I’m just cautioning against an emphasis on performance: even if the laptop is plugged in, other design considerations will dictate and limit the raw performance.

Yes. You need to go to server class chips (eg. threadripper) before beating the raw multi-core performance of a top-spec M4 Max in a Macbook pro, and the battery life is still crazy good!

What gave me pause was when my base-spec M1 Air handily beat my admittedly old server (Xeon E5-2650v2) on a single-core compute-bound task [0] (generating a ton of UUIDs). I know Ivy Bridge is 12 years old at this point, but I didn’t expect it to be 2x slower.

EDIT: Also, I know the code in the link is awful; the point is doing a 1:1 comparison between the architectures.

[0]: https://gist.github.com/stephanGarland/f6b7a13585c0caf9eb64b...


My M1 Max MBP is a bit slower feeling than a new Lenovo Thinkpad P16s (Ryzen 7 Pro 7840U) I have, when using VS Code in Linux. But a M3 Max MBP I have blows it away. If you run Linux, and care about MacOS apps or need non-dev stuff like Outlook, then a Linux AMD laptop can be a really cheap, fast option. Unfortunately the manufacturers don't want to load them out... like my AMD chip supports 128GB, but no laptop manuf. will lay down more than 64GB.

It's a laptop, same performance with higher power efficiency means same performance with a much longer mobile uptime, which makes the Macbooks tiers above their competition.

And for data centers, same performance at better power efficiency means hundreds of thousands of dollars saved in power.


Yes it is. My M2 Max MBP runs multithreaded workloads in the same ballpark as my water cooled 12900k.

M series really is amazing.

But if you don’t want Apple, or you want to be able to upgrade, check out Frameworks. [1]

Really satisfying combination of quality and value for high performance laptops.

[1] https://frame.work/


Their 16" laptop is extremely bulky. I think this is a category where Macbooks clearly win. Thinkpad and FrameWork have great options for 14", but at larger screen sizes something is always missing for me.

To be fair, the 16" MacBook Pro (I have the M1 Max) is also rather bulky. It's to big and to heavy for travel. If you need do a lot of traveling, or just don't work at your desk, I'd recommend against getting a 16" laptop in general.

It’s 0.15in thicker.

But it seems the parent's point is there's no reason Dell couldn't have kept making improved XPS models. Maybe they don't compare on a $/watt basis with Apple silicon, but you could presumably have still paid less and gotten something pretty decent.

No vertical integration is what did intel in because they both do fab and design. TSMC won because they aren't vertically integrated into anything.

Apple is better because of actual superior technology. The chips are custom made and no one can match the technology yet.


They have superior technology because they control the full stack and have taken more and more ownership of it over the years (most recently building their own modem in the iPhone 16e). They could design chips for an exact set of constraints (originally iPhones) and then expand that to the mac. Intel with x86 had to support legacy and tons of different devices (and bad leadership caused them to ignore efficiency and later ignore gpus). Other laptop manufacturers have to run other people's software and few really make their own underlying hardware to the extent apple does.

> (originally iPhones) and then expand that to the mac

Yes and: (as you know) Aggregate volume also benefits Apple's whole product suite.

It's hard for a laptop mfg to justify pushing to next node process for just ~25m units. So competitors have to wait for Qualcomm, Samsung, whomever to transition.

It's easier for Apple with ~250m phones, ~50m tablets, ~25m laptops, etc. per year. (Apple's war chest also enables monosopoly of upcoming node processes.)

Imagine trying to pull off AirPods or Vision without that deep vertical integration. The ridiculously ambitious Vision is just barely feasible, riding on mobile's coattails. A Vision using 3rd party CPUs would be delayed.

--

This all is in addition to the Apple specific optimizations, which you mention.


No they don't all their technology is equivalent to what's in the industry save their chips. Which btw is manufactured by TSMC so the chip itself is not vertically integrated.

My argument is they were able to develop the chip because of their control. The constraints allowed them that freedom and the constraints come from the top down integration and control.

I'll bow out here because I can just tell this won't be a worthwhile thread.


But what other advantage did this give them? Like name specific examples. Feel free to leave, but I honestly don't see where you're coming from.

> But what other advantage did this give them? Like name specific examples. Feel free to leave, but I honestly don't see where you're coming from.

Back when Apple used Intel processors, they were at the mercy of Intel's roadmap; if Intel missed a deadline for a new chip, Apple had to change plans. Obviously, that's no longer the case.

Back in the Motorola/IBM days, their G5 processor ran so hot that Apple had to create a special case with 7 fans to cool it. It was an amazing engineering feat, but something Apple would never do unless they had no choice. I've used a Power Mac G5—it sounded like a jet taking off, and the fans stayed on. [2]

They get to integrate new technologies quicker than being constrained by the industry.

Apple launched the first 64-bit smartphone, the iPhone 5s, in 2013—at least a year before any Android manufacturer could. And when Qualcomm finally shipped a 64-bit processor, no version of Android supported it. [1]

There are dozens of examples where Apple's vertical integration has allowed them to stay a step ahead of competitors.

The latest is the C1 modem that shipped in the iPhone 16e. Because the C1 is much more efficient than Qualcomm's modem, the 16e gets better battery life than the more expensive iPhone 16 with Qualcomm's modem. [3]

And because Qualcomm's licensing fees are a percentage of the cost of the device it's in, shipping the C1 enables them to put modems in laptops. The Qualcomm fee is significant: an iPad Air starts at $599; the same iPad Air model with one of Qualcomm's modems costs $749.

Customers have wanted MacBooks with cellular modems forever; now they'll be able to do that, since the modem will become part of Apple's SoC in the near future.

That's what you can do when you're not constrained by off-the-shelf components.

[1]: "First 64-bit Android phone has no 64-bit software"—https://arstechnica.com/gadgets/2014/08/first-64-bit-android...

[2]: https://thehouseofmoth.com/a-little-known-fact-about-the-pow...

[3]: https://appleinsider.com/articles/25/02/27/apples-c1-modem-b...


It is unclear how much Apple have to pay to Qualcomm in patent fees. It can be still substantial.

excellent answer. thank you.

They've been able to reap some real technological efficiencies because of their vertical integration. Notable ones I know about:

- The integrated on-chip RAM dramatically speeds up memory access. Your full 16 GB of RAM on an M1 functions at cache speeds; meanwhile, the L3 cache on an Intel processor is 1-8M, more than 3 orders of magnitude smaller.

- Apple takes full advantage of this with their software stack. Objective C and Swift use reference counting. The advantage of refcounting is that it doesn't have slow GC pauses, but the disadvantage is that it has terrible locality properties (requiring that you update refcounts on all sorts of different cache lines when you assign a variable) which often make it significantly slower on real-world Intel hardware. But if your entire RAM operates at cache speeds, this disadvantage goes away.

- Refcounting is usually significantly more memory-efficient than GC, because with the latter you need to set aside empty space to copy objects into, and as that space fills up your GC becomes significantly less efficient. This lets Apple apps get more out of smaller overall RAM sizes. The 16GB on an M1 would feel very constraining on most modern Wintel computers, but it's plenty for Apple software.

- The OS is aware of the overall system load, and can use it to determine whether to use the performance or efficiency cores, and to allocate workloads across cores. The efficiency cores are very battery-efficient; that's why Macbooks often have multiple times the battery life of Windows laptops.

- The stock apps are all designed to take advantage of efficiencies in the OS and not do work that they don't need to, which again makes them faster and more battery efficient.


Apple M1 (or any M-series) RAM absolutely does NOT function at cache speeds. Do you know how expensive that memory would be? The RAM is not literally "in the CPU", but colocated in the same SoC "system on chip" package as the CPU.

It feels like a core part of your claim--at least half of it--relies on most software for "Wintel computers" being written in garbage collected languages, which would be shocking if it were true.

You’re treating vertical integration as if it’s this absolute thing. Apple is clearly more vertically integrated than any other laptop brand by virtue of designing everything from the CPU to the OS. That remains true even though Apple doesn’t run their own chip fabs or mine their own bauxite.

Apple is very dependent on using the latest process node from TSMC though. For that reason and the fact that the US cannot match what TSMC does, it all points to Taiwan dictating what kind of aid the US must provide.

I don't see the current US leadership wanting to put that in jeopardy.


More like the US is dictating the aid. Tsmc is opening a fundamentally unprofitable fab in Arizona.

Apple is better because they’re not competing on price which is why they can afford to bring so many things in house. That’s how they can afford the talent and other R&D resources.

We get what we pay for.


$899 (edu) or $999 is extremely competitive for what you get.

Most people buying an entry level computer these days should at least consider stretching to get a MBA than the $300-400 shovelware, they’ll get so much use out of it.

My wife is still using her 2020 M1 Air and it’s still as snappy as the day we got it, still works for all her use cases.

Incredible value.


That’s a great point, but I think that’s the result of decades of work enabled by premium pricing culminating in their custom silicon (which is itself a product of their ability to command exclusivity with TSMC nodes). The shareholders demand ever constant growth and Apple is moving down market just like everyone else (looking at you, BMW 3 series).

My wife is still using her 2020 M1 Air and it’s still as snappy as the day we got it, still works for all her use cases.

Ah! my early 2015 13" Macbook pro died only few weeks back. I don't think any other laptop will last nearly 10 years (TBF I did replace the battery and speakers for $280 in 2020 though)


I am using my HP Omen from 2016, which is still my main laptop. I gotit for 600 I think? I also upgraded Ram and SSD. The hinges on the lid broke the plastic case, and i am not replacing the dead battery, but it definitely works

Still using the T450s I got in 2015, so technically the ThinkPad won :D JK, that's a very respectable life span!

My M1 Max with 8TB is going strong 4 years in

and a pleasure to work with

I am sad that the resale value didn't hold as much as people claim Apple products do, but that’s because of the overpriced storage mostly

Looks like I can get $2500 vs the $7600 I paid for it

So rolling over into a newer maxed out model isnt so easily rationalized


Walmart is selling the 8GB M1 for $629. It presumably won’t ever get Apple Intelligence but it’s a great starter if money is tight.

I’d be tempted to buy a used machine with more RAM because I’ve had really good experiences with used electronics like laptops or phones.

BTW I have Apple Intelligence turned off but I’m not prepared to say it won’t ever be useful for anybody.


While they are still in stock, 13" M2 Macbook Air w/16GB RAM and 256GB SSD are only $799 at Best Buy & Amazon right now.

> Most people buying an entry level computer these days should at least consider stretching to get a MBA than the $300-400 shovelware, they’ll get so much use out of it.

I don't think that expecting everyone to waste 3x the money to scratch the same itch is an informed take, specially when the $300 shovelware has far better specs in terms of RAM and by far HD.

Nowadays you can even get better performing miniPCs for half the price than your MacBook Air M3, such as any of the systems packing a AMD Ryzen 7 8845HS.

I think some people look at the shiny computers and don't look past that.


> Nowadays you can even get better performing miniPCs for half the price than your MacBook Air M3

This is a stupid comparison: even a Mac mini pretty much fulfills this, since the M4 is a step up from the M3 and the actively-cooled mini can sustain higher performance than the passively-cooled MacBook Air, and at about half the price of an M3 MacBook Air.


And comparing a miniPC to a laptop is an informed take?

Like others pointed out, if we're comparing like for like, the Mac Mini is in that "mini PC" price range, and again is very competitive.

Specs don’t actually mean anything. Jobs was right. People just want their shit to work.

Pity that isn't how macOS has been lately.

vs what?

I thought this too but I think the amd mobile series chips have mostly caught up

What about Lunar Lake?

Solid, but still I can't find it in a laptop with passive cooling like the macbook air line here, which is a huge plus in a laptop imo

yes, it's apple and oranges. Apple is making a Veblen good. Dell/Lenovo are making lowest-cost/lowest-bid commodities

It's not a Veblen good.

> It's not a Veblen good.

What do you think it is then?


This is an incredibly lazy comment. I think you and most people would agree on the immense utility of Apple products. There’s also no evidence that demand increases with price in MacBooks.

> This is an incredibly lazy comment.

It isn't, and it's amusing how people get riled with a simple request to justify why dismissed someone else's opinion without presenting a single argument.

> I think you and most people would agree on the immense utility of Apple products.

They are consumer electronics, and laptop manufacturers are a dime a dozen. Why do you believe they are special in that regard? I mean, until recently they even shipped with a below-standard amount of RAM.

> There’s also no evidence that demand increases with price in MacBooks.

That's the definition of a Veblen good, something that is not known for being useful beyond serving as a status symbol.


> That's the definition of a Veblen good, something that is not known for being useful beyond serving as a status symbol.

I don't think you understood the Veblen good definition. And MacBooks do not fit the definition. The parent comment explained it well.


I don’t think it’s possible to understand what a Veblen good is and also think that Apple makes them. Apple get a brand premium for sure, but a “Veblen good” is something specific, and Apple don’t make those.

It's mostly overpriced shit wrapped in a nice cellophane.

I love it though and I believe there is no better alternative. Everything else is just the shit without the nice package.


> It's mostly overpriced shit wrapped in a nice cellophane.

That's orthogonal to the concept of a Veblen good. A Veblen good can very much be shit wrapped in cellophane.

The core trait of a Veblen good is that customers buy it as a status symbol. Also, being overpriced contributes to reduce the number of those who can afford to buy one.


Apple products have always been status symbols and this Macbook is no different.

One could easily put together a significantly more powerful Linux desktop for a lower price. This has always been the case, but Apple's marketing tries to convince you otherwise. Honestly, I've always been surprised by how effectively their marketing has misled the tech-savvy crowd on HN.


You can’t simply swap OSX for Linux. I heavily use accessibility tools that are just not comparable on Linux. These tools probably cost more than the laptop, even though Apple include it in the OS.

Troubleshooting is one of my main comparative advantages - I'm better at it than I am at programming and I enjoy it more. It's also a relatively independent skill, not everyone is good at it or likes it. It reminds me of the lateral thinking puzzles I did as a kid where you had to ask questions to uncover whatever the weird situation was. You have to question your assumptions - think about how you might be wrong, something I like to do in general anyway.

There's a certain way of reasoning about the problem and thinking about what it might be with limited information in a systemic way. It's also a bit broader than debugging - you can do a lot of troubleshooting (sometimes faster and more effectively) by doing things other than reading the code.

It's also been somewhat of a career advantage because it seems to be both more uncommon than standard dev for someone to be really good at and something that most people dislike (while it's my favorite thing to do). It also overlaps a lot with other more general types of problem solving.

Anyway - a lot of the article resonates with how I think about it too.


I think troubleshooting has a lot of overlap with thinking along the lines of the scientific method.

1. You have to start having hypotheses that you test, but should be ready to throw them away as quickly as you thought of them, when the results from testing it says so. Let data

2. You should preferably think hard about effective way to quickly rule out influencing variables and so quickly square in on the area where the erroneous effect is coming from.

3. You have to really rule out confounders. Make sure to turn off any caches or similar that might play games with you.

The area where I see most colleagues fail in this process is not being stringent enough with things like ruling out confounders and being systematic about organizing the outputs from hypothesis testing, to make sure you are 100% which outputs belong to which inputs etc.

It is the discipline and strictness in the process that will do the trick. Anything less and you will just trick yourself.


Yes! This generally rings true to me.

The other thing I see trip people up is being unwilling to make a fast hypothesis that can be easily tested to narrow scope.

Instead they’ll often try to look at the code to understand but that’s usually slower for anything remotely complex.


Have you found effective ways to market this skill?

Usually it’s best in an operational type role, can be support, sre, tpm, etc. depending on your strengths. It’s best when paired with good comms and somewhat good social skills.

You build credibility by jumping in and doing a lot of support type stuff early on (which then also makes you better at whatever the product is, more familiar with what sucks for users).


The parent comment mirrored my sentiments exactly.

In my company, I'm often the person who joins a production bug troubleshooting call, after sometimes hours of investigation, and rapidly identifies the root cause.

My typical workflow is:

* Clarify the issue and our assumptions. Often, simply restating the observed behavior aligns everyone.

* Pose questions to validate or challenge those assumptions.

* Suggest alternative methods to test the primary hypothesis.

Often, testing the initial hypothesis reveals its inaccuracy, leading to a swift discovery of the actual root cause.

Ultimately, it comes down to critical thinking and questioning assumptions I think.


Food motivated is also observable in people imo. I'm pretty food motivated - I'm a lot more interested in doing something if there's a meal associated with it than if there's not. I've definitely observed this variance in both people and dogs. Some people seem to get more joy out of it - probably partly just some genetic disposition.

I also broke mine in a crash, had surgery, but have not bothered to get the plate removed and it's fine?

The only time I really notice it is if someone pushes on it or if I'm doing front squats with a bar.


Get it removed: the next hit will be much worse than a broken collar bone if the bone can't do its job of giving in before less restorable parts of the shoulder give in.


A lot of psych research is bullshit that doesn't replicate - it's the dark ages over there, medicine before germ theory.


The other posts on her substack are also worth reading (and the earlier ones from his perspective) - I read through a bunch of them last August when one was posted. A tragic story well told, something that waits for all of us.


The only measure for love is loss


I don’t understand that at all. Would you explain what you mean?


I can take a crack. This is Jake's brother, Sam. The Japanese poet Yakamochi wrote:

We were together Only a little while And we believed our love Would last a thousand years.


The Asimov story it reminded me of was The Profession, though that one is not really about AI - but it is about original ideas and the kinds of people that have them.

I find the LLM dismissals somewhat tedious for most of the people making them half of humanity wouldn't meet their standards.


I feel like it's more the reason why they're missing, than the details around the actual miss, that makes people (rightly) dismiss the tech.

If I had a coworker who was just winging it all the time, sooner or later the trust/patience would run out.


> I find the LLM dismissals somewhat tedious for most of the people making them half of humanity wouldn't meet their standards.

Aren't people funny like that? One person values an encyclopedic chatbot for company, the next prefers a human. Thank god we can all get along.


That isn't what I said, but likely it won't matter. People will be denying it up until the end. I don't prefer LLMs to humans, but I don't pretend biological minds contain some magical essence that separates us from silicon. The denials of what might be happening are pretty weak - at best they're way over confident and smug.


> half of humanity wouldn't meet their standards.

All anti ai sentiment as pertains to personhood that I've ever interacted with (and it was a lot, in academia) boils down to arguments for the soul. It is really tedious and before I spoke to people about it it probably wouldn't have passed my turing test. Sadly even very smart people may be very stupid and even in a place of learning a teacher will respect that (no matter how dumb or puerile), more than likely they think the exact same thing.


The Machine Stops also touches on a lot of these ideas and was written in 1909!

--

"The story describes a world in which most of the human population has lost the ability to live on the surface of the Earth. Each individual now lives in isolation below ground in a standard room, with all bodily and spiritual needs met by the omnipotent, global Machine. Travel is permitted but is unpopular and rarely necessary. Communication is made via a kind of instant messaging/video conferencing machine with which people conduct their only activity: the sharing of ideas and what passes for knowledge.

The two main characters, Vashti and her son Kuno, live on opposite sides of the world. Vashti is content with her life, which, like most inhabitants of the world, she spends producing and endlessly discussing second-hand 'ideas'. Her son Kuno, however, is a sensualist and a rebel. He persuades a reluctant Vashti to endure the journey (and the resultant unwelcome personal interaction) to his room. There, he tells her of his disenchantment with the sanitised, mechanical world. He confides to her that he has visited the surface of the Earth without permission and that he saw other humans living outside the world of the Machine. However, the Machine recaptures him, and he is threatened with 'Homelessness': expulsion from the underground environment and presumed death. Vashti, however, dismisses her son's concerns as dangerous madness and returns to her part of the world.

As time passes, and Vashti continues the routine of her daily life, there are two important developments. First, individuals are no longer permitted use of the respirators which are needed to visit the Earth's surface. Most welcome this development, as they are sceptical and fearful of first-hand experience and of those who desire it. Secondly, "Mechanism", a kind of religion, is established in which the Machine is the object of worship. People forget that humans created the Machine and treat it as a mystical entity whose needs supersede their own.

Those who do not accept the deity of the Machine are viewed as 'unmechanical' and threatened with Homelessness. The Mending Apparatus—the system charged with repairing defects that appear in the Machine proper—has also failed by this time, but concerns about this are dismissed in the context of the supposed omnipotence of the Machine itself.

During this time, Kuno is transferred to a room near Vashti's. He comes to believe that the Machine is breaking down and tells her cryptically "The Machine stops." Vashti continues with her life, but eventually defects begin to appear in the Machine. At first, humans accept the deteriorations as the whim of the Machine, to which they are now wholly subservient, but the situation continues to deteriorate as the knowledge of how to repair the Machine has been lost.

Finally, the Machine collapses, bringing 'civilization' down with it. Kuno comes to Vashti's ruined room. Before they both perish, they realise that humanity and its connection to the natural world are what truly matters, and that it will fall to the surface-dwellers who still exist to rebuild the human race and to prevent the mistake of the Machine from being repeated."

https://en.wikipedia.org/wiki/The_Machine_Stops


I read this story a few years ago and really liked it, but seem to have forgotten the entire plot. Reading it now, it kind of reminds me of the plot of Silo.


Thanks a lot for posting this, I read the whole thing after. These predictions would have been impressive enough in the 60s; to hear that this is coming from 1909 is astounding.


Yeah it’s pretty wild, feels modern.

For a more recent recommendation - I also loved permutation city which was written in 1994 and pretty prescient about cloud computing.


Thanks for posting that story, I hadn't heard of it before!

Another take on the AI Halting Problem.

Stanislaw Lem's Golem XIV describes a series of super-intelligent computers that just suddenly decide to stop communicating.

https://en.wikipedia.org/wiki/Golem_XIV

https://cannonballread.com/2021/05/golem-xiv-blauracke/

https://readsomethinginteresting.com/acx/34

https://news.ycombinator.com/item?id=25741124

tialaramex on Jan 12, 2021 | parent | context | favorite | on: Superintelligence cannot be contained: Lessons fro...

Check out the Stanisław Lem story "GOLEM XIV".

GOLEM is one of a series of machines constructed to plan World War III, as is its sister HONEST ANNIE. But to the frustration of their human creators these more sophisticated machines refuse to plan World War III and instead seem to become philosophers (Golem) or just refuse to communicate with humans at all (Annie).

Lots of supposedly smart humans try to debate with Golem and eventually they (humans supervising the interaction) have to impose a "rule" to stop people opening their mouths the very first time they see Golem and getting humiliated almost before they've understood what is happening, because it's frustrating for everybody else.

Golem is asked if humans could acquire such intelligence and it explains that this is categorically impossible, Golem is doing something that is not just a better way to do the same thing as humans, it's doing something altogether different and superior that humans can't do. It also seems to hint that Annie is, in turn, superior in capability to Golem and that for them such transcendence to further feats is not necessarily impossible.

This is one of the stories that Lem wrote by an oblique method, what we have is extracts from an introduction to an imaginary dry scientific record that details the period between GOLEM being constructed and... the eventual conclusion of the incident.

Anyway, I was reminded because while Lem has to be careful (he's not superintelligent after all) he's clearly hinting that humans aren't smart enough to recognise the superintelligence of GOLEM and ANNIE. One proposed reason for why ANNIE rather than GOLEM is responsible for the events described near the end of the story is that she doesn't even think about humans, for the same reason humans largely don't think about flies. What's to think about? They're just an annoyance, to be swatted aside.


> Those who do not accept the deity of the Machine are viewed as 'unmechanical'

From the moment I understood the weakness of my flesh, it disgusted me. I craved the strength and certainty of steel. I aspired to the purity of the blessed machine.


Maybe USG will now stand behind American companies and push back on this sort of thing? Enough of the EU or UK fining US companies over bullshit. In this case it's also better for the UK consumers too.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: