Hacker News new | past | comments | ask | show | jobs | submit login

> In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

Their primary business goal is to sell hardware. Yes, they’ve diversified into services and being a shopping mall for all, but it is about selling luxury hardware.

The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.




> but it is about selling luxury hardware.

Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets. People spend hours a day on their phones, and often run their life and businesses through it. Even with the $1000+/2-3y price tag, it’s simply not that much given how central role it serves in your life. This is especially true for younger generations who often don't have laptops or desktops at home, and also increasingly in poorer-but-not-poor countries (say eg Eastern Europe). So the iPhone (their best selling product) is far, far, far more a commodity utility than typical luxury consumption like watches, purses, sports cars etc.

Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury. Especially since the M1 launched, where performance and battery life took a giant leap.


Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.


> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.

I am the one paying for my MacBook Pro, because my company is a self-funded business. I run my entire business on this machine and I love it. I always buy the fastest CPU possible, although I don't max out the RAM and SSD.

Amusingly enough, I talked to someone recently about compilation speeds and that person asked my why I don't compile my software (Clojure and ClojureScript) on "powerful cloud servers". Well, according to Geekbench, which always correlates very well with my compilation speeds, there are very few CPUs out there that can beat my M3 Max, and those aren't easily rentable as bare-metal cloud servers. Any virtual server will be slower.

So please, don't repeat the "MacBooks are for spoiled people who don't have to pay for them" trope. There are people for whom this is simply the best machine for the job at hand.

Incidentally, I checked my financials: a 16" MBP with M3 and 64GB RAM, amortized over 18 months (very short!) comes out to around $150/month. That is not expensive at all for your main development machine that you run your business on!


> comes out to around $150/month.

Which, incidentally, is probably about 10x less than you would spend compiling your software on "powerful cloud servers". :-)


For a fair comparison, what about comparing against the cheapest "power cloud server"?

I mean Hetzner has a reputation for renting bare metal servers at the cheapest price in the market. Try AX102 which has very close performance to a M3 Max (CPU only): https://www.hetzner.com/dedicated-rootserver/matrix-ax/

While the OP's solution has a lot of advantages like being able to own the device and including GPU, but at least we do have cloud servers with comparable costs available.


Indeed! That server is very close to my M3 Max. I stand slightly corrected :)

Worth noting: the monthly cost is close to my 18-month amortized cost.


I tried a lot to use remote servers for development when I had an Intel MacBook and I found the experience to always be so frustrating that I upgraded to the M series. Have the tools gotten any better or is vscode remote containers still the standard?


I did use them several years ago, for Clojure and ClojureScript development. Docker and docker-compose were my main tools, with syncthing helping synchronize source code in real time, Emacs as the editor. Everything worked quite well, but was never as easy and smooth as just running everything locally.


vscode remote containers are still the standard, but I find them very usable nowadays. My setup is a MBP M2 that I use to remote into a Windows WSL setup at home, a Linux desktop at work, and various servers. Nesting remote SSH + remote Docker works seamlessly, that was previously a major headache.


In your case it makes sense to get the most performant machine you can get even if it means you're paying a ton more for marginal gains. This is not usually true for the general public.


General public can buy a M1 MacBook Air for $799 if they need a laptop at all. An air will serve them well for a long time.


Love my Air...


As a Clojure/ ClojureScript developer myself I just wonder what You do that compilation is such an important part of Your workflow and at the same time don't need as much RAM as possible? Yes, the MacBook Pro isn't bad at all for Clojure(Script) development. I was pretty angry that the Lenovo ThinkPad T14 Gen3 has a full channel of soldered RAM and just a single slot for expansion since I really use a lot of RAM and would prefer to go with 64 GB full dual-channel and not a hybrid 48 GB with 32 GB being dual-channel and 16 GB being single channel. (Yes, it does actually work.) Most builds that I do are done asynchronously using GitHub Actions or similar. Yes, it does take some time but the build+deploy isn't that time sensitive.

In addition to the hardware, the OSX software is so much better with flawless speed, productivity, and multitasking with gestures. Try doing the desktop switching on the windows. On a flip note, I would gladly use the cloud if internet speeds and latency comes down to negligible level - we developer are an impatient lot.


"Engineers" - ironically the term used in the software industry for people who never standardize anything, solve the same problem solved by other "engineers" over and over again (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?) while making the same mistakes and learning the hard way each time, also vehemently argue about software being "art" might like OSX, but even that is debatable. Meanwhile actual Engineers (the ones with the license) the people who need CAD and design tools for building bridges and running manufacturing plants stay far away from OSX.


I did EE in college but we mostly just used Windows because the shitty semi-proprietary SPICE simulator we had to use, and stuff like that, only supported Windows. The company that makes your embedded processor might only support Windows (and begrudgingly at that).

I think engineers using software should not be seen as an endorsement. They seem to have an incredible tolerance for bad UI.


Is it truly bad UI?

They may be locked in, which just forces things. Not an endorsement.

However, they may also be really productive with whatever it is. This could be an endorsement.

In CAD, as an example, there are often very productive interaction models that seem obtuse, or just bad to people learning the tools first time.

Often improving on first time ramp ups to competence nearly always impacts the pro user too.

Where it plays out this way, I have always thought the UI was good in that the pros can work at peak efficiency. It is hard to beat them.

Fact is the task complexity footprint is just large enough to make "good", as in simple, intuitive interfaces not possible.


You seem to be suggesting that a chunk of the hundreds of millions of people who use a UI that you don't like, secretly hate it or are forced to tolerate it. Not a position I'd personally want to argue or defend, so I'll leave it at that.


What an oddly aggressive and hostile response to such a banal observation. Yes, millions of people use software they hate, all the time, that’s wildly uncontroversial.


Its not an "observation" its someone making it up. Why are you so upset if I disagree?


Making up what? Go drop by your nearby shop. My hair styling constantly complains about management software that they use and quality of payment integration. At work I constantly hear complaints about shitty, slow IDEs. At optician store guy been complaining about inventory system.

People hate software that they're forced to use. Professionals are better at tolerating crapware, because there's usually sunk cost fallacy involved.


There are only two types of software: those that people hate and those that nobody uses (a paraphrase)


<painfully earnest nerd>

Well actually, I use FreeBSD as my daily driver (on a used ThinkPad I bought for 300 euros), and I love it. :D

</painfully earnest nerd>

okay, now you're going to tell me that FreeBSD is in the "software nobody uses" category isn't it?


This is not a reasonable way to infer the sentiment of hundreds of millions of people in different countries, different business, different situations, etc, etc.

Disguising it as an "observation" is even more ridiculous.


Indeed I’m not ready to defend it, it is just an anecdote. I expected the experience of using crappy professional software to be so universal that I wouldn’t have to.


Sure, and this is where I will ask you post a list of "good" professional software so I can google all the bugs in that software :)

Nah, I'm good. Believe what you want to believe my friend.


>They seem to have an incredible tolerance for bad UI.

Irelevant.

Firstly, it's a tool, not a social media platform designed to sell ads and farm clicks, it needs to be utilitarian and that's it, like a power drill or a pickup truck, not look pretty since they're not targeting consumers but solving a niche set of engineering problems.

Secondly, the engineers are not the ones paying for that software so their individual tolerance is irelevant since their company pays for the tools and for their tolerance to those tools, being part of the job description and the pay.

Unless you run your own business , you're not gonna turn down lucrative employment because on site they provide BOSCH tools and GM trucks while you personally prefer the UX of Makita and Toyota. If those tools' UX slows down the process and makes the project take longer it's not my problem, my job is to clock in at 9 and clock out at 5, that's it, it's the company's problem to provide the best possible tools for the job, if they can.


Do you disagree with the sentence before the one you quoted? I think we basically agree, you came up with a bunch of reasons that

> I think engineers using software should not be seen as an endorsement.


> my job is to clock in at 9 and clock out at 5

Where can I find one of those jobs?


It was figuratively. Obviously everyone has different working hours/patterns depending on job market, skill set and personal situation.

But since you asked, Google is famous for low workloads. Or Microsoft. Or any other old and large slow moving company with lots of money, like IBM, Intel, SAP, ASML, Airbus, DHL, Siemens, manufacturing, aerospace, big pharma, transportation, etc. No bootstrapped "agile" start-ups and scale-ups, or failing companies that need to compete in a race to the bottom.

Depends mostly on where you live though.


France. You'll get a two hour lunch break too.


And you will usually leave at 18:30.


If you look at creative pros such as photographers and Hollywood ‘film’ editors, VFX artists, etc. you will see a lot of Windows and Linux as people are more concerned about getting absolute power at a fair price and don’t care if it is big, ugly. etc.


Oh, I'm sure there are lots of creatives who use OSX, so I don't mean to suggest nobody uses OSX, so I'll admit it was a bit in jest to poke fun at the stereotype. I'm definitely oldschool - but to me It's a bit cringe to hear "Oh, I'm an engineer.." or "As an engineer.." from people sit at a coffee shop writing emails or doing the most basic s/w dev work. I truly think silicon valley people would benefit from talking to technical people who are building bridges and manufacturing plants and cars and hardware and chips and all this stuff on r/engineeringporn that everyone takes for granted. I transitioned from s/w to hardcore manufacturing 15 years ago, and it was eye opening, and very humbling.


I’d assume a lot of this is because you can’t get the software on MacOS. Not a choice. Who is choosing to use Windows 10/11 where you get tabloid news in the OS by default? Or choosing to hide the button to create local user accounts?


Who is choosing to use macOS, where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?


I do. Because for all issues it has, it is still much better than whatever Windows has to offer.

> where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?

At least my WiFi doesn't turn off indefinitely during sleep until I power cycle whole laptop because of a shitty driver.


So what, Windows does the same. Printers [1], WiFi [2], VPN [3], Bluetooth devices [4], audio [5] - and that's just stuff I found via auto-completing "windows update breaks" on Google in under 5 minutes.

The only problem is that Apple is even worse at communicating issues than Microsoft is.

[1] https://www.bleepingcomputer.com/news/microsoft/microsoft-wa...

[2] https://www.bleepingcomputer.com/news/microsoft/microsoft-fi...

[3] https://www.bleepingcomputer.com/news/microsoft/microsoft-sa...

[4] https://www.forbes.com/sites/gordonkelly/2019/06/12/microsof...

[5] https://www.theregister.com/2022/08/22/windows_10_update_kil...


The big difference is that Microsoft - at least usually - confirms and owns the issues.

With Apple, it's usually just crickets... nothing in the release notes, no official statements, nothing. It's just trial and error for the users to see if a particular update fixed the issue.


That's anti-competitive and frustrating, but not an argument against the value of a pure Apple hardware ecosystem.


Which was not the point. The question was who would be choosing Windows over macOS. I would and this is one of the reasons why.


People overwhelmingly choose windows world-wide to get shit done. That answers the who.


So the same software exists on multiple platforms, there are no legacy or hardware compatibility considerations, interoperability considerations, no budget considerations, and the users have a choice in what they use?

I.e the same functionality exists with no draw backs and money was no object.

And they chose Windows? Seriously why?


More choice in hardware. More flexibility in hardware. UI preferences. You can't get a Mac 2 in 1 or a Mac foldable or a Mac gaming notebook or a Mac that weighs less than a kilogram. You can't get a Mac with an OLED screen or a numpad. Some people just prefer the Windows UI too. I usually use Linux but between MacOS and Windows, I prefer the latter.


We use the sales metrics and signals available to us.

I don't know what to say except resign to the fact that the world is fundamentally unfair, and you won't ever get to run the A/B experiment that you want. So yes, Windows it is !


You seem to have some romanticized notion of engineers and deeply offended by someone calling themselves engineer. Why do you even care if someone sits at a coffee shop writing emails and calls themselves engineer? You think it somehow dilutes prestige of word "engineer"? Makes it less elite or what?


"deeply offended" - My default response to imposters is laughter. Call yourself Lord, King, President, Doctor, Lawyer whatever - doesn't matter to me. I'd suggest you to lighten up.


They hate you because you speak the truth. Code monkeys calling themselves engineers really is funny.


"silicon valley people would benefit from talking to people who build chips", that's a good one!


It would be funny, if it wasn't also sad to see the decline.


Do you have an engineering degree ?


Yes, a bachelors and a masters.

Not that the degree means much, I learnt 90% of what I know on the job. It certainly helped get my foot in through the university brand, and alumni network.

You can call yourself anything you want Doctor, Lawyer, Engineer. I have the freedom to think my own thoughts too.


I always likened "engineers"[1] to "people who are proficient in calculus"; and "computers"[1] to "people who are proficient at calculations".

There was brief sidestep from late 1980s to early 2010s (~2012) where the term "software engineer" came into vogue and completely ran orthogonal to "proficiency in calculus". I mean, literally 99% of software engineers never learned calculus!

But it's nice to see that ever since ~2015 or so (and perhaps even going forward) proficiency in calculus is rising to the fore. We call those "software engineers" "ML Engineers" nowadays, ehh fine by me. And all those "computers" are not people anymore -- looks like carefully arranged sand (silicon) in metal took over.

I wonder if it's just a matter of time before the carefully-arranged-sand-in-metal form factor will take over the "engineer" role too. One of those Tesla/Figure robots becomes "proficient at calculus" and "proficient at calculations" better than "people".

Reference: [1]: I took the terms "engineer" and "computer" literally out of the movie "Hidden Figures" https://en.wikipedia.org/wiki/Hidden_Figures#Plot

It looks like ever since humankind learned calculus there was an enormous benefit to applying it in the engineering of rockets, aeroplanes, bridges, houses, and eventually "the careful arrangement of sand (silicon)". Literally every one of those jobs required learning calculus at school and applying calculus at work.


Why pointing out Calculus as opposed to just Math?

Might be just my Eastern Europe background where it was all just "Math" and both equations (that's Algebra I guess) and simpler functions/analysis (Calculus?) are taught in elementary school around age 14 or 15.

Maybe I'm missing/forgetting something - I think I used Calculus more during electrical engineering than for computer/software engineering.


In my central european university we've learned "Real Analysis" that was way more concerned about theorems and proofs rather than "calculating" something - if anything, actually calculating derivatives or integrals was a warmup problem to the meat of the subject.


Calculus, because all of engineering depends critically on the modeling of real world phenomena using ordinary or partial differential equations.

I don’t mean to disregard other branches of math — of course they’re useful — but calculus stands out in specific _applicability_ to engineering.

Literally every single branch of engineering. All o then. Petrochemical engineering to Biotech. They all use calculus as a fundamental block of study.

Discovering new drugs using Pk/Pd modeling is driven by modeling then drug<->pathogen repo as cycles using Lotka models.

Im not saying engineers dont need to learn stats or arithmetic. IMO those are more fundamental to _all_ fields, janitors or physicians or any field really. But calculus is fundamental to engineering alone.

Perhaps, a begrudging exception I can make is its applications in Finance.

But every other field where people build rockets, cars, airplanes, drugs, or ai robots, you’d need proficiency in calculus just as much as you’d need proficiency in writing or proficiency in arithmetic.


True, we learnt calculus before college in my home country - but it was just basic stuff. But I learnt a lot more of it including partial derivatives in first year of engineering college.

>I think I used Calculus more during electrical engineering than for computer/software engineering.

I think that was OPs point - most engineering disciplines teach it.


Yeah computer science went through this weird offshoot for 30-40 years where calculus was simply taught because of tradition.

It was not really necessary through all of the app developers eras. In fact, it’s so much so the case that many software engineers graduating from 2000-2015 or so work as software engineers without a degree in BS. Rather, they could drop the physics & calculus grind and opt for a BA in computer science. They then went on to become proficient software engineers in the industry.

It’s only after the recent advances of AI around 2012/2015 did a proficiency in calculus become crucial to software engineering again.

I mean, there’s a whole rabbit hole of knowledge on the reason why ML frameworks deal with calculating vector-Jacobian or Jacobian-vector products. Appreciating that and their relation to gradient is necessary to design & debug frameworks like PyTorch or MLX.

Sure, I will concede that a sans-calculus training (BA in Computer Science) can still be sufficiently useful to working as an ML engineer in data analytics, api/services/framework design, infrastructure, systems engineering, and perhaps even inference engineering. But I bet all those people will need to be proficient in calculus the more they have to deal with debugging models.


That 99% guess seems high considering calculus is generally a required subject when studying computer science (or software engineering) at most universities I know of.


In mine it was mandarory, there were 9 + 9 + 4.5 credits of only calculus itself. There was way more: discrete math, algebra...


You’re right it’s a total guess. It’s based on my experience in the field.

My strong “opinion” here comes from an observation that while calculus may have been a required subject of study in awarding engineering degrees, the reality is, people didn’t really study it. They just brushed through a couple of pages and wrote a few tests/exams.

In America there’s plethora of expert software engineers who opt for a bachelors degree in computer science that is a BA not a BS.

I think that’s complete totally reasonable thing to do if you don’t want to grind out the physics and calculus courses. They are super hard after all. And let’s face it, all of the _useful to humanity_ work in software doesn’t require expertise in physics or calculus, at least until now.

With AI going forward it’s hard to say. If more of the jobs shift over to model building then yes perhaps a back to basics approach of calculus proficiency could be required.


Most software engineering just doesn’t require calculus, though it does benefit from having the understanding of functions and limit behaviors that higher math does. But if you look at a lot of meme dev jobs they’ve transitioned heavily away from the crypto craze of the past 5 years towards “prompt engineering” or the like to exploit LLMs in the same way that the “Uber for X” meme of 2012-2017 exploited surface level knowledge of JS or API integration work. Fundamentally, the tech ecosystem desires low skill employees, LLMs are a new frontier in doing a lot with a little in terms of deep technical knowledge.


Hmm, that is an interesting take. Calculus does seems like the uniting factor.

I've come to appreciate the fact that domain knowledge has a more dominant role in solving a problem than technical/programming knowledge. I often wonder how s/w could align with other engineering practices in terms of approach design in a standardized way so we can just churn out code w/o an excessive reliance on quality assurance. I'm really hoping visual programming is going to be the savior here. It might allow SMEs and Domain experts to utilize a visual interface to implement their ideas.

Its interesting how python dominated C/C++ in the case of the NumPy community. One would have assumed C/C++ to be a more a natural fit for performance oriented code. But the domain knowledge overpowered technical knowledge and eventually people started asking funny questions like

https://stackoverflow.com/questions/41365723/why-is-my-pytho...


I agree a hundred percent that domain knowledge is the single most dominant influence to problem solving expertise.


there was some old commercial that had the tagline "performance is nothing without control". If you can't put the technology to work on your problems then the technology, no matter how incredible, is worthless to you.


This checks out. I'm a software developer who took math all through high school and my first three years of college. I barely scraped through my calculus exams, but I excelled at combinatorics, probability, matrix math, etc. (as long as it didn't veer into calculus for some reason).

I guess I just enjoy things more when I can count them.


For this engineering, I think calculus is not the main proficiency enhancer you claim it to be. Linear Algebra, combinatorics, probability and number theory are more relevant.

Calculus was important during the world wars because it means we could throw shells to the enemy army better, and that was an important issue during that period.

Nowadays, calculus is just a stepping stone to more relevant mathematics.


Calculus has never gone out of style ;)

Todays ML frameworks grapple with the problem of “jacobian-vector products” & “vector-jacobian product” as a consequence of understanding the interplay between gradients & derivative; and the application of the “chain rule”. All of those 3 concepts are fundamentally understood by being proficient in calculus.

While I’m being the hype-man for calculus I don’t mean to say proficiency in linear algebra or statistics is in any “less necessary” or “less useful” or “less challenging” or “less..” in any way.

I’m merely stating that, historically, calculus has been the unique branch of study for engineering. Statistics has always found value in many fields — business, finance, government policy etc.

Sure Linear algebra is one of those unique fields too — I kinda like to think of it as “algebra” in general and perhaps its utility has flowed in tandem with calculus. Idk. I haven’t thought super hard about it.


Calculus is continuous, analog math. Digital Computers use discrete math.

Both are math, and both are still incredibly important. Rockets haven't gone out of style.


Aren’t you supposed to learn calculus to be able to understand what O(n) even is? Is it not a standard part of a CS major?


You have precisely captured why I got interested in AI.


They also drive trains


From what I've heard (not an OSX user) Windows is the best operating system for multiple screens; OSX and Linux glitch way more. Most anyone doing 3D sculpture or graphics/art on a professional level will eventually move to working with 2-3 screens, and since there are no exclusively Mac design programs, OSX will be suboptimal.

There's little things too, like some people using gaming peripherals (multi-button MMO mice and left hand controllers, etc.) for editing, which might not be compatible with OSX.

And also, if you're mucking around with two 32 inch 4k monitors and a 16 inch Wacom it might start to feel a little ridiculous trying to save space with a Mac Pro.


Besides Windows having more drivers for USB adapters than Linux*, which is a reflection of the market, I find Linux having much fewer glitches using multiple screens.

Once it works, Linux is more reliable than Windows. And virtual desktops have always worked better on Linux than on Windows. So I disagree with you on that front.

* In my case, this means I had to get an Anker HDMI adapter, instead of any random brand.


>I find Linux having much fewer glitches using multiple screens.

Maybe as long as you don't need working fractional scaling with different DPI monitors, which is nothing fancy now.


Nitpick: it hasn’t been called “OS X” for almost eight years now, starting with macOS Sierra.


I’ve been doing art on a pro level for twenty five years and I dislike multiple monitors.


I am just commenting about what I've seen at concept artist desks / animation studios / etc.


Why is that?

I am not an artist and also dislike multiple monitors, though I will employ two of them on occasion.

My reasons are:

If the window and application management doesn't suck, one display is all one needs.

With cheap multiple displays and touch devices came an ongoing enshitification of app and window management. (And usually dumber focus rules)

Having to turn my head x times a day sucks.


Who do you think writes those CAD and design tools that help “actual engineers” solve the same problems over and over?


Would you like me to explain how it works to you? I'm not sure why you added a question mark.


Yes, they were asking you a question. Do you not understand question marks?


I'd say a lot of engineers (bridges, circuit boards, injection mouldings) are kept far away from OSX (and linux). Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!


> Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!

My MBP has been running without any restart for over a month.


Yes. I'm pretty sure my wifes 2014 Macbook Air has been 6 months without restart. My windows 11 workstation however has never done a week. I power down now daily to avoid dissapointment.


> who never standardize anything

IETF RFCs soon number over 10K; Java, win32, the Linux kernel syscall API are famous for backward compatibility

not to mention the absurd success of standard libraries of Python, Rust, PHP and certain "standard" projects like Django, React, and ExpressJS

> (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?)

considering the design space is enormous and the tradeoffs are not trivial ... it's good to have libraries that fundamentally solve the similar thing but in different context-dependent ways

arguably we are using too many libraries and not enough problem-specific in-situ DSLs (see the result of Alan Kay's research the STEPS project at VPRI - https://news.ycombinator.com/item?id=32966987 )


I'd argue almost all NEW library development is about politics and platform ownership. Every large company wants to be the dependency that other projects tie into. And if you don't want to hitch your wagon to google or facebook or whoever, you roll your own.

Many if not most computational problems are fundamentally about data and data transformation under constraints - Throughput, Memory, Latency, etc, etc. And for the situations where the tradeoffs are non-trivial, solving this problem is purely about domain knowledge regarding the nature of the data (video codec data, real-time sensor data, financial data, etc) not about programming expertise.

The various ways to high level architect the overall design in terms of client/server, P2P, distributed vs local, threading model, are, IME are not what I would call crazy complicated. There are standard ways of implementing various variations of the overall design which sadly because of a overall roll-your-own mindset, most devs are reluctant to adopt someone elses design. Part of that is that we don't have a framework of knowledge that allows us to build a library for these designs in our head where we can just pick one thats right for our usecase.

I don't agree with your characterization of the design space as 'enourmous'. I'd say most programmers just need to know a handful of design types because they're not working on high performance, low latency, multi-million endpoint scalable projects where as you say things can get non-trivial.

I'll give a shot at an analogy (I'm hoping the nitpickers are out to lunch). The design space for door knob is enormous because of the various hand shapes, disability constraints, door sizes, applications, security implications, etc. And yet we've standardize d on a few door knob types for most homes which you can go out and buy and install yourself. The special case bank vaults and prisons and other domains solve it their own way.


I challenge you to take those people who make bridges to build full software.

I am not meaning software is engineering or not.

It is a fact, in terms of cost, that software and bridge building are, most of the time very different activities with very different goals and cost-benefit ratios.

All those things count when taking decisions about the level of standardization.

About standards... there are lots also and widely used, from networking to protocols, data transfer formats... with well-known strengths and limitations.


In my 30± year career I can confidently say that Software Engineers look towards standardisation by default as it makes their lives easier.

It feels to me that you're bitter or had more than one bad experience. Perhaps you keep working with, or come across, bad Engineers as your generalising is inaccurate.


Maybe we need a new moniker "webgineer". The average HN/FAANG web programmer does appear to vastly overestimate the value of their contributions to the world.


Have we done full circle?

When I started doing this "Internet stuff" we were called "webmasters", and job would actually include what today we call: - DevOps - Server/Linux sysadmin - DB admin - Full stack (backend and frontend) engineer

And I might have forgot some things.


1999 indeed! I haven't heard that term since around 1999 when I was hired as a "web engineer" and derisively referred to myself as a "webgineer". I almost asked if I could change my title to "sciencematician".


People who cobble together new printers or kettles overestimate the value of their contributions to the world too. The delineation isn't between JS devs and JPL or ASML engineers.


You can shit all you want on so called "engineers", but they are the one who make the CAD you're talking about that "real engineers" use. So get off your high horse.


You're kidding yourself if you don't think that mechanical, structural or any other engineers don't do the same thing. They do.

I worked for one of the UKs leading architecture / construction firms writing software and also am an amature mechanic.

You'd be amazed at how many gasket types, nuts, bolts, fasteners, unfasters, glues, concretes, bonding agents and so on ... all invented for edge preferences and most of which could be used interchangably.

Also standards? Hah. They're an absolute shitshow in any engineering effort.

I mean ... even just units of measure. C'mon.


And they can typically setup their dev environment without a VM, while also getting commercial app support if they need it.

Windows requires a VM, like WSL, for a lot of people, and Linux lacks commercial support. macOS strikes a good balance in the middle that makes it a pretty compelling choice.


There are a plethora of companies offering commercial support for various Linux distributions.


I was thinking more about software like the Adobe suite, Microsoft Office, or other closed source software that hasn’t released on Linux. Electron has made things a bit better, but there are still a lot of bigs gaps for the enterprise, unless the company is specifically choosing software to maintain Linux support for end users.

Sure, Wine exists, but it’s not something I’d want to rely on for a business when there are alternatives like macOS which will offer native support.


Most people don't need the Adobe Suite, and the web version of M$-Office is more than Ok for occasional use. Most other enterprise software are web apps too nowadays, so it's much less relevant what OS your machine is running than it was ten years ago...


Excel is essential and in most businesses that I worked with, most of the accounting and business side is run on it. I switched to Windows from Linux just because of Excel when WSL came out. If Linux would have Excel and Photoshop that would be a no brainer to choose it, but that will never happen


Yep, that's pretty much it.

Apple fanboys like to talk about how cool and long lasting a MacBook Air is but a 500 bucks Chromebook will do just as well while allowing pretty much 90% of the use cases. Sure, the top end power is much lower but at the same time considering the base RAM/storage combo Apple gives it is not that relevant. If you starting loading it up, that puts the pricing in an entirely different category and in my opinion the MacBook Air becomes seriously irrelevant when compared to serious computing devices in the same price range...


There's still a huge market for people who want higher end hardware and to run workloads locally, or put a higher price on privacy. For people who want to keep their data close to their chest, and particularly now with the AI bloom, being able to perform all tasks on device is more valuable than ever.

A Chromebook "does the job" but it's closer to a thin client than a workstation. A lot of the job is done remotely and you may not want that.


Yes, but for those people if you consider the price of a fully loaded MacBook Pro it is a rather small win considering all the limitations.

If the only thing you care about are battery life (only if you plan to use it lightly on the go, because even the high-end Apple Silicon sucks decent amount of power at full tilt) and privacy I guess they are decent enough.

This is my argument: the base models are at the same time overkill and too limited considering the price and the high-end models are way too expensive for what they bring to the table.

Apple has a big relevancy problem because of how they put a stupid pricing ladder on everything, but that is just my opinion, I guess. As long as they keep making a shit ton of cash it doesn't matter, I suppose. But if the relevant people stop buying Macs because they make no sense, it will become apparent why it matters sooner or later...


Not at all, a Chromebook let's you run Linux apps. I can run full blown IDEs locally without problems. And yes, that is with 8Gb ram, ChromeOS has superb memory management.


Since the full blown IDE is running in a Linux VM, don't you mean, "Linux has superb memory management"?


Well, Google developed and deployed MGLRU to Chromebooks long before upstreamed it. Plus they use some magic to check the MGLRU working set size inside the VMs and balance everything.


Now I see. Interesting. (I'm planning to switch to ChromeOS, BTW.)


what chromebooks come with a mini LED HDR screen and insane battery life? i’d love to know


Are you seriously arguing about mini-LED displays only found in expensive MacBook Pro when I mention a cheap 500 dollars Chromebook. There is at least a 4x difference in price for those machines, it is ridiculous to even pretend they are somewhat comparable.

And if we are talking about expensive high-end hardware, mini-LED is worse than OLED found in those machines anyway so it's not like if that would be a strong argument.


yes, because i find the argument that a chromebook coming close to a MacBook to be extremely disingenuous.

no, i’m a big fan of OLED, but for a laptop, especially a non gaming one, the extra brightness and lack of burn in concern makes it better.


My argument isn't about Chromebooks vs any MacBook. My argument is against a base MacBook Air that is too expensive for relatively limited added utility against something like a cheaper Chromebooks.

Sure, the MacBook Air is better built and will do some things better but those things are not extremely relevant for someone who would be satisfied by an entry level MacBook Air, because while an MBA has some very nice attributes, in the end everything is limited by its RAM/storage (and to a lesser degree, ports).

For a concrete example, in my country the cheaper MacBook Air you can get is the old M1 design at 1k€, then there is the M2 refresh at 1.2k€ and M3 variant at 1.3k€.

But you can get an Asus Chromebook Plus for 600€ that has either the same amount of RAM and storage or more RAM (16Gb) or more storage (512Gb) depending on the variant you end up with. The display is worse (100 nits less bright and worse resolution) but slightly bigger (14") and that may matter more to many people. It has an older Intel i5 (you can find some AMD options for better efficiency) but it hardly matters for the vast majority of people who just want a laptop to do the basics (basically the target of a MacBook Air). Its battery life would be a bit worse than an MBA but not in a way that can be relevant for the vast majority of customers. One major advantage it has over an MBA is better ports selection, with an HDMI port, a USB A port and an SD card reader on top of the 2 Thunderbolt/USB C ports the MBA has, allowing a dongle free life without having to buy anything else, providing much better utility. That can be way more relevant for many peoples than a better build quality (that I would argue do not even bring better longevity, since with Apple you are hostage of the software support anyway).

You see I am not against MacBooks; in fact, I would advise purchasing a MacBook Pro for some specific use case.

But the reality is that the entry level Apple hardware is rather compromised for its price, and if someone would be satisfied by that choice, I'm arguing that there is another choice (worse on paper, better in some ways) but at half the price (40% off minimum).

If you start loading up a MacBook Air, you end up in MacBook Pro price territory and it doesn't make a lot of sense to not add the 100-200 more to get the much better machine.

I know from experience that entry level Apple hardware is a terrible deal, both because I made the mistake myself or I had to help/fix the issues for other people that made those choices. I have a cousin who remind me every time how much he fucking hates his entry level iMac (an old one with a compromised Fusion Drive and minimum RAM) even though it was rather expensive compared to most computers. My answer is always the same: you spent a lot, but not enough, because with Apple you do not deserve a good experience if you don't go above and beyond in your spending.

In my opinion it is way more disingenuous to advocate for entry-level Apple hardware to people who would be very much satisfied with products costing half as much. The value is just not there, Apple went way too far in the luxury territory in locking down everything and creating a pricing ladder that is engineered to confuse and upsell customers to extract as much money as possible.

For someone who really needs a computer to do more intense work, provided they can work around the tradeoffs of Apple Silicon/macOS and they are willing to spend a large amount of cash, Apple has some great hardware for sure. For everyone else the value is extremely questionnable, especially since they are going full steam ahead into services subscription and the software can be lacking in some ways that will require purchasing even more stuff, the total cost of ownership doesn't make sense anymore. For example, their iPhone SE is absolutely terrible, at 500€ you pay for 6 years old technology with small screen compared to the footprint, terrible battery life, etc. A 500€ mid-range Android is so much better in so many ways that it is just stupid at this point...

As for OLED, I don't think burn-in is a significant concern anymore, and if it is I would argue that you are using it too much like a desktop. In my country you could buy 2 decent OLED laptops for the price of an entry-level MacBook Pro anyway so it doesn't matter as much (and replacing displays of hardware manufacturers other than Apple is much easier and cheaper, so there is that). I think the MacBook Pros are very good for some niche applications, but at viable minimum 2.23k€ price (16Gb RAM/512GB storage) there are a lot of good options so it really requires a careful analysis of actual use case. If you do things related to 3D or engineering it is probably not worth it...


No mini LED, but you can configure the HP Elite Dragonfly Chromebook with a 1000 nits IPS display.

And AFAIK, Google dictates 10+h of battery life with mixed web browsing for all Chromebooks.


1000 nits is useless without HDR.


I'm pretty sure it does HDR too.


without mini LED / precise backlight control though, it’s useless


and backlight control


You usually don't need either for software development though, and if you do the free or online alternatives are often good enough for the rare occasions you need them. If you are a software developer and you have to spend significant time using Office it means you either are developing extensions for Office or your company management is somewhat lacking and you are forced to handle things you should not (like bureaucracy for instance).


Where I’m at my email is in Outlook. Having to use the web version sounds annoying. I also end up getting a lot of information in spreadsheets. Having to move all that to the online version to open also sounds annoying. The online version is also more limited, which could lead to issues.

I could see a front end dev needing Photoshop for some things, if they don’t have a design team to give them assets.

There are also security software the company says laptops must have which isn’t available for Linux. They only buy and deploy this stuff with Windows and macOS in mind.

A couple weeks ago on HN I saw someone looking for a program to make a demo of their app (I think). The comments were filled with people recommending an app on macOS that was apparently far and away the best option, and many were disappointed by the lack of availability elsewhere. I find there are a lot of situations like this, where I might be able to get the job done on another OS, but the software I actually want to use is on macOS. Obviously this one is a matter of taste to some degree.

It’s not as big an issue as it was 20 years ago, but it’s still an issue for in many environments.


I use Linux for work with the MS apps used in a browser. I use one specific app using a remote desktop .. also using a browser.

So this can be done. I don't expect the IT support to help me with any Linux issues.

My excuse for using Linux? It makes me more effective at developing software.


If you mean WSL for containers, macOS needs a VM too. If youre doing C++ macOS dev tools are .. bleak. Great for webdev though


↑ This!

I would love to buy Apple hardware, but not from Apple. I mean: M2 13 inch notebook with access to swap/extend memory and storage, regular US keyboard layout and proper desktop Linux (Debian, Alpine, Mint, PopOS!, Fedora Cinamon) or windows. MacOS and the Apple eco system just gets in your way when you're just trying to maintain a multi-platform C++/Java/Rust code base.


WSL for normal stuff. My co-worker is on Windows and had to setup WSL to get a linter working with VS Code. It took him a week to get it working the first time, and it breaks periodically, so he needs to do it all over again every few months.


I'm developing on Windows for Windows, Linux, Android, and web, including C, Go, Java, TSQL and MSSQL management. I do not necessarily need WSL except for C. SSH is built directly into the Windows terminal and is fully scriptable in PS.

WSL is also nice for Bash scripting, but it's not necessary.

It is a check box in the "Add Features" panel. There is nothing to install or setup. Certainly not for linting, unless, again, you're using a Linux tool chain.

But if you are, just check the box. No setup beyond VS Code, bashrc, vimrc, and your tool chain. Same as you would do on Mac.

If anything, all the Mac specific quirks make setting up the Linux tool chains much harder. At least on WSL the entire directory structure matches Linux out of the box. The tool chains just work.

While some of the documentation is in its infancy, the workflow and versatility of cross platform development on Windows, I think, is unmatched.


This. I have to onboard a lot of students to our analysis toolchain (Nuclear Physics, ROOT based, C++). 10 years ago I prayed that the student has a Mac, because it was so easy. Now I pray they have Windows, because of WSL. The tool chain is all compiled from source. Pretty much every major version, but often also minor versions, of macos break the compilation of ROOT. I had several release upgrades of Ubuntu that only required a recompile, if that, and it always worked.


Unless he is doing Linux development in the first place, that sounds very weird. You most certainly don't need to set up WSL to lint Python or say JS in VSCode on Windows.


That sounds wild, you can run bash and unix utils on windows with minimal fuss without WSL. Unless that linter truly needed linux (and i mean, vscode extensions are typescript..) that sounds like overkill


Don't you need Cygwin or Git Bash if you don't use WSL? That's kind of fussy.


As Windows/UNIX developer, I only use WSL for Linux containers.


What do you mean without a VM? I guess you don't count docker/podman as VMs then?


Likely that most devs want to use Unix tools — terminal, etc.


Those aren't VMs -- they're containers.


Only on Linux - on MacOS and Windows, you have to do virtualization for containers.


Unless you do use WSL1 as your container runner. Nobody does this but I do not get why.


I hardly can imagine how it works for you, because WSL1 basically lacks any option for containers be it namespaces or cgroups? Netfilter? Bridges?

Feel free to correct me and share successfull cases with wsl1 and containers


WSL is not a VM. Edit: TIL WSL2 is a VM. I develop on mac and linux computers so should have kept my mouth shut anyways


Hey, you learned and corrected yourself, dont be so hard on yourself mate.


Seriously! I agree. They just modeled the best discussion with some of the highest value there is.

Being wrong is no big deal. Being unable to be right is often a very big deal.


Username highly inaccurate ;)


Downloaded for complimenting GP?

Stay classy, Hacker News


Just to make sure your TIL is complete, do note that Linux containers are VMs also on MacOS :)


> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.

I know engineers from a FANG that picked MacBook pros in spite of the specs and only because of the bling/price tag. Them they spent their whole time using it as a remote terminal for Linux servers, and they still complained about the thing being extremely short on RAM and HD.

One of them even tried to convince their managers to give the vision pro a try, even though there was zero use cases for it.

Granted, they drive multiple monitors well with a single USB-C plug, at least with specific combinations of monitors and hubs.

It's high time that the "Apple sells high end gear" shtick is put to rest. Even their macOS treadmill is becoming tiring.


The build quality of Apple laptops is still pretty unmatched in every price category.

Yes, there are 2k+ laptops from Dell/Lenovo that match and exceed a similarly priced MacBook in pure power, but usually lack battery life and/or build quality.


Apple devices also work quite seamless together. IPads for example work great as a second screen wirelessly with the MBPs. I'd immediately buy a 14 inch ipad just for that, since that is so useful when not on your standard desk. Also copy paste between devices or headphones just work...

in case Apple would come up with the idea to take an ipad as external compute unit that would be amazing... just double your ram, compute and screen with it in such a lightweight form factor... should be possible if they want


You can use the iPad as a second monitor on Windows too and it works nicely. I also use my airpod pro's with my Dell Windows XPS and it's perfect.


is there now a low latency solution for windows 2nd monitor? I was only aware of some software where latency is quite bad or one company that provided a wireless HDMI / Displayport dongle...

Also the nice thing for headphones within apple is, that the airpods automatically switch to where the attention is... meaning e.g., in case I watch something on the laptop and pick up an iphone call (no matter if via phone or any app) the airpod automatically switches


My 15 inch macbook which fried its display twice (didn't go to sleep properly and then put in a bagpack and overheated. There is no way to see that the sleep didn't kick in), and then had the broken display cable problem (widespread and Apple wanted $900 for a new display..) would disagree. For comparison: The 4k touch display on my xps15 that didn't survive a diet coke bath was <$300 including labor for a guy to show up in my office and repair it while I was watching....


> The build quality of Apple laptops is still pretty unmatched in every price category.

I owned a MacBook Pro with the dreaded butterfly keyboard. It was shit.

How many USB ports do the new MacBook air have? The old ones had two. And shipped with 8GB of RAM? These are shit-tier specs.

The 2020 MacBook pros had a nice thing: USB-C charging, and you could charge the from either side. Current models went back to MagSafe, only on one side. The number of USB ports is still very low.

But the are shiny. I guess that counts as quality.


I guess we can agree to disagree, but I find the 2020 rev Macbook pros have a good number of USB-C ports (2 on the left, 1 on the right -- all can do PD), a magsafe charger, headphone jack, HDMI port and SD card slot. How many USB-C ports do you need? Sometimes I wish there was ethernet but I get why it's not there.

I agree, the butterfly keyboard was shitty but I absolutely love the keyboard on the 2020 rev. It's still not as great as my mechanical desktop keyboard, but for a laptop keyboard it's serious chef's kiss. Also, I have yet to find a trackpad that is anywhere as good as the Macbook. Precision trackpads are still way way worse.

Finally, the thing that always beings me back to MBPs (vs Surfacebooks or Razers) is battery life. I typically get a good 10+ hours on my MBP. Battery life on my old Razer Blade and Surfacebooks were absolutely comically horrible.


I'm absolutely not an Apple person. Privately own zero Apple hardware.

However there are two awesome things about my work MBP I would really want from my ThinkPad:

Magsafe charger - too many close calls!

And the track pad.

I can't work properly without an external mouse on my ThinkPad. But on the MBP everything just has the right size, location, proportions and handling on the track pad. I had a mouse for the MBP too but I stopped using it!


USB-C charging still works with the Pros (driving a M3 Max), and 3 ports seems reasonable to me.


> I owned a MacBook Pro with the dreaded butterfly keyboard. It was shit.

Yea, the butterfly was utter shit. And they fucked up the touchbar by not just putting it on TOP of the existing F-keys.

But the rest of the laptop was still well machined :D


the more the deviate from the BSD core the worse it gets.


But I can still fire up a terminal and use all of my *nix skills to operate.

I can't do that on Windows without either wrestling with PowerShell or WSL2


I don’t think it’s at all unreasonable for an engineer using a device for 8+ hours every day to pay an additional, say, 0.5% of their income (assuming very conservatively $100,000 income after tax, $1,000 extra for a MacBook, 2 year product lifespan) for the best built laptop, best screen, and best OS.


$100,000 after tax does not seem conservative to me (at least outside the US).


$50,000 income, 4 year product lifespan?

Obviously doesn’t apply to all engineers.


> and best OS

I do networking stuff and macOS is on par with Windows - I can't live on it without running into bugs or very questionable behavior for longer than a week. Same as Windows.


What stuff is weird? I have so far had very good experiences with Apple (although not iOS yet). Almost everything I do on my Linux workstation works on Mac too. Windows though is beyond horrible and different in every way.

> I do networking stuff

Me too, but probably very different stuff. I’m doing p2p stuff over tcp and am affected mostly by sock options, buffer sizes, tcp options etc.


> Best OS

I like apple hardware, but their OS is fucking atrocious. In the year 2024 it still doesn't have a native volume mixer, or any kind of sensible window management shortcuts. Half the things on it have to be fixed with paid software. Complete joke of an OS, if it were up to me I'd stick a linux distro on top of the hardware and be happy


The OS is not a joke since it can do some stuff better than either Windows or Linux can but I completely agree that there are some serious limitations or omissions that should have been fixed.

I think they don't because they have an incentive to not do so: they get a cut on all the software you have to purchase on the App Store to make up for it. It might not look like a lot, but if a large portions of Mac users need to buy a 5-10 bucks app to fix the windows management problems, it becomes serious money at 15-30% cut on millions of purchases...

And this is precisely the problem with Apple today. They are not honest enough to fix or improve the stuff they sell at a very high price, both because they sell it anyway and because they put in place many incentives for themselves to not do so.

There is the running meme of the iPad calculator but macOS could also use some care on the calculator/grapher not having received serious attention in decades. At the price they sell their stuff that would seem like a given, but considering they'll make money on the apps you buy to improve that situation, they'll never do so.

After using Apple App Stores for so many years, I wish I didn't, the convenience really isn't worth the cost down the road...


And the M1 chip on mine really alters productivity. Every time we want to update a library, we need some kind of workaround.

It's great having a chip that is so much different than what our production infrastructure uses.


This should be a temporary problem solved with time. The battery and performance gains are completely worth most workarounds required.


Not worth it at all. I rarely use battery power, so I'd rather have an intel or AMD chip with more cores and a higher clock speed at the expense of the battery. Oh, and an OS that can actually manage its windows, and customize keyboard settings, and not require an account to use the app store


Then why are you using a Macbook in the first place? There are plenty of Ryzen 7000 and Intel Ultra laptops with similar performance out there. The key benefit of a Macbook is the battery life and sane sleeping when portable.


Tell that to my employer


Apple's hardware these days is exceptional but the software left wanting in comparison. MacOS feels like it's been taking two steps back for every step forward for a decade now. I run MacOS, Linux w/ i3, and Windows all every day, and outside of aesthetics & apple integration MacOS feels increasingly the least coherent of the 3.

The same is true of the ipad which is just a miraculous piece of hardware constrained by an impotent operating system.


This statement is completely wrong. There are millions of engineers in the world and most of them live in countries like China, India and Russia. Very few of them use MacBooks.

The vast majority of the software engineers in big companies (that employ a lot more people than big tech and startups combined) who use Java and C# also have predominately Windows laptops (as their employers can manage Windows laptops a lot easier, have agreements with vendors like Dell to buy them with a discount, have software like AV that doesn't support MacOS, etc.).

On top of that MacBooks don't have the best screens and are not the best built. Many Windows laptops have OLED screens or 4K IPS screens. There are premium Windows laptops made out of magnesium and carbon fiber.


I'm an American, so maybe the situation is different elsewhere.

Every company I've worked for during the last 12 years gives out MacBook Pros. And I've been developing using Scala / Java for the last 20 years.

Employers manage Macs just fine, this isn't 1999. There have been studies showing that Macs have lower IT maintenance costs compared to Windows.

I admit that I haven't dealt with Windows devices in a long time, maybe there are some good ones available now, but I find your statements to be beyond belief. Apple Silicon Macs have blown the doors off the competition, out performing all but top-end Intel laptops, while using a fraction of the power (and I never even hear the fans come on).


I think relatively few corporations are offering Macs to people. It's all bog-standard POS Dells, with locked-down Windows images that often do not even allow you to change the screensaver settings or the background image, in the name of "security." I'd love to be wrong about that.


all two jobs I've worked, both as a backend dev using Go in data-storage companies, have offered Macs. The first one, a small, badly run startup, only offered Macs. This gig, a larger company, offers Mac, Linux and Windows. I started with Linux and then switched to Mac because I was tired of stuff breaking.

I use Debian + FreeBSD on my personal PCs though.


US Engineers, and in countries of similar income, the rest of the world is pretty much settled in a mix of Windows and GNU/Linux desktop/laptops.


If it weren't for the OS I would've bought a MacBook instead of a Lenovo laptop.

I've set up my OS exactly as I want it. (I use arch btw ;-))


Arch works fairly well on Apple silicon now, though Fedora is easier/recomended. Limited emulation due to the 16KB pages and no thunderbolt display out.


Same, but on gentoo :-p


I’m freelance so I’ve absolutely paid for my last 3 Macbooks. They’re best in class tools and assets for my business.


Arguably the best OS? For what? For browsing the web, video editing, etc.? Maybe. For development? Jesus, macOS doesn't even have native container support. All the devs I know with macOS then either get a second Linux laptop, or spend a lot of their time SSHd into a Linux server.

For dev (at least backend and devops), macOS is not that great.


I don't know what you are talking about, I'm a back end engineer, and every company I've worked for during the last 12 years gives out MacBook pros to all devs. Even the game company that used C# and Mono gave out MacBooks (and dual booted them, which of course you can't do any more; I never bothered with Windows since our servers were written in Scala).

Not all teams run tons of containers on personal computers. All our servers are running on AWS. I rarely ssh into anything.

I like the fact that OS X is based on UNIX, and not some half-assed bullshit bolted onto Windows. I still have bad memories of trying to use Cygwin 15 years ago. Apparently WSL is an improvement, but I don't care.

Mac runs all the software I need, and it has real UNIX shells.


Yeah it's funny for all the hoopla I've heard over the glory of MacOS having a REAL UNIX TERMINAL, WSL works better in practice simply because it's running an actual Linux VM and thus the support is better.

Still, I just don't think it's that burdensome to get containers running on MacOS, it's just annoying that it happens to work worse than on Windows or Linux. Ignoring the hardware, the only real advantage to MacOS development is when you're targeting Apple products with what you're developing.


"best OS" is so subjective here. I'll concede that the MacBook hardware is objectively better than any laptop I've owned. But it's a huge leap to say Mac OS is objectively better than Linux IMO.


They are perhaps only the best by a very small margin.

I am happy to not support Apple's ecosystem and use a minimally worse laptop from a different brand.


I have one and hate it with a passion. A MacBook Air bought new in the past 3 years should be able to use Teams (alone) without keeling over. Takes over a minute to launch Outlook.

My 15 year old Sony laptop can do better.

Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.

I avoid using it whenever possible. If people email me, it’d better not be urgent.


I avoid using Outlook on any device, but I wouldn't complain about my Surface tablet's performance based on how poorly iTunes performs...


Meanwhile here I am, running linux distros and XFCE on everything. My hardware could be a decade old and I probably wouldn't notice.

(In fact I DO have a spare 13 year old laptop hanging around that still gets used for web browsing, mail and stuff. It is not slow.)


Indeed, I have a 15-year-old desktop computer that is still running great on Linux. I upgraded the RAM to the maximum supported by the motherboard, which is 8 GB, and it has gone through three hard drives in its life, but otherwise it is pretty much the same. As a basic web browsing computer, and for light games, it is fantastic.


It also performs pretty well for the particular brand of web development I do, which basically boils down to running VS Code, a browser, and a lot of ssh.

It's fascinating to me how people are still attached to the hardware upgrade cycle as an idea that matters, and yet for a huge chunk of people and scenarios, basically an SSD, 8gb of RAM and an Intel i5 from a decade ago could have been the end of computing history with no real loss to productivity.

I honestly look at people who use Apple or Windows with a bit of pity, because those ecosystems would just give me more stuff to worry about.


Is it an Apple silicon or Intel machine? Intel macs are crazy slow - especially since the most recent few versions of macOS. And especially since developers everywhere have upgraded to an M1 or better.


No MacBook Air from the last 3 years is Intel-based


You could certainly still buy new intel macbooks 3 years ago from Apple. Plenty of people did - particularly given a lot of software was still running through rosetta at the time.

The M1 air was only released in November 2020. With a bit of slop in the numbers, its very possible the parent poster bought an intel mac just before the M1 launched.


Yeah it's such a shame how much the performance has been affected by recent macOS. I kept my 2019 Mac Book Pro on Catalina for years because everyone else was complaining... finally upgraded directly to Sonoma and the difference in speed was night and day!


Sounds a bit like my Intel MBP, in particular after they (the company I work for) installed all the lovely bloatware/tracking crap IT thinks we need to be subjected to. Most of the day the machine runs with the fans blasting away.

Still doesn't take a minute to launch Outlook, but I understand your pain.

I keep hoping it will die, because it would be replaced with an M-series MBP and they are way, way, WAY faster than even the best Intel MBP.


That’s not an issue with Macboom but with MS. MS has an incentive to deliver such a terrible experience on macs.


MS has literally thousands of managers running Outlook and Teams on their company-provided ARM MacBooks daily.


> Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.

It is Microsoft. I could rant all day about the dumpster fire that is the "NEW Microsoft Teams (Work or School)"

It's like the perfect shining example of how MS doesn't give a flaming fuck about their end users.


I will pile on on MS Teams. I am on a Mac and periodically have to fight it because it went offline on me for some reason and I am no longer getting messages. Slightly less annoying is when my iPhone goes to sleep and Teams on my iPhone then sets my status to "Away", even though I am actively typing on Teams on my computer.

And while my particular problems might be partially because I am on MacOS, I observe Windows-using colleagues have just as many problems joining meetings (either total refusal, no audio, or sharing issues). So I think using Teams as a measure of any computer is probably not warranted.


I suppose you like bloatware and ads in your taskbar and 49 years of patch Tuesday. Have fun with that. I’ll take Mac over any windows.


Teams is shit, and hangs and crashes on my Mac. I blame Microsoft for that.


Outlook (old) is okay on Mac Teams is a dumpster fire on every platform


This hasn't been true for a long time.


not machines learning Devs


no, no, NO and yes.

I actually rejected a job offer when heard I will be given a macbook pro.

Apple, been the most closed company these days, should be avoided as much as you can, not to mention its macos is useless for linux developers like me, anything else is better.

its keyboard is dumb to me(that stupid command/ctrl key difference), can not even mouse-select and paste is enough for me to avoid Macos at all costs.


> I actually rejected a job offer when heard I will be given a macbook pro.

Probably best for you both.


> I actually rejected a job offer when heard I will be given a macbook pro.

For what it's worth, I've had a good success rate at politely asking to be given an equivalent laptop I can put linux on, or provide my own device. I've never had to outright reject an offer due to being required to use a Mac. At worst I get "you'll be responsible for making our dev environment work on your setup".


I've had 50/50. These days I'm fairly okay with just taking the Macbook Pro. I did have one instance where I got one my first week and used my Dell XPS with Linux the entire 10 months I was at the place. I returned the Macbook basically unused.

Only one time did I interview with a place where I asked if I'd be given a choice what hardware/OS I could use. The response was "We use Windows". My response was, "no we do not. Either I will not be using Windows with you, or I will not be using Windows NOT with you". I didn't get an offer. I was cool with it.


I think I had similar feelings but took an open mind and love my m2 pro Sometimes an open mind reaps rewards friend


I selected Mac + iOS devices when a job offered a choice, specifically to try out the option, while personally sticking with Windows and Android.

Now the performance of Mx Macs convinced me to switch, and I'll die on the hill of Android for life


> its keyboard is dumb to me(that stupid command/ctrl key difference)

Literally best keyboard shortcuts out of all major OSes. I don't know what weird crab hands you need to have to comfortably use shortcuts on Windows/Linux. CMD maps PERFECTLY on my thumb.


what amazing laptop must an employer give you to not be summarily rejected?


any thing runs Linux,even wsl2 is fine,no macos is the key. and yes it costs the employer about half of the expensive Apple devices that can not even be upgraded, its hardware is as closed as its software.


Employers typically also care about costs like “how hard is it to provision the devices” and “how long is the useful life of this” or “can I repurpose an old machine for someone else”.


Provisioning is a place where Windows laptops win hands down, though.

Pretty much everything going wrong with provisioning involves going extra weird on hw (usually for cheap supplier) and/or pushing weird third party "security" crapware.


macOS is clearly better for linux devs than Windows, given it is unix under-the-hood.

I don't even know what you mean by mouse-select and paste.


> "I don't even know what you mean by mouse-select and paste."

Presumably they mean linux-style text select & paste, which is done by selecting text and then clicking the middle mouse button to paste it (no explicit "copy" command).

macOS doesn't have built-in support for this, but there are some third-party scripts/apps to enable it.

For example: https://github.com/lodestone/macpaste


On Windows these days, you get WSL, which is actual Linux, kernel and all. There are still some differences with a standalone Linux system, but they are far smaller than macOS, in which not only the kernel is completely different, but the userspace also has many rather prominent differences that you will very quickly run afoul of (like different command line switches for the same commands).

Then there's Docker. Running amd64 containers on Apple silicon is slow for obvious reasons. Running arm64 containers is fast, but the actual environment you will be deploying to is almost certainly amd64, so if you're using that locally for dev & test purposes, you can get some surprises in prod. Windows, of course, will happily run amd64 natively.


> "userspace also has many rather prominent differences ... (like different command line switches for the same commands)."

Very quickly remedied by installing the GNU versions of those commands, ie: "brew install coreutils findutils" (etc)

Then you'll have exactly the same command line switches as on Linux.


> the actual environment you will be deploying to is almost certainly amd64

that’s up to your team of course, but graviton is generally cheaper than x86 instances nowadays and afaik the same is true on google and the other clouds.


Arm is an ISA, not a family of processors. You may expect Apple chips and Graviton to be wildly different, and perform completely different in the same scenario. In fact, most Arm cpus also have specific extensions that are not found in other manufacturers. So yes, while both recognize a base set of instructions, thats about it - expect that everything else is different. I know, amd64 is also technically an ISA, but you have 2 major manufacturers, with very similar and predictable performance characteristics. And even then, sometimes something on AMD behaves quite differently from Intel.

For most devs, doing crud stuff or writing high-level scripting languages, this isn't really a problem. For some devs, working on time-sensitive problems or with strict baseline performance requirements, this is important. For devs developing device drivers, emulation can only get you so far.


What are you responding to here?

No, I said you won’t always be deploying on amd64. Because arm64 is now the cheapest option and generally faster than the sandy bridge vcpu unit that amd64 instances are indexed against (and really, constrained to, intentionally, by AWS).

I never said anything about graviton not being arm64.


Its not about price, its about compatibility. Just because software compiles in a different ISA doesnt mean it behaves the same way. But if that isn't obvious to you, good for you.


On most Linux environments: text you highlight with the mouse (or highlight by double/triple clicking) can be "pasted" by middle-clicking.


And it's a separate clipboard from Ctrl+C/right-click-and-copy. The number of times I miss that on non-Linux...


Personally, I use tmux on both Linux and macOS to get multiple clipboards and the mouse behaviour I’m used to.


Engineers loving tools is peak HN :).


No, no, no, yes.


If we are honest vanity signaling is a large part of it. Basically the Gucci bag equivalent for techies.


Honestly not. My tests run WAY faster on Apple Silicon, that's all I care about.


Not being contrarian, but what are you comparing?


M* has caused nothing but trouble for most mac user engineers I know (read: most engineers I know) who upgraded. Now not only are they building software for a different OS, they're building for a different architecture! They do all of their important compute in Docker, wasting CPU cycles and memory on the VM. All for what: a nice case? nice UI (that pesters you to try Safari)?

It looks like Apple's silicon and software is really good for those doing audio/video. Why people like it for dev is mostly a mystery to me. Though I know a few people who don't really like it but are just intimidated by Linux or just can't handle the small UX differences.


I'm an engineer that has both an apple silicon laptop (mbp, m2) and a linux laptop (arch, thinkpad x1 yoga.) I choose the mac every day of the week and it's not even close. I'm sure it's not great for specific engineering disciplines, but for me (web, rails, sre) it really can't be beat.

The UX differences are absolutely massive. Even after daily-driving that thinkpad for months, Gnome always felt kinda not quite finished. Maybe KDE is better, but it didn't have Wayland support when I was setting that machine up, which made it a non-starter.

The real killer though is battery life. I can work literally all day unplugged on the mbp and finish up with 40-50% remaining. When i'm traveling these days, i don't even bring a power cable with me during the day. The thinkpad, despite my best efforts with powertop, the most aggressive frequency scaling i could get, and a bunch of other little tricks, lasts 2 hours.

There are niceties about Linux too. Package management is better and the docker experience is _way_ better. Overall though, i'd take the apple silicon macbook 10 times out of 10.


Battery life followed by heat and fan noise have been my sticking points with non-mac laptops.

My first gen ThinkPad Nano X1 would be an excellent laptop, if it weren’t for the terrible battery life even in power save mode (which as an aside, slows it down a lot) and its need to spin up a fan to do something as trivial as driving a rather pedestrian 2560x1440 60hz display.

It feels almost like priorities are totally upside down for x86 laptop manufacturers. I totally understand and appreciate that there are performance oriented laptops that aren’t supposed to be good with battery life, but there’s no good reason for there being so few ultraportable and midrange x86 laptops that have good battery life and won’t fry your lap or sound like a jet taking off when pushed a little. It’s an endless sea of mediocrity.


> The thinkpad, […], lasts 2 hours.

This echoes my experiences for anything that needs power management. Not just that the battery life is worse, but that it degrades quickly. In two years it’s barely usable. I’ve seen this with non-Apple phones and laptops. iPhone otoh is so good these days you don’t need to upgrade until EOL of ~6 years (and even if you need it battery is not more expensive than any other proprietary battery). My last MacBook from 2011 failed a couple of years ago only because of a Radeon GPU inside with a known hw error.

> There are niceties about Linux too.

Yes! If you haven’t tried in years, the Linux desktop experience is awesome (at least close enough) for me – a dev who CAN configure stuff if I need to but find it excruciatingly menial if it isn't related to my core work. It’s really an improvement from a decade ago.


I'd like to offer a counterpoint, I have an old'ish T480s which runs linuxmint, several lxd containers for traefik, golang, python, postgres and sqlserver (so not even dockerized, but full VMs running these services), and I can go the whole morning (~4-5 hours).

I think the culprit is more likely the power hungry intel CPU in your yoga?

Going on a slight tangent; I've tried but do not like the mac keyboards, they feel very shallow to me, hence why I'm still using my old T480s. The newer thinkpad laptop keyboards all seem to be going that way though (going thinner), much to my dismay. Perhaps a P14s is my next purchase, despite it's bulk.

Anybody with a framework 13 want to comment on their keyboard?


I really like the keyboards on my frameworks. I have both the 13 and the new 16, and they are pretty good. Not as good as the old T4*0s I'm afraid, but certainly usable.


Interesting. I do similar (lots of Rails) but have pretty much the opposite experience (other than battery life - Mac definitely wins there). Though I use i3/Sway more than Gnome. The performance of running our huge monolith locally is much better for Linux users than Mac users where I work.

I used a Mac for awhile back in 2015 but it never really stood out to me UX-wise, even compared to Gnome. All I really need to do is open a few windows and then switch between them. In i3 or Sway, opening and switching between windows is very fast and I never have to drag stuff around.


This is going to change once Arm on Linux becomes a thing with Qualcomm's new jazz. I am mostly tethered to a dock with multiple screens. I have been driving Ubuntu now for over 4 years full time for work.


>The UX differences are absolutely massive.

Examples?


In my experience as a backend services Go developer (and a bit of Scala) the switch to arm has been mostly seamless. There was a little config at the beginning to pull dual-image docker images (x64 and arm) but that was a one time configuration. Otherwise I'm still targeting Linux/x64 with Go builds and Scala runs on the JVM so it's supported everywhere anyway; they both worked out of the box.

My builds are faster, laptop stays cooler, and battery lasts longer. I love it.

If I was building desktop apps I assume it would be a less pleasant experience like you mention.


The pain for me has been in the VM scene, as VirtualBox disappeared from the ecosystem with the switch to ARM.


Interestingly enough, the trend I am seeing is all the MacBook engineers moving back to native development environments. Basically, no longer using docker. And just as expected, developers are getting bad with docker and are finding it harder to use. They are getting more and more reliant on devops help or to lean on the team member who is on Linux to handle all of that stuff. We were on a really great path for a while there in development where we were getting closer to the ideal of having development more closely resemble production, and to have developers understand the operations tools. Now we're cruising firmly in the opposite direction because of this Apple switch to arm. Mainly it wouldn't bother me so much if people would recognize that they are rationalizing because they like the computers, but they don't. They just try to defend logically a decision they made emotionally. I do it too, every human does, but a little recognition would be nice.


It's not even a problem with MacBooks as such. They are still excellent consumer devices (non-casual gaming aside). It's this weird positioning of them as the ultimate dev laptop that causes so many problems, IMO.


Why would excellent machine be blamed for shitty software?


Because machines are tools meant to perform tasks, and part of that is being interoperable with other tools and de facto standards in the relevant field. For dev work, today, MacBook is not good at it.


Remember, though, that the binaries deployed in production environments are not being built locally on individual developer machines, but rather in the cloud, as reproducible builds securely deployed from the cloud to the cloud.

Modern language tooling (Go, Rust et al) allows one to build and test on any architecture, and the native macOS virtualization (https://developer.apple.com/documentation/virtualization) provides remarkably better performance compared to Docker (which is a better explanation for its fading from daily use).

Your "trend" may, in fact, not actually reflect the reality of how cloud development works at scale.

And I don't know a single macOS developer that "lean(s) on the team member who is on Linux" to leverage tools that are already present on their local machine. My own development environments are IDENTICAL across all three major platforms.


Virtualization and Docket are orthogonal technologies. The reason you use docker, especially in dev, is to have the exact same system libraries, dependencies, and settings on each build. The reason you use virtualization is to access hardware and kernel features that are not present on your hardware or native OS.

If you deploy on docker (or Kubernetes) on Linux in production, then ideally you should be using docker on your local system as well. Which, for Windows or MacOS users, requires a Linux VM as well.


It seems that you're trying to "educate" me on how containers and virtualization work, when in fact I've been doing this for a while, on macOS, Linux and Windows (itself having its own Hyper-V pitfalls).

I know you mean well, though.

There is no Docker on macOS without a hypervisor layer - period - and a VM, though there are multiple possible container runtimes not named Docker that are suitable for devops-y local development deployments (which will always, of course, be constrained in comparison to the scale of lab / staging / production environments). Some of these can better leverage the Rosetta 2 translation layer that Apple provides, than others.


I'm sorry that I came up as patronizing, I was more so trying to explain my confusion and thought process rather than to teach you about virtualization and containers.

Specifically what confused me in your comment was that you were saying Docker on Mac was superseded by their new native virtualization, which just doesn't make sense to me, for the reasons I was bringing up. I still don't understand what you were trying to say; replacing docker with podman or containerd or something else still doesn't have anything to do with virtualization or Rosetta, or at least I don't see the connection.

I should also say that I don't think anyone really means specifically docker when they talk about it, they probably mean containerization + image repos in general.


I don’t know a single engineer who had issues with M chips, and most engineers I know (me included) benefited considerably from the performance gains, so perhaps your niche isn’t that universal?


My niche is Ruby on Rails web dev, which is definitely not universal, but not all that narrow either!


You must have an unusual setup because, between Rosetta and rosetta in Virtualization.framework VMs (configurable in Docker Desktop or Rancher Desktop), I’ve never had issues running intel binaries on my Mac


I’m doing Ruby on Rails dev too. I don’t notice a hige difference between macOS and Linux for how I work.

There’s quirks to either OS.

Eg when on Gnome it drives me mad that it won’t focus a recently launched apps.

On macOS it annoys me that I have install a 3rd party util to move windows around.

Meh, you just adapt after a while.


what's wrong w/ Rails on M chips? I don't recall having had much trouble with it (except w/ nokogiri bindings right when the M1 was first available, but that's a given for any new release of OSX)


We have to cross-compile anyway because now we're deploying to arm64 Linux (AWS Graviton) in addition to x86 Linux.

So even if all developers of your team are using Linux, unless you want to waste money by ignoring arm64 instances on cloud computing, you'll have to setup cross compilation.


1) macs are by far the best hardware and also performance running intel code is faster than running intel code on the previous intel macs: https://discourse.slicer.org/t/hardware-is-apple-m1-much-fas... 2) they should use safari to keep power usage low and browser diversity high


It's basically required for iOS development. Working around it is extremely convoluted any annoying


I forgot to mention that as an obvious exception. Of course developing for Apple is best on Apple hardware.


I strongly suggest putting in the time to learn how to install and maintain a linux laptop ... Ubuntu 24.04 is a great engineer platform


It is, provided that the hardware vendor has reasonably decent support for power management, and you're willing to haul around an AC adapter if not. In general, I really like AMD hardware with built-in graphics for this, or alternately, Intel Tiger Lake-U based hardware.

Asahi Linux is shockingly great on Apple Silicon hardware, though.


I disagree.

Apple is selling hardware and scaling AI by utilizing it is simply a smart move.

Instead of building huge GPU clusters, having to deal with NVIDIA for GOUs (Apple kicked NVIDIA out years ago because of disagreements), Apple is building mainly on existing hardware.

This is in other terms utilizing CPU power.

On the other hand this helps their marketing keeping high price points when Apple now is going to differentiate their COU power and therefore hardware prices over AI functionality correlating with CPU power. This is also consistent with Apple stopping the MHz comparisons years ago.


Did you reply to the right comment? Feels like we’re talking about different things altogether.


What AI is Apple scaling?


Seen MLX folks post on X about nice results running local LLMs. https://github.com/ml-explore/mlx

Also, Siri, and consider: you’re scaling AI on apple’s hardware, too, you can develop your own local custom AI on it, there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.

They scale the VRAM capacity with unified memory and that plus a ton of software is enough to make the Apple stuff plenty competitive with the corresponding NVIDIA stuff for the specific task of running big AI models locally.


> there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.

But this hardly applies to 95% if not more people of all people running Apple's hardware, the fastest CPU/GPU isn't worth much if you can fit any at least marginally useful LLM model on the 8GB (or less on iPhones/iPads) of memory that you device has?


>Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.

Most CS professionals who write code have no idea what it takes to build a desktop, so the hardware that they chose is pretty much irrelevant because they aren't specifically choosing for hardware. The reason Apple gets bought is mostly by anyone, including tech people, is because of ecosystem. The truth is, nobody really care that much about actual specs as long as its good enough to do basic stuff, and when you are indifferent to the actual difference but all your friends are in the ecosystem, the choice is obvious.

You can easily see this yourself: ask these "professionals" about the details of the Apple Neural engine, and its a very high chance that they will repeat some marketing material, while failing to mention that Apple does not publish any real docs for ANE, you have to sign your code to run on ANE, and you have to basically use Core ML to utilize the ANE. I.e if they really cared about inference, all of them would be buying laptops with discrete 4090s for almost the same price.

Meanwhile, if you look at people who came from EE/ECE (who btw on the average are far better coders than people with CS background, based on my 500+ interviews in the industry across several sectors), you see a way larger skew towards Android/custom built desktops/windows laptops running Linux. If you lived and breathed Linux and low level OS, you tend appreciate all the power and customization that it gives you because you don't have to go learn how to do things.


Coming from both environments, I'd be wary of making some of these assertions, especially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus. This applies regardless of (RT)OS / hardware choice, i.e., it's simply common sense.

The signing of binaries is a part of adult developer life, and is certainly required for the platforms you mention as well.

Unquestionably, battery life on 4090-based laptops sucks on a good day, and if you're working long hours, the last thing you want to have to do is park yourself next to your 350W adapter just to get basic work done.


>specially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus.

Very much not true. Not to make this personal, but this is exactly what Im talking about Apple fans not understanding hardware.

Linux has been through the ringer of fighting its way to general use, and because of its open source nature and constant development. So in terms of working well, it has been optimized for hardware WAY further than Apple, which is why you find it on servers, personal desktops, phones, portable gaming devices, and even STM32 Cortex bldc control boards, all of which run different hardware.

Apple doesn't optimize for general use, it optimizes for a specific business case. In the case of Apple silicon, it was purely battery life which brings more people in to the ecosystem. Single core performance is on par with all the other chips, because the instruction set doesn't actually matter (https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-doesnt-...), multi core is behind, Mac Os software is still a pile of junk (Rosetta still isn't good across the board), computers are not repairable, you have no privacy since Apple collects a shitload of telemetry for themselves, e.t.c and so on.

And, Apple has no incentive to make any of this better - prior to Apple Silicon, people were still buying Intel Macs with worse specs and performance for the same price, all for the ecosystem and vanity. And not only was the Macos still terrible (and much slower), you also had hardware failures like plugging in a wrong USBC hub would blow the chip and brick your Mac, butterfly keyboards failing, and questionable decisions like virtual esc keys.

>The signing of binaries is a part of adult developer life,

...for professional use, and the private key holder should be the person who wrote that software. I hope you understand how ridiculous it is to ask a developer to sign code using the manufacturers key to allow them to run that code on a machine that they own.

>Unquestionably, battery life on 4090-based laptops sucks on a good day,

Well yea, but you are not buying that laptop for battery life. Also, with Ryzen cpus and 4090s, most get like 6-8 hours depending on use due to Nvidia Prime, which is pretty good for travel, especially if you have a backpack with a charging brick.

If you want portability, there are plenty of lighter weight option like Lenovo Yoga which can get 11-12 hours of battery life for things like web browsing.


Macbooks are not bang-for-buck. Most engineers I know buy it because it's like Windows but with Unix tools built-in.


I would be interested if there exists a single better value machine in $ per hour than my partners 2012 MacBook Air, which still goes


Any decent laptop from the same era. My parents are using both HP ProBooks and Lenovo Thinkpads from that era currently and they are working perfectly and maintenance costs are lower than the same era macbooks...

I own a MacBook Air, I won't be buying another purely because the moment I need to upgrade anything or repair anything it's effectively ewaste.


Not found any good proxy which works well with cisco VPN software. Charles and proxyman work intermittently at best and require disconnecting from the VPN and various such dances.

Fiddler on windows works flawlessly.


> Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets.

I notice your use of the weasel word "just".

We undoubtedly live in a world where Apple products are sold as lifestyle gadgets. Arguably it's more true today than it ever was. It's also a world where Apple's range of Veblen goods managed to gain footing in social circles to an extent that we have kids being bullied for owning Android phones.

Apple's lifestyle angle is becoming specially relevant because they can no longer claim they sell high-end hardware, as the difference in specs between Apple's hardware and product ranges from other OEMs is no longer noticeable. Apple's laughable insistence on shipping laptops with 8GB of RAM is a good example.

> Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.

I don't think so, and that contrasts with my personal experience. All my previous roles offered a mix of MacBooks and windows laptops, and MacBooks were opted by new arrivals because they were seen as perks and the particular choice of windows ones in comparison were not as impressive, even though they out-specced Apple's offering (mid-range HP and Dell). In fact in a recent employee's review their main feedback was that the MacBook pro line was under-specced because at best it shipped with only 16GB of RAM while the less impressive HP ones already came with 32GB. In previous years, they called for the replacement of the MacBook line due to the rate of keyboard malfunctions. Meaning, engineers were purposely picking the underperforming option for non-technical reasons.


I bought my first Apple product roughly 11 years ago explicitly because it had the best accessibility support at the time (and that is still true). While I realize you only see your slice of the world, I really cringe when I see the weasel-word "lifestyle". This "Apple is for the rich kids"-fairytale is getting really really old.


Apparently you’ve never used Apple Silicon. There’s no PC equivalent in terms of specs.

Also, I think you’re misunderstanding what a Veblen good is and the difference between “premium” and “luxury.” Apple does not create luxury or “Veblen” goods like for example, LVMH.

An easy way to discern the difference between premium and luxury — does the company advertise the product’s features or price?

For example, a Chanel handbag is almost entirely divorced from its utility as a handbag. Chanel doesn’t advertise features or pricing, because it’s not about the product’s value or utility, it’s what it says about your personal wealth that you bought it. That’s a Veblen good.

Apple heavily advertises features and pricing. Because they sell premium products that are not divorced from their utility or value.


price-performance is not a thing for a vast majority of users. Sure I'd like a $40k car but I can only afford a $10k car. It's not nice but it gets me from a to b on my min-wage salary. Similarly, I know plenty of friends and family. They can either get 4 macs for $1000 each (mom, dad, sister, brother) so $4k. Or they can get 4 windows PCs for $250 so $1k total.

The cheap Windows PCs suck just like a cheap car sucks (ok, they suck more), but they still get the job done. You can still browse the web, read your email, watch a youtube video, post a youtube video, write a blog, etc.. My dad got some HP celeron. It took 4 minutes to boot. It still ran though and he paid probably $300 for it vs $999 for a mac. He didn't have $999.


I’m not saying one or the other is better for your family members. But MacBooks last very long. We'll see about the M series but for myself for instance I got the M1 air without fans, which has the benefit of no moving pieces or air inlets, so even better. My last one, a MBP from 2011 lasted pretty much 10 years. OS updates are 8-10y.

> The cheap Windows PCs suck […], but they still get the job done

For desktop, totally. Although I would still wipe it with Ubuntu or so because Windows is so horrible these days even my mom is having a shit time with only browsing and video calls.

A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.


> A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.

To the person with no budget, all that doesn't matter. They'll still get let $250 laptop and put up with the garbage battery life (find a power outlet), infuriating trackpad (buy an external mouse for $10), bloatware (most users don't know this and just put up with it), etc....

I agree Apple is better. But if your budget is $250 and not $1k then you get what you can get for $250 and continue to feed your kids and pay your rent.


But also you don't have to buy new. If I had $250, an ancient MacBook might be better than a newer low-end windows laptop. Though for my purposes I'd probably get an oldish Chromebook and root it.


Most of Apple’s money comes from iPhones.


you can get a laptop with a much bigger screen and a keyboard for as little as 100 to 300$ and it will be much much easier to get work done on, than an apple phone. so i think apple is still very much a luxury product.


Spending your life on a phone is still a lifestyle "choice".


[flagged]


Countering a lazy reference with some weird racist stereotype was the best you could do?


It's not a stereotype, no one programs on phones here.


True, should have just said nonsense


Clumsily phrased. What I meant is that iPhones or similar priced smartphones are affordable and common for say middle class in countries with similar purchase power to Eastern European countries. You’d have to go to poorer countries like Vietnam or Indonesia for iPhones to be “out of reach”, given the immense value it provides.

Heck now I see even Vietnam iPhone is #1 vendor with a 28% market penetration according to statcounter. That’s more than I thought, even though I was just there…

Speaking of India, they’re at 4% there. That’s closer to being luxury.


I think US is their main market, though. The rest of the world prefers cheaper better phones and doesn't mind using WhatsApp for messaging, instead of iMessage.


As a single market, US is probably biggest. I’m seeing numbers that say that the “Americas” is a bit less than half of global revenue, and that would include Canada and all of South and Latin America. So the rest of the world is of course very important to Apple, at least financially.

> doesn't mind using WhatsApp for messaging

Well WhatsApp was super early and way ahead of any competition, and the countries where it penetrated had no reason to leave, so it’s not exactly like they settle for less. It has been a consistently great service (in the category of proprietary messaging apps), even after Zuck took over.


It's not about price-performance value at all. Mac is still the most expensive performance. And Apple is only particularly popular in the US. Android phones dominate most other markets, particularly poor markets.

Apple is popular in the US because a) luxury brands hold sway b) they goad customers into bullying non-customers (blue/green chats) and c) they limit features and customizability in favor of simpler interfaces.

It's popular with developers because a) performance is valuable even at Apples steep cost b) it's Unix-based unlike Windows so shares more with the Linux systems most engineers are targeting.


I have never been an apple fanboy. Till 2022, I was on android phones. Work issued either Thinkpad or XPS variants. However, I have owned apple books since 2004 starting from panther era. I sincerely believe that apple provides best features and performance combination in the given price for laptops.

Here I feel that I-hate-apple crowd is just stuck with this notion of luxury overpriced brand when it is clearly not the case. Apple has superior hardware at better price points. Last time I was doing shopping for a laptop, I could get similar features only at a 30% - 40% price premium in other brands.

I am typing this on an apple M2 air and try finding similar performance under 2000 USD in other brands. The responsiveness, the (mostly) sane defaults and superior rendering and fonts make it worth it. The OS does not matter so much as it used to do in 2004 and the fact that I have a unix terminal in 2024 is just incidental. I have turned off auto updates and I do not use much of phone integration apart from taking backups and photo copying.

I switched to an iPhone in 2022 from a 200 US$ Samsung handset. Here, I would say that not everyone needs an iPhone. My old phone used to do all the tricks I need on this one. However, the camera is really and photos are really great. If I buy an iPhone next time, it would be just for the photos it takes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: