Hacker News new | past | comments | ask | show | jobs | submit login
New MacBook Air (apple.com)
574 points by xenonite 12 days ago | hide | past | web | favorite | 576 comments

Keep in mind that probably the most important spec when considering a new laptop is one that is often not directly stated: the processor series.

I'm not talking about i3/i5/i7, but rather U/Y/H. This letter determines the TDP (thermal design power/point) them machine is designed to run at. The TDP will govern the setting for the base clock speed, and, just as importantly, the throttling behabior under load.

Processor series TDPs are Y: 4.5W, U: 15W, H: 45W.

The new MacBook Air appears to have a Y series processor, like the MacBook, which means it will be aggressively throttled to keep power consumption and heat generation low.

Practically, that means that the new Air will not be capable of running sustained workloads much above its base clock speed, which makes it unsuitable for many programming-related tasks.

The Pro is still a much better choice for programmers. The 13 is suitable for many things, but the 16, with the H series processor, is really preferred for computationally intensive work.

You can get away with this machine if your workflow primary involves a text editor and remote servers, but otherwise I would still opt for the pro.

The CPUs in the new MBA have 9W TDPs.

OEMs can still control load performance, it's not entirely determined by TDP.

Anandtech has a good article: https://www.anandtech.com/show/13544/why-intel-processors-dr...

Depending on OEM settings and cooling capacity, it's possible for Intel CPUs to indefinitely run at greater than their base clocks and TDP.

Totally. To back this up with a real world example, Apple screwed up the throttling of the i9 MacBooks originally, causing them to be over throttled. A software update that pretty much amounted to an MSR write on boot fixed it.


Why does Intel let OEMs play with that? It's to give OEMs a knob they can play with to balance perf and price for their segment. It lets OEMs cheap out on cooling (or go with new form factors like the tiny GPD models) and go with the low minimums, or have a full system that's capable of higher TDP like Apple's tend to be.

Macbook Air has extremely bad cooling (on purpose, because it's thin and quiet). Don't expect it to boost well.

"Boost well" probably means different things to different people.

On my fanless 12" MB with a 7W TDP m7-6Y75 (1.3 Ghz base, 3.1 GHz max turbo), running a Prime95 torture test stabilizes at the following for the CPU via Intel Power Gadget: 9.5W, 1.95 GHz, 90°. (Laptop feels warm to the touch - it only ever feels "hot" under GPU load.) Normal all-thread load stabilizes around 2.5 GHz.

This is underestimating modern processors and overestimating programming compute resource usage.

Your post may apply if we're working with 4K+ resolution video files, rendering and other activities, but the modern programmer, even compiling binaries, will be fine on a MacBook Air.

How do I know?

I use a 2 generation old Macbook as my primary personal development machine. I write Go, Rust, Java and TypeScript using the common tool chains for all those languages.

Generalizing "programming" is a fools errand at this point. For some, their IDE alone would warrant a pro-spec machine. Some people need to virtualize their development environments. Some people need to virtualize several environments interconnected. And for some people, programming is just a simple terminal session with tmux running - VIM on one screen, shell in another, tests in another.

I totally agree that you can't fit all programmers into one bucket. I think we should be able to describe some general patterns in programmer workflows such that we can give concrete advice to programmers about what kind of hardware they would need.

A perfect example of this the recent Level1Techs video about compiling an Unreal engine game using the AMD Threadripper 3990X[1]. Some concrete advice:

1. Often the heaviest aspect of a programmer's workflow isn't compiling the code, but running automated tests using VMs and stuff

2. compilers tend to benefit from large cache sizes and good single-threaded performance. Multi-threaded performance only makes a difference in limited scenarios.

I've also heard that storage performance, and in particular storage latency, can make a big difference on compiler performance, but the difference is starker on Windows than on Linux because NTFS is a lot slower than many Linux file systems (EXT4, XFS, etc.).

The thing is, we as programmers don't have a good mental model for how the software tools we use (IDEs, virtualization/docker, compilers) scale with modern hardware, and very few technology reviewers are producing content that would help us understand.


You haven't been forced to deploy something that builds a local K8s cluster and builds and runs 10+ containers just to get a microservices app up and ready for development, I see... (only partially /s).

That's a good point, and outside of my personal development workflow. Condolences!

I don't get this argument. Run local for basic testing, else use it as an ssh machine. I work with super computers. The power of my development machine usually doesn't make much difference, I spend a lot of my time ssh'd anyways. So if something has a nice screen, good battery life, and a comfortable keyboard then I'm good. As long as it works for presentations and can compile my latex files, I'm good.

I don't see Airs as development platforms. They are portable platforms.

It doesn't even take that much. Docker Desktop on MacOS is widely known to be terrible and have extremely poor performance. Running webpack-dev-server with filewatching through Docker nearly grinds my 2017 Macbook Pro to a halt. Obviously most node/webpack/JS stuff can run natively on Mac but it's not uncommon to see dev stacks that dockerize everything because that's how the production stack works.

You can improve this substantially on MacOS using docker-sync (the issues with Docker performance on Mac are primarily related to the file system)

I wrote a little about this; it's for Rails but the same ideas apply:


docker-sync helps a little bit, and so does limiting Docker to 1 CPU core, but I still routinely get spurts 100% CPU from Docker that despite supposedly being on 1 core still grind my machine to a halt.

I often have up to 10 VSCode editors, an IntelliJ window and two browser instances open.

Add some server processes, UI watchers/builders, and my Thinkpad starts to freeze up. In an ideal world, I would have a low-latency remote desktop going, but even in 2020 it seems like a pipe dream.

I’ve been using paperspace cloud machines with parsec for low latency Remote Desktop. Originally used it for gaming. It’s remarkable how good it is for everything else too.

5ms encoding lag, 10ms network lag, totally workable.

That's my work pretty much except not k8s but it does spin up between 4 and 8 containers (depending on which system) some of which require 4GB of RAM or more (don't ask..).

On the flip side standard issue laptops are now 16" Macbook Pro's with 32GB of RAM which is pretty hardware.

Just because there are needs for high-end CPUs for some developers it doesn't mean that all programmers need them, as OP's comment implied.

Well my "programming laptop" spends all of its CPU these days on corporate garbage such as splunkd, which regularly melts down, some kind of corporate managed software updater which is terrible, Zoom, and trying to render dashboards from Grafana which is among the most flagrantly wasteful javascript hacks ever devised. It doesn't help me much that vim and screen are efficient.

In fairness, as a Rust dev my computer sounds like it's a jet engine anytime I compile. I wouldn't even think about using a slimmer CPU. Not that you're wrong, it's just where my head is at haha.

I thought the same thing, but then I got a MacBook Air and couldn't believe how slow it was in practice. I didn't know if it was thermals or memory bandwidth or something, but Java compiles which used to take 5 minutes would suddenly take 30 minutes and it just wasn't even really tractable to use.

Generally speaking If you need any heavy computational workload for more than a few min the MacBook Air doesn't fit. The 3.4Ghz Boost only works in seconds and it long run it could only sustain at about 2 Ghz.

Not to mention previous MacBook Air were Dual Core only.

Even Chrome brings my machine to a halt. I've been running Docker recently and 8Gb of ram is instantly gone. Sometimes I think most folks are simply more patient with their computers, and will wait a few seconds for operations to finish, while I get a feeling something is taking too long and start messing around with ps aux, killing processes and such...

You can lower docker resources in a settings. I use it on my air.

I agree with you :) but let's not tell the employers who gladly buy the top-of-the-line models for their programmers ;) I am quite happy with my new 2020 MBP.

programming != running a text editor.

A lot of usecases involve running a replication of the server environment locally in resource intensive containers

I'm lucky to have development access to a small slice of a very large vmware cluster.

I'm surprised no one else operates this way.

The performance gains are immense.

Or how about just having a workstation at work? E.g. the Dell XPS 15 is a very popular laptop, but an equivalent cost workstation is around 8x faster for sustained well-parallelizable loads. Use a KVM switch with your work monitor, or use it as a headless server, do whatever. Want to work on machine learning stuff? Throw in an RTX 2080 Ti, off you go, never worry about cloud costs. Are you I/O bound? RAID0 off a couple NVMe drives.

It's immensely flexible and I'm surprised it's not more common.

It's more cost upfront, which is often a show-stopper. "Why can't you spin a beefy instance on EC2 when you need it for these 2-3 hours of intense work? It's so much cheaper!"

Also, a cloud VM, if it breaks, is replaced "for free" and "instantly"; fixing a physical machine has a cost and is a delay. Unless you're big enough to keep a spare (probably makes sense for many dozens of machines), it also looks a bit uneconomical from the business's POV.

Never mind that your cloud costs in a year end up being comparable to said workstation.

That was my experience at previous employers, but not my current one. We have some shared test environments in the cloud, but individualized test environments are discouraged. They're in the process of rolling out more powerful workstation laptops to provide us with enough horsepower to do testing locally.

That's why I'm tempted to build a custom workstation at home: so that I always have access to the compute resources I need to do my job.

Sometimes latency issues. I work from home office.

Don't we all :)

There are other people who operate that way. But lots of them use AWS or some other provider for that instead of local resources.

I'm using a 4-year-old Thinkpad with a dual core U-series i7 and 32GB or RAM. The results are...adequate.

A typical compile for me takes about twice as long compared to colleagues with newer H-series processors, but the individual projects are pretty small so that often isn't a huge deal.

However, once you try and run nearly a dozen docker containers for local testing, things slow down quite a bit. Multitasking while those tests are running is very difficult.

Yeah, I'm writing C++ on a 2013 Macbook Air. This would be a _huge_ upgrade for me.

That is peanuts. Have you tried opening a browser and surf the world wide web?

Depends on what IDE you use. Webstorm pushes even my octacore pretty well. Sure, some people prefer to work like it's still 60s tho.

Compiling routinely pegs all 12 cores on my top-spec MBP for 5-10 minutes at a time. YMMV.

What about large excel files with a bunch of pivot tables and v/xlookups?

I picked up the new 2018 Air right after the redesign, 16gb and 512 ssd model. I have had no problems with xcode doing iOS dev (even using simulator on occasion), vscode and nodejs/redis/mysql working fine, also no problems running windows in parallels desktop, it’s super fast and quiet!

Great machine, only wish I had waited for the true tone and keyboard redesign but those are minor, have grown to love the keyboard.

Same here. No problems doing development. The Achilles' heel? Work conferencing with Google Meet. Kills the poor machine.

I did the same but got 32GB. It is a fantastic machine, but it is even better with the blackmagic eGPU. Heavy recommend on that if considering a home dock setup.

Can someone explain chip speed and relevance in a way I'd understand pretty please? I dialled out of this stuff years ago - at a point when bigger numbers just meant better, but now they just seem the same?

I'm still using a 2013 MBA as my primary machine day to day. It has an i7 and 8gb of RAM. I've been waiting for the keyboard change before I upgraded, but today I see my options are i5 and i7 (as 2013). I'd get 16gb of RAM, but literally have no clue about the chip stuff. I'm assuming we've moved on somewhat from 2013?!

Any help much appreciated! Thanks in advance!

For Intel chips i3/i5/i7 have generally represented feature ranges, so to really identify the chip you have to find out their product code or at least processor generation. Once you have that you can use ARK to distinguish between them. Here's an example of an older 2017 era desktop processor of i5 and i7: https://ark.intel.com/content/www/us/en/ark/compare.html?pro...

Complicating things recently is the introduction of the Xeon W and i9 lines, but for the most part the Xeon ones have yet more a new features and i9 are higher core counts to compete with AMD.

In broad strokes in the laptop space the majority of enhancements since 2013 I would say is higher RAM capacity (that Apple as a manufacturer has not to pursue compared to Lenovo, etc.), still yet lower overall power consumption to improve battery life, DDR3 to DDR4 ram, bigger L1/L2/L3 caches, and normal improvements to embedded graphics to support bigger onboard and external displays, and PCI-E enhancements for drives as NVMe-style drives are what I would consider the new standard.

Thank you!

Purely in terms of the Options of MBA 2020.

i3 - Dual Core i5 - Quad Core i7 - Quad Core and slightly higher Clock speed.

That is it. Apple has kept it fairly simple most of the time.

Worth nothing the i7 is only very very very slightly faster as you are limited by thermals. I dont think it is worth the price. If you ever need the processor power then the current ( and future ) MBA isn't for you.

And in all honestly the CPU performance between your 2013 MBA and this MBA isn't all that different if you are getting the Dual Core. After all your 2013 CPU had 15W of thermals to play with, this newer CPU only has 9W peak, ( and 7.5W average ).

But the overall package of MacBook Air still many times better. Screen Quality, Ports, SSD Speed. Speakers. Not sure if they have updated the WebCam, if not it will be worst than your 2013 MBA.

Same question here. My 2011 Air is basically unusable now, so I've been waiting for this machine. i3, or $300 more for the i5 (which was my first inclination). Then you get into the i5 machine and they offer an i7 for another $150.

My work won't be processor intensive, just a personal machine. But given how long I kept my old one, I intend to keep this one for a long time, and I'm not sure where the best price:performance tradeoff is. I could argue going low-end and replacing it sooner, or high end, and keeping it forever.

Great to hear others getting same mileage as me - my late 2012 Air bought in March 2013 still going strong with only a battery replacement done so far. 4Gb RAM, 128 Gb SSD. Used daily for browsing, watching videos online, building IOT software, playing with unikernels. The only time it hasnt performed was when used last year as a work machine while waiting for macbook to arrive - and that was down to the collection of corporate stuff required. Once got new work machine, i removed all the work stuff, and the Air went back to being the solid daily hack it has always been. In the same time my friends with other laptops have had several cycles.

It's largely turned into a doorstop after the battery was dying a year or so ago. I bought a $70 battery from a random seller on advice from a friend who had good luck with fit and instructions. Replacement was a breeze, and it seemed great at first, but pretty quickly started just dying on me. I use the laptop so rarely, it just seems like, no matter what the SOC was last time I used it, next time I pick it up (usually a week or more later), it was dead and needed power.

that said, about a year or two ago was also when it just started feeling too slow to be usable (beachballs all the time). I blame web bloat, mostly :-/

I bought it new, so I definitely got my money's worth out of it.

Well I'm on a 2016 macbook air, 8gb RAM and doing web programming everyday. 3 phpstorm windows, sometimes Photoshop open, more than 20 tabs in Chrome, iterm, figma, paw... Honestly it runs pretty well and I was waiting for this update.

Lucky you, because yours still has a U processor, not a Y one.

I'll wait for some benchmarks then. Maybe there's not much improvement. I see Apple is comparing last model that had 2 cores with this new 4 cores and saying there's a 2x improvement.

Assuming PHP you just described "a text editor and remote servers"

Apparently PHPStorm and Photoshop aren't "text editors" one would imagine.

PhpStorm = Big, Java-based IDE. It's not some simple text editor.

PHP Dev often means some sort of local server running too.

big complex IDEs are almost never compute bound.

compilers are often compute bound (certainly on SSDs).

web servers are almost never compute bound.

You severely underestimate JetBrains IDE.

any IDE that is compute bound should be thrown out, especially for a language that is never compiled.

One has to wonder about the carbon footprint of millions of developers running these monstrous IDEs.

Yeah, but that's a lot of us. Obviously there are some programmers out there doing computation-bound work, but an awful lot of us aren't. If you're doing web stuff or making apps, then basically any computer manufactured in the last 5-8 years is going to be fine. I do iOS side work on a 2012 quad-core Mac Mini. It's totally fine.

It's OK to geek out about hardware, if that's your thing. But it's not really necessary for most people -- even programmers.

I think you’ll be happy with the update, especially if you were happy with the previous machine. Remember you can try it out for 14 days risk free, so if you have any doubts you have time to test it thoroughly before committing. https://www.apple.com/shop/help/returns_refund

How are you going to return it if all Apple stores are closed due to COVID-19 ?

If you are running something computationally intensive isn't a desktop more suitable?

45W will get your laptop pretty warm and noisy and will drain your battery.

I'm on a 7200u (i5, 15W) and I could really use the extra oomph (rounding things, it'd be 40% extra performance assuming a cubed increase in power consumption vs performance improvement -- which is pretty close to what I've seen in practice) of it being 45W.

Been looking at the newer gaming notebooks, and I'm waiting the "real world confrontation" of 10th generation i7s (specially those that have AVX512) vs Renoir (lots of cores, AVX2 on a single instruction and (not as much as desktop) lots of cache), since I run some numeric heavy code and it's not quite clear how they'll stand up to each other.

Why do I run this on a notebook and not a desktop? Because I'm always on the move, and I can pack the notebook and go anywhere I have to and be able to work with or without internet connection (which is sometimes the case).

You might be interested in the ThinkPad P1 (Gen 2). It has the option of not including a dedicated GPU, which will save a considerable amount on battery power, while still packing in a 6-core i7 H-series processor.

But I am also interested in the upcoming processor generations.

Thanks for the suggestion. I was willing to wait a bit more to see that standoff: What will get me more bang out of the processor, the "super" architecture of Zen 2, or the coveted AVX512?

The P1 comes with a 9th gen i7, I assume they'd eventually get a gen 3 with 10th gen i7 and then I'll be able to compare.

> What will get me more bang out of the processor, the "super" architecture of Zen 2, or the coveted AVX512?

Bychance, have you seen the article from puget systems on how to use MKL with Ryzen[1]? It explores some of the different trade-offs between AMD and Intel for numeric computing.

[1] https://www.pugetsystems.com/labs/hpc/How-To-Use-MKL-with-AM...

Yeah, I'd seen that, great link! Waiting to see something like that with the mobile processors.

> 45W will get your laptop pretty warm and noisy and will drain your battery.

So put it on a stand and plug it in? Problem solved.

> isn't a desktop more suitable

Some people need to carry their computer around, so the laptop has to be a compromise between stationary computing power and portability. The MacBook Pro strikes this balance quite well all around. Of course there are bigger and heavier laptops, if you require even more performance and are not limited by portability.

> isn't a desktop more suitable

No it isn't any more suitable, because you can put your laptop on a stand and plug it in to achieve the same thing.

It's not the same thing by far: Laptops have much worse heat dissipation than desktops, they overheat, and your high-powered CPU gets throttled down.

I find it a bit opaque that they do not advertise exactly which series/gen they are putting on the machine. Can't find it even on the Tech Specs page.

They don't advertise their memory models or disk manufacturer either. Probably typical customer does not care. Wait for teardowns if you're interested in those details.

I'd say anyone buying a pro careas at least about the generation.

I'm a FE developer (Mainly React/Node/React Native Development) and I'm still using a 2012 13" Macbook air (8Gb) as my daily driver at home. There's a noticable difference between it and my work (granted still relatively old) 2017 MBP 17". Running builds and an entire test suite maybe take twice as long but it's overall very usable and I see no reason yet to upgrade.

but it's overall very usable and I see no reason yet to upgrade.

HiDPI. Once I had used a HiDPI screen for a while, I can't really tolerate LoDPI displays anymore.

I agree. I'm an a 2012 15" MBPr, and and I can see myself using this laptop for a long time still.

>You can get away with this machine if your workflow primary involves a text editor and remote servers, but otherwise I would still opt for the pro.

I bought a 2018 15 inch MBP with 6 core i9, 32 GB of ram and vega GPU with the idea that it would be my one stop development machine. The system is terrible, especially considering the outrageous price - it does have sufficient compute power but the thermals are insanely bad - it's constantly turning the fans to 100% and I can't even tune the power usage - when I run a VM/emulator along the IDE people in the office start turning their head in my direction because of the fan noise.

And the performance still isn't even close to a mid range desktop machine which would have cost 1/4 the price.

I'm seriously considering selling my MBP buying this Mac Book Air top config (I need an OSX machine to develop iOS/OSX apps, otherwise I would go for Lenovo X1 Carbon) and building a desktop which will be always on VPN for heavy lifting + it can replace my gaming console.

And the performance still isn't even close to a mid range desktop machine which would have cost 1/4 the price.

I'm seriously considering selling my MBP buying this Mac Book Air top config (I need an OSX machine to develop iOS/OSX apps, otherwise I would go for Lenovo X1 Carbon) and building a desktop which will be always on VPN for heavy lifting + it can replace my gaming console.

I have a MacBook Pro 2018 and Intel NUC running Linux with exactly the same CPU. In typical C/C++/Rust builds, the Linux NUC is perceptibly much faster than the MacBook.

Didn't dive much deeper, but I'd guess the difference is thermals (despite the NUC having a small enclosure) and much higher system call overhead in macOS.

That said, I still like having a Mac as well, since there are so many great applications that I use frequently. But a sweet spot (as you suggest) is a reasonably powerful MacBook for the things that macOS is good at and a powerful Linux workstation or server for compute.

The 2019 Pro completely fixed the thermal issues, it is actually considered to have pretty impressive thermals now.


Unlike most other Y-series devices (including the MacBook), the MacBook Air has a fan. This allows it to dissipate more heat and sustain higher clock rates.

Yep, so many people don't think about that when they look at laptops and it's so annoying.

FWIW, the Y CPU in the Air is set at 9W instead of the usual 4.5W

This is why there is a huge difference between 8-gen or greater Intel CPUs on mobile. As of 7th gen, the only mobile CPUs that had quad-core were the HQ-series 45W TDPs which required a totally different cooling design. Then, with 8th gen, all the regular low-TDP CPUs got quad core and the 45Ws started offering six-core.

For example, the only ThinkPad of that generation's T-series that had one was the T470p which had a significantly different design in terms of batteries, thunderbolt features, etc. to accommodate the significantly larger cooling system. With the very next year's model, all the models got quad-core with their usual low-voltage/low-power (and low battery usage) design.

MacbookAir always seemed like a tablet with keyboard to me. I have seen ordinary people complaining that they can't multitask apps there, that they have to run one app at the time. I think Air is not supposed and never was to be used as a development or video processing machine.

Where do you even find this info? Even on the Tech Specs page for the macbook air on apple.com it says "3.0GHz 6-core Intel Core i5 Turbo Boost up to 4.1GHz 9MB shared L3 cache" and doesn't give the model number.

Practically, that means that the new Air will not be capable of running sustained workloads much above its base clock speed, which makes it unsuitable for many programming-related tasks.

Does that still holds when using the power adapter?

OP's comment was about thermal dissipation, which, if anything, would be a more difficult problem with the power adapter (since the processor doesn't need to be throttled to conserve battery, for example). So, yes.

Thanks, I guess the question then is, for how long can the mid and max clock rates be sustained.

So for peak load, it’s an upgrade compared to the previous generation MacBook Air. But for sustained load, it seems to actually be a downgrade, as the previous gen still had processors of the U series.

as the previous gen still had processors of the U series.

No? The base model had a i5-8210Y?


I am sorry, with previous generation I meant the non-retina version of 2016, which indeed had U processors.

I've been issued a MBP at work and now thinking of picking a 5 year old MBP to work at home, since switching two OSes is overwhelming at this point. Is this a good idea?

Yes. What kind of work do you do that prevents you from taking the MBP home?

The nice thing about this one is the lack of the touch bar.

> The Pro is still a much better choice for programmers.

I'm not totally sure I agree. I typically do a lot of programming on my MacBook Air. It's simply a box I use to ssh to something else more powerful to do the actual runs.

Yes, if you're running computationally intensive stuff on your MacBook Air you're going to be disappointed, but if what you're mostly doing is typing into a terminal screen then it's probably perfect. And probably even over-powered? :)

That's exactly what he said at the end of his comment.

Learned something new, thanks!

The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience. The inverted-T arrow keys help you fly through lines of code, spreadsheets, or game environments.

This isn't progress. This is the baseline. Apple have gone from bad to OK, and they're celebrating as though they've achieved something amazing.

We all (most of us anyway) wanted them to go back to the scissor design. Are we going to now complain that they did what the community has been begging them to do? Was butterfly a mistake? Yes. Were they slow to correct the issue? Yes. Now that they fixed it we should be happy about it.

As far as talking about it being amazing, its called marketing spin. This is how it works. However, those two sentences do not say anything about it being amazing. It simply focuses on the positive features of the keyboard. The two sentences above clearly communicate to mac users that the company has fixed the problems that people wanted fixed. Did you really expect a bunch of public self-flaggelation? They are telling us clearly that they did what we asked for. Perfect.

I think it's the marketing copy most people are taking issue with with.

They tried a new design, which was horrible to use and had a high failure rate. They continued to claim the new keyboard was amazing, and stubbornly continued to use this crappy keyboard long after the problems were apparent.

And now they are touting a "normal" keyboard mechanism as if they've invented something new and wonderful... only Apple could get away with such transparent BS.

"The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience."

What in that says "invented" "new" or even "wonderful?" It seems like you're reading into the text what isn't even there.

Well, they are calling it the "magic" keyboard. How does that strike you?

I assumed that was because it’s the same key-mech that’s in their (external, Bluetooth) Magic Keyboard. Where it itself was branded “magic” just in reference to its two sibling peripherals, the Magic Mouse and Magic Trackpad.

IIRC, the peripheral line itself was started with the original Magic Mouse, which was branded as such because it didn’t have separate external actuatable buttons, but rather was just a smooth surface with a multitouch digitizer on the top half + a single actuatable microswitch underneath the shell†. Apple wanted the image of it “magically” figuring out when you left/right/middle-clicked (despite no L/M/R buttons) or scrolled (despite no scroll-wheel.) Also, the “plug in to pair” experience might have contributed to the claimed “magic”—it was a fairly unique approach to pairing at the time.

† Which is a design with some real benefits, like being easily disinfectable, with no crevasses close to the hand for filth and germs to accumulate in. (There is a crevice on the Magic Mouse, but it’s on the bottom, where your hands will never touch it.)

There is also a bit of “magic” in the Magic line of peripherals that’s not in the hardware itself, but rather in the OS: when the Magic line of peripherals—Apple’s first Bluetooth line of peripherals—was introduced, Apple added a feature to macOS where macOS will “train” the Apple EFI firmware to recognize devices paired in macOS itself, such that the firmware will later attempt to connect to such paired devices on boot. This means that e.g. holding Option on your Bluetooth keyboard to select an alternate boot device on an iMac would actually work. Which was kind of necessary, as those are the peripherals iMacs shipped with.

It strikes me as completely in line with their other peripheral offerings such as the:

- Magic Mouse

- Magic Mouse 2

- Magic Trackpad

- Magic Trackpad 2

- Magic Keyboard

- Magic Keyboard with Numeric Keypad

The previous sentence: "Now features the new Magic Keyboard."

"Refined" is also an adjective that has a definition close to "new and wonderful". It's not just a word to bring out an emotional response to make people feel like buying the thing.

They refined the scissor mechanism. It's an improved scissor mechanism with better key stability than previous scissor keyboards on their laptops (so sort of the "best of both worlds" between the butterfly keyboard and their old scissor keyboard). The new mechanism is also used in the Magic Keyboard and 16" MacBook Pro.

"Refined" means made better. This is better than the previous keyboard, and also features 1mm of travel, making it better in that sense than any of their previous keyboards.

Imagining a word to mean what you want it to, and then reacting negatively to that, that doesn't say as much about Apple as it does the observer.

> What in that says "invented" "new" or even "wonderful?"

Uh... "refined"?

Refined doesn't mean any of those words. Here's two definitions from Google:

> with impurities or unwanted elements having been removed by processing.

> developed or improved so as to be precise or subtle.

It basically just mean improved, which most people would agree.

In fairness, that keyboard switch style hasn’t had 1mm off travel before, so don’t you think “refined” is somewhat accurate?

I read that as "elegant" not "made better". It's possible that people could read it either way.

Yes, that's by design; it's a purposefully ambiguous choice of words that be read either way depending on what the reader's subconscious wants to hear. Either way, they don't have to admit that they were wrong, customers that hated the old now now feel relieved and vindicated, and people are probably more likely to buy the new one. That particular choice of words is probably the result of millions of dollars of marketing psychology, focus groups and A/B testing.

Where does it say "we stuffed up, and in response to your feedback we've gone back to basics"? Sometimes, people want to hear acknowledgement of error.

Are you seriously expecting them to say "our previous product was bad, this one is good"?

When has any tech company ever done that?

A good value trade in program would have been nice. I managed to sell my 2017 15” for a bit under $2000 CAD to upgrade to the current 16”. I would have rather dealt with Apple than deal with hagglers and low ballers.

Last time I remember that happening was Porsche's recent launch of the 992 generation 911, in which they poked fun at the fried-egg headlights on the 996. If they can do it, so can Apple. They are both Jedi-level marketing orgs.

That takes courage.

I mean Domino's managed to pull it off.

They did basically say that in the live announcement of the 16” MBP, but remember they still sell some models with the old keyboard, plus millions of people still will be using that one for years to come, so there’s no way they will disparage it for the next few years at least.

They said that when they announced the extended warranty program on the old keyboard. People who want to hear an acknowledgement of error can go back and read that, if they missed it.

People want to hear wailing and gnashing of teeth, which is silly.

It’s marketing man. Next you will be complaining that the burgers look a lot better on the billboards compared to when you unwrap them, or that the shirt looked a lot better on the mannequin than when you put it on, or that car ads always show their car speeding on an open road instead of stuck in a traffic jam during the morning commute.

Ignore the marketingese and don’t let it bother you so much. The important thing is that they listened to their customers and created a better product as a result.

I dunno, we expect people to be honest in every other aspect of their life. Why shouldn't marketers be honest too?

We do not expect people to be honest in other aspects of their life. Just look at the cosmetics industry. Politicians with their campaign promises. Management with their corporate right-sizing and synergies. Kids and Santa Claus. (haha!)

Anyway, I consider fake reviews to be dishonest .. I consider this to be more like "putting a positive spin on it".

> They tried a new design, which was horrible to use

I actually liked typing on the butterfly switches, the only thing I did not like was the left/right arrow keys being bigger.

Well it isn't normal, and it isn't the old Keyboard either.

Despite having Double the Key Travel of Butterfly ( 0.5mm to 1mm ) It still felt the same as butterfly. The old Scissor had 1.3mm, while only 0.3mm difference, it felt night and day.

The new scissors also claims higher stability. Although I doubt this has anything to do with the design but rather of the "height" of the key be lower.

It is indeed new, but I know if it is wonderful yet. I haven't had a long period of time to try and use it.

Marketing aside, thank god we're back to normal. Some things are just better left alone and don't need innovation (vim comes to mind). Refinement sure. I love my 2014 MBP keyboard.

They deserve flak and they are getting it. It doesn’t have to be logical. It is emotional. They also created an emotional brand.

Somehow I think Apple can handle it. Anyone wasting energy complaining is only doing so because they are waiting to buy again.

It makes as much sense defending them as it does yelling at them about a keyboard.

I dunno, I used to buy Apples because they had great hardware for a fair price (premium, sure, but if you're in the market for a premium machine...), but my last two machines have been a Dell and a Surface. It's different trade offs, but '\_o_/` I was never wedded to their operating system anyway.

I held off buying a new MBP replacement for two years now thanks to the massive criticism over the butterfly fiasco. I'd say the complaining was useful and had an impact. Not wasteful at all.

To be honest, yes, I'd like them to say "we messed up, and we finally recognised that, so we're fixing it".

I think it would be amazing if the most valuable, most design-focused company in the world admitted to everyone that they made a mistake. It would do a lot towards allowing everyone else to make mistakes without beating themselves up over it. After all, if the thousands of specialist engineers, paid billions in salaries, given the best equipment in the world, in a company that really (and I mean really) values design, can make a mistake, then it's kinda OK that your home page looks a little crappy on mobile.

They did that when they actually started fixing it[0].

[0] https://www.tomsguide.com/us/apple-macbook-keyboard-service-...

Wasn't that the dust problem, though? Admitting that there's a problem when people are taking out class-action lawsuits against you seems a little late to me. And again, rather than a "we're so sorry, we made a huge mistake" speech, it's more of a "we're such an amazing company, we're going to fix your faulty laptops for free!" speech.

>To be honest, yes, I'd like them to say "we messed up, and we finally recognised that, so we're fixing it".

Exactly. In Steve Jobs days he would have either jokingly admit there was a mistake or at least say something that people dont like it ( hence admitting there is a problem ).

The new Apple put up a big middle finger and didn't act until there are class action.

Even with the fix, I'm ddisappointed in their response. II had a 2013 MBP that worked great for years and years, and was excited to finally get a maxed out MBP about eiighteen months ago. The ffirst year was great and then this damn keyboard started doing its thing. I'm deliberately leaving the keyboard errors in place for this comment, they aren't typos. Yes, they havee the keeyboard replaceement program for the enext 3-4 years. But then you have to be wiithout your workhorse for a week while they replace it, and then what, you'll probably have the same problems a year later. (Incidentally, I have a job switch coming up wiith some tiime off - my plan was to use that time to send in the laptop for repairs theen when my clieint isn't relying on my availability, but now that plan is shot wiith the Apple Stores and malls being closed.) And yes, I can fix this by just buying the 16", but this computer was expensive and was supposed to last me at least 3-5 years. I'm supposed to iincrease my spending to Apple? A sane program would be to buy back this lemon at a heffty price so I can buy the new one and be made whole.

Marketing spin is not clear communication. It does not deserve praise. It's not mandatory, either.

None of:

> MacBook Air now features the new Magic Keyboard, first seen on the 16-inch MacBook Pro. The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience. The inverted-T arrow keys help you fly through lines of code, spreadsheets, or game environments.

says anything about "fixing problems that people wanted fixed" or "doing what people asked for." That would read:

> We used a lower-travel keyboard with a butterfly mechanism and alternative arrow key layout on recent models, and you said you didn't like it. We listened. The MacBook Air now features a proven scissor mechanism with a return to 1mm travel and classic inverted-T arrow key layout.

They're implying that they've come up with something new, which is a lie. That's not perfect.

As far as I can tell, this is the first keyboard with 1mm of travel that Apple has made. Before the recent butterfly debacle, their keyboards had as low as 1.2mm travel (on the iMac keyboard) or more. The much-maligned butterfly design got down to 0.5mm travel.

So this scissor design with 1mm of travel seems to be something new. The inverted-T arrow keys are not new, but the scissor mechanism itself is, a refinement of previous designs.

To be completely honest, they should describe the keyboard as "a lot less bad than the previous keyboards, but not quite as good as the ones before them".

The two sentence above clearly oversell the product: "refined", "delivers", "help you fly", plus all the details for something that is already known... It's not because everybody is doing glorified marketing, which often results in being deceptive, that we have to be okay with it.

> Did you really expect a bunch of public self-flaggelation? They are telling us clearly that they did what we asked for. Perfect.

What ? No. They are telling us they refined it, not that they fixed their mistake. It's not perfect, it's lame and cheesy.

Does refined not mean that they took out some of the bad parts? That's about as close as you can get to admission of a fuckup as you can get in sales copy

>Now that they fixed it we should be happy about it.

No, we moved on and do not care. There are many companies that make laptops and keyboards. 5 years too late.

Honestly, Apple's marketing has always been worse than the other companies when it comes to sell basic things as a revolutionary, life changing progress.

IMO, when you see car companies 2020 model gets 3 more horsepower or 1 more MPG, it’s better to consider how difficult that was rather than simply expecting it. Even a slightly better toothbrush is a positive life change. We have normalized progress, but even seemingly incremental progress is making the world a better place.

Further, it can take revolutionary change to maintain incremental improvements. Filling HDD with say Helium has a lot of knock on effects hidden by the spec sheet.

Ok. How should they communicate this?

"We've gone back to where we were three years ago after a mistake, sorry."

I don't understand what the issue is with Apple trying to sell their products. It sounds like people here are upset that marketing and sales exists, and that they use language to try to make their products seem impressive, and that consumers aren't rational when it comes to buying things.

To be fair the butterfly keyboard was pretty nice after you got used to it but unnforrtunattelyyyyy itt enndddeed iin fffaiiilurree foorrr moosst off usss. I batted mine back to the apple store after 3 weeks. Thank goodness it was the Christmas no questions asked period.

> Ok. How should they communicate this? "We've gone back to where we were three years ago after a mistake, sorry."

Yes. We expect adults to own up to their mistakes, so why don't we hold corporations to the same standard and instead just accept corporate bullshit from them?

4 years but who's counting.

Sent from my MBP 13" 2018 with sticky spacebar.

In the old days Steve would owe up to their mistakes, or at least put forward with a statement, or talk about it in conference.

New Apple doesn't act. Refuse to listen. Even with the Repair programme they still act as if it was not their fault.

> "We've gone back to where we were three years ago after a mistake, sorry."

That would be refreshing, I'd respect that.

> "We've gone back to where we were three years ago after a mistake, sorry."

Yep this is the right move, and as someone else says, this would be worthy of respect.

I think we've all just about had it with corporate bullshit -- and to be sure that says more about this moment in time than anything else.

Apple is consistently guilty of blowing smoke up our collective asses. It would be nice if they could give it a rest and simply be honest. But here we are.

Coz, it's a lie, half truths are lie essentially, I agree its a sales tactic but nonetheless it's a lie.

Which part of those two sentences is a lie?

"The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience."

Is any of that false? How do we know?

"Half truths are lies". The pp is saying they're lying because they didn't admit fault. What's wrong with a company trying to sell a product by outright honestly saying "yeah, we tried to balance usability and thinness, and we went too far, we heard your complaints, and we've gone back and refined our old designs". I mean, they can tell the truth the whole truth and nothing but the truth in an even more markety way if direct honesty is too much.

This seems like nonsense. Must they admit fault in every communication for the next three years? Must they use specific words while admitting fault?

There's nothing wrong with a company saying "we screwed up." Which Apple has done already. There'a also nothing wrong with a company saying, "This keyboard is great," if it is in fact great.

>The pp is saying they're lying because they didn't admit fault.

They pretty clearly admitted fault when they offered free out of warranty repairs for keyboard issues.

No, they're not saying it's innovative or amazing, they are simply calling it a "responsive, comfortable, and quiet typing experience", which I guess is true.

Thereby implying that the only problem with the old new butterfly keyboard was the loud typing sound and not reliability.

No, because they're saying it's a refined scissor mechanism, i.e., a refined version of the old, pre-butterfly keyboard. This is 100% true.

Their (very) old keyboards are still amazing to type on. They went from high quality mechanical switches to bad rubber domes to okish rubber domes. The current magic keyboard is actually not bad, but I would still prefer the old alps switches.

I completely agree. My 2014 MacBook Pro purchased refurbished still has the best keyboard of all the devices I use. The Lenovo X1 I use for work is a close second.

What I noticed in the image that made me excited about this device is the function keys. If the 16in MBP had function keys, I probably would have purchased one already. I do wonder if the top spec of the new MBA is going to hold up to my usage though.

The mechanical ones are ancient by that standard. The Alps switches ones are pre transparent imac :)

not sure if the 2015 was any different from 2014, but it's still my favorite.

Whole world seems to be celebrating that Javascript desktop turds runs 10 times slower and consumes 10 times more resources than 20 year old native applications. So I guess everything is worth celebrating.

In the olden days, we complained about apps that used all our ram. Nowadays, we complain about Firefox or proprietary web browsers using all our ram, when in reality they're doing the best they can. The task manager can't show users who's actually to blame, so lazy devs get the glory while hardworking browser developers cop the flack.

Except that when I use the old apps that used to use all my RAM, now they don't because I have more RAM.

I don't blame the browser vendors (except maybe that V8 made JS juuuuuust good enough to make something like Node viable). They took a thing that run slow for everyone, and they made it faster.

I do blame application developers for writing everything to the web because it's there. There's instances of folks doing better than most in Electron/JS land, but it's still nothing close to the native or even managed Java/C# apps of yesteryear.

Really I should just link to my last Electron rant, seems like they're getting closer together nowadays... https://news.ycombinator.com/item?id=22598148

Yeah but it's so easy to code my crappy app now! I don't have to understand anything, just copy-paste from Stackoverflow!

Nope, that's exactly what I wanted to hear before buying. Rather than just saying they improved it, they explicitly pointed out it has the keyboard you want from the 16" MBP.

>MacBook Air now features the new Magic Keyboard, first seen on the 16-inch MacBook Pro.

That saved me from having to google "hey, is the 'new magic keyboard' the thing in the 16" MBP that I've been waiting for in the Air and 13" MBP, or is it something else entirely".

Damned if they do, damned if they don't.

Some people just seem to choose a target for life (like the Favored Enemy of a Ranger from Dungeons & Dragons) and never give it a break, no matter what.

I suspect that even if Apple comes out with the best keyboard that mankind is ever going to make, some of you are still going to be angry about how they removed optical drives 4000 years ago.

I'm looking at this in a bit of a different way:

- Apple knows that at the very least, some set of vocal people don't like the previous keyboard. They also know that many of their customers had to get repairs, even if they liked the keyboard. Those customers might understand that "butterfly = bad"

- They need to tell people that they've fixed the problem but don't want to do so in a way that says "the last product was bad" (so they can't just say nothing about it)

I think we should also place some fault on other manufacturers for just blindly attempting to do what Apple does without thinking: check out the latest XPS 13. They've implemented the same arrow setup as Apple's butterfly keyboards. And yet, I haven't seen a single review online that criticizes the XPS for this choice.

So much this- I struggled with my butterfly mbp for about 18 months all under the guise of "its a stout machine, the keyboard isnt that bad" or "I can use an external keyboard".

Then I grew tired of the macbook fan noise when running windows 10 and debugging with the touch bar. I ordered a surface 3 laptop and immediately realized how important a nice keyboard is to me. Its tactile, its got enough travel, the keys feel nice. I type with fewer errors and I work faster. Anyone want to buy a 2018 macbook pro with 6 core and 32gb ram?

I think you're underselling how truly revolutionary the inverted-T arrow keys are. I hear they help you fly through lines of code, spreadsheets, or game environments.

So is this a not-broken keyboard design? I have the (now) previous-generation Air, and need a keyboard repair.

Would be seriously tempted to just buy a new one if I was confident the keyboard wasn't absolute garbage. Typing on it was fine when it worked, but it double-spaces, and a couple of the other keys are now wonky.

Never had a keyboard die on me before this -- Mac or otherwise.

This is essentially the same type of keyboard from before the crappy design you ended up with. I've had a MBP 2014 since release with that keyboard and I love it. I can't express enough how good it feels to type on (although some personal preference).

I also tried out the new version of that keyboard on the recently-released MBP. It feels almost the exact same as the old one, just a slightly more shallow depth.

Hopefully that's helpful.

Conversely, I went from a MBP 2015 to the 16" model, and I think the "fixed" keyboard is still terrible. I've used all sorts of keyboard, and it's the first one where I regularly have doubled or missing letters when typing. Maybe I'm not hammering the keys hard enough?

Yes, this is what you should do:

1. Get your keyboard repaired (free, I assume)

2. Sell that thing online

3. Get this one.

I mean, this has a quad-core option, too. It's a no-brainer better machine.

I have the 16-inch that has the same keyboard mechanism as this. It's 100% an evolution on the previous design. While it is early, it's a proven design and there's no chance of it having any of the issues that the butterfly keyboard did.

I feel like my 16-inch is the computer I intended to buy in 2016.

If you can afford the upgrade cost after you sell your current one, you won't regret it.

I use my laptops for my business for 3 years under AppleCare and Joint Venture protection, buy the top of the line, and migrate the old one to the rest of the family. For children, the laptops are fine for another 5-7 years. The current laptop has the old scissor keyboard, and I'm waiting for Apple Stores to open again to pick it up from depot-sent-parts Genius-repair under an AppleCare that just ran out end of February.

Unfortunately, Apple will only replace the butterfly keyboard for 4 years after initial retail purchase of the laptop.

I dread the eventual breakdown of the new butterfly keyboard after the end of February next year, when I'm on my own. I hope iFixIt will start selling DIY repair kits, or the cost to repair at Apple makes such a frequent failure a non-starter (in which case I'll turn it into a fixed-place computer in the house, with an external keyboard).

Don't hold your breath on repair kits. It's a very difficult repair and you can't separate the butterfly keyboard from the top case. You have to replace the whole top case. The best you could maybe do is not replace the battery - but it's all glued. Good luck. I wouldn't attempt that sort of thing, and Apple doesn't.

I wouldn't be surprised if Apple bended that 4 year rule, though. But also realize that Apple will eventually stop repairing all computers just due to time passage. And also, eventually, from a financial perspective it's more cost effective to find a working used computer (after heavy depreciation) than to actually do a repair even if that repair is doable.

I do think that their replacement keyboards have better reliability than the early models. I didn't have any problems after getting mine replaced - I believe I used the replacement keyboard for around two years. But I still wasn't willing to keep the computer longer. I went straight from the 2016 to the 2019 16-inch MacBook Pro.

In any event, it was a nice opportunity for an upgrade. My 2016 was still worth $1000, and I ended up with 4 more cores, double the storage, and an absolutely massive increase in graphics performance. Using the education store and picking up in a state with no sales tax did a lot to make that price more palatable.

Supposedly. I don’t think anyone has any of these yet.

Woohoo for the new arrow keys!

If you liked the stability of the butterfly switches, but the travel and reliability of the scissor switches, the new thing is really pretty nice, and is arguably an advance over both.

It's amazing how well received the "I'm cynical and world-weary" angle plays on Hacker News.

"This is the baseline. Apple have gone from bad to OK"

Apple's scissor keyboard is pretty broadly considered the best in the industry by a country mile. Their butterfly mechanism was a bad misstep (I mean...almost indistinguishable from my Yoga 720, but compared to prior Apple keyboards), but saying that they went from "bad" to "OK" is just nonsense.

"and they're celebrating"

Advertising doing what advertising does. So brave on HN to point out that marketing is marketing-ee. Are you also telling me that the new car isn't going to make me an adventure seeking extrovert?

> Advertising doing what advertising does. So brave on HN to point out that marketing is marketing-ee. Are you also telling me that the new car isn't going to make me an adventure seeking extrovert?

Why is that acceptable? If you lie or misrepresent the truth in almost any other field, you get criticised. But when marketers do it, they're immune. That's just weird.

False advertising is, in fact, illegal. If in your opinion they've crossed the line into actual factual inaccuracy, it's your right to take legal action against them, or request that your state's attorney general do so on your behalf.

They've turned around. That is amazing for any company, especially apple.

Only in the marketing world does "Magic" == "functional".

I, along with many people I know, love the scissor keyboard.

It is progress. Apple runs a monopoly, people seem to not be able to escape (I'm on Windows) and Apple has struggled to fix an utterly broken keyboard for 3 or more years. So, finally they did this on an entry model.

Even I who gave up on Apple thought, hey what a nice machine.

It's not so much progress, as reversal of a regress.

Imagine a country where a terrible leader comes to power, and the nation regresses for years. Then a new leader arises and reverses course. Does the country celebrate and boast?

This is below baseline. I still don't understand why it's important for them to make compromises at the keyboard, which is an essential part of the notebook experience. A keyboard has keys, keys have a certain height. Get over it Apple. I've tried the latest 3 generations of keyboards and the 2015 still comes out on top. It looks as if Apple's engineers are trying to fight the keyboard. Macbook's keyboard were almost unquestionably the best among notebooks. Now we're happy if they're not crap. Of course this is all personal opinion and I'm sure Apple tests these things extensively and only release them if they make for a significant improvement. They wouldn't release a broken keyboard and deny they're at fault for years, right?

After using the latest Macbook Pro 13" for over a year I have recently had my 2015 Macbook Pro 13" repaired. Both are max specs. It was a $600 bill, but it has been so worth it. The keyboard just works, it doesn't run hot and the fan doesn't blast under the slightest load, the performance is much better, the battery lasts longer and the external screen + keyboard and mouse are detected every single time without having to re-plug the USB-C or open and close the computer lid. Also, no dongles required to connect USB-A or SD cards. Yes, it does look a bit clunky and not as elegant as the newer one, but seriously it actually just works.

I almost can't believe how much shit I put up with on a daily basis for over a year. If you replaced a dying 2015 Macbook Pro with a new one, I very much urge you to reconsider getting it fixed at pretty much any price. It is so very worth it.

Ha-ha, reading this from the 2015 MPRO 13", exact same experience. I've tried to move to DELL XPS 13 2 years ago on Linux, did not work for me, however with a new line of Dell's I am starting to thinking about repeating this experiment.

Apple's 2016-2019 laptops were pretty unusable, hope they revert the touchbar too..

A question, has anybody tried System76 comparing to Dell/Apple?

Wow, this is a common theme. I have a 2015 13" macbook which wasn't max specced out and I feel like I'm hurting these days RAM wise and HardDrive wise. I bought a super specced out 2020 XPS 13 2 in 1 because of reservations about the latest macbook pro 13 and I kind of hate it.

I think this is personal taste, but I just don't love the build as much, the trackpad, and ubuntu is acting pretty flaky. The camera and wireless chipsets are not immediately working right.

There just doesn't seem to be a perfect solution.

I have both the 2020 2-in-1 XPS 13 (32GB, core I-10) and I love it because it's a beautiful 2-in-1, but yes, camera and fingerprint scanner don't work at all, but only in the 2-in-1 version -- works great in the laptop version, which seems to have a different camera. Wireless works great, though, as does the trackpad. (Kubuntu 19.10, not LTS -- LTS didn't work very well at all for me.)

I also have a lower-spec'ed 2020 XPS 13 laptop (8GB, core I-10) that works perfectly across the board (except for the fingerprint scanner, which I wouldn't use anyway.)

I love the hardware on both. Absolutely love it -- more than my Macbook Air; no sharp edge where I rest my wrists, touch screen, phenomenal screen (4k, which is higher than Retina, or 1080p, which gives better battery life) etc. The 2-in-1 has a taller screen shape, so you can fit more lines in your coding windows.

If you care about camera, get the laptop. It's also a little bit cheaper -- I got it for about $950 from Costco. The only thing I don't like about the laptop is that the screen doesn't tilt all the way back. The keyboard is better on the laptop instead of the 2-in-1 as well -- more depth of travel, and the Delete key is in the right place instead of offset by the fingerprint/power sensor as on the 2-in-1.

I love the new Dell XPS 13 and think both the 2-in-1 and the laptop are better pieces of hardware than the Macbook Air, and I prefer the new Kubuntu 19.04 over Mac OSX as well. (I don't have a MBP so can't compare to the Pro.)

I felt the same way as you about my 2015 13” so I reformatted the drive with apfs and did a fresh install of Mojave. Got a ton of storage back and it runs like a new machine. One of the best things I’ve ever done.

I think System76, Lenovo, or Tuxedo is a better choice than Dell XPS for Linux from a reliability standpoint. Additionally, if you plan to do full disk encryption, the XPS line has all sorts of issues. https://www.dell.com/community/Linux-Developer-Systems/XPS-1...

Adding Purism to this list.

Slightly more pricey, but worth it. Typing this on one.

Been wondering about those. Did you use their distro or install another one on it?

Still using PureOS, just so I know any problems are caused by me and not some strange hardware/software issue. It's a perfectly serviceable Debian-based distro, haven't had an issue with it so far.

Good to know, thank you.

Once they offer a 4-core 13" option it will be a lot more compelling.

Weird - I've got a newer Dell XPS and am making this post with WDE enabled and have had 0 problems. Anecdotal I know, but I've ran pretty much every major Linux distro (Ubuntu/Fedora/Manjaro) and am currently running Regolith linux.

I've not noticed any hiccups or problems at all around encryption.

I've run Linux on my xps 13 since 2015. Issues at first when the hardware was brand new and not supported in the latest Ubuntu kernel but excellent ever since. Fully intend to buy another if it ever dies, no worries about reliability here.

I gave up on Lenove when they sold the company, maybe that was a mistake. Dell seems to be one of the best companies considering quality and support.

On the other hand, I've never tried System76, and never heard of Tuxedo, thank you for mentioning!

Tuxedos aren't cheap, but the coolest thing about them is they have a way to preinstall Linux and setup full disk encryption with LUKS and you just set the encryption key when you first turn on the machine. Never seen anyone else offer that.

I use a MacBook and a fully loaded (with GPU) System76 laptop. I really like both, for different purposes. The System76 is much faster for large builds and is obviously much better for deep learning training. The MacBook is light and portable. I like having two laptops, at extremes of the weight/portability vs. computational power spectrum.

I had a 2014. It is a great laptop, no doubt but I recall before several patches I had issues w/ USB 2.0 port connectivity. I had the screen replaced once. So it wasn't without its own set of bugs. Working beautifully now.

My only beef is no NVME and it stutters a tad. If I could find a 2015 w/ 16gb/1 tb spec w/ NVME...

I'm hoping the glitches in the 16" can be worked out, eager to see how the 14.1 will look...

and I know Apple kremlinology is never the most accurate way to look at things, but boy were the Mansfield/Forstall years great...

I have the very first-gen Retina Macbook Pro 15-inch back in 2012. Still work almost perfectly to this date.

The only problem is battery degradation (70%) and in reality it lasts <2 hours. But since I use it at home as a desktop, not a problem at all!

Is yours the retina 2014? Those can take standard NVMe M.2s with this adapter: https://www.amazon.com/Sintech-Adapter-Upgrade-2013-2016-201...

I've done this on a '14 13" MBP so I can help if you have any questions.

I am thinking of going to Dell from an 2014 MBP, what did not work with Dell on your last experiment?

The latest XPSs look very nice.

I tried dual-booting an XPS15, which did not go well [0]. I understand the 13's are more Linux-compatible, but I got so badly annoyed by Dell Support I'm never buying Dell again.

[0] the drive management software didn't like GRUB, and would complain that there was no drive if GRUB was installed. Repeated reboots later, it would suddenly say "oh look, a hard drive!" and boot into Windows, removing GRUB in the process. Dell Support were worse than useless at diagnosing the problem, let alone fixing it (still unfixed, the machine is now a rather shitty games machine)

Typing this from Fedora 31 running on a dual-boot, 4-month old XPS 15 (LCD screen, not 4K OLED). The only configuration change that has to be made on a new XPS is switching the SATA mode to AHCI. Fedora signs the kernel correctly--you don't even have to disable secure boot.

Signing 3rd party kernel modules under secure boot isn't difficult, but documentation on it is sparse. So I've kept my notes for next time: https://lawler.io/scrivings/linux-cookbooks/

The Arch guys have a great compendium on the XPS and Linux: https://wiki.archlinux.org/index.php/Dell_XPS_15_7590

I don't know if that changed recently. I gave up on it a year ago. I went through all this (though with Ubuntu not Fedora), switched modes, disabled secure boot, read every web page I could find. Nothing worked. I'm kinda tempted to see if I can get it working now, but honestly, also kinda traumatised by the whole experience and never want to touch it again.

I'm not familiar enough with Ubuntu's installer capabilities, but Fedora's has made huge advances in the past 5 years. I went in to the setup fearing the worst (that is, the struggle that was doing a secure boot, UEFI Windows 8 + Fedora 20 setup in 2014), but was shocked at how good Fedora 31's install routine was.

I shrank the 1TB SSD partition in Windows, disabled the proprietary fake-RAID, then booted Fedora's XFCE spin from a USB key. The installer handled everything else--no having to manually tinker with the UEFI partition. Post-install, the XPS's UEFI correctly booted to GRUB, which had successfully detected the Windows bootloader (again, no tinkering like with FC20) and can boot either OS.

Disabling hibernate in Windows and figuring out how to mount the encrypted Windows partition in Fedora was all that remained.

The Linux Kernel 5.5 has a bunch of improvements for the i7-9750H (and other Coffee Lake processors), which is another reason to consider giving Fedora 31 a spin. :)

I had this pleasant experience the same time I was trying to figure out how to dual-boot FC31 on a touch bar MacBook Pro. The short answer is: 80% of the MBP's hardware doesn't work in Linux. Broadcom won't release drivers that are compatible with Linux and blame FCC rules for their reason. Insane. https://github.com/Dunedan/mbp-2016-linux

dammit, now I'm tempted to try again. Thanks, I think ;)

I suspect this would be due to mixing legacy and UEFI booting, where grub-legacy would interfere with whatever the disk management was doing and fixing it would involve removing grub.

For grub-uefi, it is just another file on a EFI system partition.

possibly. I couldn't get a clear answer from Dell on what their disk management system was doing, so it got really hard to work out what was going on and where the problem was. I tried everything I could think of and the internet could suggest.

oddly enough I have a system76 ibex pro and a 2015 macbook pro 13". I'm almost a year into switching over. (Everything except lightroom). The system76 is much bigger, clunkier and not as well built. But honestly its fine, the software is updated frequently and besides a couple of times where the keyboard didn't come back after sleep the systm76 has been a trooper (sleep awake cycle brought it back).

I enjoy this machine and would buy another.

I'm actually thinking of doing the same thing. Why didn't it work for you the first time around?

Meh, the camera was on the left bottom side of the screen. On every online meeting, I was looking like a parent talking to kids from the corner :D

Also, I had a windows edition which works pretty bad with linux. A friend of mine got a new Linux edition with fixed camera and I think perfect XPS is 15" 32Gb for linux. That's my alternative #1

Aren't all XPS 15 models nvidia-only? That would an extra roadblock on the way to just-works linux experence.

I'm still rocking my 2015 MBP as well, and having the luxury of working on newer models through various jobs, I'm still not convinced to upgrade 5 years later. That said, the new 16" MBP has been the best Apple laptop experience I've had in a while.

The 16" looks great! But I really don't want to lug it around every day. I'm hoping some of it will carry over to the smaller form factors.

I have had every single MacBook Pro update since the 2015 and had either been returned or sold off. It was simply crap. I cant believe they have been making that crap for 5 years. Although judging from the 7 years of Mac Pro I guess MacBook Pro already got a better treatment.

At the end I stuck to my MacBook Pro 2015. Perfect Size Trackpad with Zero false positive. Most people put up with a few false positive on the larger TrackPad. But I dont see any reason why this have to be this way. The keyboard felt way better, even the new Magic Keyboard on MBP 16 felt exactly the same as butterfly. Touch Bar is Junk, again it sometimes freeze. And some people put up with it thinking it is a non issue.

The old Apple and its users used to be perfectionist. Nowadays a lot of people settle for mediocre.

I just wish they make MacBook Pro Classic. Just throw in a new CPU will do.

I wish I got this advice! I just replaced my 2015 MBP 15" max spec with a Mid-2019 version and that touch bar is annoying the heck out of me. At least it has a physical escape key I guess (I keep hitting F1/brightness controls on the Bootcamp side of things for everything).

Fortunately so

Hah! I'm not sure what the situation is like for the current 15" MPB. The specs are quite different compared to the 13" one. It may not be as severe as my experience. I think I'd try to hold on to that 2015 model just in case.

Good luck!

You get used to it. But it's worse. But you get used to it. I would go back.

I'm in a similar position, bought a 2019 13" Pro. I've got to say, I'm really happy with my 2020 Thinkpad Carbon X1 and won't be looking to upgrade again for a long time.

Or maybe a thinkpad, depending on your budget. I normally use a maxed out macbook pro but I didn't want to bring a $5k machine on vacation. So, I bought a $600 dell laptop last summer. The wifi range/reception was dreadful. Sitting beside my wife on the couch in the hotel and she was on her $1000 macbook air getting perfect reception, me wishing I had spent the extra $400.

I'm not sure if it was the wifi chipset or the antenna design, and maybe some of the higher end Dells with better radio chipsets would perform better, but I returned the laptop to Dell and I was pleasantly surprised by how easy that was. At least they're doing something right. After that, I picked up a $700 ThinkPad T-series for travel and the reception is great on it.

I don't know specifically about your dell laptop, but Dell XPS's nowadays use the Killer wifi chip instead of Intel's like Apple and Lenovo.

Last year I upgraded my main development machine - a 2013 11" Air (yes, seriously) to a new-to-me maxx'd out 13" 2015 MBP. I do "heavyweight" local development - IntelliJ, Java, Typescript, React, Postgres, etc.

It's faster than my Air, mostly from going 8 -> 16GB of RAM, and the high-resolution screen is great. But it still feels pretty slow and IntelliJ can get sluggish at times. The MBP has a 3.1GHz i7 so at least on paper it doesn't seem materially slower from the current gen of processors. Maybe memory speed is the issue? 1867 MHz DDR3 vs 3733MHz LPDDR4X in the new Air?

I miss the 11" form factor, but I guess that's a lost cause.

The 2015 MBP bought me (cheap) a couple years while Apple sorts out their keyboard issues, but I'm already looking forward to a replacement. It's just ok.

I very much feel your pain about the form factor. Apple seems to be moving away from the smaller form factors which is a shame. I like the look of the new 16" MPB but there is no way I'm lugging that around everywhere. Hmpf.

You would benefit from a 4-core machine. Even the dual core current models are a good 20% faster than what you have now, but you'd be amazed at how much better the 4 core models feel.

So happy to hear this! I just dropped CAD~800 to have the logic board on my late-2014 MBP replaced. I figured "it still works, why spend 3k on a new one?" Plus the new ones with those stupid bar things (I have one for work I never open because I always use workstation).

I recently got a 2019 MBP for work (with the ESC key!!!), and I have to say that a lot of the things you are encountering with your 2013 I encounter quite a bit with this new one.

Overall, I found the speed of this model far exceeded my 2017 MBP from my last job, but it's kinda surprising how frequently the monitors / keyboard / mouse encounter connection issues.

That being said, I feel like I'm being a little picky, especially when I think of how nightmarish it was to get any of this shit to work with my Windows machines. It's the exception rather than the rule that anything would work the first time as expected (monitors / keyboards / mice / printers / programs).

> That being said, I feel like I'm being a little picky, especially when I think of how nightmarish it was to get any of this shit to work with my Windows machines.

It would be different if it had always been like this with Apple. The older models show that it used to work pretty much flawlessly. It is such a big step backwards and just because other manufacturers are crap at it doesn't make it okay. :)

I feel like I could live with the performance issues. But I'm doing a lot of switching between offices and having external screen/mouse/keyboard not working straight away is such a major pain. Even sometimes during the day after walking off with the laptop and coming back to the desk it doesn't work. It could be my specific setup but then Apple doesn't sell any official docking setups either AFAIK.

I'm still using my 2012 macbook pro and doubt I will buy Apple's new Impossible to repair™ "Gree[nd]™" machines when it finally dies.

If anyone's looking for a 13 inch 2015 mbp with one battery cycle count, let me know. :) Its a dual core 8GB though.

Will these new mac books support the older mac os which runs 32 bit Software? (Or does wine have a workaround)

For that price i buy a descent laptop.

Anything past 1995 can play Descent.

This. This right here. This is what matters. The only questions are: Joystick, joystick with keyboard, or keyboard with n-key rollover

Descent is my favorite game of all time. Are there any modern alternatives?

  - Quad core processor
  - Scissor keys
  - No touch bar
MacBook Air, the new MacBook for Pros.

I honestly never understood the hate for the touchbar. It allows me to be much more granular with volume and brightness, and I never really used F-keys anyway.

Volume adjustment is actually a great example of why I hate the touchbar.

- I can't adjust the volume without looking at it. Because the touchbar is flat with no haptic feedback when I land on a button, it's hard to remember the exact position of the volume 'button' without looking. Sounds trivial - but combined with point 2....

- The way the volume control expands - it actually moves the 'volume down' button AWAY from your finger, which again requires me to keep looking at the control.

This means that when a loud song comes on, it can take 2-3 seconds to quickly turn the volume down in total. I could do that with one single keypress in half a second or less on a keyboard, without needing to look at the keyboard.

That can also be the difference between missing a key detail from a quiet speaker on a Hangout.

Flashy, but it's a terrible user experience by every metric other than looks, I guess.

You can actually just tap the volume icon on the Touch Bar and slide your finger back and forth immediately and it works; you don't have to tap, then move your finger to the volume slider and move back and forth.

(This is clever, but basically undiscoverable unless someone tells you in, for example, a comment on Hacker News, which is how I found out.)

This is assuming the touch bar isn't asleep and you can even see where the volume button is in the first place. Often I have to touch the bar once just to wake it up, then find the button and touch and hold and slide.... ech I hate it personally.

> You can actually just tap the volume icon on the Touch Bar and slide your finger back and forth immediately

No you can't! There is a pretty long delay. If you move your finger during the delay, nothing happens. Then when it finally decides to switch modes, you have to move your finger again for it to change the volume. Hope you didn't hit the edge of the touchbar yet. Combined with the phantom button presses when using the top row of the keyboard, especially the Siri button, plus other small issues, the whole thing is bafflingly terrible.

It's potentially that I'm using a 2019 MPB, but I can absolutely touch and slide to change volume immediately. Just press and slide on the icon for volume or brightness.

Also pro-tip: you can change the buttons that show up in the touch bar. Settings > Keyboard > Customize Control Strip. I swapped out Siri for a "Sleep" button, which is super convenient when I walk away from my desk.

On my 2017 with Catalina there is an animation that occurs to show the volume slider. Any sliding of your finger that occurs before the animation completes is definitively ignored. Additionally, there is a significant delay before the animation even starts.

I just timed it at ~580 milliseconds, more than half a second from finger hitting the bar to the time when it stops ignoring touch input. It's easy to slide your finger more than the entire length of the volume bar in that time. It's absurdly bad. It would be weird and pretty lame if they fixed this only on newer models.

An app called Pock replaces the touch bar with a custom one that I find a lot more useful - might work for you too

Just go all in with BetterTouchTool + GoldenChaos-BTT (https://community.folivora.ai/t/goldenchaos-btt-the-complete...) -- I don't know why Apple hasn't bought BTT and made this the default, it's truly the best way to use the Touch Bar and the reason why I miss it when I'm using any other keyboard.

I only hate that it replaced the top row of keys. If it were an addition instead of a replacement, I'd be okay with it. It has it's moments, but so do the keys it replaced.

I fully agree with ESC, as that's a universal key and used very often, but the other ones are more specific and having them be adaptable always made sense. Now that they returned the ESC key, that part is solved.

The volume control keys are also pretty universal.

As a developer, how would I step into, step over, step out in Xcode without function keys?? (Continue being ctrl-cmd-Y is the worst shortcut ever). It truly hampers my development because I have to look at the touchbar to see where on earth those keys are (F6, F7) or step in/continue in Chrome (F10, F11).

Also, where on earth is the escape key??

The escape key is back on the new 16 inch, but even on older Touch Bar Macs you can tap anywhere on the left side of the Touch Bar (doesn't have to just be the escape button area) and it will still work.

Different strokes for different folks, but I've never liked using the function keys for debugging. I just click the buttons on the screen. I'm a little surprised they don't have a way to set the Touch Bar buttons up to do that in Xcode though.

I will try the "left of the escape key" trick - thanks!

Moving the mouse cursor up to the toolbar always seems a lot of travel and swishing around if you're hovering over variables to see their contents in the source code.

I have found the auto/local/all view in Xcode to be a bit dumb and unable to properly expand some template objects in C++ so it's all just an exercise in frustration anyway!

On the 16”, esc is right there to the left of the Touch Bar. I’m waiting for the next 13”, which I expect will have the same change.

In Intellij IDEA when debugging the touchbar has a debugging-specific menu with all those controls. I don't find myself needing to look at the touchbar all that often. My muscle memory has adjusted over the past couple years I guess.

Want granularity? Just hold alt+shift while pressing the volume buttons to adjust the volume in quarter-box increments. You can do it without looking and it’s way easier than moving that slide on that gimmick touch bar. Works for brightness, too.

Much of the hate is that the touchbar wasn't optional, at least not unless you wanted to opt out of an Apple laptop. If the touchbar had been something users could choose, Apple users wouldn't have minded so much.

Supporting more options is expensive, so it's understandable that Apple didn't want to give their customers a choice. Still, it seems like a gimick. And it appeared at the same time as the butterfly keyboard, cementing the notion that Apple had lost its way.

I appreciate the touchbar every day (esp. with bettertouchtool) but the soft escape is horrendous as it's used in so many of my workflows and isn't 100% responsive and doesn't give any tactile feedback.

It's a mandatory and expensive feature.

It occasionally freezes. I occasionally touch the buttons by accident. Mine is missing the physical escape key.

It doesn't add any benefit to my experience. I'd prefer real keys that I don't need to look at. I could hit volume up/down easily on the previous models.

Using Terminal, I use the Esc key a lot for navigating and having a touch bar Esc key is not a great experience since you also don't feel feedback that you're touching the right key.

I've also accidentally hit the touch bar a few times while hovering one of my fingers above it as I press down on one of the number keys.

It's just not worth it money/usefulness. To me it's going too far with tech.

It's not awful. It's just that I wish I had actual keys every time I use it.

I’ve had it locked up a couple of times, and couldn’t mute a loud sound.

You can be just as granular by using shift + alt + volume/brightness. That way your changes will be in 0.25 step increments rather than the default full steps.

keyboards are meant to be used without looking at them. with the introduction of the touchbar, you have to look at what you're pressing. it's like a giant touch screen in a car, it works, but you have to look at it, where as if you have buttons, you can find what you want to do by feel/memory.

on a personal note, i've randomly refreshed webpages because i've overreached on the number row with the touch bar.

>It allows me to be much more granular with volume and brightness

I press the physical button a few more times? I find that 10 times easier.

I hope I didn't get permanent damage but I hurt myself badly with it. I was trying to put the volume up a bit while wearing earplugs (it was very low) so I pressed the "up" volume key. I accidentally pressed few pixels to the left from where I should have pressed and it went to FULL VOLUME without a warning, blasting audio and hurting myself badly.

This could not have happened without the touchbar. This is horrible UX and I will never trust that (work) computer again.

"I don't need F-keys, so I don't understand why anyone else would ever need them"

Does that statement strike you as reasonable at all?


  - 2 cores
  - 1.1 GHz
In 2020...

Intel’s 10nm chips have very low base clocks. They’re almost always in some sort of boost mode.

You can get a quad core i7

Which is still a 1.2 GHz base clock.

Because wasting energy on a 4 GHz base clock is completely pointless (and so is your criticism) if you can adjust frequency as needed. 99% of the time 1GHz is sufficient. It’s only when you launch the browser that a faster CPU is useful, not when you’re reading through a website

> Because wasting energy on a 4 GHz base clock is completely pointless

That's not how base clock works. Base clock is not min clock. Base clock is what it's "guaranteed" to hit under sustained load if TDP is respected. It's the TDP clock. A 4ghz base clock CPU will still be far, far below that when idle.

The problem is that it ramps up by 250 Mhz increments, over a period of several seconds, and that can be extremely noticeable in some workflows.

I went from an 6700HQ (2.6 Ghz base) to a 10710U (1.1 Ghz base) and the difference is definitely there, and it's jarring enough to the point where I kind of regret it. It feels like a huge step backwards, despite the latter CPU being four generations ahead.

Did you make sure that your turbo button was pressed? Honestly, this is why I just stick to naturally aspirated CPUs.

The first thing I notice in that comparison is that one chip is rated for a 45W TDP and other is 15-25W. While I think these cross-segment comparisons are exciting and show great progress, it's just not fair to the electrons.

Surprising. What device are you using?

This does not seem to be the case for my i7-8565U nor my older m3-something. The 8565U for example feels just as snappy as my desktop i7-6700, except it’ll throttle after about 30s.

> The problem is that it ramps up by 250 Mhz increments

Isn't this completely down to the cpu frequency governor in the OS?

My impression was that this "stepping" is customisable, at least in linux, I regularly step my CPU manually even.

I'm not sure what Apple has here, but maybe it's not what you expect.

I'm on Windows 10 and I'm not aware of a built-in method to change the CPU multiplier.

How windows does things and how other operating systems do things, in the wise words of Jayne Cobb "ain't exactly but similar".

With modern noteboock CPUs the base clock is only a loose indicator for how a CPU will perform. The CPU will still be downclocked and undervolted depending on the load.

Don't you know? Higher is always better! 5 Ghz in my laptop, plz!

And turbos up to 3.8 GHz. We’ll see, but I suspect it’ll spend a lot of time above 3 GHz

At a 9W TDP the chip wont spend any more than a few seconds at boost clock.

Sure, for the same price I can get a 8-core, 3 GHz base clock in a non-Apple laptop.

With 2 hours of battery life

Which is fine.

Exactly what I was thinking when I read the spec sheet: 'I wonder how many developers will go for the maxed out version.'

Definitely going from 2015 MacBook Air to this new one for my personal at-home coding laptop, as long as I like the keyboard when trying it out.

I had really been wanting to upgrade for Retina & better processor but I knew they would upgrade the processor and fix the keyboard if I waited for 2020... no reason to wait now.

I don't run any crazy fat Docker stacks for my own stuff at home, so this is perfect.

why not the pro for a few more hundred dollars? or wait for the upgraded version in the summer? you get a noticeable performance boost, dedicated graphics card, touch bar, and so on?

I really don't need that much extra memory or performance, I thoroughly dislike the touch bar, and I want a slimmer/lighter form factor.

My at-home hobby work is in Golang and Python, and not particularly compute-intensive stuff. Neither of those have huge heavy toolchains.

Really the main thing driving an upgrade from my minimum-spec 2015 Air is the Retina display.

My personal laptop is an 11" Air mid-2013 and I still use and love it. I especially love the keyboard on it because the keys have height, feel closer to a mechanical keyboard, and don't capture as much dust and dirt as the flat keys on my newer touchbar 2016 pro work laptop

This is good news from Apple as I was not into any of their more recent laptops but I'll probably upgrade to this one

I only wish they had a 11" version but not a deal breaker

Same for me, although it’s a 2012 11". The 13" model here is actually somewhere in between the old 11" and 13" for size.

I probably would except for the lack of ports and weak CPU.

Still just waiting on a new 14" MBP.

you actually make use of 4 usbc connections? I'm genuinely curious on the use case. These days you can get 12 in 1 dongles from china that cover the last 25 years of input device standards into one usbc connection, and it will charge the thing.

Not a big fan of dongles.

Why though? I find buying one $30 dongle that has 10 inputs for cables I already own is a better deal than buying 10 $20 usbc cables. As a mac user I've been used to dongles for a while, mac laptops never seemed to have the standard AV out aside from that fluke generation with HDMI. Always some weird connector for the sake of being weird, it seemed.

I don't love coding on a tiny screen when I'm not using my laptop keyboard to awkwardly type with an external monitor.

Seriously considering trading my maxed out 2018 13" for this, but in the end probably not going to do it.

I am.


  - 16GB RAM
Can you even run Eclipse with just 16GB?!

I can't tell if you're joking, but I have a 2018 Mac Mini with just 8GB of RAM, and I often run Eclipse, IntelliJ, and PyCharm at the same time (along with multiple browsers and other stuff), and performance is fine.

I was actually surprised by this--when I first started using this computer, I thought for sure I would need to add more RAM, which for the 2018 model is too complicated to do yourself (at least to me it seemed too risky).

Semi-joking, but the problem is real for me. I've a 2013 13' MacBook Pro with 8GB RAM, and my system can't cope with my workflow ... tens of tabs in Safari, webapps in Chrome (YouTube, Google Docs, ...), Eclipse with Scala / Java, ... it's a huge struggle.

I was handed a 2017 MacBook Pro with 8GB of RAM at my current job while waiting for my actual laptop to be delivered, and it was a nightmare.

I keep a lot of tabs open to look things up, but nothing excessive on that machine. I also run VSCode or Pycharm and would also bring up 5-10 containers at times.

It seriously hurt not only my productivity but also my mood afterwards just by having to put up with it for weeks.

Unless you're a very basic user I don't get why you would settle for 8GB in 2020. 8 gigs of RAM cost basically nothing, it's not worth changing your workflow in the slightest to work around that artificial limitation.

It is odd because these memory issues are very real, but if you ever say "wow, devs are getting lazy and these 'desktop' apps that rebundle Chrome are really killing my machine (eg. Slack, Skype) with inordinate quantities of logic in javascript" you get shouted down.

It's bizarre. If everyone used the native toolkits we'd have far less memory usage and everyone (even the memory-constrained) would have a good experience.

Also, with these memory hogs they will do a lot of allocation and deallocation. This is also a problem with interpreted languages. And allocation is the enemy of speed, and energy usage. It'll destroy your daily battery expectancy as everything gets interpreted.


I remember feeling the same when I was forced to upgrade from 32 mb of ram to 128 mb of ram to run the combination of browser, chat and IDE on windows NT4, back when they moved from hand-optimized assembly to mass-produced C++ for most software.

With every layer of abstraction added to ease development the hardware requirements go up. You can build things fast, or you can build fast things, doing both is tricky.

That's actually very surprising to me.

I'm running linux with an anemic window manager, and with nothing but chrome and slack open (20tabs in chrome) I am consuming 6GiB.

If you add teams, pycharm and outlook (electron) it consumes 9GiB... Actually, that's also less than I expected.

Well done pycharm.

I think OSs in general just eat a portion of whatever memory you give them. Right now I'm puttering around with a dozen tabs in firefox, in fact my biggest memory hogs right now are firefox with 3.5gb and apple mail with ~500mb, not really doing anything else, and somehow 12GB/16GB are in use. Better for the ux to keep things open in memory if you have it to spare, I suppose.

When you are memory constrained, you can definitely tell. Everything comes to a halt and you just twiddle your thumbs between commands. This 16GB machine I have shipped with 4GB which was painful even 8 years ago when it was released, and I upgraded myself to 8GB 6 months into ownership. A few years later when javascript became more pervasive on the web, I hit memory constraint on 8GB a lot just from having tabs open in chrome, back when it was perhaps more of a memory hog, so I opted for 16gb and haven't had issues since.

I think at 16gb you should be set for at least 5 years. Most people, even a lot of devs on company issued equipment, are working with 8gb complaining about it right here in this very thread.

If you have larger requirements, a lightweight, thin laptop with a teensy fan isn't for you. Even if it had the hardware specs, the physics of heat dissipation don't work for you and you are better off spending the same money for more hardware sitting in a box under your desk. Me and my sore back are eyeing this up, all my computing is done on a cluster anyway.

Same. I have a MacBook Pro (Retina, 13-inch, Early 2015) with 8GB of RAM and I've been doing fine. Sure, there are some hiccups every now and then but it works. I have Spotify, VS Code, Slack, Kitty tmux sessions and more open 24/7.

What? It's probably swapping like a bastard, which with SSDs is probably not that horrible. Even 16GB for me is low (I do run in a Linux VM guest). I got myself a Mac Mini with 64 gigs, for great justice.

If I open up PyCharm and IntelliJ and Spotify and SourceTree and Docker and three different browsers and iTerm and Remote Desktop and a few other apps all at once, I will get an occasional hiccup, but it's really not as bad as I would have expected. I think 16GB would be nice though.

For comparison, I also have a 2012 Mac Mini at home with an SSD and 16GB RAM, and it's still chugging along pretty well too, although it's noticeably slower than the 2018 model with 8GB RAM.

I'm curious with regard to swapping if that might mean my SSD is going to wear out sooner. Maybe investing in more RAM would be worth it even if I don't feel like I need it.

Exactly. I'm deciding between 32GB or even 64Gb, just to be on the safe side. Because nowadays you're running Slack, Spotify, several messengers, Firefox, Chrome, IntelliJ, Docker and Kubernetes on your local machine.

Which is kind of horrifying, if you stop and think about it. You're wondering whether you need another 32G of RAM to run a basic working environment, a glorified text editor, and some communications software. I used a BBC that could do that in 32K of RAM in the 1980s! Obviously I'm not really suggesting the functionality today is equivalent, but the idea that ultimately you're meeting the same basic needs yet it now takes a million times as much space is... unsettling.

Its certainly amazing how much memory consumption has grown. I like to think of it in terms of economics, we could never write today's software using 80s methods. Slack in assembler? Impossible. Kubernetes in C++? Maybe, but there will be security holes, and Go is just more productive. Developers are expensive, very expensive.

Developers are expensive, very expensive.

Such is the accepted wisdom in much of the industry, but I'm a bit of a sceptic on this score. Of course developer time is expensive, particularly if you're in somewhere like the Bay Area where salaries are an extra 0 compared to most of the world. But we live in an era of virtualisation and outsourcing (sorry, "cloud computing") when businesses will knowingly pay many times the cost of just buying a set of servers and sticking them in a rack in order to have someone else buy a much bigger server, subdivide it into virtual servers, and lease them at a huge mark-up. All kinds of justifications have been given for this, many of which I suspect don't stand up to scrutiny anywhere other than boardrooms and maybe golf courses.

There's a nice write-up somewhere, though regrettably I can't immediately find it, of the economics of cloud-hosting an application built using modern trends. IIRC, it pitched typical auto-scaling architectures consisting of many ephemeral VMs running microservices and some sort of orchestration to manage everything against just buying a small number of highly specified machines and getting on with the job using a more traditional set of skills and tools. Put another way, it was the modern trend for making everything extremely horizontally scalable using more hardware and virtualisation against a more traditional vertical scaling approach using more efficient software to keep within the capacity of a small number of big machines. The conclusion was astonishingly bad for the modern/trendy case, to the point where doing it was looking borderline insane unless your application drops into a goldilocks zone in terms of capacity and resources required that relatively few applications will ever get near, and those that do may then move beyond it on the other side. And yet that horizontal scaling strategy is viewed almost as the default today for many new software businesses, because hiring people who are good enough to write the more efficient software is assumed to be too expensive.

We live in a world where any 1 man startup thinks and their investors hope they will have 10k employees by year end. Therefore, if you are going to be burning money anyway, what's another line item on the monthly outflow if it means you don't have to spend 3 months hiring someone to toil in the server room and a couple months ordering and assembling your farm that might crash the day your startup gets linked on hacker news.

There are technical reasons for this, being able to handle sudden load, but mostly it's for ideological reasons. We aren't building companies, we are building stock pumps guised as the utopian future. If you are wondering what a blue chip company looks like in tech, they are the ones that own their own infrastructure.

Maybe there is a middle road for cash poor companies, where you keep latent demand in house for the sake of cost and sense, but have some sort of insurance policy with a cloud service to step in if demand surges.

I’m not well-versed at all in Go and Kubernetes? Isn’t the large RAM usage from the usage of containers? Is Go a memory-hog?

Its a GC language, so yes it needs more than C++, but less than Java. I believe that is because often structs can live on the stack.

We don't have the same basic needs. People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with. Back in the 80s you're QA and Production environment were very likely the same!

I'll admit that modern text editors and communication software have grown resource hungry, but a lot of that comes from being able to deliver a strong, cross platform experience. I remember desktop Java doing much of the same with just as bad resource usage. Same with applets.

People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with.

Sure, but that immediately raises the next question of why those VMs are so big...

Yeah, the fixed cost of a VM context is on the order of kilobytes in the host kernel, megabytes in the guest kernel. And with VM balloon paging a guest VM acts much like a regular process in terms of memory usage. It's not VM usage that hogs memory, it's the applications, regardless of VMs.

Why does it matter when you can afford that RAM? Just buy and forget about it, it's cheap enough. We used to land to the moon with CPU less performant than Apple's HDMI adapter cable, it's fun comparison but not very useful, that's just the way things are and it's not going to change anytime soon.

I realise it's how things are today and not going to change any time soon, but it still feels like we as an industry have moved all too easily in a very wasteful direction. Sure, with RAM you can just buy more, but it's symptomatic of a wider malaise. Other capacities, particularly CPU core speeds, have long since stopped increasing on a nice exponential-looking curve to compensate for writing ever more layers of ever more bloated software in the name of (presumed) greater programmer efficiency. It just feels like we've lost the kind of clever, efficient culture that we used to have, and I'm not sure we weren't sold a bill of goods in return.

I'm not sure whether curve is still exponential or not, but it's there. Single-thread performance is increasing every year a little bit and core count is increasing like never before. 16 cores consumer CPU is not a dream anymore.

RAM size slowly increases as well. 4 GB was enough 10 years ago. 8 GB was enough few years ago. Today I would suggest 16 GB as a bare future-proof minimum and one can buy 64 GB for a reasonable price.

We still have room for more layers. And it's not only about efficiency, it's also about security. Desktops are still not properly sandboxed, my calc.exe still can access my private ssh key.

Once performance growth will really stop, we will start to optimize. Transistor density will double every few years until at least 2030 and AFAIK there are plans beyond that, so probably not soon.

> Why does it matter when you can afford that RAM?

Why should everyone have to afford that RAM?

I have 8GB on my work laptop with almost all of this (except Kubernetes, but I fail to understand why you would need a local Kubernetes) and it's fine, I usually have 2GB free memory.

Don't exagerate your memory requirements, you would be more than fine with 16GB.

That's not even close to an exaggeration. I'm running only half those things (or their competitive equivalents) right now on a Windows box. I just checked and I've got 14.8 GBs in use.

Fortunately, I have a Dell XPS 15 with 32 GBs of RAM, but the second I start up a single VM, one more messaging app, a small handful of Docker containers, or any IDE (of which I'm running none right now), I'm going over 16 GBs.

Realistically, most of us on HN probably need around 20-24 GBs, but laptops don't come in those increments.

> of us on HN probably need around 20-24 GBs

I develop for a living. I use 6 GB including a browser, a VM and an IDE.

Some of you greatly exaggerate the needs. Some workflows require 16+ GB of RAM, but most people complaining about RAM mismanage it or do not understand that caches are not mandatory.

I'm running firefox and mail in macOS, and 12gb are in use. The OS keeps things loaded in memory if you have it to spare.

Right now on macOS I'm running Firefox, Outlook, 2 VSCode instances, Postman, 1 Electron chat app and another chat app and I'm under 5GB. Uptime 4 days.

Cache does not count when discussing memory requirements.

I love it when people tell me my requirements.

Java people problems

"quad core" @ 1.4GHz

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact