Hacker News new | past | comments | ask | show | jobs | submit login
If a MacBook Pro runs hot or shows high kernel CPU, try charging it on the right (stackexchange.com)
895 points by gkop on April 23, 2020 | hide | past | favorite | 478 comments

This post on HN yesterday fixed my problems of the last 5-7 days. I use a TB 2 Hub hooked up to a 2018 MBP to 2 4k monitors, a USB mic, etc.

And I've noticed the hub got really unstable whenever the CPU fans would go wild. Looks like it was the controller overheating due to the shitty thermals that Jony Ive's Apple seems to keep pushing out. (Still the Apple of today).

Now that I've switched my ports in a different config, so far I've had no crashes in the last 2 days.

I swear, I wish I didn't love macOS so much (or wasn't so heavily invested in it), or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine. WSL makes this more palatable, but the unparalleled retina support on macOS, my 15 years of using it, and just habits built up, keeps me from leaving. (I felt the same way when I first moved from windows to macOS, but it was in my early 20s and I had lots of time to play with the OS)

YMMV but switching from laptop OS X to desktop Linux has been a gamechanger for me. Far cheaper, and the power / UX of a mid-range 2020 desktop blows my 2019 Macbook Pro completely out of the water for my usecase. Code is a joy again. When I do use my Macbook on the go, I generally use VSCode remotely connected to my desktop because it's so much faster.

I also use Regolith Linux, which is a noob-friendly tiling window manager version of Ubuntu, and it feels so slick with multiple monitors.

I switched from a macOS laptop to a Linux laptop and it’s been completely the opposite for me. I’ve been spending so much time fighting the OS, I’m actually considering buying a Mac again even though the laptop is just a few months old.

You can call me too incompetent or whatever but I’ve been running into stupidly obscure bugs that even stumped some of my Linux guru friends.

Just as one example of many, when the device is connected to a Thunderbolt display the Intel wifi driver crashes and restarts periodically which freezes USB input for about 15 seconds every time. This issue persisted across different Linux distributions, kernels and firmware versions.

Don’t ask how long it took to figure this out. I have now connected a USB wifi dongle to the displays USB hub.

I really miss the plug+play nature of macOS, I think Linux has it advantages and it might be better on desktops but it’s just been horrible for me on a laptop. I might have to try a MacBook plus a fast Linux desktop next.

> You can call me too incompetent or whatever but I’ve been running into stupidly obscure bugs that even stumped some of my Linux guru friends.

No, you're not incompetent. As usual, manufacturers are putting weird features into their laptops that the Linux drivers and userspace can't keep up with - e.g. the debacle that is the Nvidia Prime gpu switching tech for low power vs high performance modes. Simply doesn't work most of the time, leaving you scratching your head. UEFI related woes are also common as we finally deal with having to give up on a decent experience with legacy boot.

At this point the wise person buys a laptop with official Linux support out of the box. It helps you and it helps the community (vote with your dollars!).

Yes, weird features such as "802.11n wifi", "suspending the system when lid closed" etc. Had a well supported older thinkpad (even supports open source bioses etc), and issues cropped up in those areas

For sure, but usually this comes down to "manufacturer picked a slightly cheaper wifi chip with no linux support", "manufacturer is doing weird shit with ACPI they shouldn't", etc.

No. For me, in 16.04 everything was perfect, 18.04 started breaking things

Unfortunately I’ve had similar issues with “official” Linux supported hardware. Linux on desktop still requires forum diving.

Who is declaring that hardware "official" and if they're calling it "supported" shouldn't they be supporting it?

It was a Dell XPS 13 Developer Edition with Ubuntu, far and away the most recommended distro/hardware combination. Everything else seems to be less supported.

If someone doesn't know how to dig into terminal commands and google stuff on your phone to troubleshoot, you shouldn't even attempt linux on a desktop.

Oh yeah, Dell XPS Dev edition is usually a decent choice, although you're right that not everything works perfectly (at least it didn't a few years ago when I bought one). It is a little annoying to have bugs in a factory-installed OS, but I'm glad to at least have the option.

I actually prefer Thinkpads these days because everything "just works" when I install Fedora on it.

If you're an Ubuntu person and want it from the factory, I have heard that System76 build quality has gotten pretty good.


"Ubuntu Certified hardware has passed our extensive testing and review process to make sure Ubuntu runs well out of the box and it is ready for your business. We work closely with OEMs to jointly make Ubuntu available on a wide range of devices."

So in this case it is a collaboration between Ubuntu and OEMs

I have been using popOS and have no problems really. The docking/connecting to Monitors/webcams have been really smooth.

I am not a power user of other peripherals though.

Only thing you need to look out for when buying components is what the Linux support looks like. I've built 3 desktops in the last couple of years and all of them work out of the box. It really doesn't "require" forum diving.

A Dell XPS 13 developer edition with ubuntu preinstalled hasn't worked "out of the box" for simple multi monitor usecases for me. Invariably something requires an update for support that invalidates some assumption and then whoops the only guide that describes your problem has a solution that involves pulling and compiling X.org in a terminal and dealing with tarballs.

And someone can go gosh, you must be doing it wrong, and they're almost certainly correct! However I'm a pretty big power user and can actually get things to work and dig into forums, so I realize that the average user has absolutely no chance.

How do you handle video card selection?

I got the ASUS 1215B netbook with Ubuntu and got my share of issues with wlan and GPU support.

Linux laptops are a mixed bag, to say the least. One-off "precision tuned" (vs commodity) hardware is often to blame. In general, if you want a stable linux laptop experience, you should be buying old. It's not very attractive, but it is sustainable -- which is it's own kind of attractive! :)

In general, my experience using debian-based distros as a laptop driver have been more or less frictionless. I use an older lenovo thinkpad (x1 carbon, 1st generation, purchased for like $200 secondhand a year and a half ago).

In general, thinkpads have a reputation for having a plug+play linux experience, once you boot for the first time. I'd recommend giving them a try before throwing in the towel, if you have any patience in reserve.

You can get pre-owned (older=more stable linux support) t-series thinkpads with very nice specs, especially if you're willing to trade off display resolution. Plus the parts that die (batteries, ram) are all commodity and replaceable, you could presumably run with the same laptop for a decade, if you're in to that sort of thing.

Hi, i have a Thinkpad L450 and when i was looking at switching to Ubuntu on my laptop, i saw that it had worse batery life than windows,i don't remember how much,maybe 10-15%?Do you have any experience on that?Battery life is pretty important for when i take my laptop on courses(not anymore eh)

TLP can be very effective. Have you tried it?

I did try it,however i do not remember if it made any difference.Perhaps i didn't configure it properly?

I’d say give it a try. My experience has always diverged from the stated norms on battery life in all devices and contexts, but maybe my usage patterns are abnormal.

If you’re using simpler programs (Firefox, terminal), your battery life should be great. I usually get four hours of heavy vim use with WiFi etc all on, on a battery that’s 75% capacity. YMMV.

It’s not a mbp level, but I imagine it’s fairly similar to what windows would draw. Maybe better.

This actually is a ThinkPad :(

It’s a new X390 Yoga though. Maybe they’ve dropped the ball a little, it definitely matches my friends’ experience that the more traditional models are pretty stable.

Sorry to hear this. Yeah, it’s probably a matter of getting an older one. The newer ones are trying to compete with surfaces and mbp, and it’s been painful to watch :/

Anecdotal evidence but I've had a great experience with the first gen X1 Extreme 4K and running ubuntu 18.04.

I recentely ordered the X1 Extreme v2 and I am rediculously excited. It's my first non-Apple laptop in 20 years.

Laptop vs. Desktop.

Linux laptop experience is notorious for weird quirks and instabilities with the hardware. Desktop is a lot smoother.

What laptop are you using? I think that would be helpful context. I've been running Ubuntu 16.04 on my Dell XPS 15 9560 with almost no issues (and none that aren't easily resolved with a shell script or a keyboard shortcut) for almost three years now. I get the sense that the more popular the hardware, the fewer issues you'll have running Linux on it. I certainly wouldn't call you "too incompetent" -- but I might call you just plain unlucky.

If you want to go back to macOS because you don't have to be "lucky" to get a laptop that plays nice, I don't blame you. But for me, the tradeoffs to stay on Linux have been minimal and absolutely worth it.

I’ve had the same experience. Using windows and linux now, it’s shocking but windows has been a pretty good experience for getting work done.

I recently got a new laptop and bought it from one of those linux specialist places after having a pretty terrible compatibility experience with my previous high end HP laptop.

As an intermediate, install windows on the insiders channel, though update 20h1/2004 should be out soon for GA.

I had to jump into windows for a few things last month, and WSL2 on windows has become pretty good and bearable, the Docker beta support for WSL2 also really good... seems to use a bit more memory than I recall. But linux cli with a Windows GUI has been surprisingly bearable... Remote (wsl/ssh) extensions for VS Code invaluable as well.

You are saying the alternate to probably the most used developer setup is a combination of multiple beta release channels?

I am suggesting trying a software-only alternative to buying another laptop.

Far less obscure than the issue you cite but I have found keyboard shortcuts across the OS and third party apps on Linux are so incredibly inconsistent when compared to macOS (or Windows for that matter.) I find I'm constantly bouncing between control, alt, and super to achieve what I could do in macOS with just super.

Truth is, Linux doesn't work well on new hardware unless it is some specific models that state Linux support like some dell or thinkpad machines. You bugs will probably be fixed in a couple of months/years.

I don't completely love any setup, but I'm starting to think that Windows 10 + WSL is the best open-source development setup. Huge variety of hardware, plus all hardware actually works right, plus pretty much any popular desktop app works reliably and has a good GUI, plus all Linux CLI tools are there and work right.

What's the fascination with laptops anyway? Are that many people really working from coffee shops? I'm pretty much always coding on the same desk at home and a mid-high end desktop is significantly cheaper than a decent laptop and much more powerful than even the really high end laptops. The ergonomics are also much better, although that can be fixed on laptops with docks and separate monitors etc too.

I had a laptop at my previous job, now a desktop.

If you have to do a demo in a different room, it's a pain. Your best bet is usually to use another laptop. Same thing if you want to have a call in a quiet place and want to check emails/reference during the call.

Work from home is also easier, you don't have to use 2 computers or carry your tower (though currently that's viable).

It's more about those that work from the office mostly, maybe go into meetings etc... and then home occasionally (or right now, mostly).

At work, I'm on a dock... at home, I'm on a 4k-kvm switch... so I'm not really using the laptop but for a lighter computers... Just got bumped to 32gb ram, which is most of the laptop bottleneck. Though my personal system (r9 3950x) is much faster than the laptop (i7 8550u) both on 1tb nvme.

I'm envious of your 3950x, but on the other hand I suspect I'd never actually max it out. I should get into video editing or something to justify more PC upgrades...

Even then... Literally the only time I max it out is when I'm doing a handbrake x265 encode on a faster preset... the slower presets only use about 75% of CPU. If I did it again, would probably go with a 3900X and spend the extra $250 towards an RTX 2070 Super instead of the RX 5700 XT that I got.

My old computer was over 5yo at upgrade, with a mid-cycle upgrade of a couple components, likely this will be the same, though I don't think I will ever go RGB again.

What 4k kvm switch do you use? Do you like it?

It's the TESmart KVM Switch... HDMI 4K 60hz 4:4:4, I like it okay. Seemed about the best option on Amazon, but mixed reviews. My display is 60hz, and I don't play many games, so it works well enough for my use.

I did get hdmi adapters for my pi4's, but haven't actually tried them yet. May need different adapters/cables if you're going to/from mini-dp or another interface. back of the switch looks like USB-B Female (standard USB cable, not 3) and HDMI Female.

I've only been using it for about 4 weeks, but so far working well.

KVM (comes with 2 cable sets): https://www.amazon.com/gp/product/B07F6XFVZ3/

Extra 2 cable sets: https://www.amazon.com/gp/product/B075FRFXNX/

Micro HDMI to HDMI (for pi4): https://www.amazon.com/gp/product/B00JDRHQ58/

I'm sheltering in place and in the past 3 hours, I've coded in like 3 areas of my house. I also need to move when my partner is working and is on a call.

Not having a dedicated work area like a home office or just a desk and office chair in a corner of the living room (this is my solution) sounds like a nightmare to me.

Having a single spot to work at, never to change location, posture, or surrounding sight sounds like a nightmare to me.

I have two sofas, a small desk and a bigger one in two different rooms, a balcony, a table in the kitchen and one in the garden. I frequently switch and move between all of those, which helps me a lot in getting out of coding slumps and refocus. I was so happy to give up my static one desk multimonitor setup at the office for a nimble 13" laptop work from home situation.

To each his own.

Sitting at the same place every day and having no freedom to move because you're tethered to a desktop sounds like a nightmare to me.

Not very interesting discussion to be had in this direction.

I used to think the way you do as recently as just a few years ago. I couldn't imagine not being able to pick up and move to a cafe or co-working space. I was working a lot of hours and working from home when I probably shouldn't have.

What I've found is that since having a desktop and a dedicated desk and office in my house, when I leave the room I don't bring work with me. I also don't have push notifications enabled on my phone, including email. When I go out, I enjoy other things and then when I come back to my desk, I am much more focused and ready to concentrate on work.

Yep, different people have different subjective preferences about working situations, who woulda thunk it? I am interested to hear peoples perspectives though.

I have a desktop PC at home in a home office. My wife is right near me.

I think this is probably a problem because I also play games on it so the room sends very mixed messages to my brain.

I help with a small business so have to use my PC for that. With my day job I have a laptop provided and do move around the house to get a mental disconnect and help me focus on some tasks. The problem is it's not a great device for stuff like in-depth research, I really want a big screen for that, so usually have to use the PC and get distracted.

I have one of those. I've been remote for a couple years so none of this is new to me.

I just unplugged my laptop from my standing desk and went to lay down for my postprandial chill session. I'm going to get some work done as soon as I'm done faffing about on HN (whomst among us...) and then I'll probably plug back in again.

It's just easier to do it this way rather than synchronize state between a desktop and laptop. One less thing to own, also.

I think there's more to it than just working from a coffee shop.

Without going into people who are often on the road, many folks I know like working on the same computer at the office and at home.

It's usually easier to carry a laptop than a desktop. Even though there are many very small desktops nowadays (see HP's elitedesk mini - though it looks like a laptop without an screen, so I'm not sure it's that much more powerful) the laptop has usually fewer cables to unplug so it's generally less of a pain.

Another angle is that for many people a laptop has enough power for the activities they do and being portable is a real plus. I'm typing this on a 2013 MBP in my bed. This laptop might be slow compared to a modern mid-range desktop, but it's not tethered to a fixed spot. When I need to do serious work, I can plug a 4K screen and external keyboard and have the desktop experience.

On the rare occasion when I need a lot of power for some task, I'll usually fire up some outrageous ec2 instance for an hour or two. It will also have better network connectivity, which allows me to work comfortable over my parents' DSL line too.

I guess it all comes down to usage patterns. If you always use your computer on the same desk and never have the need to move it, I guess a desktop is a more effective use of funds. But many people seem to enjoy being able to carry the computer on a sofa, in the kitchen, etc.

When I started at my current company we had desktop PCs, went home, and used a VPN to dial in.

They got rid of the desktop PCs and gave us all crappy laptops (they took a few years to catch up in power to what we had) and the argument was they could not ensure a secure environment on random home PCs. They were probably angling towards a hot-desk setup too but most people have a dedicated desk still.

I definitely preferred the old setup. I just struggle do any meaningful work at all on a laptop, I need to plug it in to a screen but I can't dedicate that space so normally just put up with it rather than unplug my home setup.

They do provide docks in the office but unfortunately a few different generations of HP laptops are around so you might need to search for the right one.

I don't think in a corporate setting fascination has anything to do with this. I work for a global bank; we have around a quarter million employees and many of our office spaces employ hot-desking. Each one of us gets a laptop and we can work from wherever. Especially that you're required to work from home at least one day per week.

It’s not a fascination. The issue is usually that if you have a desktop, you also need a laptop. So unless you actually need a desktop’s power, you just use a laptop with desktop ergonomics (screen, keyboard and mouse) and occasionally go portable.

Is there a distro/DE/WM that handles UI scaling gracefully? Every time I try to go back to Ubuntu with my 5K monitor I'm met with the worst UI scaling options imaginable. For example, scaling up the titlebars, but leaving everything else tiny... or scaling options being limited to 1.0 or 2.0. I still want to output to the monitor at 5K, but with the whole UI scaled up. Mac OS and Windows 10 (to a lesser extent) handle this without issue. I would love to move to Linux full time, but this has been a hurdle for me.

In all the articles about ubuntu 20 that I saw today one of the main improvements mentioned was HiDPI diplay support, maybe this solves your problem.

Meanwhile, the MS Surface I'm typing this on has a blurry Task Manager and unreadable notifications because 75% of them fall off my Full HD screen.

I wonder why this has been such an issue for Microsoft. If I remember correctly, the entire Computer Management program and all of its subcomponents have no functional UI scaling. As pretty as a Retina monitor is, I'm glad that I went with a cheaper, lower-resolution monitor with good color accuracy and reliability. With subpixel hinting, it's very sharp from my usual sitting position. With laptops, I can see the advantage of going HiDPI.

YMMW but I had a really good experience with 2x scaling on a 4k Laptop Screen in standard Ubuntu and gnome. There are still some apps which ignore dpi settings (zoom) or need to be forced to scale (often electron stuff, although you can usually just zoom with Crtl+ there). Otherwise awesome sharp text and fluid.

Fractional scaling seems to be quite a mess, at least in KDE it breaks a lot of layouts.

2x seems to work well most places... it's definitely fractional scaling that sucks.

For price-performance, absolute performance and acoustics laptops never made any sense at all. A 3700X annihilates the highest perf part Apple uses and can be cheaply cooled under full load without causing much noise; or without causing any noise for slightly more expense. And the 3700X is a midrange offering, not high end (unlike the Apple part).

Absolutely, but, sometimes you’re on the move.

The 3700X is a mid-high range offering btw. 3600 and 3600X (which isn’t wotth) is midrange.

We're not going to be on the move any time soon, so if there's a good time to build a tower it's now.

Is there a config you would recommend? Primary use case would be for programming & browsing. Would prefer not to build as I haven't done it but am not averse to the idea.


> Far cheaper, and the power / UX of a mid-range 2020 desktop blows my 2019 Macbook Pro completely out of the water for my usecase.

Which laptop do you use? I tried to switch from my just-post-Intel MacBook Pro to the Lenovo X220 several years ago, figuring Linux support + IBM quality (they had been fairly newly bought out) would give me a solid machine. Turns out several stuck pixels on the monitor and a keyboard on which some keys didn't work were officially regarded as within acceptable quality range. (Plus, I hated the trackpad, but that's my preference rather than a hardware issue.)

The same story. I was mac os addict for a long time since 2010 MacBook air and 2012 MacBook pro. They were amazing machines with stable *nix based OS.

But since that time Apple more focused on phones, desktop OS didn't get much better. The only thing they are doing is more and more cloud integration to lock you in the Apple ecosystem. I got tired of that.

Now I prefer to use Regolith Linux, because it's much better for the development to have a Linux system with proper package management, without messing with the brew.

The distro is insanely fast, and you can almost forget about a mouse with i3. It also is really minimalistic, and default settings are really good. I didn't have any urge to change anything.

And it works really great on Thinkpad laptops which have an amazing keyboard. For home, I am using NUC Hades Canyon with last-get desktop i7 processor which a bought for 300$ and you know what, it's at least 2 times fast comparing 2019 MacBook Pro 15'. RIP mac mini. All drivers installed out of the box, even wifi and external sound card works without any notch.

Just want to plug i3. i3 + Ubuntu has been my go-to for 7 years.


Regolith is what pulled me away from MacOS. Very minimal, elegant, and super productive.

Why have I never heard of Regolith Linux, it looks like exactly what I've wanted out of a Linux distro for years without me spending forever customizing it to my liking. Thanks for this gem.

It’s probably more the desktop vs laptop than the Linux vs Macos

"I swear, I wish I didn't love macOS so much (or wasn't so heavily invested in it), or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine."

I was you at this time about two years ago. Exact same thing. Fed up with Apple hardware bullshit, and with their pricing. With Apple in general. I had a sick iMac (as sick as an iMac can be, I mean). Loved and was invested in OSX. But something tipped the scales. Can't remember what specifically, but I said "fuck it" and put the machine on Craigslist. Got a buyer immediately, and lost very little money on a three year old machine. Took the cash, plus a little more, and built a PC. A liquid cooled PC. An egregious, so-ugly-it's-kinda-neat monster with tubes and a radiator and fans and the whole deal. I run Windows 10 Pro and Ubuntu. Neither is perfect for me. Windows especially can be maddening. My personal pet peeve is the lack of powerful device search -- in OSX, I could use Spotlight or better yet Alfred to look inside PDFs, for example. SOL on that in W10.

But it was worth it. My AMD-powered PC smokes anything in my Zip code, I'm pretty sure, and it's more than enough for my work needs (and my work does actually put the thing through its paces.)

I miss OSX, but not enough to go back.

A Spotlight like feature is coming in the next update to Windows 10 I believe. Totally agree on PDFs though, Windows doesn’t even have a solution as decent as Preview on MacOS. The default for PDFs is internet explorer for gods sake.

And curious, do you dual boot Ubuntu, or do you use WSL? My biggest gripe with Windows is the insufferable terminal. But WSL fixes that, and WSL 2.0 is going to have full platform Docker support as well.

Now that Apple insists on proprietary chips, has horrible thermal throttling, got rid of the MagSafe charger, and regressed on keyboard experience, I can’t say I’ll buy another MacBook if my current one dies.

I didn't know about that upcoming feature. I'm really excited to hear that, so thanks. The PDF thing is a wild oversight, too, but I found SumatraPDF and am really impressed by its light weight and speed and bare-bones ethos. There's barely a UI but man can it handle 53 open PDFs at once.

I have both WSL and dual boot, but I dual boot way more. This is likely because I am a command line idiot, and never got fully fluent with navigating my computing life using one. It's high on my list of skills to master in life, because I know what a force multiplier it can be, but I haven't gotten around to it. I mostly use Linux to write, weirdly enough: from org mode to LaTeX to statistics coding, it's a smooth experience - again because it's pretty "just the basics" and does them well. I have no actual need for Linux - I'd be fine with just windows. I guess it's more aspirational on my part.

Apple's moves: yeah man, they're taking that walled garden shit to serious extremes. I know there's a logic to it that works for them, so I don't begrudge them their choices necessarily. But I did really like the company for a long time, so it's a bummer to see them the way they are now.

Neither, looking into it now - here's a link i'm also about to read:


That Spotlight feature isn't shipping with Windows 10. It is included with the optional PowerToys software.


> My personal pet peeve is the lack of powerful device search -- in OSX, I could use Spotlight or better yet Alfred to look inside PDFs

X1 Search [0] is lightning-fast, results-as-you-type, and searches inside every file and in all of your email and attachments. Not just what's open in your mail client but also email archive files. I have email archives back to 2000, and the lookup is still instant.

X1 is actually one of the two things that had me switch back to Windows the two times I've gone all in on a switch to Mac. (Ironically back then the motivation was Apple's hardware was much better.)

[0] https://www.x1.com/products/x1-search/

> My personal pet peeve is the lack of powerful device search

Not content search, but for general file search, Everything[0] is indispensable. I have it bound to Win-Shift-F.

[0] https://www.voidtools.com/

Everything is insanely fast. Even though I have X1 Search and use that for content search (I commented on GP about that), I still use Everything for any search on filenames or directories because it opens instantly. I also love that typing slashes at the beginning or end of a phrase filters a search to a directory. It even supports Regex.


Oh yeah for sure, I found everything pretty quickly and it's definitely indispensable. I'd love if it did content search, but file search is great and fast and straightforward.

Same here, AMD Ryzen PC. Not liquid cooled, but it is a mini-itx cube which is neat.

I have a mid 2012 MB air that I still love. Screen isn't nearly as nice, but I use it over the MBP because the keyboard isn't like typing on cement and I can actually use it on my lap without feeling like its actively trying to burn my balls off. In all fairness though, I have a XPS 9560 and 9360 and both overheat and throttle like crazy and require throttle stop. Lone cowboy admin for a small company so I've got a pile of laptops.

> My personal pet peeve is the lack of powerful device search

I'm legitimately curious. I've used OS X a bit, but am primarily a windows/linux user.

In 20+ years of computer use, I've never wanted this facility. I actually take the option of removing the search indexing system from windows 7 (where you still could), and just used everything's filename search. On W10, I deliberately disable/break cortana/search so it doesn't run all the time.

Out of curiosity, what do you use a file-aware search facility for?

Great question. The answer may boil down to the way my memory works. I work in research, so I'm constantly reading and citing papers and studies for lit reviews, general understanding, making sure no one else has done the project I just thought up in the shower (usually they have), and mainly just staying at the crest of the wave in my field and subfields.

With the exception of the big famous names (famous for the 13 of us in our niche, anyway), I rarely remember those studies' authors, nor the titles of the papers most of the time - i.e. the data encoded in the filename. But I do remember certain phrases, numbers, and the like that they use in the body of the paper - in other words, the material that actually interests me. File content search for PDFs and other text allows me to enter one of these snippets and find the paper in question without having to spend minutes upon minutes scratching my head about "who was that lady at NYU ... or was it a guy at Berkeley"?

That's odd. I was just searching a directory of PDFs for a title and I was getting irritated that it was returning results from the file content. Are you sure this doesn't work for you? I am using the insiders build so that may be the difference. Also, it's Edge that windows uses for its default PDF reader and it's actually quite good. There are a few UX quirks, but it's very performant.

If you want to search inside a PDF file, install Adobe PDF Reader. Windows Index Search doesn't have a PDF filter by default, you need a filter for it.

What about Hackintosh? Seems like you have the specs.

Yeah that's a great point. I thought about it initially, but I got a little nervous (perhaps unjustifiably) about the Hackintosh universe being a little slapdash and then about Apple maybe issuing some under the radar OS update that bricks my machine. I'm catastrophizing, maybe, but I never pulled the trigger. That said, I never actually put in the 3-4hrs of reading I'd need to do, so I definitely wouldn't rule it out.

Do you have experience with doing the Hackintosh thing at all?

I did the Hackintosh thing for a friend on one of those 10” HP mini-laptop 5 years ago. It was delightful when it worked, however every macOS update (or OS-X back then) was a toothache—usually culminating in 2-7 hrs of googling/re-configuring etc. It wouldn’t have been so bad if the machine wasn’t this person’s main machine, or if I had waited longer before updating (so known procedures to make things work were available and not still being understood and developed by the community).

Might be a very different experience on a desktop, but definitely read up on the update experience and time-cost if you go this route.

I am currently in the middle of my own Hackintosh build on a fairly compatible laptop. If you are not prepared to blindly execute commands and run applications listed in a few different guides and then hope for the best, and you wish to grok what you're doing to your computer (so that you could, say, debug inevitable problems), I'm sorry to report that you are looking at many, many more than 3-4 hours of reading. Let me just say here that I have many, many years of experience in helpful fields (software, hardware, firmware, -nix), and I don't hesitate to say the process of building a Hackintosh is difficult and involved. That is, if you don't intend to buy specific compatible desktop hardware and then use specific software tools to do the install. For example I am installing on a laptop using the new bootloader OpenCore (versus the long default Clover) and I do not already have a Mac or Windows system handy, so I'm doing the install from Linux. This makes everything more complicated, but this is probably more similar to the "average" use case for most users than building a desktop Hackintosh using Clover.

That being said, the good thing is that the situation is improving: The documentation is being constantly updated and consolidated (which can be its own evil as you know, since there is frankly too much documentation out there, most of it outdated), the tools are getting easier to use and performing their functions in less hacky ways, and the community of Hackintosh builders is growing. But just be advised that the vast majority of the community of Hackintosh users really have no idea what they've done to their systems beyond being able to regurgitate the instructions they followed in whatever guide they used. And so most of the posts and replies on the forums and subreddit will not be helpful for solving any of the inevitable issues you'll run into. Probably 95% of thread replies are other users flailing around with their own similar-sounding problems, suggesting essentially random switches to flip in the configuration files (further complicated by completely new issues introduced between version updates, as the sibling comment mentions). This is problematic because in actuality, everyone's using completely different hardware and so none of the ubiquitous suggestions of "You need to enable this setting since it worked for me" are applicable. Successfully building a Hackintosh essentially comes down to loading the proper firmware settings and hardware drivers which just so happen to work for your particular set of devices. So just go into it with eyes wide open to the fact that this is a large community standing firmly on the shoulders of a very few giants, and be mindful that you can physically damage your machine if you take the wrong suggestion from a random forum user for a problem you're having. The most helpful external (non-Hackintosh) documentation I've often referred to during this process are the current UEFI and ACPI specifications, just to give you a heads up on something useful to have handy. Good luck!

This is really helpful, thank you. What you say puts some form and empirical evidence to my concerns about Hackintosh. I guess my gut had it right this time, which is unusual. So you can brick your fancy new homemade, warranty-less PC!

Anyway maybe one day I'll do it for fun on a crappy laptop I get off Craigslist. Sounds like this pays off most when it's a low-risk effort.

That's probably for the best. I bought a laptop with a known-compatible processor, and confidence in my past experience in hardware that I'm not too worried about frying my machine. For someone technical who knows in advance about possible hardware damage, I'd just say that while damage is possible, this is mostly a danger for when you'll be "patching" the ACPI configuration files that define to OSX how it should interface with your processor. So if you aren't careful, you could be telling OSX to send voltage down a line that shouldn't have voltage on the line. I mean you're not gonna smell your mistake, but you won't be using that CPU ever again. And of course as you know, essentially pounding a square peg into a round hole like you are doing when trying to fit Apple's device drivers to your particular hardware, the possibility for damage is there, too. That all being said, if you enjoy a technical challenge and learning a lot about how OSX works, it's a great opportunity to work up a sweat, with relatively little risk to your hardware if you approach the problem the right way, prepared to grok what the guides and documentation is really saying.

I prefer Apple just like you.

With that said, I have been noticing more and more people building these insanely powerful desktop machines (a lot of times for less than a MacBook Pro) to keep at home.

Here is where it gets interesting - then the same group of people start walking around with a burner Chromebook that was bought on the cheap with nothing more than a shell with SSH or just simply remote into their desktop via apache guacamole.

Technology is pretty neat.

As others have mentioned here, if you are willing to meet the OS half way, then Linux is the way to go. With enough customization, you won't want to go back to any other OS. I use XMonad + tmux + vim, so everything is fast, keyboard driven, and minimal, and working with a mouse pointer to arrange windows is now arcane and clunky to me.

Equivalent setups with Windows and Mac are sub-optimal since there is only so much you can hack the window manager to do what you want. But you do need to go through some legwork and a learning curve get Linux working for you.

I jumped to Linux in October when I built my new desktop... so many issues... After so much time battling with MacOS and Linux Desktop on different issues... was actually surprised how much I like WSL2 in Windows of all things. Only been using it since early March, had to jump back to windows for a project.

Definitely a better experience than I remember a year and a half or so ago. The new MS terminal works pretty well, and the Docker WSL2 support is very seamless. VS Code with WSL extensions works great. I spend most of my time in that space and have had so few issues.

Note: editing \\wsl$ files in windows is a little slow, same for wsl editing mounted windows drives... but in the sandbox has been really great.

Yeah I hear you. It took me over a decade before I stopped bouncing off of Linux and going back to windows or Mac. It took more knowledge about working with Linux subsystems, as well as building a stable config that I checked into git. As mentioned, you really have to meet it half way, but once you get over that hump it is definitely more productive and simple than Mac and windows.

> or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine

I returned my 2018 Mac Mini because I was frustrated with all the constraints:

- eGPU required a PC-like external chassis, totally defeating the point of the Mini.

- Want to upgrade the storage? You need an external drive chassis and a free TB3 port.

- Only 2 type-A USB ports

I got fed up with it and returned the Mini.

So, I ended up building a really nice mini-ITX Hackintosh for the same price as what I paid for the Mini. It's got a couple NVMe sticks in it, and a 10tb hdd. The whole thing is about the same size as an eGPU chassis alone. It's quiet, it stays cool, and is relatively rock-solid.

That was almost 18 months ago and I still don't regret it a bit.

It's the "Mac" for me.

What's the state of hackintoshes currently? Maybe that would be a good compromise?

They’re fantastic on desktop and have been for more than a decade, once you get past a setup process which can be quite painful (although isn’t always).

But laptops are a lot trickier. You need drivers (or patches) for a lot of extra hardware (your trackpad, your battery-level-reader, the screen brightness controller, etc); you need sleep and cpu power management to work; you usually can’t just swap out the wifi card with a Mac-compatible one, etc.

It can absolutely be done, but you need to do a lot of research on compatibility beforehand. And as a result, you may discover your options aren’t really all that much better than they were in Real Mac land.

I would go for it (ie start doing research) only if there’s something specific you really want in a laptop that Apple simply doesn’t offer. A touch screen, for instance.

It's not necessarily finding drivers that's the problem so much as correctly patching the ACPI tables so drivers can find stuff, especially on laptops, excluding some stuff like wireless cards which often have sketchy to no driver support. Options are certainly better than in real mac land, but be prepared to spend at least a week working through every device one-by-one. Once it's working, though, it's stable.

It's not just finding compatible wireless that's a problem! You can also mostly eliminate:

• Any laptop with an nVidia GPU

• Any laptop that uses switchable graphics (unless you're okay with terrible battery life from the GPU being always on)

• Any AMD laptop (because even with custom cpu patches, the integrated graphics won't work).

That's a lot of laptops, particularly in the type of segments people would likely be most interested in, since Apple doesn't make them. Combined with the aforementioned wifi compatibility problems, you really need to do your research first!

Apart from very useful, also very easy to setup


Genuinely really tempted to try this. Should fix any of the issues with AMD compatibilities as well...

I'm currently installing it and it really was as easy as running the two scripts. I haven't gotten to GPU passthrough yet, though.

Hmm, what's performance like on that? It feels like running under QEMU would be pretty slow.

It’a a type 1 hypervisor so not bad. But, I wouldn’t do it on a laptop. You need to pass through a gpu for it to work acceptably.

I'm going to try this on my Linux desktop right now. It might prove to be a good way to run Photoshop on Linux, though I'm not too hopeful...

Thanks for the tip!

If you just want to run Photoshop etc, you’ll have a much easier time with a Windows VM!

If you need to run, say, Sketch or Omnigraffle, then it would make more sense!

I tried a Windows VM but it was unworkably slow :/ I've been dual-booting.

Are you using KVM, and are you using GPU passthrough?

Those two things together are what's able to provide close-to-native performance.

I'm using VMware, I'm not sure if that uses KVM. It should have GPU passthrough though, yes.

Nope, VMWare is a type-2 hypervisor, not a type-1 like KVM. (Unless you're using ESXI, but you would know if you were.)

Mind, VMWare + GPU passthrough really should have more than acceptable performance, so I'm surprised.

It's not that it's abysmal, it's just annoying enough to prevent me from using it. I stopped using it a while ago, I possibly had issues with my Wacom tablet as well, but I don't really recall.

Do you know a type-1 hypervisor I could use on Linux? Will QEMU do? Thanks for the info!

On Ubuntu ish systems (not sure about base debians), you can get full KVM support in qemu by installing the "qemu-kvm" package right out the repos and then starting your VMs with the "-enable-kvm" switch. You may also need to get EFI working in qemu to support OSX EFI bootloader, which you can get by installing the "ovmf" package right out the repos, and then adding the "-bios OVMF.fd" switch to your qemu command (OVMF.fd being the firmware file used by qemu to do EFI, which the "ovmf" package seems to install near /usr/share/qemu/OVMF.fd). I'm not sure about GPU passthrough, but with full KVM support enabled, you will immediately notice a pretty huge difference in the performance of the VM.

Thanks, I think I did enable KVM in QEMU and OS X is pretty snappy. There's a bug where the mouse can't tell where the screen bounds are, and stops in the middle of the screen, but that's unrelated. Thanks for the tip!

Qemu can do it, but you explicitly need to configure it to use kvm. And then you need to set it up to use GPU passthrough.

It's a somewhat involved process, especially for GPU passthrough.

I'm not sure about on Linux, but the GPU passthrough on Macos (Fusion) is not like KVM passthrough. When I looked into it I found out it's for one specific use case. Won't increase performance for general apps like Photoshop.

Yeah, macOS doesn't really have GPU passthrough.

But on Linux, you can do GPU passthrough on Linux for macOS (and Windows) guests.

In particular, since no VM has graphics acceleration for macOS guests, and because macOS relies very heavily on graphics acceleration, GPU passthrough is basically the only way to comfortably use macOS inside of a VM.

They keep getting better, but you always have the fear of one software update bricking your device because some engineer at Apple woke up on the wrong side of the bed one day and put in something in the bootloader that only works on Apple hardware.

It would work if I kept this laptop as my backup (can't stop working for a day or two while I fix all that stuff).

I said this repeatedly in a Hackintosh thread on HN last week: I have never had a point release break Hackintosh. Especially these days with bootloader kext injection, it’s really quite rare. Whole version upgrades are another story, but you shouldn’t just install those on a whim anyway.

Also, if you’re on Hackintosh, Apple isn’t touching your bootloader!

I did something similar, but just got a MacMini which is to rather old but quite powerful, has a lot of RAM and it is more enjoyable to work on with the setup I have (keyboard, mouse, monitor).

I still use MBP for travelling but not when at home.

How much is an Apple motherboard design issue vs. a USB C issue?

W/ the new ARM news coming out, I kinda wonder if the Intel line is on life support and Mac engineering attention is all-in on the new ARM macs?

After using Macs for years I switched to Ubuntu about 3 years ago and am completely used to it now. I don't miss Mac OS at all, except for Adobe software support :(

Can you link to the post?

We've merged those comments into this thread.

Sorry this is the post I'm talking about. I first saw it yesterday and then tried to change up my configuration of USB C cables hooked up and it's been better since.

I have 3 cables hooked up. First is for the apple TB3 to TB2 adapter so I can reuse my TB2 hub. Second is a USB C to DisplayPort cable for my second 4k monitor (cuz I couldn't run both 4Ks off the hub), and the third is power.

> unparalleled retina support on macOS

What are you referring to? Windows has hires, wide gamut support.

If you are talking about the icon mess that Windows is nowadays, then I agree. But I hardly care about icons in the operating system.

If you want a powerful system, an 8-core 16-thread Zen2 Ryzen with an RTX 2080 Super is a beast for the same kind of prices you get a Macbook Pro.

I have both a MBP and a Windows workstation hooked up to a 4k monitor, and the difference in high-DPI support is night and day.

macOS and mac apps support high-DPI essentially flawlessly. On Windows even system dialogs have blurry text, as do many third-party apps (such as Mathematica until the very latest release).

Apps need to be updated to enable proper hi-res, yes, but that is not Windows' fault if Wolfram does not do so.

The problem is that Windows has way better backwards compatibility, while Apple routinely kills old tech. So app developers have to keep up, which is good for the user.

On the other hand, of course, you can still run very old apps in Windows, while Apple does not even support 32-bit, modern OpenGL or Vulkan.

That doesn't explain the system dialogs. Microsoft's development style is to just accrete more stuff; that's how you get two Settings dialogs for example.

We were talking about third-party apps, not Windows' own apps (which are indeed a mess).

A handful of downvotes yet no comments except @millstone's (thanks!).


Text in Windows also looks like garbage. Code is easier to read on MacOS. Even a non-retina Mac looks better than a high DPI Windows. Not sure if it's the system font choice, anti-aliasing, font weight...

I also think that Apple's decision to only support 1x or 2x to be the right choice. It's the wild west on Windows when it comes to high DPI. Half the UI in windows is either scaled up to "retina" and the other half is tiny boxes or text. As a result, objects are always the "correct" size relative to other objects around it.

I use both macOS and Windows, and text looks fine to me in both, and I know I am sensitive to this because I got crazy trying to get it working a few years ago in Linux and once fighting with ClearType. But in Windows 10 out of the box, never had an issue.

The problem you describe is probably old applications which haven’t been properly updated to support hi-res. If they do custom drawing or controls or frameworks, Windows cannot do anything to fix it.

ClearType is strange. I've gotten used to the way that my Linux system renders text, which is part of the reason why I like it so much. Using small fonts on macOS is always super blurry, while small fonts on Windows look like bitmap fonts and have clear subpixel hinting artifacts. On a HiDPI screen, they all look pretty similar.

Windows may support hires, but doesn't have high quality icons, and most of the apps are a bit rough. Doubly so if you drag a window from one monitor to another.

I forgot when was the last time I've ever looked at icons on Windows. I just press Win key and start to type name of the application and it shows up in a list. In most cases it takes one to 2 letters of typing as I guess Windows remembers your stats.

I do have very few icons in a normally hidden taskbar but as I set their size to tiny they have no particular look at all. I just distinguish them by color pattern.

Not sure what problems you have with dragging. I have 2 32" 4K monitors hooked up (one is vertical position) and do not experience any particular problems.

> 2 32" 4K monitors hooked up

Two identical monitors is the happy path. Nonidentical ones get interesting.

I have a small high-ppi laptop hooked up to a large low-ppi monitor, and that confuses a number of apps when moved to the monitor they didn't start on. Most notably Visual Studio; some widgets are the wrong size, and the text is slightly blurred due to some up/downscaling issues.

I'm talking about how the magnification for third party apps is often weird. SQL Server Management studio for me is almost always in the wrong zoom factor. RDP gets confused. Etc.

Most of these things have probably been fixed, but I'm not sure. Windows devs?

SSMS has the worst text rendering of any tool I regularly use on windows. It's not a tie. I'm not sure how this is possible, since SSMS is built on a Visual Studio shell, but I see what I see.

Its out of the box configs are horrible. They use Courier New.

Although I haven't regularly used SSMS in a while (thanks to Azure Data Studio), the first thing I used to do was change the fonts to the "new" (15 year old) ClearType fonts. Like Consolas everywhere.

Then it looks way better.

It's not fixed. Many, many apps go haywire for me in HiDPI.

Right click on the icon and look at the compatibility tab.

he's talking about apps that don't behave according to windows scaling (happens) or monitor scaling I think.

I have two monitors and windows scales apps equally on both even though one is 5120x1440 and the other is 1080p. The result is that I either pick small icons on the big monitor or big icons on the small monitor.

Totally agree. I would love to spend Windows prices to get that kind of power. And it would run cool. But I just don't have the time to fiddle with Windows.

Then again, maybe it won't be so bad. Worth a try at some point just setting up a couple of my elixir and ember projects to run off WSL2.

Definitely try WSL2, it's a huge improvement over WSL1 and it's finally at the point where I can honestly say it's usable for what I need it for.

My 15" Lenovo C940 with the 4K screen is a _much_ higher resolution in pixels-per-inch than the so-called "retina" screen.

And all the software "just works" at this resolution nicely.

Are you running Windows or Linux on that? I'm a MacOS user but I'd love to know if it's possible to have a good high-res-screen experience on a Linux laptop.

Windows, of course. It would probably be an awful experience on Linux. You want the "happy path" of NVIDIA Windows drivers, etc. Of course, Windows has first-class support for Linux so you can have the best of both worlds.

> https://news.ycombinator.com/item?id=22265549

> Apple says "we are listening now, and here is a new cooling design," then it comes out to be even less adequate that the old one. I can't think of anybody else capable of trolling up their customers like that.

Apple's thermal engineering is simply bad, and doing it bad is a company policy.

There are no other believable explanation to me. Apple been promising to fix their thermal design for years on end, with each year's model being supposed to have better thermals than the previous one, but in reality all their designs were consistently crappy.

The only explanation to that for me is that they took thermals as a subtle marketing feature, just like makers of laptops with crappy batteries always find ways to draw battery life from thin air.

Proper cooling requires thick laptop. Take a look at mobile workstations or high end gamer laptops. They are not thin, but their cooling is more adequate (although still not on tower level). There's no way to fool physics and combine good cooling, thin design and quiet fans, you have to compromise on something.

> Proper cooling requires thick laptop.

Correctness of this statement is only notional.

All of their current models can have much better thermals without increasing their bulk if they actually tried to make it so. Quite number of other makers have superior thermals in even thinner packages, and, more importantly, cheaper ones.

So far, none of their recent models show a single sign of thermal engineering being being done as such. Their 16 inch model has a thermal solution I would only see in a $300 white label laptop. Yes, they added few extra millimetres to fans, but at the same time they still use the same single skinny heatpipe, and tiny radiators.

And all of that is when they have access to the best parts, and fabrication services on the market. If you look closely on their BOM, they have many surprisingly low spec parts, and very minimalistic, spartan design decision.

A lot of accusations and allegations here, with zero specifics and zero to back any of it up. If you're as knowledgeable on this topic as you allege and seem, could you give you more information?

Which "$300 white label laptop" has equivalent thermal design to the 16" MBP?

Which "other makers" actually have "superior thermals in even thinner packages" that are "cheaper"? That sounds like bullshit to me, given what I've seen and experienced in the PC market, where loud fans that run all the time are very common.

Razer blade, Asus GX501, Asus GM501, Lenovo Y740S, Samsung Odysseus, 2020 Alienware M15, MSI GS65, 2020 Gigabyte Aero

Can confirm, work issued MBP 13" is vastly slower and hotter than my personal Aero 15x doing the same workload.

And even when the Aero is running games or training a neural net, it's only ~70°C

None of those laptops retail for less than $1500. The first one you listed is $3999.

A top spec 16" MPB is 6,099 USD. Does it have better thermal design? Doesn't seem so. It is an entirely valid comparison.

baybal2 made two separate claims:

1: The thermal solutions in macbooks are comparable to some $300 laptops

2: Other companies do laptops that are just as thin and powerful as macbooks but have much better thermals.

The list they posted is 2, not 1.

Um, the "top spec" MB Pro price is completely irrelevant since that's inflated with vast amounts of really fast SSD.

The relevant comparison is to the $2400 MBP base price.

Y740S starts at $1099.

I mean, I agree with you, but you can't look at a thick laptop and just assume it's going to be better.

Thermal zones in servers is a good example of low physical footprint and high density power consumption that cools quite well. The engineering effort being spent on a good thermal solution can cause a device that is thin to outperform a thicker device.

It's just that Apple does not seem to be spending the resources there.

I don't think the parent was saying that a thick laptop is going to be better - but rather that when you are making a device as thin as possible, and that's the metric you index on, cooling performance WILL suffer.

It's not just that they haven't put enough engineering effort in - they consciously made a design tradeoff.

I still think it can be done, whether it's sacrificing performance or rethinking the thermal design to include a smaller battery and a larger heat pipes, or simply using a lower TDP CPU..

But yeah it's all trade-offs.

Servers blades are typically cooled by noisy fans spinning incredibly fast and moving large quantities of air. Not really a good comparison IMHO. A better comparison is simply one with non-Apple laptops.

1U servers are LOUD though.

Only because there's no need for them not to be. it's common to retrofit slower spinning quieter fans into servers for home/office use.

But the same design constraints were used in a mini-itx gaming PC on the linus tech tips channel a few months ago.. I can't seem to find the video now though. The thermal performance was amazing.

My company uses Dell PowerEdge R340s for our onsite server needs, and while they're certainly louder than a modern desktop, they're pretty bearable.

For contrast, I've got a Sun Fire T2000 at home that's louder than the airliners flying overhead (my current and previous apartment both happen to be under airport flight paths, for SFO and RNO, respectively). For obvious reasons (at least until I can figure out how to get a reliable network connection to my garage) that one gets run pretty sparingly, lol

Been quite happy with my Razer blade 15, it's not a ridiculously thick laptop, but thanks to extra tall rubber feet, beefy fans and a couple other tricks it's able to run a gtx 1060 and a hexacore cpu just fine under load.

I used to have one of those. Sure it can run all that hardware, but it sounds like a jet engine when it's running. Completely ridiculous design that would never be released by Apple.

The Microsoft Surface Books also have a 1060 inside, but they are whisper quiet.

I can hear my coworker's i7 macbooks across the room, they are hardly quiet when under moderate load.

"The only explanation to that for me is that they took thermals as a subtle marketing feature"

Yes. And it's not even a subtle marketing feature. Reviewers ogle over internal designs.

But it's also partly in their dna. Landing on the wrong side of compromise is something they've been doing for a long time even in desktops.

Hasn't the latest MacBook Pro 16 inches evolved in the cooling design? I thought it was much better.

I switched from a Macbook pro 13" (2015) to the 16" this year. It's much noisier in day-to-day usage, which I've tracked down to high heat generated by the discrete GPU whenever it is in use (regardless of load).

Unfortunately, some conditions force the discrete GPU to activate - one of which is "being plugged into an external monitor." Even if you've only got terminals open, the GPU runs real hot with 1% load, and the fans ramp up to match. (This may only be true for some monitors - my work-provided monitor is the Apple Thunderbolt Display).

Sometimes Slack forces the discrete GPU to turn on, for example when clicking on an embedded youtube video. The discrete GPU will remain in use until Slack is restarted. Other applications sometimes behave similarly - I use https://gfx.io/ to see what applications are forcing it on.

Perhaps the cooling engineering is better, but the practical effects of it make me miss my 13" laptop.

> Unfortunately, some conditions force the discrete GPU to activate - one of which is "being plugged into an external monitor."

Had something similar with AMD (desktop) GPU some years ago. It wouldn't go to the lower power states if my desktop refresh rate was set above 119 Hz. So it would be hot and fairly loud. So I ended up using 119 Hz on the desktop and configure games to use 144 Hz.

I also noticed that the video card wouldn't decrease the RAM clock rate significantly, but when on the desktop reducing the RAM clock a lot had a very noticeable impact on heat and no measurable performance degradation. The answer I got for that was that it was tricky to dynamically scale the RAM to such a degree (IIRC I set it to half normal speed). I ended up using an overclocking tool with profiles, worked fine.

Someone posted somewhere that switching to an usb-c monitor fixed their issue (instead of usb-c to hdmi adapter). Anyone here have some experience with this?

This doesn't answer your question, but I'm using an Apple-provided USB-C -> thunderbolt 2 adapter, as it's the only input that my display will accept.

It's frustrating that this problem is present with all-Apple hardware.

Same issue here moving from a 2016 13" to the new 16". It's silly that we can't just use the integrated intel GPU with an external monitor if we desire.

Ugh, thanks. I was eying larger MBP as my 2017 13" runs almost always with both cores used, and is loud - was hoping a six core would be quiet under same load :/.

Now that I think about it, lately it's been fine. Fans don't come on when powering an external 4k monitor under normal use anymore.

However, I'm not sure whether that's due to a recent software update or the fact I recently switched the cable I use from HDMI to DisplayPort.

This was also true for me on a 2012 MBP (on HDMI and DisplayPort iirc).

My XPS 15 had a similar issue where turning on the GPU made it heat up and become loud - actually the discrete GPU offered no performance increase because the heat made everything throttle....... sad! At least I could use an external monitor with that one though.

It's just a limitation of the laptop form factor. My desktop has a 45 watt processor, same as the i7s, and it has a huge block of metal and a 120mm fan to keep it cool quietly (but still audible under load). There's no way to fit a 45w CPU and 45w GPU into a laptop and make it work.

I'm okay with the fans spinning up loud when I'm doing something intense! I expect that transcoding video or `make -j12` will put out a lot of heat.

I'm frustrated because the fans often spin up loud when GPU usage and CPU usage are both below 10% - when I'm literally just reading my email.

This happens with MS Edge browser too. Just browsing to Arstechnica, for some reason, turns on the discrete GPU and it will stay on until Edge is shutdown or restarted.

I went back to 13" mbpro for same reasons

In the 15 inch they undervalued the processor and got better thermals. In the 16 Inch they have different fan curves and larger thermal intakes and fans.

I manually disable Intel Turbo boost now though. And for some reason when I start up my laptop it can reach around 90 degrees C if I have a bunch of things open (I guess it turbos when it is restoring previous apps open) but that makes sense though.

How did you disable Turbo? I regularly hit 90 when I start running unit tests or compile xcode. I've gotten into a habit of preemptively setting the fans to max before doing anything like that now.

I use https://www.rugarciap.com/turbo-boost-switcher-for-os-x/ to disable Turbo Boost - it requires some weird permissions, but seems to work alright.

"it requires some weird permissions" -- https://krebsonsecurity.com/2020/04/when-in-doubt-hang-up-lo...

Corollary: when in doubt, don't.

If you want to interact with the cpu directly it needs to be a ktext which is part of the kernel.

This ability will be removed in 10.16

That's what the TURBO button on the case is for, duh


I felt so cheated when I discovered that the Turbo button was actually a way to slow things down when you turn off turbo. I spent so long as a child "running the computer responsibly so it doesn't overheat"!

I also recall the patches people had for things like Jazz Jackrabbit that ran a busy-loop some number of times in hot code because timing was execution-dependent in games like that. Faster CPUs made the game unplayable!

I absolutely loved playing Jazz Jackrabbit, wasn't expecting to see a reference to it, thanks for the nostalgia :)

Intel Extreme Tuning Utility lets you tweak almost any parameter but it only works on Windows, if you have dual boot you can use it though (I think) as it persists settings to the firmware/BIOS. For MacOS there’s e.g. “Volta” I think which can help.

Apple is operating on the bleeding edge of design, that doesn’t mean their thermal engineering is bad, it’s best in class - they’re simply pushing the margins.

I wish they’d make a thicker laptop with more room inside, but that’s just not what they do.

I thought the first part of this comment was satire. I'm sorry.

Is it really surprising? Apple is all about marketing - they just want you to think you've got really fast hardware, it doesn't actually have to work properly. 99% of users aren't doing anything that actually taxes the CPU all that much so nobody is going to complain.

I would imagine the iPhone is built by a totally different team and it serves totally different markets. Doesn't really change the fact that the macbook has had consistent thermal issues release after release.

Apple aren't stuffing a third party CPU into an inadequate cooling solution with the iPhone because there isn't one. If there was an equivalent to i7/i9 in terms of mindshare in the mobile market I wouldn't be suprised if Apple released a phone with a low clocked and badly cooled one of those too.

Some releases have been worse than others, this would be my favourite example: https://www.notebookcheck.net/Apple-MacBook-Pro-15-Core-i9-s...

The response (rather effectively) refuted your claim that "Apple is all about marketing."

> Apple aren't stuffing a third party CPU into an inadequate cooling solution with the iPhone because there isn't one.

It might have escaped your notice that (1) third party ARM CPUs for mobile devices are not exactly difficult to find and (2) Apple was in fact using one before they decided to take the design in-house.

I was specifically referring to something with the same "whoa, that means it's really fast" mindshare as the i7/i9 has. Do laymen really look at phone CPU models the same way that people use them to guide purchasing decisions on laptops?

See also: Intel labeling middling 2c/4t cpus as 'i7' a few generations ago despite them not being at all comparable to the desktop equivalents because they know people will buy them based off the model number and not the actual performance. (https://ark.intel.com/content/www/us/en/ark/products/95451/i...)

And NVidia giving all their mobile GPUs the same names as the desktop cards (pre 10XX gen, they're actually the same cards with slightly lower clocks now) despite them being entirely different hardware. (https://en.wikipedia.org/wiki/GeForce_900_series#Products)

Actual performance doesn't sell nearly as well as perceived performance.

(Also I guess it's all about marketing for everyone)

> Do laymen really look at phone CPU models the same way that people use them to guide purchasing decisions on laptops

I think I saw a bit of that phenomenon with Snapdragon variations and # of cores.

Does it look sleek? Shut up and take my money.

> Note that high temperature on the right side appears to be ignored by the OS. Plugging everything into the two right ports instead of the left raised the Right temperatures to over 100 degrees, without the fans coming on. No kernel_task either, but the machine becomes unusable from something throttling.

I feel like the top answer is missing the forest for the trees. If the temperature of the chassis rises past 100 degrees because a peripheral was plugged in, and degrades performance if all of the peripheral ports are in use... that's not a usable computer.

Edit: Ah, 100F is 37C, so that's not so bad.

A long time ago (roughly 2012?) I owned a Macbook Pro for a brief time

I was running Windows 7 in Bootcamp and I wanted to set up Gentoo Linux in a virtual machine for some Linux work I needed to do

I left the machine on my desk to compile a kernel. Basic wooden desk, nothing underneath or around it. No problems there

Roughly 5 minutes later, the system had reached what Speccy reported to be a scorching 117 degrees celsius! (242.6F)

I immediately shut it down and left it to cool off, then asked around on an IRC full of various flavours of IT people (programmers etc)

The horrifying answers I got were that this was INTENTIONAL and that "the system acts as a giant heat sink" which is why it didn't power off after crossing a threshold

As far as I understand it, running it under Bootcamp also disabled any kind of thermal throttling and forced the more power hungry "Radeon" graphics chip to be used, further adding to the problem

I remember those macbooks got so hot that it was physically painful to touch the metal under the screen near the magsafe charging port. That was fun.

I have forever been leery of hitting the F4-F8 buttons because of the 2007 Core 2 Duo model. I used it for a couple years and remember at least 3-4 times when I decided to rest a finger up there and ended up getting scorched.

Yup I remember this. The area above the F keys would downright burn.

This is why Apple stopped describing their portables as “laptops” some time in the early 2000s. They didn’t want the legal risk of somebody burning themselves after it was implied to be safe to use on their lap.

Well, it was 117 C. If you spilled water on it, it would boil.

Hell, boiling water would cool the machine.

If you spilled water directly on the CPU, maybe.

I have some golden memories from university playing Unreal Tournament 2004 on my laptop in bootcamp, the added difficulty was that if you touch the metal between the keys it'll burn your finger tips.

It's gotta be 100°C.

The top cover part between the touch bar and the screen regularly runs so hot that touching it really hurts. Can't leave my hand on there for more than a second or two. Ran all sorts of diagnostics and according to those everything is fine.(MBP 2017)

I mean, this is in a sense a known issue: https://www.penny-arcade.com/comic/2009/01/30/the-new-hotnes...

Note that was back in 2009. Macbook thermals: not good.

Goes further back than that. I used to refer to the Powerbook G4 as a liquid cooled laptop -- where the liquid was the user's circulation.

At a certain point, Apple had to stop calling them "laptops" and start calling them "notebooks" because people kept putting them in their laps and frying their genitals.

No it's got to be C. 100F is at most, luke warm.

100ºC is boiling though. You would probably smell solder rosin!

Edit: Well TIL there is an awful lot of heat to get rid of.

Some low-temp solders melt as low as 140C, but typical SAC lead-free solders you'll find in laptop motherboards melt at 220C. Many laptops are fine if a processor core hits 100C.

I've got an old Thinkpad with an i7-3920mx that regularly runs at 95C when running a few VMs and is certified by Intel up to 105C. And yes, I've tried replacing the thermal paste and increasing fan speeds to try to keep temperatures down, because not all processors are successful there, but it has been running like that with no issues for years.

That's not too high, considering CPU start to throttle at approx 95C, and MOFSETs can go up to 115C

MOSFETs can go way, way higher than that. There are CPUs (well, microcontrollers) that go way higher than that.


> Specified Over -55 to 225°C

> Typically, parts will operate up to 300°C for a year, with derated performance.

Whether 100°C is fine for a CPU is a question for the individual CPU.

Thank you. I used to design microprocessors (a LONG time ago) and TIL that there are CPUs that can operate at 225°C. Wow. And even limping along at 300°C? Wow wow. BTW, I can't stand the 8051 architecture, but that's really off topic. I guess if you need 225, you need that chip. Reminds me of the Henry Ford / Model T quote...

8051 can certainly service interrupts quickly and predictably, which is probably all you want if you are sticking a microcontroller inside a turbine engine or oil drill.

>and MOFSETs can go up to 115C

Mosfets have a maximum junction temperature of about 175C

Depends on the MOSFET, of course. Some MOSFETS will run up to 300°C, according to the manufacturers.


Used in e.g. turbine engines, instrumentation in oil wells or mining operations, and industrial process control. You can get microcontrollers that work in similar temperature ranges, “in any architecture as long as it’s an 8051”.

100%, I should have said "depends on MOSFET" instead of just moving the bar.

Lots of people run video cards up to 95C. Maybe not something I want sitting on my lap though.

My R9 290X stock would run up to 96C.

Boiling for water. Don’t think your motherboard is made out of that :)

I've run CPUs at 100 C. It's not like they melt at that temperature or something.

100C is where intel typically throttles down clock as a life saving measure. You can run at 100C, or higher, but modern CPUs will cut down performance in an attempt to lower temp. High temp = shorter life. Macbooks adjusted that limit upwards so the CPU has a higher threshold before the auto-throttle kicks in. More about that here: https://www.youtube.com/watch?v=947op8yKJRY

What part would thermal throttle at 37C? It can’t be fahrenheit.

It's definitely C, look at the other temperatures. It starts at 25ish C, room temperature.

Let's remember that even though the CPU in a current model MacBook Pro might reach 100 degrees C, the laptop itself is only going to hit about 50 degrees. Which is still pretty hot.

Relevant recent experience: I have had a 2019 MBP with vega graphics card for a while. I noticed that recently it was running VERY hot all the time. I popped the bottom off (you need that damned pentalobe screwdriver), and found tons of dust jamming the fans entirely. Upon removing all the dust and cleaning out the fan ports the machine runs great again. If you're having cooling problems start with this fix.

I have a 13“ 2017 MBP.

What is annoying:

- runs extremely hot, easily

- fans are often on full force albeit nothing special is happening

- before buying the ridiculously expensive AirPods Pro, re-connecting them (normal AirPods) after a disconnect often required a system reboot (to the point I’ve made the reboot part of my pre-meeting schedule)

- connecting external devices usually requires adapters. With them too, when the adapter stopped working usually it helped connecting it on the right side (Reboots weren’t helping)

- for some reason the adapter would work on the left side again after some time

- as soon as I connect the external screen (not even a retina one) the fans go louder

- time machine is a PITA. Every time it runs fans are blaring up, machine gets hot - even system is getting significantly slower. Even if it just backs up a diff of 150MB. And it runs multiple times a day, with no option of configuration other than no auto-backup at all.

- Window’s decade old window management via shortcuts still doesn’t exist in macOs (you have to install a third party tool for that)

> before buying the ridiculously expensive AirPods Pro, re-connecting them (normal AirPods) after a disconnect often required a system reboot (to the point I’ve made the reboot part of my pre-meeting schedule)

To add a contrasting datapoint, my (gen 2) AirPods have been virtually flawless for a year or so with my 15" 2017 MBP.

Well maybe it's due to mine being gen 1 AirPods ️?

> before buying the ridiculously expensive AirPods Pro, re-connecting them (normal AirPods) after a disconnect often required a system reboot (to the point I’ve made the reboot part of my pre-meeting schedule)

Did you try resetting your Bluetooth stack? https://www.macrumors.com/how-to/reset-mac-bluetooth-module/

I was talking to a hardware engineer a while back who was heavily criticizing USB-C is because of all the ways vendors can abuse it. Turns out that the USBC port on the Nintendo Switch doesn’t comply with the USB-PD standard, which caused lots of issues for users who had third party charging docks.[0] There were accusations that Nintendo did this intentionally to restrict third-party accessories.

After reading that, nothing I read about USBC surprises me. Sounds like another spec for vendors to abuse and ignore.

0: https://www.nintendolife.com/news/2018/03/could_switchs_non-...

The switch has issues, but the bricking was because one of those dumb docks was putting 9 volts on a pin that's supposed to be using 2 volt signals, if I read the spec right. And the switch tolerated 5 volts there, too. Putting more than 5 volts onto the non-power pins is such an obvious violation that I can't blame the spec for that fault.

Don’t forget the mess with the cables! This has been discussed before: https://news.ycombinator.com/item?id=20443765.

USB-C is a complete mess. I can’t believe Apple dropped MagSafe for this. They could’ve made all the io ports type C but kept MagSafe for charging with an optional chargeable type C port on the right side.

Are there any good macbook laptop stands that act as a heatsink as well? My current stand has rubber grips so while there's plenty of airflow under it the heat isn't being drawn away very well. I don't want to add a fan since that would add noise.

I really like the mStand: https://www.raindesigninc.com/mstand.html

It is solid metal with lots of airflow underneath.

Amazon Basics has similar thing that is essentially the same but looks a bit less stylish and costs half of what mStand costs. I've been using it for almost 4 years now and it works great.

I have this one as well, it does a decent job of wicking heat too, the portion of the stand in direct contact with the laptop is noticeably warmer than the part that contacts the table

I also like this model. There's a little air gap between the computer and the stand, but being solid aluminum I figure it's going to move a lot more heat away than, say, a wooden desk surface or cloth-covered human legs.

I also rate these stands - but it has rubber grips. You're not going to get any heat transfer from laptop to stand, so it's not going to act as a heatsink.

I have the same stand and it does absorb a good amount of heat when the Mac is running at full power (the angle helps there too), albeit not a true "heat sink". The rubber has negligible displacement.

I use the 360 model, which adds a swivel option, and it's been a really useful and stable stand.

That's an interesting thought, but you'd have to have a really good seal between the two for it to work (think like processors and thermal paste).

Been a huge fan of the Roost Stand ever since I got it. There's no heat sink, but there's only four small plastic contact points between the laptop and the stand. My fans are often on when running several Docker containers, but the surface of the top bar never gets that hot.

I have a 2018 13" Macbook Pro.

(I'm in no way affiliated with Roost, I'm just a happy customer.)

Roost: https://www.amazon.com/Roost-Laptop-Stand-Adjustable-Portabl...

It'd be interesting to just try resting it on a 1/4" thick steel plate. The thermal capacity would be so high that it'd never really warm up.

That was my though too, those thin stands may look sleek but they don't really do anything in terms of heat transfer. If anything it'd be worse since now the machine has nothing to transfer the heat to.

Aluminium is a much better heat conductor than steel.

I fee like I've seen a lot of thick steel plates around the right size sitting around in old workshops. You could probably find something cheaply from a scrap dealer.

Thick aluminum or copper seems more rare, I don't know if it's commonly produced.

Solid aluminum is not hard to come by. You can probably just walk into your kitchen and pick up a frying pan.

A large slow fan (~10in at ~10rpm) could move a lot of air across the bottom of the laptop very nearly silently.

Any Noctua brand fan will stun you with its silence. I installed four 20mm fans in a network switch to replace its screaming loud factory fans. They're all running ~1200RPM and it now sounds like a small desk fan. A larger format fan running at a few hundred RPM will likely add very little to the ambient noise around you.

Also big fans (bigger than 140mm) are usually running on noisy ball bearings (or roll bearings) while "standard" 120 and 140 fans are often running on silent magnetic suspension.

Do you mean 100 rpm? 10 rpm isn't really going to do anything. When I last looked all of those "laptop cooler pads" use tiny fans that would either be ineffective or loud or both. I settled on the mStand mentioned above, but as noted it has rubber standoffs that prevent it from acting like a true heatsink.

I mean 10rpm, really! A 10" fan at 10rpm moves about as much air as a 2.5" fan at 160rpm.

You're not trying to move huge amounts of air, since heat transfer from the laptop to the air is pretty slow. You just need enough movement to clear out any hot air that is building up under the laptop.

> You just need enough movement to clear out any hot air that is building up under the laptop.

On a nearly totally different tangent, our own bodies build up hot pockets of air indoors as well, where there's no natural breeze to get rid of it. Getting rid of it creates a surprisingly strong cooling effect where you probably won't need AC for a while longer than you expect - and "air circulator" fans are pretty good at doing it over a whole room, so you don't need to keep a fan directed at yourself.

What fan can even run at 10 rpm without stalling? I don’t think most will even go down to 100.

Nope, even 10 rpm is a big change. Going from standing air to any movement is a remarkable change in thermal conductivity.

I find that the TwelveSouth Curve does a much better job at taking heat away than the Rain mStand - I'd guess this is because of airflow.

I use the svalt stand that has a fan in it. Works really well


jesus h christ, it's EXPENSIVE for a simple stand.

Even as someone who has owned the Rain Design stands, I thought exactly the same. $160 for a laptop stand?!?

Then I realized, that was "without fans".

$260 for a laptop stand.

basically the ipad pro magic keyboard :P

How noisy is the fan on the svalt?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact