This post on HN yesterday fixed my problems of the last 5-7 days. I use a TB 2 Hub hooked up to a 2018 MBP to 2 4k monitors, a USB mic, etc.
And I've noticed the hub got really unstable whenever the CPU fans would go wild. Looks like it was the controller overheating due to the shitty thermals that Jony Ive's Apple seems to keep pushing out. (Still the Apple of today).
Now that I've switched my ports in a different config, so far I've had no crashes in the last 2 days.
I swear, I wish I didn't love macOS so much (or wasn't so heavily invested in it), or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine. WSL makes this more palatable, but the unparalleled retina support on macOS, my 15 years of using it, and just habits built up, keeps me from leaving. (I felt the same way when I first moved from windows to macOS, but it was in my early 20s and I had lots of time to play with the OS)
YMMV but switching from laptop OS X to desktop Linux has been a gamechanger for me. Far cheaper, and the power / UX of a mid-range 2020 desktop blows my 2019 Macbook Pro completely out of the water for my usecase. Code is a joy again. When I do use my Macbook on the go, I generally use VSCode remotely connected to my desktop because it's so much faster.
I also use Regolith Linux, which is a noob-friendly tiling window manager version of Ubuntu, and it feels so slick with multiple monitors.
I switched from a macOS laptop to a Linux laptop and it’s been completely the opposite for me. I’ve been spending so much time fighting the OS, I’m actually considering buying a Mac again even though the laptop is just a few months old.
You can call me too incompetent or whatever but I’ve been running into stupidly obscure bugs that even stumped some of my Linux guru friends.
Just as one example of many, when the device is connected to a Thunderbolt display the Intel wifi driver crashes and restarts periodically which freezes USB input for about 15 seconds every time. This issue persisted across different Linux distributions, kernels and firmware versions.
Don’t ask how long it took to figure this out. I have now connected a USB wifi dongle to the displays USB hub.
I really miss the plug+play nature of macOS, I think Linux has it advantages and it might be better on desktops but it’s just been horrible for me on a laptop. I might have to try a MacBook plus a fast Linux desktop next.
> You can call me too incompetent or whatever but I’ve been running into stupidly obscure bugs that even stumped some of my Linux guru friends.
No, you're not incompetent. As usual, manufacturers are putting weird features into their laptops that the Linux drivers and userspace can't keep up with - e.g. the debacle that is the Nvidia Prime gpu switching tech for low power vs high performance modes. Simply doesn't work most of the time, leaving you scratching your head. UEFI related woes are also common as we finally deal with having to give up on a decent experience with legacy boot.
At this point the wise person buys a laptop with official Linux support out of the box. It helps you and it helps the community (vote with your dollars!).
Yes, weird features such as "802.11n wifi", "suspending the system when lid closed" etc. Had a well supported older thinkpad (even supports open source bioses etc), and issues cropped up in those areas
For sure, but usually this comes down to "manufacturer picked a slightly cheaper wifi chip with no linux support", "manufacturer is doing weird shit with ACPI they shouldn't", etc.
It was a Dell XPS 13 Developer Edition with Ubuntu, far and away the most recommended distro/hardware combination. Everything else seems to be less supported.
If someone doesn't know how to dig into terminal commands and google stuff on your phone to troubleshoot, you shouldn't even attempt linux on a desktop.
Oh yeah, Dell XPS Dev edition is usually a decent choice, although you're right that not everything works perfectly (at least it didn't a few years ago when I bought one). It is a little annoying to have bugs in a factory-installed OS, but I'm glad to at least have the option.
I actually prefer Thinkpads these days because everything "just works" when I install Fedora on it.
If you're an Ubuntu person and want it from the factory, I have heard that System76 build quality has gotten pretty good.
"Ubuntu Certified hardware has passed our extensive testing and review process to make sure Ubuntu runs well out of the box and it is ready for your business. We work closely with OEMs to jointly make Ubuntu available on a wide range of devices."
So in this case it is a collaboration between Ubuntu and OEMs
Only thing you need to look out for when buying components is what the Linux support looks like. I've built 3 desktops in the last couple of years and all of them work out of the box. It really doesn't "require" forum diving.
A Dell XPS 13 developer edition with ubuntu preinstalled hasn't worked "out of the box" for simple multi monitor usecases for me. Invariably something requires an update for support that invalidates some assumption and then whoops the only guide that describes your problem has a solution that involves pulling and compiling X.org in a terminal and dealing with tarballs.
And someone can go gosh, you must be doing it wrong, and they're almost certainly correct! However I'm a pretty big power user and can actually get things to work and dig into forums, so I realize that the average user has absolutely no chance.
Linux laptops are a mixed bag, to say the least. One-off "precision tuned" (vs commodity) hardware is often to blame. In general, if you want a stable linux laptop experience, you should be buying old. It's not very attractive, but it is sustainable -- which is it's own kind of attractive! :)
In general, my experience using debian-based distros as a laptop driver have been more or less frictionless. I use an older lenovo thinkpad (x1 carbon, 1st generation, purchased for like $200 secondhand a year and a half ago).
In general, thinkpads have a reputation for having a plug+play linux experience, once you boot for the first time. I'd recommend giving them a try before throwing in the towel, if you have any patience in reserve.
You can get pre-owned (older=more stable linux support) t-series thinkpads with very nice specs, especially if you're willing to trade off display resolution. Plus the parts that die (batteries, ram) are all commodity and replaceable, you could presumably run with the same laptop for a decade, if you're in to that sort of thing.
Hi, i have a Thinkpad L450 and when i was looking at switching to Ubuntu on my laptop, i saw that it had worse batery life than windows,i don't remember how much,maybe 10-15%?Do you have any experience on that?Battery life is pretty important for when i take my laptop on courses(not anymore eh)
I’d say give it a try. My experience has always diverged from the stated norms on battery life in all devices and contexts, but maybe my usage patterns are abnormal.
If you’re using simpler programs (Firefox, terminal), your battery life should be great. I usually get four hours of heavy vim use with WiFi etc all on, on a battery that’s 75% capacity. YMMV.
It’s not a mbp level, but I imagine it’s fairly similar to what windows would draw. Maybe better.
It’s a new X390 Yoga though. Maybe they’ve dropped the ball a little, it definitely matches my friends’ experience that the more traditional models are pretty stable.
Sorry to hear this. Yeah, it’s probably a matter of getting an older one. The newer ones are trying to compete with surfaces and mbp, and it’s been painful to watch :/
What laptop are you using? I think that would be helpful context. I've been running Ubuntu 16.04 on my Dell XPS 15 9560 with almost no issues (and none that aren't easily resolved with a shell script or a keyboard shortcut) for almost three years now. I get the sense that the more popular the hardware, the fewer issues you'll have running Linux on it. I certainly wouldn't call you "too incompetent" -- but I might call you just plain unlucky.
If you want to go back to macOS because you don't have to be "lucky" to get a laptop that plays nice, I don't blame you. But for me, the tradeoffs to stay on Linux have been minimal and absolutely worth it.
I recently got a new laptop and bought it from one of those linux specialist places after having a pretty terrible compatibility experience with my previous high end HP laptop.
As an intermediate, install windows on the insiders channel, though update 20h1/2004 should be out soon for GA.
I had to jump into windows for a few things last month, and WSL2 on windows has become pretty good and bearable, the Docker beta support for WSL2 also really good... seems to use a bit more memory than I recall. But linux cli with a Windows GUI has been surprisingly bearable... Remote (wsl/ssh) extensions for VS Code invaluable as well.
Far less obscure than the issue you cite but I have found keyboard shortcuts across the OS and third party apps on Linux are so incredibly inconsistent when compared to macOS (or Windows for that matter.) I find I'm constantly bouncing between control, alt, and super to achieve what I could do in macOS with just super.
Truth is, Linux doesn't work well on new hardware unless it is some specific models that state Linux support like some dell or thinkpad machines.
You bugs will probably be fixed in a couple of months/years.
I don't completely love any setup, but I'm starting to think that Windows 10 + WSL is the best open-source development setup. Huge variety of hardware, plus all hardware actually works right, plus pretty much any popular desktop app works reliably and has a good GUI, plus all Linux CLI tools are there and work right.
What's the fascination with laptops anyway? Are that many people really working from coffee shops? I'm pretty much always coding on the same desk at home and a mid-high end desktop is significantly cheaper than a decent laptop and much more powerful than even the really high end laptops. The ergonomics are also much better, although that can be fixed on laptops with docks and separate monitors etc too.
If you have to do a demo in a different room, it's a pain. Your best bet is usually to use another laptop.
Same thing if you want to have a call in a quiet place and want to check emails/reference during the call.
Work from home is also easier, you don't have to use 2 computers or carry your tower (though currently that's viable).
It's more about those that work from the office mostly, maybe go into meetings etc... and then home occasionally (or right now, mostly).
At work, I'm on a dock... at home, I'm on a 4k-kvm switch... so I'm not really using the laptop but for a lighter computers... Just got bumped to 32gb ram, which is most of the laptop bottleneck. Though my personal system (r9 3950x) is much faster than the laptop (i7 8550u) both on 1tb nvme.
I'm envious of your 3950x, but on the other hand I suspect I'd never actually max it out. I should get into video editing or something to justify more PC upgrades...
Even then... Literally the only time I max it out is when I'm doing a handbrake x265 encode on a faster preset... the slower presets only use about 75% of CPU. If I did it again, would probably go with a 3900X and spend the extra $250 towards an RTX 2070 Super instead of the RX 5700 XT that I got.
My old computer was over 5yo at upgrade, with a mid-cycle upgrade of a couple components, likely this will be the same, though I don't think I will ever go RGB again.
It's the TESmart KVM Switch... HDMI 4K 60hz 4:4:4, I like it okay. Seemed about the best option on Amazon, but mixed reviews. My display is 60hz, and I don't play many games, so it works well enough for my use.
I did get hdmi adapters for my pi4's, but haven't actually tried them yet. May need different adapters/cables if you're going to/from mini-dp or another interface. back of the switch looks like USB-B Female (standard USB cable, not 3) and HDMI Female.
I've only been using it for about 4 weeks, but so far working well.
I'm sheltering in place and in the past 3 hours, I've coded in like 3 areas of my house. I also need to move when my partner is working and is on a call.
Not having a dedicated work area like a home office or just a desk and office chair in a corner of the living room (this is my solution) sounds like a nightmare to me.
Having a single spot to work at, never to change location, posture, or surrounding sight sounds like a nightmare to me.
I have two sofas, a small desk and a bigger one in two different rooms, a balcony, a table in the kitchen and one in the garden. I frequently switch and move between all of those, which helps me a lot in getting out of coding slumps and refocus. I was so happy to give up my static one desk multimonitor setup at the office for a nimble 13" laptop work from home situation.
I used to think the way you do as recently as just a few years ago. I couldn't imagine not being able to pick up and move to a cafe or co-working space. I was working a lot of hours and working from home when I probably shouldn't have.
What I've found is that since having a desktop and a dedicated desk and office in my house, when I leave the room I don't bring work with me. I also don't have push notifications enabled on my phone, including email. When I go out, I enjoy other things and then when I come back to my desk, I am much more focused and ready to concentrate on work.
Yep, different people have different subjective preferences about working situations, who woulda thunk it? I am interested to hear peoples perspectives though.
I have a desktop PC at home in a home office. My wife is right near me.
I think this is probably a problem because I also play games on it so the room sends very mixed messages to my brain.
I help with a small business so have to use my PC for that. With my day job I have a laptop provided and do move around the house to get a mental disconnect and help me focus on some tasks. The problem is it's not a great device for stuff like in-depth research, I really want a big screen for that, so usually have to use the PC and get distracted.
I have one of those. I've been remote for a couple years so none of this is new to me.
I just unplugged my laptop from my standing desk and went to lay down for my postprandial chill session. I'm going to get some work done as soon as I'm done faffing about on HN (whomst among us...) and then I'll probably plug back in again.
It's just easier to do it this way rather than synchronize state between a desktop and laptop. One less thing to own, also.
I think there's more to it than just working from a coffee shop.
Without going into people who are often on the road, many folks I know like working on the same computer at the office and at home.
It's usually easier to carry a laptop than a desktop. Even though there are many very small desktops nowadays (see HP's elitedesk mini - though it looks like a laptop without an screen, so I'm not sure it's that much more powerful) the laptop has usually fewer cables to unplug so it's generally less of a pain.
Another angle is that for many people a laptop has enough power for the activities they do and being portable is a real plus. I'm typing this on a 2013 MBP in my bed. This laptop might be slow compared to a modern mid-range desktop, but it's not tethered to a fixed spot. When I need to do serious work, I can plug a 4K screen and external keyboard and have the desktop experience.
On the rare occasion when I need a lot of power for some task, I'll usually fire up some outrageous ec2 instance for an hour or two. It will also have better network connectivity, which allows me to work comfortable over my parents' DSL line too.
I guess it all comes down to usage patterns. If you always use your computer on the same desk and never have the need to move it, I guess a desktop is a more effective use of funds. But many people seem to enjoy being able to carry the computer on a sofa, in the kitchen, etc.
When I started at my current company we had desktop PCs, went home, and used a VPN to dial in.
They got rid of the desktop PCs and gave us all crappy laptops (they took a few years to catch up in power to what we had) and the argument was they could not ensure a secure environment on random home PCs. They were probably angling towards a hot-desk setup too but most people have a dedicated desk still.
I definitely preferred the old setup. I just struggle do any meaningful work at all on a laptop, I need to plug it in to a screen but I can't dedicate that space so normally just put up with it rather than unplug my home setup.
They do provide docks in the office but unfortunately a few different generations of HP laptops are around so you might need to search for the right one.
I don't think in a corporate setting fascination has anything to do with this.
I work for a global bank; we have around a quarter million employees and many of our office spaces employ hot-desking. Each one of us gets a laptop and we can work from wherever.
Especially that you're required to work from home at least one day per week.
It’s not a fascination. The issue is usually that if you have a desktop, you also need a laptop. So unless you actually need a desktop’s power, you just use a laptop with desktop ergonomics (screen, keyboard and mouse) and occasionally go portable.
Is there a distro/DE/WM that handles UI scaling gracefully? Every time I try to go back to Ubuntu with my 5K monitor I'm met with the worst UI scaling options imaginable. For example, scaling up the titlebars, but leaving everything else tiny... or scaling options being limited to 1.0 or 2.0. I still want to output to the monitor at 5K, but with the whole UI scaled up. Mac OS and Windows 10 (to a lesser extent) handle this without issue. I would love to move to Linux full time, but this has been a hurdle for me.
I wonder why this has been such an issue for Microsoft. If I remember correctly, the entire Computer Management program and all of its subcomponents have no functional UI scaling. As pretty as a Retina monitor is, I'm glad that I went with a cheaper, lower-resolution monitor with good color accuracy and reliability. With subpixel hinting, it's very sharp from my usual sitting position. With laptops, I can see the advantage of going HiDPI.
YMMW but I had a really good experience with 2x scaling on a 4k Laptop Screen in standard Ubuntu and gnome. There are still some apps which ignore dpi settings (zoom) or need to be forced to scale (often electron stuff, although you can usually just zoom with Crtl+ there). Otherwise awesome sharp text and fluid.
Fractional scaling seems to be quite a mess, at least in KDE it breaks a lot of layouts.
For price-performance, absolute performance and acoustics laptops never made any sense at all. A 3700X annihilates the highest perf part Apple uses and can be cheaply cooled under full load without causing much noise; or without causing any noise for slightly more expense. And the 3700X is a midrange offering, not high end (unlike the Apple part).
Is there a config you would recommend? Primary use case would be for programming & browsing. Would prefer not to build as I haven't done it but am not averse to the idea.
> Far cheaper, and the power / UX of a mid-range 2020 desktop blows my 2019 Macbook Pro completely out of the water for my usecase.
Which laptop do you use? I tried to switch from my just-post-Intel MacBook Pro to the Lenovo X220 several years ago, figuring Linux support + IBM quality (they had been fairly newly bought out) would give me a solid machine. Turns out several stuck pixels on the monitor and a keyboard on which some keys didn't work were officially regarded as within acceptable quality range. (Plus, I hated the trackpad, but that's my preference rather than a hardware issue.)
The same story. I was mac os addict for a long time since 2010 MacBook air and 2012 MacBook pro. They were amazing machines with stable *nix based OS.
But since that time Apple more focused on phones, desktop OS didn't get much better. The only thing they are doing is more and more cloud integration to lock you in the Apple ecosystem. I got tired of that.
Now I prefer to use Regolith Linux, because it's much better for the development to have a Linux system with proper package management, without messing with the brew.
The distro is insanely fast, and you can almost forget about a mouse with i3. It also is really minimalistic, and default settings are really good. I didn't have any urge to change anything.
And it works really great on Thinkpad laptops which have an amazing keyboard. For home, I am using NUC Hades Canyon with last-get desktop i7 processor which a bought for 300$ and you know what, it's at least 2 times fast comparing 2019 MacBook Pro 15'. RIP mac mini. All drivers installed out of the box, even wifi and external sound card works without any notch.
Why have I never heard of Regolith Linux, it looks like exactly what I've wanted out of a Linux distro for years without me spending forever customizing it to my liking. Thanks for this gem.
"I swear, I wish I didn't love macOS so much (or wasn't so heavily invested in it), or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine."
I was you at this time about two years ago. Exact same thing. Fed up with Apple hardware bullshit, and with their pricing. With Apple in general. I had a sick iMac (as sick as an iMac can be, I mean). Loved and was invested in OSX. But something tipped the scales. Can't remember what specifically, but I said "fuck it" and put the machine on Craigslist. Got a buyer immediately, and lost very little money on a three year old machine. Took the cash, plus a little more, and built a PC. A liquid cooled PC. An egregious, so-ugly-it's-kinda-neat monster with tubes and a radiator and fans and the whole deal. I run Windows 10 Pro and Ubuntu. Neither is perfect for me. Windows especially can be maddening. My personal pet peeve is the lack of powerful device search -- in OSX, I could use Spotlight or better yet Alfred to look inside PDFs, for example. SOL on that in W10.
But it was worth it. My AMD-powered PC smokes anything in my Zip code, I'm pretty sure, and it's more than enough for my work needs (and my work does actually put the thing through its paces.)
A Spotlight like feature is coming in the next update to Windows 10 I believe. Totally agree on PDFs though, Windows doesn’t even have a solution as decent as Preview on MacOS. The default for PDFs is internet explorer for gods sake.
And curious, do you dual boot Ubuntu, or do you use WSL? My biggest gripe with Windows is the insufferable terminal. But WSL fixes that, and WSL 2.0 is going to have full platform Docker support as well.
Now that Apple insists on proprietary chips, has horrible thermal throttling, got rid of the MagSafe charger, and regressed on keyboard experience, I can’t say I’ll buy another MacBook if my current one dies.
I didn't know about that upcoming feature. I'm really excited to hear that, so thanks. The PDF thing is a wild oversight, too, but I found SumatraPDF and am really impressed by its light weight and speed and bare-bones ethos. There's barely a UI but man can it handle 53 open PDFs at once.
I have both WSL and dual boot, but I dual boot way more. This is likely because I am a command line idiot, and never got fully fluent with navigating my computing life using one. It's high on my list of skills to master in life, because I know what a force multiplier it can be, but I haven't gotten around to it. I mostly use Linux to write, weirdly enough: from org mode to LaTeX to statistics coding, it's a smooth experience - again because it's pretty "just the basics" and does them well. I have no actual need for Linux - I'd be fine with just windows. I guess it's more aspirational on my part.
Apple's moves: yeah man, they're taking that walled garden shit to serious extremes. I know there's a logic to it that works for them, so I don't begrudge them their choices necessarily. But I did really like the company for a long time, so it's a bummer to see them the way they are now.
> My personal pet peeve is the lack of powerful device search -- in OSX, I could use Spotlight or better yet Alfred to look inside PDFs
X1 Search [0] is lightning-fast, results-as-you-type, and searches inside every file and in all of your email and attachments. Not just what's open in your mail client but also email archive files. I have email archives back to 2000, and the lookup is still instant.
X1 is actually one of the two things that had me switch back to Windows the two times I've gone all in on a switch to Mac. (Ironically back then the motivation was Apple's hardware was much better.)
Everything is insanely fast. Even though I have X1 Search and use that for content search (I commented on GP about that), I still use Everything for any search on filenames or directories because it opens instantly. I also love that typing slashes at the beginning or end of a phrase filters a search to a directory. It even supports Regex.
Oh yeah for sure, I found everything pretty quickly and it's definitely indispensable. I'd love if it did content search, but file search is great and fast and straightforward.
Same here, AMD Ryzen PC. Not liquid cooled, but it is a mini-itx cube which is neat.
I have a mid 2012 MB air that I still love. Screen isn't nearly as nice, but I use it over the MBP because the keyboard isn't like typing on cement and I can actually use it on my lap without feeling like its actively trying to burn my balls off. In all fairness though, I have a XPS 9560 and 9360 and both overheat and throttle like crazy and require throttle stop. Lone cowboy admin for a small company so I've got a pile of laptops.
> My personal pet peeve is the lack of powerful device search
I'm legitimately curious. I've used OS X a bit, but am primarily a windows/linux user.
In 20+ years of computer use, I've never wanted this facility. I actually take the option of removing the search indexing system from windows 7 (where you still could), and just used everything's filename search. On W10, I deliberately disable/break cortana/search so it doesn't run all the time.
Out of curiosity, what do you use a file-aware search facility for?
Great question. The answer may boil down to the way my memory works. I work in research, so I'm constantly reading and citing papers and studies for lit reviews, general understanding, making sure no one else has done the project I just thought up in the shower (usually they have), and mainly just staying at the crest of the wave in my field and subfields.
With the exception of the big famous names (famous for the 13 of us in our niche, anyway), I rarely remember those studies' authors, nor the titles of the papers most of the time - i.e. the data encoded in the filename. But I do remember certain phrases, numbers, and the like that they use in the body of the paper - in other words, the material that actually interests me. File content search for PDFs and other text allows me to enter one of these snippets and find the paper in question without having to spend minutes upon minutes scratching my head about "who was that lady at NYU ... or was it a guy at Berkeley"?
That's odd. I was just searching a directory of PDFs for a title and I was getting irritated that it was returning results from the file content. Are you sure this doesn't work for you? I am using the insiders build so that may be the difference. Also, it's Edge that windows uses for its default PDF reader and it's actually quite good. There are a few UX quirks, but it's very performant.
If you want to search inside a PDF file, install Adobe PDF Reader. Windows Index Search doesn't have a PDF filter by default, you need a filter for it.
Yeah that's a great point. I thought about it initially, but I got a little nervous (perhaps unjustifiably) about the Hackintosh universe being a little slapdash and then about Apple maybe issuing some under the radar OS update that bricks my machine. I'm catastrophizing, maybe, but I never pulled the trigger. That said, I never actually put in the 3-4hrs of reading I'd need to do, so I definitely wouldn't rule it out.
Do you have experience with doing the Hackintosh thing at all?
I did the Hackintosh thing for a friend on one of those 10” HP mini-laptop 5 years ago. It was delightful when it worked, however every macOS update (or OS-X back then) was a toothache—usually culminating in 2-7 hrs of googling/re-configuring etc. It wouldn’t have been so bad if the machine wasn’t this person’s main machine, or if I had waited longer before updating (so known procedures to make things work were available and not still being understood and developed by the community).
Might be a very different experience on a desktop, but definitely read up on the update experience and time-cost if you go this route.
I am currently in the middle of my own Hackintosh build on a fairly compatible laptop. If you are not prepared to blindly execute commands and run applications listed in a few different guides and then hope for the best, and you wish to grok what you're doing to your computer (so that you could, say, debug inevitable problems), I'm sorry to report that you are looking at many, many more than 3-4 hours of reading. Let me just say here that I have many, many years of experience in helpful fields (software, hardware, firmware, -nix), and I don't hesitate to say the process of building a Hackintosh is difficult and involved. That is, if you don't intend to buy specific compatible desktop hardware and then use specific software tools to do the install. For example I am installing on a laptop using the new bootloader OpenCore (versus the long default Clover) and I do not already have a Mac or Windows system handy, so I'm doing the install from Linux. This makes everything more complicated, but this is probably more similar to the "average" use case for most users than building a desktop Hackintosh using Clover.
That being said, the good thing is that the situation is improving: The documentation is being constantly updated and consolidated (which can be its own evil as you know, since there is frankly too much documentation out there, most of it outdated), the tools are getting easier to use and performing their functions in less hacky ways, and the community of Hackintosh builders is growing. But just be advised that the vast majority of the community of Hackintosh users really have no idea what they've done to their systems beyond being able to regurgitate the instructions they followed in whatever guide they used. And so most of the posts and replies on the forums and subreddit will not be helpful for solving any of the inevitable issues you'll run into. Probably 95% of thread replies are other users flailing around with their own similar-sounding problems, suggesting essentially random switches to flip in the configuration files (further complicated by completely new issues introduced between version updates, as the sibling comment mentions). This is problematic because in actuality, everyone's using completely different hardware and so none of the ubiquitous suggestions of "You need to enable this setting since it worked for me" are applicable. Successfully building a Hackintosh essentially comes down to loading the proper firmware settings and hardware drivers which just so happen to work for your particular set of devices. So just go into it with eyes wide open to the fact that this is a large community standing firmly on the shoulders of a very few giants, and be mindful that you can physically damage your machine if you take the wrong suggestion from a random forum user for a problem you're having. The most helpful external (non-Hackintosh) documentation I've often referred to during this process are the current UEFI and ACPI specifications, just to give you a heads up on something useful to have handy. Good luck!
This is really helpful, thank you. What you say puts some form and empirical evidence to my concerns about Hackintosh. I guess my gut had it right this time, which is unusual. So you can brick your fancy new homemade, warranty-less PC!
Anyway maybe one day I'll do it for fun on a crappy laptop I get off Craigslist. Sounds like this pays off most when it's a low-risk effort.
That's probably for the best. I bought a laptop with a known-compatible processor, and confidence in my past experience in hardware that I'm not too worried about frying my machine. For someone technical who knows in advance about possible hardware damage, I'd just say that while damage is possible, this is mostly a danger for when you'll be "patching" the ACPI configuration files that define to OSX how it should interface with your processor. So if you aren't careful, you could be telling OSX to send voltage down a line that shouldn't have voltage on the line. I mean you're not gonna smell your mistake, but you won't be using that CPU ever again. And of course as you know, essentially pounding a square peg into a round hole like you are doing when trying to fit Apple's device drivers to your particular hardware, the possibility for damage is there, too. That all being said, if you enjoy a technical challenge and learning a lot about how OSX works, it's a great opportunity to work up a sweat, with relatively little risk to your hardware if you approach the problem the right way, prepared to grok what the guides and documentation is really saying.
With that said, I have been noticing more and more people building these insanely powerful desktop machines (a lot of times for less than a MacBook Pro) to keep at home.
Here is where it gets interesting - then the same group of people start walking around with a burner Chromebook that was bought on the cheap with nothing more than a shell with SSH or just simply remote into their desktop via apache guacamole.
As others have mentioned here, if you are willing to meet the OS half way, then Linux is the way to go. With enough customization, you won't want to go back to any other OS. I use XMonad + tmux + vim, so everything is fast, keyboard driven, and minimal, and working with a mouse pointer to arrange windows is now arcane and clunky to me.
Equivalent setups with Windows and Mac are sub-optimal since there is only so much you can hack the window manager to do what you want. But you do need to go through some legwork and a learning curve get Linux working for you.
I jumped to Linux in October when I built my new desktop... so many issues... After so much time battling with MacOS and Linux Desktop on different issues... was actually surprised how much I like WSL2 in Windows of all things. Only been using it since early March, had to jump back to windows for a project.
Definitely a better experience than I remember a year and a half or so ago. The new MS terminal works pretty well, and the Docker WSL2 support is very seamless. VS Code with WSL extensions works great. I spend most of my time in that space and have had so few issues.
Note: editing \\wsl$ files in windows is a little slow, same for wsl editing mounted windows drives... but in the sandbox has been really great.
Yeah I hear you. It took me over a decade before I stopped bouncing off of Linux and going back to windows or Mac. It took more knowledge about working with Linux subsystems, as well as building a stable config that I checked into git. As mentioned, you really have to meet it half way, but once you get over that hump it is definitely more productive and simple than Mac and windows.
> or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine
I returned my 2018 Mac Mini because I was frustrated with all the constraints:
- eGPU required a PC-like external chassis, totally defeating the point of the Mini.
- Want to upgrade the storage? You need an external drive chassis and a free TB3 port.
- Only 2 type-A USB ports
I got fed up with it and returned the Mini.
So, I ended up building a really nice mini-ITX Hackintosh for the same price as what I paid for the Mini. It's got a couple NVMe sticks in it, and a 10tb hdd. The whole thing is about the same size as an eGPU chassis alone. It's quiet, it stays cool, and is relatively rock-solid.
That was almost 18 months ago and I still don't regret it a bit.
They’re fantastic on desktop and have been for more than a decade, once you get past a setup process which can be quite painful (although isn’t always).
But laptops are a lot trickier. You need drivers (or patches) for a lot of extra hardware (your trackpad, your battery-level-reader, the screen brightness controller, etc); you need sleep and cpu power management to work; you usually can’t just swap out the wifi card with a Mac-compatible one, etc.
It can absolutely be done, but you need to do a lot of research on compatibility beforehand. And as a result, you may discover your options aren’t really all that much better than they were in Real Mac land.
I would go for it (ie start doing research) only if there’s something specific you really want in a laptop that Apple simply doesn’t offer. A touch screen, for instance.
It's not necessarily finding drivers that's the problem so much as correctly patching the ACPI tables so drivers can find stuff, especially on laptops, excluding some stuff like wireless cards which often have sketchy to no driver support. Options are certainly better than in real mac land, but be prepared to spend at least a week working through every device one-by-one. Once it's working, though, it's stable.
It's not just finding compatible wireless that's a problem! You can also mostly eliminate:
• Any laptop with an nVidia GPU
• Any laptop that uses switchable graphics (unless you're okay with terrible battery life from the GPU being always on)
• Any AMD laptop (because even with custom cpu patches, the integrated graphics won't work).
That's a lot of laptops, particularly in the type of segments people would likely be most interested in, since Apple doesn't make them. Combined with the aforementioned wifi compatibility problems, you really need to do your research first!
It's not that it's abysmal, it's just annoying enough to prevent me from using it. I stopped using it a while ago, I possibly had issues with my Wacom tablet as well, but I don't really recall.
Do you know a type-1 hypervisor I could use on Linux? Will QEMU do? Thanks for the info!
On Ubuntu ish systems (not sure about base debians), you can get full KVM support in qemu by installing the "qemu-kvm" package right out the repos and then starting your VMs with the "-enable-kvm" switch. You may also need to get EFI working in qemu to support OSX EFI bootloader, which you can get by installing the "ovmf" package right out the repos, and then adding the "-bios OVMF.fd" switch to your qemu command (OVMF.fd being the firmware file used by qemu to do EFI, which the "ovmf" package seems to install near /usr/share/qemu/OVMF.fd). I'm not sure about GPU passthrough, but with full KVM support enabled, you will immediately notice a pretty huge difference in the performance of the VM.
Thanks, I think I did enable KVM in QEMU and OS X is pretty snappy. There's a bug where the mouse can't tell where the screen bounds are, and stops in the middle of the screen, but that's unrelated. Thanks for the tip!
I'm not sure about on Linux, but the GPU passthrough on Macos (Fusion) is not like KVM passthrough. When I looked into it I found out it's for one specific use case. Won't increase performance for general apps like Photoshop.
But on Linux, you can do GPU passthrough on Linux for macOS (and Windows) guests.
In particular, since no VM has graphics acceleration for macOS guests, and because macOS relies very heavily on graphics acceleration, GPU passthrough is basically the only way to comfortably use macOS inside of a VM.
They keep getting better, but you always have the fear of one software update bricking your device because some engineer at Apple woke up on the wrong side of the bed one day and put in something in the bootloader that only works on Apple hardware.
It would work if I kept this laptop as my backup (can't stop working for a day or two while I fix all that stuff).
I said this repeatedly in a Hackintosh thread on HN last week: I have never had a point release break Hackintosh. Especially these days with bootloader kext injection, it’s really quite rare. Whole version upgrades are another story, but you shouldn’t just install those on a whim anyway.
Also, if you’re on Hackintosh, Apple isn’t touching your bootloader!
I did something similar, but just got a MacMini which is to rather old but quite powerful, has a lot of RAM and it is more enjoyable to work on with the setup I have (keyboard, mouse, monitor).
I still use MBP for travelling but not when at home.
After using Macs for years I switched to Ubuntu about 3 years ago and am completely used to it now. I don't miss Mac OS at all, except for Adobe software support :(
Sorry this is the post I'm talking about. I first saw it yesterday and then tried to change up my configuration of USB C cables hooked up and it's been better since.
I have 3 cables hooked up. First is for the apple TB3 to TB2 adapter so I can reuse my TB2 hub. Second is a USB C to DisplayPort cable for my second 4k monitor (cuz I couldn't run both 4Ks off the hub), and the third is power.
I have both a MBP and a Windows workstation hooked up to a 4k monitor, and the difference in high-DPI support is night and day.
macOS and mac apps support high-DPI essentially flawlessly. On Windows even system dialogs have blurry text, as do many third-party apps (such as Mathematica until the very latest release).
Apps need to be updated to enable proper hi-res, yes, but that is not Windows' fault if Wolfram does not do so.
The problem is that Windows has way better backwards compatibility, while Apple routinely kills old tech. So app developers have to keep up, which is good for the user.
On the other hand, of course, you can still run very old apps in Windows, while Apple does not even support 32-bit, modern OpenGL or Vulkan.
That doesn't explain the system dialogs. Microsoft's development style is to just accrete more stuff; that's how you get two Settings dialogs for example.
Text in Windows also looks like garbage. Code is easier to read on MacOS. Even a non-retina Mac looks better than a high DPI Windows. Not sure if it's the system font choice, anti-aliasing, font weight...
I also think that Apple's decision to only support 1x or 2x to be the right choice. It's the wild west on Windows when it comes to high DPI. Half the UI in windows is either scaled up to "retina" and the other half is tiny boxes or text. As a result, objects are always the "correct" size relative to other objects around it.
I use both macOS and Windows, and text looks fine to me in both, and I know I am sensitive to this because I got crazy trying to get it working a few years ago in Linux and once fighting with ClearType. But in Windows 10 out of the box, never had an issue.
The problem you describe is probably old applications which haven’t been properly updated to support hi-res. If they do custom drawing or controls or frameworks, Windows cannot do anything to fix it.
ClearType is strange. I've gotten used to the way that my Linux system renders text, which is part of the reason why I like it so much. Using small fonts on macOS is always super blurry, while small fonts on Windows look like bitmap fonts and have clear subpixel hinting artifacts. On a HiDPI screen, they all look pretty similar.
Windows may support hires, but doesn't have high quality icons, and most of the apps are a bit rough. Doubly so if you drag a window from one monitor to another.
I forgot when was the last time I've ever looked at icons on Windows. I just press Win key and start to type name of the application and it shows up in a list. In most cases it takes one to 2 letters of typing as I guess Windows remembers your stats.
I do have very few icons in a normally hidden taskbar but as I set their size to tiny they have no particular look at all. I just distinguish them by color pattern.
Not sure what problems you have with dragging. I have 2 32" 4K monitors hooked up (one is vertical position) and do not experience any particular problems.
Two identical monitors is the happy path. Nonidentical ones get interesting.
I have a small high-ppi laptop hooked up to a large low-ppi monitor, and that confuses a number of apps when moved to the monitor they didn't start on. Most notably Visual Studio; some widgets are the wrong size, and the text is slightly blurred due to some up/downscaling issues.
I'm talking about how the magnification for third party apps is often weird. SQL Server Management studio for me is almost always in the wrong zoom factor. RDP gets confused. Etc.
Most of these things have probably been fixed, but I'm not sure. Windows devs?
SSMS has the worst text rendering of any tool I regularly use on windows. It's not a tie. I'm not sure how this is possible, since SSMS is built on a Visual Studio shell, but I see what I see.
Its out of the box configs are horrible. They use Courier New.
Although I haven't regularly used SSMS in a while (thanks to Azure Data Studio), the first thing I used to do was change the fonts to the "new" (15 year old) ClearType fonts. Like Consolas everywhere.
he's talking about apps that don't behave according to windows scaling (happens) or monitor scaling I think.
I have two monitors and windows scales apps equally on both even though one is 5120x1440 and the other is 1080p. The result is that I either pick small icons on the big monitor or big icons on the small monitor.
Totally agree. I would love to spend Windows prices to get that kind of power. And it would run cool. But I just don't have the time to fiddle with Windows.
Then again, maybe it won't be so bad. Worth a try at some point just setting up a couple of my elixir and ember projects to run off WSL2.
Are you running Windows or Linux on that? I'm a MacOS user but I'd love to know if it's possible to have a good high-res-screen experience on a Linux laptop.
Windows, of course. It would probably be an awful experience on Linux. You want the "happy path" of NVIDIA Windows drivers, etc. Of course, Windows has first-class support for Linux so you can have the best of both worlds.
> Apple says "we are listening now, and here is a new cooling design," then it comes out to be even less adequate that the old one. I can't think of anybody else capable of trolling up their customers like that.
Apple's thermal engineering is simply bad, and doing it bad is a company policy.
There are no other believable explanation to me. Apple been promising to fix their thermal design for years on end, with each year's model being supposed to have better thermals than the previous one, but in reality all their designs were consistently crappy.
The only explanation to that for me is that they took thermals as a subtle marketing feature, just like makers of laptops with crappy batteries always find ways to draw battery life from thin air.
Proper cooling requires thick laptop. Take a look at mobile workstations or high end gamer laptops. They are not thin, but their cooling is more adequate (although still not on tower level). There's no way to fool physics and combine good cooling, thin design and quiet fans, you have to compromise on something.
All of their current models can have much better thermals without increasing their bulk if they actually tried to make it so. Quite number of other makers have superior thermals in even thinner packages, and, more importantly, cheaper ones.
So far, none of their recent models show a single sign of thermal engineering being being done as such. Their 16 inch model has a thermal solution I would only see in a $300 white label laptop. Yes, they added few extra millimetres to fans, but at the same time they still use the same single skinny heatpipe, and tiny radiators.
And all of that is when they have access to the best parts, and fabrication services on the market. If you look closely on their BOM, they have many surprisingly low spec parts, and very minimalistic, spartan design decision.
A lot of accusations and allegations here, with zero specifics and zero to back any of it up. If you're as knowledgeable on this topic as you allege and seem, could you give you more information?
Which "$300 white label laptop" has equivalent thermal design to the 16" MBP?
Which "other makers" actually have "superior thermals in even thinner packages" that are "cheaper"? That sounds like bullshit to me, given what I've seen and experienced in the PC market, where loud fans that run all the time are very common.
I mean, I agree with you, but you can't look at a thick laptop and just assume it's going to be better.
Thermal zones in servers is a good example of low physical footprint and high density power consumption that cools quite well. The engineering effort being spent on a good thermal solution can cause a device that is thin to outperform a thicker device.
It's just that Apple does not seem to be spending the resources there.
I don't think the parent was saying that a thick laptop is going to be better - but rather that when you are making a device as thin as possible, and that's the metric you index on, cooling performance WILL suffer.
It's not just that they haven't put enough engineering effort in - they consciously made a design tradeoff.
I still think it can be done, whether it's sacrificing performance or rethinking the thermal design to include a smaller battery and a larger heat pipes, or simply using a lower TDP CPU..
Servers blades are typically cooled by noisy fans spinning incredibly fast and moving large quantities of air. Not really a good comparison IMHO. A better comparison is simply one with non-Apple laptops.
Only because there's no need for them not to be. it's common to retrofit slower spinning quieter fans into servers for home/office use.
But the same design constraints were used in a mini-itx gaming PC on the linus tech tips channel a few months ago.. I can't seem to find the video now though. The thermal performance was amazing.
My company uses Dell PowerEdge R340s for our onsite server needs, and while they're certainly louder than a modern desktop, they're pretty bearable.
For contrast, I've got a Sun Fire T2000 at home that's louder than the airliners flying overhead (my current and previous apartment both happen to be under airport flight paths, for SFO and RNO, respectively). For obvious reasons (at least until I can figure out how to get a reliable network connection to my garage) that one gets run pretty sparingly, lol
Been quite happy with my Razer blade 15, it's not a ridiculously thick laptop, but thanks to extra tall rubber feet, beefy fans and a couple other tricks it's able to run a gtx 1060 and a hexacore cpu just fine under load.
I used to have one of those. Sure it can run all that hardware, but it sounds like a jet engine when it's running. Completely ridiculous design that would never be released by Apple.
The Microsoft Surface Books also have a 1060 inside, but they are whisper quiet.
I switched from a Macbook pro 13" (2015) to the 16" this year. It's much noisier in day-to-day usage, which I've tracked down to high heat generated by the discrete GPU whenever it is in use (regardless of load).
Unfortunately, some conditions force the discrete GPU to activate - one of which is "being plugged into an external monitor." Even if you've only got terminals open, the GPU runs real hot with 1% load, and the fans ramp up to match. (This may only be true for some monitors - my work-provided monitor is the Apple Thunderbolt Display).
Sometimes Slack forces the discrete GPU to turn on, for example when clicking on an embedded youtube video. The discrete GPU will remain in use until Slack is restarted. Other applications sometimes behave similarly - I use https://gfx.io/ to see what applications are forcing it on.
Perhaps the cooling engineering is better, but the practical effects of it make me miss my 13" laptop.
> Unfortunately, some conditions force the discrete GPU to activate - one of which is "being plugged into an external monitor."
Had something similar with AMD (desktop) GPU some years ago. It wouldn't go to the lower power states if my desktop refresh rate was set above 119 Hz. So it would be hot and fairly loud. So I ended up using 119 Hz on the desktop and configure games to use 144 Hz.
I also noticed that the video card wouldn't decrease the RAM clock rate significantly, but when on the desktop reducing the RAM clock a lot had a very noticeable impact on heat and no measurable performance degradation. The answer I got for that was that it was tricky to dynamically scale the RAM to such a degree (IIRC I set it to half normal speed). I ended up using an overclocking tool with profiles, worked fine.
Someone posted somewhere that switching to an usb-c monitor fixed their issue (instead of usb-c to hdmi adapter). Anyone here have some experience with this?
Same issue here moving from a 2016 13" to the new 16". It's silly that we can't just use the integrated intel GPU with an external monitor if we desire.
Ugh, thanks. I was eying larger MBP as my 2017 13" runs almost always with both cores used, and is loud - was hoping a six core would be quiet under same load :/.
This was also true for me on a 2012 MBP (on HDMI and DisplayPort iirc).
My XPS 15 had a similar issue where turning on the GPU made it heat up and become loud - actually the discrete GPU offered no performance increase because the heat made everything throttle....... sad! At least I could use an external monitor with that one though.
It's just a limitation of the laptop form factor. My desktop has a 45 watt processor, same as the i7s, and it has a huge block of metal and a 120mm fan to keep it cool quietly (but still audible under load). There's no way to fit a 45w CPU and 45w GPU into a laptop and make it work.
This happens with MS Edge browser too. Just browsing to Arstechnica, for some reason, turns on the discrete GPU and it will stay on until Edge is shutdown or restarted.
In the 15 inch they undervalued the processor and got better thermals. In the 16 Inch they have different fan curves and larger thermal intakes and fans.
I manually disable Intel Turbo boost now though. And for some reason when I start up my laptop it can reach around 90 degrees C if I have a bunch of things open (I guess it turbos when it is restoring previous apps open) but that makes sense though.
How did you disable Turbo? I regularly hit 90 when I start running unit tests or compile xcode. I've gotten into a habit of preemptively setting the fans to max before doing anything like that now.
I felt so cheated when I discovered that the Turbo button was actually a way to slow things down when you turn off turbo. I spent so long as a child "running the computer responsibly so it doesn't overheat"!
I also recall the patches people had for things like Jazz Jackrabbit that ran a busy-loop some number of times in hot code because timing was execution-dependent in games like that. Faster CPUs made the game unplayable!
Intel Extreme Tuning Utility lets you tweak almost any parameter but it only works on Windows, if you have dual boot you can use it though (I think) as it persists settings to the firmware/BIOS. For MacOS there’s e.g. “Volta” I think which can help.
Apple is operating on the bleeding edge of design, that doesn’t mean their thermal engineering is bad, it’s best in class - they’re simply pushing the margins.
I wish they’d make a thicker laptop with more room inside, but that’s just not what they do.
Is it really surprising? Apple is all about marketing - they just want you to think you've got really fast hardware, it doesn't actually have to work properly. 99% of users aren't doing anything that actually taxes the CPU all that much so nobody is going to complain.
I would imagine the iPhone is built by a totally different team and it serves totally different markets. Doesn't really change the fact that the macbook has had consistent thermal issues release after release.
Apple aren't stuffing a third party CPU into an inadequate cooling solution with the iPhone because there isn't one. If there was an equivalent to i7/i9 in terms of mindshare in the mobile market I wouldn't be suprised if Apple released a phone with a low clocked and badly cooled one of those too.
> Apple aren't stuffing a third party CPU into an inadequate cooling solution with the iPhone because there isn't one.
It might have escaped your notice that (1) third party ARM CPUs for mobile devices are not exactly difficult to find and (2) Apple was in fact using one before they decided to take the design in-house.
I was specifically referring to something with the same "whoa, that means it's really fast" mindshare as the i7/i9 has. Do laymen really look at phone CPU models the same way that people use them to guide purchasing decisions on laptops?
See also: Intel labeling middling 2c/4t cpus as 'i7' a few generations ago despite them not being at all comparable to the desktop equivalents because they know people will buy them based off the model number and not the actual performance. (https://ark.intel.com/content/www/us/en/ark/products/95451/i...)
And NVidia giving all their mobile GPUs the same names as the desktop cards (pre 10XX gen, they're actually the same cards with slightly lower clocks now) despite them being entirely different hardware. (https://en.wikipedia.org/wiki/GeForce_900_series#Products)
Actual performance doesn't sell nearly as well as perceived performance.
(Also I guess it's all about marketing for everyone)
> Note that high temperature on the right side appears to be ignored by the OS. Plugging everything into the two right ports instead of the left raised the Right temperatures to over 100 degrees, without the fans coming on. No kernel_task either, but the machine becomes unusable from something throttling.
I feel like the top answer is missing the forest for the trees. If the temperature of the chassis rises past 100 degrees because a peripheral was plugged in, and degrades performance if all of the peripheral ports are in use... that's not a usable computer.
A long time ago (roughly 2012?) I owned a Macbook Pro for a brief time
I was running Windows 7 in Bootcamp and I wanted to set up Gentoo Linux in a virtual machine for some Linux work I needed to do
I left the machine on my desk to compile a kernel. Basic wooden desk, nothing underneath or around it. No problems there
Roughly 5 minutes later, the system had reached what Speccy reported to be a scorching 117 degrees celsius! (242.6F)
I immediately shut it down and left it to cool off, then asked around on an IRC full of various flavours of IT people (programmers etc)
The horrifying answers I got were that this was INTENTIONAL and that "the system acts as a giant heat sink" which is why it didn't power off after crossing a threshold
As far as I understand it, running it under Bootcamp also disabled any kind of thermal throttling and forced the more power hungry "Radeon" graphics chip to be used, further adding to the problem
I have forever been leery of hitting the F4-F8 buttons because of the 2007 Core 2 Duo model. I used it for a couple years and remember at least 3-4 times when I decided to rest a finger up there and ended up getting scorched.
This is why Apple stopped describing their portables as “laptops” some time in the early 2000s. They didn’t want the legal risk of somebody burning themselves after it was implied to be safe to use on their lap.
I have some golden memories from university playing Unreal Tournament 2004 on my laptop in bootcamp, the added difficulty was that if you touch the metal between the keys it'll burn your finger tips.
The top cover part between the touch bar and the screen regularly runs so hot that touching it really hurts. Can't leave my hand on there for more than a second or two. Ran all sorts of diagnostics and according to those everything is fine.(MBP 2017)
At a certain point, Apple had to stop calling them "laptops" and start calling them "notebooks" because people kept putting them in their laps and frying their genitals.
Some low-temp solders melt as low as 140C, but typical SAC lead-free solders you'll find in laptop motherboards melt at 220C. Many laptops are fine if a processor core hits 100C.
I've got an old Thinkpad with an i7-3920mx that regularly runs at 95C when running a few VMs and is certified by Intel up to 105C. And yes, I've tried replacing the thermal paste and increasing fan speeds to try to keep temperatures down, because not all processors are successful there, but it has been running like that with no issues for years.
Thank you. I used to design microprocessors (a LONG time ago) and TIL that there are CPUs that can operate at 225°C. Wow. And even limping along at 300°C? Wow wow.
BTW, I can't stand the 8051 architecture, but that's really off topic. I guess if you need 225, you need that chip. Reminds me of the Henry Ford / Model T quote...
8051 can certainly service interrupts quickly and predictably, which is probably all you want if you are sticking a microcontroller inside a turbine engine or oil drill.
Used in e.g. turbine engines, instrumentation in oil wells or mining operations, and industrial process control. You can get microcontrollers that work in similar temperature ranges, “in any architecture as long as it’s an 8051”.
100C is where intel typically throttles down clock as a life saving measure. You can run at 100C, or higher, but modern CPUs will cut down performance in an attempt to lower temp. High temp = shorter life. Macbooks adjusted that limit upwards so the CPU has a higher threshold before the auto-throttle kicks in.
More about that here:
https://www.youtube.com/watch?v=947op8yKJRY
Let's remember that even though the CPU in a current model MacBook Pro might reach 100 degrees C, the laptop itself is only going to hit about 50 degrees. Which is still pretty hot.
Relevant recent experience: I have had a 2019 MBP with vega graphics card for a while. I noticed that recently it was running VERY hot all the time. I popped the bottom off (you need that damned pentalobe screwdriver), and found tons of dust jamming the fans entirely. Upon removing all the dust and cleaning out the fan ports the machine runs great again. If you're having cooling problems start with this fix.
- fans are often on full force albeit nothing special is happening
- before buying the ridiculously expensive AirPods Pro, re-connecting them (normal AirPods) after a disconnect often required a system reboot (to the point I’ve made the reboot part of my pre-meeting schedule)
- connecting external devices usually requires adapters. With them too, when the adapter stopped working usually it helped connecting it on the right side (Reboots weren’t helping)
- for some reason the adapter would work on the left side again after some time
- as soon as I connect the external screen (not even a retina one) the fans go louder
- time machine is a PITA. Every time it runs fans are blaring up, machine gets hot - even system is getting significantly slower. Even if it just backs up a diff of 150MB. And it runs multiple times a day, with no option of configuration other than no auto-backup at all.
- Window’s decade old window management via shortcuts still doesn’t exist in macOs (you have to install a third party tool for that)
> before buying the ridiculously expensive AirPods Pro, re-connecting them (normal AirPods) after a disconnect often required a system reboot (to the point I’ve made the reboot part of my pre-meeting schedule)
To add a contrasting datapoint, my (gen 2) AirPods have been virtually flawless for a year or so with my 15" 2017 MBP.
> before buying the ridiculously expensive AirPods Pro, re-connecting them (normal AirPods) after a disconnect often required a system reboot (to the point I’ve made the reboot part of my pre-meeting schedule)
I was talking to a hardware engineer a while back who was heavily criticizing USB-C is because of all the ways vendors can abuse it. Turns out that the USBC port on the Nintendo Switch doesn’t comply with the USB-PD standard, which caused lots of issues for users who had third party charging docks.[0] There were accusations that Nintendo did this intentionally to restrict third-party accessories.
After reading that, nothing I read about USBC surprises me. Sounds like another spec for vendors to abuse and ignore.
The switch has issues, but the bricking was because one of those dumb docks was putting 9 volts on a pin that's supposed to be using 2 volt signals, if I read the spec right. And the switch tolerated 5 volts there, too. Putting more than 5 volts onto the non-power pins is such an obvious violation that I can't blame the spec for that fault.
USB-C is a complete mess. I can’t believe Apple dropped MagSafe for this. They could’ve made all the io ports type C but kept MagSafe for charging with an optional chargeable type C port on the right side.
Are there any good macbook laptop stands that act as a heatsink as well? My current stand has rubber grips so while there's plenty of airflow under it the heat isn't being drawn away very well. I don't want to add a fan since that would add noise.
Amazon Basics has similar thing that is essentially the same but looks a bit less stylish and costs half of what mStand costs. I've been using it for almost 4 years now and it works great.
I have this one as well, it does a decent job of wicking heat too, the portion of the stand in direct contact with the laptop is noticeably warmer than the part that contacts the table
I also like this model. There's a little air gap between the computer and the stand, but being solid aluminum I figure it's going to move a lot more heat away than, say, a wooden desk surface or cloth-covered human legs.
I also rate these stands - but it has rubber grips. You're not going to get any heat transfer from laptop to stand, so it's not going to act as a heatsink.
I have the same stand and it does absorb a good amount of heat when the Mac is running at full power (the angle helps there too), albeit not a true "heat sink". The rubber has negligible displacement.
Been a huge fan of the Roost Stand ever since I got it. There's no heat sink, but there's only four small plastic contact points between the laptop and the stand. My fans are often on when running several Docker containers, but the surface of the top bar never gets that hot.
I have a 2018 13" Macbook Pro.
(I'm in no way affiliated with Roost, I'm just a happy customer.)
That was my though too, those thin stands may look sleek but they don't really do anything in terms of heat transfer. If anything it'd be worse since now the machine has nothing to transfer the heat to.
I fee like I've seen a lot of thick steel plates around the right size sitting around in old workshops. You could probably find something cheaply from a scrap dealer.
Thick aluminum or copper seems more rare, I don't know if it's commonly produced.
Any Noctua brand fan will stun you with its silence. I installed four 20mm fans in a network switch to replace its screaming loud factory fans. They're all running ~1200RPM and it now sounds like a small desk fan. A larger format fan running at a few hundred RPM will likely add very little to the ambient noise around you.
Also big fans (bigger than 140mm) are usually running on noisy ball bearings (or roll bearings) while "standard" 120 and 140 fans are often running on silent magnetic suspension.
Do you mean 100 rpm? 10 rpm isn't really going to do anything. When I last looked all of those "laptop cooler pads" use tiny fans that would either be ineffective or loud or both. I settled on the mStand mentioned above, but as noted it has rubber standoffs that prevent it from acting like a true heatsink.
I mean 10rpm, really! A 10" fan at 10rpm moves about as much air as a 2.5" fan at 160rpm.
You're not trying to move huge amounts of air, since heat transfer from the laptop to the air is pretty slow. You just need enough movement to clear out any hot air that is building up under the laptop.
> You just need enough movement to clear out any hot air that is building up under the laptop.
On a nearly totally different tangent, our own bodies build up hot pockets of air indoors as well, where there's no natural breeze to get rid of it. Getting rid of it creates a surprisingly strong cooling effect where you probably won't need AC for a while longer than you expect - and "air circulator" fans are pretty good at doing it over a whole room, so you don't need to keep a fan directed at yourself.
This is why a "universal" port was a bad idea. Just because a cable looks like it'll fit doesn't mean it will work. Charging ports have fundamentally different requirements from data ports and display ports. I miss MagSafe.
Also, if the overheating issues are true then macOS should issue a warning when charging from an unsuitable port.
> Also, if the overheating issues are true then macOS should issue a warning when charging from an unsuitable port.
It doesn't seem to be an issue with the port being unsuitable - the SE post mentions[1] that the right side similarly increases in temperature significantly when in use. It just doesn't trigger the same throttling behavior.
Looking at an iFixit teardown[2], the ssd chips are located next to the left thunderbolt controller. I wonder if that may be why temperature spikes on the left side trigger more aggressive temperature management than spikes on the right.
[1] Quote: Note that high temperature on the right side appears to be ignored by the OS. Plugging everything into the two right ports instead of the left raised the Right temperatures to over 100 degrees, without the fans coming on.
I had the same issue with a MagSafe 2012 MBPr (throttling when charging). It's just a lack of adequate heat dissipation in Apple's laptops, and with the USB-C ones you suddenly get around it by plugging in on the right side where there aren't enough temperature sensors for it to properly throttle.
The concept of a universal port was perfectly fine, it's just the implementation that is flawed.
And before USB-C and Thunderbolt, "Just because a cable looks like it'll fit" DID mean it will work, at least for what an end-user would have been likely to face.
I think this part of the comment should be highlighted more due to people simply thinking switching to the right side 'fixes' whatever the issue is:
> Note that high temperature on the right side appears to be ignored by the OS. Plugging everything into the two right ports instead of the left raised the Right temperatures to over 100 degrees, without the fans coming on. No kernel_task either, but the machine becomes unusable from something throttling.
I have the 16" Macbook Pro (work issued) and it does get very warm when you are charging and running a couple of monitors (1x1920x1080 and 1x4096x3084) even when not under a massive amount of load.
It's fine because in that situation it's on a stand on my work desk and it's noticeably cooler when not runnig the externals when it's on my lap.
Lovely hardware and it was a good move to switch to a Mac because the team I run had already standardised on macs when I joined the team but its one thing that makes me wish I'd stuck to an equivalent Thinkpad/Fedora if I'm honest.
Also own a 16" MBP. I use a kernel extension called Turbo Boost Switcher to control heat. The battery lasts 25% longer with turbo boost disabled and the machine never feels warm. I enable turbo only when rendering audio from my DAW or for long compiles. The machine otherwise never feels slow. Big win.
I'm dreading the day MacOS formally blocks kernel extensions.
Apple isn't under legal obligation to expose equivalent functionality to userspace. Time will tell if they're moving kexts or blocknig them. My money's on blocking them.
The developers of Little Snitch have confirmed that, at the very least, network filtering would work in userspace via the new Network Extension framework: https://news.ycombinator.com/item?id=22677849
And in userspace, you can only do what the APIs allow. Unless there is an API for directly manipulating the CPU Turbo Boost system, you're S.O.L. when kextload is dead
Has it come to this? We used to have physical turbo buttons, now we have soft un-turbo buttons. I think the grim reality in WFH is that laptops just really still suck compared to desktops for running dual monitors, an A/V feed with OBS, and 80 tabs across two browser windows along with a few other apps open. That load would have my i7 8th gen at 70-80%, but has my Ryzen 7 at maybe 10%. Not throwing away my laptop anytime soon, but I'm learning to love the desktop platform again.
Totally agree - Turbo Boost Switcher has dramatically changed how I use and like my 16" MBP - no more fan noise, no more burning fingers or blanket, and practically no noticeable difference in performance (e.g. Destiny 2 runs fine with it off).
Hard to beat a Thinkpad with Linux for dev work. a year ago I switched from a mbp to a Thinkpad with Ubuntu and I cringe every time I have to go back to the Mac in order to do something.
The high-dpi support really is a big deal. No other OS (Windows or Linux) comes close to handling it as seamlessly. I run Gnome in Ubuntu and have got it to a "serviceable" state by being able to set pixel density on a per-app basis by editing launcher files (depending on which display it's running on), but that's not something an average user would be able to figure out (or should have to).
Apple has done a great job with their window manager and support for varying DPI between displays. It's just a shame their OS is otherwise such a walled garden, and their hardware is a bad joke ($6k for an 8 core desktop with a years old GPU and no storage, come on.) It really boiled down to a "pick your poison" scenario for me and I sided with the OS (and hardware) that lets me tinker to my heart's content.
Mac text rendering is blurrier than Windows text rendering (and I have learned of no way of changing that despite my having used a Mac for 10 years).
Worse, if for some reason you want to change the size of the elements on the display, the only way I know how to do that on a Mac (namely, to use System Preferences :: Displays to change the "resolution" to some value other than "default for display") makes the text much blurrier. I don't have a Retina display on my Mac, but someone who does claims that even on Retina, he prefers Windows because of the blurriness of the Mac
In contrast, if you can be somewhat picky about which apps you use, text on Windows is just as sharp no matter how big or small you configure the elements on the screen relative to the default size.
(In most Mac apps, it is easy to adjust the size of the text in the main pane, but all the other text and all the other non-textual elements, e.g., icons, stay at the default size.)
>Apple has done a great job with their window manager and support for varying DPI between displays.
That might be true, but the Mac does a poor job accommodating sub-par or non-standard human visual systems (and I would guess that people who cannot easily control how far their eyes are from the screen -- e.g., people living in a small van -- would find a Mac frustrating as well relative to Windows).
Mac font rendering is also more accurate to the font than Windows. Since MacOS doesn’t hammer glyphs to the pixel grid, font scaling is far more consistent. Windows’ font rendering is part of the reason why their HiDPI support is so janky.
I always thought that Microsoft's commitment to keeping old binaries running on new versions of Windows is the reason their HiDPI support is suboptimal and that if you use only apps that use the latest text-rendering API, the experience is great.
How would making the pixel grid finer exacerbate the problems with a strategy of hammering glyphs to the pixel grid?
Could you tell me a bit more about your experience of Linux with a high res display? Are you using the built-in screen of a laptop? Would you mind sharing the model, or do you have any recommendations/advice regarding laptops with high resolution displays that can work well with Linux? Basically, is there any hope of emulating the experience of a macbook Retina screen under modern Linux? Do you know if there are any groups / momentum in the linux development community working on this?
This was the game-changer for me. When I first saw 2012 MBP Retina I was sold. Build quality is overall so much better than its class, although Thinkpads are solid stuff as well (good enough for the ISS, anyway). The rest of your comment I align with also.
> The high-dpi support really is a big deal.
I honestly don't understand this. I owned a Retina MacBook Pro (2015) for a while and my current (Lenovo) laptop has a 4k screen, but I don't think I've ever actually cared about the increased pixel density. The only thing it's ever done for me is increase heat production, decrease battery life, and decrease compatibility (there's Mac software that isn't Retina compatible too, and it looks at least as bad as on Windows).
I use 24 inch monitors at 1080p all day and I can see the pixels if I look, but images still look plenty good and text is super readable. I switch between this pixel density and my Pixel 3 XL and while I can definitely notice the difference in density if I look, my productivity isn't affected whatsoever by having a less dense screen. Is everyone else putting their face 2 inches from the screen every 5 minutes just for the sense of satisfaction they get from not seeing the pixels?
> Apple has done a great job with their window manager
I think that's a pretty massive stretch. I haven't used a Mac as my primary machine for a few years now, but I've been watching the window management get worse and worse over the years. It used to be you could have a grid of Spaces and proper intuitive window management, but now Maximize is hidden behind the Fullscreen button. Virtually everyone I've watched use a recent version of macOS either has everything fullscreen, uses one window at a time with 4" of spacing around it because it's a pain to properly size the windows, or has 7 different apps installed to fill in missing features that have been around in Linux for as long as I can remember and in Windows since Windows 7.
I find text visually less fatiguing at high-DPI even though I have no problem reading it on my 24" 1080p secondary monitor. I don't think it's a matter of productivity as much as comfort (though maybe comfort indirectly affects productivity).
I agree with everything you said about the more recent releases of MacOS. Maybe I was being too generous in my previous comment - it really does seem optimized for very small laptop displays, so that's probably why they have recently placed so much emphasis on fullscreening everything (and also why they got rid of the Expose grid - horizontal swipe works better on a laptop trackpad).
You don't understand because you are the kind of person who buys a laptop with a 4k screen and thinks that's an asset, who thinks 1080p monitors and that aspect ratio are acceptable, etc.
In other words, you've never had a large high-dpi monitor setup with multiple screens large enough to appreciate it. And you haven't ever been an advanced enough Mac user to learn keyboard commands and the various ways to manage windows.
I recently bought at 49” ultrawide monitor. Both my MacBooks can drive the native resolution in Windows but not MacOS. Apple refuses to fix their broken driver or even acknowledge it is a problem.
Workaround is to provide two inputs to the monitor and run it in picture by picture mode. This is ok but when waking from sleep MacOS gets confused about what monitors windows were on and the arrangement of the panels. This has always been a problem and why I got away from dual/multi panels in the first place. MacOS support for that has always been abysmal.
It’s one of the first examples of Apple gear not “Just working (tm)”. I suppose it’s my fault for not buying a monitor with an Apple logo on it.
I find myself wondering why I deal with the headaches in MacOS. I switched because it was easy. Now Apple is making it hard. At this point fighting with Linux actually appears to be easier than using MacOS.
That's how I felt when I used a Mac for work. Linux doesn't work just the way I want right out of the box, but neither did the Mac, and at least Linux gives me the tools to change and fix the things I didn't like.
Granted, my monitors are 1080p and 720p (ugh), so I haven't had to wrangle with high DPI issues, which does not sound fun.
That was suggested in some of the forum threads I read. It doesn’t solve the problem though. It just scales the image up and provides the same quality I have now. There were also some suggested profile hacks which I think is what SwitchResX does.
You can also option-click on the scaled options in display manager and get more options but none of those are reasonable either.
The whole reason I use Macs is because I don’t have to constantly hack and fiddle with them. At that point I may as well install a Linux or BSD.
The software quality from Apple was never great but it is really falling off a cliff recently. It’s getting to a point I wonder why I pay the premium.
True. I recently picked up Albert [1] on my Ubuntu machine because I was missing spotlight search. So far it works well - it would have been nice if something like it was included by default.
I don't understand the people for whom Spotlight works. I tend to open Spotlight, and start typing in the name of a file I'm looking for. Consistently, reproducibly, for many different files, it'll fill in the name of the file at a certain point, but then, if I keep typing that very name, it'll switch to a different match.
That is, if the name is `abcdefgh` and I type `abc`, then it'll complete it; but if momentum carries me and I type `abcd`, then it'll drop the desired file entirely off the list of suggestions, and give me a different, non-matching one.
Huh, interesting, because that exact problem is one of my key complaints with windows search, while I've never noticed it with spotlight -- and I've been using spotlight daily since Tiger.
Well, that's not quite true, for a few years after spotlight was released it would get slow after a while, so I mainly stuck with Quicksilver. Eventually spotlight got faster, but I never had that particular problem with it, even though I very regularly have that particular problem on Windows.
I am forced to use Windows most of the time. The fact that its desktop search is soooo bad is a daily bummer and makes me look forward to opening my MBP.
It seems like part of Microsoft's goal here was to direct traffic to Edge and Bing at the expense of actual usability. See also: the "help system". Luckily there's a lot of third-party stuff to replace the search feature.
FWIW, Windows laptops (that you can then install Linux on) have had hidpi screens for more than 15 years (!!), and Ubuntu's fractional scaling works really well right out of the box now.
"have had" and "works reliably" are very different things.
Every Windows and Linux machine I've used has had issues with individual apps, and most of the time I've plugged them into things that cross multiple scaling factors simultaneously, they've have had rather significant issues (when it isn't just "issues with everything except device-native"). I haven't had an issue with that at all in a few years now on my Mac, and I plug it into literally 10x more screens and configurations.
Just to add to the anecdata, my MBP would kernel panic ~25% of the time it was plugged into a 4k displayport 1.1 monitor. I'm glad your system is stable, though!
Exactly. I would love to use Linux, but switching back to the traditional-resolution laptop screens after working on the high-res MacbookPro screens is not something I ever want to do. It would be fantastic if Linux came to support it well.
Indeed, my personal machine is a T470P (i7-7700HQ/2560x1440 and I upgarded the ram to 32GB) it's a little beast and Fedora Cinnamon is hands down my favourite host OS for development but practicially made me pick a Macbook for work since I'd be getting questions about stuff on a platform I'd never used, I like OSX generally but there are days where I still miss Linux (though iterm2 is phenomenally good, not aware of anything comparable on Linux which is a little ironic).
there is a terminal emulator called terminator. It has a lot of nice features, including split screen keyboard broadcasting. very useful if you have many machines with the same configuration. https://terminator-gtk3.readthedocs.io/en/latest/
If you want something incredible, try terminology. It has a few nice commands baked in starting with 'ty' - like tyls and tycat. I'm not going to spoil them for you by telling what they do. Just try it, you are going to love it.
I was just contemplating the idea of someday switching from macOS to Ubuntu. Curious what it is you dislike so much about macOS?
I think the biggest thing I would miss is being able to send texts from my computer. I use that all the time. I and a Windows computer at work for a bit and it absolutely drove me nuts.
>I think the biggest thing I would miss is being able to send texts from my computer. I use that all the time. I and a Windows computer at work for a bit and it absolutely drove me nuts.
That's primarily an issue with iOS APIs (or lack thereof), though. Gotta switch phones as well :)
Dell released an app[1] that allows for this, which supports both Android and iPhone. While in theory it's intended for use on Dells, there are workarounds[2] to get past that check and install it on any Windows computer.
I believe the 16" got a significant bump up in its discrete graphics card, and I'm pretty sure it kicks on whenever you plug in a second monitor, no matter what else you're doing. That card has been by far the greatest driver of heat/fan speed in my usage; more than the CPU.
The fan on my 13" MBP (2017) rarely kicks on unless I have an external monitor plugged in. I've been told that it has to engage the discrete graphics card to power my external monitor (only 24" / 1080p), and that's what causes the extra heat. Unfortunately, with the external monitor the fan is on all the time. I almost wish my work environment weren't so quiet, because then I wouldn't hear the fan kick on.
Anecdotally, and I haven't done any serious testing of this, my work colleagues and I have noticed that on our 2017, 15 inch MacbookPros, wired Ethernet (either from a dongle, or a hub) is only reliable on the right hand side. One co-worker has realized he has to plug in his hub after boot, but before he logs in in order for the device to be recognized, if the hub is wired into his wall drop. There are obvious work-arounds to all these issues, but it's frustrating to have these sorts of problems on such an expensive machine.
I have a similar issue with setting up my eGPU with Bootcamp. Apparently the macOS bootloader messes with the setup of a ton of devices so what I have to do is plug in USB C devices before boot, then plug in my thunderbolt devices in the bottom left port at a specific point during boot, then plug in USB-A devices after login. Anything else crashes the machine.
Admittedly my use-case is niche, but I wouldn’t be surprised about similar things happening with other devices.
I just bought an eGPU to solve the Macbook overheating problem. While that works, it made me mad that I had to spend 350 euro’s to fix a problem I did not have with my 2014 model, while having an extra unsightly box on my desk with a giant power brick.
Heh, you effectively no longer have a laptop either (I mean, you do, if you want it to overheat I guess), so you spent money + changed it to a desktop computer, without really getting any of the benefits of a desktop computer.
While your comment is genuinely amusing it should be stated that the eGPU wouldn't be needed when out and about since you'd only be running the one screen. So in some ways having an eGPU is the best of both worlds: something equivalent to a desktop PC when "docked" at his desk but still a usable portable device when out and about.
Yes, but I absolutely do not need the 3D capabilities of that card. I just want a cool laptop with a large screen for remote work and a 4k or 5k monitor at home.
Here's hoping some enterprising TB3 hub maker adds a decent, low-powered GPU into one of their hubs.
This might help balance the load of driving a 4k/5k screen, or at the very least, keep the MacBook's motherboard cool, and also keep the whole setup fairly portable.
I’d snap that up in a heartbeat if it meant I could run more than 2 external screens on my shitty 2015 non-touchbar MBP (the model with only 2 USB-C ports)
I always use the left, because it keeps cords away from the mouse zone, and I can let the cord run straighter to avoid stress (and prevent fraying). What the hell, Apple?
It's a shame right-angle Thunderbolt cables aren't really a thing. I'm using a right-angle USB-C cable for my docked 12" MB which really cleans up the desk space.
I've only ever plugged my LG Ultrafine 5k into the right hand side of 2016 15 inch, but I find that if the room is warm (perhaps > 22 degrees C) I have 'kernel_task' apparently maxing out my CPU. Anecdotally until I discovered that aiming a fan at the underside seemed to make this problem go away. It doesn't seem to happen (that I remember...) when the big screen isn't plugged in.
I discovered I could charge two MacBooks on one charger by daisy chaining them, for some reason I didn't expect that to work. The second laptop didn't charge very fast if at all, but you can keep working while you left your charger at hame.
The Nintendo Switch charger only puts out 39 watts, and I think the Macbook Pro charger puts out 96 watts, so it makes sense that it would charge slowly.
One ridiculous thing I ran into was that USB-C hubs can block Wi-Fi that is on the 2,4ghz band. Solution? Wrapping it into foil somewhat works but I had to buy a new Wi-Fi access point. Sometimes rotating the USB-C adapters works too.
I have a similar issue when I use my external SSD with my MacBook Air. If it is connected to the left side, the WiFi gets blocked. If I connect it to the right side, WiFi works fine.
I tried foil, but it didn't help except when it was connected to ground.
I have an LG Ultrafine 5k connected via thunderbolt. In order to use my machine for anything even mildly intensive while the monitor is plugged in, I've resorted to a 5-fan laptop cooling stand and a medium-sized personal cooling fan sitting next to it.
I do live in a warm climate, but this is ridiculous.
The only thing I can think of other than it being shitty thermal design is that the cable is dodgy, easily losing signal if moved the wrong way. Is this something that could affect it? I'm planning to order a new cable but I've seen so many reports of thunderbolt ports getting to hot and triggering the dreaded kernel_task I'm not optimistic.
Does anyone know if this also affects MacBook Pro 13” with only two Thunderbolt ports on the left (the ones without Touch ID)? Are these ports the same as the laptops with three ports? If yes, does this mean we (MBP13” users) have no solution?
Haven't noticed this on my 2018 15" i7. My monitor at work (2160p) and home (1440p ultrawide) are placed to the left of my laptop and always use both left TB3 ports for charging and display
I've had issues with my 2017 MBP running really hot, but never connected that with kernel_task running at 150% CPU until now.
I found a tool called Mac Fan Control, which lets you change the thresholds at which the fans kick in. I've been using it for a good while now. Laptop runs louder, but cooler.
Today my machine slogged to a crawl, became basically unusable to the point of restart -- I don't remember which port the power was plugged into, but this definitely shed some light on the issue. Fan Control wasn't running, incidentally.
I still have the problem that my 16" MacBook runs really hot with a external display plugged in. There is a pretty good thread on the issue on the mac rumors forum [1]. It seems to particularly be a problem at 2560 x 1440 @ 60hz.
This is wild. IIRC, the TB3 ports in right side have lower bandwidth, so I've always plugged everything in on the left.
I wonder if this is the reason my MBP would get sluggish while kernel_task uses a surprising amount of CPU? I always blamed the corporate spyware/antivirus. Maybe I was just charging it wrong
Yes - there is no cooling on the TB3 controllers, yet they put out a significant amount of heat, which causes a kernel_task internal thread to intentionally throttle CPU (which shows as high CPU usage).
Been driving me mad till I hacked my dock to fit on the right side and connected power on the left, which resulted in better airflow and lower heat generation by the controllers....
> I wonder if this is the reason my MBP would get sluggish while kernel_task uses a surprising amount of CPU?
This happens to me as well while charging if the battery is below ~50%. Start recharging above 50 and it doesn't happen. At some point I took it to an Apple store for an unrelated issue and they told me that their diagnostics flagged a faulty sensor in the internal charging circuitry. I haven't sent it back yet for repair because of the turnaround time involved, but consider that it could be related.
Just been going thru "Mac Kernel Task Hell" myself recently until I got a box of a cold gel packs [0] to act as my new "Macbook Pro Cooling Stand".
Seriously, I know my Mac is a little old (2015 pre-Touchbar) but that doesn't mean I need to suffer like this w/ kernel_task eating 1,324% of my available CPU just because it's getting a little hot.
Other routes taken:
- Get a Fan App to keep the fans running at max 24/7, meh
- Take the risk of bricking my one and only dev machine by disabling the kernel_task process altogether, no thanks
No, just one nice ice pack with a cute tea towel on top, makes for a nice stand that keeps the aluminum unibody thingy nice and cool, sigh
If you recompile the XNU kernel you could disable the part that ‘reserves’ the CPU to cool the system. I am not sure whether is is still possible to replace the kernel, but IIRC some Hackintosh people replace it by custom builds (assuming that Apple released the relevant version already).
Of course you probably do not want to do that, this is done for a reason.
Have you ever cleaned out the dust from the Macbook internals? If you've never done it, and your laptop is ~5 years old then dust is almost certainly part of your problem.
That would be the first thing I'd try honestly. You can Google lots of info/videos on how to do it. It's a pretty straightforward job if you're careful.
Interesting. I've noticed the fans going crazy way more when I use my laptop from bed (charger on my left side) vs my desk, (charger on my right side).
I figured it was the blanket restricting airflow (which I'm sure is still a big contributor). Interesting stuff.
I bet if you take a look at the motherboard for the year and model this happens on the CPU is closer to the left side than the right. That would explain the kernel task being configured to force idleness when left side is hot, but not the right side.
This is a real problem on the 15" 2018 MacBook Pro.
Quick summary of my experience: It turned out TB3 chip is on the left side by the power plug and was warming up too much by having both power and accessories flowing thru it for the internal fans to be able to cool off, and the OS was spinning up phantom kernel_task cycles to attempt to slow and cool down the computer.
In my case case I was waiting for an 87W TB3 docking station to come along, and was in the meantime running 2 external monitors off the left side using Apple's dongles.
This looks like the post that saved my sanity. Apple has a pretty serious design flaw in its MacBooks.
Whenever I read this stuff now I just feel a little sad.
For years OS X (macOS?) has been my go-to, but lately I find myself wondering why I’m paying such a premium for what amounts to some nice-but-problematic hardware and an OS with a bolted-on package manager. To develop iPhone apps I guess?
I see some good points in this thread (e.g. iMessage). Maybe I’m just getting old. I’d rather just have a ThinkPad or an EliteBook running Ubuntu or, sigh, Windows. (I’m not a Windows fan but at least it doesn’t feel like it’s regressing like the Mac experience has been.)
> I’m not a Windows fan but at least it doesn’t feel like it’s regressing like the Mac experience has been
Maybe I am just crazy but I had to go back and use XP for a few days for a project I’m working on (long story). After being on it and going back to 10 I honestly miss it in a lot of ways.
The UX on Xp was so clear, I still know where all the settings are and how to use them. I felt so in control. Now on Windows there are like 10 different settings menus with completely different design languages. I can’t find anything without search (and search is a disaster in and of itself). I can’t control updates well and in the last year there have been multiple updates that borked machines.
Yeah, even the messages have become more patronizing.
"We're getting things set up for you."
"You'll need the Internet for this."
"We need to update some apps."
And my personal favorite, "Windows is a service and updates are a normal part of keeping it running smoothly. Ready? Restart now. Not ready? Pick a time that works for you."
For a bonus, the title to that wonderful dialog was "Let's cross this one off your list."
Agreed on the hardware. Part of it is perception - people expect crappy windows laptops to be crappy and break down. Apple is "supposed" to be perfect, and any problems are instantly literal front-page news.
MacOS has had a bad run, though. It is substantially buggier than it used to be - I see far more weird silent failures than I have in a long time (pre-Snow Leopard), bad performance in some cases, etc.
And then there is all the Catalina security-related breakage. Maybe that will work out for the bulk of users, but it broke the platform for me. That's the reason I don't see myself paying for another.
I used to feel that way until it bit me my own self. I had to trash an iMac with a bad graphics card that was past warranty. That was a lot of cash down the drain.
Then add the silliness (thankfully behind us) with the butterfly keyboards on the MacBooks.
I can’t speak for the rest of us, but between the glitchy (for me) experience with Catalina and some of the unwelcome (to me) changes around SIP, I definitely feel less and less happy every time I sit down to do work on my Mac.
> I also don’t understand the idea that MacOS has been regressing, it seems like a weird hivemind take around here lately.
Something always seems to break for me on macOS updates. Often it's related to third-party software, but it still feels like a regression for me.
For Catalina:
* I use Karabiner-Elements for keyboard customization. Since Catalina, hitting capslock will randomly stop the keyboard from working until the OS is rebooted. My solution has been to remove my capslock key to avoid inadvertent presses.
* Since iTunes was removed, Apple Music no longer works with Firefly Media Server. This means that any NAS device with iTunes library support can no longer serve that library to macOS. My solution has been to give up playing music on macOS, since I can't find any software for the relatively simple use case of "play my pre-organized folder from this artist in chronological order by album".
Couple of dumb-sounding questions after reading the link:
* Left or right side relative to what? Is it my right when looking at the laptop screen?
* Does this have any implications for reliability of Bluetooth transmitters? I use a mouse and keyboard that both plug into a USB-C hub, and sometimes have issues with mouse tracking, or keystrokes not being registered. My WFH setup has these things plugged in on the opposite side they normally would be, and I think I'm seeing this issue a little more frequently than I was in the office.
If you're using a Logitech Receiver (small USB type A RF dongle [0]) plugged into a USB-C hub, interference might be causing your problems[1]. I used to have a setup like that, but after switching to Apple's USB-C adapters (USB-C to USB type A and/or USB-C to HDMI+USB type A) solved problems. Those adapters feature a short cable between the plug / ports, I wonder that's because of the interference?
* I hope so! I use bluetooth mouse and keyboard, and have had similar issue on my MBP 2019 (the latest one). The issues only seem to crop up when I'm at my desk, with the power plugged in on the left side, and the USB-C hub (for external drives and monitor) on the right side. I now have a tiny bit of hope that swapping power and hub might fix these issues.
+1 Anecdata: I have a hub (USB, network, charging, HDMI) all going into a single USB-C port, really convenient. Watched the CPU usage of kernel_task for a minute while it was plugged into the left (which I have done by default), it hovered around 20-25%.
Switched to the right side port, CPU usage for kernel_task dropped to around 4-6%.
Switching back, letting it chooch for a while, still hovers around 5% but with more jumps to 6 or 9%. May be down to that side having cooled down a bit.
I'll keep an eye on it, that's pretty interesting.
I've had this problem too for a while now with my MBP 2017. The left ports can't be used for charging at all. The moment I plug in the charger, the fans start running at full speed.
I 'm having issues even on one of the right side port. I can't use it for anything. If I plug in my monitor, It shows me "USB Accessories Disabled" error. If I plug in my HDD, it just doesn't work. I've tried resetting the SMC and PRAM. Anyone had similar experience ?
My work machine, which is a 15" MPB, has such poor thermal performance that I'd never consider buying one with my own money. My personal machine is an Air, which only gets hot when I'm actually doing something taxing. I can live with that.
Generally I am a big fan of Apple products, but I don't know how they have the gall to ship these things with such hideously disruptive heat issues. Hopefully the new 16" model is the start of a swing back toward sanity.
I still wonder if there are durable brands for laptops, since I'm not sure the new thinkpads are so good. I guess dell makes good "pro" laptops, Toshiba maybe?
It doesn't seem there really are brands that can be considered as reliable anymore. Of course Asia has such a big market share on computer parts, it's not surprising that no laptop can work well after 5 years.
MBP quality has been consistently less and less quality as the years go on. It would take me a page just to write out the critical issues my brand new MBP is facing. Apple has not done much QC with the updates to the point of an update temporarily wasting the computer.
I really can't stand the trend to keep making things thinner - it's gotten to the point where it's hard to find a work laptop I can get through my company that comes with an ethernet port. If we can miniaturize stuff so well then can't we miniaturize it all to spread components further apart in a predominantly empty but tall case so airflow stops being an issue?
I used to think apple was obsessed with thinness, but I think that's actually the byproduct of what they're actually obsessed with: weight. While they might seem like the same thing, and they are obviously related... I don't care so much about the change in thickness from an aesthetic standpoint, but I do appreciate that the latest generation of macbook pros are noticeably lighter in my hand and on my back. So, I can see one practical reason for doing it.
That’s all fine until they use this as an excuse to remove a drive, or a headphone port, or prevent you from changing the battery, and then refuse to admit they aren’t meeting the needs of their customers.
All Apple provides us is correlation, so it's entirely possible (likely even) that the removal was incidental and driven by competing internal needs (like removal of wires, progression towards thinness). Materially it doesn't matter at all what caused this because they don't offer phones with functionality people clearly desire that are otherwise commonplace.
It'd be much nicer if they simply offered things like removable battery, headphone, etc, and articulated how this would affect the form of the phone, but that would affect other priorities that Apple has. It's not difficult to accept, even on this forum, that materially their massive profit margin hurts their product offering.
Progression towards weight doesn’t sound to me like an excuse for ... doing it for progression towards thinness. Which was what your parent comment was talking about their motivation being. I, by the way, am an iPhone user who wishes they kept the jack.
I’m also not sure what you mean by their profit margin hurting their product offering. Do you mean their desire to keep iPhones an aloof luxury brand hurts their product offering? I suppose dropping surprises like removing jack X or button Y could be part of that, but I think it’s more likely they have found customers with different preferences than you or I to be a preponderance of their user base and decided that making us happier is not economically worth a totally separate mechanical design that they have to both manufacture and separately stock. Maybe they incited people to move to airpods, but their forced converts seem to largely like them.
The two aren’t really related tough. If you took the current MacBook and made it twice as thick, but kept all the same internal components, it might weigh an ounce more from the extra case metal. The rest would be air.
I agree that Apple's behavior isn't quite consistent with a simple obsession with thinness, but I don't think it's weight, either.
My 15" PowerBook in 2001 was 2.4 kg. The 15" MacBook Pro in the linked answer is 1.83 kg. (At their heaviest, they got to about 2.5 kg.) This seems like a lot of trouble for half a kilo (-25%). It also got much thinner over that time, from 2.5 cm to 1.5 cm (-40%).
It's probably a combination of factors. Jony Ive in 2001: "People will have a visceral reaction to its weight and volume."
Hey, I'm happy to have optimizations around weight and I understand that internal jacks and cards to handle specialized lines adds weight, but an integrated jack is going to add a lot less overall weight than a dongle and, while not everyone uses ethernet, I think it's at least common enough to justify keeping the jack around.
At work we all use laptops to allow working from home and we're also all hardwired in because there's no way we'd have reliable throughput over wifi with the multitude of employee work & personal devices our network supports.
I got a Lenovo X1 Carbon that is pretty thin and have a adapter for Ethernet, good enough for me and doesn't seem to have any performance issues. The form factor of the X1 makes me really love it.
But when it comes to smartphones, I agree. They are getting thinner and larger, possibly to make it easier to drop and break or something, because I sure don't get it. It's hard to find a good, reliable smartphone that is around 4 inches today so I can hold it without worry with one hand.
I have the carbon too, but I also have a Dell which is a similar thickness but with a drop down sprung Ethernet port. Annoying Lenovo couldn’t fit the same.
Thinner fits the general use case better probably, e.g. there's way more students who care about a laptop that can fit between textbooks than IT professionals who must have an Ethernet port on hand at all times.
Also, sweet money from selling all sorts of adapters (I think by now I spent at least 1/3rd of what I paid my 2017 MBP on dongles, docks.. this one correctly turns on the display on resume, this one doesn't but passes 4k without hiccups, etc)
Can only speak for Denmark, where I'm from, but I'd venture that at least 50% of students have MacBook Pros. This is simply based on personal experience though. I might be way off. This is for the last three years of High School equivalent. University varies a lot depending on your field.
Honest question: why do you need ethernet onboard a laptop? USB-C ethernet interface adapters are cheap; could you not simply get a few and leave them on the ends of the ethernet cables to which you regularly attach?
These days, for most users (you and I included), very rarely if ever is wifi inadequate. If, for speed or security reasons physical ethernet is needed, the adapters are small and cheap.
We need(ed) them extensively for lab hardware access in the office. It's accessible over wi-fi and VPN but we often need a wired connection to directly test hardware functionality and performance.
So last year after people kept losing their dongles we starting keeping USB-C dongles always plugged into the cable ends. Adds $20-$40 to every person-facing ethernet outlet, but nobody ever lacks one, and we noticed more people using Ethernet as a result because of how much faster and more stable it is than our wi-fi.
If anyone not new or transient to the office took the dongle off the cable, they got shamed very, very deeply. YMMV (we are a very shame-effective group) but buying 15 dongles once ended up being cheaper than buying 10 dongles, then 5 a few months later, then another 5, then oops we found the 5 that we thought were lost...
With all this remote work happening due to Covid-19 I think video conferencing is making clear the issues with trying to push a high amount of data through a congested airspace. Even at home in a concrete building I have plenty of channel interference and regularly get dsyncs from spotty wifi when gaming over my LAN with my wife.
Well, I know what you mean, but... I have a macbook for work and when we all converted to work from home, I needed to buy a USB-C ethernet adapter, thunderbolt adapter, and a USB-C to "old" USB adapter, and was shocked to find out that each one was $30-$40. I guess that's cheap in the grand scheme of things, but it would be nice if this stuff was still bundled into the computer.
My company provides them, but we buy off-brands and I assume in bulk. I asked IT about it once - could it really be that much cheaper? - and turns out yes, it's substantially cheaper, especially at scale.
I guess my point is, don't be afraid of off-brand dongles!
I'm on a Dell XPS now, and formerly a MBP 2015, so I haven't had a physical ethernet port in a really long time. I still miss it— I work at a robotics company, and a lot of my job revolves around the installer workflow (pxe, kexec, that kind of stuff), driver development (lasers, PLCs, and other devices that all speak Profinet and similar ethernet-based protocols), not to mention general fault investigation for when a robot "falls off the network" but can still be accessed over a wired last-ditch rescue interface.
So yes, I have dongles at my desk and there are stashes of them around the office, but it's still a source of frustration when they go missing, don't work, etc. I know I'm a bit of a special case, but I would definitely still appreciate a computer that was 50% thicker with a proper ethernet port.
My wife’s HP laptop has a spring loaded full sized jack that pops out. It adds no thickness and the laptop cost $300, there’s no ethernet jack on the Mac not because they can’t but they don’t want one.
A friend of mine works in an office where you need a hardline for the network and there's one coworker always leaving his USB-C adapter on the hotseat cat5s, and always complaining when others in the office take them off and repeatedly lose them.
I imagine it's probably as much of an inconvenience for everyone who has the port to take off the dongle and worry about losing it as it is an inconvenience for people who need the dongle in the first place. The solution to both seems to be "only distribute laptops that have the ports your business needs for daily work", but unfortunately companies don't seem to agree.
I'm also personally unimpressed with USB-C port stability, unlike USB-A connectors USB-C seems to have a lot of natural wiggle in it and the weight of dongles can wrench the cable out.
The thing is most people don’t use Macs or even personal computers for that matter anymore. It’s never been about a massive appeal product, it’s about meeting people’s needs. I think Apple remembers the good old days when they got rid of the CD ROM and everyone went from doubting it to patting them on the back and saying what a good job they did calling the trend.
Even in 2020, a professional computer is not really professional IMO if it does not have an ethernet jack. Even my wife’s $300 cheapo HP laptop has a drop-down ethernet jack, it adds no thickness. There is no technical or form limiting reason not to have an ethernet jack.
It's probably obvious but it's obvious to me only on hindsight: Do not block the ventilation vents of your MacBook Pros. Eg. this might happen if you use it on the bed (hopefully not for too long).
I got a MBP 2019 at the end of May 2019 - it was just released a few days before. My USB-C ports on the left of the computer do not power an external monitor, only ports on the right. This doesn't seem to be called out in any specs, but and another MBP 2019, purchased in October... doesn't exhibit that. I feel like I got some weird 'in between' model. I can't find it now, but I'd thought I'd seen something in the 'system report' showing that some powers had more power than others...
I could swear at the time I remember reading that one set of ports had higher power than the other, and... IIRC, that's true for the 13" models (or was for some years?) but wasn't supposed to be for the 15" models.
I plugged my Dongle and my charger on the right. I also have turbo boost disabled but now I think I Have improved my thermals by ~6-10 degrees C. I did unplug my backup SSD though.
It's incredible to me that Apple has maybe 10 major products and they somehow can't focus on any of them not named "iPhone". Between the keyboards, thermal issues in the previous Mac Pro, ignoring the iPad's software, and other issues I've probably forgotten, what exactly is their dysfunction?
It's incredible to me that MacBooks are considered professional-grade hardware in the silicon valley hivemind, between crippled functionality thanks to the form-over-function design and outright defectivity as described in this article.
I feel like they did make investments in MacBooks, just not the kind I want. I care less about a thinner laptop. Keep the same thickness with a nicer screen and overall build quality. Apples focused abit too much on aesthetic over function, but with the recent 16” MacBook, they’re making course corrections.
Personally I can’t figure out my development workflow on Windows. I tried the Linux subsystem stuff and IO was just too damn slow. 15 minutes to install a few package dependencies. For a while, I ended up running Ubuntu on another machine that I would SSH into for all my side projects just so I have my trusty zshell and all the nice package managers and other build tools.
It’s not OS dependent. It’s just that doing it on Mac let’s me focus on the problem I want to solve vs non value add stuff like just getting things configured.
I’m relatively fast using zshell, vim, iTerm, tab space to quickly search and open a file or program, adding things to my .zsh rc file, quickly installing packages, etc. In Windows, everything is slower, and I generally feel disoriented.
Something as simple as running docker wouldn’t work on windows. Apparently I need Windows 10 Pro edition... and I had the license too but upgrading would fail and rollback. It’s things like this that add up.
iPhones are >5x the revenue of Mac sales, and probably much more than that for profit:
"For the fiscal year 2019, the company's iPhone business accounted for approximately 54.7% of total sales; the company's Services segment made up approximately 17.7% of revenue; Mac sales generated 9.8% of total revenue; Wearables, Home and Accessories segment comprised 9.4% of the company's sales; the iPad accounted for 8.1% of the company's sales."
So what? iPhone making a ton of money isn't a good excuse for why my laptop made by one of the richest companies in the world is a worse experience than my old MacBook Pro.
And I've noticed the hub got really unstable whenever the CPU fans would go wild. Looks like it was the controller overheating due to the shitty thermals that Jony Ive's Apple seems to keep pushing out. (Still the Apple of today).
Now that I've switched my ports in a different config, so far I've had no crashes in the last 2 days.
I swear, I wish I didn't love macOS so much (or wasn't so heavily invested in it), or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine. WSL makes this more palatable, but the unparalleled retina support on macOS, my 15 years of using it, and just habits built up, keeps me from leaving. (I felt the same way when I first moved from windows to macOS, but it was in my early 20s and I had lots of time to play with the OS)