And I've noticed the hub got really unstable whenever the CPU fans would go wild. Looks like it was the controller overheating due to the shitty thermals that Jony Ive's Apple seems to keep pushing out. (Still the Apple of today).
Now that I've switched my ports in a different config, so far I've had no crashes in the last 2 days.
I swear, I wish I didn't love macOS so much (or wasn't so heavily invested in it), or I'd happily ditch it for a really powerful thermally cooled desktop and use that as my machine. WSL makes this more palatable, but the unparalleled retina support on macOS, my 15 years of using it, and just habits built up, keeps me from leaving. (I felt the same way when I first moved from windows to macOS, but it was in my early 20s and I had lots of time to play with the OS)
I also use Regolith Linux, which is a noob-friendly tiling window manager version of Ubuntu, and it feels so slick with multiple monitors.
You can call me too incompetent or whatever but I’ve been running into stupidly obscure bugs that even stumped some of my Linux guru friends.
Just as one example of many, when the device is connected to a Thunderbolt display the Intel wifi driver crashes and restarts periodically which freezes USB input for about 15 seconds every time. This issue persisted across different Linux distributions, kernels and firmware versions.
Don’t ask how long it took to figure this out. I have now connected a USB wifi dongle to the displays USB hub.
I really miss the plug+play nature of macOS, I think Linux has it advantages and it might be better on desktops but it’s just been horrible for me on a laptop. I might have to try a MacBook plus a fast Linux desktop next.
No, you're not incompetent. As usual, manufacturers are putting weird features into their laptops that the Linux drivers and userspace can't keep up with - e.g. the debacle that is the Nvidia Prime gpu switching tech for low power vs high performance modes. Simply doesn't work most of the time, leaving you scratching your head. UEFI related woes are also common as we finally deal with having to give up on a decent experience with legacy boot.
At this point the wise person buys a laptop with official Linux support out of the box. It helps you and it helps the community (vote with your dollars!).
If someone doesn't know how to dig into terminal commands and google stuff on your phone to troubleshoot, you shouldn't even attempt linux on a desktop.
I actually prefer Thinkpads these days because everything "just works" when I install Fedora on it.
If you're an Ubuntu person and want it from the factory, I have heard that System76 build quality has gotten pretty good.
"Ubuntu Certified hardware has passed our extensive testing and review process to make sure Ubuntu runs well out of the box and it is ready for your business. We work closely with OEMs to jointly make Ubuntu available on a wide range of devices."
So in this case it is a collaboration between Ubuntu and OEMs
I am not a power user of other peripherals though.
And someone can go gosh, you must be doing it wrong, and they're almost certainly correct! However I'm a pretty big power user and can actually get things to work and dig into forums, so I realize that the average user has absolutely no chance.
In general, my experience using debian-based distros as a laptop driver have been more or less frictionless. I use an older lenovo thinkpad (x1 carbon, 1st generation, purchased for like $200 secondhand a year and a half ago).
In general, thinkpads have a reputation for having a plug+play linux experience, once you boot for the first time. I'd recommend giving them a try before throwing in the towel, if you have any patience in reserve.
You can get pre-owned (older=more stable linux support) t-series thinkpads with very nice specs, especially if you're willing to trade off display resolution. Plus the parts that die (batteries, ram) are all commodity and replaceable, you could presumably run with the same laptop for a decade, if you're in to that sort of thing.
If you’re using simpler programs (Firefox, terminal), your battery life should be great. I usually get four hours of heavy vim use with WiFi etc all on, on a battery that’s 75% capacity. YMMV.
It’s not a mbp level, but I imagine it’s fairly similar to what windows would draw. Maybe better.
It’s a new X390 Yoga though. Maybe they’ve dropped the ball a little, it definitely matches my friends’ experience that the more traditional models are pretty stable.
Linux laptop experience is notorious for weird quirks and instabilities with the hardware. Desktop is a lot smoother.
If you want to go back to macOS because you don't have to be "lucky" to get a laptop that plays nice, I don't blame you. But for me, the tradeoffs to stay on Linux have been minimal and absolutely worth it.
I had to jump into windows for a few things last month, and WSL2 on windows has become pretty good and bearable, the Docker beta support for WSL2 also really good... seems to use a bit more memory than I recall. But linux cli with a Windows GUI has been surprisingly bearable... Remote (wsl/ssh) extensions for VS Code invaluable as well.
If you have to do a demo in a different room, it's a pain. Your best bet is usually to use another laptop.
Same thing if you want to have a call in a quiet place and want to check emails/reference during the call.
Work from home is also easier, you don't have to use 2 computers or carry your tower (though currently that's viable).
At work, I'm on a dock... at home, I'm on a 4k-kvm switch... so I'm not really using the laptop but for a lighter computers... Just got bumped to 32gb ram, which is most of the laptop bottleneck. Though my personal system (r9 3950x) is much faster than the laptop (i7 8550u) both on 1tb nvme.
My old computer was over 5yo at upgrade, with a mid-cycle upgrade of a couple components, likely this will be the same, though I don't think I will ever go RGB again.
I did get hdmi adapters for my pi4's, but haven't actually tried them yet. May need different adapters/cables if you're going to/from mini-dp or another interface. back of the switch looks like USB-B Female (standard USB cable, not 3) and HDMI Female.
I've only been using it for about 4 weeks, but so far working well.
KVM (comes with 2 cable sets):
Extra 2 cable sets:
Micro HDMI to HDMI (for pi4):
I have two sofas, a small desk and a bigger one in two different rooms, a balcony, a table in the kitchen and one in the garden. I frequently switch and move between all of those, which helps me a lot in getting out of coding slumps and refocus. I was so happy to give up my static one desk multimonitor setup at the office for a nimble 13" laptop work from home situation.
To each his own.
Not very interesting discussion to be had in this direction.
What I've found is that since having a desktop and a dedicated desk and office in my house, when I leave the room I don't bring work with me. I also don't have push notifications enabled on my phone, including email. When I go out, I enjoy other things and then when I come back to my desk, I am much more focused and ready to concentrate on work.
I think this is probably a problem because I also play games on it so the room sends very mixed messages to my brain.
I help with a small business so have to use my PC for that. With my day job I have a laptop provided and do move around the house to get a mental disconnect and help me focus on some tasks. The problem is it's not a great device for stuff like in-depth research, I really want a big screen for that, so usually have to use the PC and get distracted.
I just unplugged my laptop from my standing desk and went to lay down for my postprandial chill session. I'm going to get some work done as soon as I'm done faffing about on HN (whomst among us...) and then I'll probably plug back in again.
It's just easier to do it this way rather than synchronize state between a desktop and laptop. One less thing to own, also.
Without going into people who are often on the road, many folks I know like working on the same computer at the office and at home.
It's usually easier to carry a laptop than a desktop. Even though there are many very small desktops nowadays (see HP's elitedesk mini - though it looks like a laptop without an screen, so I'm not sure it's that much more powerful) the laptop has usually fewer cables to unplug so it's generally less of a pain.
Another angle is that for many people a laptop has enough power for the activities they do and being portable is a real plus. I'm typing this on a 2013 MBP in my bed. This laptop might be slow compared to a modern mid-range desktop, but it's not tethered to a fixed spot. When I need to do serious work, I can plug a 4K screen and external keyboard and have the desktop experience.
On the rare occasion when I need a lot of power for some task, I'll usually fire up some outrageous ec2 instance for an hour or two. It will also have better network connectivity, which allows me to work comfortable over my parents' DSL line too.
I guess it all comes down to usage patterns. If you always use your computer on the same desk and never have the need to move it, I guess a desktop is a more effective use of funds. But many people seem to enjoy being able to carry the computer on a sofa, in the kitchen, etc.
They got rid of the desktop PCs and gave us all crappy laptops (they took a few years to catch up in power to what we had) and the argument was they could not ensure a secure environment on random home PCs. They were probably angling towards a hot-desk setup too but most people have a dedicated desk still.
I definitely preferred the old setup. I just struggle do any meaningful work at all on a laptop, I need to plug it in to a screen but I can't dedicate that space so normally just put up with it rather than unplug my home setup.
They do provide docks in the office but unfortunately a few different generations of HP laptops are around so you might need to search for the right one.
Meanwhile, the MS Surface I'm typing this on has a blurry Task Manager and unreadable notifications because 75% of them fall off my Full HD screen.
Fractional scaling seems to be quite a mess, at least in KDE it breaks a lot of layouts.
The 3700X is a mid-high range offering btw. 3600 and 3600X (which isn’t wotth) is midrange.
Which laptop do you use? I tried to switch from my just-post-Intel MacBook Pro to the Lenovo X220 several years ago, figuring Linux support + IBM quality (they had been fairly newly bought out) would give me a solid machine. Turns out several stuck pixels on the monitor and a keyboard on which some keys didn't work were officially regarded as within acceptable quality range. (Plus, I hated the trackpad, but that's my preference rather than a hardware issue.)
But since that time Apple more focused on phones, desktop OS didn't get much better. The only thing they are doing is more and more cloud integration to lock you in the Apple ecosystem. I got tired of that.
Now I prefer to use Regolith Linux, because it's much better for the development to have a Linux system with proper package management, without messing with the brew.
The distro is insanely fast, and you can almost forget about a mouse with i3. It also is really minimalistic, and default settings are really good. I didn't have any urge to change anything.
And it works really great on Thinkpad laptops which have an amazing keyboard. For home, I am using NUC Hades Canyon with last-get desktop i7 processor which a bought for 300$ and you know what, it's at least 2 times fast comparing 2019 MacBook Pro 15'. RIP mac mini. All drivers installed out of the box, even wifi and external sound card works without any notch.
I was you at this time about two years ago. Exact same thing. Fed up with Apple hardware bullshit, and with their pricing. With Apple in general. I had a sick iMac (as sick as an iMac can be, I mean). Loved and was invested in OSX. But something tipped the scales. Can't remember what specifically, but I said "fuck it" and put the machine on Craigslist. Got a buyer immediately, and lost very little money on a three year old machine. Took the cash, plus a little more, and built a PC. A liquid cooled PC. An egregious, so-ugly-it's-kinda-neat monster with tubes and a radiator and fans and the whole deal. I run Windows 10 Pro and Ubuntu. Neither is perfect for me. Windows especially can be maddening. My personal pet peeve is the lack of powerful device search -- in OSX, I could use Spotlight or better yet Alfred to look inside PDFs, for example. SOL on that in W10.
But it was worth it. My AMD-powered PC smokes anything in my Zip code, I'm pretty sure, and it's more than enough for my work needs (and my work does actually put the thing through its paces.)
I miss OSX, but not enough to go back.
And curious, do you dual boot Ubuntu, or do you use WSL? My biggest gripe with Windows is the insufferable terminal. But WSL fixes that, and WSL 2.0 is going to have full platform Docker support as well.
Now that Apple insists on proprietary chips, has horrible thermal throttling, got rid of the MagSafe charger, and regressed on keyboard experience, I can’t say I’ll buy another MacBook if my current one dies.
I have both WSL and dual boot, but I dual boot way more. This is likely because I am a command line idiot, and never got fully fluent with navigating my computing life using one. It's high on my list of skills to master in life, because I know what a force multiplier it can be, but I haven't gotten around to it. I mostly use Linux to write, weirdly enough: from org mode to LaTeX to statistics coding, it's a smooth experience - again because it's pretty "just the basics" and does them well. I have no actual need for Linux - I'd be fine with just windows. I guess it's more aspirational on my part.
Apple's moves: yeah man, they're taking that walled garden shit to serious extremes. I know there's a logic to it that works for them, so I don't begrudge them their choices necessarily. But I did really like the company for a long time, so it's a bummer to see them the way they are now.
X1 Search  is lightning-fast, results-as-you-type, and searches inside every file and in all of your email and attachments. Not just what's open in your mail client but also email archive files. I have email archives back to 2000, and the lookup is still instant.
X1 is actually one of the two things that had me switch back to Windows the two times I've gone all in on a switch to Mac. (Ironically back then the motivation was Apple's hardware was much better.)
Not content search, but for general file search, Everything is indispensable. I have it bound to Win-Shift-F.
I have a mid 2012 MB air that I still love. Screen isn't nearly as nice, but I use it over the MBP because the keyboard isn't like typing on cement and I can actually use it on my lap without feeling like its actively trying to burn my balls off. In all fairness though, I have a XPS 9560 and 9360 and both overheat and throttle like crazy and require throttle stop. Lone cowboy admin for a small company so I've got a pile of laptops.
I'm legitimately curious. I've used OS X a bit, but am primarily a windows/linux user.
In 20+ years of computer use, I've never wanted this facility. I actually take the option of removing the search indexing system from windows 7 (where you still could), and just used everything's filename search. On W10, I deliberately disable/break cortana/search so it doesn't run all the time.
Out of curiosity, what do you use a file-aware search facility for?
With the exception of the big famous names (famous for the 13 of us in our niche, anyway), I rarely remember those studies' authors, nor the titles of the papers most of the time - i.e. the data encoded in the filename. But I do remember certain phrases, numbers, and the like that they use in the body of the paper - in other words, the material that actually interests me. File content search for PDFs and other text allows me to enter one of these snippets and find the paper in question without having to spend minutes upon minutes scratching my head about "who was that lady at NYU ... or was it a guy at Berkeley"?
Do you have experience with doing the Hackintosh thing at all?
Might be a very different experience on a desktop, but definitely read up on the update experience and time-cost if you go this route.
That being said, the good thing is that the situation is improving: The documentation is being constantly updated and consolidated (which can be its own evil as you know, since there is frankly too much documentation out there, most of it outdated), the tools are getting easier to use and performing their functions in less hacky ways, and the community of Hackintosh builders is growing. But just be advised that the vast majority of the community of Hackintosh users really have no idea what they've done to their systems beyond being able to regurgitate the instructions they followed in whatever guide they used. And so most of the posts and replies on the forums and subreddit will not be helpful for solving any of the inevitable issues you'll run into. Probably 95% of thread replies are other users flailing around with their own similar-sounding problems, suggesting essentially random switches to flip in the configuration files (further complicated by completely new issues introduced between version updates, as the sibling comment mentions). This is problematic because in actuality, everyone's using completely different hardware and so none of the ubiquitous suggestions of "You need to enable this setting since it worked for me" are applicable. Successfully building a Hackintosh essentially comes down to loading the proper firmware settings and hardware drivers which just so happen to work for your particular set of devices. So just go into it with eyes wide open to the fact that this is a large community standing firmly on the shoulders of a very few giants, and be mindful that you can physically damage your machine if you take the wrong suggestion from a random forum user for a problem you're having. The most helpful external (non-Hackintosh) documentation I've often referred to during this process are the current UEFI and ACPI specifications, just to give you a heads up on something useful to have handy. Good luck!
Anyway maybe one day I'll do it for fun on a crappy laptop I get off Craigslist. Sounds like this pays off most when it's a low-risk effort.
With that said, I have been noticing more and more people building these insanely powerful desktop machines (a lot of times for less than a MacBook Pro) to keep at home.
Here is where it gets interesting - then the same group of people start walking around with a burner Chromebook that was bought on the cheap with nothing more than a shell with SSH or just simply remote into their desktop via apache guacamole.
Technology is pretty neat.
Equivalent setups with Windows and Mac are sub-optimal since there is only so much you can hack the window manager to do what you want. But you do need to go through some legwork and a learning curve get Linux working for you.
Definitely a better experience than I remember a year and a half or so ago. The new MS terminal works pretty well, and the Docker WSL2 support is very seamless. VS Code with WSL extensions works great. I spend most of my time in that space and have had so few issues.
Note: editing \\wsl$ files in windows is a little slow, same for wsl editing mounted windows drives... but in the sandbox has been really great.
I returned my 2018 Mac Mini because I was frustrated with all the constraints:
- eGPU required a PC-like external chassis, totally defeating the point of the Mini.
- Want to upgrade the storage? You need an external drive chassis and a free TB3 port.
- Only 2 type-A USB ports
I got fed up with it and returned the Mini.
So, I ended up building a really nice mini-ITX Hackintosh for the same price as what I paid for the Mini. It's got a couple NVMe sticks in it, and a 10tb hdd. The whole thing is about the same size as an eGPU chassis alone. It's quiet, it stays cool, and is relatively rock-solid.
That was almost 18 months ago and I still don't regret it a bit.
It's the "Mac" for me.
But laptops are a lot trickier. You need drivers (or patches) for a lot of extra hardware (your trackpad, your battery-level-reader, the screen brightness controller, etc); you need sleep and cpu power management to work; you usually can’t just swap out the wifi card with a Mac-compatible one, etc.
It can absolutely be done, but you need to do a lot of research on compatibility beforehand. And as a result, you may discover your options aren’t really all that much better than they were in Real Mac land.
I would go for it (ie start doing research) only if there’s something specific you really want in a laptop that Apple simply doesn’t offer. A touch screen, for instance.
• Any laptop with an nVidia GPU
• Any laptop that uses switchable graphics (unless you're okay with terrible battery life from the GPU being always on)
• Any AMD laptop (because even with custom cpu patches, the integrated graphics won't work).
That's a lot of laptops, particularly in the type of segments people would likely be most interested in, since Apple doesn't make them. Combined with the aforementioned wifi compatibility problems, you really need to do your research first!
Thanks for the tip!
If you need to run, say, Sketch or Omnigraffle, then it would make more sense!
Those two things together are what's able to provide close-to-native performance.
Mind, VMWare + GPU passthrough really should have more than acceptable performance, so I'm surprised.
Do you know a type-1 hypervisor I could use on Linux? Will QEMU do? Thanks for the info!
It's a somewhat involved process, especially for GPU passthrough.
But on Linux, you can do GPU passthrough on Linux for macOS (and Windows) guests.
In particular, since no VM has graphics acceleration for macOS guests, and because macOS relies very heavily on graphics acceleration, GPU passthrough is basically the only way to comfortably use macOS inside of a VM.
It would work if I kept this laptop as my backup (can't stop working for a day or two while I fix all that stuff).
Also, if you’re on Hackintosh, Apple isn’t touching your bootloader!
I still use MBP for travelling but not when at home.
W/ the new ARM news coming out, I kinda wonder if the Intel line is on life support and Mac engineering attention is all-in on the new ARM macs?
I have 3 cables hooked up. First is for the apple TB3 to TB2 adapter so I can reuse my TB2 hub. Second is a USB C to DisplayPort cable for my second 4k monitor (cuz I couldn't run both 4Ks off the hub), and the third is power.
What are you referring to? Windows has hires, wide gamut support.
If you are talking about the icon mess that Windows is nowadays, then I agree. But I hardly care about icons in the operating system.
If you want a powerful system, an 8-core 16-thread Zen2 Ryzen with an RTX 2080 Super is a beast for the same kind of prices you get a Macbook Pro.
macOS and mac apps support high-DPI essentially flawlessly. On Windows even system dialogs have blurry text, as do many third-party apps (such as Mathematica until the very latest release).
The problem is that Windows has way better backwards compatibility, while Apple routinely kills old tech. So app developers have to keep up, which is good for the user.
On the other hand, of course, you can still run very old apps in Windows, while Apple does not even support 32-bit, modern OpenGL or Vulkan.
I also think that Apple's decision to only support 1x or 2x to be the right choice. It's the wild west on Windows when it comes to high DPI. Half the UI in windows is either scaled up to "retina" and the other half is tiny boxes or text. As a result, objects are always the "correct" size relative to other objects around it.
The problem you describe is probably old applications which haven’t been properly updated to support hi-res. If they do custom drawing or controls or frameworks, Windows cannot do anything to fix it.
I do have very few icons in a normally hidden taskbar but as I set their size to tiny they have no particular look at all. I just distinguish them by color pattern.
Not sure what problems you have with dragging. I have 2 32" 4K monitors hooked up (one is vertical position) and do not experience any particular problems.
Two identical monitors is the happy path. Nonidentical ones get interesting.
I have a small high-ppi laptop hooked up to a large low-ppi monitor, and that confuses a number of apps when moved to the monitor they didn't start on. Most notably Visual Studio; some widgets are the wrong size, and the text is slightly blurred due to some up/downscaling issues.
Most of these things have probably been fixed, but I'm not sure. Windows devs?
Although I haven't regularly used SSMS in a while (thanks to Azure Data Studio), the first thing I used to do was change the fonts to the "new" (15 year old) ClearType fonts. Like Consolas everywhere.
Then it looks way better.
I have two monitors and windows scales apps equally on both even though one is 5120x1440 and the other is 1080p. The result is that I either pick small icons on the big monitor or big icons on the small monitor.
Then again, maybe it won't be so bad. Worth a try at some point just setting up a couple of my elixir and ember projects to run off WSL2.
And all the software "just works" at this resolution nicely.
> Apple says "we are listening now, and here is a new cooling design," then it comes out to be even less adequate that the old one. I can't think of anybody else capable of trolling up their customers like that.
Apple's thermal engineering is simply bad, and doing it bad is a company policy.
There are no other believable explanation to me. Apple been promising to fix their thermal design for years on end, with each year's model being supposed to have better thermals than the previous one, but in reality all their designs were consistently crappy.
The only explanation to that for me is that they took thermals as a subtle marketing feature, just like makers of laptops with crappy batteries always find ways to draw battery life from thin air.
Correctness of this statement is only notional.
All of their current models can have much better thermals without increasing their bulk if they actually tried to make it so. Quite number of other makers have superior thermals in even thinner packages, and, more importantly, cheaper ones.
So far, none of their recent models show a single sign of thermal engineering being being done as such. Their 16 inch model has a thermal solution I would only see in a $300 white label laptop. Yes, they added few extra millimetres to fans, but at the same time they still use the same single skinny heatpipe, and tiny radiators.
And all of that is when they have access to the best parts, and fabrication services on the market. If you look closely on their BOM, they have many surprisingly low spec parts, and very minimalistic, spartan design decision.
Which "$300 white label laptop" has equivalent thermal design to the 16" MBP?
Which "other makers" actually have "superior thermals in even thinner packages" that are "cheaper"? That sounds like bullshit to me, given what I've seen and experienced in the PC market, where loud fans that run all the time are very common.
And even when the Aero is running games or training a neural net, it's only ~70°C
1: The thermal solutions in macbooks are comparable to some $300 laptops
2: Other companies do laptops that are just as thin and powerful as macbooks but have much better thermals.
The list they posted is 2, not 1.
The relevant comparison is to the $2400 MBP base price.
Thermal zones in servers is a good example of low physical footprint and high density power consumption that cools quite well. The engineering effort being spent on a good thermal solution can cause a device that is thin to outperform a thicker device.
It's just that Apple does not seem to be spending the resources there.
It's not just that they haven't put enough engineering effort in - they consciously made a design tradeoff.
But yeah it's all trade-offs.
But the same design constraints were used in a mini-itx gaming PC on the linus tech tips channel a few months ago.. I can't seem to find the video now though. The thermal performance was amazing.
For contrast, I've got a Sun Fire T2000 at home that's louder than the airliners flying overhead (my current and previous apartment both happen to be under airport flight paths, for SFO and RNO, respectively). For obvious reasons (at least until I can figure out how to get a reliable network connection to my garage) that one gets run pretty sparingly, lol
The Microsoft Surface Books also have a 1060 inside, but they are whisper quiet.
Yes. And it's not even a subtle marketing feature. Reviewers ogle over internal designs.
But it's also partly in their dna. Landing on the wrong side of compromise is something they've been doing for a long time even in desktops.
Unfortunately, some conditions force the discrete GPU to activate - one of which is "being plugged into an external monitor." Even if you've only got terminals open, the GPU runs real hot with 1% load, and the fans ramp up to match. (This may only be true for some monitors - my work-provided monitor is the Apple Thunderbolt Display).
Sometimes Slack forces the discrete GPU to turn on, for example when clicking on an embedded youtube video. The discrete GPU will remain in use until Slack is restarted. Other applications sometimes behave similarly - I use https://gfx.io/ to see what applications are forcing it on.
Perhaps the cooling engineering is better, but the practical effects of it make me miss my 13" laptop.
Had something similar with AMD (desktop) GPU some years ago. It wouldn't go to the lower power states if my desktop refresh rate was set above 119 Hz. So it would be hot and fairly loud. So I ended up using 119 Hz on the desktop and configure games to use 144 Hz.
I also noticed that the video card wouldn't decrease the RAM clock rate significantly, but when on the desktop reducing the RAM clock a lot had a very noticeable impact on heat and no measurable performance degradation. The answer I got for that was that it was tricky to dynamically scale the RAM to such a degree (IIRC I set it to half normal speed). I ended up using an overclocking tool with profiles, worked fine.
It's frustrating that this problem is present with all-Apple hardware.
However, I'm not sure whether that's due to a recent software update or the fact I recently switched the cable I use from HDMI to DisplayPort.
My XPS 15 had a similar issue where turning on the GPU made it heat up and become loud - actually the discrete GPU offered no performance increase because the heat made everything throttle....... sad! At least I could use an external monitor with that one though.
It's just a limitation of the laptop form factor. My desktop has a 45 watt processor, same as the i7s, and it has a huge block of metal and a 120mm fan to keep it cool quietly (but still audible under load). There's no way to fit a 45w CPU and 45w GPU into a laptop and make it work.
I'm frustrated because the fans often spin up loud when GPU usage and CPU usage are both below 10% - when I'm literally just reading my email.
I manually disable Intel Turbo boost now though. And for some reason when I start up my laptop it can reach around 90 degrees C if I have a bunch of things open (I guess it turbos when it is restoring previous apps open) but that makes sense though.
Corollary: when in doubt, don't.
This ability will be removed in 10.16
I also recall the patches people had for things like Jazz Jackrabbit that ran a busy-loop some number of times in hot code because timing was execution-dependent in games like that. Faster CPUs made the game unplayable!
I wish they’d make a thicker laptop with more room inside, but that’s just not what they do.
Apple aren't stuffing a third party CPU into an inadequate cooling solution with the iPhone because there isn't one. If there was an equivalent to i7/i9 in terms of mindshare in the mobile market I wouldn't be suprised if Apple released a phone with a low clocked and badly cooled one of those too.
Some releases have been worse than others, this would be my favourite example: https://www.notebookcheck.net/Apple-MacBook-Pro-15-Core-i9-s...
It might have escaped your notice that (1) third party ARM CPUs for mobile devices are not exactly difficult to find and (2) Apple was in fact using one before they decided to take the design in-house.
See also: Intel labeling middling 2c/4t cpus as 'i7' a few generations ago despite them not being at all comparable to the desktop equivalents because they know people will buy them based off the model number and not the actual performance. (https://ark.intel.com/content/www/us/en/ark/products/95451/i...)
And NVidia giving all their mobile GPUs the same names as the desktop cards (pre 10XX gen, they're actually the same cards with slightly lower clocks now) despite them being entirely different hardware. (https://en.wikipedia.org/wiki/GeForce_900_series#Products)
Actual performance doesn't sell nearly as well as perceived performance.
(Also I guess it's all about marketing for everyone)
I think I saw a bit of that phenomenon with Snapdragon variations and # of cores.
I feel like the top answer is missing the forest for the trees. If the temperature of the chassis rises past 100 degrees because a peripheral was plugged in, and degrades performance if all of the peripheral ports are in use... that's not a usable computer.
Edit: Ah, 100F is 37C, so that's not so bad.
I was running Windows 7 in Bootcamp and I wanted to set up Gentoo Linux in a virtual machine for some Linux work I needed to do
I left the machine on my desk to compile a kernel. Basic wooden desk, nothing underneath or around it. No problems there
Roughly 5 minutes later, the system had reached what Speccy reported to be a scorching 117 degrees celsius! (242.6F)
I immediately shut it down and left it to cool off, then asked around on an IRC full of various flavours of IT people (programmers etc)
The horrifying answers I got were that this was INTENTIONAL and that "the system acts as a giant heat sink" which is why it didn't power off after crossing a threshold
As far as I understand it, running it under Bootcamp also disabled any kind of thermal throttling and forced the more power hungry "Radeon" graphics chip to be used, further adding to the problem
Hell, boiling water would cool the machine.
The top cover part between the touch bar and the screen regularly runs so hot that touching it really hurts. Can't leave my hand on there for more than a second or two. Ran all sorts of diagnostics and according to those everything is fine.(MBP 2017)
Note that was back in 2009. Macbook thermals: not good.
Edit: Well TIL there is an awful lot of heat to get rid of.
I've got an old Thinkpad with an i7-3920mx that regularly runs at 95C when running a few VMs and is certified by Intel up to 105C. And yes, I've tried replacing the thermal paste and increasing fan speeds to try to keep temperatures down, because not all processors are successful there, but it has been running like that with no issues for years.
> Specified Over -55 to 225°C
> Typically, parts will operate up to 300°C for a year, with derated performance.
Whether 100°C is fine for a CPU is a question for the individual CPU.
Mosfets have a maximum junction temperature of about 175C
Used in e.g. turbine engines, instrumentation in oil wells or mining operations, and industrial process control. You can get microcontrollers that work in similar temperature ranges, “in any architecture as long as it’s an 8051”.
What is annoying:
- runs extremely hot, easily
- fans are often on full force albeit nothing special is happening
- before buying the ridiculously expensive AirPods Pro, re-connecting them (normal AirPods) after a disconnect often required a system reboot (to the point I’ve made the reboot part of my pre-meeting schedule)
- connecting external devices usually requires adapters. With them too, when the adapter stopped working usually it helped connecting it on the right side (Reboots weren’t helping)
- for some reason the adapter would work on the left side again after some time
- as soon as I connect the external screen (not even a retina one) the fans go louder
- time machine is a PITA. Every time it runs fans are blaring up, machine gets hot - even system is getting significantly slower. Even if it just backs up a diff of 150MB. And it runs multiple times a day, with no option of configuration other than no auto-backup at all.
- Window’s decade old window management via shortcuts still doesn’t exist in macOs (you have to install a third party tool for that)
To add a contrasting datapoint, my (gen 2) AirPods have been virtually flawless for a year or so with my 15" 2017 MBP.
Did you try resetting your Bluetooth stack? https://www.macrumors.com/how-to/reset-mac-bluetooth-module/
After reading that, nothing I read about USBC surprises me. Sounds like another spec for vendors to abuse and ignore.
USB-C is a complete mess. I can’t believe Apple dropped MagSafe for this. They could’ve made all the io ports type C but kept MagSafe for charging with an optional chargeable type C port on the right side.
It is solid metal with lots of airflow underneath.
I have a 2018 13" Macbook Pro.
(I'm in no way affiliated with Roost, I'm just a happy customer.)
Thick aluminum or copper seems more rare, I don't know if it's commonly produced.
You're not trying to move huge amounts of air, since heat transfer from the laptop to the air is pretty slow. You just need enough movement to clear out any hot air that is building up under the laptop.
On a nearly totally different tangent, our own bodies build up hot pockets of air indoors as well, where there's no natural breeze to get rid of it. Getting rid of it creates a surprisingly strong cooling effect where you probably won't need AC for a while longer than you expect - and "air circulator" fans are pretty good at doing it over a whole room, so you don't need to keep a fan directed at yourself.
Then I realized, that was "without fans".
$260 for a laptop stand.