This appears to be a software bug in macOS or AMD's drivers. Plugging in an external monitor at certain resolutions (1440p@60Hz in my case) will cause the AMD GPU to draw 18-20W constantly, no matter what it's doing. This causes the total system draw to increase from ~10W to ~30W. All that heat has to go somewhere. If I change the resolution of my external monitor to 1080p the AMD GPU drops to ~4W and the fans are quiet again. A lot more information available in this long MacRumors thread: https://forums.macrumors.com/threads/16-is-hot-noisy-with-an...
This is a bug on Macbooks in general, as I understand it. I have a 2019 MBP with a 4k monitor, and if I change the resolution to anything except the default, there is massive system-wide lag. God forbid I use PyCharm on 4k scaled resolution – a full second between typing and seeing the text.
That seems unrelated; I think Mac OS is the only place where non native resolutions use GPU-accelerated scaling. The other operating systems just set whatever resolution you asked for and let the monitor handle it.
This seems to be excessive GPU use when simply connecting an external monitor at its native res.
The 2015 MBP is a work horse but it too had issues with external monitors. I used to connect it to two external 1080p monitors one via HDMI the other with some ancient adapter. When I plugged in the latter my internet would suddenly stop working. This was apparently from a multi year bug where the ports adapter was so close to the wireless card that it interfered with the wireless cards normal operation. Upgrading the adapter fixed the problem (it was one of the old display adapters with physical pins you had to screw in) - took a very long time to debug that issue!
The MBP 2015 with discrete GPU is hardwired to output video signal through AMD M370X. It can get quite noisy when just connected to an external display.
Also, at least macOS provides a consistent resolution for windows, texts, etc. On Windows (after Win 8), many times old applications run with __blurred__ texts and icons while the new shiny Win apps have no such problem. It's actually one of the reasons I don't use Windows anymore, although it's been making good progress.
It's technically consistent, Windows just chooses to rasterize non-DPI-aware applications at the lowest common denominator DPI and scale them up on high-DPI monitors. AFAIK MacOS does the opposite, and rasterizes everything at the highest DPI and then scales down as necessary. The latter looks much better so I'm not totally sure why Windows doesn't do it - perhaps a compatibility hack for apps that expect classic 96dpi.
Newer versions of Win10 actually have an odd variant of this that works more like MacOS where it will sometimes rasterize at a higher DPI and scale down - it looks pretty nice but it's not compatible with all old applications, I only have it enabled for a couple.
But those apps were written for specific resolution and you (probably) cannot make them render themselves with higher resolution. And MacOS unlike Windows simply doesn't support running old apps.
NVIDIA drivers on Windows have an option to scale on the GPU instead of on the monitor, though I'm not certain whether they ever default that on. Presumably it's there as a workaround for people with bad/broken monitors. It'd be interesting to try and measure whether that option raises power draw...
I've never had an NVIDIA driver with it on by default. I've run two different computers with NVIDIA GPU's through a TV and had to manually scale the picture on both of them when I couldn't get the TV's auto scaling to work properly.
I can speak from personal experience that NVIDIA GPUs will use bilinear scaling on Windows to display non-native resolutions on displays that don't have built-in scalers. This isn't a rare situation either. As far as I know, every display with a G-SYNC module inherently does not support display-side scaling. Also worth noting, NVIDIA lets you choose whether to use their scaler or the display's scaler.
I've seen tons of other bugs related to MBP+external monitor over the years, with multiple MBP / OS versions.
My current issue is that I use an ultrawide curved monitor at work, and when I lock the screen and then unlock it, some of my previous window positions are changed on the external monitor. My large-but-not-fullscreened IDE that takes up ~85% of the external monitor is resized and shifted off screen, every time. I have to use the 4-finger swipe up gesture to see all windows, drag the now-visible IDE window into a new desktop panel at the very top, then minimize it from fullscreen, to get back to where I was before I pressed ctrl+cmd+q...
I think the lag is common to all JetBrains IDEs: IDEA, WS, PyCharm, etc. IIRC, it's the JVM not using any acceleration for rendering and there might be another bug whereby using the integrated display and an external 4k panel at the same time causes massive lag. Try closing the lid with your 4k monitor attached and see if things speed up.
In any case, I feel you on the system-wide lag. I basically just avoid using JetBrains products at this point.
wow, a few years ago i was working in java world, and using intellij on a daily basis. any time i used a 4k monitor, i simply could not type anything or scroll through code without insane delay. i googled it and found a ton of people having the same issue. i could only sanely use intellij if i upscaled my resolution to its highest. is this all interconnected?
Same here, but recent versions are much better. If you take a screenshot when running one of the scaled modes on 4k it's actually rendering at a far higher resolution, which then gets scaled down. Almost 6K IIRC, which on a laptop is pretty much insane.
I have medium-to-high fan noise/RPM consistently when connected to an Apple TB display (2560x1440@60Hz), but it's much quieter when connected to a 2560x1600@60Hz monitor. So if it is very specific resolutions, but not necessarily higher resolutions in general, that matches my experience.
So I first thought that using gfxCardStatus to force the use of integrated graphics could help.
However:
> gfxCardStatus v2.3 and above actively prevents you from switching to Integrated Only mode when any apps are in the Dependencies list (or if you have an external display plugged in). This is because if you were to do this, your discrete GPU would actually stay powered on, even though you've switched to the integrated GPU.
Because of the way the ThunderBolt ports are wired, you cannot use external graphics without activating the discrete GPU. It's been like this in MacBook Pros for a long time.
fwiw NVIDIA GPUs have had a similar behavior under normal conditions on Windows forever. After following up with tech support about it, the explanation provided can basically be summarized as 'the higher the resolution + refresh rate of your attached monitors, the more demand placed on the memory controller and scan-out, which means a higher minimum GPU and/or memory clock to drive it'.
My current RTX 20xx card can drive two 4k monitors at low clocks, but the 9xx series really struggled and the 10 series still needed to clock up some. I suspect something similar is probably happening here with the AMD GPU if you're running the laptop's integrated retina display along with a 1440p60 external display, it's probably forced to clock up to drive that many pixels. I'm not really sure how AMD could get around basic hardware limitations here unless they have a really clever way of driving displays at a low memory clock.
I have a 2160p external monitor attached to my MBP 16" via Thunderbolt/Displayport 1.2 at 60 Hz.
I tried out the five different "Looks like" scalings the Display Preferences allow for and then eyeballed the "Radeon High Side" wattage readouts via iStat Menus:
1280x720: 18W
1920x1080: 6W
2560x1440: 6W
3200x1800: 6W
3840x2160: 6W
So it looks like it might (also?) be related to the display resolution scaling.
It would make sense if situations where the native panel resolution doesn't match the framebuffer resolution are causing the driver stack to have to maintain two framebuffers (native and non-native) to do the upscale and then feed it to the monitor. GeForce drivers on windows specifically have a setting to control whether this happens or not (let the monitor scale it, have the GPU scale it). The upscale would not only use more VRAM in that case (and as a result more memory bandwidth) but it'd need a bit more GPU compute as well to perform the scaling.
On my low-spec laptop from a while back, running Rising Thunder set to 720p was faster than native, but it was even faster to first set my desktop resolution to 720p because the overhead of the game engine scaling its 720p framebuffer up to native 4k was measurable. Setting the desktop resolution down also improved battery life.
That's interesting. I was watching a movie on my 1440p@60hz monitor last night on my 16" MBP in clamshell mode and the fans were on the whole time. That sounds like a connection.
Can you share how you're measuring GPU power draw? I have just migrated to a 16" MBP with the AMD chip, and also use an external display some of the time. Have noticed the fan noise and heat but it hasn't been a problem yet.
I’d guess they measure total draw and compare against their baseline eg when they say the GPU draws 20W it’s 20W more than whatever the draw is when only using the internal display.
I’m curious if this is affected by the resolution on the laptop display. I believe the 16” and more recent 15” all default to a slightly scaled resolution, versus the pre-Touch Bar 15” which defaults to the panel’s native 2x resolution.
I'm using the MBP in clamshell driving a 2560x1440@60 Apple Cinema Display at native res, and seeing the GPU drawing 20W constantly (via iStat Menus). The laptop's internal display is set to native 2x.
The fans sit just fine at the minimum speed under this setup, nice and quiet... until I start to do anything even remotely demanding, like, oh, open Ableton Live. Used to be I could run dozens of tracks and plugins with no fan noise. Now I can't even have a blank project open without the fans spinning up to audible levels. So much for recording vocals!
This would explain why when an external monitor is plugged in, my battery drains despite being on a charger. When it reaches 5% battery, the CPU load goes to 800%.
I’ve noticed this too using an HP Thunderbolt Dock G2 with a 16” and 2x 4K60 monitors. I just assumed it was because it’s summer here and the laptop tends to sit in the sun a bit, but it’d be nice if it got fixed.