My indicator for if Apple is for the customer vs for Apple is how macOS 'negotiates' YPbPr instead of RGB for non-Apple branded monitors (some LG monitors also get a pass) which results in worse color quality. I believe this to be carefully engineered to be a plausible bug rather than a real one.
BTW I have found a workaround using BetterDisplay and an EDID override (to more closely match what the monitor is actually telling macOS).
Seconding this. Feels actively anti-user even if this is just a bunch of heuristics that end up choosing the wrong thing. Honestly, why is this not a dropdown?
Related bug: macOS defaults to variable refresh rate when available instead of remembering my choice of 144hz. This is confounded by my hub (Caldigit TS3 Plus), which has trouble with variable refresh rates that result in a black screen.
The cherry on top: either I use a HDMI cable and deal with BetterDisplay forcing RGB to fix YCbCr, or a black screen when using DP through my hub due to the above bug.
Sometimes I wish Apple would get broken up just so macOS could have a chance at getting more love.
It’s very on brand for Apple to remove an option to trigger / customise something that should “just work”.
Example: iCloud Photos syncing is complete crap on macOS. If it has synced recently it’s not going to do it again. So you sit there like an idiot waiting 10 minutes for a photo you just took on your phone e to show up. When a pull to refresh or refresh button would have fixed it.
iCloud Photos syncing is crap on iOS too. iOS will randomly stop uploading photos to the cloud "to optimize performance". Every time something's not uploading, I check my iPhone settings only to find that it's "optimizing performance" again. Even if it's on a full battery and I'm barely even using it. There is no way to turn this off - you have to manually catch it every single time it decides to "optimize performance" and tell it to sync for an hour. If you took enough photos for it to take longer than that hour to upload them all, you'll have to do this multiple times as it goes right back to "optimizing performance" after that hour. Again, there is no way to turn this off.
On mine, Apple TV+ (the official app as well as Safari) will refuse to play 4K through a similar adapter (VMM7100) on an OLED C2 42 from Cable Matters with the latest 120Hz supporting firmware. I assume it is because HDCP is broken. It works fine with the Mac Mini's built in HDMI. Frustratingly there is no great way to debug this, but if you open up Safari and look in the network tab, you can see the resolution of the video being streamed.
Does your adapter work at 120Hz without updating the firmware? If it does, does it support HDCP?
I JUST BOUGHT A NEW MONITOR AND WENT DOWN THIS RABBIT HOLE AHHHH Almost returned this perfectly fine monitor thanks to Apple, thank god for BetterDisplay though, actual gem of an app
The entire apple monitor settings are just awful. I have a small portable projector which accepts 4k input but just downscales it to 1080p.
I cannot get osx to actually output at 1080p, all it does is output at 4k and scale the result.
The downscaling in the projector adds input lag and just drives me crazy. I really wish they'd just let you control these things rather than poorly guessing.
I didn't know about better display, I guess I should try it and see if it can fix this problem.
Does the 1080p resolution show up if you go to Advanced > Show Resolutions as List and then tick "Show all resolutions" under the list? The resolution you are looking for is probably 1920 x 1080 (low resolution). If you choose a non "low" resolution the OS will output at 4K but scale the UI to the virtual resolution.
I still won't forgive Apple from dropping 1080i resolution from macOS. I've got a home theater Mac mini hooked up to my TV which only supports 720p or 1080i. I've always displayed 1080i fine, and then one upgrade later, it's suddenly "Fuck you, user. Use 720p LOL."
I ran into this issue with the Sonoma update. My display (4k LG) was negotiating RGB just fine before, but not anymore. The BetterDisplay workaround hasn't worked for me. The poor colors and fuzzy edges around all the text is causing eye strain too. I'm beyond furious.
I used to use an EDID patcher written in Ruby but it stopped working on some version of macOS. Contained in that script is how it patches the EDID data which is what I got to work with BetterDisplay.
FWIW, here's the hacked script[0] which only keeps the EDID data patching part. Be warned it's very hacky with the base64 EDID to be patched hard-coded in line 8 of the script. It prints out the patched EDID base64 which should be entered back into BetterDisplay (which is also where you can get the unpatched base64 EDID).
The same thing happens under Linux with some monitors and AMD graphics drivers. A lot of monitors have poor standards compliance (and the standards aren't great either).
My monitor has a strange EDID that requests timings with such short vblank that my GPU doesn't have time to reclock memory between frames, preventing switching to low power modes. But because I use Linux I can supply an EDID with standard timings in software, using the drm.edid_firmwire kernel boot option, which works perfectly. Linux gives you vastly more options for fixing broken things than MacOS.
I once spent hours trying to find out why apple's font rendering is so atrocious for a 1440p monitor on a m3 macbook air (reddit just keeps telling everyone to get higher resolution screens). Turns out it's related to the color scheme - the colors were fine, but the pixels are somehow located wrong, making everything look super pixelated.
BetterDisplay provided a workaround, but it needs to be selected every time the monitor is hooked up.
(I guess that's normal for Apple stuff nowadays - when I hook up my ipad to my projector, I need to tell it every single time not to use the audio output of the projector, but keep using the bluetooth speaker.)
They also removed subpixel antialiasing several years ago. Since then, “1x” screens (i.e. ~110ppi or lower) have looked like shit on macOS compared to the same display driven by Windows or Linux.
It depends on the exact meaning of "YCbCr" and the meaning of "RGB". Is it BT.601? BT.709? BT.2020? Adobe RGB? Display P3?
Also extra fun is guaranteed if one end of the video cable is encoding with e.g. BT.601 primaries, while the other end is decoding as e.g. BT.709, or vice versa.
Lossless roundtrip conversion is only true for schemes like YCoCg-R [1]. It is not true for Rec.601, Rec.709, etc. because these standards require quantizing to 8bits (actually even less than 8bits due to the 16-235 thing).
I think that Apple, perhaps naively, expects display manufacturers to adhere to spec when in reality they often don’t.
Either way macOS has no trouble with my 27” 2560x1440 Asus and Alienware monitors. Both connect with 10bit RGB no problem, at least over USB-C and DisplayPort (haven’t tried HDMI).
macOS really wants to do different things for TVs vs monitors, so if it decides your monitor is a TV for whatever reason, it’ll probably prefer YCbCr and also not offer any HiDPI modes except exactly 2x
2560x1440 is a strong indicator of a monitor, but 4k over HDMI tends to get detected as a TV
That makes some amount of sense, I avoid using HDMI if at all possible. Even on other platforms I’ve generally had less trouble out of DisplayPort and USB-C DP alt mode. I wish it became standard for TVs to have at least a single DisplayPort, it’d make hooking computers up more pleasant.
BTW I have found a workaround using BetterDisplay and an EDID override (to more closely match what the monitor is actually telling macOS).