Hacker News new | past | comments | ask | show | jobs | submit login

My indicator for if Apple is for the customer vs for Apple is how macOS 'negotiates' YPbPr instead of RGB for non-Apple branded monitors (some LG monitors also get a pass) which results in worse color quality. I believe this to be carefully engineered to be a plausible bug rather than a real one.

BTW I have found a workaround using BetterDisplay and an EDID override (to more closely match what the monitor is actually telling macOS).




Seconding this. Feels actively anti-user even if this is just a bunch of heuristics that end up choosing the wrong thing. Honestly, why is this not a dropdown?

Related bug: macOS defaults to variable refresh rate when available instead of remembering my choice of 144hz. This is confounded by my hub (Caldigit TS3 Plus), which has trouble with variable refresh rates that result in a black screen.

The cherry on top: either I use a HDMI cable and deal with BetterDisplay forcing RGB to fix YCbCr, or a black screen when using DP through my hub due to the above bug.

Sometimes I wish Apple would get broken up just so macOS could have a chance at getting more love.


It’s very on brand for Apple to remove an option to trigger / customise something that should “just work”.

Example: iCloud Photos syncing is complete crap on macOS. If it has synced recently it’s not going to do it again. So you sit there like an idiot waiting 10 minutes for a photo you just took on your phone e to show up. When a pull to refresh or refresh button would have fixed it.


iCloud Photos syncing is crap on iOS too. iOS will randomly stop uploading photos to the cloud "to optimize performance". Every time something's not uploading, I check my iPhone settings only to find that it's "optimizing performance" again. Even if it's on a full battery and I'm barely even using it. There is no way to turn this off - you have to manually catch it every single time it decides to "optimize performance" and tell it to sync for an hour. If you took enough photos for it to take longer than that hour to upload them all, you'll have to do this multiple times as it goes right back to "optimizing performance" after that hour. Again, there is no way to turn this off.


I bought a custom HMDI dongle that forced 4k RGB HDR for my OLED LG from M1 Mac.


Interesting. Could you post a link to the dongle?


Not sure what the other guy uses, but I use this Anker with my LG OLED to get RGB 4k@120 HDR with my M1 MacBook Pro:

https://www.anker.com/products/a8317?variant=42329259475094&...


On mine, Apple TV+ (the official app as well as Safari) will refuse to play 4K through a similar adapter (VMM7100) on an OLED C2 42 from Cable Matters with the latest 120Hz supporting firmware. I assume it is because HDCP is broken. It works fine with the Mac Mini's built in HDMI. Frustratingly there is no great way to debug this, but if you open up Safari and look in the network tab, you can see the resolution of the video being streamed.

Does your adapter work at 120Hz without updating the firmware? If it does, does it support HDCP?


Ah yeah you just reminded me of the rabbit hole I went down a couple years ago.

Yeah I had to flash the firmware on the Anker dongle, following this guide: https://forums.macrumors.com/threads/dp-usb-c-thunderbolt-to...

I just tried streaming 4K through the Apple TV app on my M1 Macbook Pro and no issues, so I'm assuming HDCP works.


I'd go with a caldigit TS4 + their hdmi/dp active adapter.


> Sometimes I wish Apple would get broken up just so macOS could have a chance at getting more love.

As much as I like the integration between the phone and macOS I like the idea of desktop Mac getting more love.


I JUST BOUGHT A NEW MONITOR AND WENT DOWN THIS RABBIT HOLE AHHHH Almost returned this perfectly fine monitor thanks to Apple, thank god for BetterDisplay though, actual gem of an app


I'm just glad we've moved beyond SwitchResX...


The entire apple monitor settings are just awful. I have a small portable projector which accepts 4k input but just downscales it to 1080p.

I cannot get osx to actually output at 1080p, all it does is output at 4k and scale the result.

The downscaling in the projector adds input lag and just drives me crazy. I really wish they'd just let you control these things rather than poorly guessing.

I didn't know about better display, I guess I should try it and see if it can fix this problem.


Does the 1080p resolution show up if you go to Advanced > Show Resolutions as List and then tick "Show all resolutions" under the list? The resolution you are looking for is probably 1920 x 1080 (low resolution). If you choose a non "low" resolution the OS will output at 4K but scale the UI to the virtual resolution.


I still won't forgive Apple from dropping 1080i resolution from macOS. I've got a home theater Mac mini hooked up to my TV which only supports 720p or 1080i. I've always displayed 1080i fine, and then one upgrade later, it's suddenly "Fuck you, user. Use 720p LOL."


I ran into this issue with the Sonoma update. My display (4k LG) was negotiating RGB just fine before, but not anymore. The BetterDisplay workaround hasn't worked for me. The poor colors and fuzzy edges around all the text is causing eye strain too. I'm beyond furious.


I used to use an EDID patcher written in Ruby but it stopped working on some version of macOS. Contained in that script is how it patches the EDID data which is what I got to work with BetterDisplay.

FWIW, here's the hacked script[0] which only keeps the EDID data patching part. Be warned it's very hacky with the base64 EDID to be patched hard-coded in line 8 of the script. It prints out the patched EDID base64 which should be entered back into BetterDisplay (which is also where you can get the unpatched base64 EDID).

[0] https://gist.github.com/karmakaze/f795171a6a795491e754c3d092...


The same thing happens under Linux with some monitors and AMD graphics drivers. A lot of monitors have poor standards compliance (and the standards aren't great either).


My monitor has a strange EDID that requests timings with such short vblank that my GPU doesn't have time to reclock memory between frames, preventing switching to low power modes. But because I use Linux I can supply an EDID with standard timings in software, using the drm.edid_firmwire kernel boot option, which works perfectly. Linux gives you vastly more options for fixing broken things than MacOS.


I once spent hours trying to find out why apple's font rendering is so atrocious for a 1440p monitor on a m3 macbook air (reddit just keeps telling everyone to get higher resolution screens). Turns out it's related to the color scheme - the colors were fine, but the pixels are somehow located wrong, making everything look super pixelated.

BetterDisplay provided a workaround, but it needs to be selected every time the monitor is hooked up.

(I guess that's normal for Apple stuff nowadays - when I hook up my ipad to my projector, I need to tell it every single time not to use the audio output of the projector, but keep using the bluetooth speaker.)


They also removed subpixel antialiasing several years ago. Since then, “1x” screens (i.e. ~110ppi or lower) have looked like shit on macOS compared to the same display driven by Windows or Linux.


There is no reason why YCbCr should be visually worse than RGB if the conversion is accurate


It depends on the exact meaning of "YCbCr" and the meaning of "RGB". Is it BT.601? BT.709? BT.2020? Adobe RGB? Display P3?

Also extra fun is guaranteed if one end of the video cable is encoding with e.g. BT.601 primaries, while the other end is decoding as e.g. BT.709, or vice versa.


RGB to YCbCr conversion and back is well defined, and RGB->YCbCR->RGB should give you back the original RGB values, no matter what the colorspace was.


Lossless roundtrip conversion is only true for schemes like YCoCg-R [1]. It is not true for Rec.601, Rec.709, etc. because these standards require quantizing to 8bits (actually even less than 8bits due to the 16-235 thing).

[1] https://www.microsoft.com/en-us/research/wp-content/uploads/...


I'm not talking about mathematically exact reconstruction, I'm talking about not being able to see the difference.


I first noticed it around 2014 on a 1080p Dell display (maybe ~24") with an i7 MacBook Pro of the day.

It was the poor non-crisp pixels that led me to even knowing there were even RGB/YPbPr modes.


was it maybe with 4:2:2 or 4:2:0 subsampling? at 4:4:4 there should be no difference


I think that Apple, perhaps naively, expects display manufacturers to adhere to spec when in reality they often don’t.

Either way macOS has no trouble with my 27” 2560x1440 Asus and Alienware monitors. Both connect with 10bit RGB no problem, at least over USB-C and DisplayPort (haven’t tried HDMI).


macOS really wants to do different things for TVs vs monitors, so if it decides your monitor is a TV for whatever reason, it’ll probably prefer YCbCr and also not offer any HiDPI modes except exactly 2x

2560x1440 is a strong indicator of a monitor, but 4k over HDMI tends to get detected as a TV


That makes some amount of sense, I avoid using HDMI if at all possible. Even on other platforms I’ve generally had less trouble out of DisplayPort and USB-C DP alt mode. I wish it became standard for TVs to have at least a single DisplayPort, it’d make hooking computers up more pleasant.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: