Apple seems to be making a lot of display related things difficult lately. This, and Display Stream Compression are two.
Just as high refresh and HDR were becoming really mainstream, Big Sur completely broke what was working flawlessly in Catalina (and it has not been fixed as of either 11.5, or the Monterey betas so far - and by completely broke, I mean does not work, at all, for anybody, not just 'some edge case').
With Catalina, my Mac Pro happily drove 2 27" 4K monitors in 10 bit color at 144Hz.
With Big Sur? I can run those same screens at 60Hz 10 bit, or 95Hz 8 bit.
If your livelihood isn't dependent on that kind of a display, personally, I would try to avoid falling into their trap. I'd call it anti-competitive to have previously supported an industry standard, only to then remove that support and push their product as the only way to achieve an equivalent to that standard.
I'm not saying the ProDisplay XDR isn't a good screen, or even overpriced in the class of displays it lives in, but there is a major leap between "I write code/documents/etc. all day and like vibrant monitors" and "I do professional multimedia work and need hyper-accurate displays to do my job correctly."
Telling all the people who want HDR to go buy a $5,000 display they don't need is a bit of a mean spirited move.
I wonder whether we might get to witness audience booing again for another outrageous Apple product as pre-recorded Apple events works great for their goals.
I think the Pro Display XDR & stand were less “outrageous” and more “aimed at a niche market most of us don’t know much about.” People were expecting something consumer / prosumer and Apple made something aimed at the professional market.
> “aimed at a niche market most of us don’t know much about.”
Then Apple should have demonstrated the $1000 stand to those niche market separately instead of trying to sell the aspiration of 'Pro' devices to the consumer in an usual consumer event.
Apple didn’t announce it at an event for general consumers, they announced it at WWDC. WWDC is an Apple event, run by Apple, to announce Apple products. Not all Apple products are aimed at general consumers.
Yeah, that's what got me about the announcement, like how could they totally "misread the room" so to speak? At their Developer conference they announce that "overpriced display" ( overpriced for developer audience ). And the $1000 stand was possibly the most tone-deaf product announcement ever from them, just totally the wrong time and place for that.
What I don't get is why they have never just simply taken the 5K display from the iMac and made it a standalone display, Displayport has been able to handle the bandwidth for years now, HDMI can handle it now as well.
> Yeah, that's what got me about the announcement, like how could they totally "misread the room" so to speak?
I don’t get what is so wrong here.
> What I don't get is why they have never just simply taken the 5K display from the iMac and made it a standalone display, Displayport has been able to handle the bandwidth for years now, HDMI can handle it now as well.
I don’t understand why people want this. You can get a 5K display from other vendors, why do you want one from Apple, specifically?
Are we still talking about a monitor stand? You can VESA mount the monitor if you don’t want to buy the stand. I think it’s assumed you’d use a third-party VESA mount, and the stand is there as an alternative for the few people who just don’t want that.. As far as I can tell, you can use a standard VESA mount, you don’t need some kind of Apple-specific one.
Precisely. I'd be very happy to have a ProDisplay. I may even buy it for the aesthetics (well, above and beyond everything else - I'm also a professional photographer). But as you say, it's more the attitude. I'm guessing whatever proprietary stuff they've done to drive the XDR at 6K in HDR isn't compatible with DSC, and there's little interest in fixing it.
It's a raw bus? It's like an operating systems idea of a network stack being to hand all software raw ethernet access. It is an obvious concurrency disaster once you have more than one application trying to access it, the data transmitted and control handed over is entirely opaque to the operating system (god knows what monitor vendors sneak over DDC) and most importantly all of these are absurdly low level implementation details that are subject to change.
For all you know, they don't even need DDC beyond the initial EDID setup and multiplex the I2C hardware to do something else and can't physically provide this interface anymore.
if you have root access, you should be allowed to have raw device access. the problem seems to be that apple no longer believes in users owning their computers, if it ever did.
Daily reminder that Jobs wanted one of their early computers (Lisa? Mac? Can't remember) to be bolted shut, until he was persuaded otherwise. Similar story with Apple II and expansion slots.
It's in their nature. Apple was bound to invent the iPhone.
That it works at all is a major improvement, and the Raspberry Pi method is worth having as well; I've been working on something very similar since I got my M1 Mini and found ddcctl and Lunar unable to work there, but got little further than building the light sensor before other projects took priority - I expect I'll probably finish this one around the equinox.
My work laptop, an Intel Mac, displays to and controls brightness on the same monitor, but I've been using ddcctl with a trivial wrapper script much more than Lunar of late. (Kinda feel a little bad about that, what with the author of Lunar having taken the time to give me tech support on a weird corner case here on HN a while back.) Still going to buy a license and, if I can find a way, set up a recurring donation. This kind of work deserves support.
I completely understand when users go with something more simple like a CLI or MonitorControl. I do the same myself when I don’t really need the features of a complex app.
By the way, I’m not sure if I understood correctly but if you need an external ambient light sensor, Lunar supports that out of the box now: https://lunar.fyi/sensor
Currently I'm working with a sensor homebrewed from a reverse-biased LED with a Darlington-pair amplifier - not what anyone would call accurate, but precise enough to calibrate reliably over the range of light values to be found in my office as long as I keep direct sunlight off the transistors.
Between that, a Pi, ddcutil, and the currently unoccupied DisplayPort input on my monitor, I'm hoping to brew up something that'll serve well enough - not terribly optimistic on that given that the monitor seems to maintain per-input brightness settings, but worth a try at least. (Also, if it does work, I can add an encoder with a built-in button as a manual override.)
On the other hand, it seems likely that, by the time I find that approach to fail, Lunar on Mac mini will be able to do DDC via HDMI - I thought it'd take a year at least after the M1 arch came out for anyone even to get as far as you already have, but clearly I failed to reckon with your dedication to the effort!
> reverse-biased LED with a Darlington-pair amplifier
that's clever! I knew LEDs can be used as sensors but I never had the time to try it.
Yeah, I don't know about DDC via the Mac Min HDMI. Weird things are happening with that port.
Some users report it doesn't work at all, some say their monitor crashes when DDC is sent through the port. One user even had the weird issue where sending DDC through the USB-C port to his Thunderbolt monitor causes the message to also be sent to the monitor connected via HDMI.
I'm trying to find a solution but these seems more like bugs in Apple's implementations of the video driver and we'll have to wait for those to get fixed in Monterey.
Why didn't they abstract it, or create a stripped/safe userland option to replace it? If I'm relying on a Mac for work, I can't have them removing essential features from my computer in a simple upgrade. Maybe MacOS needs semantic versioning, or at least some level of communication with the end user about compatibility.
This was never an intended feature. If I understand the article correctly, they were using an undocumented ("private") API which happened to stop working.
Every API is undocumented on MacOS, what do you want them to do? How are you supposed to discern between zombie XNU code and Good LTS Apple Compliant code?
> my Mac Pro happily drove 2 27" 4K monitors in 10 bit color at 144Hz.
It's a bit tangential, but I keep seeing people mentioning 4K monitors with high refresh rates on HN, but I've never seen any. Would you care mentioning what make / model those are?
I have 2 LG 27GN950-B's. They're great - very thin bezel, high color accuracy (not reference level, but better than most), and after a firmware update, I can drive my screens at 160Hz on overclock (via menu on the display, no custom weirdness required) or 144Hz in HDR/10 bit.
I remember when I first started my previous job in 2017, I opted for a MacBook Pro with a Dell D3100 docking station. When 10.13.4 dropped, I lost the ability to use some of my monitors. In the future, I won't buy from Apple simply because I'm not willing to replace all of my computer components for ones that are compatible with a walled garden.
I dont understand, you want high bitrate accurate display to the point of actually considering XDR, but simultaneously you really want LOSSY compression? The not so secret secret of Display Stream Compression is it degrades picture quality.
I want high refresh rates, which help even for operations work, web browsing, not just photography and eye fatigue. And I want HDR.
Frankly, the only reason I'd consider the XDR is as mentioned, aesthetics - I've got the 2019 Mac Pro, and I recently bought a house and set my desk at home up in the middle of my office, not against a wall, so entirely superficially, the aesthetics of the back of the XDR display could look nice.
The only thing the XDR has going for it is color accuracy (which is orthogonal, though certainly impacted, I'm sure, to lossy compression), and resolution (though I still like my two ultra thin bezel 4K screens versus one 6K screen). The refresh rate on the XDR is 60Hz.
Oh, and the XDR is not bugged/broken/crippled by Apple so as not to be able to run at full capability.
Lossy compression methods are usually smarter at choosing where to degrade picture quality. E.g. reducing JPEG quality usually results in much better pictures at the same file size than reducing image resolution.
Just as high refresh and HDR were becoming really mainstream, Big Sur completely broke what was working flawlessly in Catalina (and it has not been fixed as of either 11.5, or the Monterey betas so far - and by completely broke, I mean does not work, at all, for anybody, not just 'some edge case').
With Catalina, my Mac Pro happily drove 2 27" 4K monitors in 10 bit color at 144Hz.
With Big Sur? I can run those same screens at 60Hz 10 bit, or 95Hz 8 bit.
I guess I just need to get a Pro Display XDR...