Apple seems to be making a lot of display related things difficult lately. This, and Display Stream Compression are two.
Just as high refresh and HDR were becoming really mainstream, Big Sur completely broke what was working flawlessly in Catalina (and it has not been fixed as of either 11.5, or the Monterey betas so far - and by completely broke, I mean does not work, at all, for anybody, not just 'some edge case').
With Catalina, my Mac Pro happily drove 2 27" 4K monitors in 10 bit color at 144Hz.
With Big Sur? I can run those same screens at 60Hz 10 bit, or 95Hz 8 bit.
If your livelihood isn't dependent on that kind of a display, personally, I would try to avoid falling into their trap. I'd call it anti-competitive to have previously supported an industry standard, only to then remove that support and push their product as the only way to achieve an equivalent to that standard.
I'm not saying the ProDisplay XDR isn't a good screen, or even overpriced in the class of displays it lives in, but there is a major leap between "I write code/documents/etc. all day and like vibrant monitors" and "I do professional multimedia work and need hyper-accurate displays to do my job correctly."
Telling all the people who want HDR to go buy a $5,000 display they don't need is a bit of a mean spirited move.
I wonder whether we might get to witness audience booing again for another outrageous Apple product as pre-recorded Apple events works great for their goals.
I think the Pro Display XDR & stand were less “outrageous” and more “aimed at a niche market most of us don’t know much about.” People were expecting something consumer / prosumer and Apple made something aimed at the professional market.
> “aimed at a niche market most of us don’t know much about.”
Then Apple should have demonstrated the $1000 stand to those niche market separately instead of trying to sell the aspiration of 'Pro' devices to the consumer in an usual consumer event.
Apple didn’t announce it at an event for general consumers, they announced it at WWDC. WWDC is an Apple event, run by Apple, to announce Apple products. Not all Apple products are aimed at general consumers.
Yeah, that's what got me about the announcement, like how could they totally "misread the room" so to speak? At their Developer conference they announce that "overpriced display" ( overpriced for developer audience ). And the $1000 stand was possibly the most tone-deaf product announcement ever from them, just totally the wrong time and place for that.
What I don't get is why they have never just simply taken the 5K display from the iMac and made it a standalone display, Displayport has been able to handle the bandwidth for years now, HDMI can handle it now as well.
> Yeah, that's what got me about the announcement, like how could they totally "misread the room" so to speak?
I don’t get what is so wrong here.
> What I don't get is why they have never just simply taken the 5K display from the iMac and made it a standalone display, Displayport has been able to handle the bandwidth for years now, HDMI can handle it now as well.
I don’t understand why people want this. You can get a 5K display from other vendors, why do you want one from Apple, specifically?
Are we still talking about a monitor stand? You can VESA mount the monitor if you don’t want to buy the stand. I think it’s assumed you’d use a third-party VESA mount, and the stand is there as an alternative for the few people who just don’t want that.. As far as I can tell, you can use a standard VESA mount, you don’t need some kind of Apple-specific one.
Precisely. I'd be very happy to have a ProDisplay. I may even buy it for the aesthetics (well, above and beyond everything else - I'm also a professional photographer). But as you say, it's more the attitude. I'm guessing whatever proprietary stuff they've done to drive the XDR at 6K in HDR isn't compatible with DSC, and there's little interest in fixing it.
It's a raw bus? It's like an operating systems idea of a network stack being to hand all software raw ethernet access. It is an obvious concurrency disaster once you have more than one application trying to access it, the data transmitted and control handed over is entirely opaque to the operating system (god knows what monitor vendors sneak over DDC) and most importantly all of these are absurdly low level implementation details that are subject to change.
For all you know, they don't even need DDC beyond the initial EDID setup and multiplex the I2C hardware to do something else and can't physically provide this interface anymore.
if you have root access, you should be allowed to have raw device access. the problem seems to be that apple no longer believes in users owning their computers, if it ever did.
Daily reminder that Jobs wanted one of their early computers (Lisa? Mac? Can't remember) to be bolted shut, until he was persuaded otherwise. Similar story with Apple II and expansion slots.
It's in their nature. Apple was bound to invent the iPhone.
That it works at all is a major improvement, and the Raspberry Pi method is worth having as well; I've been working on something very similar since I got my M1 Mini and found ddcctl and Lunar unable to work there, but got little further than building the light sensor before other projects took priority - I expect I'll probably finish this one around the equinox.
My work laptop, an Intel Mac, displays to and controls brightness on the same monitor, but I've been using ddcctl with a trivial wrapper script much more than Lunar of late. (Kinda feel a little bad about that, what with the author of Lunar having taken the time to give me tech support on a weird corner case here on HN a while back.) Still going to buy a license and, if I can find a way, set up a recurring donation. This kind of work deserves support.
I completely understand when users go with something more simple like a CLI or MonitorControl. I do the same myself when I don’t really need the features of a complex app.
By the way, I’m not sure if I understood correctly but if you need an external ambient light sensor, Lunar supports that out of the box now: https://lunar.fyi/sensor
Currently I'm working with a sensor homebrewed from a reverse-biased LED with a Darlington-pair amplifier - not what anyone would call accurate, but precise enough to calibrate reliably over the range of light values to be found in my office as long as I keep direct sunlight off the transistors.
Between that, a Pi, ddcutil, and the currently unoccupied DisplayPort input on my monitor, I'm hoping to brew up something that'll serve well enough - not terribly optimistic on that given that the monitor seems to maintain per-input brightness settings, but worth a try at least. (Also, if it does work, I can add an encoder with a built-in button as a manual override.)
On the other hand, it seems likely that, by the time I find that approach to fail, Lunar on Mac mini will be able to do DDC via HDMI - I thought it'd take a year at least after the M1 arch came out for anyone even to get as far as you already have, but clearly I failed to reckon with your dedication to the effort!
> reverse-biased LED with a Darlington-pair amplifier
that's clever! I knew LEDs can be used as sensors but I never had the time to try it.
Yeah, I don't know about DDC via the Mac Min HDMI. Weird things are happening with that port.
Some users report it doesn't work at all, some say their monitor crashes when DDC is sent through the port. One user even had the weird issue where sending DDC through the USB-C port to his Thunderbolt monitor causes the message to also be sent to the monitor connected via HDMI.
I'm trying to find a solution but these seems more like bugs in Apple's implementations of the video driver and we'll have to wait for those to get fixed in Monterey.
Why didn't they abstract it, or create a stripped/safe userland option to replace it? If I'm relying on a Mac for work, I can't have them removing essential features from my computer in a simple upgrade. Maybe MacOS needs semantic versioning, or at least some level of communication with the end user about compatibility.
This was never an intended feature. If I understand the article correctly, they were using an undocumented ("private") API which happened to stop working.
Every API is undocumented on MacOS, what do you want them to do? How are you supposed to discern between zombie XNU code and Good LTS Apple Compliant code?
> my Mac Pro happily drove 2 27" 4K monitors in 10 bit color at 144Hz.
It's a bit tangential, but I keep seeing people mentioning 4K monitors with high refresh rates on HN, but I've never seen any. Would you care mentioning what make / model those are?
I have 2 LG 27GN950-B's. They're great - very thin bezel, high color accuracy (not reference level, but better than most), and after a firmware update, I can drive my screens at 160Hz on overclock (via menu on the display, no custom weirdness required) or 144Hz in HDR/10 bit.
I remember when I first started my previous job in 2017, I opted for a MacBook Pro with a Dell D3100 docking station. When 10.13.4 dropped, I lost the ability to use some of my monitors. In the future, I won't buy from Apple simply because I'm not willing to replace all of my computer components for ones that are compatible with a walled garden.
I dont understand, you want high bitrate accurate display to the point of actually considering XDR, but simultaneously you really want LOSSY compression? The not so secret secret of Display Stream Compression is it degrades picture quality.
I want high refresh rates, which help even for operations work, web browsing, not just photography and eye fatigue. And I want HDR.
Frankly, the only reason I'd consider the XDR is as mentioned, aesthetics - I've got the 2019 Mac Pro, and I recently bought a house and set my desk at home up in the middle of my office, not against a wall, so entirely superficially, the aesthetics of the back of the XDR display could look nice.
The only thing the XDR has going for it is color accuracy (which is orthogonal, though certainly impacted, I'm sure, to lossy compression), and resolution (though I still like my two ultra thin bezel 4K screens versus one 6K screen). The refresh rate on the XDR is 60Hz.
Oh, and the XDR is not bugged/broken/crippled by Apple so as not to be able to run at full capability.
Lossy compression methods are usually smarter at choosing where to degrade picture quality. E.g. reducing JPEG quality usually results in much better pictures at the same file size than reducing image resolution.
> the delivery guy called me about a laptop: the custom configured M1 MacBook Pro that costed as much as 7 junior developer monthly salaries has arrived!
I tried the MacBook Pro configurator and could only get up to $2300 USD (excluding optional software like Logic Pro). I’m sure the price and taxation are higher in the author’s country, but I still don’t understand how that could cost the equivalent of 7 months of junior developer salary anywhere.
Dev says right at the top of the entry that they are writing from Romania. Quick Google shows an average software dev salary of 4800 USD per year. If that’s average, seems like 7x monthly of a junior salary checks out.
The minimum net salary in Romania is about €280, which is about $360, which is about $500 net. So $6000 per year net. And no (u abused) junior dev is going to be making minimum salary, that's janitor salary.
> The minimum net salary in Romania is about €280, which is about $360, which is about $500 net.
This looks like you double-converted, and and also at an exceptionally high exchange rate.
And besides, we should probably be comparing euros directly to Apple's prices in euros, which, best I can tell, are in fact bigger numbers than their USD prices—the baseline offerings are listed at €1,449 and €1,679 (vs $1,299 and $1,499), and I can get the configurator up to €2,599, which puts us at a €370 monthly junior dev salary.
When I got my first job as a Junior Malware Researcher, I was earning $422 per month. That times 7 is exactly the cost of the MacBook Pro I got (1TB SSD, 16GB RAM)
Yes, I live in Romania, and salaries have been increasing a bit but they're still a joke for junior devs.
I know, we moved all our European call center operations there in my last job. The low wages are a big attraction for companies. I felt bad that we were paying so little though the people working there didn't seem to mind. For most it was their first job. But it surprised me that wages are that low.
Also, Romanians tend to have excellent language skills in most Latin languages and English so that helps a lot too. They're also really friendly in my experience. I would notice that when I walked around the office people talking among themselves would often switch to English when I was near them so I wouldn't feel left out. And they'd always ask me to join them if I went for lunch alone. And I'm not even a manager :) I was just there to do the technical side.
It always blows my mind when I compare the value of my company's MacBook Pros to the salaries of our offshore developers.
Having said that, when I look at a fully spec'd out Mac Pro, it comes out to $54,000. Add in a fully spec'd out Pro Display XDR, which is another $7k, and that's $61k.
According to levels.fyi, L3/E3/ICT2 SWEs at Google/Facebook/Apple get a salary of ~$130k, so just under 6 months of an American junior developer FAANG salary.
With 60$K you can also get a Lambda box wiht 8GPUs, 16 CPU cores and 256Gb system memory (plus 8x48GB VRAM on the GPUs). You can't train GPT-3 with it but just about anything else.
Obviously not. I have close friends who worked as waiters (and even worse, in the kitchen), and they were getting $300/month on 12h of work with the promise of tips.
That practice really needs to stop.
Recently I went to a restaurant in Cluj that had an app from which you could do everything: see menu, order, send notes to the chef, pay
The experience needs to be polished a bit (low 4G signal and no wifi made this quite hard to use for me), but I feel that could empower waiters more. Maybe by getting paid the same for a lot less work, or by creating a completely different type of job.
Tips just need to die (as being counted a part of a salary). It's only a thing in countries with poor worker protection so here's hoping they get more of that. The only country I have seen it used in was the US. My wife worked as a waiter in Copenhagen and they got a good pay and few tips. They tips they did get all got put in a glass and shared between every single employee equally, the only way tips should be used IMO.
I haven't used Lunar (https://lunar.fyi/), the app the author built, but it looks fantastic! Clearly great effort has been put into it!
I really appreciate this class of application that exposes more hardware functionality to the end user. Flux (which may or may not be a direct competitor) is another great example, as is Halide, the pro camera app for iPhones. They're certainly not flashy, but they can be great quality-of-life improvements.
And they're difficult to write! They require using APIs that are often woefully under-documented, and terribly difficult to debug. I wanted to write an app that would slowly lower your device's volume over time (so you could raise the volume for a song that you like, but then not accidentally spend the next 30 minutes with the volume super loud), and even doing simple things like listing audio devices and getting their names was endlessly frustrating.
You probably could script that in Hammerspoon. I have a small script that resets the balance to center every few seconds (it sometimes wanders off-center for some weird readon), setting the volume should be possible too.
Amazing write-up! Your efforts are much appreciated. I have an Intel Mac Air with an external monitor and strongly feel that usability items like these are sorely underrated. Kudos for the great work in a seemingly closed ecosystem.
Thanks! Yes, it can be hard to let people know that an app like this actually exists.
Some people don't even feel the need for such a thing and just stare hours at a time into a blinding 100% brightness panel when working at night time.
I think that can cause macular degeneration much faster than just "not filtering blue light", but this is not as talked about so people don't think about its effects that much.
I agree, and for me the problem is also that newer screens won't even go down far enough because they optimize for max brightness as that looks better on the spec sheet.
Most of my monitors are way too bright at 0% at night. On my 4K I even need to work at 0% during the day and reduce contrast at night (which messes up colour depth).
And yeah the OSD controls are horrible.
I still work on a real VT520 CRT terminal sometimes and it amazes me how it can go from super bright to hardly readable in pitch dark. And has amazingly handy analog brightness/contrast knobs. Not all innovation is progress.
I love knobs (rotary encoders mostly). I wish they were in everything where you need to adjust some range like brightness, contrast, volume, color warmth.
For some time I really wanted a battery powered wireless rotary encoder that I can listen to in Home Assistant and hook it to whatever I want but I couldn’t find anything like that.
Ikea made a rotary encoder (dimmer) like that with Zigbee. It works well with Home Assistant. It actually had an accelerometer to determine its position, so it could be rotated within a passive magnetic mount. Unfortunately they stopped making them. They replaced it with a square one that doubled as an on/off button. But perhaps you can still find some second-hand. It was in their Tradfri range.
I had the same problem and used one of the dark overlay tools that the author of the article criticizes to bring it down further than 0% brightness. But that trick also messes up color depth. I've since switched to an iMac, and the nice thing is that while the display can go very bright, when you dial down the brightness to the lowest value it is very dim to the point of being hard to read in daylight.
I found putting one of those Philips Hue light strips behind the monitor helped tremendously. If I set its brightness and temperature to a dim warm setting, the dimly-lit wall right behind the monitor makes mid-range screen brightness work just fine (eyes accomodate to the higher overall brightness, so the monitor needs to be brighter to "fit in", but I still perceive the whole scene as dimly-lit) and the color temperature seems to work well for me, I still get tired and sleepy. I can even control the light using the Hue bridge's REST API, so it's easy to automate as well.
That would help indeed, and so does keeping the lights on at a high level. But my 4K is so bright at 0% that it feels too bright for me even with the brightest artificial light I have :) Only in daylight it's OK (and I still keep it at 0% then!).
But there is another reason I don't always have lights on. I live in Barcelona where it gets hot in summer and I don't have AC. So at night when the temperature outside is lower than inside I leave all the windows open to cool the house. If I have lights on it attracts bugs. So I stay mainly in the dark.
Yes why can’t a mac control the volume of an external monitor on usb c/display port/thunderbolt.
It’s completely ridiculous when you think about it, that such a fundamental and important requirement for so many users, is completely missing, in 2021!
Although I'm not interested at all in brightness control and the different technologies you described, I really enjoyed reading your post. Your way of combining talking about technical issues with your personal life and challenges (your company job, having a side project along your day job, the winter cold near your house, etc.) made it possible for me to enjoy reading a post about something completely unrelated to me (DDC, etc.) Nice job.
You're welcome :) Ans since you deliberately chose this style, I'd be interested in learning from you (maybe in a new post) how you go about the process of writing.
This seemed neat so I installed Lunar to try out on my M1 MacBook Air with a LG HDR display connected over HDMI.
After launching the app my system started freezing so bad I had a hard time closing the app, inputs unresponsive for 30 sec then active for 1 sec to become unresponsive again.
I figured I'd try to launch it with my monitor unplugged and worked fine, but after plugging in my monitor again it doesn't connect again.
I've tried closing the app, uninstalling it and rebooting my system. Nothing works and now I'm left with no external display. What the hell?!
EDIT: after unplugging my dongle from usb-c (i.e. un-powering it instead of just unplugging it from the laptop) everything works again! phew. I guess it put the dongle in some weird state?
I can’t say for sure what the problem is in your case, but it’s definitely not a problem in Lunar.
All the app does is call a macOS function called IOAVServiceWriteI2C to send a standard DDC message to the monitor (something like “set brightness to 30”)
In your case, either the dongle or the monitor reacts unexpectedly to that message. What should happen is that the dongle should simply forward that message as it is to the monitor and the monitor should have firmware logic to interpret it correctly and either set the brightness to the sent value, or not do anything.
That’s why Apple will never implement native brightness changing using DDC.
They can’t risk having users come with problems like this one in the press, because the monitor or hub/dock does something funny with the DDC message and crashes.
Lunar can also function in software mode using gamma by following the instructions here: https://lunar.fyi/faq#bad-ddc
That mode will never crash a monitor because it doesn’t send any data to it.
I've tinkered around a bit and found that with another dongle that uses displayport everything works fine. Changing volume and brightness from the keyboard not having to fiddle with monitor menus is a godsend :)
The dongle that's having problems is a "Selore&S-Global USB-C HDMI + DP" if that's something you want to keep on file.
Thanks for the detailed response and all your work making the app possible!
Display Data Channel generally doesn't work through dongles. I'm not really sure how usb-c dongles work (block diagram, etc.), so I don't know why this is the case.
Thanks! If you ever notice Location mode is not enough for you, Lunar also supports external light sensors that you can very easily make yourself with the instructions here: https://lunar.fyi/sensor
Is it a big secret that m1 macs don’t work with external monitors?? The problem has been getting worse with each new forced update.
Using my monitor via HDMI and apples 100$ dongle it constantly turns off randomly and will not come back on without unplug and plug back in.
Everything else is great but after like 5 updates that claim to fix the issue but still having issues maybe apple should make some notification about the problem so people don’t buy this if they have serious work they need to get done.
My M1 works flawlessly with my usb c LG monitors except for HDR leading to washed out colours.
On the other hand, both my fullspec 2016 and 2019 MBP have days where the displays and usb and Bluetooth hubs freeze every few minutes and I’m watching a the screens go blank and the windows being retiled while I can’t type. The only thing that continues is audio over the 3.5mm port.
Sure: LG 27UL850-W. I can’t say whether it is exactly as sharp but to me it looks similarly sharp. Both scaled and on native resolution. The colours of the M1 screen are better however.
I’m using the exact same screen and it is what led me to write Lunar actually.
The screen is great, text is perfectly sharp in 4k on both M1 and Intel Macs. I don’t have the HDR function because I bought it 4-5 years ago but the colours look very good too, no washing out problem.
But the damn joystick man, I just couldn’t get past it and had to spend 4 years writing an app to work around it.
Are you sure that you have the UL and not the UK? The UL is HDR400 so actually not really hdr. The only thing is that macOS doesn’t seem to realise this and with all the toggling there are four states that occur:
MacOS and monitor no HDR: “correct” colours and brightness
MacOS HDR and monitor HDR: washed out colours / very low contrast
macOS HDR and monitor No HDR: very strange picture
macOS no hdr and monitor hdr: also washed out image
Unfortunately I don’t have some pre-made theme that I can share, I mostly improvised with colours and styles that I noticed look good on other websites, then settled on some colour scheme and style using Sass variables that I use all over the place.
If it's easy, do you mind sharing the hex for some of the colors, such as the background? You've picked a really nice combo of colors that I'd like to bookmark
Hmm, I clicked on both links on both my phone (4G) and laptop (wifi) and got a 404 error each time. Perhaps it's not publicly accessible, and only accessible when logged in to your Github account. Strange.
These are easy to look up yourself with your browser's "Inspect" functionality. E.g. in Chrome, press Ctrl-Shift-I, click the <body> tag in the Elements panel, and there's the background colour in the Styles panel: #fef7ef. If you don't use Chrome, it will be something similar; right click on part of the page and choose "Inspect" or similar.
DDC is great - I remember saving a Samsung 193P monitor (no buttons and controlled by a proprietary driver that only worked on WinXP) that had its brightness set unusably low by its prior owner. I also remember someone using DDC scripts as a KVM solution by switching the monitor inputs programmatically.
Ya I use DDC as part of a KVM, it works pretty well.
I have a usb switcher with a button, and then my desktop I have coded up to switch the input on my monitor when the keyboard appears/disappears (udev rules trigger that).
Funny, I just started playing with DDC a few days ago. I've got a work (Mac) and a personal laptop (Dell XPS with Linux) that I like switching between. I had a thunderbolt hub, but was getting tired of moving the cable back and forth, and it was also unreliable on the Mac, causing it to heat up and slow down on occasion.
I ended up just buying a pure USB hub with a switch, and running a persistent polling bash script on the Mac and the Linux machines that calls the appropriate DDC commands to switch the monitor input based on which machine the USB hub is switched to. It's fast and works great.
Nice thank you. I was originally using udev and launchd to drive actions based on events, but they took up to 10 seconds to switch. A purpose built tool would like the ones you linked I assume would not have that problem.
I wrote a little script to do this automatically when your bluetooth keyboard switches between computers. I just updated it to work with m1 Macs based on the recent work.
I just want my 2 external monitors to work, 1 in portrait, the other in landscape in my M1 MacBook Pro.
I bought a Pluggable doc, for $300, of course it wasn’t until after I couldn’t get it to work did Pluggable team tell me it wouldn’t work, “a known problem”, which they didn’t disclose. This was 7-8 months ago.
I needed the extra ports so I kept the dock, and used both monitors in landscape mode, it killed my productivity.
About 2 months ago I needed to reboot my machine, Zoom couldn’t find audio.
When the machine restarted one of the monitors was in portrait mode, it hadn’t been configured in portrait mode, but it was in portrait mode. I hadn’t done any updates, automatic updates are turned off. Then machine started with 1 monitor in portrait mode.
Everyone tells me it can’t work, Pluggable, Apple, Apple support forums.
This experience tells me the M1 release was half baked at least on the OS side.
The product was rushed to market maybe to capture market segment, maybe because component parts were already allocated, and shipping the M1 devices would help Apple to offset lost sales due to parts availability for the Intel Mac’s.
Hard to say, but I would have also expected something as trivial as rotating the display to be available without the need to buy a 3rd party $300 device, that isn’t supposed to work.
I was able to use one landscape, and one portrait 4k monitor just fine on my M1 mini. One is connected via HDMI and the other via USB-C. Have you tried it without the dock?
Without automatic brightness control for my external display, I had the exact same problems the author of Lunar describes in his article, constantly and manually adjusting the brightness of my display throughout the day, with its awful on screen menu.
Before I got my Mac with an M1, I‘ve used Brightness Menulet, which also worked quite ok, but not nearly as polished and functional.
> reading brightness, contrast or volume from the monitor fails about 30% of the time
There is a good chance this is because the read is done too fast. If it gets less reliable with a super long HDMI cable then it's almost certainly the clock speed.
I2C is a clocked protocol where the host (the laptop) decides the clock speed. It isn't designed as a high speed data transfer mechanism, and requires that all the signals get to the other end of the wire and back all within every clock cycle. Since there is very little data to send, a clock speed of 10kHz should do the job just fine. But I bet the hardware is set to something silly like 10 MHz by default, since it probably uses the same I2C drivers and hardware blocks as used for communicating with other things on-chip and within the system.
The Paddle mention is very interesting, I'd never heard of it before. I really hope they make the Sketch-like model an integrated part of the platform, I wish more apps used that revenue model, and I'd like to use it myself for some productivity utilities in the future.
The article got a bit confusing around halfway the post. If I didn't google the software it reads like I would need a configured Raspberry Pi in addition to this software...
Controlling external displays via DDC was never supported, and only works even on Intel Macs via a "private," undocumented API. It's neither appalling nor even surprising that this wasn't prioritized for the M1 kernel, and it is likewise reasonable that some more reverse engineering would be required.
I know it’s not how it’s supposed to work, but I think once an API is used in real world, and sees useful applications that some users rely on it day to day, it should be seen as “public”.
I never heard of Lunar before but completely understand the use case, all the more so it doesn’t seem Apple is interested in filling that gap outside of its pro display.
It works through at least Big Sur on Intel. Maybe it was previously documented and they pulled the docs around Catalina's release, but this would be the first I've heard of that in at least a year of active interest in the topic, and that makes it difficult to credit on pure assertion. There are almost a hundred comments in this thread, and it's close to midnight here; do you mind linking those to which you refer?
Does the gamma solution lower the power use like a proper brightness reduction? And if so, by the same amount?
Did you ever attempt to reach out to Apple about how this should be implemented on the new kernel? I would hope there's some forum for this kind of question still.
One bit of speculation I've read before is that the display brightness on some displays is stored in a cheap EEPROM chip without wear leveling, and constantly writing new brightness values throughout the day will wear it out in a short time span (years)
I’ve read that in a lot of places over the years and even got afraid of it when my LG 4k flickered a bit one day.
But I’m sure I’ve went way past the 100k writes speculated limit with Lunar’s Smooth Transition and by testing Lunar 6 hours a day almost every day for the past 4 years.
It’s possible that this EEPROM issue might not be an issue anymore. Lunar has thousands of active users and there are zero claims of a monitor failing because of it.
It’s also possible that it’s just a problem waiting to happen and I’ll get sued to death in a few years, who knows.
I have a monitor that has "eco" mode, reducing the brightness. Is that a different DCC command or are they implementing a similar gamma reduction? If not could an eco mode be called, if indeed it is a industry wide standard for reducing energy usage?
Brightness is literally just a backlight power level (pretty much), so gamma and eco mode are a completely separate concept. Frankly I have no idea what eco mode means, but I wouldn't be surprised if it's just brightness * 0.75 + contrast * 1.25.
Yep, that sounds a lot like Apple. Thanks for the thorough and interesting writeup on this. I didn't even know I wanted this so badly until now... luckily I still stand a chance I guess.
Since then, Lunar has acquired a lot more AppStore violations like linking to frameworks located in /System/Library/PrivateFrameworks, not being sandboxed, and not caring about a lot of their design guidelines.
So I didn’t really want to go through their process when the chances of getting an approval are close to zero.
Why bother? They may change their mind later on and then what? Apple will come out with their own display that will just work out of the box with their own API.
It's just ridiculous that you need to go through all this palava and reverse engineering to use standard methods to control something as basic as a monitor.
> reading brightness, contrast or volume from the monitor fails about 30% of the time
I2C over long lengths of wire isn't the sturdiest physical layer... I would expect the low level I2C handlers to retry a few times on no ACK received, but a retry_count of 5 might resolve most of the issues around a bit flipping on its way through the wire :)
Yes, retrying always fixes the problem. And for the purpose of getting the latest monitor values from time to time to keep the UI in sync, this suffices.
I'm more surprised that I2C works at all over the long, thin Thunderbolt cable of my LG. Huge bandwidth going through it for video and the USB hub, lots of current passing through for charging the MacBook at the same time, I'm amazed it works so well.
There is https://www.ddcutil.com/ It has a command line utility and a qt GUI. Unfortunately the GUI is a bit convoluted, because it exposes every option that you can change on the monitor. I once almost bricked my LG monitor, because I accidentally locked the hardware buttons on my monitor, via an undocumented manufacturer specific option.
> Unfortunately the GUI is a bit convoluted, because it exposes every option that you can change
Wow, color me surprised. That doesn't sound like a desktop Linux app at all :)
>I once almost bricked my LG monitor, because I accidentally locked the hardware buttons on my monitor, via an undocumented manufacturer specific option.
Reminds of the good old days of late 90s / early 2000s desktop Linux, when the wrong video timing settings in your XFree86 config could make your CRT monitor (almost literally) explode.
But I'm sure 2022 will be "the year of Linux on the desktop", finally :)
I wouldn't be surprised if Apples ultimate goal was to control the monitor playback ecosystem and implement DRM such that they could get a cut of any medimedia you consume through your Mac. They'd 'curate' the content to prevent you know piracy, or whatever.
That $99 price is for the Sketch program. The creator (Alin) just wanted a subscription model similar to it. The pro version of Lunar is $23 for a year of updates and support.
The XDR works quite well on the MacBook Air, no journey required. Only annoyance is that it has specific ideas which side is up or down when used vertically, and I always turn it the wrong way, first.
Yes, I should have probably written something about that as well.
Apple vendored displays have a proprietary protocol implemented over USB so that macOS can natively change the brightness and volume.
Lunar can take advantage of that as well as you can see here at `Apple DisplayServices`: https://lunar.fyi/faq#ddc
So if you want to Sync the Macbook brightness to the XDR, or sync from one XDR to another, or just get more hotkeys to control presets and contrast, Lunar can still bring something over XDR's features.
Bit of a non-sequitur, but I always laugh whenever I read a blog where someone finally discloses all of the gripes they had with their last Mac while transitioning to the new one. Of course, this new Mac is better than the old one, and won't be susceptible to the same issues they had last time.
Frankly, it all just makes me happier to be out of the ecosystem. I love the fact that there are people like you willing to write beautiful, functional and native system apps, but it also hits me with a pang of sadness when I hear that you spent 7 months salary on a device that will merely give you the ability to package and distribute apps to a certain platform. It's frankly dystopian, and I get the feeling that Apple will continue to edge out "lower-level" software like this in their bid to "increase userland security" (see: gimping MacOS APIs).
It's too bad people keep trying to use the M1 for power user tasks. The hardware shortcuts and proprietary implementations make using the machine in anything but factory use cases tedious if not infeasible. Being able to change the backlight intensity shouldn't have to be a major accomplishment with unfixable sporadic hardware bugs.
It’s actually been a great experience for the most part. I certainly wouldn’t call it “tedious if not infeasible”.
It’s a new platform so I don’t expect every single feature of every piece of software to work on day 1, but I haven’t encountered anything show-stopping.
Not being able to adjust monitor brightness using a 3rd party tool for a few months until the developer adds the new hardware isn’t really the end of the world.
Just as high refresh and HDR were becoming really mainstream, Big Sur completely broke what was working flawlessly in Catalina (and it has not been fixed as of either 11.5, or the Monterey betas so far - and by completely broke, I mean does not work, at all, for anybody, not just 'some edge case').
With Catalina, my Mac Pro happily drove 2 27" 4K monitors in 10 bit color at 144Hz.
With Big Sur? I can run those same screens at 60Hz 10 bit, or 95Hz 8 bit.
I guess I just need to get a Pro Display XDR...