Basically: if I set my display to ~75% brightness and open the video, the whites in the video are 100% brightness, way brighter than #FFF interface white on the rest of my screen.
But if I increase my display brightness to 100%, the whites in the video are the same as the interface white, because it obviously can't go any brighter.
If I decrease my display brightness to 50%, the whites in the video are no longer at maximum 100% brightness, maybe more like 75%.
But it's also kind of buggy -- after messing around with system brightness a bit, the video stops being brighter and I've got to quit QuickTime and restart it again for effect. Also, opening and closing the video file makes my cursor disappear for a couple seconds!
I'm wondering if it switches from hardware dimming to software dimming when the video is opened and closed, and if that switch has to do with the cursor disappears. If it is, though it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
Interestingly confirming it: taking a screenshot of the video while screen is at 75% brightness show massive brightness clipping in the video, since it's "overexposed" in the interface's color range. But taking a screenshot while screen brightness is 100% shows no clipping, because the video is no longer "overexposed" to the interface.
I'm just so surprised I had utterly no idea macOS worked like this. I'd never heard of this feature until now.
In order to pull this off, you need to know exactly how many nits bright the display is, as well as having complete software control of the actual hardware brightness. On Windows, you have neither. Enabling HDR mode completely throws off your current colors and desktop brightness and you have to reset both your physical monitor settings and dial in the new desktop white point with a software slider Microsoft buried in the advanced HDR settings (that almost nobody knows how to use) to hopefully be somewhere in the vicinity of what you had before.
When it comes to display technology, having vertical integration is a huge benefit. Look at high-DPI: state-of-art on Windows in 2020 is nowhere near as good from a software implementation or actual user experience point of view as it was on day 1 when Apple introduced Retina MacBooks back in 2012.
On Windows the systemwide "color management" basically consists of assigning a default color profile that applications can choose to use - which is generally only done by professional design/photo/video software, and not by the desktop or most "normal" apps.
The underlying problem is that APIs describe colors in the display colorspace. #ffffff means "send full power to the red, green, and blue subpixels", without describing what color the red, green, and blue subpixels are. That was not a major problem until relatively recently; every display used the same primary colors, so there was no need to specify what colorspace you were sending values to the OS in. But, then it became cheap and easy to use better primaries ("wide gamut"), and we had a problem. Every color written down in a file suddenly became meaningless; an extra piece of information would be required to turn that (r, g, b) tuple into a display color. So, everyone kind of did their own thing! Image formats long had a way to tag the pixel data with a colorspace, so images with tags basically work everywhere. Applications can read that and tell the OS that colors are in a certain color space, and it can map that to your display. Most applications do that; if you have a wide-gamut display and take an AdobeRGB-space image off your digital camera, the colors will be better than if you looked at it on an sRGB display. Even web browsers handle this fine; if they are presented with an image with a colorspace tag, they'll make sure your monitor displays the right colors.
The problem is sources of color data that don't have a tag. CSS is a big offender. CSS doesn't specify the colorspace of colors, so typically browsers will just send whatever is in there directly to the display. That means if you're a web designer and you pick #ff0000 on your sRGB display, people using a wide-gamut display will see a much more vibrant shade of red, and everything will look off. In fact, pretty much everyone using a wide-gamut display will see wrong colors everywhere because of this; I have one, and I just forced into sRGB mode because it's so broken. (On the other hand, a lot of people like more vibrant colors, so they think it's a good thing that they get artificial vibrance enhancement on everything they view. And are then disappointed when an application handles colors correctly, and what they see on their monitor are the same boring colors their digital camera saw out in the field.)
But, the problem is not Windows, the problem is applications and specs those applications use. Authors of specs don't want to say "sorry, there is no way you can ever use colors outside of sRGB without some new syntax", so they just break colors completely for everyone. That's why things look terrible on monitors that aren't sRGB; the code was built with the assumption that monitors will always be sRGB. Get rid of that assumption and everything will look correct!
There are also plenty of images out there that don't include color space tags, so it's undefined as to what colors they're actually trying to display. Some software assumes sRGB. Some software assumes the display colorspace. It's inconsistent. (I used to produce drawings in Adobe RGB and upload them to Pixiv, and their algorithm totally gets colorspaces wrong. It will serve your verbatim file to some users, but serve a version of the file to other users with the color tag removed, so that there is no possible way the viewer can see the colors you intended. I gave up on wide gamut and restricted myself to sRGB, because the Internet sucks.)
 It gets more complicated for shades of grey, involving gamma correction. #7f7f7f doesn't mean "send half as much electrical power to each subpixel", but rather maps to an arbitrary power level. The idea is to use the bits of the color most efficiently for human viewers; it's easy to tell "0 power" from "0.01% power". (You'll see this in practice when you write some code to control an LED from a microcontroller; if you just use the color as a PWM duty cycle, your images won't be the right colors on the display you just made. Of course, many addressable RGB LEDs do the gamma correction internally, so your naive approach of copying the image pixel values to the addressable LED will actually work. I learned this the hard way when I got addressable LED panels from two separate batches, and the old batch did gamma correction and the new batch didn't. I didn't realize it was gamma at play, so built an apparatus to measure the full colorspace of the LEDs with a spectrophotometer (https://github.com/jrockway/apacal). When I plotted the results, I immediately realized one was linear and the other was gamma-corrected... which meant all the code I wrote to build a 3D LUT for the LEDs was pointless; some simple multiplication was all I needed to make the LEDs look the same. But I digress...)
CSS defines that all hex and rgb() colors are always in the sRGB colorspace. There is also support for other colorspaces, such as P3 and Adobe RGB. 
The Safari web browser does color management for wide gamut displays correctly, so #ff0000 looks correct and not too vibrant. The biggest offenders are Chrome and Firefox, because they are not color managed. Those web browsers (and Windows) give you the wrong colors.
> There are also plenty of images out there that don't include color space tags, so it's undefined as to what colors they're actually trying to display. Some software assumes sRGB.
Images without colorspace tags have been defined to be sRGB images by the web specs  and all web browsers should already do that. Other software may do something else, as you said. I hope all software will copy how web does it.
FWIW I think the Windows approach to color management has proven wrong and off-base for today's world. It stems from the 90s where color management was seen as something only "pro" applications would ever need to do, so it was okay to require a lot of effort from those few application developers to implement color management in their apps. The MacOS approach where applications tag their surfaces with one of a few standard color spaces and the system does the rest is less powerful in theory but, on the other hand, means that things will actually work. Plus, I think MacOS has escape hatches so that apps can do their own color management based on output device ICC profiles if they really want to.
(see the edited preface to https://cameratico.com/guides/web-browser-color-management-g... )
I wonder what happens if you try this on the LG 5K hooked up to a mac. It is physically the same panel that's in the iMac, so in theory it can present the same range. But if the OS needs to know the exact physical abilities of the display, it might not be able to detect that LG display. Or maybe apple does detect it, because they partnered with LG.
This might explain why I’ve experienced a slight difference when working with photos and video on the iMac monitor vs the LG UF5k. Interestingly I have a small program synchronizing the brightness of my LG with the iMac, the LG would light up ever time I opened the video on the iMac and then come back down on closing. This might explain som weird brightness behaviors I’ve been noticing on the 5k every now and then, I use to decouple the syncing whenever that happens and now I might know why.
Linux seems better somehow but I haven’t had as much time with it there, so I’m not exactly sure why.
> The operating system is complicit in this trickery, so the Digital Color Meter eyedropper shows “white” as 255, as do screenshots.
I wonder if other measurement tools (e.g. Adobe color picker) are also fooled by the OS in this case.
Digital Color Meter shows both UI white and video white as exactly #fff despite the video white being much brighter!
Even at full screen brightness, the video white was noticeably brighter
It's not technically HDR, but if I view HDR videos with it in SDR mode with the brightness turned up to maximum, there is definitely a bit of an "HDR effect", similar to what Apple has achieved.
However, under Windows, without support from the operating system, this doesn't really work. The colours are shifted and the brightness of the content is way too low.
Microsoft could add something similar, but this is a company that thinks that colour is a feature in the sense of:
That plus P3 gamut means a video playing on a display is closer to what a filmmaker intended.
What's cool is: 1) 4-year old Macs have become HDR laptops and 2) the implementation is subtle- you get full brightness in the HDR video without having the rest of the UI blast to full brightness.
That video can have a very bright sky, for instance. You can have a bright sky and a blinding text box or neither in other OSs.
It’s also a very bright display in its own right, with 1600 nits vs the 300-400 of a regular one. And 1,000,000:1 contrast as well.
If my screen is already at 100% brightness then there's no HDR effect. 100% brightness is true 100% brightness, the max capability of my backlight.
The only difference is that if my screen is at less than 100% brightness, the HDR content can be brighter than the rest of the screen because it has the headroom.
Does that make sense?
Also, per later comments, this feature only kicks in when you're actually viewing SDR content, so there's no downside (even contrast) in everyday use.
But it seems like Apple might be gradually moving from 8-bit to 10-bit displays. They don't really advertise it clearly, but if the technical specifications say "millions of colors" it means 8-bit, if they say "billions of colors" it means 10-bit. (256^3 vs 1024^3.)
So if you've got the iMac model with the 4K screen, it's 10-bit and therefore capable of 1,024 levels of color. So remapping the UX to e.g. 512 will still be fine, assuming (safely, I think?) the compositing is all done using 10-bit color.
My impression is that PMing this is really hard. And then each of the other guys is going to have an opinion that this shouldn't be done because it's so rare, etc.
Something must be organizationally right for something like this capable engineering to have succeeded on such a barely noticeable feature.
I love it when products casually have cool things like this. Not quite the same scope but IntelliJ's subpixel hinting option has each element of the drop down displaying with the hints that it describes. You don't have to pick an option to see it. You can just preview it off directly.
Most frameworks don't let applications touch pixel data without jumping through some hoops, because by restricting it you can implement things like lazy loading, GPU jpeg decoding GPU resizing, etc.
It's easier when you control all parts of the stack. No way they could have pulled that one off with NVidia who were "famous" for breaking with Apple years ago when Apple demanded to code the drivers themselves... for valid reasons when one looks at the quality of their Windows and Linux drivers. The Windows ones are helluvalot buggy and the Linux ones barely integrate with Linux because NVidia refuses to follow standards.
It really is beyond belief that an organization with so many employees can fail to adhere to a uniform vision and standard and focus on correcting details.
I'm a life long Windows and Android user. But honestly, seeing articles like this and how smooth the UI on macOS and uniformly they apply new updates and UI changes makes me extremely jealous and resentful that Microsoft is so bad at something so basic.
Features are great, but users at their start point interact with UI first. They need to fix that before anything else.
Now they want to give you the option to run Android apps on Windows through emulation. This just going to create a bigger jumbled mess.
Latest Windows 10 iteration is by far the snappiest OS I've used in a long time since it uses GPU acceleration for the desktop window manager. You can check this in task manager. The icing on the cake, if you gave a laptop with 2 GPUS(Optimus) is when you can run a demanding 3D app like a game in windowed mode in parallel with other stuff like watching videos on youtube and you can see in task manager how windows uses the external GPU to render the game and the integrated GPU to accelerate your web browser, all running butterly smooth.
>In fact, their OS is in such shambles and is a disoriented mess with respect to UI consistency.
True, but that's what you get with 30 years worth of built in backwards compatibility. I can run a copy of Unreal Tournament 1999 that was just copied off an old PC with no sweat right after ripping and tearing in Doom Eternal. Can you run 20 year old software on current Apple without emulation? Apple can afford to innovate in revolutionary ways when it dumps older baggage whenever it feels like it and start from a fresh drawing board without looking back, see intel to apple silicon transition. In 2 years x86 apps will be considered legacy/obsolete on Mac hardware. Microsoft can't really do this with windows so yeah, it's a mess of new GUI elements for the simple stuff and windows 2000 era GUI elements for the deep pro settings. The advantage is that if you're an old time Windows user you can easily find your way using the "old" settings and if you're new to windows you can do most configs through the "new" GUI without touching the scary looking "old" settings.
Uh, Windows started doing that in Vista.
Today's WDDM, however, is snappy as hell, even on my MBP from 2012, but OSX is, and always will be, a sluggish nightmare. Intel GPU alone, no Nvidia or AMD DGPU, old enough that it has no hardware scheduling features, isn't DX12 compliant, but with Win10, its still is just as fast as my brand new workstation build when it comes to just being a plain ordinary desktop.
Apple needs to fix their development culture internally, and it strongly shows in their software product quality. Sad, because the M1 seems like a cool chunk of hardware, could be a real winner if it wasn't held down by OSX.
Quartz today is Vista's WDDM of yesterday.
A key feature of Control Panel was that it was "pluggable": vendors could add their own items, and often did. These are ordinary Win32 apps written to match the style of the era. Worse, some system control panel items have plugins in turn. E.g.: drivers can define extra "tabs", network cards have protocol-specific popup windows, etc...
There is just no way to update the look & feel of these to match the new Settings app style, most of the code is third party and ships as binary blobs.
The Microsoft Management Console (MMC), used mostly for Administrative Tools and server consoles has a similar problem.
Combined, these two make up the majority of the OS GUI!
I worked at Facebook for years and it now has a similar problem. Developers are evaluated every six months on their 'impact', which results in many dropping boring work and joining teams that are doing new things, even if they aren't needed.
Windows struggles to correctly display Adobe RGB (wide gamut but not HDR) JPG images I've downloaded in 1999, now over 21 years ago and counting.
They'll get there... eventually. Maybe next decade, by which I mean 2030, not 2021.
If you’re not spent time in a huge company, this might seem to be the case. But really, Apple’s uniform standard really is the exception. I’m sure there are other organizational costs for this, such as it being harder to take risks with products or execute quickly.. but gosh they are good at producing a cohesive, mostly consistent set of products. I deeply appreciate their attention to detail and long term commitment.
This is a bit misleading. The backlight isn’t at a higher level than necessary for sRGB content all the time, just whenever any HDR encoded videos or EDR apps are open. When you open an HDR video you can see the highlights getting brighter as the backlight gets pushed.
Yup, I think this has to be the case. But the crazy thing is, I can't perceive any shift in UX brightness whatsoever, even a flicker, when I open/close an EDR video.
I would have thought that there would be some slight mismatch at the moment the backlight is brightened and pixels are darkened -- whether it would be a close but not perfect brightness match, or a flicker while they're not synced. But nothing.
As I mentioned in another comment, the only giveaway is that my cursor (mouse pointer) disappears for a second or two. I have to guess that adjusting its brightness happens at a different layer in the stack that can't be so precisely synced.
So that's why Lunar reads a much higher brightness and makes all external monitors as bright as ten thousand suns when HDR content is played.
I wonder if there's any way to detect this and read or compute the SDR brightness instead.
but what about blacks? If you have a dark scene with bright highlights (eg. campfire at night), does the black parts of the scene get blown out because backlight bleed?
What I like about conclusions of this piece is it points to how strategic Apple is thinking in its leverage due to the breadth of distribution of advanced hardware and software.
Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
While competitors fixate on some single feature like night photo quality, Apple is also subtly chipping away at something like this.
The cheaper version of this display has a price tag of $5k, the more expensive one $6k.
I never spent even remotely as much money on a display, so I cannot speak from first-hand experience. But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Apple does many things, but certainly not bleeding-edge innovation. Of course it likes to sell its products as such, but I guess that's just how marketing works.
Translation: I don't understand this display.
> But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Translation: Despite my lack of knowledge, I feel qualified to say Apple sucks.
What? Having been in the inner sanctum of engineering within Apple for years, that's exactly the engineering priority for groups I saw and worked within. I'm genuinely curious why you assert otherwise, find it surprising.
- M1 
- A14 in the iPhone 12 Pro (first 5-nanometer chip with 11.8 billion transistors) 
Previously, they introduced the first 64-bit CPU in a mobile device, which stunned competitors at the time .
Not to mention their excellence in hi-dpi displays. In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent). However, the MacBook Pro 16" remains superior when it comes to sound quality, form factor (i.e. weight), touch pad precision and latency, thermal performance, etc., despite significant investments from Dell in those areas to catch up. 
With everything on that list, thru made a better version of something that already existed, especially with display - it's not like they were the ones developing and manufacturing the displays.
Even the ARM architecture and instruction set is not created by them.
I think it's important not to get carried away.
Ironically this is now somewhere they could stand to improve.
The MacBook displays are excellent, particularly when it comes to colour reproduction, but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.
Evidently the drop in sharpness is imperceptible to most people, but I can certainly tell, to the point where I forego the extra space and drop it back to the native resolution.
For a company that generally prides itself on its displays, I think the right option would be to just ship higher res panels matching the default resolution.
They have also done this with certain iPhone displays over the years, but at 400+ppi it’s well within the imperceptible territory for most people. For the 200-something ppi display on the MacBooks, not so.
My understanding of how scaled resolutions in macOS work is that graphics are always rendered at the display's native resolution. The scaling factor only decides the sizing of the rendered elements. Can you point to some documentation that supports your view? I'd like to learn if I'm wrong and understand all the details.
It used to be this non-native scaling was only an option and by default the MacBooks ran at the exact native panel resolution. But at some point that changed so the default is one “notch” on the “more space” slider. I presume most people preferred it that way as you don’t get a lot of text on the screen at the native “Retina” resolution. But the sharpness is worse than when running unscaled.
Uhh, both Sony and Dell had 1080p, 1200 vertical and then QHD laptops in form factors down to 13" before Apple. I owned both before I moved to Apple myself.
You can read people here talking about how their laptops have had high res displays when Apple announced "Retina" back in 2012: https://news.ycombinator.com/item?id=4099789
But it’s not just the resolution, it’s that Apple made such a high resolution usable via 2x rendering, and did so immediately for the entire system and all applications.
You can also get a 4K UHD Dell at 13".
> In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+
> But it’s not just the resolution
It was, above. Now it's the resolution and the ecosystem. "Apple did it first". "No they didn't." "Well, they were the first to do it right" ( for varying definitions of "right").
I have no particular horse in the game. In fact, my entire home ecosystem from Mac Pro to MBP to iPad, iPhone, Watch would more lean me in one particular direction, but ...
It's amazing how far we've come
There are some technological improvements that are so transformative (wifi, flash storage, high-resolution/"retina" display, LTE data, all-day battery life) that once you try them you never want to go back.
Then there are the changes that make you go "hmm..." (butterfly keyboard, touchbar without a hardware escape key, giant trackpad with broken palm rejection...)
- Phones, (Original iPhone way ahead of competitors)
- MP3 players (Original iPods)
- Tablets, (Pretty much the only serious tablet as far as I can see)
- Smart Watches, (Apple Watch still defines the category)
- Ultra Books, (First MacBook Air)
- All in One Desktop (iMac)
- Mobile CPUs (Apple silicon has been way ahead for years)
- Laptop CPUs (M1)
This doesn't just all happen by making existing things more "User Friendly". This takes real innovation to pull off.
I'm ex-Apple and an Apple fan as much as anyone, but I also have the benefit of being old. Not to take anything away from Apple's collective accomplishments, in many of these categories I'd say they "redefined" more than "defined".
There were many smartphones before the iPhone (the Palm Treos were great), many MP3 players before the iPod, many tablets before the iPad (the Microsoft Tablet PC came out about a decade before the first iPad), all-in-one PCs go back 40 years now, etc.
Pretty much, and they did a pretty good job too.
It's kind of the difference between invention and innovation.
Apple certainly does invent things, but they are a superlative innovator.
Xerox never made the Macintosh because they missed key innovations such as regions, the Finder, consistent UI guidelines, the ability to put a usable GUI OS in an affordable package.
Their monitor is great for a consumer or prosumer monitor but just because it has PRO in the name doesn't mean it can dance in the ring with the actual PRO displays that are used to master million dollar motion pictures.
The point of the 6k Pro HDR wasn't to compete with the one or two $40k monitors used to master million dollar motion pictures.
It was to replace the five to ten other $40k monitors used in other parts of the production pipeline, by being accurate/wide/bright enough for that purpose, and to provide good-enough accuracy to a whole swath of jobs where it was dearly needed but far too expensive.
Overkill for developers though.
And then there's the Dell UP2718Q, almost entirely the exact same specs, but only 4K, at 1600€. Released 2017, years before the XDR.
The ideal is 5K, which is double 2560x1440 in each dimension, but essentially the only 5K displays available are the LG one made in partnership with Apple, which is over a grand and has numerous quality issues, or the one built into the iMac. It’s really annoying.
Are these documented anywhere? I have one (it's glorious) and I've not had a problem with it, but it'd be interesting to see the list.
Here's a spec overview for those unfamiliar. https://www.displayspecifications.com/en/comparison/4f37c07a...
But you're right, the additional 3000$ are definitely noticeable - but if they're noticeable enough to justify that price tag is another question.
Did you just compare that mediocre POS to an XDR?
What's so "mediocre POS" about them in your opinion?
And there is a huge difference between 4K and 8k, and a much higher DPI.
And the halo effect actually is an artifact from the backlighting which can be resolved with the backlight recalibration cycle, which the Pro Display XDR does automatically and invisibly during times with purely SDR content, while it has to be manually run and is very obtrusive on the Dell one, that much is true.
The monitor that are comparable to the Pro XDR Display are ASUS PA32UCX ($4500) or EIZO CG319X ($6000) which usually requires full recalibration every certain amount of usage.
> When we used to manufacture TVs, we'd produce them for customers in the southern hemisphere too. When building these, we had to run them through our production lines upside down.
> When the old cathode ray tube TVs were built, the earth's magnetic field was taken into account during the production process. This ensured the current flowed the right way through the TV and so the TV was able to function normally.
As for bleeding edge, Apple has pioneered plenty of technology. It's true that it builds upon foundations of technical designs and scientific discoveries by others but that applies to every other company as well. Very few organizations are capable of going straight from invention in a science lab to large scale commercial product all by themselves. If you judge by how much "new" technology has actually reached consumers though, Apple is clearly leading the field.
The closest competitor I’ve found is the PA32UCX at $4500, and supposedly its fan is noisy enough that I’m hesitant to buy it.
> it’s strange and new, and possibly unique to Apple.
But that is exactly how Windows 10 does it with HDR displays, too. So not really unique to Apple. To the article's benefit, it did say "possibly" :)
I'm using Win 10 myself with an HDR display, and HDR white appears brighter than "desktop" white just like in the article photo.
There is also a slider in Win10 HDR settings that allows you to bring SDR/"desktop" white up if you wish to oversaturate SDR.
When you toggle HDR on Windows, the desktop becomes dull gray and desaturated exactly because they pull down the previous desktop brightness to something less than 255. So you have to then adjust your monitor's brightness up to compensate. The monitor's brightness effectively sets the upper cap of the HDR brightness, so let's say your brightness was set at 50% before, now you've got to fiddle with the monitor to boost the screen brightness to 100% to allow HDR to function, and to achieve your previous desktop white brightness (you'll probably also have to adjust the software "desktop white" point slider you mentioned, since MS has no clue what the correct monitor brightness and SDR pull down amount should be, so good luck matching your previous desktop colors and brightness). In my experience very few people successfully manage to setup their Windows HDR correctly, and even if you do there's no way to "seamlessly" switch between the two modes (which you have to do since tons of stuff on Windows doesn't work properly when HDR mode is enabled). I haven't checked Surface or other MS hardware, perhaps they're able to do something more clever there?
What Apple does, is that when your display brightness is 50% and you display HDR content, the HDR content will seamlessly appear at a brightness somewhere between 75% to 100% of the maximum screen brightness. That is a seamless HDR effect, giving you the whiter than white experience next to your other windows that just works.
Though I remember having read that the Windows HDR stuff works slightly differently for internal monitors (e.g. in laptops), is your experience with those?
In other words, assume SDR "Game" mode is set to Backlight 50, SDR "Cinema" mode is set to backlight 25, all other settings are equal, and, therefore, 100% white is considerably brighter in "Game" mode.
Then both values for "white" cannot possibly match a single, fixed level set by any other mode, HDR or otherwise.
It's therefore impossible for Windows or any other input source to "just work" when switching from an arbitrary SDR mode to a preset HDR mode.
Again, assuming your TV works more or less like mine, and mode switches use the most-recently-used preset in the target "mode family" (meaning not only SDR and HDR, but also Dolby Vision, which maintains its own collection of presets), and that the various presets are independent of one another.
And if this is not the case, and presets cannot be set independently, then I'm glad I don't have a newer TV, because some of my presets have color settings that are wildly different from standard calibration (e.g., a preset resetting the display to its native, uncalibrated white point and gamut, used with video players capable of internal HDR tone mapping and color correction given a custom 3D LUT generated from measurements).
You have the HDR/SDR brightness balance to set and the monitor's own contrast adjustments.
Macs and its apps have been properly color managed for decades. That's why the transition from SDR to HDR monitors has been painless. Apps have been ready for it for a long time.
No, apps on Windows HDR look normal unless they are HDR-aware and use the "extra brightness".
I have a wide-gamut display and I can notice the difference between applications that incorrectly "stretch" sRGB to the display gamut versus apps that actually colour manage and map the colours correctly.
No app on Windows colour manages the UI widgets such as the icons, toolbars, etc... This is because the WDDM shell doesn't do any kind of colour management, it leaves that up to the application developers.
The sad thing is that Vista introduced an extremely wide scRGB gamut and WDDM had a number of internal features built around it. Unfortunately, it was only ever enabled for full-screen games, video overlays, and for internal use by apps that do colour management. They should have converted the entire desktop manager to use it.
In particular https://gitlab.freedesktop.org/swick/wayland-protocols/-/blo... was linked which discussed how a wayland compositor would have to display mixed "HDR" and SDR content on the same display. This document even has references to EDR. Ultimately this would end up achieving a similar result as what's described in the blog post here.
If you're interested in the technical details on what may be necessary to achieve something like this, the wayland design document might be a good read.
The first thing you think is, how was I OK with this terrible standard video to begin with? The HDR version just looks SO MUCH better and the standard looks so flat next to it. Like comparing an old non HDR photo with an HDR one.
What is the source for this? I don't see any justification for this claim in the article. There are plenty of ways to implement this feature that don't involve permanently throwing out dynamic range on all your SDR panels. I'm not even convinced from reading this that they aren't HDR panels to begin with - the idea of an iPhone having a 9-bit or 10-bit panel in it isn't that strange to me, and while that wouldn't be enough for like Professional-Grade HDR it's enough that you could pair it with dynamic backlight control and convince the average user that it's full HDR.
Considering Apple controls the whole stack and uses a compositor there's nothing stopping them from compositing to a 1010102 or 111110 framebuffer and then feeding that higher-precision color data to the panel and controlling the backlight. Since they control the hardware they can know how bright it will be (in nits) at various levels.
I've also gotten into the habit of using uBlock Origin to kill all of the "Accept Cookies" type popups without ever clicking accept. Those are just as intolerable as ads to me.
On other story, I give my junior advice that she should install adblocker on her browser, her reply was: "How can I watch the commercial then?" Umm...OKAY.
1) people just didn't know better or
2) they actually loves those flashing ads all along
> But at key parts of the story, certain colors eek outside of that self-imposed SDR container, to great effect. In a very emotional scene, brilliant pinks and purples explode off the screen — colors that not only had been absent from the film before that moment, but seemed altogether outside the spectrum of the story’s palette. Such a moment would not be possible without HDR.
I think the author knows that this is a special case of the effect where you limit color palette to some range of colors for a duration of the film and then exceed that range in places—no HDR in particular fundamentally needed to make this possible.
True, HDR can give a greater effect in absolute colorimetric terms when full palette is revealed, but the perceived magnitude depends on how restricted the original palette was prior to the reveal, and how masterfully the effect is used in general.
I seems that the new MacBook Air with M1 can't play HDR content on external displays. :(
The MacBook Air with M1 has two USB 4 ports. USB 4 is Thunderbolt 3 + some extra bits.
See https://support.apple.com/en-us/HT201736 for information.
1) the software needs to render with the correct gamut profile (typically "P3.display")
2) the OS needs to have a reasonable color profile for the display that knows about the higher gamut
3) the display needs to be in the correct mode to interpret and render in the correct gamut.
My LG HDR400 display only has 1 obvious setting for HDR as a "quick" setting, but it behaves like what you're said, and that just drops gamma.
Monitors that are IPS (i.e. most monitors) with HDR are more mostly fake.
If you want real HDR600, try the Samsung Odyssey G7; but even that does not have full-array local dimming (which you'd expect to spend $2k for).
Anything else, in my opinion, is like selling LTE as “5Ge”.
Also, no, the 400/600 etc is just brightness: HDR10 and HDR 400 are the same, except HDR 400 mentions the level of brightness on the display. .
You may be thinking of HDR True Black, which is a further enhancement, but something different. Deeper blacks can also be achieved by non-OLED displays that support full-array local dimming, and it looks like most displays that support FALD are also HDR displays.
Bit depth is the resolution of the color space.
> Also, no, the 400/600 etc is just brightness: HDR10 and HDR 400 are the same, except HDR 400 mentions the level of brightness on the display. .
Vesa  disagrees with you. Note the Maximum
Black Level Luminance restrictions. Minimum brightness + maximum black level = minimum contrast. That's why legacy display technologies with mediocre static contrast (TN, VA, IPS) need backlight hacks to support those specifications.
For the 5-6k that this display from Apple costs, I would certainly expect it to be an OLED display at LEAST. But it's not.
How is this the best we can get and it's NOT OLED? Dimming zones for 6k? I don't understand what's going on, I just want a nice OLED monitor that will fit on my monitor arm. I'll even pay the same price that you can get an AMAZING 55" OLED TV from LG for; 1500-2k.
Also the amount of bright colors used in computer interfaces would cause some significant discomfort.
And I'm not sure how legitimate this claim is, but I've read before that OLED suffers from dead pixels at a higher rate than other screen types, but don't take that too seriously without proof.
I see this repeated a lot. Do you have any numbers/images on actual burnin in OLED screens. Would be interesting to know how long an OLED screen remains useable when used as a PC monitor.
Or to put it another way: Burn in does not seem to be enough of a concern to prevent Samsung/etc from puttin OLED screens into phones.
> Also the amount of bright colors used in computer interfaces would cause some significant discomfort.
The entire point of Apple's solution here is that UI's max brightness is not the display's true max brightness.
The problem with contrast is more pronounced on my OLED tv than on my HDR LED monitor. I've also noticed on my TV if I watch netflix with standard size subtitles, the brightness overwhelms the lower part of the image on dark TV shows. I suspect this is less of an issue on LED monitors only because the contrast is not so extreme.
Again, all anecdotes. I do like OLED though, enough to make it my priority TV feature.
I'm not willing to give my email addy out for those sample clips either, so I guess I won't know too soon.
White is white. White on a display is the brightest point the display can display.
What Apple is doing, as the article explained, is showing regular white as gray. That’s not cool, that just stupid. It’s exactly what TVs at Best Buy do when showing SD vs HD content: They ruin the regular image just so you can see the difference.
The issue is that my monitor is not a demo display, it’s what I use sometimes in daylight, and I’d very much appreciate that extra brightness that Apple takes away from me.
You know what this means for you? Everything you see and watch on your computer is not a bright as it could be. On an LCD screen that’s a big deal because suddenly your blacks are brighter (because of the backlight at 100%) and your whites are dimmer (because Apple saves brightness on the off chance that you have HDR content)
#ffffff is L=100%. What is L=800%? It exists in HDR content, and we can’t just make the web color #ffffff a dim gray to the eye.
We must start thinking in terms of HSL or LAB or even RGBL, and consider that L > 100% is where HDR peak brightness lives.
HDR’s color space exceeds the luminosity that sRGB hex triplets can represent, and remapping HDR color spaces into sRGB hex gives you horrendous banding and requires complex gamma functions. The CSS colors spec is finalizing on this, but essentially we’re at the last days of hex codes being a great way to express color on the web. They’ll remain good as a last resort, but it’s time to move a step forward.
Apple is pinning sRGB hex #ffffff to “paper white” brightness because the hex color specification can’t encompass the full spectrum of monitors anymore. The different between #ffffff and #fefefe can be enormous on a display with 1800 nits of peak brightness, and if you map #ffffff to peak brightness, you burn out people’s eyes with every single web page on today’s legacy sRGB-color Internet (including Hacker News!). That’s why HDR has “paper white” at around 400 lumens in calibration routines.
So, then, sRGB hex colors have no way to express “significantly brighter than paper white #ffffff”, and UI elements have little reason to use this extended opportunity space - but HDR content does, and it’s nice to see Apple allowing it through to the display controller.
But there’s no way to make use HDR in web content - other than embedded images and videos - if we continue thinking of color in terms of hex codes. This insistence that we remap hex codes into thousands of nits of spectrum is why web colors in Firefox on an HDR display make your eyes hurt (such as the HN topbar): it’s rescaling the web to peak brightness rather than to paperwhite, and the result is physically traumatic to our vision system. Human eyes are designed for splashes of peak brightness, but when every web page is pouring light out of your monitor at full intensity, it causes eye strain and fatigue. Don’t be like Firefox in this regard.
“But how do we conceive of color, if not in hex codes?” is a great question, and it’s a complicated question. In essence you select color and brightness independent of each other, and then make sure that it looks good when peak brightness is low, and doesn’t sear your eyes when peak brightness is high.
If this interests you, and you’d like to start preparing for a future where colors can be dimmer or brighter than sRGB hex #FFFFFF, here are a couple useful links to get you started:
As a final note, there are thermal reasons why peak brightness can be so much higher than paperwhite: your display can only use so much power for its thermal envelope. Yes, HDR displays have thermal envelopes. So overusing peak white, such as scaling #ffffff to the wrong brightness, can actually cause the total brightness of the display to drop when it hits thermal protections, while simultaneously wasting battery and hurting your users’ eyes.
It's gonna be five minutes before everybody's extra important call-to-action buttons and probably ads are max brightness too
I've always wondered what it'll take for the major OS manufacturers to implement an anti-seizure filter for the content they transmit to their screens, and I'd bet that a flickering ad at HDR max brightness causing seizures worldwide one day will finally compel them to do so.