Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR?

If you set #ffffff to be a comfortable max, then that would be the brightness cap for HDR flares that fill the entire screen.

But filling the entire screen like that rarely happens. Smaller flares would have a higher cap.

For example, let's say an HDR scene has an average brightness that's 55% of #ffffff, but a tenth of the screen is up at 200% of #ffffff. That should give you a visually impressive boosted range without blinding you.





Oh.

I don't want the ability for 10% of the screen to be so bright it hurts my eyes. That's the exact thing I want to avoid. I don't understand why you think your suggestion would help. I want SDR FFFFFF to be the brightest any part of my screen goes to, because that's what I've configured to be at a comfortable value using my OS brightness controls.


I strongly doubt that the brightness to hurt your eyes is the same for 10% of the screen and 100% of the screen.

I am not suggesting eye hurting. The opposite really, I'm suggesting a curve that stays similarly comfortable at all sizes.


I don't want any one part of my screen to be a stupidly bright point light. It's not just the total amount of photons that matters.

It is not just the total amount.

But it's not the brightest spot either.

It's in between.


I just don't want your "in between" "only hurt my eyes a little" solution. I don't see how that's so hard to understand. I set my brightness so that SDR FFFFFF is a comfortable max brightness. I don't understand why web content should be allowed to go brighter than that.

I'm suggesting something that WON'T hurt your eyes. I don't see how that's so hard to understand.

You set a comfortable max brightness for the entire screen.

Comfortable max brightness for small parts of the screen is a different brightness. Comfortable. NO eye hurting.


It's still uncomfortable to have 10% of the screen get ridiculously bright.

Yes, it's uncomfortable to have it get "ridiculously" bright.

But there's a level that is comfortable that is higher than what you set for FFFFFF.

And the comfortable level for 1% of the screen is even higher.

HDR could take advantage of that to make more realistic scenes without making you uncomfortable. If it was coded right to respect your limits. Which is probably isn't right now. But it could be.


I severely doubt that I could ever be comfortable with 10% of my screen getting much brighter than the value I set as max brightness.

But say you're right. Now you've achieved images looking completely out of place. You've achieved making the surrounding GUI look grey instead of white. And the screen looks broken when it suddenly dims after switching tabs away from one with an HDR video. What's the point? Even ignoring the painful aspects (which is a big thing to ignore, since my laptop currently physically hurts me at night with no setting to make it not hurt me, which I don't appreciate), you're just making the experience of browsing the web worse. Why?


In general, people report that HDR content looks more realistic and pretty. That's the point, if it can be done without hurting you.

Do they? Do people report that an HDR image on a web page that takes up roughly 10% of the screen looks more realistic? Do they report that an HDR YouTube video, which mostly consists of a screen recording with the recorded SDR FFF being mapped to the brightness of the sun, looks pretty? Do people like when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white? (see e.g https://floss.social/@mort/115147174361502259)

Because that's what HDR web content is.

HDR movies playing on a livingroom TV? Sure, nothing against that. I mean it's stupid that it tries to achieve some kind of absolute brightness, but in principle, some form of "brighter than SDR FFF" could make sense there. But for web content, surrounded by an SDR GUI?


> when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white

I don't know why you're asking me about examples that violate the rules I proposed. No I don't want that.

And obviously boosting the brightness of a screen capture is bad. It would look bad in SDR too. I don't know why you're even bringing it up. I am aware that HDR can be done wrong...

But for HDR videos where the HDR actually makes sense, yeah it's fine for highlights in the video to be a little brighter than the GUI around them, or for tiny little blips to be significantly brighter. Not enough to make it look gray like the misbehavior you linked.


> I don't know why you're asking me about examples that violate the rules I proposed. No I don't want that.

Other than the exaggerated 10x, I don't understand how it violates the rules you proposed. You proposed a scheme where part of the screen should be allowed to be significantly brighter than the surrounding SDR GUI's FFF. That makes the surrounding GUI look grey.

> And obviously boosting the brightness of a screen capture is bad. It would look bad in SDR too. I don't know why you're even bringing it up.

I'm bringing it up because that's how HDR looks on the web. Most web content isn't made by professional movie studios.

The example video I linked conforms with your suggested rules, FWIW: most of the image is near black, only a relatively smart part of it is white. The average brightness probably isn't over SDR FFF. Yet it still hurts.


The whole chip in the middle is brighter than white. Half that video is super bright, making this example way more than I was suggesting in both area and average brightness.

> most of the image is near black, only a relatively smart part of it is white. The average brightness probably isn't over SDR FFF.

It's a lot more than I suggested, and I said average brightness half of FFF for my example.

Also if I knew you were going to hammer on the loose example numbers I would have said 2% or 1%.

> I'm bringing it up because that's how HDR looks on the web.

But I'm not defending how it looks. I'm defending how it could look, since you don't see why anyone would even want HDR on the web.


This is going on for too long. Maybe you can somehow find a way to process all HDR content so that it's reasonable (i.e never makes the surrounding SDR GUI look grey, never makes bright pots which are bright enough to hurt) across all screens imaginable and all contexts. Maybe. I have my doubts, but go ahead.

Convince the web standards bodies and browser implementers and transform the world into one where HDR on the web is perfect and never causes issues.

But until that's done, there's a simple solution: Just don't support HDR. Until your hypothetical perfect solution is universally implemented, it does more harm than good on the web and should not be supported.

I don't see why anyone would want HDR on the web in its current form.


Well the reason I was talking about limits that way is because it's something screens already do when displaying HDR content. They can't go full power over much area, and the bigger you go the dimmer your limit gets. So repurposing those existing algorithms with some tweaking.

It's not very hard on a technical level.

And no it doesn't have to be universal and perfect to reach the point that HDR is a benefit. There are some blatant flaws that need fixing, and just a few fixes would get us a lot closer.


It has to be universally not harmful.

Again, go convince standards bodies and browser implementers to implement those algorithms after doing studies to demonstrate that it fixes the issue. Until then, I just don't want it in my browser.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: