Hacker News new | past | comments | ask | show | jobs | submit login

In a practical sense, if you are doing lots of complicated blending or gradients in Photoshop, you can set your mode to 16-bits. It will stop reduce a lot of banding issues, and you can convert back to 8-bits with dithering when saving out as a common format.



I do film color work as part of my job. It's nuts how many hoops one has to jump over to give proper, almost proper, image experience on a variety of viewing devices. It's akin to sound mastering.

When there are gradients visible, more or less the only thing you can do is to introduce artificial monichromatic noise to the image to hide the perception of a staggered gradient.

I would like to see an industry-wide push for consumer-grade (at least) 10-bit signal chain from graphics cards to monitors with high dynamic range. That would have more impact on image quality than crap being pushed for now, like 4k and VR.

HDR B4 4K, chaps!


This is kinda happening - the "UHD Alliance Premium Certified" spec for TVs mandates 10-bit from the input to the panel. It's a shame the UHD Blu-ray standard doesn't mandate 10-bit, thought hopefully most will use it :) Dolby Vision mandates 12-bit mastering and delivery, though it sounds like 10-bit connections to the panel can be considered acceptable...


Now if we could only convince Nvidia to gives us 10-bit output on all cards, not only Quadros. That would be great.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: