Really Krita would benefit massively from brushes being implemented on GPU but the developers (rightly) are worried about the massive complexity gain.
I believe author meant 265 not 256. Typical programmer typo.
I've long been using Painter for hobbyist digital painting, but I'm not very happy about the price, the occasionally shoddy stability, and the in-app purchase upsell Corel is nowadays pushing in every upgrade. I don't use the crazy artistic brushes very much because they're so slow — just looking for something with a decent set of customizable gouache/acrylic type brushes.
And the brush system is amazing (it also has nice auto-smooth options). You can customize everything. There's things Krita can do with brushes that Photoshop can't which allows me to use my pen with tilt support to the fullest. You can tweak the properties of any brush individually, even invert them. The only thing Photoshop has is real dynamic brushes (similar to Corel's simulated brushes but not as realistic, although faster), but because of limitations in what you can customize about the properties I could never get them to work how I wanted, plus the useful brushes (with more than a couple of bristles) were too slow to be of much use.
Regarding brush presets, I don't remember what presets come built in (I removed 90% of them) but there are nice brush packs out there that people have created that have all sorts of brushes to help get you started.
One feature I love is its coloring tool, which you only need to make a stroke or two on a part to specify how you want color it.
I've colored this in about 5 minutes...
Thank you for the effort of getting HDR to work on Angle.
I suspect that some of it might be waiting for Windows and MacOS's support to become a bit more stable before writing too much code.
Quite the opposite of displaying HDR imagery on an HDR monitor.
It provides finer gradations to reduce banding, but doesn't change the tone response curve to add more highlight brightness.
Can't it use H.264 instead for better performance? H.265 is so slow...
But at the very least HLG works on H.264.
That leaves software decode, which is pretty hard on phones and other constrained devices.
> Of course, at this moment, only Windows 10 supports HDR monitors, and only with some very specific hardware. Your CPU and GPU need to be new enough, and you need to have a monitor that supports HDR. We know that the brave folks at Intel are working on HDR support for Linux, though!
Don't they consider P3 to be HDR, or is there something else that disqualifies the Macs that have Display P3?
I don’t know if Apple makes HDR displays.
So they're super nice, I'm very happy with my 2017 model and it's the best display I've ever used, but not HDR.
There was a lawsuit about dithered displays back in the 90s that made them write "millions of colors" instead of "16 million colors", so I'd say it's unlikely that they're not actually providing those 10 bits.
So their fanciest displays are both Display P3 and 10 bits per channel.
In the past, it was used to get finer gradations, which is in fact fairly important if you are working with black and white , to avoid banding.
HDR involves mapping the 10 bits over a different tone response curve that covers a wider output dynamic range.
Mind, even on Windows, it's pretty messy and you have to do a lot of figuring out and hacking platform layers to make it work.