It's intriguing, because I wonder how this would affect police work. I'm imagining things here, but I assume that when a profile sketch is developed, all officers using that image know that it's "just a sketch" because, it looks like a drawing, because it is one.
So what happens if you now generate a photorealistic "sketch" based on a description? Are officers going to be sufficiently aware to know that's not a actual photo of the guy they are looking for, and act accordingly? Or is it going to heavily bias a manhunt effort? moreover, what happens when the photo randomly ends up close to someone present in the dataset?
The police already know those sketches are super fake lol. The point of those isn't to arrest the right person it's to have an excuse to hassle arrest or maybe kill a minority.
I know that would solve the problem. What I'm curious about is what happens if that isn't done, because I can see promoting the idea of a "photorealistic AI sketch artist" as a very attractive proposal for a certain type of manager. It's just a thought experiment.