
Google's AI can now identify faces from heavily pixellated images - rfreytag
https://www.wired.co.uk/article/google-brain-ai-pictures-blur
======
ageitgey
This is very poor reporting. The paper has nothing to do with identifying who
is in a low-res image. It's about scaling up low-res images by totally making
up the missing parts based on a statistical model.

I guarantee the authors of this paper would be horrified by this article and
how their work was mis-represented. The first paragraph of the research paper
literally says "We demonstrate that this model produces _a diversity of
plausible high resolution images_ "

The part at the end of the story about this being applied to CCTV images is
especially misguided.

But the bigger lesson here to all researchers is that they need to be aware
that this is likely to happen with their work! They need to make sure they are
communicating their work as clearly as possible to reduce the risk of this
happening (maybe by posting a lay person-readable blog post to go with the
paper that clearly says what it does and doesn't do).

~~~
deckarep
Just glancing at the white paper examples shows that this article is
completely sensational and missing the point. Good call out.

------
chmod775
It absolutely can't identify faces. That's not even what it's intended to do.
All it can do is create a believable high resolution face for a given
pixelated one.

~~~
GuiA
_> All it can do is create a believable high resolution face for a given
pixelated one._

Yes; and, it should be emphasized, the face it will generate has no guarantee
whatsoever of matching the original one, except through their heavily
downscaled version.

------
GuiA
_In the future, with further development, similar systems could be developed
to add detail to pictures and video that are low resolution. One current
category that lends itself to this is blurry CCTV images._

This is just statistical extrapolation pushed to the absurd, and various civil
rights disasters waiting to happen.

Look at the outputs compared to the ground truth - the faces look absolutely
nothing alike.

It should be a real concern that some will see these systems as magical -
being able to recover information that just isn't there - and advocate for
their use in real life situations, such as security cameras as mentioned in
the article. How long until an innocent person gets prison for life or the
death sentence because a neural net generated a face that looks closely like
theirs out of blurry footage?

This is basically a modern polygraph in the making - "garbage in, garbage out"
framed as science.

~~~
rando444
While I agree with your statement, I was actually very impressed to see the
level of detail they were able to achieve with this technology.

------
ibdf
Ah-ha! TV shows will no longer have to fake it. Too bad the news is old and
the title is all wrong.

------
sharemywin
As with a lot the articles, I wonder what the examples they don't show look
like.

