
We Built a Facial Recognition Machine for $60 - pseudolus
https://www.nytimes.com/interactive/2019/04/16/opinion/facial-recognition-new-york-city.html
======
npip99
As much as this technology is powerful, you can't make it "illegal" to take
photos of a public park, and then bring them home and do whatever you want
with them at home. That would just be absurd. Banning the software online only
isn't really an option, because the knowledge on how the software works is
already public. Anyone could rewrite the code on their own (Esp because 300
line neural nets, no libraries used, are still pretty dam good at facial
recognition). The real privacy should absolutely be on the data side. There
should not be public databases of profiles. If you have a big insta profile or
you are a big youtuber, then yeah, you'll be public even through you're not a
celebrity. That was choice. Random people should be private. No mugshots
should be freely available, and photos of social media profiles should be
obscured by default. Or, alternatively, we accept that this is the way things
are.

~~~
jermaustin1
Policing the data should be the way forward. Just like I cannot take your PII
and publish it without your consent, it should go also with your dataset of
faces. The only trouble I see is what constitutes a face for facial
recognition vs a face that is in the background of my Instagram photo?

Take this photo I took of the Manhattan Bridge in Dumbo [1], there are a few
faces in that photograph that are quiet prominent. I'm protected in the US
because it is in a public place, but in the future would I need to manually
blur out the faces of those who do not sign a model release?

1: [https://megapickles.photo/2018/06/03/dumbo-washington-
street...](https://megapickles.photo/2018/06/03/dumbo-washington-street/)

~~~
OkGoDoIt
Can you really not take PII and publish it without consent? Let’s say you
found it somewhere publicly or in some other fashion where a user has not
agreed to a privacy policy. I’m not saying you should do that or that it’s
morally right, but are there actually legal protections against that outside
of Europe?

~~~
jjeaff
Not that I know of. As long as what you publish isn't defamatory, there is no
law against publishing pii.

------
tarasmatsyk
The whole article seems to be a marketing post.

I completely agree that you could build a face recognition system in 60$ or
less, however the image match described in the post is unlikely to be the real
system results, I've been studying image recognition for a year now and it is
almost impossible to train a model without having thousands of images
classified somehow (usually by hands). One of the biggest problems in AI is
that NN cannot match an object in 2 images if it was taken from different
perspective unless R. Madonna was hanging out there for 9 hours to be filmed
from different angles to collect enough of photos from camera (not taking into
account different image resolutions and camera quality)

Thanks for the article, however, the myth is BUSTED :D

~~~
mabbo
> One of the biggest problems in AI is that NN cannot match an object in 2
> images if it was taken from different perspective

I think you've missed the last 5-ish years of NN development. Some of the
latest Face2Vec stuff can handle different perspectives pretty well by either
building a system that can predict 3D representations of the face from one
shot[0] or by just training on two different perspectives of the same face and
having the system output the same vector (I've read some work on this but
Google is failing me at finding a reference).

If _you_ can recognize people from different angles, computers will eventually
be able to do it better than you. Probably already.

[0][https://arxiv.org/pdf/1703.10714.pdf](https://arxiv.org/pdf/1703.10714.pdf)

~~~
bsenftner
Bingo. 1) Train an algorithm that given any angle view of a human head,
produce a 3D reconstruction of said head. (Depending upon the amount of face
visible, the accuracy of the reconstruction increases. 2) Train an algorithm
that given a human head geometry, it identifies any deformation due to facial
expression and neutralizes that expression. 3) Trivial algorithm: given a
human head geometry not facing the camera, rotate that head such that it is
facing the camera. 4) render the human head geometry without expression,
facing the camera, using the video image source as the texture map for said
head. 5) Perform FR.

------
rgoulter
The use of the word "legal" in the title seems charged to me. (EDIT: The
edited HN title is better now, thanks).

Especially given, from the article:

> In the United States, there are no federal laws that restrict the use of
> facial recognition.

~~~
nickelcitymario
Unless I'm terribly mistaken, laws generally only describe what's illegal. We
don't write laws to tell you what you can do, we write them to tell you what
you can't.

So if there's no laws saying you can't do it, it's legal.

~~~
jessriedel
rgoulter isn't saying that the title is wrong, he's saying that it's
unncessarily charged. By explicitly specifying that this technique for face
recognition is legal, it suggests that _other_ techniques are illegal and so
something bad (but legal) might have been achieved. But in fact, none of this
is illegal; the headline is just clickbait.

~~~
nickelcitymario
Fair enough, I see your point.

But, I'd argue the use of "legal" was to make a real point: We assume this
isn't legally possible today, and the article is showing how it is. So I'm not
saying you're wrong, simply that I disagree.

------
veryworried
What exactly does an "illegal" facial recognition machine look like?

~~~
Nasrudith
I think it would need to intrinsically commit another crime in the course of
operation.

Like some sort of horrific machine which obtains shape information by head
crushing and then stating if the now crushed head was recognized. Needless to
say there is no reason to do that.

------
galadran
This would be illegal in the EU under the GDPR which would require positive
consent from the individuals monitored (simply walking into a space would not
be enough). However, individual countries are allowed to alter this aspect of
the Regulation themselves, so effectively countries may now "opt in" to the
use of facial recognition in more settings. Quite different from the opt out
relationship of the US States.

Source:
[https://www.lexology.com/library/detail.aspx?g=c0714997-42b1...](https://www.lexology.com/library/detail.aspx?g=c0714997-42b1-4436-a1ba-5436ecdaaf56)

~~~
jermaustin1
So how does that work when I'm in a park taking pictures to share on my public
Instagram? Do I need to get a model release from each person that walks into
my frame? What defines monitoring?

~~~
tzs
Article 2: "This Regulation does not apply to the processing of personal data
[...] by a natural person in the course of a purely personal or household
activity".

The recital associated with that explains it: "This Regulation does not apply
to the processing of personal data by a natural person in the course of a
purely personal or household activity and thus with no connection to a
professional or commercial activity. Personal or household activities could
include correspondence and the holding of addresses, or social networking and
online activity undertaken within the context of such activities. However,
this Regulation applies to controllers or processors which provide the means
for processing personal data for such personal or household activities."

It sounds like GDPR wouldn't require you to get a release...but it might
require Instagram to require you to get a release.

~~~
jermaustin1
So that means if I plan to sell prints of my photos, I would have to get
releases even though I didn't want a person in my shot, but the nature of it
being a touristy place, people are inevitably in the shot. Take this photo for
instance[1].

1: [https://megapickles.photo/2018/06/03/dumbo-washington-
street...](https://megapickles.photo/2018/06/03/dumbo-washington-street/)

------
DoubleCribble
How long until we see this as a monetized SaaS? (ed note: that's Stalking as a
Service)

