Hacker News new | comments | ask | show | jobs | submit login

> Communities such as San Diego, California are using mobile biometric readers to take pictures of people on the street or in their homes and immediately identify them and enroll them in face recognition databases.

In their homes? I'm pretty sure this is illegal. Or is it not if they are visible from a public area?

So imagine a scenario where a delivery company outfits their delivery people with body cameras "for safety" (a legitimate concern) but then sells access to their feeds to a biometrics company as an additional revenue stream and lets them extract data from that video..

They know the address being delivered to, and now they can map the presence of specific biometric data to that location and time.

Doesn't matter. In their homes, or in their gardens, or when they are going out and have one foot on the street.

Illegal collection can't reliably be prevented. Get a photograph, extract the biometric data, enter them in your database, delete the photograph. Job done.

Maybe the only way to get control back on this is to require that these databases to be open source (or publicly available) and make it illegal to hide those databases. Citizens must also have the right to be removed from those databases.

At what point do we stop trying to prevent unpreventable illegal activity, and try to put systems in place for dealing with its undeniable existence?

I'm not certain how such systems would look in this case (everybody wears Guy Fawkes masks in public?), but this debate reminds me of prohibition vs. harm reduction in the War on [People Using] Drugs.

I don't think the comparison holds. It's not like there're thousands of generations of face recognition fighting against a wrong-headed prohibition; this sort of panopticon garbage is a clear violation of societal norms. I don't think you accept that misbehavior from artificial persons is a given; even as we look for a technical solution, we should also be pressuring the legislatures to amend laws to prevent this.

Does the "Right to be Forgotten" cover the datacenter neurologic?

Then you will still need a reliable auditing system to audit what they're deploying is what's in the open source repository. The government would solve that by creating the "Open Source Auditing Department" or somesuch. And at that point, who trusts them anyway?

I trust democratic governments and the justice system a bit more than companies. The Bill Of Rights (or equivalents in other countries) is a much nicer read than any EULA.

Correct - they have an expectation of privacy in their homes. Same deal applies if you use a creeper telephoto lens.

That's good to know.

Just out of curiosity, do you know how that interacts with indecency laws? Like for example if someone is having sex in their home but someone is able to see it from the sidewalk through a crack in the curtains? This is purely hypothetical of course.


My understanding is that it comes down to a "reasonable person" doctrine - i.e. would a reasonable person consider the location to be private.

Another way of thinking about it is if there is a reasonable chance that you will be seen by people engaging in normal everyday activities.

Personally, I would say that peeking through a small gap in the curtains would not qualify as normal behavior. Someone doing that is clearly snooping.

On the other end of the spectrum, having sex in front of a window that fronts a public sidewalk with no curtains would pretty clearly be public indecency.

It depends on a LOT of things, jurisdiction being the biggest part, so there's no universal answer, but cases like this should be indicative:


Interesting case, thanks for sharing.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact