ah, yeah, outstanding argument - looking forward to seeing it applied to all other electronic data ever produced, no matter how old they are
Sometimes when you get all the equipment, it can encourage usage, just like military gear to police, or large databases gained from surveillance that make detective work way too easy.
I feel like law enforcement has relied on data from surveillance for so long that everyone has forgotten real detective work and there is just lots of data noise and toys they are using to up budgets, in the end it might be the end of us.
It was so long ago I don't know if they still do it or what's done with the fingerprints now.
I guess it’d be good if I had been kidnapped and they could find my fingerprints at the kidnappers house, or prove that I am “me” if I was kidnapped and escaped years later.
My parents bought it as did many others.
I think the worst problems here are around asymmetries. If I secretly have all your data but you don't know that and have none of mine, there's a power imbalance. But if we have equal amounts of data and know when each is looking at the other, that seems far more reasonable to me.
It's either privacy for everyone, or transparency for all, but this in between is the worst where some entities get leverage over others because they have access to information that others don't have about them.
Maybe "privacy for everyone" is a better point to aim for.
The alternative seems to be no privacy as a baseline and then still results in an everybody-loses arms race to top it off.
We should at least know when our face is being accessed on demand and in some ai process. Same with our data.
We'll get there eventually but there will be lots of issues along the way.
This was what the FBI wanted in the 50s, 60s and 70s, and now, 70 years later, they finally have it.
But I do agree with you.
We can't stop eyes (biological and electronic) from seeing faces. We can't stop memory banks (biological and electronic) from storing the likenesses of faces.
Better than trying to craft legislation that we know the state will secretly violate is to make all of this data public and make the world's human facial lexicon part of our shared understanding.
This is the most ideal solution for an optimistic society.
Also, 1st amendment challenges only protect you if the supreme court is sympathetic.
Where I travel, (digital and otherwise) privacy is quite alive and well. (This is definitely outside of highly populated first-world cities.)
This comment could be straight out of 1870 when things were industrializing rapidly. We'll figure it out eventually. Hopefully without any serious oppression and/or bloodshed in the process.
A friend's child is in a camp that unfortunately uses them, and "invited me" so that I could send a letter to the child (which is a useful feature). The URL of the child's photo is:
When parents upload a photo of their camper, our facial recognition software scans each photo posted by camp and notifies parents when photos of their camper are available. Family members can easily see their camper in action without hunting through the camp’s entire gallery."
And this is the problem: The ruthlessness of engineers who then will wash their hands in innocence.
Physicists creating nuclear weapons. Chemists creating chemical weapons. Software Engineers creating an Orwellian surveillance wet dream. These people need to get the Perillos of Athens treatment, so they and those who come after them can learn that ethics is a thing in their profession.
What if I was gay and was seen leaving a gay club? Now what if I ended up in the Kingdom of Saud and that data was shared or leaked?
Instead of Saudi Arabia, how about the United States 100 years ago? It is not as though there are no more harmless activities for people to participate in that they will suffer for.
A comparatively trivial but important example: how about tracking your employees movements to punish them for talking to other companies? Is that a 'you should know your risks' situation too?
1. That someone going to a gay club in NYC later decides to go to Saudi Arabia, knowing full well that the government there is actively hostile to them.
2. That the government of Saudi Arabia somehow has access to facial recognition data from the NYPD.
3. This data for some reason includes not only mugshots directly collected by the NYPD (which was what TFA was actually about) but also private survellience footage from gay clubs, and this footage is clear enough to make a match (remember people tend to go to clubs at night)
4. The government of Saudi Arabia decides that they need to expend a tremendous amount of computing resources to run facial recognition of all of their survellience footage of every face in their country against this fictional database that they somehow obtained.
5. Further they conclude that a single facial recognition match, which is known the flawed, of a person in the vicinity of a gay club is sufficient to convict someone.
There are real privacy concerns yes, but this is absurdist fearmongering and deserves to be ridiculed.
Data getting deleted is a good thing.
The whole "if you don't want to go to jail, don't commit crimes" line of thinking really falls apart when you realize that the legal system is well beyond the ability for a layman to navigate on a day to day basis and gets even worse when you have a perfect surveillance state capable of seeing and storing massive amounts of data.
When the government can trivially make the case that a citizen is a criminal just by looking deep enough in the archive it's time to start considering that maybe your government isn't protecting/serving you anymore.
Because computers were used for something evil once so it must be the inanimate objects that are themselves inherently evil and not, you know, the people using them. What a weird, reactionary, alarmist stance to be so prevalent on a supposedly technology based forum.
I also recommend a reading of the history behind the "bill of rights."
If your position is that this is the equivalent of SS soldiers breaking down your door and hauling you away to a concentration camp, then I honestly don't know why you aren't in the street rioting right now.
We could have a useful discussion about realistic pros and cons of things that are actually happening if people would leave off of the nonsensical hyperbole for like, a fucking second.
The most effective way to fix abuses is to prevent them in the first place. We restrict govts and police by design, a need recognized for centuries. If a capability can be abused, it will be abused. Not if, when. That's the last reply such a shortsighted argument will get.
(Also, am not restricted to other's problematic examples upthread.)
No one voted for this or gave their permission. This is what happens when you don't keep law enforcement on a short leash.
what is your solution to this outside of panopticon?
We aren't talking about small towns here, we're talking about an urban metropolis. Relative anonymity is a feature, not a bug.
They already have the photos from arrest records. They already have footage from security cameras, etc. I don't see a problem in using computer algorithms to try to find possible matches between these two data sets. We do this for fingerprints and DNA all the time, matching data that was collected during an arrest to that from a crime scene.
Frankly, I do not have a problem with this, and actually think the police would be derelict in their duty if they were not doing this.
The greater question has to do with what our rights are. The reality is, our system has been setup such that everyone commits crimes. By applying highly effective methods, you’re essentially enslaving the population. That’s why we had laws protecting freedom of speech and unreasonable search and seizure. The U.S. was setup to be particularly sensitive to those issues. The reason they are in the constitution is it was believed government would eventually try to chip away at the freedom and it did.
That’s why people are concerned. Their duty is to keep the peace. That does not necessarily mean or need to mean spying 24/7.
The problem I see with many of these systems are -- they are trained on one population and used for inference on a very different distribution (minorities.) OR their accuracy is only high on the overall population, not universally accurate across all subsets.
When those in charge, those with political power, etc (usually non-minorities) "see" the performance of the system, they experience it in their own context where it works. They don't feel the pain from the experience of the sub-populations where false positives/negatives are larger.
OPT IN is always the choice of liberty.
Young people can be criminals too. And if the police catch them while still juvenile, they might have more of a chance to turn their life around.
Young people can also go missing, and facial recognition of security cams could help find them.
I honestly can't see any reason why children shouldn't be included like everyone else.
It goes without saying that a conviction should never depend on computer matching, and that safeguards need to be in place to prevent abuse and limit to legitimate law-enforcement purposes.
But trying to locate suspects or victims in the first place? Children are no less important here than adults.
If you are concerned about privacy, then let's regulate what data you can collect and from whom you can collect it. However, once you have the data, police should be able to use computers to better understand, filter, and generally make the data more useful to solving crimes.
Notice how eroding privacy even further is not even in the top 10 things that could have improved this situation
For this you want to expose society to even more dystopian ruin by hyper-asymetrical information gatherng via GPS on a per body basis.
You can't make this up.
The US justice system isn't designed to turn anyone's life around -- it's designed to punish people. There's no benefit whatsoever to a kid spending time in jail or prison (or to putting their parents in debt over fines or bail payments).
Lol. Has the past half century or so of law enforcement hammered in a belief in the moderation of power by law enforcement? That such power is never used asymmetrically against the lower strata of society as opposed to the elites? Will it prevent people from being the victims of selective enforcement of laws, even non-violent ones involving the mere possession of drugs?
Lol. Lol. Lol.
Or (as seems to be the case) are we just shoveling data into a system and hoping it can cope.