I don't think it's unreasonable to assume that in a near future we will have cameras that can do much more than identification on a massive scale. It may be easy to track things like how stressed, healthy, sober etc. people are without their consent or without them even noticing. And this would be precise, not speculative like the current online tracking.
Would you live comfortably in this world? Would you build this?
It will be built through economic coercion appealing to their livelihoods, and the impacts will be perceived as externalities. Incidentally, this is the same way most of the modern world got built over the last several centuries, with those having to work to earn their keep taking on various risks to enrich themselves ostensibly to provide for their families: mining, military service, or data mining others' personal information to sell ads; their participation in these has far-reaching impacts on the world, but gets distributed over a large population and avoids hitting close to home.
Some people can't rationalize this away, so (in a low-violence society) they quit and do something else. Given enough time, the workers who are uneasy will be replaced by people who have no such qualms. They, and their bosses, will make good money until an outraged public prompts government regulations to catch up. By that point, they will likely have a seat at the table. In an authoritarian society, or when the state itself is the actor, there's even less of a chance of this balance.
In fact, part of the attraction of working in, say Child protection services is the damage you can selectively inflict on others. Same for the police, but much more direct. 15 minutes of research will tell you that they're not doing their duty for the money.
This is WHY you would want the power of government artificially limited by an independent and powerful judiciary.
When you hypothesize a world where AI uses cameras everywhere to know everything they can about everybody,
you haven't characterized that world at all.
First, IS IT REALLY EVERYBODY? (But let's assume for now, it's really everybody).
The question that will characterize the kind of world, is what is done with that information.
You can choose to use it to oppress and repress people; or you could use it to help them. If you detect a person overly drunk, you can send somebody fetch her, and either send her to the police station or to her home. Here you can ask what kind of world you would live more comfortably in? But it has nothing to do with cameras AI, and databases.
If the AI detects that some person is becoming stressed and loses her health, you can use that information to fire her, (bad health leads to bad productivity right?), to forbid her to get insurance, etc, or you can use that information to give her some holidays and schedule a consultation with health services. Here you can ask what kind of world you would live more comfortably in? But it has nothing to do with cameras AI, and databases.
And if the AI detects that some crime is being committed or about to be, you can send the police after or before the damage is done to the victim, and you can put the criminal in jail or send back to his country of origin, or leave him on the streets to try again another crime. Here you can ask what kind of world you would live more comfortably in? But it has nothing to do with cameras AI, and databases.
Almost every tracking method or tool that has been created is used by businesses to exploit me as much as possible, so why should I believe that further developments would be used any differently?
I think 'who holds the power?' is a good lens to try to judge this through. Unless there is a major shift in how humans relate to the World, I think unleashing this power is dangerous.
Eventually some one with bad intentions will end in power even if started with a pretty good guy(Example: US president start and end). Its better to think of the worst and place the checks at the earliest (like the constitution)
We don't really know what the feasible computational and data limits are here, the potential (and likelihood, I think) is that they are much, much lower than where we would like them to be. In the near future something very close to ubiquitous surveillance is very likely to be the norm. And that has all sorts of chilling implications. If systems can keep track of where you are and what you're doing most hours of most days then that opens up a lot of questions in terms of free expression, political organizing, civil disobedience, etc.
Yep, but think also how many times you approached a friend from behind and patted him/her on the back only to realize that you just patted a perfect stranger?
I have no particular problem to believe that one's gait is "distinctive", but quite a few in believing that that particular stroll is "unique".
I think they just don't want to be quite so blatant and obvious about it.
Don’t worry, systems like this are incredibly fragile. You could spray paint the cameras and force the government into an attrition war (I can spray paint them faster than you can replace them, and paint is cheap).
This is already a daily reality for certain minorities in some countries already.
Once you go full family execution, that's pretty much the limit of escalation against an individual. If you're willing and able to destroy everything and everyone I care about, there is no longer any bound you can control on what I might be willing to do to prevent that, or avenge it, and you are now relying entirely on my own ethical reticence and my resource limitations to avoid me engaging in acts of terrorism against your nation-state and its loyalists.
And that's the practical argument for proportional punishments, if your government is so dysfunctional that there is no moral argument for them. If you lock up all the Uighurs just for existing, for example, you may find that the ones who escape your pogrom are more willing to fight, and to fight dirty.
Realistically, we should probably be talking about vandalism of surveillance equipment in terms of how many cameras or camera-equivalents you could be compelled by force to replace whenever you are caught destroying one. And that is balanced by how likely you are to be caught when destroying one. Which depends somewhat on how many cameras/snitches in the area are fully operational.
There's probably an optimal camera-destroying algorithm to be discovered in there somewhere.
> you may find that the ones who
> your pogrom are more willing to
> fight, and to fight dirty.
This stuff is frequently by design.
Sure, if the powers-that-be get nervous, any sweep would miss a lot of subversives while picking up plenty of random people - just like purges in the many repressive regimes around the world. The main win for repression here is they no longer need a system like, say, East Germany where 25% of population spied on their neighbors.
I suppose you might count it a win that in this coming future, your car isn't going to start when you act staggering drunk rather than when you objectively are staggering drunk. But I can't see the win here.
But talk about cameras and surveillance? The tech is everywhere and can do everything with perfect accuracy and will come for you and your babies if you even sneeze wrong!. Come on.
Having worked with security camera systems (I used to install them) let me tell you something: the cameras are shit, the quality they produce is shit, clients are cheap as fuck and don't want to repair them, they break constantly, the software is terrible to use... I could go on.
We are a very long way away from the dystopia you are describing. People will gladly unintentionally fuck this up long before it becomes what you're fearing it will be.
With cars, the idea of a 0.1% chance of dying each time you step into a vehicle is terrifying - that's much higher than you get now, humans are actually amazingly good at driving. IE, fortunately, the system probably won't stomach the costs of bad self-driving cars.
But with the justice system spotting potential criminals, yeah, pulling in some innocents isn't going to bother a system with N degrees of repression. Most people in the justice system have no choice but to plea bargain now. Most people in the justice system are also guilty but the innocent-but-not-wealthy minority is generally SOL. And that is in our ostensible democracy, that's known by most sophisticated observers of the system, and only handful make an effort to change it. So a half-assed evidence gatherer of whatever sort will make lots of people happy, put lot of guilty and innocent people in jail and hardly needs great reliability. Indeed, with AI parole-granting systems, we're already there.
You don't have to attack the camera directly to sabotage it. Cutting power or communications cables and jamming transmissions works too.
Believe me, I wish it was that easy.
Jammers are trackable, but if you use them judiciously, I think they could be used to cover other attacks: activate jammer, attack camera infrastructure, deactivate jammer, and flee.
if its not then I find a way to adapt. I accept that the only thing constant is change.
> Would you build this?
many2 reason or could be simply just because is exciting
What constantly amazed me was how you could see someone walking a quarter mile away and instantly know exactly who it was.
The South Pole is a smaller population than China so I'd be worried about false positives (probably mostly within families as kids learn to stand and walk like their parents and have similar dimensions when fully grown) but I suspect it's actually a pretty robust identification technique.
Even if the false positive rate is really low, it's still a problem. All of these dragnet-style things mess with your priors. In a traditional investigation, the police have a handful of suspects and then gather evidence. Circumstantial evidence showing that the chances are 1 in 1000 that this one of the five suspects was not the culprit is pretty convincing, and probably enough to convict.
Cast a dragnet over a city of 1 million, and suddenly that same 1-in-1000 error rate will be giving you 1000 innocent people who, under the old system, are 'guilty beyond reasonable doubt'. They've been pre-selected to look pretty frikkin' bad regardless of whether they're guilty, which means any one of them will have a hell of a time proving their innocence.
I expect that if you knew that the cameras were watching, you could trick their AI's pretty easily (e.g., by faking a limp), but I'd prefer to confirm that before relying on it :-).
Kind of reminds me of Dunes "Walk without rhythm so you don't attract the worm"
Totally anecdotal, but I can often identify people I know just by listening to their footsteps.
edit : a source : https://www.youtube.com/watch?v=JASUsVY5YJ8
Former CIA Chief: "Changing the way that you walk requires some amendment to what you're doing. You're going to need a physical apparatus to make you walk differently. You can't possibly say, okay, now I'm going to limp. To change my walk, I put a little piece of gravel in my shoe, just a little thing that pinches my toe, and it changes my walk whether I want to or not. You can put a bandage around a knee, something that restricts you, will change your walk."
2. Countermeasure: use stone in the shoe, tape and other things to fool the gait detector.
3. Counter-countermeasure: detect artificial gaits. Everyone unrecognized with artificial gait is person of interest.
I'm so glad people like you are building my future.
Alternatively, I've heard (don't have the source handy) that they're looking at other stuff that the stone in shoe might not change such as overall height and knee to leg to hip to foot stature and angular momentum ratios. Who knows?
A false positive is an excuse to target someone police don't like, or suspect is guilty, so the dismantlement of a system that provides those convenient false positives will be fought.
"Big Brother is watching. Learn to move indescribably."
It's like that anti-face-ID makeup that covers your face with dazzle patterns. It may work as a visible act of protest, but it won't actually protect your privacy.
I wonder, then- do the authorities of the various countries (by far not just China) who make use of this kind of technology know how little they add to their ability to spy on their citizens? If they know- do they use it anyway just so that lucrative contracts can be signed and money exchange hands? Do they (also?) figure that the fear of being identified by all-seeing, all-knowing cameras will keep the population in the straight and narrow, even if the cameras are blind and dumb as bricks, in practice?
These modalities simply don't carry enough information to reliably identify anyone.
Facial recognition works best when you need the subject to be unaware that they are being identified.
The context is a crowd of about 100 (10 - 1000), in which you are tracking a few percent, and you need to keep "eyes" on an individual, discriminating amongst a few others that have crossed is path.
That said, I'd agree that it looks difficult to help distinguish individuals in a trained team.
Comparing what the NSA does to what China does is like comparing the treatment of women in the US and Saudi Arabia
Why do you have so much certainty? What evidence can you provide that they use or currently plan to use these techniques domestically for widespread tracking?
No matter it's many faults the US is still a democracy with checks and balances.
B) this is not an excuse for the consistent course towards militaristic, authoritarian state fueled by fear.
EDIT: Sourced claim.
China doesn't have near the surveillance infrastructure that the US government does outside of China.
The simple fact that China has more citizens to surveil does not mean China is conducting surveillance on a larger scale.
While I agree with what you're saying, it makes sense that we'd be getting this information only just now - after there has been enough time for 1) data scientists and the like to become prevalent/cheap enough for government contracting and 2) enough time to build working applications on top of all that data.
The real punchline from that article "no peer reviewed scientific studies have ever been done to prove the basic assumption that every person’s fingerprint is unique."
False-positives don't matter, since China has no civil rights.
However, I do agree that inferring the proportions from video imagery at various angles and with various items of clothing is going to be challenging due to systemic distortion, e.g. inferring incorrect angle due to clothing.
I think the main advantage of gait analysis (or limb proportion measurement) is that it will encourage a more efficient, decentralized computation that first distills the observed data into a fingerprint that can be more easily transmitted over a wire to a massive database for matching, rather than the alternatives of either replicating that database or transmitting raw, high-bandwidth data.
Besides limb measurements, they could also use last known location, residence, and vehicle ownership information to increase the certainty of loose matches.
There are already amazing airborne camera systems that track the motion of every license plate in a city, and even that isn't the same as tracking every car's movement because you still have to know which license plate to follow -- you just can't care about all movement of all cars. I suppose someone will say that you can store all movement in a database, but I'll wager that's just not worth anywhere near the cost. Your computing power would be much better spent on computing large primes.
Anyway, it's a cool sci-fi movie premise. You can see the murderer shaving off his mustache, burning off his own fingerprints and then knee-capping himself to avoid detection.
Of course we can. For each license plate encountered by CCTV or cop cars, store a tuple of (timestamp, coordinates). This won't require much disk space, so that's not a concern. Later, it is easy to answer questions like "what places was this car seen and when?" and "which cars were in this vicinity at this time?"
I'm not arguing that it will never be possible, I'm arguing that the original claim, and the mass surveillance that make people nervous, is not happening today.
I was really into the idea of card counting when I was in high school (over 10 years ago) and remember coming across multiple sources that claimed casinos did this as part of their arsenal to detect people who've been blackballed.
It can be done with an accelerometer or a camera, and probably other kinds of sensors.
It could also be used for early diagnosis in humans, but is cost prohibitive for human preventative care use.
Well, now it's you. And also, once again, brought to "the world" by China. (Maybe they'll sell it along side of or as an add-on to their Great Firewall turnkey systems.)
I remember viewing a few of those old video segments (drone- and plane-based platforms) and feeling pretty nervous.
Yeah right. The sensitivity/specificity requirements for this to be useful would be insanely difficult or impossible to reach.
> the kinds of questions that began to present themselves were just.. the questions themselves were ugly and I didn't want to know the answers. It's like banging on a door, you knock on a door, and the door opens slightly and behind that door it's dark and there's loud noises coming like there's wild animals in there or something. And you peer into the darknees and you can't see what's there but you can hear all this ugly stuff.. do you want to step into that room? No way, you just sorta back out quietly, pull the door shut behind you, and walk away from it.
-- Bill Ehrhart ( https://www.youtube.com/watch?v=tixOyiR8B-8 )
Subject: Woman's man
Time to talk: None.
What was it ?
Decades of careful propagandised hatred for the Commies, or the Reds, or whatever slur, is hard to shake off too.
Or maybe you're right. But it's impossible to trust what our own governments say to us now.
Sure, but throwing up our hands and assuming that all states are basically the same probably takes us further from the truth.
We've got to make our own judgments somehow, and it's prety clear that despite the many flaws of our Western countries, they are meaningfully more open and tolerant of dissent, investigative journalism and the like than some other countries, including China.
Every piece of information should be taken with a grain of salt, and it makes sense to add a few extra grains when it comes to countries that we have been trained to be suspicious of. But that doesn't mean we should bury it all in a salt mine and ignore it.
I'd agree with all that. I just don't have any evidence to suggest they're watching us less or have greater ethics when it comes to invading citizens/subjects privacy in secret.
In absolute terms whistleblowing is a dangerous game in the west, so it is likely that there's plenty being successfully hidden from us -- and the evidence we do have suggests that surveillance is very intense and intrusive. But I still lean toward the belief that it is probably relatively more restrained, and less likely to be used in completely dystopian ways, than in more secretive and repressive states less constrained by the rule of law.
non-facetious: Wouldn't it be more appropriate to say 'Chinese government can now...'
But I mean of course as an American I would care more about what the US does because it affects me more. The war on drugs especially is an authoritarian nightmare (and practically had ended up racist) and the war on terror has eroded due process but China has a type of authoritarianism that just isn't the same as the type we have in the US.