Surveillance is the key concept in a major design of social control, Bentham's "Panopticon." When anyone may be watched, people must live expecting that they will be watched, and the will of those with the power to create sanctions will enacted. This will be a historical novelty if facial recognition systems become widespread. Historically, only limited public venues were subject to effective widespread surveillance (except in extreme authoritarian cases like East Germany). Increasingly, online spaces are subject to surveillance that is limited by browser instrumentation, with facial recognition it will be possible to track people offline as well, if you have access to the cameras.
Surveillance is the twin of transparency, which can be thought of as possible tool for democratic control. I do believe that transparency i.e. through freedom of information acts has some positive characteristics for democracy but has a weak empirical record compared to panoptic control. Facial recognition might have affordances for collective action, such as making it easy to produce evidence of excesses or corruption.
That said, maybe we should not give the authorities the power to implement technological changes without the consent of the governed? Technology moves faster than policies can adapt to them, but old laws and institutions like the police evolved in times when the tools of day could not have been imagined.
Regulating technologies like facial recognition popularizes the control of social systems which counter acts authoritarianism and high concentration of power and resources.
Very good point.
If facial recognition were widely deployed and the data matched up, it would be more like a police officer following around wherever you go, 24/7.
The fear is that you could use the tech to build ultra-detailed activity profiles of any person
The problem isn't that there could be mistakes or bias in the technology, but that no one can be held accountable for them.
The right? Maybe not. But I surely don’t want Big Brother always watching. I gotta ask, have you read 1984?
Public surveillance everywhere is a precursor to that.
The surveillance within your home is very much already a reality.
Now we have people welcoming this creeping totalitarianism with concerns of safety. Trading justice for individual crimes at the ever-growing risk of totalitarianism.
It is important to apply ethics when developing technology, as rationality alone is not sufficient.
It's not the tool we need to debug; it's us.
 Where, by "these tools", I mean machine learning writ large.
EDIT: I highly doubt your comment's parent was suggesting we had to be irrational in order to be ethical. Rather, I believe they meant we should be rational and ethical. Wouldn't you agree?