It probably is an unethical technology. Its primary feature is to strongly tie someone's identity to their physical body. This alone is not inherently unethical, but it opens the door for so many abuses that it is hard to consider it as ethical. These abuses come from the ability to use a centrally-controlled technology to systematically isolate and excommunicate people from society, no matter where they go.
It becomes even more unethical when you consider the secondary, less publicly acknowledged feature: to provide plausible deniability for pursuing false positives. When facial recognition has the legal weight to drive searches and arrests, the "fuzziness" of false positives becomes an asset to those performing searches and arrests.
What makes facial recognition so dangerous is that it's so trivial and so cheap to deploy. You literally walk into a supermarket and tills have facial recognition built in to automatically bind your shopping to "you" even if you don't have an account or pay with cash. It's abused to hell basically and you will be facially recognized pretty much everywhere you go in public. It's definitely an "unethical" technology because of how it's generally used.
DNA testing isn't, because the technology is far too expensive and time consuming to be used this way. The second supermarkets start scanning my DNA to give me special offers, I'll complain about it too.
My point is, this sort of "well it could be used this way!" argument isn't helpful at all. Retina/fingerprint scanning could be used this way too, but it isn't - that doesn't make the argument against facial recognition invalid, or it any less unethical.
This analogy is somewhat valid, since facial and DNA data are mostly "public," given how often we show our face or incidentally shed some DNA (IIRC the police collected the golden state killer's DNA from a cup at a restaurant).
But it also relies on a shallow comparison – unlike a facial scan, DNA is not a passively collectible identifier. Collecting and processing DNA requires active, physical surveillance that downloading and faceprinting a photograph does not.
This physical limitation eliminates most practical risk of a global adversary exploiting DNA matching to build a dragnet surveillance regime. Meanwhile, facial recognition has already wrought such a state of affairs.
Incidentally, passive DNA surveillance does happen, but only in aggregate, like when sampling wastewater for toxins or other biomarkers. I'm not aware of any technology that separates the distinct DNA of individuals from mixed wastewater. If that were possible, then maybe DNA dragnet surveillance would be, too.
You can't scan someone's DNA from 20 feet away in tens of milliseconds with <$100 of hardware.
We're able to do facial recognition on microcontrollers costing less than $5. Commercial facial classification on that class of hardware is half a decade out.
In a sane world, there would be sane laws with stiff penalties (as in, jail). For instance, "any identifying data about individual users may only be used for authentication and shall be completely under the control of the user. 'Under control' means the user is allowed to delete it. Examples of activity not compatible with 'under control of the user' include training AI models. Illegal use of authentication data will represent a felony punishable by XYZ."
And even then, if there is jail, it's not beyond a corporation to appoint a lesser VP to serve the jail time and just "take care of" him when he gets out. The American agricultural products company you would think it is + their American bank + some British banks did so years ago when they got caught trying to offset the cost of R&D farmland in Africa by leasing it out to those warlord / death squad types (and their arms dealers) who hand machetes, cocaine, and AK47s to 10 year olds.
To me, philosophically, A tool by itself isn't inherently good or bad, but the way in which it is applied is. Unfortunately, people couldn't resist the temptation to do evil that large-scale, (relatively) rapid, programmatic identification (faulty or otherwise) of humans enables.
No it's great to track and record every human around the clock and completely automated. That way, we make sure everyone stays in line and don't try to organize themselves to get rid of dictatorship.
No no, it's for their safety. We can now watch as you get mugged at gunpoint while not doing anything about it. See, you're safer now that we've facially recognized and identified you getting mugged.
under their breath because the IP is better held in a different part of our corporate structure.