
Taylor Swift showed us the scary future of facial recognition - pseudolus
https://www.theguardian.com/technology/2019/feb/15/how-taylor-swift-showed-us-the-scary-future-of-facial-recognition
======
diafygi
So like with most ML/AI technologies, it's usually not the specific use case
that is the most troubling. The most troubling is that _everyone_ scanned are
put into the database for later use or as training data.

So even if I'm not a stalker of Taylor Swift, I'm now part of ISM Connect's
database, so when they release a "EnergyGuard" product, it will certainly
recognize my face at an environmental protest at a coal power plant. And now
they know that I'm a climate change advocate _and_ I like Taylor Swift.

This kind of mapping is what scares me about ML/AI the most, where you pretty
much have to collect, store, share, and map data that violates your Fourth
Amendment rights (e.g. the "mosaic" principle of privacy from US v. Jones[1]).

[1]: [https://www.washingtonpost.com/news/volokh-
conspiracy/wp/201...](https://www.washingtonpost.com/news/volokh-
conspiracy/wp/2014/04/28/courts-grapple-with-the-mosaic-theory-of-the-fourth-
amendment/)

