
The file Clearview AI has been keeping on me - _Microft
https://www.vice.com/en_us/article/5dmkyq/heres-the-file-clearview-ai-has-been-keeping-on-me-and-probably-on-you-too
======
kurthr
It's worth noting (as mentioned they are under security review) that
apparently ClearviewAI lost their client database (hopefully that's all) to
hackers last week. Along with being sued by the big boys, that's gotta hurt.

[https://nakedsecurity.sophos.com/2020/02/28/clearview-ai-
los...](https://nakedsecurity.sophos.com/2020/02/28/clearview-ai-loses-entire-
database-of-faceprint-buying-clients-to-hackers/)

------
chris_f
I don't see it get mentioned much, but Yandex's reverse image search [0] will
also match similar facial images. It works surprisingly well.

Other search images can obviously perform this function, but I think most stay
away from it because of the creepy factor.

[0] [https://yandex.com/images/](https://yandex.com/images/) \- click on the
camera for the reverse image search

~~~
simion314
Just tested it, it returns people with some similar features but that are
obvious different persons, different face shape and size , maybe facial hair
and glasses confuses it or they could not find any better matches so they
returned just something so it is not empty.

~~~
ghaff
I tried it with a picture of me and the top half dozen photos were, in fact,
pictures of me. I was originally thinking there were also bizarrely a couple
of photos of signs and then I realized I was standing next to a sign in the
photo and that's what caused the algorithm to stick a couple of signs in
there.

------
buckminster
> "Clearview does not maintain any sort of information other than photos,” the
> company wrote. “To find your information, we cannot search by name or any
> method other than image."

So a police force buys this product, searches for an image, and just gets more
images. Really? What use would that be?

~~~
throwGuardian
I think they also provide the origin of the images (example MySpace), and more
often than not, the origin ends up with a definitive identity.

The response report on the author, for instance, clearly identifies her

------
extesy
Shortcut to the page with data access and opt out request forms:
[https://clearview.ai/privacy/requests](https://clearview.ai/privacy/requests)

~~~
yoaviram
Or send them a CCPA / GDPR erasure request:
[https://yourdigitalrights.org/?company=clearview.ai](https://yourdigitalrights.org/?company=clearview.ai)

------
luckydata
Until our way of governing ourselves doesn't change, everything scientists and
engineers do can and will be converted into a weapon or tool of oppression. We
should be more vigilant and we should demand better from our elected
officials.

~~~
pstuart
We should always demand this, yet not count on it happening.

It would be nice to find ways to take these weapons and use them for self-
defense as well. I wish I had a clue of what that would look like.

~~~
Frost1x
Well, the one edge of this sword is essentially being used by wealthy,
businesses and those in power to gain additional information on average
citizens to leverage for their own advantages and the expense of those folks
happiness.

Traditionally, an average person had a tremendous amount of privacy simply by
the virtue of being average. It wasn't worth the time/effort/resources to
collect data on them. With ubiquitous sensors, it's now far more manageable or
even trivial to collect data on people and use it no matter how unimportant or
insignificant you are in respect to money/power.

People of great wealth and power also somewhat enjoy this privacy. Unless
you're a public facing official or on the top of the wealthy list, you
essentially can also disappear in crowds and enjoy a degree of anonymity. Some
may recognize Gates or Bezos in a crowd but drop down to say Forbes 400
rank... #144 (picked at random) and you'll find.. (searching) Leonard Sterm.

Prior to that sentence I've never heard that name and certainly wouldn't
recognize him in public even though he is estimated to be worth a modest
$4.8B. I might (probably not) interact with them in the future and as an
average person, I'm at a disadvantage because I can't recognize his net worth
and public information which I could potentially leverage to my advantage in
the same fashion someone wealthy/powerful may use their resources to leverage
me.

Take this a step further and focus on business investors and participants
promoting these technologies, pouring money into them. For example, I can't
even name the people involved in Cambridge Analytica even though they were in
the news and probably wouldn't recognize them either. Sure, I can dig around
and find them but that wouldn't be useful in most situations.

For the most part, your average person doesn't have knowledge or resources to
access and leverage such data that businesses are starting to use against the
average consumer and worker rapidly and in ways that significantly impact
their quality of life.

As such, I would propose leveraging these same identification and recognition
approaches against those pulling the strings or in public government office
positions who should be fighting for citizens rights and developing a targeted
dataset for those folks and exposing it in a way your average citizen could
use it. Perhaps as a phone app that can pass a few frames of sampled video
back to a service that identifies these individuals and gives detailed public
information about them. Make _their_ anonymity suffer. Link it to public
facing records of their history, etc. Develop network information for _their_
families, friends, business associates.

Obviously, this is worthless against high profile individuals (celebrities,
high government officials, etc.), but a lot of wealth and power hides in the
shadows of the general public's eyes yet are pulling the strings on these
cultural shifts us plebes are dealing with. Turn the tables and make the
technology problematic for their daily lives as well--drag them out into the
light for the public--maybe then you'll see more regulation and penalties for
using these technologies surface. Just a thought.

~~~
pstuart
I think you've got something there. This would likely need to leverage
decentralized tech (perhaps blockchain could actually be useful for something
other than crypto-currency?).

Then there are the challenges of getting people to pay attention and to care
about the results.

------
bsanr2
Can anyone chime in on why California's law is necessary to force the deletion
of your data? Wouldn't these photos be subject to copyright law?

~~~
simion314
From what someone said in a previous thread the copyright part is not that
simple, so as an example the person that takes the picture can claim the
copyright not the subject. Probably there is a different law that would
prevent others to make money using your image but is not the copyright AFAIK

~~~
ghaff
>Probably there is a different law that would prevent others to make money
using your image but is not the copyright AFAIK

There is a right of publicity [1] that tends to come into play if I want to
use a photo of you for commercial purposes like advertising. However, if I
sell a candid photo of you that I took in public to a newspaper or if it's in
a book I sell that's OK. (Assume it's straightforward use and nothing that
could be misconstrued or deliberately misrepresents you in some way.)

No idea about this case and IANAL. But it seems in general that I would be
able to freely use images on the Internet for training models and for other
purposes that don't involve republishing those images.

[1] [https://www.nolo.com/legal-encyclopedia/the-right-
publicity....](https://www.nolo.com/legal-encyclopedia/the-right-
publicity.html)

~~~
bsanr2
I'm trying to understand where subject release documents fit into this. So I
don't have to get those signed to take pictures of people, even in public?
What if it's a still from video?

~~~
ghaff
A model release form gives the right to use a photo for most purposes*
including things like marketing literature, e.g. the classic happy smiling
diverse workers found on company websites and in promotional brochures.
However, if I just take a photo of you in a public place I can upload it to
Flickr, post it on my website/blog, sell it to The New York Times for
editorial use, etc. You have fairly limited rights to photos of yourself taken
in public.

A video is just a sequence of still frames so I don't know why that would be
any different.

IANAL. Added: I'm primarily familiar with the US. Your mileage may vary
elsewhere.

*subject possibly to what I wrote earlier about misleading or defamatory purposes, e.g. if I use a photo of you to illustrate an article about neo-nazis.

------
YeGoblynQueenne
>> “Clearview helps to identify child molesters, murderers, suspected
terrorists, and other dangerous people quickly, accurately, and reliably to
keep our families and communities safe.”

It helps to identify child molesters. "Think of the children".

------
ThePhysicist
Supposedly any company in the EU that has data on you needs to proactively
make you aware of this and allow you to have this data erased or rectified.

I don’t think all or even most companies respect this rule yet (I haven’t
heard of a single company about my data so far) but I think it’s a powerful
idea that would keep companies like Clearview from being able to build up such
large databases without people knowing about it.

I really hope the EU will keep strengthening subject rights as I think it’s a
very good solution to most problems involving the use of personal data.

------
fortran77
I like the way they claim it stops "child molesters" but then goes on to say
that Macy's uses it, presumably to identify shoplifters before they strike.

------
spectramax
How do we stop this? Can we sue Clearview through EFF with the grounds of
violating copyright or some weird law?

California/GDPR laws are great but they don't _stop_ the data collection,
merely allow the user to access it.

We need to make companies like Clearview completely and unquestionably
illegal. I am infuriated and don't know what we can do.

~~~
ajaygeorge91
stop uploading your pics to internet then

~~~
carapace
Exactly! People think the Internet is like Disneyland but it's actually more
like the bad parts of Bangkok.

Somewhere between the inter-networked machines and peoples' minds some kind of
_magical thinking_ takes over.

Why don't normal people understand that they are being actively fought by
organized relentless enemies?

In the old days cities had walls around them, I think we're barely getting to
that point on the internet now, despite repeated and continuous raids by all
kinds of marauders.

~~~
pergadad
Or the digital world is an incredibly huge and complicated place that most
people don't even begin to grasp. What is private and public is not obvious,
and everything can quickly leak, be accidentally shared, ... And you lose
control.

Stop blaming the users for what is systemic and legal faults - and in many
cases really an issue not even of laws or of enforcement. Think revenge porn -
nothing you can do to stop it, and only so much you can do to contain it once
it's out there.

~~~
carapace
Look I'm not saying it's _fair_ , I'm saying that, if you want e.g. control of
your own nudie pics, you got to (excuse me) come correct. Don't take them on
your phone, and don't ever let them onto a computer that's connected to the
internet. Better yet, don't use digital at all, use film and keep it in a
safe, like _everyone did_ before pocket computers with radio networking and
hi-res camera became ubiquitous.

There are things you can control and things you can't, eh?

I do blame the users. They buy a book of matches and proceed to burn their own
house down. Either they're children or just incredibly stupid. That everybody
else is doing it too changes nothing IMO.

------
beholder1
The work of this system does not seem different from what's known of chinese
face detection and surveillance systems.

------
llarsson
Google has all the data and algorithmic chops required for this, too.

Just use their reverse image search feature, as helpfully explained here:

[https://support.google.com/websearch/answer/1325808?co=GENIE...](https://support.google.com/websearch/answer/1325808?co=GENIE.Platform%3DDesktop&hl=en)

They have tons of images from social media sites, too.

~~~
danso
But image similarity is not the same as face recognition. I just uploaded the
photo the author gave to Clearview, and it returned none of the photos
Clearview found – just other front-facing flatly-lit mugshots of other people.

Of course Google can do this internally (and FAcebook could do it even better)
– but the public-facing image search does not do what Clearview purports to
do.

------
finnh
Is it just me or are those results clearly based on using her name (from the
ID she provided as part of her CCPA request)? To have zero false positives,
and multiple different hair colors, etc ... strains credulity that is all
based on face matching _only_.

