
Clearview AI is struggling to address complaints as its legal issues mount - pulisse
https://www.buzzfeednews.com/article/ryanmac/clearview-ai-cops-run-wild-facial-recognition-lawsuits
======
sschueller
How many clearview AIs exist that we don't know about? This is something I
could create as a hobby project in my free time so what would someone with
real resources be able to do?

~~~
Timothycquinn
Probably several dark versions of this out there have access to this type of
technology already. Its just too tempting for maleficent organizations around
the world to ignore.

Just think about the investigators around the world (FBI, DEA etc...) who are
combating evil organizations but are not careful with the the information the
post online; those people are screwed. Time for a new job.

I'm glad that this one was done in the USA so we have a way to bring more
light to these issues.

I suspect (and hope) that clearview AI will not last long here in its present
form.

~~~
dylan604
Okay, say they take the code and set it up on servers outside the US. Maybe
incorporate as a new name in a non-US country as well. What implications would
that have on US law enforcement agencies in particular, but any US based
company in general from using this new foreign company?

------
jrs95
Even if Clearview somehow shuts down...this sort of thing is basically
inevitable without laws to protect our privacy. And honestly, all of these
companies that might at least complain when Clearview does it would gladly
share this data with the federal government. So I'd be surprised if at least
some federal law enforcement agencies have had access to equivalent (and
probably better) technology for quite awhile.

------
_bxg1
> One of Clearview AI's investors defended the company, “We do not have to be
> hidden to be free.”

Yikes.

~~~
donclark
Would it be a good idea to capture all the investors names, address, photos,
etc. and put them online for all to see?

~~~
catalogia
Legally? Maybe not. But ethically and morally? Hell yes!

We need a Robin Hood of privacy. Extralegal justice for the common people, for
whom official channels are contemptuous.

~~~
bathtub365
What if you were mistakenly or maliciously targeted?

~~~
catalogia
If you wish to avoid vigilante justice, you need to provide something better.
The government is meant to handle matters like this to protect anybody who
might be wrongly accused, but the government is abdicating their duty to do
so. When the government abdicates their duty to the common people, the common
people have a moral and ethical right to take matters into their own hands.

------
choppaface
Some ads now have links to sites where you can do mass opt-outs, like this
one:
[https://optout.networkadvertising.org/?c=1](https://optout.networkadvertising.org/?c=1)
I have no clue if they're legit, but they certainly look useful.

Some companies like Sift Science got caught not having proper opt-out
facilities that are now required by law
[https://www.nytimes.com/2019/11/04/business/secret-
consumer-...](https://www.nytimes.com/2019/11/04/business/secret-consumer-
score-access.html)

Now that the CCPA is in full effect, can't somebody just build a mass opt-out
service to get out of Sift, Clearview, etc? Sift etc already have to hire a
3rd party to handle ID verification (yeah it's weird you have to send them
your ID to get your own data, which is supposed to define your identity...)..
No idea if it would be profitable, but if you've got the ID verification bit
solved, then launching an opt-out service might be the way to make your
product go viral?

------
chii
> To process a request, however, Clearview is requesting more personal
> information: “Please submit name, a headshot and a photo of a government-
> issued ID to facilitate the processing of your request.“

ROFL - they will use your request to add and extract value out of your
information. It also clearly, and uniquely identifies individuals who can be
considered "trouble makers".

------
kerkeslager
Good. I hope they struggle their way into bankruptcy. Companies like this
shouldn't exist.

~~~
chii
but unfortunately that data is already out there. If clearview could
collect/scrape, so can anyone else. If Clearview could hire people to create
AI's to identify people/match people, so could anyone else.

And this kind of database, i imagine, would already exists for the NSA or
whatever intelligence agency for China/russia.

------
sdan
Kind of surprised how they did this. Facebook, Instagram, YouTube, Twitter,
etc. has hard rate limiting and crazy scraping prevention. Wonder how hard
they worked on their scrapers.

------
mortenjorck
There’s going to be legislation. That’s more or less a given at this point.
The question is: can we actually get good legislation? My fear is that we’ll
get something more along the lines of the EU cookie directive than a full-on
GDPR.

So instead of worrying that your biometric data could be captured anywhere and
used for anything without your knowledge, you now have to click “I accept”
before engaging in any activity with a business or government entity, which
then establishes your _consent_ to your biometric data being captured anywhere
and used for anything without your knowledge.

------
joering2
> Clearview AI, the facial recognition company that claims to have amassed a
> database of more than 3 billion photos scraped from Facebook, YouTube, and
> millions of other websites [...]

Let me guess - neither FB nor YouTube will go after them legally, because its
probably waste of money anyways, since their clients did not experience
immediate/direct harm?

~~~
comex
If you keep reading, Twitter, another of Clearview’s scrape targets, did send
them a cease-and-desist letter, asking them to “cease scraping and delete all
data collected from Twitter”. Not that that guarantees Twitter will actually
follows through and file a lawsuit if Clearview doesn’t comply.

~~~
chii
a cease and desist is worth about as much as the paper it's printed on.

------
sillysaurusx
Hot take: fighting this is a losing battle, because you’re fighting against
technology. Can you think of a single technology that society has successfully
rejected? It’s pretty hard.

In general, once the tech to do X exists, then X will find a way to exist.

Instead of crucifying Clearview, it might be better to open a dialogue about
regulation in this space. But even that has problems; bitcoin was eventually
accepted. Yet ICO regulation shows that it’s worth trying.

The point is, this kind of tech can be helpful. Feynman once cited a quote
along the lines of “Science is a key that unlocks the door to heaven, but the
same key unlocks the door to hell.” It seems relevant here.

~~~
reaperducer
_Can you think of a single technology that society has successfully rejected?_

Eugenics?

[https://en.wikipedia.org/wiki/Eugenics](https://en.wikipedia.org/wiki/Eugenics)

~~~
pfdietz
Genetic screening, and abortion of defective fetuses, is a thriving industry.

------
lykr0n
Companies like this are Evil. Plain and simple. If your business model
involves distributing/processing personal information you had no consent to
take and sharing it with 3rd parties, then that's evil.

~~~
civilian
Cynical take: I'm pretty sure we've all agreed to ToSes that has allowed our
faces to end up online. Clearview AI is just doing what a detective with an
eye for faces and unlimited time is doing.

I agree that I'm really concerned about the way the surveillance state is
creeping in. But it kind of feels like the cat is out of the bag, and
moralizing like this is useless.

~~~
ska

      Clearview AI is just doing what a detective with an eye for faces and unlimited time is doing.
    

This has always been a bullshit argument for this type automation, whether it
be facial recognition, license plate readers, recording phone conversations,
whatever.

It pretends that scale doesn't matter, when the reality is that scale and
network effects are the most important feature of many of these 'advances'.

~~~
lucasmullens
Could you critique it without calling it a "bullshit argument"?

~~~
ska
By "bullshit argument" I mean specifically that this argument is not presented
in good faith. Rather it is used to obfuscate or avoid engaging with the
issues at hand. At least, that is how it comes from corporate entities etc.
From individuals I expect it is more a category error.

Would you have preferred “disingenuous " ?

------
chadmeister
A few days ago there was a front page post about Clearview AI where the
original post title was, as rules on this forum require, the same as the
article's title:

"The Secretive Company That Might End Privacy as We Know It"

A few hours in the post title was changed to:

"Clearview AI helps law enforcement match photos of people to their online
images"

The difference to me was night and day and super sketch. Why was this change
made? Is ycombinator invested in Clearview?

~~~
AnimalMuppet
Well, if the headline is clickbait, the policy here is to change it to
something that reflects the actual _content_ of the article, rather than just
parroting the clickbait. Whether that is what was done in the case of this
particular article, I can't say.

~~~
muppet_frog
Not sure I would call a New York Times article 'clickbait'...

