
I got my file from Clearview AI - us0r
https://onezero.medium.com/i-got-my-file-from-clearview-ai-and-it-freaked-me-out-33ca28b5d6d4
======
sxp
The article is missing a link to the Clearview forms to request a copy of the
data or request deletion:
[https://clearview.ai/privacy/requests](https://clearview.ai/privacy/requests)

~~~
_fullpint
Its absolutely wild that for anyone that's not a resident of California or
EU/UK -- there isn't a way for you to request anything other than specific
images/links.

~~~
ardy42
> there isn't a way for you to request anything other than specific
> images/links.

And you can't even do that unless you've already managed to remove the image
from the internet:

[https://clearview.ai/privacy/deindex](https://clearview.ai/privacy/deindex):

> This tool will not remove URLs from Clearview which are currently active and
> public. If there is a public image or web page that you want excluded, then
> take it down yourself (or ask the webmaster or publisher to take it down).
> After it is down, submit the link here.

~~~
sdoering
At least for GDPR reasons (as far as I understand it) you can forbid the use
of your data. They then are not allowed to use your data in any way. Even data
already public or put on the internet by you in the future.

If they do they are in "deep shit" (pardon my french) legally. I actually hope
they do this - and somebody can catch them in the act. I believe they will be
gone soon then.

Also I would advise anyone under GDPR legislation to also request exactly with
whom the data was shared and go on to also request deletion and usage
information from them. It is a pain that one has to jump all these hoops. I
would love for the GDPR to have a way of forcing such a company to also do the
information and deletion requests on your behalf and prove that to you.

Sadly this was not included I believe.

------
40four
I used to think some of my peers were being overly cautious by purposely
trying to obfuscate their online profile. Back in the day, I couldn’t care
less about putting in effort to try to protect my online privacy. I have
slowly but surely come to see the light. Now it’s at the forefront of my mind
at all times.

Learning about this company (and I imagine other unknown entities are doing
the same) has encouraged me to get more aggressive.

I think I will start to try some shenanigans I learned from a friend. I plan
on replacing my online profile pics to a random grab from
[https://www.thispersondoesnotexist.com/](https://www.thispersondoesnotexist.com/).
Might not make much of a difference for the old stuff, maybe I can
successfully request a data deletion as the article suggests. At least it will
introduce a little bit of noise to the AI overloads :)

~~~
JDulin
Just for the record, the journalist who broke the story on Clearview noted
that Clearview AI has specifically demonstrated it isn't fooled by
thispersondoesnotexist.com:

[https://twitter.com/kashhill/status/1218542846694871040?s=20](https://twitter.com/kashhill/status/1218542846694871040?s=20)

You won't be giving them any info on you, but you won't be confounding them
either.

~~~
gertrunde
I'm not sure what is not working-as-intended here?

Run facial recognition against computer generated face, got no matches. Surely
that is the expected and intended result from both parties?

Or is it expected to match against a different face?

~~~
moftz
They might have already scraped all of the faces on that site. They aren't
generated on demand so you conceivably scrape the entire database of fake
people and tell the algorithm to ignore anything matches them. Then, the
algorithm would just treat any photos of a person that doesn't exist that it
finds in the wild as it would if you have no profile pic. It might fool a
person or an AI not trained on those pictures. Another option is to generate
your own people that do not exist and use those images. This could work as
long as Clearview isn't doing some sort of image analysis to look for telltale
signs of AI-generated faces. You could start photoshopping fake faces onto
your real pictures in an effort to blur the line between AI generated pictures
and real pictures.

------
air7
The thorny issue here is that this is all public information put forth
willingly by people. These are not leaked medical records. In a way these
abilities are like a person saying "hmm isn't that the guy from that thing a
while back?" What are the limits of what is allowed to do with public data? I
don't have a clear opinion.

~~~
njkleiner
> These are not leaked medical records. In a way these abilities are like a
> person saying "hmm isn't that the guy from that thing a while back?"

That's why I don't think there is much of a point in trying to prevent people
(e.g. by law) from crawling and using data in this fashion.

I feel like the only reasonable solution here would be to force these
companies to rebuild their databases by legally limiting the lifetime of such
data.

That way people have a chance to remove themselves from the database by
changing/deleting their online profiles without having to use legal measures
like GDPR requests. People wouldn't even have to be aware of any individual
database they might be part of; they would be removed from it automatically at
some point.

Another benefit of this would be that the pure cost of constantly re-crawling
a giant dataset could act as a limiting factor and therefore prevent abuse.

~~~
gowld
Is you are going to play the "is it legal" game, then it's illegal bulk
copyright violation. I gave Facebook a license to make copies of my data for
use in its website. I didn't give Clearview a license to make copies to give
out to it's customers.

------
wolco
Is it me or did anyone find the availability of photos to be less than I would
have expected.

It is worrisome but a facebook could produce a lot more privacy related
connections from private photos no one knows existed. I guess I was expecting
that.. perhaps in the future facebook will offer this service.

~~~
RandallBrown
Almost all of those photos were from the guys personal blog and one wasn't
even him.

I'd be way more worried if it was finding stuff like me in the background of
someone else's photo in a crowded city or something like that.

~~~
wideasleep1
To my knowledge, only 1 photo of me exists online (I've never even taken a
selfie), in a concert crowd that showed up in our online newspaper. Feeling
pretty secure about that.

------
thinkloop
Bit of a rub that to request your messy, potentially erroneous, public
profile, you have to give private, authenticated identification and contact
information - basically the most valuable information they could want,
dramatically increasing the value of your profile that you were so concerned
about in the first place.

~~~
ardy42
> Bit of a rub that to request your messy, potentially erroneous, public
> profile, you have to give private, authenticated identification and contact
> information - basically the most valuable information they could want,
> dramatically increasing the value of your profile that you were so concerned
> about in the first place.

The concept of being opted-in by default and being forced to authenticate
yourself to opt out is getting more and more ridiculous by the day.

These systems need to be opt-in, and that requirement needs to be enforced by
some kind of powerful government agency with the power to arrest and jail non-
compliant operators. Anything less feels like it would end up being a complete
surrender to companies like Clearview AI.

~~~
tomerv
The governments are the customers of such products. We can't expect them to
protect us.

~~~
ardy42
> The governments are the customers of such products. We can't expect them to
> protect us.

So you're saying we should just surrender to companies like Clearview AI? I
don't agree with your fatalism.

I think you're making a mistake in assuming that governments are unaccountable
and cannot be prevented from pursuing interests that against those of the
people. _That 's clearly not the case._ If you were right, we'd already have
unchecked police surveillance (isn't government is the consumer of
surveillance products?), but we don't. That's only the case in non-democracies
like China. In democracies, society (and the government itself) is capable of
putting significant constraints on government action.

~~~
tomerv
I suppose it depends on the definition of government. I was thinking of the
executive branch (maybe I confused the terms). Parliament legislation would be
the branch that has the responsibility to protect from such things.

------
hcarvalhoalves
Is it really that surprising, since all the photos are available on public
sites? It seems this tool reveals the same a Google search for the name would
reveal.

~~~
MikeAmelung
This guy's name is Tom Smith. Go ahead and pop that into Google and let me
know how that turns out.

His name is about as generic as you could imagine for a white person. But they
returned a bunch of images of HIM, and one Alexey Something-or-other, which
could be his troll account.

Edit: the Alexey part is a joke, I'm sorry but I thought it was funny.

~~~
dmix
TIL generic names (in your geographic region) are good ways to promote
privacy.

I always wondered how the Chinese authorities figured that out considering the
massive name reuse in China. I can’t watch a Chinese movie or TV show without
at least one character sharing the surname (the first part in China) of a
Chinese person I know.

Smith would be our classic version in the west.

~~~
ThrustVectoring
The Romani people have used it as a general tool to make it difficult to
govern them. The local equivalent of "John Smith" gets used by everybody in
every official form. Meanwhile, they just call each other what they otherwise
would've.

------
LockAndLol
I'm not entirely sure what's so shocking about public data, shared willingly
by the person to the public, being used to identify the person. If the data
weren't/hadn't been shared willingly, I'd definitely see that as a problem.
Same thing as if the data were to have been gleaned from non-public sources
e.g the person's private belongings or secured digital realms like private
forums, password encrypted backups, private profiles, etc.

~~~
Cthulhu_
It's about trust and permission. If I give e.g. a local news website
permission to use my portrait, I do NOT implicitly give permission to
Clearview, Google, Facebook, etc to use it for their own purposes.

I mean it's implicitly known that anything you post on the internet is public
property, but legally that is not the case. Portrait law (at least in my
country). You can't just take someone's portrait and use it for your own
gains.

~~~
LunaSea
Actually you do, because scraping is considered legal in the US and your
information is on a public site which makes scraping the information legal.

~~~
stordoff
Legal _in the US_ is an interesting point. We often see the US trying to
overreach and apply US standards when US-owned material is hosted on sites
outside of the US (Richard O'Dwyer), or arguing that accessing a server in the
US makes you subject to US law (Gary McKinnon). Why would similar arguments
not apply here? It may be legal in the US, but that doesn't necessarily place
the actions outside the jurisdiction of the other country. Naturally,
_enforcement_ may be an issue, but that doesn't make it legal. At the very
least, I don't see how it can be seen as OP giving permission - it's
effectively another country taking that decision away from him.

------
samsquire
Ideally we live in a society where not being anonymous is not such a big risk.

But governments change, the data is still around to be abused.

This is what disturbs me. Is how data can be abused in the future.

~~~
chii
> not being anonymous is not such a big risk.

the world has always been anonymous because of the lack of capability to track
large amounts of data - until recently.

Anonymity allows you safety from any one who seeks to predate you. I think
that safety needs to be maintained. People stupidly put photos of themselves
online, then face tag their friends. This allows third parties to identify
your friends and circles, and that's dangerous. All relationship should be
reciprocal.

~~~
kbenson
No, long ago the world was a bunch of small groups of people that knew each
other intimately. At the village and tribe level there was little or no
anonymity. The offset was that there was also _intimacy_. The problem now is
that we again have no anonymity, but there is none of the intimacy needed to
offset the negative aspects of this (and we probably can't, I don't think
intimacy scales the same way, but who knows).

~~~
vageli
The people who left their tribes (or were forced off) were effectively
anonymous to any new group of people they might come across. So if you were
kicked from your group for a misdeed, sure you could continue your bad deeds
in the new group or you could turn over a new leaf without the weight of your
past mistakes holding you back.

~~~
kbenson
> The people who left their tribes (or were forced off) were effectively
> anonymous to any new group of people they might come across.

The people who left or were forced out often died. The world is a scary and
dangerous place without a support network, which civilization basically is.

> So if you were kicked from your group for a misdeed, sure you could continue
> your bad deeds in the new group or you could turn over a new leaf without
> the weight of your past mistakes holding you back.

Outsiders were often viewed with distrust. Why wouldn't they be, when most
people can only associate their leaving the safety of the community with at
best a foreign way of thinking, but more likely them being forced out for past
misdeeds.

It's sort of like interviewing for a job a 35 year old that has no work
experience to show for the last decade, and not a very convincing story as to
why (or even if it's feasible, you don't really _know_ ). Why take the risk?

------
martin-adams
So will we see a new normal that if you delete your data, you’re considered
suspicious as it looks like you might be hiding something.

Personal data may soon become the same as a credit score. No data, high risk.

~~~
fredsanford
This is already here with Google if you use oBlock or uMatrix. Endless
Captchas...

"We detected suspicious blah blah blah"

~~~
dredmorbius
My Recatcha policy:
[https://mastodon.cloud/@dredmorbius/102054627041751386](https://mastodon.cloud/@dredmorbius/102054627041751386)

~~~
ThreeFx
Ironically that triggered the CloudFlare check for me.

~~~
dredmorbius
A recent change on that instance, and one causing other issues:

[https://mastodon.cloud/@TheAdmin/103880990266428449](https://mastodon.cloud/@TheAdmin/103880990266428449)

[https://mastodon.cloud/@dredmorbius/103882090290877356](https://mastodon.cloud/@dredmorbius/103882090290877356)

------
Rafuino
I submitted an info request about 2 months ago now, and I haven't heard a word
from these jackasses

~~~
fishmaster
Keep going. They risk being fined by not answering.

------
jachee
I wonder what would happen if I were to issue them a $5000/mo/piece
retroactive license invoice for any/all use of my name/likeness/etc for their
profit.

I wonder what would happen if _everyone_ did.

~~~
Cthulhu_
A class-action suit that would probably last for a couple years and end in a
settlement where the claimants - assuming they fill(ed) in forms x, y and z,
would be entitled to at best a few hundred dollars.

See also the Equifax settlement. I'd say Equifax is the best / most recent
example about something like this.

~~~
JumpCrisscross
> _Equifax is the best / most recent example about something like this_

Equifax’s credit-data provenance is enshrined in law. Their mistake was in
improperly distributing legally-owned data.

Clearview does not have legal claim to individuals’ or Facebook‘s copyrights.
Its mistake is more fundamental than Equifax’s.

------
fg6hr
It's interesting how collecting or distributing files with music is a grave
crime and corporations can take down anything they don't like with a half-
assed DMCA request, with no repercussions for "mistakes", but a person's face
image doesn't belong to that person and the same corporations can safely
collect and distribute these images for profit.

~~~
ArnoVW
Actually, in Europe it _is_ illegal to collect personally identifiable
information of people without their consent. GDPR and all that.

I am surprised they haven't been fined out of existence yet.

~~~
sdoering
Probably because nobody in the EU currently has this company on their screens.
Or because those who are critical of this company and its practices have not
yet reported it to the relevant authorities.

In Germany not only the GDPR regulation but also the so-called "right to the
own picture" applies here.This means that no one may use/sell pictures of a
person without explicit consent. Therefore, photographers must also have an
explicit release of the person for the respective context of use.

~~~
cl3misch
There has been recent activity (article in german).

The article claims you could ask for deletion of your data without uploading
your ID.

[https://www.golem.de/news/gesichtserkennung-
datenschuetzer-r...](https://www.golem.de/news/gesichtserkennung-
datenschuetzer-rechnet-mit-millionen-clearview-betroffenen-2003-147466.html)

~~~
101404
Clearwater did not obtain consent, so the data shouldn't be there in the first
place. Each and every data protection Behörde here in Germany should be
investigating this.

------
shmageggy
> _For a few million dollars, nearly anyone with a startup background could
> likely build their own version of Clearview in less than a year._

Am I being naive, or is this being overly generous? What about this can not be
recreated with an off-the-shelf web scraper and a pretrained facial
recognizer?

~~~
PeterisP
It's likely to cost you a few million dollars in hardware and multiple months
to run your an off-the-shelf web scraper and a pretrained facial recognizer on
a very, very large number of images. There are a lot of images on the
internet, bandwidth and compute are not free.

------
joshwa
Yandex Image Search already uses facial recognition and is available to the
public. See:

[https://yandex.com/images](https://yandex.com/images)

[https://nelsonslog.wordpress.com/2020/01/07/facial-
recogniti...](https://nelsonslog.wordpress.com/2020/01/07/facial-recognition-
for-the-public-yandex/)

[https://www.bellingcat.com/resources/how-
tos/2019/12/26/guid...](https://www.bellingcat.com/resources/how-
tos/2019/12/26/guide-to-using-reverse-image-search-for-investigations/)

------
koolba
This website does some crazy redirect loop between medium at the sub domain
when opened without JS. How is that even possible?

~~~
shakna
It returns a 302 header with this location:

[https://medium.com/m/global-
identity?redirectUrl=https%3A%2F...](https://medium.com/m/global-
identity?redirectUrl=https%3A%2F%2Fonezero.medium.com%2Fi-got-my-file-from-
clearview-ai-and-it-freaked-me-out-33ca28b5d6d4)

Which then sends another location of:

[https://onezero.medium.com/i-got-my-file-from-clearview-
ai-a...](https://onezero.medium.com/i-got-my-file-from-clearview-ai-and-it-
freaked-me-out-33ca28b5d6d4)

Which just ends up bouncing you back and forth, unless JS is allowed to
percolate through. However, there is some useragent sniffing happening, so the
exact set of headers changes.

------
drivebycomment
A related thought experiment. What if Google or Bing build an image search
service based on the face recognition ? If you don't like that idea, how about
doing that for celebrities and public figures? If you are ok with one but not
the other, what would be a good line distinguishing them and guiding
principle? If you are not ok with either or ok with both, why?

More I think about it, more I lean toward allowing both, but I can see why
people would not like it.

~~~
brandonmenc
> If you are ok with one but not the other, what would be a good line
> distinguishing them and guiding principle?

Because celebrity by definition requires trading privacy for fame, and is
almost always a decision.

We need a new legal classification for "public, but not accessible by everyone
in the world for the rest of time" information, which is what most regular
people assume or desire for themselves.

------
mbrameld
Aren't they profiting from the copyrighted works of others without express
permission?

------
holler
Do the founders of this company have their profiles in the database? What
about the other executives and investors? Do those profiles contain all of the
same personally identifiable information as any other randomly sampled user?
Does it show photos, social media posts, personal contacts, school teachers,
addresses, family member data etc? If not, this sort of blunt assault on
privacy would make Orwell toss in the grave.

------
davnn
So the difference between this an google‘s reverse image search is that
google‘s matching algorithm is worse (probably because it doesn‘t include
facial recognition)?

Well, public images are public and I don‘t think banning such a service would
prevent governments from implementing something like this .. the technological
challenges are getting less and less.

~~~
sixstringtheory
You can presumably hold government officials accountable, but you can’t vote
out the CEO of Clearview.

------
Funes-
Nobody should be posting their private data (pictures or videos, for instance)
on a publicly accessible site if they want privacy. Look, if you want your
relatives, friends, and even acquaintances to see whatever you want them to,
just do it through a medium that allows for private communication. It's just
common sense.

~~~
vharuck
>It's just common sense.

Based on what and how people share online, the "common" sense is an
expectation of decency in not vacuuming data just because you can. You and I
know that's silly, but that's not common sense.

------
megaman821
When it comes to information gathering, I have always assumed if it is
technically possible to do, then some agency in the government is doing it. So
even if people were able to shame all of Clearview's customers into not using
them, that wouldn't stop this type of information gathering form going on.

~~~
kick
Just because the government has nukes, doesn't mean citizens should have them.
The government shouldn't have them, either, but there's not much citizens can
do about that outside of revolt.

Meanwhile this company has been nothing _but_ privacy abuses and lies to the
public. If it isn't broken up by law, it will be interesting to see what the
people do.

------
hellofunk
> Perhaps most worrying is the fact that some of Clearview’s data is wrong.
> The last hit on my profile is a link to a Facebook page for an entirely
> different person. If an investigator searched my face and followed that lead
> (perhaps suspecting that the person was actually my alias), it’s possible I
> could be accused of a crime that the unknown, unrelated person whose profile
> turned up in my report actually did commit.

Yikes, this is like Minority Report - level weirdness.

------
Jennajeanette00
If you need a hacker for hire service, I think one of the best places to find
one will be on the deep web HL forum . where you can find tins of hackers
willing to hack anything for money . One of those with 89% positive review
will be Barrysanchez AT hackermail DOtcom . I am a proud user of their
services. and I am not ashamed to say this . lol

------
eurekin
Honest question - why is author so shocked, given that all the information was
public or published by himself?

~~~
Lio
I think because the idea that someone you don't know and have no relationship
with is systematically collating everything they can about you, is a bit like
having a stalker.

You don't know why they're stalking you.

On the face of it it's for law enforcement in case you decide to commit a
crime sometime in the future.

...or it could be so that they can raise the prices in shops when you walking
and the facial recognition picks you up.

...or it could be for a future employer to decide that the kind of bars you
visit means that you're not the right "social fit" for a job.

...or it could be for... anything.

You have no control and that's the scary thing.

------
DyslexicAtheist
I hope somebody out there in a country that doesn't have an extradition
agreement with the US breaks into their systems and publishes the list of
their clients. We urgently need a big-style leak of everything these scumbags
do.

Also countries should start issuing arrest warrants and sanctions against
them.

------
knolax
Wow that last photo doesn't even remotely look like him. I'm more surprised by
how bad their facial recognition is than by the data have.

------
101404
Why is there no arrest warrant against the managers and owners of Clearview
here on the EU?

They are using PII of hundreds of millions of Europeans without written
consent.

That alone should mean billions of euros in fines.

~~~
saturday14
Because the general public doesn't care enough to raise a stink? Back when
Snowden story broke, I tried explaining its significance to a handful of my
friends - all I got was a yawn. These are not dumb people, they are smart
professionals (non-IT). Even then, I failed to get the point across to them.

It would require some serious education before the public wakes up to the
dangers of private companies running amok with their data. Sad thing is, it is
already too late. It is going to be very difficult to put a lid on this. This
is a company that we (now) know about - how many are there silently working in
the shadows that we don't know about?

~~~
dnissley
Is potential surveillance by government entities the primary reason you are
against companies generally doing whatever they want with user data? Are there
other rationales?

Just like your friends, I personally don't particularly care either, but from
time to time I have tried to understand the privacy crowd's obsession with
this issue and the rationale behind laws like the GDPR and CCPA, as well as
the desire for even more restrictive laws and I truly don't get it.

Is there a manifesto somewhere I'm missing? Some essay or thinkpiece that lays
out in detail the case against collection of user data?

~~~
BiteCode_dev
The best manifesto is an history manual.

Take any century in the last 2000 years of human history, and there are files
about people. For long it was sculpted or hand written. Then it was printed.
Now it's digital. But it's the same thing, only the scale and speed change.

And at any point during those centuries, somewhere in the world, some entity
(it doesn't have to be the gov) does bad things with those files. It's
different every time. Excluding, killing, tracking, stealing, controlling...
The form changes every time, but it's the same thing: abuse of those files.

It would, of course, be less of a problem if the information access was
perfectly symmetrical. If anybody could access anybody's data, society would
probably have a hard time for a few years, then adjust. And maybe become more
fair.

But that's not what's happening. Here it just reinforces power asymmetry. And
it creates incentives with huge bias that affect everybody's life.

There are three reasons why people give your answer.

1 - We had a nice run for a few decades in North America and West Europe. It's
been a sweet life. And the human mind sets it as a new baseline. Now people
see this as normal, and something else as the exception.

However, those decades ARE the exception. An exception that needs maintenance
to preserve as best as we can.

2 - We are already pretty bad at making the connection between our misery
today and the consequences of our past lifestyle, but today's information
system is making it extra hard.

There are several factors for this: those in power getting really good at PR,
information overload, more levels of indirection between causes and
consequences, and the whole system complexity that never ceases to increase.

3 - The convenience is huge, and the price delayed

We don't get tracked for free, we get huge convenience in exchange. Plus we
don't pay the price immediately, nor individually. We pay it as a society, and
since it's cumulative, it's not obvious how much it costs us. It will only be
painful in ... Well nobody knows when.

In fact, not only doing things right would rip us from convenience, but we
would individually pay a strong price on top, right now. While seeing
everybody around not doing it.

It's the exact same problem than for global warming.

Not accepting tracking is a deep and important political decision that shapes
the future of our entire society. It is as important as avoiding mixing the
church and the state or defending freedom of speech.

And it's also why it's not a popular view: it requires to think about what
society we want to build, and not just what life we hope to have individually.

------
bencollier49
Compiling profiles like this without consent is subject to massive fines under
GDPR.

I find it extremely surprising that they would be responding to GDPR subject
access requests, given that they appear to be ignoring the rest of it.

~~~
sdoering
Not replying would get them in trouble even faster. So probably they hope to
appease the one requesting the data.

------
minusSeven
For people in countries with no clear privacy laws could I also request
Clearview to show my data ?

Has anyone tried it ?

------
himinlomax
The author says: "If they have — and you’re a resident of California or a
citizen of the EU — the company is legally obligated to give you your profile,
too."

Sure, they're obligated to give you your profile per GDPR. But if you're
within the GDPR's jurisdiction, they're obligated to get your consent BEFORE
they collect personal information about you. If they haven't, they're liable
for at least millions of euros of fines.

------
DeathArrow
They are using copyrighted images. What if 1000 of copyright owners decide to
sue them asking $100 000 each?

~~~
Kovah
One thing you should be very careful about is licensing. If you post photos on
Facebook or Instagram, you automatically grant them a license to redistribute
the photo and share it with others. And these "others" can include Clearview.
So, Clearview could have a contract with Facebook which legally allows them to
get and save those photos. So, you are still the copyright owner, but due to
licensing Clearview can legally store and use the photos.

About suing them if they would break copyrights: not sure about the US, but in
Germany it wouldn't be that easy to actually sue them for such a high sum. You
could argue that the company makes money by offering the search service, but
there must be evidence that Clearview made that specific sum just with your
photo(s), which is very unlikely.

~~~
Cthulhu_
They would have to be able to prove that in a lawsuit first. Do they have
similar agreements with the alumini magazine? The python coder's meetup group?
The personal blog?

Just one of those pictures would be enough for a lawsuit.

Of course, if someone were to sue, they would be heavily pushed to settle out
of court. I suspect there's a lot more legal activity going on about things
like this, but everyone that starts to make some noise is quickly silenced
with a lump sum and a binding contract to shut up about it. If they don't,
they're threatened with spending the next 5-10 years in court. Because if
there's anything corporate lawyers are good at, it's stalling and making sure
the suing party, especially if it's a random consumer, spends years and
hundreds of thousands in court fees.

What we need is more rich people suing businesses. Or a massive public defense
fund supporting the average joe's case.

But right now it's in the rich people's interests to support shady businesses,
and if they don't they'll be offered massive financial incentives on the golf
course.

------
cmarschner
Given the „right to be forgotten“ shouldn‘t he be able to request all his data
be deleted?

~~~
erk__
From the article:

> And remember that once you receive your data, you have the option to demand
> that Clearview delete it or amend it if you’d like them to do so.

------
peter_d_sherman
Excerpt:

"What does a Clearview profile contain? Up until recently, it would have been
almost impossible to find out. Companies like Clearview were not required to
share their data, and could easily build massive databases of personal
information in secret.

 _Thanks to two landmark pieces of legislation, though, that is changing. In
2018, the European Union began enforcing the General Data Protection
Regulation (GDPR). And on January 1, 2020, an equivalent piece of legislation,
the California Consumer Privacy Act (CCPA), went into effect in my home
state._

Both GDPR and CCPA give consumers unprecedented access to the personal data
that companies like Clearview gather about them. If a consumer submits a valid
request, companies are required to provide their data to them. The penalties
for noncompliance stretch into the tens of millions of dollars. Several other
U.S. states are considering similar legislation, and a federal privacy law is
expected in the next five years."

------
yoaviram
Send them a GDPR/CCPA deletion request now:
[https://yourdigitalrights.org/?company=clearview.ai](https://yourdigitalrights.org/?company=clearview.ai)

------
battery_cowboy
Private companies can do whatever immoral garbage they want, and in exchange
for access to the data or the product they developed in an unethical manner,
the government slaps them on the wrist, or does nothing, allowing it to get
worse and worse and allowing them to assert more control over us.

Imagine the story when people find out this or another company started
skimming the porn sites with facial recognition, starts gaining access to
surveillance footage from Nest or Ring, or maybe even gets access to state and
federal DOT cameras and real time feeds from body cameras!

Facial recognition is going to need some regulation, ASAP.

~~~
nexuist
> starts gaining access to surveillance footage from Nest or Ring, or maybe
> even gets access to state and federal DOT cameras and real time feeds from
> body cameras!

[https://www.engadget.com/2020-03-04-banjo-ai-utah-law-
enforc...](https://www.engadget.com/2020-03-04-banjo-ai-utah-law-enforcement-
surveillance.html)

> The agreement gives the company real-time access to state traffic cameras,
> CCTV and public safety cameras, 911 emergency systems, location data for
> state-owned vehicles and more. In exchange, Banjo promises to alert law
> enforcement to "anomalies," aka crimes, but the arrangement raises all kinds
> of red flags.

> Banjo relies on info scraped from social media, satellite imaging data and
> the real-time info from law enforcement. Banjo claims its "Live Time
> Intelligence" AI can identify crimes -- everything from kidnappings to
> shootings and "opioid events" \-- as they happen.

------
sergioisidoro
Why are paywalled articles still allowed on HN front page? And why are people
still upvoting them?

~~~
tomhoward
[https://news.ycombinator.com/item?id=22693084](https://news.ycombinator.com/item?id=22693084)

------
shmerl
They should be in prison, instead of making money on violating people's
privacy.

------
chippy
Would you help and use Clearview if it was being used in your governments
strategy against coronavirus?

Would you volunteer your time to tag images with your friends and
acquaintances, to help slow down the virus? To do otherwise would be immoral
and lead to the death of thousands, right?

To those worried about that, it's just temporary. It would just last a few
months and then you don't need to worry about it any more. This is a global
war and we have to make sacrifices and take important actions.

~~~
dredmorbius
Yuval Noah Harrari, "The world after coronavirus"

[https://www.ft.com/content/19d90308-6858-11ea-a3c9-1fe6fedcc...](https://www.ft.com/content/19d90308-6858-11ea-a3c9-1fe6fedcca75)

Bruce Schneier, "Emergency Surveillance During COVID-19 Crisis"

[https://www.schneier.com/blog/archives/2020/03/emergency_sur...](https://www.schneier.com/blog/archives/2020/03/emergency_surve.html)

I'd have very grave misgivings.

~~~
toxicFork
Thank you for the links. Listening to them reminds me of my interaction with
similar thoughts.

I was in a startup accelerator near the end of last year and many of the
business ideas that some of my teams came up with were data oriented and how
it could be better gathered or used for good purposes. Or for profit and
control. For example measuring people's location, driving speed and brake
force to give discounts for car insurance. We found out that some companies do
this already, and others are asking for it. And that was not even in an
emergency context.

From another viewpoint, the more data we get, the more sensitive we will get
for the "crises", or we can define simpler things as "crises" anyway. And that
will demand more data gathering. If we allow using data to stop Coronavirus
for example, we can call "the flu" as something to tackle, if only we had more
information about the people. Why not the cold, too, after the flu?

I feel that it is inevitable.

