
Before Clearview Became a Police Tool, It Was a Secret Plaything of the Rich - pseudolus
https://www.nytimes.com/2020/03/05/technology/clearview-investors.html
======
DennisP
Clearview now says the app is "available only for law enforcement agencies and
select security professionals to use as an investigative tool."

I'm fairly convinced by David Brin's argument that this is exactly the wrong
approach. In _The Transparent Society_ he argues that privacy is no longer an
option; our choice is between a society where the police surveil us all, and
one where we all surveil each other. He says only the latter is compatible
with freedom. We have to be able to monitor the cops and the powerful, just
like they monitor us.

Maybe we need our own Clearview, with open source face recognition and data on
the darknet.

~~~
pletsch
I'm convinced that Gen-Z is going to blow all of this out of the water. That
the world will eventually hit a point where there isn't anyone without
something online that can come back to haunt them. And I think companies are
starting to see it too.

A little anecdotal but when I was applying for jobs last summer there was a
video that was brought up twice during interviews with different
organizations. Both times I was told after the fact that it was just to see
how I would react, all I did was tell them what happened, not that I was sorry
because I wasn't, I told them I stood behind what my actions were and there
was nothing I could do about it now (it hit 100k views on Instagram, not a
situation where I can contain it). Both companies offered me a job.

I rewrote this a few times and I'm still not sure I got my point across, but I
agree with the argument in the The Transparent Society, and I already see it
shifting towards that.

~~~
saagarjha
> A little anecdotal but when I was applying for jobs last summer there was a
> video that was brought up twice during interviews with different
> organizations.

I find it interesting that a company would do this instead of silently reject
you…

~~~
pletsch
The one I took, the interviewer ended up being my boss. He told me that the
reason they overlooked it was because:

1\. If there were videos of everything stupid he did, there would be no way he
would ever get hired.

2\. It wasn't anything illegal or violent, so he didn't see it as a character
issue.

3\. I was honest about it.

~~~
Balgair
My Machiavellian side says that they have dirt on you now and know they can
use it at any time. If anything, more dirt is then better, as it creates more
leverage for them in the long run.

Still, Hanlon's Razor applies, and you boss is more than likely just telling
the truth.

~~~
djrogers
Your Machiavellian side is completely missing the point then - this is(was) a
_public_ event (as evidenced by GP's reference to 100k views on IG), thus
there is no leverage.

You'd have leverage if there were a reasonable likelihood that you were the
only hiring manager to have this information, but if everyone is on a level
playing field, the leverage disappears.

~~~
Bartweiss
Not only is an easily discovered public event poor leverage, it becomes much
_worse_ leverage if it comes up in an interview.

When companies (or governments) try to manipulate employees, they frequently
rely on some kind of willful ignorance. Wells Fargo is a great example: they
set impossible performance targets and turned a blind eye to fraud, then fired
and blacklisted whistleblowers - ostensibly for knowing about that same fraud!

If a shady employer wants leverage, even public events can suffice as long as
they can claim ignorance. For example, most stock option grants are
immediately lost if you're fired, but even at-will employment can't be
terminated specifically to deprive someone of their options. So an employer
might give a generous options package, then "discover" the IG video and use it
for dismissal at just the right time to prevent a profitable exercise. But if
that video comes up during hiring, it's no longer a plausible reason for later
dismissal, at least without committing perjury regarding the interview.

I can't even work out a scenario where "lots of people know about this
including us" is an effective way to manipulate someone.

~~~
sfifs
In any real going business, cost of hiring and training someone of a type who
is eligible for compensation with stock options and the subsequent morale dip
if discovered doing such manipulation would by far outweigh the benefits of
doing this. So this is largely a tin foil hat scenario

~~~
Bartweiss
For any business with decent size, absolutely. There are a thousand ways to
claw back options, and the reason they don't get used is that doing it even
once would make hiring practically impossible.

For a small enough company? It falls in the same category as "diluting out of
one guy's shares" \- bad morals _and_ bad business, but it still happens.

------
ausbah
>“People were stealing our Häagen-Dazs. It was a big problem,” he said. He
described Clearview as a “good system” that helped security personnel identify
problem shoppers.

>BuzzFeed News has reported that two other entities, a labor union and a real
estate firm, also ran trials with a surveillance system developed by Clearview
to flag individuals they deemed risky. The publication also reported that
Clearview’s software has been used by Best Buy, Macy’s, Kohl’s, the National
Basketball Association and numerous other organizations.

this seems just like another tool that will be used to put the non-elite at a
disadvantage. until proven otherwise - I can only think of negative outcomes
for ex-felons, low wage workers, people of color, and the likes coming from
the usage of this app by the rich and large corporations

~~~
keanzu
> ex-felons, low wage workers, people of color, and the likes

"people of color" are not "the likes" of ex-felons and low wage workers.

~~~
hirsin
People who are generally oppressed by the larger system/those in authority?
There's a charitable understanding of the statement here.

------
notRobot
We're now constantly under surveillance and facial recognition is being used
on that footage. Apple stores. Malls. Bus stands. Grocery stores. Train
stations. Traffic stops. Schools.

Privacy is dead. Anyone with money or any government can use one picture of
you and get basically every piece of information about you.

Maybe this isn't true to the same extent for the HN crowd who might be more
privacy conscious, but it is true for 99% of the rest of the population.

Schools and governments failed to educate about the privacy concerns. Maybe
that's understandable. But they still don't. Teens post all sorts of stuff
that will come back to bite them, that will never be forgotten.

~~~
snarf21
This is true. Only legislation will stop this and that seems unlikely.

The worst part is that most people upload high quality, high fidelity tagged
training data on a daily basis. Add in the fact that social networking
automatically builds network circles, it is not wonder we are riding a
landslide towards 1984 and Fahrenheit 451.

~~~
arpa
We are already there, but with a healthy dose of Huxley's soma, so most of us
don't really see it. I, too, partake the soma in terms of current discourse -
otherwise it's very hard to survive out there, and be a member of society
almost impossible.

------
zcw100
And they rage when the little guy does it to them. It's obvious that employers
are either paying people to post good reviews or strong arming current
employees to write positive reviews on Glassdoor. The good thing is they're
easy to spot since they're usually some vapid garbage like "Best place in the
world to work!" with 5 stars interspersed with the real 1 star reviews. You
can also tell because there are usually just enough fake reviews to push the
real ones off the front page.

------
DailyHN
Seems like they're building the hype train so that consumers want to pay for a
version they can use to "take control" of their digital identity.

~~~
Nextgrid
Just like the credit reference agencies. The scum gets your data from
everywhere (sometimes wrong data), shares it with whoever asks, but makes it
super difficult for _you_ to get it or rectify any wrong data.

~~~
gruez
>shares it with whoever asks

Legally speaking they're supposed to get your authorization, but yeah it's an
honor system.

>but makes it super difficult for you to get it or rectify any wrong data.

Is it? I thought all you had to do was write them a letter?

~~~
JohnFen
> Is it? I thought all you had to do was write them a letter?

Yes, it is. I've been trying to get this data on myself for years now, and
have yet to succeed. Letter-writing is not sufficient (at least in the US).

------
bsenftner
Anyone that wants their own "Clearview" like app can take almost any FR
application and create a database from scraping the web. All that Clearview
did was pre-scrape the web for you, but to do that yourself or as an open
source project of sorts is not difficult.

~~~
lwh
I wonder how legislating open algorithms as illegal will work out? If you can
run this reasonably on a phone building your own face/object databases how
could they stop anyone from doing it?

~~~
JohnFen
You wouldn't legislate against the algorithm. You would legislate to prevent
the abuse of the data used to feed the algorithm.

------
xz0r
If they are able to do this by creating the dataset just by scraping public
social media web images, and having a facial recognition algorithm in place,
whats stopping anyone else to do that ?

~~~
12xo
I'd imagine that there are many, more clandestine versions of this.

------
vinbreau
This skit from Amazon Women on the Moon turned out to be prescient. It's not
so funny anymore.

[https://www.nytimes.com/2020/03/05/technology/clearview-
inve...](https://www.nytimes.com/2020/03/05/technology/clearview-
investors.html)

~~~
kyuudou
Which skit?

------
12xo
The problem I see with this tech is with insurance. Your life, your health,
your well being your family's well being, will be dependent on your ability to
hold good insurance. If everything is measured, everything is tracked,
everything quantified, your life is going to be very different.

------
hammock
Has anyone here tested the app and can speak to their direct experience?
(throwaway or not)

------
StavrosK
Doesn't the title imply it's not a secret plaything now?

I guess it's not secret...

------
fultonfaknem42
I would _love_ to produce my own ClearView.

------
sub7
I know Hoan. He's a great guy. Very talented. Super capable. Totally trust him
to deploy this responsibly.

~~~
iron0013
Despite the many examples in this article and elsewhere about him already
deploying it irresponsibly? Or maybe I missed the implied “/s”

------
peter_d_sherman
Facial recognition technology as a tool for protecting retail
establishments... Interesting!

------
dropoutcoder
I have lived in {} since the mid 2000’s. Stalking by strangers and
acquaintances has gotten out of hand in (at least) the past five years. (Any
such behavior against me has since calmed down in the past year, after
reworking my digital devices, but the effects have had significant impact on
me. I also dropped out and gave up on life this past year, which may make me a
much less interesting target to harass.)

Such technologies are part of an ongoing increase in information and power
asymmetries that can be abused to harass innocent competitors, as has happened
to me. I’ve had strangers come up to me in public and discuss specifics of my
private life, including non public details about my since failed startup, and
personal/private comms. Concurrently, I was falsely accused of a serious crime
and was put under the microscope and harassed on a regular basis by strangers
regarding this. It became apparent that my life was completely owned at that
point, digitally and publicly. It amounted to ongoing bullying which really
pushed me beyond thresholds of learned helplessness already long since
established.

There seems to be no recourse against this behavior. If you have a digital
“kick me” sign attached to your back, there’s little you can do to remove it,
short of avoiding being in public. Or, as in my case, one can drop out of
life, go homeless, give up all of your assets, and prepare for suicide.
Strangers can verbally harass/own/gaslight others, maintain perfect plausible
deniability, have perfect encryption to cover their tracks, and devastate
people who aren’t equipped to deal with this behavior.

Evolution of survival going forward is trending towards resilience to
increasingly sophisticated psychological violence and harassment, as well as
the ability to accept being an unwitting voyeur in all public places.

One of the most difficult aspects to this was reporting these incidents
(admittedly, under duress in the heat of the moment), and being told that I
must be delusional and mentally ill. To me, the delusion is genuinely
believing that technology is not used to stalk or harass people in public. As
a counterpoint, I will say that being stalked repeatedly does increase your
paranoia, so you’ll start to look over your shoulder at every turn. If you
believe that all of your devices and accounts are hacked and being used to
harass you, the complete lack of digital privacy can have a profound impact on
sanity.

To this day, I’m utterly freaked out by the presence of personal cameras, to
the point where I’ve nudged people in the community to be aware of the
cultural impact of holding phones vertically in coffee shops or other public
places. As most people are of course good natured, I’ve noticed a trend in the
places that I frequent towards people being more prudent in this regard. I
personally cover the public facing back camera on my phone with my index
finger as a matter of habit by now, to avoid pointing it at strangers in
public. Personally I believe responsibility amongst the tech elite would
include immediate installation of physical shutters that open only when a
camera is in use. Shutters can be colored blue or yellow, perhaps as a
culturally standardized signal that the camera is “closed”.

There’s clear benefit to tech such as Clearview but the potential for abuse by
irresponsible or immoral actors is tremendous. As someone pointed out, such
tech can be rolled yourself. It seems that the problem is therefore out of
control. Welcome to the age of unwitting voyeurism.

Edit: I did make a comment on the linked NYT article, including my real
identity. In this comment, I called out at least one person involved in
shenanigans against me. This person name dropped {} as someone who would
recognize him, before he trashed my startup without seeing it, encouraged me
to drop out of my continuing Computer Science studies at the local University
(due to the bad rep I would receive for doing so as a middle age adult, so he
said), and then threatened my career/reputation if I told the truth about
specific stalking incidents, all in one conversation. Not long thereafter, I
experienced a stalking incident in public by two men with walkie talkies who
harassed me about said startup, mentioning non-public specifics about an
engagement we were seeking. In retrospect, these men could have been using
tech such as Clearview to more easily enable their stalking and harassment of
me. The location of this incident was the playground of wealthy folks in my
city’s most affluent public area. My comment on the NYT article was not
approved by the moderators, understandably.

------
RickJWagner
Meh. You can be recognized by humans, you can be recognized by machines. I
don't get the outrage.

~~~
danso
The top of the story features an example where human recognition was not
sufficient:

> _One Tuesday night in October 2018, John Catsimatidis, the billionaire owner
> of the Gristedes grocery store chain, was having dinner at Cipriani, an
> upscale Italian restaurant in Manhattan’s SoHo neighborhood, when his
> daughter, Andrea, walked in. She was on a date with a man Mr. Catsimatidis
> didn’t recognize. After the couple sat down at another table, Mr.
> Catsimatidis asked a waiter to go over and take a photo._

> _Mr. Catsimatidis then uploaded the picture to a facial recognition app,
> Clearview AI, on his phone. The start-up behind the app has a database of
> billions of photos, scraped from sites such as Facebook, Twitter and
> LinkedIn. Within seconds, Mr. Catsimatidis was viewing a collection of
> photos of the mystery man, along with the web addresses where they appeared:
> His daughter’s date was a venture capitalist from San Francisco._

> _“I wanted to make sure he wasn’t a charlatan,” said Mr. Catsimatidis, who
> then texted the man’s bio to his daughter._

~~~
jfk13
So it's OK these days to randomly ask a waiter to go and snap a photo of some
other customer in a restaurant? Maybe call me old-fashioned, but that strikes
me as outrageous behavior.

~~~
pseudolus
“Let me tell you about the very rich. They are different from you and me. They
possess and enjoy early, and it does something to them, makes them soft where
we are hard, and cynical where we are trustful, in a way that, unless you were
born rich, it is very difficult to understand. They think, deep in their
hearts, that they are better than we are because we had to discover the
compensations and refuges of life for ourselves. Even when they enter deep
into our world or sink below us, they still think that they are better than we
are. They are different. ”

F. Scott Fitzgerald

