
Clearview AI helps law enforcement match photos of people to their online images - johanam
https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html
======
oefrha
> His tiny company, Clearview AI, devised a groundbreaking facial recognition
> app. You take a picture of a person, upload it and get to see public photos
> of that person, along with links to where those photos appeared. The system
> — whose backbone is a database of more than three billion images that
> Clearview claims to have scraped from Facebook, YouTube, Venmo and millions
> of other websites — goes far beyond anything ever constructed by the United
> States government or Silicon Valley giants.

This doesn’t sound groundbreaking all and I’d be very surprised if
FBI/NSA/DHS/Palantir didn’t already have a system like that. Maybe they were
just reserved for higher-value targets. Of course, NSA isn’t gonna tell you
what it has constructed until decades later, so claiming that this goes far
beyond NSA capabilities is reckless at best and clueless at worst.

~~~
jslabovitz
What’s groundbreaking here is not really the technology but rather the
marketing: Clearview is selling a relatively cheap product to law enforcement
agencies who are desperate to solve cases at the lowest cost.

I think we (both here at HN and in the larger society) have this perception
that police departments, etc., are part of a well-organized hierarchy of well-
considered processes and technology that start at the FBI/NSA/etc and works
its way down, all carefully vetted for technical and ethical standards, and
disclosed to the public who is ostensibly their employer.

The reality that I’ve seen is quite different. As the article points out,
individual police departments (and their officers!) often independently
research and procure services, based on hearsay and questionable morals. Does
the service help ‘get the bad guys’? Then it’s good. Does the service
obviously violate copyright and ToS agreements? Not our problem.

I learned this when I built a scrappy little website for a one-man company who
builds training kits for first responders. As low-tech as this site was (e.g.,
ordering the product entailed sending a purchase order by postal mail) I was
struck by the apparent success of my client’s product: he had dozens of
endorsements by apparently top-level influencers in the field, and didn’t have
much competition. There were some certifications that he matched, but it
didn’t seem all that hard to sell a few thousand dollars of technology to
small/mid-level LEOs. (This isn’t a criticism of his product, which seemed
fine and certainly didn’t involve privacy or surveillance technology.) The
numbers in the article about Clearview’s services are in the same range — easy
to justify if it’s only a few thousand a year and apparently produces good
results.

~~~
enraged_camel
>>What’s groundbreaking here is not really the technology but rather the
marketing: Clearview is selling a relatively cheap product to law enforcement
agencies who are desperate to solve cases at the lowest cost.

Exactly. This is a prime example of commoditization. The product doesn't have
to be sophisticated. Just good enough to handle the most common use cases
reasonably well, at a cheap price.

------
kbos87
I’ve been waiting to see this story and this company show up. It has felt
inevitable and it’s sad that it’s being allowed to happen. How long will it be
before the Ring cameras on every block feed into a real-time database where
anyone can figure out where anyone else is, track their movements, and monitor
their every moment?

Facial recognition is a bigger threat to our current way of life than just
about anything else I can fathom other than climate change or nuclear war. The
scariest part is that few people seem to recognize or care about the risk.

~~~
bonoboTP
People freak out when it's based on cameras and faces because it triggers some
ancient "being watched" reactions in us.

But when it's based on phone GPS, wifi networks etc., then people are fine
with it. And that type of tracking is very possible for many years now through
smartphones. But it feels less viscerally spooky.

~~~
JohnStrangeII
It's because you can leave your phone at home but not your face.

~~~
analbumcover
Maybe this will lead to a Face/Off type scenario.

------
ausbah
"One of the odder pitches, in late 2017, was to Paul Nehlen — an anti-Semite
and self-described “pro-white” Republican running for Congress in Wisconsin —
to use “unconventional databases” for “extreme opposition research,” according
to a document provided to Mr. Nehlen and later posted online."

Giving hate groups a greater ability to stalk and harass their victims is also
pretty scary.

------
tinman45
This illustrates hypocrisy of Silicon Valley.. don’t be evil indeed

Peter Thiel funding this, while he demolished Gawker Media, through litigation
for his own invasion of privacy.

Forget Silicon Valley, we urgently need federal regulation to limit this
assault on our privacy (at the very least it can slow down our country’s
inevitable decline into a black mirror episode)

~~~
Hamuko
I don't think many people in the valley actually care about being evil.

~~~
CarelessExpert
I don't think many people in the valley think about it one way or the other.

Welcome to our brave new world: techno-utopian utilitarianism, exemplified by
the sophomoric philosophies of Zuck and Thiel.

~~~
samstave
More insidious than they are simply sophomoric

------
rahuldottech
I'm not okay with this - and you shouldn't be either. It's only a matter of
time before it gets and misused/abused - to identify protestors, for law
enforcement officers' personal uses, to identify those doing legal things
deemed "immoral" (see: China), etc.

We really need regulation here. Urgently.

The US appears to have been the leader in such regulation in the past. The
problem is, they don't do that anymore. They haven't passed any laws related
to user rights or privacy in a long time, and are actively trying to make
encryption illegal.

The same is true for the Australian government, and those of several
developing nations. We can hope that the EU does something, but... the impact
will be limited.

It's especially bad for people living in non-first-world countries like India
where the citizens aren't educated on the consequences of law enforcement
agencies using tech like this. Laws taking away the right to privacy are being
pushed through regularly. Recently they've started using facial recognition to
identify protestors: [https://www.fastcompany.com/90448241/indian-police-are-
using...](https://www.fastcompany.com/90448241/indian-police-are-using-facial-
recognition-to-identify-protestors-in-delhi)

I really wish that some leading tech companies would try and push regulation
through, but that will never happen since apparently privacy erosion and
constant user tracking is critical for revenue for seemingly all of them
(except Apple, I suppose).

Also, even if somehow regulations were put in place that made it necessary for
websites to try and protect user data and made it illegal to scrape PII,
there's nothing stopping government agencies from developing tools like these
for themselves. Aaaand we go back to the first paragraph of this comment. This
is a sad state of affairs.

~~~
quotemstr
I'm okay with this. It's a long-standing principle that you have no
expectation of privacy in public spaces. On what basis do you get to claim
suddenly that it's "not okay" and on that basis slow technological
development? Every new tool has benefits and drawbacks, when it comes to
integrating computing into our lives, the benefits have always massively
outweighed the drawbacks.

We don't need regulation here, urgently or not. This whole push towards
banning things --- this company, the EU facial recognition thing, and so on
--- strikes me as just another moral panic used as an excuse for a few to
impose their opinions and power on the many.

I've yet to see privacy advocates identify actual undeserved harms that have
come to people as a result of the technology that they want to regulate. Loss
of "privacy" in public is only a harm if you already accept the premise of the
argument, which I don't.

~~~
saagarjha
> It's a long-standing principle that you have no expectation of privacy in
> public spaces.

There are limits to what we consider acceptable even in public spaces; for
example, upskirt photos aren't ok even you're in a public place. I think it's
still reasonable to consider that one day (maybe today, for many people?) it
might mean that every single moment of their life outside is being recorded,
which was literally not possible until recently. It's a valid thing to
discuss.

~~~
quotemstr
What I'm against is this idea that X is okay as long as X is expensive, but
the moment X becomes cheap, or democratized, or accessible, all of a sudden
it's a problem, and we need a ban.

Example: it's already legal to keep tabs on people in public. There are
businesses build on this idea, private investigators. A little sleazy?
Expensive? Sure. But legal. If you want to be consistent, you should ban them
too.

If X is what causes harm, X should be disallowed no matter the price.

~~~
rapind
In general I disagree (but would agree with you in certain cases, like
something that was doable but gatekept due to cost).

Alot of X's aren't a problem until they can scale. It's not pragmatic to
outlaw everything that might be a problem at scale but might never be able to
achieve that scale.

We're in an era where we are discovering alot of abuses that could only be
classified as an issue due to scale and efficiency.

------
gazoakley
I can imagine the EU having a field day with this. Given the scraping of data
from other sites, this is clearly processing of PII without explicit consent -
it's effectively being used as a biometric identifier too so the rules around
sensitive data also apply...

~~~
pinkfoot
If they don't do business in the EU, the DGPR doesn't apply to them. Any more
than Saudi Arabia's blasphemy laws affect me.

~~~
99ps
It’s my understanding that GDPR protects the personal data of EU citizens.
Regardless of whether you do business in the EU, if you process this data GDPR
applies to you.

My guess is that this company has no way to verify that they don’t process EU
citizen data. They almost certainly do if they’re scraping so pervasively. And
I don’t think they can credibly claim users gave consent let alone all the
other rules they need to follow.

Looking forward to someone challenging them on this and hopefully the EU
taking action. This feels like exactly what GDPR should protect against.

~~~
pinkfoot
The EU might want that, but just today my company collected and passed the
personal details of a number of EU citizens to colleagues without their
consent.

Since the company doesn't do business in the EU, the GDPR can go get knotted.

PS. My gay mates have also not decided to go straight just because Uganda
outlaws it.

~~~
alwayseasy
>> Since the company doesn't do business in the EU, the GDPR can go get
knotted.

That's not how international law works though, especially when wielded by a
large economic block. If the EU wants to put pressure on a company the pain is
harsh. For instance they can blacklist the company and it's C-suite from
international banking and ask any in-treaty country to extradite or arrest
employees.

Also are you admitting to breaking EU law and moral/ethical codes on HN ?

~~~
pinkfoot
The first yes, the latter no.

I also freely admit to breaking a lot of blasphemy laws.

None of them are laws where I live, so I won't ever get extradited.

------
caconym_
Is this unique? I feel like there are many companies providing this sort of
surveillance/data-scraping aggregation and indexing functionality to law
enforcement and TLAs. We just don’t hear about them too often because we are
not their customers.

Our government has abandoned us, and total surveillance is the future unless
something radical changes.

Fun tip: get an old analog radio, like in an old non-connected car or a
boombox or walkman or clock radio or something, go somewhere quiet and
private, and listen to whatever you want. And realize that nobody knows what
you’re listening to—it’s your secret. I find this to be a strangely powerful
experience.

~~~
tim333
The Russians have had a service to find people on their Facebook equivalent
since 2016
[https://en.wikipedia.org/wiki/FindFace](https://en.wikipedia.org/wiki/FindFace)

------
dbroockman
Holy shit, this quote is crazy:

> While the company was dodging me, it was also monitoring me. At my request,
> a number of police officers had run my photo through the Clearview app. They
> soon received phone calls from company representatives asking if they were
> talking to the media — a sign that Clearview has the ability and, in this
> case, the appetite to monitor whom law enforcement is searching for.

~~~
wallace_f
Sadly, it reminds me of all the stories of authorities using databases for
stalking women they're interested in, or people they have a personal grudge
against, or whatever.

[https://nypost.com/2019/03/11/sergeant-used-police-
databases...](https://nypost.com/2019/03/11/sergeant-used-police-databases-as-
personal-dating-service-to-target-150-women-chief/)

[https://theweek.com/speedreads/651668/hundreds-police-
office...](https://theweek.com/speedreads/651668/hundreds-police-officers-
abused-databases-stalk-lovers-journalists-enemies-ap-finds)

[https://apnews.com/699236946e3140659fff8a2362e16f43/ap-
acros...](https://apnews.com/699236946e3140659fff8a2362e16f43/ap-across-us-
police-officers-abuse-confidential-databases)

~~~
tartoran
This is a valid concern, it aleady happens and nothing stops if from happening
in the future

------
brobdingnagians
> That’s because Facebook and other social media sites prohibit people from
> scraping users’ images — Clearview is violating the sites’ terms of service.
> “A lot of people are doing it,” Mr. Ton-That shrugged. “Facebook knows.”

The last kind of person on Earth I want making an app like this is someone
that doesn't care about terms of service, morality, contracts, or upholding
the law. It seems like he just got into it for the money, and has no
compunction about unethical behaviour. "Everybody's doing it" is a cliche, and
idiotic, response. Don't take any wooden nickels when you sell your soul...

~~~
nyolfen
i’m hoping this story prompts his company getting sued into oblivion. if i
were running twitter or fb and suddenly there’s a strong incentive not to post
personal photos i’d be rightly alarmed.

~~~
strathmeyer
Maybe their thinking if the things Facebook or Twitter does with your photos
doesn't bother you nothing will

------
notyourday
Was not there a Russian app 4-5 years ago that did something similar? Its
premise, if I recall correctly, was to trawl VK/dating app profiles based on
photos of people taken on a street.

If we are to believe the tests journalists did, it was pretty good considering
the app authors just pulled some VK/russian dating app photos

Edit: Thanks for downvotes. Here's the article:
[https://www.theguardian.com/world/2016/apr/14/russian-
photog...](https://www.theguardian.com/world/2016/apr/14/russian-photographer-
yegor-tsvetkov-identifies-strangers-facial-recognition-app)

The app is called FindFace

------
morisy
More background, including access to the public records requests that led to
this investigation:

[https://www.muckrock.com/news/archives/2020/jan/18/clearview...](https://www.muckrock.com/news/archives/2020/jan/18/clearview-
ai-facial-recogniton-records/)

------
raldi
How is this not a massive copyright violation? When I upload a photo to, say,
Instagram, I know I’m granting _them_ a perpetual license to do basically
anything with my photo, but scrapers don’t inherit those rights.

~~~
bobbylarrybobby
The article does say what Ton-That did violates most sites' TOS. It's now up
to the sites to take legal action against him.

~~~
microdrum
I mean, it's a search engine right? Google scrapes Facebook at global scale.

~~~
raldi
No it doesn’t. Google honors Facebook’s robots.txt, which excludes basically
all the valuable data. It’s why they blew billions trying to get us all to
switch to Google+.

~~~
microdrum
That's so weird because I see text and images from fb domains in google search
results all the time. Likewise LinkedIn.

~~~
raldi
[https://webmasters.stackexchange.com/questions/28337/can-
goo...](https://webmasters.stackexchange.com/questions/28337/can-google-crawl-
facebook-pages)

~~~
microdrum
Your link supports my statement.

------
boomboomsubban
For an article seemingly about the dangers of such a product, it sound an
awful lot like an advertisement. They mention several influential people
involved, had law enforcement agencies rave about how well it worked, included
the price point, and free trials are available!

~~~
johanam
Agreed. Anyone who stands to benefit from this company's services are a few
clicks away from a trial--and it does not seem they're particularly sensitive
to how their system is used.

------
air7
This is/was inevitable.

We all know that once technology allows a beneficial behavior you can easily
get away with, nothing can stop it. See torrents, ad blocking, reverse-
engineering, cracking, etc.

But even from a legal/moral perspective, it's not clear where the line is. The
data is publicly available, uploaded voluntarily by the people themselves. The
algorithms are freely available. People are allowed to take photos... Sure,
the end product is creepy, but where along the way did we go too far?

~~~
CarelessExpert
Gathering up personal data without user permission, and then putting it to use
in ways not originally intended by those users who provided it.

That's when we went too far.

This isn't hard.

Edit: And as a random aside, I'd be surprised if Clearview wasn't violating
copyright law, here. When a person uploads a photo to Facebook, the user
grants a license to Facebook.

So unless I'm missing something, Clearview is illegally copying and using
these works without permission...

~~~
air7
What precisely is "personal data"? Can a nosey neighbor look out their window
and note passersby that they recognize?

Can one recognize people whose face they saw somewhere (on TV, on a dating app
etc) without "permission". (e.g. "I'm pretty sure I just saw my tinder match
going into a bar with someone")?

What if someone has a very good memory for faces and a curiosity to match?
What if someone employ scouts to report when they spot certain people? What if
someone automate these processes?

Where is the line between private informatio and publicly available raw data
such as photons bouncing off people's faces?

~~~
CarelessExpert
At risk of violating HN's policy: it's this kind of sophistry that allows tech
employees and entrepreneurs to justify their predatory behaviour... bend over
backwards hard enough and you can find the logic to justify nearly anything.

Let's try it!

How do you define a biological weapon?

If I know I'm sick and I deliberately sneeze on people, am I a weapon?

If I pay people with an illness to sneeze on people, is that a weapon?

What if I cultivate smallpox in a lab and spread it with an aerosol sprayer
instead of using human carriers?

Where is the line on what is a biological weapon? Where along the way did we
go too far?

------
acqq
> While the company was dodging me, it was also monitoring me. At my request,
> a number of police officers had run my photo through the Clearview app. They
> soon received phone calls from company representatives asking if they were
> talking to the media — a sign that Clearview has the ability and, in this
> case, the appetite to monitor whom law enforcement is searching for.

Like in the movies.

------
Gatsky
If Clearview profits from having your photo in their database, don’t you
deserve compensation?

~~~
o-__-o
What is the value of your photo? That is the compensation you deserve. So
what’s that worth, $2? It’s the aggregate that provides value to the business

~~~
nocturnial
If two parties can't decide on the value, I would think it's only natural
there would be no transaction.

If I set the value of my photo at $2 million and company X sets it at $0.20,
am I forced to sell it at $0.20? If two people can't decide on a valuation
it's only fair there's no transaction.

~~~
o-__-o
Depends. If it is your photo, then you own the copyright and you entirely have
the right to set a price or not share it at all (don't click 'I agree' on
those TOS)

If it is not your photo, then unless you are a celebrity or otherwise a famous
figure the law is quite clear that you do not have any recourse for your photo
being used. You must use the courts to determine the valuation of your
likeness.

------
stareatgoats
We as a society are still experimenting with new new-found toys like this in a
lawless territory. Similar to how car-traffic was largely unregulated until we
realized that traffic rules are really necessary.

It is not clear how this will play out though. Is it even possible to hope for
a state that doesn't spy on its citizens? I'm not so sure anymore (thanks, all
you f-g terrorists). Maybe our struggle has to be to regulate and enforce
_how_ the spying is done, and used, and live with the fact that it can be
abused before it is corrected.

If anyone has a clear view of how such pessimism might be wrong I'll be happy
to hear it.

~~~
bamboozled
Why are governments so interested in spying on citizens ? What’s the real
motive ?

~~~
pixl97
Lots of reasons. Catching tax evasion. Stopping crime. Preventing themselves
from getting overthrown by their own citizens.

The problem is there are good and bad reasons, but even the good reasons will
be corrupted by bad people.

------
oxymoran
We have already reached the point of no return unfortunately. The only option
at this point is to open source and normalize this tech somehow so that
everybody has access to it.

~~~
microdrum
Right. I mean, cameras are everywhere. Probably, computers will look at and
parse their feeds, not just slow humans with eyeballs. Better this than
Sensetime.

------
newscracker
This company has too much power to fake and manipulate things in different
ways. It’s just a matter of time for false positives that crush the already
vulnerable people. The level of abuse people can (and will) be subjected to is
going to be horrendous.

Without regulations, laws and audits, everyone is screwed. Oh yeah, even law
enforcement is screwed if/when it blindly believes in these systems and
considers them fool proof and beyond suspicion.

 _Creepview — now, that’s my name for this company. It also makes sense that
Peter Thiel put money into it._

------
throwaway13337
As this has been and will be inevitable given the tech, it's not a question of
regulation as so many call for. Regulation will just put this tech in a small
group of hands to imbalance the power of probably government versus citizen.

The only way to deal with this is to recognize that privacy in public spaces
was a temporary concept in society available for a limited period. Allow
everyone this information and let the cards fall in a more balanced way.
Anything else would be oppression.

~~~
bem94
> privacy was a temporary concept in society available for a limited period

Maybe, maybe not. But it's an easy and incongruous thing to say from a
throwaway account.

Even easier if your life and/or livelihood doesn't depend on a degree of
personal privacy.

------
usaar333
Lots of discussion on ethics (important!), but little in actual metrics.

What actually are the precision and recall of systems that search over faces
of the entire national population? I would have thought enough people look
similar that precision would be inherently low (even as a human it is
occasionally hard to tell people apart in photos), but the claims here (and in
similar articles NYT has had on Chinese companies doing similar things) is
implying near perfect numbers.

~~~
perl4ever
I remember reading how some Chinese company was using face recognition on
livestock (chickens maybe?), which (although this might be human centric) seem
far less diverse in appearance than humans.

------
analbumcover
I would hazard a guess that this will lead to more anti-mask laws since
politicians, who love appearing tough on crime, will view anyone wearing a
mask as a likely criminal. Which will probably decrease crime but also
effectively eliminate any checks on state power.

How has this guy not been targeted by organized crime groups already?

------
xupybd
Can we protect our selves by spacing fake accounts with our image? Could that
confuse these systems?

------
lucas_membrane
This loss of privacy is just another consequence of our perpetual focus on and
devotion to competition against each other -- everything gets weaponized.
Privacy hinders the real game-changer, the weaponization of you.

------
raldi
Even if we ban it here in the US, there's nothing stopping adversarial foreign
governments from building the same databases, and probably with a lot more
computing power and technical expertise.

~~~
ardy42
> Even if we ban it here in the US, there's nothing stopping adversarial
> foreign governments from building the same databases, and probably with a
> lot more computing power and technical expertise.

You make a good point, a simple ban of this technology isn't good enough. The
ban needs to go further, to ban the kinds of things that enable the
technology, like massive databases that aggregate people's personal photos or
surveillance video.

~~~
aeternum
The bans would have to be quite extensive. You could not allow networked
security camera systems like nest/ring. No social networks could exist, no
personal video sharing. Even governmental databases of photo IDs and company
photo IDs pose a risk given this technology. Always a chance of a data leak or
bad actor.

There does not seem to be a feasible way to stop this. We may simply need to
accept that a face will be enough to positively ID anyone.

------
geggam
I have often wondered if you form an LLC and sign over the rights to your DNA
/ looks / et al to said LLC contractually are you more protected in todays
society ?

------
ENOTTY
The social media companies might be able to buy some good press by suing this
company, its founders, its investors, and its employees into oblivion for
terms of service violations.

On the other hand, it would push copycats into developing alternative business
models to capture the demand while still hiding from civil legal action.

And there’s wider societal costs from potentially chilling innovation like
this through the social media companies acting like a trust.

------
blorgons
Ton-That was a scammer, reported on by Thiel's nemesis Gawker, and an HN
poster (hi!)

[https://news.ycombinator.com/threads?id=hoan](https://news.ycombinator.com/threads?id=hoan)
[https://gawker.com/tag/hoan-ton-that](https://gawker.com/tag/hoan-ton-that)

------
Ididntdothis
I am waiting for a government that puts together all the tech we are
developing right now and creates the perfect surveillance state. Even in the
worst governments like Russia under Stalin, the Nazis or North Korea citizens
still have/had the possibility to move around and do things without anyone
knowing. That soon may be over.

~~~
IggleSniggle
Isn't this what the Chinese social credit system already is?

~~~
microdrum
China + Face++ + Sensetime are doing this. The tech is out there.

------
neonate
[http://archive.md/jQIrw](http://archive.md/jQIrw)

------
rolph
the look that clearview has cultivated is at best clandestine and might even
be criminal

it is telling about the attitudes of law enforcement that they skirt
prohibitions of firstparty use of facial recognition by consorting with an
entity such as clearview

------
haskellandchill
Oh I know Hoan. At least he isn't alt-right any more. I think he has matured
and is a decent guy. It's just commodity technology, but it is going to make
him rich. _shrug_ Cool.

------
thomzi12
How did Clearview scrape so much of Facebook? Is this fundamentally hard for
Facebook, Instagram, Linkedin, etc. to stop if someone determined wants to
suck in photos for a tool like this?

------
macinjosh
The only thing scarier to me than a government or a corporation performing
surveillance is when they do it together.

~~~
geggam
Internet marketing companies already sell data to the govt. That horse left
the barn years ago

------
paulcnichols
The counterpart to this dystopian future is other people taking photos/video
without your consent. Just as a social norm I wish that was illegal first, and
if it already is, enforced more.

Apple and android should blur faces by default until other people explicitly
give consent to be photographed. That's the future I want to see.

------
qrbLPHiKpiux
In their TOS 3,2

They specifically prohibit on their site the very Thing they’re doing to
others.

------
aSplash0fDerp
The article summary pretty much highlights that anybody can aggregate public
data for personal or commercial use, so Clearview AI is not the only player.

Data aggregation and transparency were supposed to be the foundation of open
government, but it looks like the citizens (by way of
consumerism/capitalism/communism) are the victims of legislated privacy
violations by law makers that want nothing to do with transparency. It seems
like a conflict of interest if business is pulling the strings of politics.

If the citizens pushed harder for a transparent government, other than
encryption, what else can they legislate to turn the tables on that debate
(ie. No privacy for government is a no go)?

------
tus88
I guarantee you this has already been invented many times over by intelligence
agencies around the world.

------
imgabe
My personal rule: Ignore any news story that uses the word "might".

Sure, something might happen. Anything _might_ happen. The news is supposed to
tell us things that _did_ happen.

~~~
rwbhn
So, we must always be reactive rather proactive in our responses? I'm
personally quite interested in what might be coming, particular ly if the
article does a good job of describing driving forces and potential obstacles.

~~~
imgabe
It's not news, it's speculation and should be noted as such. The opinion
section would be an appropriate place for it.

If some journalist wants the world to know what their crystal ball says the
world is going to be like, they should publish a book, write a blog, whatever.
Don't pretend it's journalism.

Humans are _notoriously_ bad at predicting the future.

