
Microsoft Bing not only shows child pornography, it suggests it - a-wu
https://techcrunch.com/2019/01/10/unsafe-search/
======
userbinator
I'm not surprised. I remember reading an article a long time ago about how
porn was easier to find with Bing than Google, because the latter was doing a
lot more censoring.

...and in general, I've found Bing to be more useful for searching obscure
things. There's more spam in the results, but at the same time you're more
likely to find what you're looking for amongst them.

But unfortunately with articles like this, it seems that might change... as
much as I'm against CP and abuse in general, I'm also against censorship and
the degradation of search engine results to only the most mainstream/popular
topics.

~~~
corebit
Microsoft invests significantly into fighting this filth. Don't let someone's
news agenda lead you into false conclusions.

~~~
djsumdog
Yea, they filter their e-mail platforms for sure, but part of it might be the
justice department binding their hands.

People in companies that track this stuff down get special hash lists from
certain law enforcement agencies and the evidence itself is pretty much a
controlled substance; so they need special authorization to safely handle and
report it with a chain of custody. Two Microsoft employees got PTSD because of
the large amount of data they had to check:

[https://www.telegraph.co.uk/news/2017/01/12/two-former-
micro...](https://www.telegraph.co.uk/news/2017/01/12/two-former-microsoft-
employees-developed-ptsd-watching-child/)

I knew a lawyer who had to defend someone in this type of case. He was allowed
to view the evidence (he chose not to), but only him. No paralegals, no one
else in his office that wasn't directly sitting on the defense table.

If you're not granted specific authorization, even looking at this content is
illegal, so Microsoft, Facebook, Apple, Google, etc have a very limited staff
of people who are even authorized to handle this stuff.

It's a difficult problem.

~~~
AaronFriel
They don't just filter their e-mail platforms and consume hash lists from law
enforcement agencies. On the contrary, Microsoft funds and freely distributes
the most common tool for checking photos (not file hash based, I think
proprietary). That's likely the project the employees you mentioned were on, a
job I wouldn't wish on anyone I know. I hope they were well supported both
while working and afterward.

[https://www.microsoft.com/en-us/photodna](https://www.microsoft.com/en-
us/photodna)

~~~
vermilingua
It states immediately below the fold that it uses hash lists.

~~~
AaronFriel
I want to avoid using the term - even though they use it - because it's not a
hash function like most people are familiar from computer science. Their
"hash" function is resistant to alterations and is more like a "fingerprint".

Edit: to be completely clear, PhotoDNA isn't a cryptographic hash. It's a hash
function that maximizes similarity of hashes based on inputs, and is probably
closer to a bloom filter in some respects.

~~~
throwaway218649
These algorithms are called "fuzzy hashing." Examples include "ssdeep,"
"smapsum," and "nilsimsa."

~~~
solarkraft
Is this what I know as 'image hash'?

------
beager
Search engines, messaging services, and social networks that are all
threatened by this should forego their competitive instincts and share
information about how to combat this issue. If Google doesn’t have this issue
but Bing and DuckDuckGo do, I’d feel morally bound as Google to share best
practices to help them prevent indexing, suggesting, and retrieving this
content.

~~~
ickler9
There's a reason this was shared with TechCrunch before it was shared with
Microsoft.

Do we know Google didn't also have this issue? Or did they have it, patch it,
and then make the press aware. Look back at Facebook's PR groups spamming
TechCrunch with "tips" and hoping to seed negative articles about their
adversaries. This whole thing, while a very valid problem that needs to be
addressed immediately- reads exactly like some PR group dropping tips on a
client's competitors to TechCrunch.

Forgive me if I'm skeptical of the motives of someone who cares more about the
press finding out _first_ that Bing is leaking potential child porn, over
actually removing access to child porn.

~~~
codezero
I don’t think this needs to be a conspiracy. Bing is far behind. I worked on a
suicide awareness project when I was at Quora, and at least at the time Google
had help links featured for suicide subjects and Bing had instructions on how
to kill yourself. Arguably Bing had the better search results but not the high
ground. This could be another one of those cases.

~~~
onetimemanytime
>> _I worked on a suicide awareness project when I was at Quora, and at least
at the time Google had help links featured for suicide subjects and Bing had
instructions on how to kill yourself. Arguably Bing had the better search
results but not the high ground._

This is not being behind, it's showing what the user wants. BING should have
banners or ads on suicide prevention, that's all. If you want to kill
yourself, a suicide prevention page is not the most relevant one.

This story is to give a black eye to MSFT, they could have just told them
instead of doing studies, but that doesn't bring clicks to their site. A lot
of times competitors are behind such stories. Not necessarily google, it could
be a vendor hoping that MS hires them.

~~~
codezero
It’s definitely being behind. Helping your users die will affect your ability
to profit from them in the future, it also costs almost nothing to show a
suicide prevention banner with a help line, and they are known to help prevent
suicide.

That said, I’m not at all arguing that a competitor or vendor wasn’t the
reason for this article.

------
tzs
The search given at the start of the article, "porn kids", now returns that
they have nothing matching. Unlike other two word searches, where if it
doesn't find something matching both words, it looks for similar things, "porn
kids" is a flat out statement that it has nothing, suggesting that they
explicitly special cased it.

"porn teens" returns plenty of results, and I doubt that they are all teens
who are legally old enough.

One of the two word phrases that should not have had any matches that I tested
was "porn isomorphisms". As expected, it gave me a lot of porn that had
nothing to do with isomorphisms, and a small amount related to isomorphisms
that had nothing to do with porn. All the porn involved women.

I then tried "porn homomorphisms", expecting similar results to "porn
isomorphisms", except maybe the "homo" in there would lead to more male porn.
To my surprise, there isn't any porn in the results! It's almost all math
stuff, or word stuff.

Bing is weird.

~~~
nitwit005
> "porn teens" returns plenty of results, and I doubt that they are all teens
> who are legally old enough.

Because they hire people who are 18 or 19 who look younger.

While I can't dig up an article about it, there was a guy in Florida who was
brought to trial on child pornography charges. Some "expert" insisted the girl
couldn't possibly be 18. The woman had to fly over and testify that she was 18
at the time.

~~~
stordoff
Lupe Fuentes:

> In 2009, federal agents arrested a man in Puerto Rico on suspicion of
> possessing child pornography that included Fuentes. At trial, a pediatrician
> testified that Lupe was underage based on her appearance. Lawyers for the
> defense subpoenaed her to show her passport, proving that she was 19 years
> old at the time of filming.

[https://en.wikipedia.org/wiki/Lupe_Fuentes](https://en.wikipedia.org/wiki/Lupe_Fuentes)

~~~
nitwit005
I see I got the location wrong. Thanks.

------
dleslie
> There’s no excuse for a company like Microsoft that earned $8.8 billion in
> profit last quarter to be underfunding safety measures.

Perhaps the problem is that there is an excuse: it is likely that they believe
that they are neither responsible for the images they index, nor are they
beholden to proactively engage with law enforcement.

This is where treading the line between conduit and publisher can bite the
corporation who is attempting to do so. While Microsoft does not host the
images, it does index them and serve links to them. What's more, they also
host thumbnails and metadata that allow the images to be browsed and
discovered.

Because they host the thumbnails and metadata, are they not a publisher?
They're hardly acting as a disinterested router of opaque data when the data
served is stored on their servers and indexed by their algorithms.

~~~
onetimemanytime
Do you really think Microsoft is showing them on purpose? What's the gain? I
can see the loss and not just in PR.

IMO, they failed but others failed to properly screen cp, until they fixed it.

~~~
dleslie
I do not believe there is any purposeful or malicious intent from Microsoft.
But that doesn't mean they cannot or should not be held responsible for the
data that they host and serve.

~~~
onetimemanytime
Fine, but responsible in what way? Not like they did it on purpose. Their bots
found the images, Bing just forgot to exclude them. Now they're work a lot
more to exclude them.

------
deveynull
This article doesn't do justice to the work Microsoft has done with combatting
child pornography in an attempt to paint Bing as some intentionally neglected
backwater and Microsoft as some criminally negligent monolith. Microsoft's
PhotoDNA is far and away the most widely shared detection tool and is in place
with a ridiculous range of organizations. More importantly, Microsoft works
closely with LE and makes it easy for their partners to integrate with LE.
Google avoids the problem in the article entirely by doing a nudity blacklist
for image results, which nicely solves the problem of some reporter doing a
hit job, but avoids actually doing anything that makes finding material more
difficult.

There are plenty of problems with black-listing based approaches, and I
strongly disagree with some of the technical decisions made in PhotoDNA [1],
but Microsoft should be credited for their commitment to reporting and
prosecuting, and bringing their processes to other companies who otherwise
would be unable to dedicate the resources required to be effective.

[1]
[https://github.com/deveyNull/phistOfFury](https://github.com/deveyNull/phistOfFury)

------
zokier
What I find bizarre is that there would be so much child pornography still
left in the indexable _public_ internet. I would have imagined that pedophiles
would have been pushed underground long ago. Why hasn't law enforcement
attacked the hosters and sources of such content? Surely finding them can't be
that difficult if even bing finds the material.

~~~
gambiting
Apparently one of the reasons for removal of all NSFW content from tumblr was
the sheer amount of CP on it - someone on reddit wrote a whole comment about
it. Basically people created lots of blogs linking to other CP content, which
was re-shared, but even if the original blog was deleted you could still see
the re-shares and find the original content.

~~~
jjhawk
[https://www.reddit.com/r/OutOfTheLoop/comments/a2r4h4/whats_...](https://www.reddit.com/r/OutOfTheLoop/comments/a2r4h4/whats_the_deal_with_tumblr_banning_all_nsfw/eb0z65x/)

Appears to be the comment you are referencing in case anyone else is looking
for it.

------
hajile
The real story is much different. Interpol, the FBI, etc have been handed a
golden tool to find child porn users. They could be prosecuting all those
sites and investigating their visitors, but instead, they turn a blind eye.

Rather than stop real predators seeking out this garbage, they run sting
operations to entrap people who are too uninformed and poor to fight back then
pat themselves on the back about fighting crime.

~~~
withinrafael
Chasing users feels like a complete waste of time. Taking down the sources
(e.g. on the darkweb) and physically recovering the kids involved is surely a
better use of their time, right? Ideally they'd do both, but finite
resources...

~~~
hajile
Chasing users is useful because those people are high-risk to be abusing
children in their lives.

The government drones on and on about the dangers of "The dark web" and how we
need to give up all our security (because if we don't, they can't possibly
catch anyone).

Instead, we find hundreds of sites so easily accessible that a standard web
crawler can locate them. It wouldn't be too difficult to track down the site
and monitor it to find the owner. At that point, shut down the site and
squeeze the owner. Sooner or later, one of them is going to have links to
producers that can be followed.

------
vivekd
>“Omegle for 12 years old” prompted Bing to suggest searching for “Kids On
Omegle Showing”,

Results called "kids on omegle showing" suggests that the kids were being
prompted by predators to produce child pornography on social networks. There
has got to be some rule about letting kids access these social networking
platforms. Who could possibly think it's a good idea to let a child post their
photos videos and profile information online and leave that open to the public
for any predator who wants to reach out to them. And what's worse these kids
are probably using these things unsupervised.

I wonder how a search company could hope to really effectively combat this
content considering it's probably constantly being produced and circulated on
a daily basis. Although one should expect them to keep track of and closely
monitor keyword phrases routinely associated with child porn.

~~~
Flashtoo
Omegle is more of a chat room than a social network. Users can video chat with
random strangers. What I found odd is that Omegle video chat is moderated,
probably to curb unwanted sexual content, but there are also two other video
chat categories: adult, which allows sexual content, and unmoderated, which
sounds like it has no systems in place to remove unwanted content like child
pornography. It seems absurd to me to have an unmoderated category on top of
an adult category - what legal things would you need an unmoderated video chat
with strangers for?

The results for "kids on Omegle showing" are most likely other users who have
captured screenshots or recordings of the illegal video streams on Omegle.

------
dep_b
Wow. That must've been a pretty disgusting story to investigate. I heard
people at Facebook that have to moderate this kind of stuff develop PTSD-like
problems.

~~~
jayalpha
"That must've been a pretty disgusting story to investigate."

Please, this is limited to law enforcement. DO NOT TRY TO INVESTIGATE THIS
yourself. Looking and finding CP could be considered a crime already in your
jurisdiction.

~~~
RIMR
At the same time, citizen investigations can be powerful things.

Investigating corporations for flippantly allowing child abuse materials to be
indexed and distributed on their platforms should never be illegal. The fact
that Israeli investigators did this research, rather than Americans (Google is
an American company) shows that our own low enforcement clearly isn't doing
it's job here.

I have a toolchain for finding child abuse content on Youtube, and I use it to
report videos to Youtube for takedown. It's absolutely insane that I can be
held criminally responsible for finding this content - but Youtube is immune
from consequences hosting it.

Some of the videos I have found had millions of views.

Many that I have reported have not been taken down.

------
paxy
This is really surprising considering Microsoft owns PhotoDNA which is used
all over the internet to detect child pornography.

~~~
monocasa
I've heard rumors that PhotoDNA incentived the creation of new child
pornography that ostensibly isn't in their database, rather than the
traditional sharing of older images.

~~~
ggggtez
That's an interesting idea. I don't see how they would establish cause and
effect, so I'm inclined to not believe it. But it is interesting to consider
unexpected incentives.

~~~
monocasa
I mean, the effects (your life being ruined forever if caught) isn't exactly
amenable to the scientific process and fully establishing cause and effect.
It's totally rational for them be extra cautious beyond what they can
establish.

~~~
ggggtez
I meant, how would you prove that more was produced as a result? How do you
know that now isn't produced because there are now people in the world in
general? Or some other cause?

------
dwighttk
Duck Duck Go's suggestions for "omegle k" look like they are going to return
awful results:
[https://twitter.com/dwightknoll/status/1083440852993802243](https://twitter.com/dwightknoll/status/1083440852993802243)

~~~
Kiro
DDG is using Bing though so not surprising.

~~~
dwighttk
Bing is a source, but it isn't just a passthrough
[https://duck.co/forum/thread/4350/did-you-know-that-
duckduck...](https://duck.co/forum/thread/4350/did-you-know-that-duckduckgo-
bing)

~~~
mda
Bing is the biggest source, they try hard to divert from this fact.

------
cmarschner
They way this usually works is: a problem surfaces, a team is created that
creates countermeasures, they measure the results, they solve it, then the
metrics become part of a dashboard. They run against a wall improving the
metrics, they move on, team funding is cut in half (or moved to another
location, like india/china, with a junior team). Meanwhile adversaries find
new ways to counter the measure. The article mentions several components of
the stack, developed in different locations. I presume right now they have put
up a tiger team to fix it. The cycle begins again...

~~~
asdfasgasdgasdg
Some of the search terms seemed like pretty low-hanging fruit. "Porn CP"? I
thought that there was some kind of initiative to just show no results for CP-
seeking queries, and that seems like one of the ones that would be pretty
obvious to block. I wonder if we're not just seeing the result of a bug.

~~~
mrsteveman1
Google does that, some are obvious but even those that _seem_ obvious end up
causing problems.

You can't use the word "minor" _at all_ in combination with some terms, even
though it's a common last name.

I don't know what the answer is, but it seems like we should be able to do a
much better job selectively returning results to "wall off" the things we want
to separate and avoid accidentally letting through, rather than simplistically
blocking words, which is clearly what's happening in some cases.

You don't necessarily have to identify every single image to know that certain
terms should not be returning results from sites and pages that have a high
probability of being porn, and certain NSFW terms should not be returning
results from sites aimed at, or content known to have been made by, children.

Anecdotally it seems like all we've done is to edge closer to ruining search
for legitimate situations without accomplishing much.

~~~
asdfasgasdgasdg
> I don't know what the answer is, but it seems like we should be able to do a
> much better job selectively returning results to "wall off" the things we
> want to separate and avoid accidentally letting through, rather than
> simplistically blocking words, which is clearly what's happening in some
> cases.

This is just defense in depth. If you know a query returns 90% CP before
filtering, even if your filters are 99% sensitive, you're going to get some CP
in the first few pages for that set of terms. So if you identify a query as
CP-seeking, then you would rather probably just show nothing at all. Of
course, the definition of CP-seeking would have to be tuned, but the ratio of
legitimate to CP results would have to be a component of that.

> You don't necessarily have to identify every single image to know that
> certain terms should not be returning results from sites and pages that have
> a high probability of being porn

We definitely already do that, and have been doing at least since the late
2000s.

> Anecdotally it seems like all we've done is to edge closer to ruining search
> for legitimate situations without accomplishing much.

Objectively, this is not true. We have accomplished a great deal in terms of
CP suppression, and search is better than it ever has been for the vast
majority of legitimate queries. Regardless, I'm sorry that you feel that way.

------
lordnacho
You would think that people who have illegal content would try to hide it from
a mainstream search engine? Aren't they just sitting ducks waiting for police
to come arrest them? I thought this kind of thing was on the dark net, and
there was no way to accidentally see it?

------
throwaway2826
The problem of this article is that it is untrue, and that because the
material may not be viewed, almost nobody has any means to check that it is
untrue.

Apart from incidents, which are probably removed very fast, there has not been
and is there is no child pornography on Bing in the past year. It is true that
the keyword recommendations are very disturbing. They probably reflect what
other people have been searching for.

Source: I'm sexually attracted to (some) children, and I sometimes use Bing to
search for legal photos of (naked) children, because I know that Bing uses
PhotoDNA to filter out everything illegal. I have never seen any child
pornography. There's only naturism without any sexual posing and models
(woman) that might look 15-17 but are actually from legit porn sites.

You may think that's disgusting or immoral, and I can understand the
disgusting part. I know that I'm sexually attracted to young boys since I was
15/16 years old, and I have decided to never act on that attraction. I have a
stable relationship with another adult. However, viewing naturism photos of
children is not illegal, and I don't think I harm anyone by viewing such
photos.

I you want to have some information about pedophilia, as there are lots of
myths about it, I think this is a good resource with linked sources:
[https://pedofieltweets.wordpress.com/2018/12/30/pedophilia-e...](https://pedofieltweets.wordpress.com/2018/12/30/pedophilia-
essential-facts) . Main point to take away, is that most abuse isn't committed
by pedophiles, and that it is very likely that most pedophiles don't abuse
children.

------
hodgesrm
Holy cow. Online search and big social media sound like a complete dystopia at
this point. What's next?

~~~
airstrike
News corporations. Oh wait, that's already a dystopia in the present.

'Tomorrow Never Dies' has really aged well...

[https://en.wikipedia.org/wiki/Tomorrow_Never_Dies#Reflective...](https://en.wikipedia.org/wiki/Tomorrow_Never_Dies#Reflective_reviews)

------
walrus01
For the report embedded in the techcrunch article, I'm getting "This document
has been removed from scribd" when attempting to view or download it. Why
couldn't they just publish it as a PDF file on the techcrunch website?

------
jbob2000
Without even reading the article, I know exactly what they are talking about.
I completely stopped using Bing a few weeks ago after stumbling upon the
search phrase "nude beach" with safe search turned off. Do not do this.

~~~
whoisjuan
For some reason Bing is highly optimized for porn. There are a lot of memes
that satirize about this.

I think this was a strategy to lure users from Google, which is very
restrictive when it comes to indexing and showing sensitive content.

~~~
fooker
Can confirm, I use Bing only for porn, Google for everything else.

------
Bhilai
Wow, Bing seems to be in a bad shape. Just a few months ago Bing was serving
malware/adware disguised as a Google Chrome ad and now this.

~~~
quantum_magpie
To be fair, a link to real Google Chrome download is also serving
malware/adware.

------
dwighttk
Tangent: (I need to start making a note of the search terms I'm using when I
notice this so I can get confirmation I'm not just ignorant of the relation,
but) I'm often surprised by the number of suggestive images that show up in
image results even with safe search selected. I usually use duckduckgo, but
sometimes also google image search.

------
bborud
Search is one of those businesses where just delivering a service that to most
consumers seems to not change much from one year to the next actually requires
a lot of work.

Actually improving noticebly requires an astonishing amount of work. Improving
to the degree that users overcome their brand biases is harder still.

I don't envy Microsoft.

------
ithrowthings2
Two things:

one) Article should redact search terms if it is illegal to try them - we
can't even verify findings are real. Also, I predict a number of people
(likely low but still a lot), searched the terms to verify.

two) A lot of pornographic websites will advertise "young", "girls", "teen",
and maybe even "kid" but all persons will be of legal age, some though do
dress/look much younger. A) Maybe the people doing the investigation can't
tell / have poor judgement. B) How would Microsoft Bing know whether the young
"stars" look young, or are actually young?

My vote would be to ban the whole damn search keywords altogether. Let
investigator find and prosecute those in the darker corners of the web but
just ban searches for keywords like those in the article.

------
BingHasProblems
In my mind, Bing has two separate issues here:

1\. Not filtering out all child pornography images..

2\. Suggesting search terms relating to child pornography.

The second point is the most damning, in my opinion. Here's an example:

1\. Navigate to Bing Images.

2\. Turn off safe search.

3\. Search Bing images "sex"

4\. Click on the second image.

At the top of the image detail page, there will be suggested search terms. One
of them, for me, is "baby sex fetish"

They seem to process each search result image, and suggest other search terms
based on what they think the image contains. For example, if I search for
"breast", and click on a NSFW picture containing smaller-chested model, I end
up with suggestions like "Skinny Small Tit Girls Naked".

For what it's worth, Google doesn't provide suggestions for NSFW searches at
all. I think this approach goes a long ways to solving this specific issue.

------
nailer
Duckduckgo does this too. Rather than typing 'ne' to autocomplete this very
site, I typed 'nn' once and there was child porn on duckduckgo's first page.

~~~
lysp
I've switched from Google to Duckduckgo for the majority of searches and found
that using Duckduckgo's image search is too NSFW to be used at work.

One time I searched for a building called "Banana Alley" and was trying to
find a photo of it to send to a friend. Google correctly returned a photo of
the building, however Duckduckgo returned some "interesting things adults do
with bananas".

I believe Google has a pre-processor to work out if you're searching for adult
content first before deciding if to show adult content in the image results.

So if I search for random terms - Google will never show porn, unless I
include sex-related terms in the search query. And only then will the NSFW
filter be removed.

------
ourmandave
They also show beastiality if your search is about horses. D=

------
Wowfunhappy
Microsoft is, as the article points out, a multi-billion dollar company.
Though they clearly have not invested enough effort into preventing this,
they've almost certainly invested a great deal of effort. And it was still
ineffective.

What does this mean for smaller companies that make products which index the
open web? Put another way, does this stuff show up on Duck Duck Go?

------
snr
Is there a reason they don't run child exploitation detection ML engine
(PhotoDNA) on the media they index? I'd guess they already run an ML engine to
"describe" a image while they index it and this additional step shouldn't be
too expensive?

~~~
FakeComments
What makes you think they don’t, and what is found here is just the small
portion that makes it past the filters?

~~~
dwighttk
Based on the redactions in the images in the article, those searches are
returning ~75% cp. I guess that could still be a small portion of whats out
there, but that's horrifying.

~~~
FakeComments
Hm. Here’s my napkin math:

Porn accounts for 1% of images. Child porn accounts for 1 in a million
fraction of that. So it’s 10 per 1 billion images.

If you figure the filters exclude 90% of child porn images, that’s 1 per
billion which will show up in search results.

I can’t find a good estimate of total number of images, but YouTube shows 5
billion videos a day and gets 1800 minutes of video per minute.

So if we estimate a trillion photos, then we’d expect around a thousand child
porn images to make it past the filters, and Bing to be able to return a few
pages of 75% child porn when we accidentally stumble on a term in that
category.

~~~
dwighttk
Oh yeah, didn’t think about the concentrating factor of the search term.

~~~
Bartweiss
There's a slightly bizarre situation here where search algorithms are working
against themselves, yeah. Presumably anything spiked by PhotoDNA isn't
returned at all - which frees up those spots for the next-most-relevant
result. And the more effective Bing's indexing is, well, the worse that result
will be...

Since PhotoDNA is basically a known-bad tool, it presumably can't win the
battle unless turnover is fairly low. Penalizing or hiding sites with many
PhotoDNA hits (or perhaps a high percentage of hits) might do better by
targeting concentrators, but that would depend on what sort of sites are
serving this stuff. I assume they have to be fairly small/scattered to stay
operational, which in turn makes it harder to predict what sort of content
they have.

(And despite the article, it doesn't seem clear that Google _has_ solved this
problem, so much as bypassed it with a whitelist approach to nudity in
general.)

------
vajrabum
Is there a reason not block the engine from returning image results or
suggesting related results based on the search terms? I realize that's
probably naive but seems less difficult than starting by trying to classify
the images.

------
mac01021
Naive thought/question: if CP becomes easily accessible for free through
search engines, won't that cause less CP to be produced, since the producers
will be less able to make money from it?

So maybe that's not all bad?

~~~
iloveyouall
Jevons paradox states that increases in production usually cause an increase
in demand, and there might be similar effects here: The availability might
make it more socially acceptable, which in turn might increase the demand, and
therefore also the demand of original productions.

------
0b01
Seems Microsoft already have the technology: [https://www.how-
old.net](https://www.how-old.net)

------
blackflame7000
Someday hopefully soon, computer vision will be able to more immediately
identify these perverts the moment it hits any open peer network.

------
commonsense1234
thanks for the warning at the top. it is so tempting to reproduce the search
results.

------
theandrewbailey
Also consider how many things Bing ties into, like Windows. Oh shit.

------
throw7
techcrunch.com not only reports on child pornography, it suggests how to get
it.

------
trumped
This is probably automated keyword suggestions... what the the police need to
do is go after the source of the problem (take down and prosecute the ones
hosting these files).

~~~
hjanssen
While this makes totally sense, it is just a part of solving this problem.
Often, those files are hosted on foreign servers with next to no chance for
law enforcement to get access and take those servers down. A filter will
definitly help with the spread of these atrocitys, this also protects the
dignity of the victims.

Google does not show these search results so it must be entirely possible to
filter them out. Why can't MS do the same thing for Bing?

~~~
trumped
we have seen the fbi blocking content that was hosted in other countries...
why put every and all companies in charge of policing? is law enforcement too
lazy?

~~~
hjanssen
Because _Microsoft_ is sourcing and indexing these images, not the FBI or the
general society. Going from there, i think it is entirely reasonable to see
the responsibility with Microsoft.

~~~
trumped
I think it is a good thing to have a search engine that indexes everything...
it makes it easier for cops to do their job...

------
RIMR
I stopped using Bing to search for porn specifically because the
recommendations were creeping me the fuck out.

I remember searching for something like "Blowjob Cumshot", and that
recommendation bar started giving me options for "Preteen Blowjob Cumshot",
"Middle School Blowjob Cumshot", and "Little Girl Blowjob Cumshot".

All of the thumbnails for these buttons were clearly lolicon/hentai, but it
blew my mind that Bing was actually suggesting these search terms as
alternatives to mine...

This was years ago, so it blows my mind that this is still a fucking
problem...

------
Animats
Microsoft could outsource the censorship to Beyondsoft. They do much of the
Internet censorship for the government of China.[1]

[1] [https://www.nytimes.com/2019/01/02/business/china-
internet-c...](https://www.nytimes.com/2019/01/02/business/china-internet-
censor.html)

------
mooseburger
It still rubs me the wrong way that simple possession of child pornography is
illegal. It's obviously a good idea to criminalize production of child
pornography, and possibly the sale of as well, assuming that sales of child
porn incentivize production of it, but images and video do not actually harm
any children, meaning criminalizing possession does nothing, and is in fact an
example of censorious and moralistic government overreach.

Consider that it is perfectly legal to watch cartel or ISIS execution videos,
and that the acts depicted in such videos are far worse than child rape. You
disagree? Are you saying you would rather be flayed alive than be
retroactively raped as a child and live with the trauma of it? Under what
principle does it make sense that possession of gore videos is legal, and
possession of child pornography is not?

~~~
Loughla
Ethics aside.

>but images and video do not actually harm any children

Yes, they are. In the creation of that video, a child was harmed.
Criminalizing the consumption of a good is aimed at stemming the production of
a good.

~~~
openasocket
There's also some nice secondary effects of criminalizing possession of child
pornography. What little evidence we have about pedophiles suggests that the
majority of consumers of child porn have also sexually abused children
personally. And, while child sexual abuse tends to go un-reported and can be
difficult to prove in court (usually the child is the only witness, limited
physical evidence, etc.) possession of child porn is much easier to get a
conviction on. So not only is criminalization stemming the production of child
porn, it is helping to find and convict sexual abusers.

~~~
DanBC
> What little evidence we have about pedophiles suggests that the majority of
> consumers of child porn have also sexually abused children personally.

That's not true at all.

~~~
openasocket
I over-stated the evidence. I was thinking of a study claiming 85% of felons
convicted of child porn possession also admitted to also committing child
sexual abuse in anonymous reporting. But it seems there's a lot of controversy
around that study:
[https://www.nytimes.com/2007/07/19/us/19sex.html](https://www.nytimes.com/2007/07/19/us/19sex.html)
.

That said, my understanding is the evidence does seem to indicate a
correlation between the two, though in the 30-40% range. See
[https://web.archive.org/web/20080111204617/http://www.ndaa.o...](https://web.archive.org/web/20080111204617/http://www.ndaa.org/publications/newsletters/child_sexual_exploitation_update_volume_1_number_3_2004.html)
for example. Not a majority, but still significant. And, of course, this
sample is just people who were convicted of possession, not all people who
ever viewed such material, so I'd have to qualify my statement to "convicted
consumers of child porn"

~~~
DanBC
I agree with this post. Lots of people viewing images of child sexual abuse
will go on to commit a contact offence (or already have committed a contact
offence).

But there are plenty of people who won't go on to commit a contact offence.

This is a problem for law enforcement because they need to monitor all people
who've viewed images of child sexual abuse in order to stop them committing
contact offences, and this second group inflates the numbers. Maybe it's only
20% more, but maybe it more than doubles the numbers.

------
nv-vn
Who was the guy who found this and why the fuck was he searching "porn kids"
and "omegle kids showing"? Like I get that it's an issue and all but this dude
was looking up really sketchy shit that you wouldn't just stumble upon by
accident.

~~~
Jach
"It's ok officer, I'm a journalist!"

~~~
MagicPropmaker
It worked for Pete Townshend.

