...and in general, I've found Bing to be more useful for searching obscure things. There's more spam in the results, but at the same time you're more likely to find what you're looking for amongst them.
But unfortunately with articles like this, it seems that might change... as much as I'm against CP and abuse in general, I'm also against censorship and the degradation of search engine results to only the most mainstream/popular topics.
People in companies that track this stuff down get special hash lists from certain law enforcement agencies and the evidence itself is pretty much a controlled substance; so they need special authorization to safely handle and report it with a chain of custody. Two Microsoft employees got PTSD because of the large amount of data they had to check:
I knew a lawyer who had to defend someone in this type of case. He was allowed to view the evidence (he chose not to), but only him. No paralegals, no one else in his office that wasn't directly sitting on the defense table.
If you're not granted specific authorization, even looking at this content is illegal, so Microsoft, Facebook, Apple, Google, etc have a very limited staff of people who are even authorized to handle this stuff.
It's a difficult problem.
Edit: to be completely clear, PhotoDNA isn't a cryptographic hash. It's a hash function that maximizes similarity of hashes based on inputs, and is probably closer to a bloom filter in some respects.
> If you're not granted specific authorization, even looking at this content is illegal, so Microsoft, Facebook, Apple, Google, etc have a very limited staff of people who are even authorized to handle this stuff.
With the risk of offending folks, this is such a retarded law/process. Holy fuck. This is dystopian-level letter of the law vs spirit of the law.
We don't make footage of car hackings illegal to own, why should this be any different?
At the end of the day, the harm is already done, no point causing more.
1. "Google does it better than Bing"
Google unfortunately does it better than Bing by censoring all searches. It has been impossible to turn off safesearch on Google for many years (the "filter explicit results" setting only switches between soft and hard filtering). This has a positive outcome in this situation, but also greatly degrades the quality of available results in other cases.
The article mention Omegle's role very explicitly, however it reserves all of its vitriol for Microsoft specifically. Why? Omegle and the other platforms that are actually producing, hosting and facilitating the origination of this disgusting content should be the ones that have to get their shit together, or in some cases be persecuted. But Microsoft is a "juicier" target.
3 "even people not seeking this kind of disgusting imagery could be led to it"
They turned off SafeSearch, as clearly seen in the article's illustrative screenshot. This is a sleazy statement meant to whip people into a frenzy.
4. "Microsoft must" "human moderators" "underfunding" "another example of tech companies refusing (...)"
The article explicitly orders people, over and over, to be outraged, despite Microsoft's prompt and appropriate response, and makes assumptions that are unconfirmed or untrue.
TechCrunch's conduct here was not great - it seems to me like they handled the situation in such a way that showed their only aim was to attract clicks, rather than get the problem resolved promptly. It was tabloid behavior.
It's bad not because they're ignoring it, instead it's bad because they're just not particularly competent?
edit: to clarify again, some things are obviously worth blocking (like CP)! The issue is that once that technology has actually been deployed and demonstrated, you can be sure that different groups will start pushing for other things to be blocked as well. It changes the dialogue from "it'd be nice if you could figure out how to block X" to "hey, you're already blocking Y so why not also block X?" and that's a fairly potent change in my opinion.
...and more deviously, no one would want to be seen arguing for CP, so it makes for an effective "memory hole" to censor anything.
I have on a few occasions been searching for some highly obscure and technical subjects with Google, using search terms that have basically no connection to porn nor children whatsoever, and saw the ominous "some results have been removed from this page because they contain suspected images of child abuse" message. It really makes one wonder, especially since the word "suspected" is somewhat unsettling: was it actual CP that was removed, or something else entirely for a different reason? The fact that mere possession of CP is illegal makes for a chilling effect ("how dare you even question it --- are you a pedophile? You would not have gotten that warning if you weren't searching for 'bad things', right?"), and an extremely powerful censorship tactic.
The exact line gets blurry with various media. Kim, the female protagonist in the Broadway Musical "Miss Saigon" is a 17-year old whore. There's implied sex, as well as multiple stripper-dances throughout the musical to hammer in the sexual issues that Kim faces.
Something like Miss Saigon is allowed and not CP (despite being below the age of 18) because of cultural significance and whatnot.
There are clear cut cases of CP of course. But when musicals / movies / various media toy with the idea of 16-year-old or 17-year-old girls who are exploring their sexuality... where do you draw the line? What should get censored?
Lolita the novel + movie also gets brought up a lot. As it is explicitly a story about a middle-age man exploring sexuality / erotic elements with an underage girl. It toes the line and never becomes sexually explicit, but its clear from the context what the novel is pushing.
No, because Miss Saigon is a fictional piece of work, as is Lolita. Committing a crime in a piece of fiction is obviously not equivalent to actual child pornography, just like showing an action movie isn't equivalent to footage of actual violent crime.
There is no blurry line here. Child abuse and child pornography are clearly defined terms. Justifying child abuse in a novel might be morally offensive, but is not a crime, abusing an actual child however, and publishing footage of it, is.
Okay, let me give an alternative situation then.
A 14 year old girl sends a sexual text message to her (also underage) boyfriend. In Minnesota, the 14-year-old-girl was charged with Child Pornography.
If you think morality is black-and-white, then you are going to cause issues to many people. There's nothing "black and white" about distributing child porn. There are unfortunately, a ton of gray areas.
Yes, there are clear-cut cases of CP that need to be banned. But you must ALWAYS be wary about the edge cases, lest you harm otherwise innocent people. A 14-year-old who sends naked pictures of herself to her friends is... well... objectionable, but it shouldn't result in a sex-crime on her permanent record.
It's a fucking stupid thing for them to do, but criminal justice system isn't the solution for that.
It might be hard for Y to prove this though.
Good news! If you are rich enough to pay and keep paying your lawyers then you can move the case to the federal venue either immediately or in appeals court. Now we have a standardized venue to actually discuess any gray areas or lack thereof
That’ll be $2000 for this “hour” of work to afford your freedom, thanks
In many places around the world it's clearly defined to include purely fictional depictions. It seems to be rare, but there have been convictions for possessing child pornography cartoons in the US.
A relevant note is the distinction between production, distribution and possession (both directly or indirectly) when talking about corner cases.
A 14 years old should not be incriminated of producing or possessing photos of his/her naked body, he or she could be incriminated of distribution in extreme cases (the cited case should not be one of these cases)
How well defined are they really? People have gone to jail for possessing cartoon imagery that depicts children in a way that is deemed pornography under child pornography laws. That does not match your definition of abusing an actual child and publishing footage of it. Such imagery is entirely fictional and seems closer to a story in a novel than a physical act that took place in the real world.
If thats not CP it should be banned as a gross violation of the child's privacy.
Babies aren't sexual and they also don't as far as I can tell have any legal right not to have their parents take or share pictures of them. We could do well not to invent imaginary legal rights.
Posting naked photos of a baby on the internet is an awful violation of a baby's privacy, so it would not be allowed on my platform.
That depends on the jurisdiction. In the USA it isn’t, in Australia fictions in certain media (eg anime) can be considered CP. Even porn with adults who just look young can be considered CP.
That's terrifying. I can't think of a more subjective measure to codify into law than, "if she looks young." In the USA we already have a huge problem with subjectivity in interpretation of laws which allows our government/police to go after people they don't like (with real or imagined infractions).
Actually now that I think about it, I think that's what the state of Texas is doing to Cody Wilson. They're charging him with child molestation and child prostitution by saying that even tho he thought she was 18, she "didn't look like it." (not defending Cody, just stating that the USA may be no better at all in the subjective interpretation of law department).
 but not only. other genres like snuff would come to mind. How about an AI generated take on "A Serbian Film"
Fully clothed photos of kids are CP depending on the site where you find them for example. It is "easy" to block everything that could be CP, but that would include photos of children in diapers on Facebook posted by their parents.
It is easy to classify the more horrible stuff, but as soon as you make a line you can be sure some shithead will search for a way to blur it.
In more general terms this is why many things should be at least a bit subjective. Clean-cut definition are too easy to abuse.
A recent demonstration was in Tumblr's attempt to block all pornography. They provided some canonical examples of things that are explicitly allowed. It was then discovered that their automatic porn filters blocked the images they used as the specific examples of things that shouldn't be blocked.
In any case, a search engine (like Google or Bing) may decide to ban sexual content like Lolita, for being sexual + about underage minors.
As for an actual case of actually pornographic material, I have another example: https://news.ycombinator.com/item?id=18877901
I talk about an actual pornographic case in this post. A 14-year old girl sends a sexually explicit image to her boyfriend, and is therefore charged with distribution of child pornography.
In Australia young looking porn actresses/actor and loli hentai are illegal because they promote CP as a fetish.
There is an anime I really liked, "Miss Kobayashi's Dragon Maid", except that there are small children (7-9) that while are never depicted in sexual situation are often given excessive sexual maturity; I found that extremely unsettling (I never finished that otherwise truly beautiful series for this reason) because it is how one would try to normalize the idea that "maybe a kid would like it".
In a sense I think we should treat CP as the correspective of hate speech. Hate crimes : hate speech ~ child abuse : CP.
This highlight the conflict free speech vs censorship.
If you are still reading allow me one last opinion :-) another problem with thinking that a problem is simple is that you might miss an important connection between seemingly unconnected problems.
Look at Epstein, Weinstein, Spacey, Schneider, [other disney nickolodian people], and the ridiculous number of abusers in politics in the UK and US, as well as the music industry.
Its a huge problem.
I am personally suprised that not a single person at Youtube has been held accountable for all the blatent shit on that cess-pool of a site.
If you need links and information on Youtube - bing 'Elsagate' as just one example.
And in case it's not clear, I think those should all be on the same side of the line. The "this is OK" side. (I'm pretty absolutist here and think the "obscenity exception" is complete bullshit)
But that's not what you think, is it? So perhaps that line isn't quite as precisely bounded as you're thinking.
We could design strongly addictive porn that would significantly increase the market for production of actual child abuse.
More precisely if you abolish the concept of "obscenity exception" you still have that a lot of the same is illegal by "crime promulgation" (not a nice word, could not think of anything better)
Ahhh, so Snoop Dogg's music should be illegal, then? Since it's supporting the consumption of Marijuana, a drug banned at the federal level in the US.
Or on a more serious note, should encouraging civil disobedience during the civil rights era have been illegal?
That is a stupidly broad statement, and it has consequences that I don't think you'd like.
> We could design strongly addictive porn that would significantly increase the market for production of actual child abuse.
We could also do it with some future form of ML-assisted CGI - which is also currently illegal in the states. I'd say that's a pretty close substitute good for the kind of CP that requires hurting kids.
But really, I don't think the number of people willing to commit a major felony on camera and distribute the video would go up significantly if the end product wasn't illegal to possess. I may have too high an estimation of how intelligent people are...
“Should” indicates a statement of subjective preference, not a fact.
> More precisely if you abolish the concept of "obscenity exception" you still have that a lot of the same is illegal by "crime promulgation" (not a nice word, could not think of anything better)
The Supreme Court has found that the ability of the Government to restrict “crime promulgation” is narrow, due to the First Amendment:
“the constitutional guarantees of free speech and free press do not permit a State to forbid or proscribe advocacy of the use of force or of law violation except where such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action” Brandenburg v. Ohio (1969).
It's actually very easy - legal and illegal.
Legality isn't always the best way to draw a line.
I am not claiming that there is a simple solution for how to handle censorship that is optimal for the benefit of mankind. I am saying the way companies should approach implementing it is relatively straight-forward.
Otherwise you'll have endless debates about what is and is not moral, where nobody will ever agree and product managers and engineers and everybody else will constantly be in tension and nobody will be happy.
Censor as little as necessary based on laws and let people make their own decisions for what they want to view.
I don't really "get' it either.
But that's not what the issue is here. Displaying CP is literally illegal - was discussing homosexuality illegal? Was viewing nude males? It's the difference between blocking the literal illegal item or blocking items adjacent to it.
These are weird comparisons to try to make.
Discussing CP is legal. Viewing it is not. Viewing homosexual pornography has been illegal in the past. Viewing nude males wasn't illegal, just like viewing nude children isn't necessarily.
Although legal vs illegal is binary, the decision tree leading up to that is ridiculously complex.
> It's actually very easy - legal and illegal.
In what jurisdiction?
its nice to think laws are just and they should be but thats not neccissarily the way of things some places are worse than others.
as for this subject specifically it should be obvious to people which side this is on.
Obviously cp should be censored... but then SE's could start censoring political stuff or anything that does or doesn't support fascism or socialism depending on their mood and political leanings. Not that censoring fascism is bad, but people should still be able to make up their own minds about things through research.
Hopefully AI will become good enough to accurately 'age' and block under-age pictures of minors altogether regardless of whether they're nude or not, honestly I think minors should be protected a bit, although not sure how that would work w/ minors in the public image like child actors.
We can draw the line at CP for god sake. Why is this always the go-to in threads like this?
Microsoft also shares a lot of stuff with the industry, including PhotoDNA hashes of known CP images. Just because it is not publicly discussed doesn't mean there isn't a lot of work going on, even between organizations, to work on this problem.
Do we know Google didn't also have this issue? Or did they have it, patch it, and then make the press aware. Look back at Facebook's PR groups spamming TechCrunch with "tips" and hoping to seed negative articles about their adversaries. This whole thing, while a very valid problem that needs to be addressed immediately- reads exactly like some PR group dropping tips on a client's competitors to TechCrunch.
Forgive me if I'm skeptical of the motives of someone who cares more about the press finding out first that Bing is leaking potential child porn, over actually removing access to child porn.
I'm biased because I generally like Microsoft better than Google, but this whole thing begs the question: why was this directed to the media before Microsoft? Both could've been made aware. Plenty of disclosure-like articles are written with the claim "at press time, the <problem> is no longer showing" and they're no less impactful. With child pornography, of all things, why the hell is Tech Crunch pushing this story so quickly that they had to issue a warning to not look up the links because you could be liable? Like, Microsoft is going to be rightfully shamed either way, ya really need to maximize the shock value with that extra bit? At the cost of leaving active child pornography in the open. Come on.
This is not being behind, it's showing what the user wants. BING should have banners or ads on suicide prevention, that's all. If you want to kill yourself, a suicide prevention page is not the most relevant one.
This story is to give a black eye to MSFT, they could have just told them instead of doing studies, but that doesn't bring clicks to their site. A lot of times competitors are behind such stories. Not necessarily google, it could be a vendor hoping that MS hires them.
That said, I’m not at all arguing that a competitor or vendor wasn’t the reason for this article.
Morally, we should all strive to minimize this kind of thing - but from a business perspective, this story will absolutely earn Google business and capital.
"porn teens" returns plenty of results, and I doubt that they are all teens who are legally old enough.
One of the two word phrases that should not have had any matches that I tested was "porn isomorphisms". As expected, it gave me a lot of porn that had nothing to do with isomorphisms, and a small amount related to isomorphisms that had nothing to do with porn. All the porn involved women.
I then tried "porn homomorphisms", expecting similar results to "porn isomorphisms", except maybe the "homo" in there would lead to more male porn. To my surprise, there isn't any porn in the results! It's almost all math stuff, or word stuff.
Bing is weird.
Because they hire people who are 18 or 19 who look younger.
While I can't dig up an article about it, there was a guy in Florida who was brought to trial on child pornography charges. Some "expert" insisted the girl couldn't possibly be 18. The woman had to fly over and testify that she was 18 at the time.
> In 2009, federal agents arrested a man in Puerto Rico on suspicion of possessing child pornography that included Fuentes. At trial, a pediatrician testified that Lupe was underage based on her appearance. Lawyers for the defense subpoenaed her to show her passport, proving that she was 19 years old at the time of filming.
: I haven't actually done the search myself, for obvious reasons.
Doesn't really surprise me. I can just imagine the manager demanding that the child porn searches stop immediately and this makes the most sense for a rapid solution. The question is now though how well did they guard it. I'm not going to experiment though, those kinds of could probably land me in jail here.
As for the latter point, ditto. However, I would also avoid searching that kind of stuff simply because I don't want to have such a gruesome image in my head.
Perhaps the problem is that there is an excuse: it is likely that they believe that they are neither responsible for the images they index, nor are they beholden to proactively engage with law enforcement.
This is where treading the line between conduit and publisher can bite the corporation who is attempting to do so. While Microsoft does not host the images, it does index them and serve links to them. What's more, they also host thumbnails and metadata that allow the images to be browsed and discovered.
Because they host the thumbnails and metadata, are they not a publisher? They're hardly acting as a disinterested router of opaque data when the data served is stored on their servers and indexed by their algorithms.
I think Microsoft would argue that there is no real way to prevent these images from being indexed considering that most indexing is automated and it's hard to train their systems to recognize offending images and their human employees can't possibly look through all the images to prevent child porn from being posted.
Very different from categorizing random web content.
It looks like Microsoft wasn't doing anything at all to stop new child porn. The fact that their recommendation engine pushes you further in the direction of pedophilia is proof enough that they weren't actively trying to avoid this kind of thing.
That really depends on the legal domain one is operating in and even between those where it is a crime, the definitions as to what constitutes as "child pornography" and what not, isn't as easily drawn as some people like to think .
And now this. Vulgar images, available directly from your start menu.
In fact, they've got so many "tricks", I'm not sure which one you're even referring to.
This isn't about vulgar images, but blatant abuse and exploitation.
I don't know what Microsoft might be guilty of here, if anything. But corporations can be indicted.
Almost every other crime take in account the mind of the accused. Someone who forgot their dog food was on the bottom of the cart and leaves isn't committing theft, as long as they go back and make it right. And on the other end, we execute people who conduct premeditated murder. And also we take in account an accidental homicide due to negligence (manslaughter) as a limited punishment.
And worst yet, how do you tell a 17year364days old's image on pornhub and 18year old's picture? You get arrested for CP on the 17.364 image if you're caught and they know it's bad. Did it matter if the minor perjured themselves? Nope. Did they use a fake ID? Who cares. You're at fault.
Strict liability is absolutely in place in many states.
> If MSFT has indexed without showing any good faith effort to avoid indexing then the DOJ can and should prosecute
That's not how it works. At all. That part right there.
"We don't care the reason or mitigating circumstances you have, or even if you did it. You have the $thing , so you're at fault."
(The -1's, this is how statutory crimes work. I'm stating a fact, not making ethical determinations about it. I personally think that mens rea absolutely needs to be incorporated at a minimum.)
For instance: https://kellerlawoffices.com/child-pornography-accidental-po...
IMO, they failed but others failed to properly screen cp, until they fixed it.
Wave after wave of recent subreddit bannings due to reddit now believing they will be directly responsible for data they are helping to serve should make one stop to reflect about how exactly such a law would be enforced.
What’s the gain? relevance among porn users.
Perhaps we can compare this to fishing. Nobody catches dolphins on purpose, but a lot of fishers will use wide nets well knowing that some dolphins will be trapped in.
There are plenty of problems with black-listing based approaches, and I strongly disagree with some of the technical decisions made in PhotoDNA , but Microsoft should be credited for their commitment to reporting and prosecuting, and bringing their processes to other companies who otherwise would be unable to dedicate the resources required to be effective.
Appears to be the comment you are referencing in case anyone else is looking for it.
I’ve unfortunately seen far too much of what people don’t believe is out there. That’s what I get for helping anons “dig”. As they say “nothing is beyond our reach”.
Rather than stop real predators seeking out this garbage, they run sting operations to entrap people who are too uninformed and poor to fight back then pat themselves on the back about fighting crime.
The government drones on and on about the dangers of "The dark web" and how we need to give up all our security (because if we don't, they can't possibly catch anyone).
Instead, we find hundreds of sites so easily accessible that a standard web crawler can locate them. It wouldn't be too difficult to track down the site and monitor it to find the owner. At that point, shut down the site and squeeze the owner. Sooner or later, one of them is going to have links to producers that can be followed.
This allows law enforcement—given the requisite time and money—to do the things you describe, without also endangering regular people.
Unless you're advocating the FBI should purposefully leave these results in place as a form of entrapment, which has a plethora of other issues.
It's a cat and mouse game and probably always will be.
Authorities have taken down several large child porn TOR sites for example.
This isn't someone hiding on TOR. This is someone on an easily traceable, publicly available site who could be tracked down and squeezed to find the source (and hopefully find and rescue the children).
Results called "kids on omegle showing" suggests that the kids were being prompted by predators to produce child pornography on social networks. There has got to be some rule about letting kids access these social networking platforms. Who could possibly think it's a good idea to let a child post their photos videos and profile information online and leave that open to the public for any predator who wants to reach out to them. And what's worse these kids are probably using these things unsupervised.
I wonder how a search company could hope to really effectively combat this content considering it's probably constantly being produced and circulated on a daily basis. Although one should expect them to keep track of and closely monitor keyword phrases routinely associated with child porn.
The results for "kids on Omegle showing" are most likely other users who have captured screenshots or recordings of the illegal video streams on Omegle.
Please, this is limited to law enforcement. DO NOT TRY TO INVESTIGATE THIS yourself. Looking and finding CP could be considered a crime already in your jurisdiction.
Investigating corporations for flippantly allowing child abuse materials to be indexed and distributed on their platforms should never be illegal. The fact that Israeli investigators did this research, rather than Americans (Google is an American company) shows that our own low enforcement clearly isn't doing it's job here.
I have a toolchain for finding child abuse content on Youtube, and I use it to report videos to Youtube for takedown. It's absolutely insane that I can be held criminally responsible for finding this content - but Youtube is immune from consequences hosting it.
Some of the videos I have found had millions of views.
Many that I have reported have not been taken down.
Me ... 1.
If I had to see that kind of thing as a job, I’d die.
There are a lot of revolting, but vital jobs out there. Hats off to those with the psychological disposition to handle them, even if not sustainable in the long term.
Which consequently, is why mental health for LE is of the utmost importance. Frankly it’s a public safety issue.
Holy mother of all false dichotomies...
I never said to roll back existing restrictions. I feel that I am being projected on. Perhaps this topic is too emotionally charged for HN.
No, you absolutely did not. The real solution, which has already been arrived upon, is to legally forbid pedophile felons from working with or around kids. Rolling that back is not 'pragmatic', it's fucking moronic. Feeding their fantasies by paying them to look at pornography is the worst idea I've heard in a LONG time. You are seriously creeping me out.
That doesn't stop them, especially if they have not been convinced of a crime. Instead giving them the option to get paid to look at said images will make a lot of potential criminals get a job away from any children - plus they won't be getting PTSD from looking at them, unlike non-pedophile operators.
> Feeding their fantasies by paying them to look at pornography is the worst idea I've heard in a LONG time
Mind if I ask why? It's not like most of them are not already looking at CP.
You could make any effort to stop something bad seem worse than the crime itself if you follow this line of thought too far
If Bing really wanted to do this, it’d probably be the same process as a separate company, except that the email domain of the parties working on the integration would be the same.
Calling out Bing in particular isn’t useful, but drawing attention to the problem might be — or it could turn into another repressive crusade. Hard to say.
You can't use the word "minor" at all in combination with some terms, even though it's a common last name.
I don't know what the answer is, but it seems like we should be able to do a much better job selectively returning results to "wall off" the things we want to separate and avoid accidentally letting through, rather than simplistically blocking words, which is clearly what's happening in some cases.
You don't necessarily have to identify every single image to know that certain terms should not be returning results from sites and pages that have a high probability of being porn, and certain NSFW terms should not be returning results from sites aimed at, or content known to have been made by, children.
Anecdotally it seems like all we've done is to edge closer to ruining search for legitimate situations without accomplishing much.
This is just defense in depth. If you know a query returns 90% CP before filtering, even if your filters are 99% sensitive, you're going to get some CP in the first few pages for that set of terms. So if you identify a query as CP-seeking, then you would rather probably just show nothing at all. Of course, the definition of CP-seeking would have to be tuned, but the ratio of legitimate to CP results would have to be a component of that.
> You don't necessarily have to identify every single image to know that certain terms should not be returning results from sites and pages that have a high probability of being porn
We definitely already do that, and have been doing at least since the late 2000s.
> Anecdotally it seems like all we've done is to edge closer to ruining search for legitimate situations without accomplishing much.
Objectively, this is not true. We have accomplished a great deal in terms of CP suppression, and search is better than it ever has been for the vast majority of legitimate queries. Regardless, I'm sorry that you feel that way.
Apart from incidents, which are probably removed very fast, there has not been and is there is no child pornography on Bing in the past year. It is true that the keyword recommendations are very disturbing. They probably reflect what other people have been searching for.
Source: I'm sexually attracted to (some) children, and I sometimes use Bing to search for legal photos of (naked) children, because I know that Bing uses PhotoDNA to filter out everything illegal. I have never seen any child pornography. There's only naturism without any sexual posing and models (woman) that might look 15-17 but are actually from legit porn sites.
You may think that's disgusting or immoral, and I can understand the disgusting part. I know that I'm sexually attracted to young boys since I was 15/16 years old, and I have decided to never act on that attraction. I have a stable relationship with another adult. However, viewing naturism photos of children is not illegal, and I don't think I harm anyone by viewing such photos.
I you want to have some information about pedophilia, as there are lots of myths about it, I think this is a good resource with linked sources: https://pedofieltweets.wordpress.com/2018/12/30/pedophilia-e... . Main point to take away, is that most abuse isn't committed by pedophiles, and that it is very likely that most pedophiles don't abuse children.
'Tomorrow Never Dies' has really aged well...
I think this was a strategy to lure users from Google, which is very restrictive when it comes to indexing and showing sensitive content.
It's a very common phrase that's been around for a very long time (decades) used by those in the CP community, often related to selling or buying images or videos.
Actually improving noticebly requires an astonishing amount of work. Improving to the degree that users overcome their brand biases is harder still.
I don't envy Microsoft.
one) Article should redact search terms if it is illegal to try them - we can't even verify findings are real. Also, I predict a number of people (likely low but still a lot), searched the terms to verify.
two) A lot of pornographic websites will advertise "young", "girls", "teen", and maybe even "kid" but all persons will be of legal age, some though do dress/look much younger.
A) Maybe the people doing the investigation can't tell / have poor judgement.
B) How would Microsoft Bing know whether the young "stars" look young, or are actually young?
My vote would be to ban the whole damn search keywords altogether. Let investigator find and prosecute those in the darker corners of the web but just ban searches for keywords like those in the article.
1. Not filtering out all child pornography images..
2. Suggesting search terms relating to child pornography.
The second point is the most damning, in my opinion. Here's an example:
1. Navigate to Bing Images.
2. Turn off safe search.
3. Search Bing images "sex"
4. Click on the second image.
At the top of the image detail page, there will be suggested search terms. One of them, for me, is "baby sex fetish"
They seem to process each search result image, and suggest other search terms based on what they think the image contains. For example, if I search for "breast", and click on a NSFW picture containing smaller-chested model, I end up with suggestions like "Skinny Small Tit Girls Naked".
For what it's worth, Google doesn't provide suggestions for NSFW searches at all. I think this approach goes a long ways to solving this specific issue.
One time I searched for a building called "Banana Alley" and was trying to find a photo of it to send to a friend. Google correctly returned a photo of the building, however Duckduckgo returned some "interesting things adults do with bananas".
I believe Google has a pre-processor to work out if you're searching for adult content first before deciding if to show adult content in the image results.
So if I search for random terms - Google will never show porn, unless I include sex-related terms in the search query. And only then will the NSFW filter be removed.
What does this mean for smaller companies that make products which index the open web? Put another way, does this stuff show up on Duck Duck Go?
Porn accounts for 1% of images. Child porn accounts for 1 in a million fraction of that. So it’s 10 per 1 billion images.
If you figure the filters exclude 90% of child porn images, that’s 1 per billion which will show up in search results.
I can’t find a good estimate of total number of images, but YouTube shows 5 billion videos a day and gets 1800 minutes of video per minute.
So if we estimate a trillion photos, then we’d expect around a thousand child porn images to make it past the filters, and Bing to be able to return a few pages of 75% child porn when we accidentally stumble on a term in that category.
Since PhotoDNA is basically a known-bad tool, it presumably can't win the battle unless turnover is fairly low. Penalizing or hiding sites with many PhotoDNA hits (or perhaps a high percentage of hits) might do better by targeting concentrators, but that would depend on what sort of sites are serving this stuff. I assume they have to be fairly small/scattered to stay operational, which in turn makes it harder to predict what sort of content they have.
(And despite the article, it doesn't seem clear that Google has solved this problem, so much as bypassed it with a whitelist approach to nudity in general.)
Thank you to whoever actually deals with this, so I don’t have to.
So maybe that's not all bad?