Hacker News new | past | comments | ask | show | jobs | submit login
iPhone's Adult Content Filter Blocks Anything 'Asian' (gizmodo.com)
107 points by thereare5lights on Feb 5, 2021 | hide | past | favorite | 103 comments



It literally blocks "Asian restaurants near me." The fact that someone at Apple made a simple word based filter and didn't think of the possibility that words have different meanings in different contexts is absolutely astounding to me.


You’re assuming a lot about the implementation. In fact I strongly doubt that somebody at Apple literally put “asian” as a blanket filter. It’s more likely an opaque machine learning feature gone awry.


Then it's on Apple for not validating and doing checks on the output of their ML models. Apple isn't some scrappy 5 person startup.


How would you comprehensively test for stuff like this?


Take list of top 1000 search terms and run by human reviewers. Take list of top 1000 search terms that are flagged and run by human reviewers. Take list of 1000 sensitive terms (pay a few interns to do come up with them) and run by a human reviewer. Then review top false positives and random false positives.

Apple probably spent more money on the UI components for this feature than all of that would have cost.


A company doing a quarter of a trillion in revenue might be able to figure it out.


Aren’t we supposed to be an industry that prides ourselves on tackling hard problems? We can’t build high tech pattern recognition systems and then just throw our hands up when there’s a false positive.


Does "asian" sound like an obscure term that needs "comprehensive" testing to discover?

Plus, how about partial rollout?


Well, having a list of a few million valid & invalid pages as a test case would probably help.


I did my own test to see and I could find one search term including the word Asian which was allowed (the search was "The History of Asian Nations," which is a book title). I think you're right, it must be more complex than a blocklist.

I tried over a hundred other terms (such as "Asian cuisine" and "college with best Asian American studies program") and none of them were allowed.


You can get out of anything if you put machine learning in between the developers and customers


Machine learning is almost magical in that way: complicated, contextual evaluations of things can be delegated to a black box, and since there's no longer a human people can get angry at for a contested decision, emotions aren't as high and the company can just shrug its shoulders at that silly little computer. It's a cheap way to abdicate moral and social responsibility.

It's one reason that AI ethics is a broader field than people worrying about the existential risk of AGI breakout.


And what's extra depressing is that they're usually not lying. For nontrivial AI, the developers can neither sufficiently answer why it does x or why it doesn't do y.



It's worth pointing out that, post-firing, the present tense doesn't really apply to Timnit Gebru...


If the "machine learning feature" is acting as if it's a simple word-based filter, it's a simple word-based filter.


I wouldn't be surprised if they trained a model on text scraped from somewhere like pornhub and this was the result. It's only surprising they didn't accidentally catch more ethnic terms.


It seems to block "ebony" as well, which is often used to refer to black people in adult content.


Does it also block "teen" and "lesbian"?


It does in fact block "teen", but not "lesbian".


I wonder why?


Because it's 2021. In 1990 or even 2000 it would have also blocked lesbian.


I'm implying that Apple likely looked for that term specifically and unblocked it, while they didn't care to check for the term Asian. But I wonder why?


I implied the same, and also kind of gave away the reason: because it's trendy for companies to pretend to care for LGBTQ now, and censoring it in 2021 would be a faux pas (Yes, Cook is himself gay, but that's not it, because he never was one for causes. Despite being ultra-powerful and rich, he only "went for the cause" for himself and Apple after it became Disney-safe to do so and every company did it - unlike the gay pioneers of the 60s to 90s who did it under really hostile environments).


OTOH, if you sesrch app store for “sax”, it will show you results for “sex”, so I’m sure that balances it out.


It's possible the wordfilter is provided by some outside service and it wasn't inspected thoroughly.


OT but I have never heard of anyone saying Asian restaurant.


Ever heard 'asian fusion restaurant'?

It's more than Chinese or Japanese cuisines. People might often want something asian to eat, but not decided on the particular yet...

Heck, my delivery app has "Asian" as a food category for searching...


If you're telling your friend about your dinner last night, you probably say "Chinese restaurant" but if you're trying to discover new restaurants through search you probably use "Asian restaurant".


It's literally a category in Doordash.


> any search tangentially related to “Asian” (i.e., Asian-American, Asian food, Southeast Asian, Asian restaurants near me, etc) will return a message reading “You cannot browse this page at ‘google.com’ because it is restricted,” or, “The URL was blocked by a content filter.”

> Other racial or ethnic search terms such as “Black,” “white,” “Korean,” “Arab” or “French” don’t seem to be impacted by the filter.

How is this not racism against Asians? It's pathetic that a trillion dollar company's racist content filter would even pass through quality checks in the first place.


The author of the article bends over backwards to make an incredibly weak and muddled argument for racism, before using a term that many people, including people who are the demographic focus of the term, find racist in any context. It's an absolutely terrible article.

The filter is racist because the term being blocked is the result of an underlying bias in the data that has racist origins: the fetishization of Asian women. (That block may have a real impact on people's lives. A young child searching for 'Asian history'; or a teen searching for 'Asian takeout' won't be able to find what they're looking for, which could impact Asians directly. I don't know specifically, and a real journalist would have found some real life examples, tried to get data to measure impact, etc.)

I don't agree with the author that this is 'insulting' just because it happened; or that Apple (or anyone) need necessarily hand curate the block list. (Maybe they can automate a filter list for their filter list...) I think the problem is that Apple was informed of this problem over a year ago and has neither discussed nor resolved it. That problem is, at best, only hinted at in the article.


Seems silly to suggest "racism" rather than badly designed algorithm that nobody at apple is paying attention to.


And how is that not racism? "I didn't care enough about Asians to make my software work for them" is "systemic racism".


Do you have proof that no Asian employees at Apple were involved in the development of the filter?


I mean, why do you think my statement has anything to do with that? It is widely accepted among the people who talk about "systemic racism" that, say, the existence of black cops doesn't mean the department isn't racist against black people.


It's called systematic bias


>How is this not racism against Asians?

That same filter also blocks the word "redhead", among others.


What conclusion could we make? Apple doesn't consider Asians and redheads important enough to fix this.


I think a more charitable interpretation of the situation would be "Apple didn't expect the number of Adult Content Filter users to be high enough to dedicate more efforts to testing it before shipping. So they had no idea until now that there were issues like this in the first place."

This situation has nothing to do with Asians or redheads not being "important enough". It has everything to do with the adult content filter feature not being "important enough" (until these news about the bug broke out, I assume).


Because there is no quality check. It's a bad implementation not being reviewed by a human - it's not racist, it's just banning stuff related to porn, and inconsistently at that.


> it's not racist, it's just

Hanlon’s razor is no excuse for discrimination. In fact it’s the most common excuse, but it is never acceptable.


No one is saying it's an excuse or acceptable, it's just an explanation for why the state of things is the way it is. People are too lazy or incompetent to fix software. Calling it racism is to cheapen the label into meaninglessness.


as an asian american, uh... I guess my feeling is that this isn't meaningfully discrimination. It's both an understandable mistake, and unacceptable only when not fixed with celerity.


I don't consider it an understandable or excusable mistake.

Not that they failed to specially protect Asians when they made their content filter, that they even made a content filter in the first place.

Because this kind of error is not only predictable and inevitable, it's not even fixable.

Even now, even if someone at Apple directs this to be fixed, the fix can only possibly be stupid, and the same error will still happen to countless other benign searches and content. Even actual humans get this wrong.

Since 2018 when they introduced the filter, it was known for blocking searches for "safe sex" or anything with sex in the string, while allowing searches to join terrorist organizations.

The fundamental problem is that words are not deeds nor intents. Words, any words, are just words and have no good or bad value. Even good people doing good things still have to use words that refer to bad things. In order to say "War is Hell", you have to say both war and hell. It's patently insane to think there is some way around such a basic thing.

If you wanted to respond to my arguement and say yeah but what about "Japanese school girl"? You had to say "Japanese school girl" and the discussion would be impossible or a farce if both our comments kept getting blocked, or even if just the "bad" parts were.


You just explained the core problem, so definitionally at least you must agree that it is understandable. I guess we'll have to agree to disagree that it's acceptable. It seems you're coming from a core premise that apple shouldn't block anything, ever, so yeah, you will fundamentally never find any blocking to be acceptable.

I can sympathize with that premise, as I myself don't personally like blocking. However, I think it's apple's right to do so, and given that, I think the "racial" aspect of the blocking is excusable (if they fix it quickly).


My premise is not that I want it or don't want it. It's that it's fundamentally impossible and so it's irrelevant whether anyone likes it or not, and that this makes it non-sensical to even try.

I don't mean to say it's pointless to try to solve problems. I mean to say if there is a problem, no good solutions to it lie in that direction.

Someone non-ignorant somewhere should have said, and continue to say every time an ignorant parent or school board tries to ask for the non-sensical and ultimately counter-productive, that it is non-sensical and counter-productive.

Of course this wish is itself a fantasy, impossible and thus non-sensical to expect, which is why I don't actually expect it.


I'm not saying it is. If Apple spun a wheel of all english words with the commitment to ban the one it landed on, and it was 'asian', it wouldn't be racist, but as you say it wouldn't be acceptable - both in terms of how they did it and the outcome.


> Hanlon’s razor is no excuse for discrimination. In fact it’s the most common excuse

No it is not.


Anyone who think it's not racism, look at this:

> Shen also told the Independent that he had filed a report to Apple in December 2019 pointing out the issue, but that he never received a response.

So Apple ignored the issue for more than a year. This is the real problem. If it were "black" then do you expect people will react the same and say it's just an algorithm problem or a systematic bias?


Easy there bud thats just shit algo and its hilarious


I could have sworn I've seen a headline like this repeatedly over the years. Maybe its sporadic reoccurrence reveals that people don't really use the Adult Content Filter because this is such an obviously nonsensical thing to have, especially in the Bay Area, where these guys have their HQ.

Like there are so many reasons you'd use the word Asian here.


Its called the Scunthorpe problem. The name comes from an early incident where AOL filters prevented users from Scunthorpe, England from creating accounts. (because of "cunt" in "Scunthorpe" if that wasn't obvious)

https://en.wikipedia.org/wiki/Scunthorpe_problem

Few years before the AOL incident I recall watching a news story about internet filters (in schools... I believe) blocking information on Superbowl 30, which is stylized with roman numerals, as Superbowl XXX. I remember this 25 years later because it was so amusing to me. Still is.

(XXX is sometimes used to denote sexual content.)

Oh, hey! Wikipedia actually provides a citation for the Superbowl XXX incident!!


sort of related - I recently had an incident with a school app where children can upload and showcase work, leave messages for parents, etc.

a message was signed by a seven-year-old as "love from {child's name}.xxx" - three kisses with no whitespace after the full stop

interestingly, certain operating systems decided to treat this as a hyperlink with typical styling. remarkably, since .xxx is a valid TLD and {child's name} is a popular name, the link resolved to an active domain which forwards the user to a homonymous adult performer's Brazzers profile

on a school app!


happily, this story doesn't end with "and he's still a registered sex offender today!"


100 years ago when I went to create my first AOL account, a typical way to construct the username was first initial last name, "bwhite" in my case.

It was not allowed, and it was not because it was taken, because I then tried a lot of names and generated denied messages for different types of denials like already occupied and invalid characters or length or swear words etc. They each had their own verbiage, and they were all different from the message for "bwhite". Also I think there was some pretty easy way to test for the existence of a username in the app like just trying to message them or something, so I knew the name was not in use.

That must have been '95 or so, and nothing has much changed.


It's close not but quite. That problem is describing terms where the issue is a substring of the term itself.

Here, the issue is due to the gross sexual fetishization of Asians in western culture. That is, it's context based, not term based.


> gross sexual fetishization of Asians in western culture

No need to pass so much judgement.


What?


No, it is the Scunthorpe problem. There's no rule the "banned" term has to be a substring, just the filter blocks it because in a different context it would violate the content policy.

>The Scunthorpe problem is the unintentional blocking of websites, e-mails, forum posts or search results by a spam filter or search engine because their text contains a string (or substring) of letters that appear to have an obscene or otherwise unacceptable meaning.

They go on to give examples such as a profanity filter blocking the word "bone" at a paleontology conference and a user banned for saying "great tits" when referring to the bird.


No, it isn't.

From the article

> The Scunthorpe problem is the unintentional blocking of websites, e-mails, forum posts or search results by a spam filter or search engine because their text contains a string (or substring) of letters that appear to have an obscene or otherwise unacceptable meaning. Names, abbreviations, and technical terms are most often cited as being affected by the issue.

Specifically

> because their text contains a string (or substring) of letters that appear to have an obscene or otherwise unacceptable meaning.


It's quite baffling that you can't define your own content filter on your own device.

Perhaps this will finally make some Apple users tired of being parented by Apple.



Personally I use content filters to block me from going on yahoo.com (don’t ask). As a byproduct it blocks a lot of other stuff - including this article. Too ironic.


I used to have it enabled to block certain social media websites. It is unfortunate you can't just block websites without also enabling the adult content filter, especially since the adult content filter has many false positives in my experience.


Whoever owns the adult content feature should dogfood it, as should some folks adjacent teams.


Apple is well-known as the whitest tech company in leadership roles, it’s entirely possible they did but never noticed.


The idea that Apple engineers, no matter how "white", never searched for Asian food in Cupertino, strains credibility to the breaking point.


Why would you search for 'asian' food? You'd search for japanese, chinese, korean, vietnamese...


Because there are many similarities and if you like 4 of the listed ones, searching for an Asian restaurant gives you more options to browse without searching for a specific thing.


> Because there are many similarities

This just isn't true.


What would you call all 4 featuring noodle dishes, curries over rice, ramen-like dishes, etc.? Sounds like a similarity to me.


I'd call that "you went to a restaurant offering Asian Fusion cuisine and confused it with a Korean restaurant". I've been in Korean restaurants where I would have leapt at the chance to order ramen; it's not something you can do. (I loathe and despise Korean food, but I was working for a Korean guy...)

We can also note that you seem to be including a lot of European food, like risotto, in the category of "Asian food".


You can a acknowledge the similarity of ristto dishes to curry over rice and at the same time say it's more different than Indian and Japanese curry is similar. Let's not kill all the nuance.

And sure, you won't find ramen in Korean cuisine, which is why I wrote ramen-like. You'll find noodle soups though, like https://en.m.wikipedia.org/wiki/Janchi-guksu


I was annoyed by the expression myself, wondering why one would group Japanese and Thai cuisines in the same group.

Then I noted that here in Singapore you often find "western" restaurants, which serves the stereotypical western style cuisine.

So I guess this happens a lot more than you might think initially.


There are similarities. Use of rice noodles, peanuts, soy sauce, fermented fish sauces, etc.


This white boy from the Midwest has never searched for "Asian food" in Cupertino or anywhere else. I have, though, frequently searched for Thai, Indian, Japanese, Chinese...

(And I see that I've been...not exactly ninja'ed, I guess, if I switch back to answer email before finishing my HN reply.)


What ripdog said. There's another comment upthread suggesting that searching for "Asian restaurants near me" would be a logical thing for someone to do, and I can't for the life of me figure out why. "Asian restaurants" are a category no more useful than "restaurants".


Ok, everyone saying the same thing: have you ever... been to Cupertino?

https://www.google.com/search?q=asian+fusion+cupertino


Statistically, they're more likely to be searching for "asian" in an adult context then.


You assume that if there were more asians working at apple this wouldn't have happened. Do you think asians spend all day searching for "asian" on google? How many times did you search for your race on google last year?


I have no idea, but way more than 0 or even 10.


It more likely has to do with Apple's secrecy.


I'd love to read the memo asking employees to report back their porn viewing habits.


[flagged]


Counter-argument:

It's important, of note, news, that the word 'Asian' is blocked (why else are you bothering to comment here?). Whether or not it's blocked for a semi-automated reason, this is indeed what discrimination looks like. It's the sexual fetishisation of a whole race of people bleeding into every-day life.

Since the useful information is in the headline, I don't need to click - that's the opposite of click-bait. So I disagree.


[flagged]


Wait, is redhead fetishization NOT a racist thing?


[flagged]


> Men need to stop being attracted to women based on their physical characteristics.

I can't tell if this is meant to be a joke or not.


Not quite a joke, but my understanding of the increasingly prevalent expectation


> Does that mean the fetishization of redheads is also bleeding into every-day life?

Yes it does. Why is this so hard for you to understand?


Please omit personal swipes. They're against the site guidelines and degrade discussion significantly.

https://news.ycombinator.com/newsguidelines.html


[flagged]


You concur that there may be a case for "Asians" being fetishized. So Gizmodo has a totally valid case for mentioning Asians in the title, by your own admission? Plus the title is factually true. What was your complaint about in the first place, regarding "race-baiting" and "clickbait"?


[flagged]


How can something so subjective as the quality and intent of writing be judged on whether it's "logically consistent" (which, even in a more appropriate context, is a uselessly low bar)? And "intelligent"? What does such a broad complement even refer to in this case?


im not sure where to begin here.. but towards achieving an appreciation for logic, you might enjoy something that intersects with "mathematics", such as godel's "Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme". i also recommend reading bertrand russel's "on denoting". im quite curious to know if, once you've digested them, if you would change your comment. at the least, can you clarify if you truly meant "complement", or did you mean "compliment"?


If you understand logic, at the level which you claim, then you should know that "logically consistent" has no correspondence with "correct".

An "intelligent, logically consistent argument", as you say, can be plainly wrong or irrelevant.

The underlying claim that "Asian" and "redhead" are exactly analogous seems dubious to me and I disagree with that, although I can agree that it is also unfortunate that the word "redhead" is banned by this filter.


If blondes and brunettes have their lives affected in a similar manner where they are prevented from searching about themselves, I would make the same argument.

The point is that these terms that describe our identities should not be filtered out.


Just because you don't mean to be racist doesn't mean the outcome of your behavior cannot bear a pungent racial smell. When people train models for facial recognition, we don't assume researchers hate black people.

However, when you're making decisions for a lot of people, it would be deeply incompetent to not understand how you damaged an experience for the majority of customers. In other words, if the majority of your customers were black, then you'd notice that "black" was a banned keyword.

When you can't have everything you want, then how you prioritize says something about you.


The story seems valid. Are they supposed to include every blocked word in the headline? There's such a thing as an example.


It could be still valid and more precise with "iPhone adult content filter blocks many common words describing porn categories".


Here let me update your human model a little.

An example is better.


Why are you assuming the only reason to downvote would be that they think the headline has no clickbait qualities?


> This is classic clickbait, I'm not sure how anyone could possibly disagree.

Perhaps it is because this does not affect you or your family.

I'm Asian American. If I turn on the adult filter on my children's device, they will be unable to search for anything Asian.

Do you realize how much damage that would do if my children cannot even research things related to themselves?

This is not clickbait. Just because you are not affected by it doesn't mean it's "race-baiting garbage".


>Do you realize how much damage that would do if my children cannot even research things related to themselves?

I think a more likely reason they won't be able to research things related to themselves is because the term "teen" is also banned by that filter (assuming they are around that age, of course, which could be a completely wrong guess on my end).


> they will be unable to search for anything Asian

They will be unable to search for things with "Asian" as a term. They will be able to find lots of Asian things and things related to more specific locations / names / ideas.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: