
We Give a F*** How the Site Loads - namenotrequired
http://dt.deviantart.com/journal/We-Give-a-F-How-the-Site-Loads-392679726
======
cowpewter
Here at Grooveshark, we ran into a bug once where users in some countries
couldn't perform searches. It took a while to figure out, but eventually we
discovered that it was a problem with the name of the API call the site makes
to get search results.

Originally the call was named 'getSearchResults'. Over time, we made some non-
backwards-compatible changes to the search call, and following our usual
naming standards, named the new call 'getSearchResultsEx'.

Additionally, for ease of debugging, when we hit the API endpoint, we add the
method name as a GET param. Makes it much easier when scrolling through
Firebug's network tab to find the API request you need to examine, since they
all hit the same php script.

It turns out that in some countries, you can't visit a url that contains the
word 'sex'...and the filter is case insensitive.

~~~
quantumpotato_
Which fucking countries?

~~~
cowpewter
It was a while ago. I want to say the main one was Egypt, but one of our
community reps would probably remember better than I.

~~~
quantumpotato_
Can you ask? I'm curious.

------
IvyMike
> But we can promise if you're browsing deviantART at the public library, our
> swearing won't stop you from using the site.

If your public library censors profane _words_ , you should fucking riot.

------
DanBC
> That's right. The almighty F-word was breaking how some stylesheets were
> loading for deviants who were accessing the site from computers with overly
> sensitive system-wide profanity filters installed. These users' browsers
> likely stopped parsing the stylesheet entirely upon reaching the word in the
> stylesheet, leading to a fairly ugly and/or broken page.

Wait what? What filters are checking CSS files, and why?

~~~
dpeck
Probably overzealous filter that looks at everything going over HTTP and stops
passing the connection whenever it encounters a naughty word.

Unsurprisingly these types of filters don't have a lot of understanding of
what is going to be displayed to the user vs not, but then again they probably
don't need to to satisfy their target audience quite well.

~~~
gkop
There are several ways the user might be exposed to the contents of the CSS
file, but the most straightforward is probably the content property -
[http://www.w3schools.com/cssref/pr_gen_content.asp](http://www.w3schools.com/cssref/pr_gen_content.asp)
.

------
jongraehl
Another reason to use https (though I guess IT can install a cert for
transparent MITM SSL proxying). Imagine a user whose TCP connection shuts down
as soon as the 4 bytes 'fuck' pass it (say, only for port 80). This should
happen randomly for high-entropy binary (e.g. compressed text/image) data -
about 1 in 4 billion bytes. I guess it's not that likely except for extremely
large files.

~~~
ars
Not can. Will.

And home based filters do the same, or work at the browser level.

So https doesn't help at all.

~~~
ufmace
I always wondered how many IT departments and filters actually bother to do
this...

~~~
voltagex_
Search for SSL on this page: [http://www.dsd.gov.au/infosec/top-
mitigations/top35mitigatio...](http://www.dsd.gov.au/infosec/top-
mitigations/top35mitigation-details.htm)

I imagine DOD requirements are the same.

------
tkiley
In May of this year, my development team went on a 1-month work-cation in
Panama.

The internet was fast enough, but we quickly discovered that we couldn't
access google analytics (google.com/ANALytics) without a VPN or socks proxy
back in the USA.

I suspect text-based web content filters tend to have fairly high false-
positive rates in general.

------
Joyfield
I learned LONG time ago to never write profanities in any part of my work. My
boss told me a story where some "strong" language showed up on a projected
demo of one of his projects and i stuck to me.

~~~
skeletonjelly
I coded in an alert with "WTF?!" as the text in an unlikely logic path once. I
like having something unique so I know where it came from. Came up in a demo
in front of the (govt) client. Luckily they had a chuckle at my expense, but
it could have been so much worse. I've got back to either contextual errors or
just "foo" for the lazy ones.

~~~
lamontcg
I built out some webservers with apache on it for a dev team once. I just set
it up with an index.html that had "FOO" in it just to prove that you could
point a browser at it and you got "FOO" back. Well a week later they
configured them behind a load balancer VIP so they wound up on a public IP
address. That was fine, but now you could them from the real world. The
mistake happened because that IP address had been used as part of a pool for
the public website and it was still configured on the networking gear doing
dynamic dns for the front end website. As soon as the IP came back up again
the DNS appliance decided that cluster was 'back in service' and started
handing out its IP address in response to www.<a large internet retailer>.com.
This resulted in a sev1 ticket coming back titled "customers reporting 'foo'
on the website". Imagine how much worse it could have been if I hadn't kept it
professional and used some more controversial text there...

------
joebeetee
This is funny. I've had to 'strongly encourage' our devs and designers to keep
it clean/professional when designing mockups, test apps, etc. It's so easy to
send a file around internally, give it a quick look over and shoot it off to
the client without realising there is a some dodgy paragraph copy or awkward
photo staring back at you. The humour isn't worth it!

Although I still find myself typing console.log("WHY THE FFFUUU") on annoying
JS bugs...

~~~
cookiecaper
I implemented a policy to only use the most bland test data while testing my
apps, and it's saved me a few times when, for whatever reason, there was an
accidental leakage of test data through to the client (whether it sneaks into
a slideshow, or a test email is accidentally sent to a production email
address, or...). I've been around development long enough to know that at some
point, this type of leakage WILL occur, and it will be embarrassing when a
customer sees a dummy entry for "Dickhead Buttface" or similar.

While it's excruciating and boring to use plain test data, you'll be glad you
did it when you have an accidental leak between environments.

------
crazygringo
On every serious project I've ever worked on, all comments get removed from
all exposed front-end code (CSS, JavaScript, HTML) via
LESS/minification/etc/etc.

I mean, there's no reason to expose ugly stuff like that... developers WILL
swear in comments, so just make sure it doesn't get out the front door...

------
ryansan
The video in this post is worth the price of admission alone. I love how ASP
is the least "badass" of all...

------
shuzchen
First thought: HTTPS sitewide would probably solve this problem. I'm pretty
sure most filtering happens on the networking level, not within the browser.
And if this browser on a public computer is augmented to read your https
traffic, don't use it.

------
ahoge
One of our clients once complained that one of _their_ web fonts weren't
loaded.

The file in question was blocked by their stupid proxy because its name was "
_brand_ SansExtended", which happens to contain the substring "sex".

------
leeoniya
all the more reason to compress your css

~~~
Groxx
probably won't save you if you set a class-name of 'ass'

~~~
saraid216
Clearly the correct answer is encryption. /s

