Hacker News new | comments | ask | show | jobs | submit login
Source of the recent outagelet
68 points by pg on Feb 19, 2014 | hide | past | web | favorite | 47 comments
We have a list of phrases that get replaced in titles and their replacements (which can be the empty string if we simply want to remove something). A moderator accidentally added an identical pair to it.



> We have a list of phrases that get replaced in titles and their replacements (which can be the empty string if we simply want to remove something). A moderator accidentally added an identical pair to it.

This should prove once and forever that tail recursion is dangerous and does not help with real world problems. Clearly, blowing up the stack would have been the appropriate and safer response here.


Yeah, that would replace an infinite loop with a stack overflow in that specific case, but I'm not sure that all infinite loop bugs can be converted to stack overflow bugs in the general case. Another commenter's suggestion to "have tests and a staging environment" makes more sense to me.


I think you mean that TCO is dangerous. Besides, this is probably more of an implementation problem. Do you really want to be applying a filter like this multiple times?

If I had to implement this, I would just imagine that just taking your mapping, and for each key,value , do a title.replace(key,value). I would be a "good enough" solution , and in the worst case you still have human editors on HN.


If you have one wordfilter to map "!"->"" and another to replace "cunt"->"asdf" and you got the input "cu!nt" what should the output be?


Interestingly, distinguishing between "non-loopy" and "potentially loopy" sets of rewrite rules seems to be equivalent to the halting problem :-( The key phrase would be "tag systems".


I'm not sure I particularly care so long as it produces some output before the heat death of the universe.


good point. I understood this to be more of a counter to non-malicious intent than otherwise.


Sorry, this is uneducated bullshit. There was a infinite loop and that was a error in programming logic. Limiting loops a certain number of iterations is absolute bogus. Ever though how stupid it sounds to demand c to forbid while(1)?


I spent entirely too much time trying to parse 'outagelet'.


Maybe this term "outagelet" should be added to list of terms to automatically replace in article titles.


I propose we replace it with "outagelet".


But carefully.


I read "outragelet" which is something else that occasionally happens around here.


yup. It took me literally several minutes before I saw the word "outage" in there for some reason. For some reason the non-word "gelet" overwhelmed my brain so I kept seeing out-a-gelet. Kind of like port-a-potty. Then I migrated to outa-gelet->ou-tage-let->o-utage-let->(I skipped ou-tagel-et because that looked like a german word sandwiched between two french words)->outage-let---eureka, where's my bath tub!


For some reason my brain insists on adding an extra 'l' so it reads as "outlagelet" every time.


For a hardcoded list that is manually edited frequently, maybe a duplicate-safe data structure would be preferably. It might require an extra line to parse out duplicates, but at least a page won't fail if something like this happens again.


I'd assume that they've added a check for that now, but it's hard to predict beforehand what things will break.


So a tautology broke Hacker News?


Infinite loop?


Presumably a hit on any of the "before" keys triggers not just a replacement with the corresponding "after", but another pass through either a subset of the replacement list starting from the key that matched, or the entire replacement list from the top down; otherwise an identical pair wouldn't have a chance to be pathological.


i wonder whether other cycles would cause the same problem.


Probably used "while" instead of "for each" when matching.


I guess "outagelet" is a replacement for "downtime".


It wasn't straight downtime. For a while we were up but you couldn't submit stories or comments.


I've seen this called badtime. Outagelet seems too cute, when I think of downtime as serious business.

(TBF, that is when I am a paying consumer or a paid developer. I'm upset about neither the HN badtime nor the phrase outagelet.)


What about 'browntime'?


That is hilarious. That one will be hard to resist.


Ill get my team to incorporate this in their lexicon and include it in post mortems.

Minimizes perceived downtime.


"Outagelet"? That's too cute by half.


Out of curiosity, what are some of those phrases?


Don't know about phrases, but exclamation points are automatically removed, so "!" -> "" in this case. The problem arose because someone replaced an unidentified X with X, resulting in an infinite recursion.


Things like "Top 10 Reasons X is Y" becomes "Reasons X is Y" I believe.


issue: code 403 (Forbidden).

I think you should do your best to always return an appropriate HTTP status header, and in this case it is status 503 (Service Temporarily Unavailable). You really want to reserve 403's for those pesky w00t w00ts :) , ' aNd 1=1 and the likes.

Getting 403's with my personal account, but gaining (slow) access through another browser, I was convinced that the issue had to do with my account being blocked.


In this case using nginx allow and deny statements was much easier than doing bit fiddling in Lua and returning 503s. Any time spent researching other options would have been time not spent determining the problem.

Often getting to the root of the problem as quickly as possible while simultaneously keeping the site up (practical) wins over always using correct HTTP response codes (pedantic).


Serving 403s to w00tw00ts is for casuals. Face those leet haxors!


That's why, have tests and a staging environment.


This is HackerNews. It's not called PerfectSoftwareDevelopmentPracticesInActionNews.

Just saying.


Based on the nitpicking, I thought it was!


I was able to keep a 10k/min request website up and running when I was 15. I feel sorry about who's taking care of this website.


Right ... websites with tests and a staging environment never experience downtime.


Hmm, that sounds like quite n0nsense to me. Maybe it's because "less downtime" is better than "more downtime"?


You just need enough cores so the infinite loop completes in a finite amount of time.


A cute wordlet.


A cousin of the Googlewhack.


These things happen. :)


Why none of us should quit our day jobs...


But that's exactly what pg/YC wants you to do!




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: