Hacker News new | past | comments | ask | show | jobs | submit login
Pessimal Algorithms and Simplexity Analysis [pdf] (mipmip.org)
48 points by belter 32 days ago | hide | past | favorite | 11 comments



One of my favourite papers of all time.

Most of the paper is a good joke, but Slowsort takes it a step further. It is just so inefficient I can’t help but laugh out loud each time I revisit the paper and re-understand the algorithm.

It ticks all the nice boxes, such as being easily proven correct, easy to implement, easy to parallelize, stable, no time wasted on unnecessary steps, can produce output as it sorts the rest.

At the same time, it’s Mergesort with a ridiculous merging step. But that means one can simply add a parameter to tune between Mergesort and Slowsort – if(rand(1,10) > 8) then slow else merge – so the code becomes tunable in terms of speed. Someone needs faster software? OK, I’ve got just the parameter for you :-D

If you haven’t implemented it yourself, give it a shot. It’s a transformative experience to see your computer struggle sorting an already sorted 100-entry array ;-)


Slowsort seems to be another name for the (to me) more common name Bogosort:

https://en.wikipedia.org/wiki/Bogosort


Slowsort is guaranteed to take a long time but Bogosort could finish in one step.


Bogosort is at best linear, since we have to check if the list is sorted.

Quantum Bogosort guarantees the best performance, since it destroys all universes where the list remains unsorted after the first shuffle.


I'd like to contribute an algorithm to this field of study. I call it, "Spend several months researching a novel algorithm when brute force would have been effective for all real-world problem sizes."


This ranks next to COMEFROM (which is observably superior to GOTO of the same era) as my favorite computer science jokes.


Didn't realize this was a joke. I was scratching my head, "There are engineering problems that intentionally aim to slow performance? Because if so all they have to do is just hire me..."


Some hash functions do this, e.g. bcrypt and scrypt. Theoretically, their output is no more secure than faster algorithms like SHA; but it's less practical to brute-force lots of guesses.

Hashcash (as found in Bitcoin) is another example, albeit more egregious.


The keyboard layout (inherited form the typewriters) comes to mind.


"QWERTY was designed to slow typists down" appears to be somewhat disputed claim, at least going by Google results. There are a few articles [0, 1] based on a paper [2] saying that the design was more due to feedback from telegraph operators rather than anything to do with typewriter mechanics.

On the other hand, if you subscribe to the "reduced jams" view, then that would arguably have the effect of speeding up typing due to having to deal with jams less frequently.

[0]: https://www.smithsonianmag.com/arts-culture/fact-of-fiction-...

[1]: https://www.theatlantic.com/technology/archive/2013/05/the-l...

[2]: http://kanji.zinbun.kyoto-u.ac.jp/~yasuoka/publications/PreQ...


I love multiply-and-surrender paradigms!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: