
Bogo-bogosort - luu
http://www.dangermouse.net/esoteric/bogobogosort.html
======
eridius
This seems completely pointless. The elegance of bogosort is that it's an
extremely simple algorithm, with a simple description of "randomize until it's
sorted". Bogobogosort is complicated for no apparent reason. It's trying to be
cute and clever, but there's no rationale for why additional complexity is
being added.

Bogobogosort seems in the end to be no more worthwhile than sleepybogosort,
where you must sleep() in between every operation.

~~~
deletes
No, since the complexity of bogosort with sleep is still O(n!), while the
algorithm from OP has a higher bound.

~~~
eridius
Depends on how you do the sleeps.

------
mikeash
This got me thinking about a more generalized approach to this sort of thing.

Given a task where you can check completion _somehow_ , you can then solve it
using the following procedure:

1\. Generate a random program. (This could be a Turing machine, bytecode, C
source, whatever.)

2\. Execute the program on the input for n steps, where n is an incrementing
counter.

3\. Execute the given check on the output. If the check passes, return the
output.

4\. Increment n and go back to 1.

The restriction on n steps avoids running afoul of the Halting Problem.

The check has to be a little more rigorous than the one used in Bogosort. It
can't just check to see if the output is in order, because the output may
contain completely different numbers, since the program could be doing
anything. You'd have to check not only for order but also that the output is
in fact a permutation of the input.

I wonder what the running time of this algorithm is. I wonder if it can even
be calculated.

~~~
fragsworth
We both came up with nearly the same idea at nearly the same time (within one
minute) - what are the odds of that?

My solution is slightly different - to avoid the Halting Problem, you can use
a distributed approach. You have a very large (but finite) number of
processors that you pass your programs to.

~~~
mikeash
"what are the odds of that?"

I'm not sure, maybe we should build an unbelievably inefficient program to
calculate them.

~~~
elwell
But is it really inefficient if it is efficient at being inefficient?

------
Zombieball
My favourite is still Intelligent Design Sort:

[http://www.dangermouse.net/esoteric/intelligentdesignsort.ht...](http://www.dangermouse.net/esoteric/intelligentdesignsort.html)

O(1) run time!

~~~
avaku
Genius :)

------
cardamomo
My favorite line: "as anyone who knows anything at all about computer science
knows, recursion is always good and cool."

------
jerf
I snort in fake outrage that nobody has seen fit to reference the definitive
work on this topic:
[http://ivanych.net/doc/PessimalAlgorithmsAndSimplexityAnalys...](http://ivanych.net/doc/PessimalAlgorithmsAndSimplexityAnalysis.pdf)
(Check out section 5 in particular for today's topic.)

------
fragsworth
Here is a "distributed" sorting algorithm that requires even less thinking -
you don't even have to write a shuffling algorithm!

1) Start with an empty string.

2) Increment the string in such a way that will _eventually_ generate all
strings.

3) Give the string (and your unsorted array) to a processor that attempts to
run the string as a program and go back to step 2.

When a processor is given a program, if the output contains all the elements
of A and the output is sorted, stop everything! You've sorted the array.

It requires O(n^m) processors, where n is the number of characters possible in
a program, and m is the string length of the code that finds the solution.
Lots of the processors will be stuck in infinite loops.

------
ChristianMarks
I have a soft spot in my head for _turdsort_ , which attempts to optimize over
_bogosort_ , but ends up being no better.

    
    
      #!/usr/bin/python
      from random import shuffle
    
      def turdsort(a):
        def turds(a):
          n = 0
          for i in xrange(len(a)-1):
            if a[i] > a[i+1]:
              n += 1
          return n
    
        count = 1
        n = len(a)
        t = turds(a)
        while n > 0:
          while t >= n:
            shuffle(a)
            t = turds(a)
            count += 1
          n = t
        return count
    
      if __name__ == '__main__':
        a = range(10)
        print a
        print turdsort(a)
        shuffle(a)
        print a
        print turdsort(a)

~~~
ChristianMarks
There is also the slightly less pointless _ordersort_ , which works only on
permutations. The complexity of _ordersort_ is given by the Landau function.
It is more efficient than bogosort, however.

    
    
      #!/usr/bin/python
    
      def gcd(a,b):
        while b:
          a, b = b, a % b
        return a
    
      def order(a):
        """compute the order of the permutation a"""
        lcm = 1 
        for i in range(len(a)):
          j = a[i]
          while j > i:
            j = a[j]
          if j == i: # i is a cycle leader
            j = a[j]	# get next element of cycle
            cyc = 1;	# the cycle has length at least 1
            while j != i:   # the cycle hasn't closed
              cyc += 1
              j = a[j]
            lcm = (lcm/gcd(lcm,cyc))*cyc
        return lcm
    
      def ordersort(a):
        ord = order(a)
        print ord
        b = range(len(a))
        while ord > 0:
          b = map((lambda i: a[i]), b)
          ord -= 1
        return b
    
      if __name__ == '__main__':
        from random import shuffle
        a = range(50);
        shuffle(a)
        print a, ordersort(a)

------
deletes
My addition to the algorithm: Every time your check if arrays is sorted and it
isn't, start from the very beginning throwing away all progress made so far.

I doubt this way n == 6 would finish in some normal time period.

~~~
zardeh
bogosort does that already.

I ran it on an the array [0,1,2,3,4,5,6] and it took between 3 hundred million
and 1 billion operations (by my marginally accurate counter) and between 6 and
20 minutes, but mine was in python, not C.

~~~
deletes
Welcome to HN.

I was referring to OPs algorithm which was bogo-bogosort.

At first I though my change would not increase the complexity, but I think it
is dependent on size of input since for every recursion there is a chance to
fail and start completely over, where the number of recursion is dependent on
input size. Therefore it adds to complexity.

------
duncanwest
You say it will take on average n! attempts to find a sorted list randomly.
This is false. It will take on average n!/2 attempts.

~~~
deletes
O(n!/2) == O(n!), the whole paragraph is obviously referring to big o
notation.

edit:( I have edited the post to respond to your comment, trying to clarify
what OP meant; I in no way tried to make your comment "look silly" )

~~~
duncanwest
Yes, but that's not what he says. He says "The loop will repeat on average n!
times". That's not true. He's not referring to O notation there, he brings
that in later. Here he's talking about the imperative number of times a loop
will run on average.

Edit: you've edited your paragraph to make my reply look silly. The whole
paragraph is not in O notation. In fact - the exact opposite. In the last
sentence of that paragraph he shows what the expression with constant factors
looks like before converting it into O notation, and it's wrong! "The product
(n-1)n! is O(n × n!)." Bzzzzzzt! Wrong! He should say "The product (n-1)(n!/2)
is O(n × n!)."

------
elwell
Isn't this how evolution works?

~~~
sesqu
No.

Evolution doesn't involve resets - the hint is in the name.

