

Is the search space of an optimization problem really that big? - brudgers
http://www.optaplanner.org/blog/2014/03/27/IsTheSearchSpaceOfAnOptimizationProblemReallyThatBig.html

======
raverbashing
Yes, yes it is

Heuristics may help, but in the end you either go through "all" of it or use
approximate methods that will give you a good solution but not necessarily the
best one (like Genetic Algorithms, Simulated Annealing, swarm optimization,
etc)

~~~
taeric
This is not the case, though. It is possible to eliminate vast portions of the
full search space for some problems.

I will not claim a full understanding of this, but the latest volume of the
Art of Computer Programming has some really deep treatments of this. The main
takeaway being that it is possible to work with such things for many problems
without just approximating. (Though, yes, for large N you do both
approximations and clever algorithms that let you deal with as large of an N
as possible.)

~~~
raverbashing
Yes, but vast portions of an even vast space is still a lot

Of course, you can join both techniques and prune as much space heuristically
then apply other algorithms to the remaining space.

~~~
taeric
Agreed on it still being vast.

I think I am just taking (unreasonable) exception to the term of heuristic
here. Yes, naive brute force would have you look at about 10^7 positions for
the classic 8-queens solution. However, there is no need to do this. Dancing
links drops that number down to just over 10^3. Without sacrificing
correctness/exactness.

~~~
TheLoneWolfling
And for 100-queens? Or 1,000,000-queens?

~~~
taeric
Not sure where you think I am disagreeing that large problems are large.
Consider, how long will it take for you to find the smallest of 10^99999
numbers?

And again, I was probably just overly reacting to the term heuristic. That is
usually used when it is implied you can't guess on the final outcome somehow.
At least, that is how I typically read it. So yeah, odds are I'm just wrong.
:)

------
taeric
For the lulz a while back, I did a rough implementation of the DLX algorithm
for the N-Queens problem. For the 4-Queens problem, it only looks at 12
possible configurations. (Visualization here:
[http://taeric.github.io/DancingLinks.html](http://taeric.github.io/DancingLinks.html))

The same algorithm works for the likes of Sudoku. Is quite impressive to see
how quickly it can solve some problems.

~~~
TheLoneWolfling
For the N-queens problem, especially as N grows large, there is an interesting
probabilistic solution that generally is (absurdly) fast.

Put the N queens randomly on the board. Then repeatedly pick the queen that is
attacked by the most other queens (if/when there are multiple, pick randomly),
and place it in the spot where it is attacked by the least other queens
(if/when there are multiple, pick randomly). Repeat until you have a solution.

~~~
taeric
If you are just looking for _a_ solution, DLX is already absurdly fast.
Granted, as N grows exceptionally large, it may be that it is also absurdly
memory hungry. :)

Also note, that it isn't even the best algorithm around. Just pretty good.

