

Solving big questions requires big computation - avgn
http://news.stanford.edu/features/2014/computing/

======
tromp
One big question that interests me is: how many possible positions does the
game of Go have?

Answering that requires computing the 361st power of a very sparse 363 billion
by 363 billion matrix, a big computation indeed (see
[http://www.cwi.nl/~tromp/go/legal.html](http://www.cwi.nl/~tromp/go/legal.html)
for details).

Right now, I'm computing the number of 18x18 positions (standard size is
19x19) using a couple of servers with 512GB memory and a few TB of disk-space,
and that's taking almost a year.

I'd love to have access to a super-computer, but they tend to frown upon such
frivolous computations...

~~~
Someone
But does that _require_ big computation, or aren't we smart enough to do this
the easy way?

I know simple induction will not work for this problem, but to me, this feels
like the kind of problem that might be simple enough that some mathematical
breakthrough could make it a lot simpler.

~~~
jeffreyrogers
Sometimes with this sort of thing you can exploit symmetry in the game board
to reduce the amount of work you need to do. I don't know enough about Go (or
anything at all really) to say if that's possible or not in this situation,
but it is probably worth looking into.

I agree though, this does seem like the type of problem that should have a
relatively simple solution, or at least a reasonable approximation.

~~~
tromp
We have excellent approximations; for 19x19 the number is known to be
approximately 2.081681994 * 10^170. Getting the exact number is the big
challenge. You can see in the paper that we spent a lot of effort on reducing
the complexity of the counting problem. We welcome suggestions for further
improvement, but suspect our current algorithm (counting paths in the border
state graph) is already close to optimal.

------
laurencerowe
Many scientific disciplines require supercomputers with fast interconnects for
their complex simulations. Genomics is not presently one of them, we just need
lots of compute time on fairly normal servers.

The HPC does get used of course, but not for purely technical reasons.
Genomics workloads should move to the cloud and be run at off-peak times to
take advantage of cheaper pricing, but there are significant barriers to this
as researchers can often get started on the HPC for free whereas cloud
services require a purchasing card. Their cluster will also have some support
and someone who can install software dependencies into the stack.

(As a result most genomics software is really, really hard to get running on
anything other than the author's institutions HPC and its unique set of
installed packages.)

Another barrier is just knowledge (it's hard to keep those people around in
the current market) and university projects often have strange incentices due
to central services (paid for out of general overheads) having no marginal
cost.

------
TTPrograms
Great, cool, computation is useful. Is stanford building a new super computer?
TFLOPS? CPU/GPU? This is some fluffy PR bull.

[https://doresearch.stanford.edu/research-
scholarship/computi...](https://doresearch.stanford.edu/research-
scholarship/computing-support-research)

~~~
rwallace
It's a well-written inspirational/PR piece. Such things do have value. It
doesn't claim to be presenting any novel facts, though your link looks like a
good place to start looking for those.

