
Richard Feynman and the Connection Machine (1989) - pmoriarty
http://blog.longnow.org/02017/02/08/richard-feynman-and-the-connection-machine/
======
mark_l_watson
Both of a friend's parents worked on the Manhattan Project. Feynman was a
character.

I was fortunate to be able to write code for version 1 of the Connection
Machine (that was the SIMD device) and I would prototype code on my Mac in
Star Lisp. Good times that I am grateful for.

~~~
howfun
Write more about this. A lot of people will find it very interesting.

~~~
mpfundstein
+1

------
andyjohnson0
A nice essay. This paragraph stood out for me:

 _" By the end of that summer of 1983, Richard had completed his analysis of
the behavior of the router, and much to our surprise and amusement, he
presented his answer in the form of a set of partial differential equations.
To a physicist this may seem natural, but to a computer designer, treating a
set of boolean circuits as a continuous, differentiable system is a bit
strange. Feynman’s router equations were in terms of variables representing
continuous quantities such as “the average number of 1 bits in a message
address.” I was much more accustomed to seeing analysis in terms of inductive
proof and case analysis than taking the derivative of “the number of 1’s” with
respect to time. Our discrete analysis said we needed seven buffers per chip;
Feynman’s equations suggested that we only needed five. We decided to play it
safe and ignore Feynman."_

~~~
bane
Feynman is great, very amusing and obviously brilliant. But I've also worked
with folks like him before who end up as fish out of water in fields they
don't know the conventions for and can't be bothered to learn.

Lots of time is wasted trying to figure out how to get the rest of the company
to communicate with the lone genius, and it's obvious that they're smart
enough to go away for a week and learn enough of the field they're working in
to try to use the language and vocabulary of that field to be minimally
effective.

I've been on the receiving end of these kinds of analysis and the result is
that they seem to exist purely to showcase how smart the individual is and to
provide no other meaningful input to the effort. In this case the engineers
ignored Feynman, did their own analysis anyways and followed their own
conclusion...the subtext here is that multiple people were not getting along
with Feynman's way of doing things.

He turned out right in the end of course, because Feynman, but there's lots of
people who think they're Feynman and aren't and it's hard to tell the
difference sometimes.

~~~
varjag
> He turned out right in the end of course, because Feynman, but there's lots
> of people who think they're Feynman and aren't and it's hard to tell the
> difference sometimes.

Well he had a Nobel prize in physics on his hands for starters, that's one way
to tell the difference. Granted, noone's infallible, but it's a good
predictor. Not that it would have spared him from fizzbuzz on the interview
these days..

~~~
gbromios
>Not that it would have spared him from fizzbuzz on the interview these days..

I'd hope an interviewer would view an applicant's fizzbuzz as a pleasant
surprise if it came with a bit of casual mathematical proof.

------
wimagguc
Feynman has many interesting stories about playing about with all sorts of
stuff from picking locks to the Manhattan project. His book, "Surely you're
joking, Mr. Feynman!" is just a fantastic read:
[https://www.amazon.com/Surely-Feynman-Adventures-Curious-
Cha...](https://www.amazon.com/Surely-Feynman-Adventures-Curious-
Character/dp/0393316041)

~~~
kej
The sequel, _" What Do You Care What Other People Think?": Further Adventures
of a Curious Character_ [1] is also excellent, and includes his account of
serving on the commission that investigated the Challenger disaster.

[1] [https://www.amazon.com/What-Care-Other-People-
Think/dp/03933...](https://www.amazon.com/What-Care-Other-People-
Think/dp/0393320928/) or part of the buy it together bundle in wimagguc's link

~~~
stcredzero
Funny, but I get the impression from that book's narrative of the Challenger
disaster that Feynmann was in the role of someone else's "useful genius."

------
gbromios
_This was a typical Richard Feynman explanation. On the one hand, it
infuriated the experts who had worked on the problem because it neglected to
even mention all of the clever problems that they had solved. On the other
hand, it delighted the listeners since they could walk away from it with a
real understanding of the phenomenon and how it was connected to physical
reality._

I've never seen this article before, and I love how the author managed to
leave me delighted in the same way, feeling much smarter than I actually am,
especially about theoretical physics.

~~~
anaerobia
A reprint of it is in the book Feynman and Computation edited by Anthony J.G.
Hey. You can find it in Amazon or your local library.

------
ocfnash
I came across this before and found the section "An Algorithm For Logarithms"
interesting.

As pointed out (in second para. of that section) Feynman's observation about
representing a number as product of terms of the form $1 + 2^{-k}$ reduces the
problem of estimating log to computing those values k which appear in the
product. (Btw the article incorrectly claims the representation is unique.)

There's an obvious linear time algorithm for finding a sequence of values k
but I don't think that would really perform so well compared to, say, Newton-
Raphson.

So I wonder why this was a good approach. Perhaps there's a smart way to
estimate the k or perhaps it just happened to fit the constraints under which
they were working.

Anyone have some insights?

~~~
stephencanon
Newton-Raphson doesn't buy you much for logarithms, because the iteration
itself involves either exp(x) or log(x)[1].

If you don't have a real multiplier[2], then the algorithm described is
somewhat attractive because multiplying by 1/(1+2^-k) has a nice series
expansion. You can do it using only shifts and adds (and the number of terms
you need is O(1/k), so it falls off reasonably quickly), and you get the part
you need to compute the next `k` early, so there's no serial dependency.
Actually finding k is just a count-leading-zeros operation, which is cheap
enough.

Computers today pretty much all have real multipliers, so no one does this.
Instead we reduce to a range like [1/sqrt(2),sqrt(2)], and then either use a
minimax approximation on that interval (if the accuracy requirement is
reasonably low) or further reduce by looking up a value of r close to x for
which we have a pre-computed 1/r and log(r) and take advantage of:

    
    
        log(x) = log(r * 1/r * x) = log(r) + log(1/r * x)
    

this allows one to achieve high-accuracy results if r is chosen so that one of
1/r and log(r) is exact and the other is unusually close to the exact
value[3], or stored in extended precision.

[1] If you have a fast approximate exp and log you can make use of it in an
approximate N-R iteration, but it still doesn't buy you much. In practice no
one does this; we compute exp and log directly, and they're among the fastest
functions in the math library.

[2] Meaning either you don't have one at all or that it's much, much slower
than addition, as was common in the 1980s and earlier.

[3] This trick is called "Gal's Accurate Tables"
([https://en.wikipedia.org/wiki/Gal's_accurate_tables](https://en.wikipedia.org/wiki/Gal's_accurate_tables)),
but as with most things named for someone, it was independently invented
multiple times, and likely dates back decades before the credited inventor.

~~~
ocfnash
Thanks for these remarks Stephen.

IIUC you're speculating that the relevant constraint they faced was that
multiplication was very burdensome. I agree; it seems plausible this was the
reason they found this an attractive approach.

Also agreed about Newton-Raphson; that was a thinko on my part. It would be an
unusual situation if NR were the right tool for estimating log.

------
sillysaurus3
Someone please explain how to do this. It's been a lifelong mystery that has
always fascinated me:

> By the end of that summer of 1983, Richard had completed his analysis of the
> behavior of the router, and much to our surprise and amusement, he presented
> his answer in the form of a set of partial differential equations. To a
> physicist this may seem natural, but to a computer designer, treating a set
> of boolean circuits as a continuous, differentiable system is a bit strange.
> Feynman’s router equations were in terms of variables representing
> continuous quantities such as “the average number of 1 bits in a message
> address.” I was much more accustomed to seeing analysis in terms of
> inductive proof and case analysis than taking the derivative of “the number
> of 1’s” with respect to time. Our discrete analysis said we needed seven
> buffers per chip; Feynman’s equations suggested that we only needed five. We
> decided to play it safe and ignore Feynman.

How is it possible to use PDEs to model boolean circuits?

~~~
dfox
Just today I was reminded of this paragraph when we thought about completely
different discrete optimalization problem with the expectation that there must
be some way to solve it analytically instead of numerically.

But in case of CM router, “the average number of 1 bits in a message address.”
is surprisingly useful quantity, because for the most straightforward
implementation of hypercube routing the number of one bits in address is equal
to number of links the message must traverse. The algorithm is essentially
this: find address bit set to one, clear it and send the message down the link
with same number as was the order of that bit, if address is all zeros,
message has reached the destination (the same thing can be implemented without
changing the address seen on wire by each router XORing its node ID with
address in message and getting the same bitvector, but I believe most
implementations actually modify the message address in each router).

------
jabl
I came across this a long time ago, but read it again, and laughed out loud
several times. It's a good read.

It's also amusing to think of how "small" science was only just a few decades
ago. I guess that's what exponential growth is.. (plus, I guess, survivor
bias) Here we have one of the greatest physicists of the 20th century working
at a startup for a bunch of wet-eared kids from MIT. But oh, looking at the
wikipedia page of Danny Hillis (co-founder of Thinking Machines, author of the
article), his thesis advisors were (drumroll) Marvin Minsky, Gerry Sussman,
Claude Shannon. Not exactly nobodys either..

~~~
cr0sh
> I came across this a long time ago, but read it again, and laughed out loud
> several times. It's a good read.

I've read it before as well, but read it again recently due to this HN post
(before it moved up in the stack, sometime last week).

While I always find it an amazing and amusing recollection of the period, at
the end I always find a bit of "smoke" in my eyes, knowing that we have lost
one of the greats of our age, and that I will never be able to say to him
personally how much I have enjoyed reading and listening to his words.

Fortunately, though, we do still have those - and they seem to continue to
inspire people - which is really something.

------
kordless
I love this part:

" _Feynman had a proposed solution to the anisotropy problem which he
attempted (without success) to work out in detail. His notion was that the
underlying automata, rather than being connected in a regular lattice like a
grid or a pattern of hexagons, might be randomly connected. Waves propagating
through this medium would, on the average, propagate at the same rate in every
direction._ "

------
fermigier
A counter point to this narrative can be found here:
[http://www.inc.com/magazine/19950915/2622.html](http://www.inc.com/magazine/19950915/2622.html)
or here: [http://thedailywtf.com/articles/Thinking-
Machines](http://thedailywtf.com/articles/Thinking-Machines)

In the light of these two stories, one can speculate that Feynman was mostly
hired by TM to help with PR, not to produce actual work.

------
amelius
I wonder what Feynman would be working on if he was a programmer today.

~~~
acqq
He wouldn't, he was careful to avoid that since 1940-es:

[http://www.goodreads.com/quotes/325051-well-mr-frankel-
who-s...](http://www.goodreads.com/quotes/325051-well-mr-frankel-who-started-
this-program-began-to-suffer)

I guess you haven't read the book, take a look.

------
teddyh
It’s too bad that the conversion from LaTeX markup to HTML was… nonexistent,
leaving us with such things as “ _100\%_ ” and math formulas in raw LaTeX
forms.

------
grandalf
I just visited the long now foundation's SF office yesterday. Very cool stuff.

------
aj7
Cellular automata-- a dead end.

~~~
CuriouslyC
Cellular automata are just a subclass of graph evolution algorithms
constrained to a regular grid comprised of homogeneous nodes. Neural networks
are also graph evolution algorithms, and logical programs can be modeled this
way as well.

Cellular automata are just a toy branch of a tree that is thriving like crazy.

~~~
jandrese
Cellular automata is also the most natural fit for a huge bank of single bit
processors connected in a hypercube topology. There's very little software you
need to write to turn that particular hardware into a cellular automata
simulator. It is a good first program for that machine.

------
fermigier
Good but old: 1989. Should be stated in the title.

~~~
alphydan
Feynman died almost 30 years ago ... I think it's pretty safe to assume it's
not recent news, even without the (1989)

~~~
jamesrcole
I don't think anyone would assume it's a news story. The title presents it as
an essay, and new essays get written about historical details.

