

Facebook sees need for Terabit Ethernet  - skennedy
http://www.networkworld.com/news/2010/020310-facebook-sees-need-for-terabit.html

======
Daniel_Newby
Doesn't this say more about the Facebook software layer than the physical
layer? Locality of reference == good.

~~~
andrewmccall
I was thinking the same thing, but I'd be willing to be they thought of that.

Likely it has more to do with the types of tasks they're doing. If you think
about facebook and how it's used, it's mostly a lot of little bits of data. An
awful lot of little bits of data. Sure there are lots of images - but ignoring
the image data its self (that's only really useful to end users) it's still
just a bit of data, owner, dates and users tagged.

What they have to do is crawl a lot of social graphs to update users with a
small bit of data, the things their friends are doing. Locality of reference
is a good thing, but probably very hard with the type of operations they're
performing.

~~~
Daniel_Newby
Indeed. I thought on it more after posting and it _is_ a challenging problem.
They would need to identify the subgraphs with the highest activity-
connectivity products, then migrate them towards the same hardware.
Dependencies would need bidirectional links, so that updates can be
distributed in batch mode rather than being fetched on demand. Especially hot
data might need to be continuously distributed in streaming mode.

And HTTP needs a mode for reassembling out-of-order snippets. Which use
snippet independent data compression. Oh, and everybody gets a pony.

