Hacker News new | past | comments | ask | show | jobs | submit login

Say those 10k are elite supercomputers, Newtons and Einsteins, and the rest of us average out as five year old smart phones. If those smart phones are still effective computers, then there is negative utility to deleting them. We want both.

But we want a high enough average so that the supercomputers can still reach most of their potential, and the average is not crushed by the desperate. I think that means that six orders of magnitude more of lesser computers can still contribute massively to the calculation, compared to the 10k elite. It still implies a very large population, just not a barely functioning one.




I think where your analogy breaks down is that unlike computers, distributed computation on human brains scales badly[0], which is why my example limits the community to 10,000.

[0] https://en.wikipedia.org/wiki/Dunbar%27s_number


I don't think that applies, because the size of the network is not limited by the number of connections of a node. If our intelligence were limited by the number of dendrites per neuron, there would be little use for so many more neurons than that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: