This fails for the same reason the JavaScript entries failed to make a dent in the Engine Yard contest: When JavaScript is roughly 1000x slower than GPUs, even borrowing other people's browsers isn't enough to make up the difference. Also, real ray tracing generally involves huge source data that would have to be downloaded to each client.
But that's the whole point. Sure javascript is slooow and with the small number of browsers visiting the demo, this isn't even comparable to the performance of a GPU.
It's the potential. You have literally millions of computers on the internet right now, capable of running this code.
Also, raytracing is only an example. But what large data set do you need for raytracing? I guess if you had a massive mesh then that might add up/
This seems like the beginning of another argument in favor of an idea I've been kicking around for a little while now: combine http with bittorrent in some seamless fashion to really make more of the web better distributed.
You've been thinking about it longer and probably more in-depth than I have. There was a project awhile back called localhost that made me think about it. It wasn't as transparent as what I would like to see, though. 4chan also made me think about it. The content there has a very high turnover, and many of the users painstakingly mirror the parts of the site that they care about. It would be nice if they had a tool they could use from their browser to share out their mirrors so that those pages never 404. The next step would be to implement the way localhost intentionally allows poisoning (or as I call it, "controversy") or versioning, and a trust network would round out the implementation to deliver the most-probably-preferred version upon the first visit, with the option to choose. I don't know if this would be a browser extension or a new RFC surrounding HTTP and bittorrent, but it's what I think WebDAV should be.
That's awesome. Yeah, I've been thinking along pretty similar lines.
4chan is basically a "paging channel" or "advertisement channel": anybody can post, which means it can't provide reliable delivery (any message can be swamped by spam) or archival, but that's the only way to establish communication between people who don't have a pre-existing relationship (either directly or through their social network).
Poisoning/controversy is inherent in human-readable, non-centrally allocated, shared namespaces. (See Zooko's triangle.) It's not always a problem — Google shows that your suggested approach of using a trust network to choose among the candidate versions can work pretty well — but in the cases where it is a problem, you can reduce the magnitude of the problem by using self-certifying names (like SHA1 sums), unshared namespaces (like petnames), or centrally-allocated names (like DNS). Nick Szabo's "Secure Property Titles" proposal is the only plausible alternative I've seen. (I don't know enough about social dynamics to predict whether it would work as he predicts.)
One of the advantages of hash-addressed data (data named with a self-certifying name that consists of a secure hash of its contents) is that it can be replicated to many peers without any particular concern for integrity or timeliness. (And in Freenet and BitTorrent, it is.) So it might be good to use hash-addressed data for as much of the system as possible. Localhost only stores filenames in directory nodes, then uses Kademlia to look up the .torrent infohashes associated with various versions of the filename; it would be good to often provide durable links consisting of the infohash itself.
The other thing I think is needed is a way to publish a mutable resource that only you can update. Such a resource can have a self-certifying (but non-human-readable) name consisting of the hash of your public key, possibly an ID number, and a timestamp.
I have the impression that some of the modern JavaScript engines lose less than a factor of 10× over the raw power of the CPU. Do you think the CPU-GPU gap is going to widen or narrow in upcoming years?
Pressing the backspace key on a computer terminal would generate the ASCII code 08, BS or Backspace, which would delete the preceding character. That control code could also be accessed by pressing Control-H, as H is the eighth letter of the Latin alphabet. Terminals which did not have the backspace code mapped to the function of moving the cursor backwards and deleting the preceding character would display the symbols ^H (caret, H — see Caret notation) when the backspace key was pressed. This sequence is still used humorously for epanorthosis by computer literates, denoting the deletion of a pretended blunder, much like overstriking.
Example: My slave-dri^H^H^H^H^H^H^H^H^Hboss decided to stall the project.
I'm not sure that would work -- since each person gets the next line in the batch (seems like anyway), the best you can do is poison a few random lines. making a cohesive image out of that would mean you'd have to poison the source data.
At the moment no - a trust system is on the wishlist. But as has been mentioned, you'd have to poison every line. But yep, that's a current vulnerability.