> requires some agreed-upon rendezvous point — i.e., a server.
Or a P2P network with a DHT. This can be done right now with a dedicated client (BitTorrent, BitCoin, FreeNet, IPFS, etc.). There are existing browser plugins which will opportunistically use a P2P protocol instead of HTTP, e.g. if you get the IPFS firefox extension, enable the "DNS lookup" option, and visit chriswarbo.net it should fetch the page via P2P. Projects like IPFS are currently experiments, but are aiming for browsers to eventually support (something like) them natively, alongside HTTP/HTTPS/FTP/etc.
Projects like WebTorrent are trying to implement this kind of thing in JS.
> Without servers, how will I see your edits when your browser is offline, or simply no longer visiting the page?
Again, distributed storage (DHT, etc.).
> I thought the whole point of Ward's Wiki was to be a neat place to discuss computer science & programming — in your analogy, to be the supermarket.
I've been told many times that my analogies are terrible ;) In this case, the supermarket represented some commercial Web site, with a clear separation between business and customer, where the business wants as much ownership and control as possible, and will go out of its way to keep customers happy (as long as it's profitable).
From what I can tell, Ward is doing this for the love of it. There is no profit to chase, so any project costs (like running a server) are a drain, and make it more likely to collapse. Removing those costs helps the project, even if it inconveniences visitors. Like the botanist, who wants to get on with their research rather than spending time growing produce for others.
Likewise, the visitors are contributors, not customers. They're not just after some product with as little transaction friction as possible (at least, the most valuable ones aren't; I assume most visitors just read something then leave). They're already investing their time into the project, so making things a little less convenient might be acceptable, if it means the project can stay afloat.
> It's strictly worse if
That's not how "strictly worse" works. If you claimed it's worse, I would emphatically agree (I hate single page JS "apps"!)
"Strictly worse" means that it is not better in any way; that the old version is a pareto improvement over the new one. It's not. There are reasons one might choose to do this. Those reasons are not ones that a commercial Web site should choose (exactly the opposite, in fact; they're like the supermarket); they're not ones that a static informational site should choose (e.g. they might choose to host on IPFS, but shouldn't go down the JS route); they shouldn't be chosen if identity/ownership are the goal (like indieweb, where self-hosted/managed servers make sense).
They do make sense if you want to throw a collaboration platform out into the world, with the only goal being to see what happens. That's what wiki was, so it makes sense.
Everything you write about a DHT is true, but … that's not what Ward's Wiki is doing.
It sounds like what it's doing is a very, very good fit for what the IWC guys are up to — and it could all be done without JavaScript!
> "Strictly worse" means that it is not better in any way; that the old version is a pareto improvement over the new one.
Viewed in links, lynx or eww, the old version is a Pareto improvement over the new, because the new version is nothing but a blank page, while the old version was full of information.
> Viewed in links, lynx or eww, the old version is a Pareto improvement over the new
Again, if you're qualifying the statement, it's not pareto. On transparency, water is a pareto improvement over Coca Cola. On growability, wood is a pareto improvement over steel.
The old version was an improvement over the new one; the new version is worse. They're not "pareto" or "strict" though, and I'm interested to see what the next steps are, building on this new foundation.
Or a P2P network with a DHT. This can be done right now with a dedicated client (BitTorrent, BitCoin, FreeNet, IPFS, etc.). There are existing browser plugins which will opportunistically use a P2P protocol instead of HTTP, e.g. if you get the IPFS firefox extension, enable the "DNS lookup" option, and visit chriswarbo.net it should fetch the page via P2P. Projects like IPFS are currently experiments, but are aiming for browsers to eventually support (something like) them natively, alongside HTTP/HTTPS/FTP/etc.
Projects like WebTorrent are trying to implement this kind of thing in JS.
> Without servers, how will I see your edits when your browser is offline, or simply no longer visiting the page?
Again, distributed storage (DHT, etc.).
> I thought the whole point of Ward's Wiki was to be a neat place to discuss computer science & programming — in your analogy, to be the supermarket.
I've been told many times that my analogies are terrible ;) In this case, the supermarket represented some commercial Web site, with a clear separation between business and customer, where the business wants as much ownership and control as possible, and will go out of its way to keep customers happy (as long as it's profitable).
From what I can tell, Ward is doing this for the love of it. There is no profit to chase, so any project costs (like running a server) are a drain, and make it more likely to collapse. Removing those costs helps the project, even if it inconveniences visitors. Like the botanist, who wants to get on with their research rather than spending time growing produce for others.
Likewise, the visitors are contributors, not customers. They're not just after some product with as little transaction friction as possible (at least, the most valuable ones aren't; I assume most visitors just read something then leave). They're already investing their time into the project, so making things a little less convenient might be acceptable, if it means the project can stay afloat.
> It's strictly worse if
That's not how "strictly worse" works. If you claimed it's worse, I would emphatically agree (I hate single page JS "apps"!)
"Strictly worse" means that it is not better in any way; that the old version is a pareto improvement over the new one. It's not. There are reasons one might choose to do this. Those reasons are not ones that a commercial Web site should choose (exactly the opposite, in fact; they're like the supermarket); they're not ones that a static informational site should choose (e.g. they might choose to host on IPFS, but shouldn't go down the JS route); they shouldn't be chosen if identity/ownership are the goal (like indieweb, where self-hosted/managed servers make sense).
They do make sense if you want to throw a collaboration platform out into the world, with the only goal being to see what happens. That's what wiki was, so it makes sense.