
Web We Want - signaler
https://webwewant.org/
======
rayiner
I want the Web as in its heyday: circa 1999, give or take. When sites loaded
blazing fast on my 256k DSL modem--faster than they do on my 50 Mbps
connection today. No JS, no flash ads. Just text, images, and hyperlinks like
TBL originally intended.

~~~
andreasley
I seem to remember the web of 1999 very differently: Java applets for
navigation trees, stuff that requires ActiveX and only works in IE on Windows,
HTML tables with background images in cells for layout, almost nobody used
SSL, almost no server could max out a reasonably fast connection (if you were
lucky enough to live/work where broadband was available). Barely any
audio/video.

Yes, some things were better and easier then, but it wasn't all great.

In my opinion, most technological problems have been solved: Broadband is
widely available, we have fairly good standards for HTML/CSS/JavaScript,
reliance on plugins and stuff like ActiveX and Java is at an all time low. The
question is now what we do with our newly-gained freedom – and as usual, this
one is trickier than the technical stuff.

~~~
tdkl
> almost nobody used SSL

Sad part is that this wasn't an issue back then.

~~~
pixl97
Oh, unencrypted data was an even bigger issue back then. I remember still
having ethernet _hubs_ at the time and watchin the data go to and from my
buddies computer on the our network.

------
beatpanda
"Balance commercial profit with public benefit" isn't a thing that happens
voluntarily and asking nicely for it is a joke. If we want pu lic benefit,
we're going to have to make it ourselves, or take it. I'm going to ride a
unicorn off into a sunset before people making money on the Internet (or
elsewhere) are going to suddenly grow a conscience.

~~~
kuschku
Well, the old web was profit-less and only public benefit oriented, and a lot
of stuff still is – lots of people in the open source community provide
services that are used by hundreds of millions of people and thousands of
companies completely for free.

~~~
TeMPOraL
I wouldn't say I want it back to the old web completely - there are lot of
for-profit things that I use daily. But I feel that the profit mindset indeed
poisoned the Internet. You could cut out 90% of people's attempts to make
money on-line and the net (and the world) would be much better off.

------
Aoyagi
A 4 MB front page is not the web I want.

~~~
keithpeter
12 scripts and 2 tracking cookies (Noscript/Privacy Badger). Not so bad
actually. Does the 4Mb include the download for that video?

------
anoplus
To improve fundraising, I believe the expenditures should be transparent. And
the funded elements of the organization should have progress bar, like they
have in kickstarter.

~~~
chillydawg
Given TBL is involved (who helped start the Open Data Institute in the UK) I
would have thought they'd be pretty much 100% open by default. I guess not.

------
anon5446372
A responsive website would be a good start...

~~~
bshimmin
This was my first thought too! Even on desktop, to be honest, this thing looks
hideous. The grey text on almost-black background also completely fails the
WCAG contrast ratio test.

And, honestly, what on earth is this?
[https://webwewant.org/news/We_are_announcing_the_Small_Grant...](https://webwewant.org/news/We_are_announcing_the_Small_Grants_Next_August_3_)

~~~
ytdht
Probably besides your point but: Instead of cropping that cat picture, they
set a custom size in html that doesn't match the picture's ratio... if you
want to see the cat's real proportions, right click on the picture and click
view image.

------
bobajeff
What's this site for exactly? Their about page is sort of vague.

------
aorth
The web I want has websites using SHA2-based TLS certificates and TLS cipher
suites that aren't vulnerable to padding oracle attacks (AES CBC).

------
tomphoolery
I want the Web that was originally intended: One source of data, the document,
which allows you to be as expressive and semantic as you want. Clients read
this data and implement it in their own way depending on how you're reading
it, what kind of client you're using (screen-reader, etc.), whether you're on
a mobile or a tablet, etc. No need for this responsive design bullshit that
just wastes a ton of money and time for everyone involved.

You may be asking yourself, how are we going to "design" this web? The answer
is still CSS and JavaScript, with the added use of web components. Web
components attach behavior and styling to your elements, and can even allow
the browser to interpret new elements for you that inherit from other elements
or components. This means that `<div class="the-actual-element">` goes away,
and we are left with a tag name that actually tells you what kind of content
there is inside, class names that specify "mixins" to load in styles or event
bindings for multiple elements, and the content of the element can be styled
in any way.

All that aside, I still wonder if HTML is the "right" language for the job.
Sure, it's in use everywhere, but why? Could we make something better and
lighter for browsers to interpret, so that the same document could be parsed
quickly by an API client of some kind? I really like the significant
whitespace of Haml, and especially Slim's DSL which lets you just define new
elements without prefixing them with '%'. Although Haml is pretty sweet, a
language looking like Slim would clearly be less annoying to type out given
most of your elements are totally custom. What I'd really like is a language
which combines Slim for document layout and Markdown for copywriting:

    
    
      article#the-name-of-my-article
        title
          The name of my article
        preview
          This is a preview of the article content. It's parsed with _Markdown_.
        content
          This is the *actual* article content, parsed with **Markdown** as well.
          
          - [A link](http://news.ycombinator.com)
        time.localized
          2000-01-01T00:00:00Z
        category
          general bullshit

------
jimktrains2
I've written a series of articles on some of the changes I'd like to see.

[http://jimkeener.com/posts/alt-login](http://jimkeener.com/posts/alt-login)

[http://jimkeener.com/posts/barcode-html-
tag](http://jimkeener.com/posts/barcode-html-tag)

[http://jimkeener.com/posts/http](http://jimkeener.com/posts/http)

I also created a got repo to help merge and coordinate ideas. Contributions
are always welcome.

[https://github.com/jimktrains/http_ng](https://github.com/jimktrains/http_ng)

------
popee
The web I want is merge of SIP and rendering engines for HTML5/css. SIP is
federated and decentralized but really similar to HTTP. The problem is in
difference between URL and URI, because URIs are hosted/registered directly on
user computers/agents, so for example it's not possible to take advantage of
javascript/css. If someone could hacks this, transition would be really
smooth.

But we already have webRTC, why SIP? It's standardized, mature and more basic
protocol. Ofc WebRTC is based on SIP, but with who knows how many additional
layers.

~~~
pjc50
SIP? SIP is _terrible_ , it's an attempt at replicating the telco model of
separate control plane vs talk channel of a traditional telephone company.

 _URIs are hosted /registered directly on user's computer, so for example it's
not possible to take advantage of javascript/css_

This doesn't follow either. If I host a webpage on my personal computer, I can
host the js/css in exactly the same way.

~~~
popee
> SIP? SIP is terrible, it's an attempt at replicating the telco model of
> separate control plane vs talk channel of a traditional telephone company.

Good point and I agree, but it solves problem of direct communication between
users (NAT traversal). It would be great if SIP had plain and simple GET
method (without need for session/dialog).

> This doesn't follow either. If I host a webpage on my personal computer, I
> can host the js/css in exactly the same way.

How can I get your js/css from outside, if you are behind NAT? STUN? NAT
traversal is already part of standard SIP infrastructure.

Sorry on stubbornness, just trying to brainstorm a little.

~~~
pjc50
STUN is indeed used by SIP to traverse NAT, as part of a whole pile of
techniques called ICE. However, this ultimately requires that you have a
central server on which you can "advertise" your SIP presence and exchange SDP
data. Yes, the session payload transport (RTP) is peer to peer, but the
signalling still requires a central point.

> How can I get your js/css from outside, if you are behind NAT?

The same way you get the HTML, whatever approach is used.

I don't think it's impossible to build a system involving distributing
"advertisements" saying how to reach systems on a distributed hash table, then
using either ports opened through UPNP on home routers or a "supernode"
collection of STUN servers to communicate to home systems. There are three big
downsides to this which are probably why we've not seen one yet:

\- no way to monetise it if it's truly distributed and open source => reduced
incentive to build and maintain

\- no central party to fight spam and abuse => system will drown in spam and
abuse

\- necessarily broadcasts an association between home IP address and person =>
vulnerable to privacy problems and DOS attacks

------
revelation
Let's start with disallowing loading 3rd party resources of any kind. That was
always a terrible idea.

------
ExpiredLink
The Web is a huge disappointment. It not only provides the technical
infrastructure but also the data for _unprecedented_ social control and mass
surveillance. It's a delusion to think that anyone can influence the Web's
advancement. It's already clear cut.

~~~
pastycrinkles
No kidding :/ . If anything, there should really be a movement to get people
to migrate off of it more and more.

