I made twenty of them, in fact. Each of the 20 webpages belonged to a Computer Science professor in the CS department I was in. I got paid a grand total of $800 to do that. Of that, I paid $200 in rent to my landlord for my room and utilities. My monthly groceries were always under $100. The remaining $500 I diligently sent home to my poor mom & dad back in India.
Of the 20 professors, one(two?) has died. Few have retired. Some have quit academia for industry. The rest have replaced my web0 page with the latest & greatest web2, web2++ pages.
But one stubborn professor continues to hang on to the web0 page I made for him !!! When I made that page, he was an assistant professor. He became associate professor, then full professor, then Dean! Here is his web0 page I made for him 25 years ago - https://www.unf.edu/~wkloster/
The domain was registered on 1997-05-28. In that same year the young entrepreneurs won a Business Challenge by the Rabobank for their innovative business model.
Mine was a Nintendo 64 fan site where I reviewed N64 and eventually Playstation 1 games. Those were the days.
Hm. What happened? Did someone realize, that there is traffic on that page and took it down?
One thing that would at least help. We got the web fractured from a p2p two-way street into a broadcast medium because ISP's were able to use the exhaustion of the IPv4 address space to limit stable IPs to business class connections. I can't claim we can bring that back, but wide deployment of IPv6 would at least demolish the argument against consumer static IPs and open up the network as a fair flat graph again.
I'm dreaming of an alternate universe where you read your friends microblogs through RSS, with the client and timeline algorithm of your choice. Everyone has his own mail server, with proper open-source spam filters. Open-source is a public service, receiving donations from governments. We were so close to this.
I don't know why you think the world was ever close to this.
The problem is psychological, not technical.
While US has a plethora of laundromats and buildings get dedicated laundry rooms, in Europe (at least), even the tiniest apartments in apartment buildings have their own washing machine.
While there is some overhead (if it breaks, you have to find someone to fix it; you need to clean that filter...), people have happily accepted the decentralization since the benefits outweigh the overhead.
Same can be done with any other tech if we focus on streamlining it.
People in the US don't use communal laundry because they want to, they use it because they (most likely) have to because they are in a large urban center or can't afford the appliance.
This is basically the reverse of the internet centralization/decentralization example, where centralized services on the internet are much easier to use than their decentralized counterparts.
The tech and it's benefits don't mean anything if there is too much friction before the user realizes the benefits, and it doesn't take much friction to dissuade people.
See these tweets from a founder who built a social app startup and sold it to Facebook for 8 figures:
Consumers won't adopt a decentralized solution because there is way too much friction. The only reason people use crypto is because there is a monetary incentive, and even then it's pretty damn fringe.
This comment explains it beautifully: https://news.ycombinator.com/item?id=29709446.
I don't see why any of that should be impossible, it's just that most of decentralized stuff is free software that was never focused on the starting-up flow and UX.
From a user point of view, RSS readers are easy to come by.
From a server point of view, I think a raspberry pi build that scripted most of the setup, is an achievable goal. Not an easy one, but doable.
Sadly, even if you made it super easy, most folks would stick with the big names. Still, it could be nice to start a little community of self hosted sites...
Yes! I'm thinking something like standardizing Docker deployments with nginx-proxy (which takes care of automatic Let's Encrypt certificates, it Just Works™). If everyone shipped a docker-compose.yml tailored to nginx-proxy this would make it very easy for anyone to deploy stuff on their home server.
Then you'd need a standardized interactive install script that asks for things like hostname, email details, whatever else needed in .env (or whatever config). Perhaps a good ol' Makefile?
make setup # interactive install
make up # run docker-compose up -d
make down # run docker-compose down
make uninstall # uninstall, plus clean up volumes and images
 This is how I deploy all my Docker stuff, it usually takes from 1-5 minutes to modify docker-compose.yml to fit
Happy maybe, but large no.
But solutions do exist: open-source focusing on UI, standardisation, free static IPs, more open operating systems, etc
Why would people ever want to manage all of those things? A large part of the value proposition of Gmail and Facebook is that they are easy to use.
Also, this: https://twitter.com/nikitabier/status/1369098642171162625
Consumers don't want to do it. They're tired. They want to sleep. They want to watch TV or Netflix or TikTok or YouTube or listen to Spotify. They want to turn off their brains in most cases. Maybe spend time with their kids. Or go to the grocery store, or football practice, or holiday shopping, or browse Amazon or eBay, or fret about bills, or prepare their taxes, or fix the sink, or mow the lawn, or play with the dog.
They want someone else to be responsible for the thing. They want a central entity to blame, or complain to, and to pay to take it all off their plate. They just want to send Netflix $14 / month and click a button and never have to think about running their own systems. That's what they want. That's what they will always want, so long as the price is even slightly reasonable.
No amount of wishing by techies is going to change that reality.
They want to be able to walk away from Facebook or Instagram or YouTube for three weeks or three months and not worry about whether the server ate itself, or needs updates, or a new power supply, or has a critical security vulnerability - those things all become the consumer's responsibility. If they use Facebook they can shake their fist at Zuckerberg on the rare occasion the service is down; he's to blame for X Y Z; and it costs them nothing whether they use it once per day or once per year (yeah but privacy, they're the product; right, well, step into the average consumer's mindset and out of your mindset).
The majority of consumers could change their own oil too. They don't want to because it's a big hassle to them. They don't even want to sew their own clothes, imagine. Bring back decentralized clothing.
I'm endlessly baffled by the inability of most techies to put themselves into the shoes of the average consumer and just get it, get their point of view, grasp their mindset. It's extraordinarily easy to do. Just try it sometime, with practice it gets easy, I promise. No matter how many decades go by, none of this changes, and yet the techies persist with the decentralized fantasies. One day human nature will change, just you wait!
Look I heard in the US some people even have several jobs, I suppose by end of your second or third shift of the day, the last thing you want to do is a log4j patch type upgrade... doubly so when it's christmas and the kids are noisy.
I so far trust my Ubuntu LTS package archive to give me sufficient security updates for stuff I install from main, and this is the reason I avoid non-main software, but even Ubuntu is moving away from this model to snaps and such.
You overestimate the time available to people and that's why google can shove its ads down our throats in exchange for a huge time gain for everyone. Fix that first, then tell people to update their mail daemon.
But, I am not saying that even LTS approach is sufficient, but that we could work on improving it so it's trivial for anyone to do (people using laundromats probably have no idea what a "filter" is in a washer either: that's not a barrier for people to get one, though).
Yes there's a few centralized components still such-as DNS, but even that is heavily cached and distributed in nature.
Blockchain could aid in replacing BGP & DNS allowing for a better web. But web3 is just as centralized as our current web, if not more centralized. For example, abracadabra money get's current crypto prices from a "price oracle" which is a consortium of 16 parties who state what they feel the price should be. I wouldn't call that "Decentralized"
Like a number of other people here, I primarily associate blockchain/crypto stuff with tech-bro huckster types, but the underlying tech does seem to have some interesting properties. Anybody remember Publius? https://en.m.wikipedia.org/wiki/Publius_(publishing_system)
These are two technologies that are being conflated because in theory crypto can be used to pay for IPFS.
Having a kind of literally global GUID that can point to arbitrary decentralized content is fantastic. While the credit for it absolutely goes to IPFS rather than to cryptocurrencies, it does happen to be perfectly suited for any kind of application you may try to build on a cryptocurrency blockchain.
Unless your using an IP address for you website, your domain name is registered through a centralized registrar. The DNS architecture/protocol itself is centralized as it is controlled by ICANN.
There are tons of registrars and DNS is even more decentralized — it’s not like .ru has the same operators as .us or .aws.
The key question is how many parties can limit you. The DNS roots are stable and widely cached, so you don’t have bottlenecks based on those. Once you own a record in your hierarchy of choice, you can do whatever you want with it without needing to involve the roots in any way.
What happens, however, is that people voluntarily share infrastructure because it’s cheaper and easier for them to let Google operate their email, AWS their servers, etc. That’s different in a very key way: if you don’t like how it’s going, you can easily leave at any time. The centralization is weak — lose that competitive edge and the exodus begins.
The same thing has already happened in the blockchain world: theoretically you can be your own bank, IdP, etc. but in practice most people prefer to let other people manage those functions for them. Control of the blockchains, oracles, etc. similarly has a relatively small number of parties involved and it would be harder to change them than updating DNS records.
But those registrars only exist at the discretion of a single centralized authority (ICANN)
It doesn’t matter which registrar you use, ICANN can always take away your domain name and give it to someone else, no laws apply to protect your domain it is all subject to ICANN and their decision. They have taken away 10’s of registered thousands of domain names.
Alternatively there is no centralized authority that can take away a decentralized domain name such as ENS.
ENS is based on Ethereum, which is controlled by a smaller number of people and has already had one case where they rewrote history when it was financially advantageous. There is no reason to think that in a hypothetical future where ENS came to matter to anyone the same pressures wouldn’t apply because the critical flaw for ENS is the same one preventing popular use of blockchains in general: there’s no way to recover from mistakes or abuse. This is not an idle concern: domain theft happens regularly and there’s a process for recovering your domain:
If ENS became popular there’s no chance that the same problem wouldn’t occur, and the value of a domain and risk to the public means that people would not accept the answer that the attacker won for all time. Something would happen - whether that’s the Ethereum developers rewriting history again, a global block or override list, or something else.
As a simple learning example, imagine that the Chinese government got the keys for Taiwanese government in a hack or someone stole the keys for Amazon. Do you think that the ENS community’s response would be to say there’s nothing to be done and Amazon should rebrand, or that anyone would continue to use ENS if they did?
All true, but doesn’t make ICANN not centralized.
>ENS is based on Ethereum…and has already had one case where they rewrote history when it was financially advantageous.
There is a difference between rewritten and forked. Nothing was rewritten, both chains exist. Similarly there is no centralized authority that prevents an individual or group from forking the blockchain, nor any central authority forcing users to the new fork.
Alternatively, you won’t have much success attempting to “fork” your own .com TLD because of conflict with DNS. Realistically even ENS potentially faces this risk with .eth conflicting with DNS at some point.
But as it relates to ENS on EVM, if you are suggesting Amazon could in anyway force the take over of Amazon.eth from another owner on the blockchain, it is not possible, not even the ENS DAO has that power. This may be a result of your misunderstanding that Ethereum was previously “rewritten”.
Local laws do not apply, there is no independent or impartial court, it is entirely centralized decision making by ICANN if they will take away your domain name and give it to someone else. It is exactly the same as when Twitter recently took away @metaverse handle from a active user and gave it to Facebook. At least in the case of Twitter when faced With user backlash and public support they reversed their decision and took @metaverse back from Facebook and returned it to the user. In practice ICANN is even more centralized than Twitter, because ICANN isn’t accountable to its users.
It’s clear you don’t know how ICANN words, and are entirely out of your depth wrt the definition of de/centralized. You likely also have a conflict of interest.
ICANN can do that for .com domains, because that's their TLD. They cannot do it for the TLDs they've already sold/aren't responsible for.
You can even purchase your own TLD from ICANN if you've got the money and create whatever domain names you want.
Maybe stop talking about stuff you've no clue about, otherwise you'll forever be a dunning kruger person
It is not at all unlike Twitter taking away the handle @metaverse from a user and giving it to Facebook, it was only the result of user backlash/public support Twitter reversed its decision.
What’s your point? My point is that the internet is decentralized but many websites use centralized services as there’s some value in doing so.
On the contrary, addresses and phone numbers should be like domain names. It sure would be great if I could just update my mailing/physical address once when I move.
The point I was making is that it's assumed you can get a static phone number for your house, but most of the public don't know how to get a static IP address for their house (and it's often not even possible to do, depending on ISP).
As long as you only ever give out your PObox address (domain) instead of your real address (IP), you are good.
Anyone that actively wants to can fairly easily obtain a stable IPv4 IP, or even IPv6, it’s not that it isn’t available, it’s that nobody uses it.
I’m sure my home connection has a stable IPv6 address, but I honestly wouldn’t be able to notice from the name whether it had changed or not, too many friggin characters.
1) backend running as code on a blockchain spread across the nodes instead of a single provider
2) user accounts are nothing more than a wallet address and completely controlled by the end user
3) transactions and ownership of tokens between addresses are controlled by users without any middleman control, approval, or censorship.
Web3 still runs on all the same internet architecture as the rest of the web. It's still TCP/IP underneath, and browsers with JS providing frontends.
People either forget, or weren’t around, when domains names were thought of and treated much like NFTs are today. There was a small group of early adopters but for the most part big business considered websites/domain names something akin to digital monopoly money, just a play thing with no relation to the real world.
In the early days even the law and courts didn’t know how to treat claims over domain names. Now ICANN dictates DNS rules worldwide and streamlines the process for businesses to take domains according to ICANN rules.
There are many examples of new decentralized architecture and protocols being built for web3 and the decentralized web, like:
IPFS: a p2p alternative to https
ENS: a decentralized domain name protocol as a alternative to DNS
Also an NFT is token standard, sure it can be used as a fungible record of ownership, but your framing as if all NFTs are is a “claim on a JPEG” shows either a lack of understanding or a common Luddite attitude toward the technology. My NFTs work perfectly fine as domain names, far from a JPEG.
We have nice things. We have nice abstractions already. We should be working on toning down and optimizing the good stuff, the good bits that get stuff done, not ramping up on new shiny buttons for each and every edge case out there just because we need to stay on the loop (whatever this loop is).
If you look at what we actually do with the web, it's the same thing we were doing 20 years ago. Click on links, look at pictures of cats, chat with your friends on message boards and instant messenger, send e-mail, buy the occasional thing online, read the news, play games. If you changed none of the software, the result would be the same, but we would have saved a couple trillion dollars (or, alternately, not caused people to fork out a couple trillion dollars).
The only functional change today is much more streaming video, and instead of writing blogs, everyone's recording podcasts (which we had back then as "streaming audio" using IceCast).
JS is the most accessible and distributable code someone can write. It is accompanied for free by a portable, OS-independent UI library with application-tier capabilities that requires no user friction to download.
OF COURSE this construct comes with a ton of bullshit and noob reinventions.
But you, as a software engineer, should have the skillset to pick some strong set of libraries and stand behind it. For me, I use React, React-Router-Dom v5, MobX, TypeScript, and roll my own for everything else. It works well, my code from 2017 that runs without updates is easy to port if I want to because it uses the same underlying libraries, and the browser platform is fairly stable overall.
How does it upset you that you can also do some things in a lower friction way capitalizing on new language features? The old ones didn't go anywhere. The old libraries didn't stop working. Nothing fundamentally changed, some idiots just bolted on some new stuff and some geniuses carefully architected some other new stuff. Why is velocity something to rail against in and of itself? It impacts you exactly how much you let it.
Not that most their languages are inherently better but some have a way of mitigating the problem slightly, like e.g. python's extensive std lib collection.
I had this moment where I thought "well, I need Typescript, because I want nicely typed interfaces and contracts, and I need React". Turns out it's not easy like that. To cobble everything together you need a proper template for webpack. and there's Babel and that also needs to be caressed in the right React-y way. You can't just "hop in". You have to use cookiecutter-styled tools to bootstrap because the whole thing just got so complex so that even a Hello World styled project needs a mindboggling amount of tools all orchestrated and plugged into each other in the right way or it all blows up in your face. Also, there are more than one cookiecutter-styled tools and you have to choose the right one. "But don't fret you can always run eject to see the whole machinery inside".
It's saddening and tiring.
I would tell the exact same story if I was to decide to use Python as backend, just with different technologies. Maybe you just tried to do too much too quick. I can build a Python script, just like you can probably build a JS script, it doesn't means that I can just jump into Django head first like it was nothing. You can't just "hop in", you need to learn each step, and that's fine. The number of times I thought it was as simple as a "pip install"... lost quite a bit of time in various CTF while trying to use some Python scripts. You also can stop at any steps, and that's fine too. I know that HN is good at "THIS IS THE ONLY RIGHT WAY", even this thread if filled with people arguing that "web0" is the "right way", but you know what, the right way is the one that works for you, even if it means that some page will take more time to load, or takes more data transfer, or is missing interactivity, or has a less optimal UX, or can be done faster in another way, or any other downside.
Manually setting up React (with or without TS) isn't all that hard, either.
Manually setting up React isn't all that hard, but it's extra friction you don't need when learning to code React (and probably 999 out of 1000 times using it in anger, either.)
So, yeah, using CRA or another template is generally a good idea.
> Also, there are more than one cookiecutter-styled tools and you have to choose the right one
No, you really don't.
Your first mistake is trying to add typing on a dynamic language that works just fine without types. You won’t really learn the underlying language that well that way, and you therefore won’t have a good understanding of it.
TypeScript makes sense in large projects, with many developers where you need to catch a class of problems that compile time type checking can catch. It doesn’t offer a lot or add much in the way of actually learning the fundamentals of frontend development and is a distraction when viewed through that lens.
This is honestly a very easy set of questions to answer today.
React-Query: You don't need to manage state, let your server manage state just like you do in typical non-js heavy apps.
I recently built a dashboard/reporting site for Neo4j that's completely config driven (delivered from the backend), which can refetch the config (with a gmail style "Settings have changed - Refresh" notification) and have only changed parts of the site update. It has tables with resizeable columns, filtering, search, ordering, virtualization, server side fetch, etc. It has progress charts, bar charts, counts, markdown rendering, json schema rendering. It has inputs that autocomplete, with single or multi-select. It's fully responsive, and works perfectly on desktops, tablets and phones.
If I would have tried to build this three years ago, I wouldn't have been able to do a fraction of these things. So, rant away, but the frontend community is amazing and they deserve way more credit than they get.
Or do you mean that you have to suffer like when you visit websites, like cnn.com: https://i.imgur.com/VAZMUx8.png
Web 1.0: HTML
Be very sceptical, but don't ignore the whole thing or you might end up missing the next real wave of innovation in tech (which is what i thought HN was about).
Look it up yourself, or don't.
<meta name="twitter:card" content="summary_large_image">
<meta name="twitter:title" content="My first web0 website!">
<meta name="twitter:description" content="">
<meta name="twitter:image" content="https://elliott.computer/pages/web0/social.jpg">
Web3 is not here lol, it’s literally nowhere. It’s a few Ponzi schemes grafted on top of some MySQL stores like OpenSea and Coinbase. This does not a revolution make.
> that incorporates decentralization based on blockchains
In my mind, Web 3.0 was synonymous with the Semantic Web.
After all, it very much feels like a revolution with the amount of people saying it'll never work.
Cryptocurrency culture fosters dishonesty: billions of dollars have been poured in by people who think it’s an investment they’ll get a large return from. The only way they profit is from other people putting real money in, which means everyone involved has to be a marketer.
That’s why crappy generative artwork will get talked up as nauseum, people will wax rhapsodic about logging in using a third-party service, etc.: the people saying that know they will lose money if the rest of us don’t agree that their random hashes are worth more than they put in.
Web 1 and 2 were almost immediately popular because they had real things you could do better than the alternatives. The blockchain financial systems being marketed as “web3” aren’t here in any meaningful sense: if you aren’t interested in cryptocurrency speculation, there’s very little reason to care about it (this is staggeringly unlike the earlier web), and a micro-transaction system for rich people with more centralized infrastructure is a big reversal — especially since exactly none of it works in any way without using the real web.
A year from now there will be a new buzzword to try to take money away from the get rich quick crowd.
> People keep asking what Web 3.0 is. I think maybe when you've got an overlay of scalable vector graphics – everything rippling and folding and looking misty – on Web 2.0 and access to a semantic Web integrated across a huge space of data, you'll have access to an unbelievable data resource… — Tim Berners-Lee, 2006
That's funny, he didn't mention blockchains once.
Semantic Web 3.0 and blockchain web3 stuff both suffer from at least one shared problem* -- the internet is a living, evolving system. Web 2.0 was labeled in retrospect to describe a general, but clearly visible change in the way websites were build. Much easier to predict something that has already happened and which everyone can see.
The only prediction I think that we can make about the internet is that it will defy any predictions that we make about it.
*Not to say this is the only problem either had -- for example, web3 also suffers from a wide range of credibility issues, IMO.
Can you post a link to a place where I can interact with some web3 content directly then?
Not an article about what web3 will be, but something I can play with? Back in the early web 2.0 this is pretty easy, just point to twitter, myspace, stumble upon etc.
I'm genuinely curious what web3 really even looks like since all the pages that talk about it mostly sound like marketing pitches without a single real thing I can interact with.
Starting to figure out Clarity.so, which is a project management tool kind of like Notion but with some web3 features. Haven't done too much with it yet, but an organization I'm working with is using it.
Metamask is a browser wallet that also serves as your login and identity for these sites.
If you want to learn some code and concepts in smart contracts in a really friendly way (while providing a lot of good information), you can go through the tutorials at https://cryptozombies.io/, which works like FreeCodeCamp.
Smart contract code has some similarities but also a few things you need to do differently because of the nature of the blockchain. Like random number generation you have to be careful with since your code is public, and people could potentially game it unless you do it a certain way, also function visibility is very important because it might allow other apps to call your contract and/or certain types might cost gas (i.e. tokens) when you don't need it to. Also a smart contract is immutable once it's published, so you need to be very careful you get it right, or you will have no choice but to point people to a different contract (and even still that other contract could still do some damage). Also the way string comparison is done is to make a hash of each of the strings and compare those to see if they're equal. Just a few examples of many.
So it actually feels like I'm learning something different, and not just "okay, what do I need to call/integrate to add authentication on this new platform" for the tenth time, or "Oh, the coordinates for drawing to the screen start from the top left instead of the bottom left for this tool, good to know."
I mean it's not as different as learning something like Prolog, but still fairly different.
Is this website hosted or somehow powered by web3? Does it store its data in a blockchain or a traditional MySQL/postgres/etc db? Are its user accounts nothing but wallet addresses and entirely managed and controlled by the wallet owners? Is the computation, access control etc done using smart contracts and resilient to dropping their servers?
These are honest questions I really don't know the answer to, but based on other posts in this thread this would all be required to qualify as a proper web3 example.
If OpenSea doesn't qualify as web3, I don't know what would.
They have every right to come to the table at the W3C. The federated social web has existed since 2008. It is the real web3.
Don't remember if self closing like <br /> was possible in the HTML2 days.
But with that said, I do find it interesting how many Web 3 things are just more complicated ways to solve problems already solved in the 80s and 90s.
I remember when I first heard Web 2.0 in a conference by Tim O'Reilly. I wasn't a fan but it became a convenient way to describe a new web that was driven by user generated and dynamic content. With the help of this new thing called Ajax and mirror effect, lots and lot of mirror effects on graphics and 12 pointed stars.
Web 3 is not an evolution of the web but rather a side show. A side show that will definitely have some useful things come of it but it lacks the spirit of Web 2.
Let's not muddy the waters even more. "Web" refers to the Worldwide Web created by Tim Berners-Lee. Gopher and Usenet don't count.
But also, to my sort-of point. Web 1 was (among other things like hypertext and some dynamic content via PHP/ASP/Perl/C via CGI-bin/etc) static plain HTML pages. This linked page is most certainly not "Web 0".
Yes. "Web 3" despite the name, is not the web, it's an application on the internet. The web is also an application on the internet, separate from Web3 (and Gopher and FTP and everything else.)
>But also, to my sort-of point. Web 1 was static plain HTML pages. This linked page is most certainly not "Web 0".
I guess we'll have to agree to disagree on this, since it's just a semantic argument. To me, Web 0 was static plain HTML. Web 1 was possibly the introduction of CSS and JS. Web 2 was AJAX and dynamic websites.
And my hot take on Web 3, not that it matters, is that it's properly multilingual computation on the web. Specifically being able to run applications compiled to Webassembly, and containers for legacy native applications.
Web 2.0 was not coined by historians to describe something after the fact. It was a term that was being used by the people who were inventing it. Much like Web 3 in that regard.
CSS and JS both weren't widely supported cross-browser until well into the initial web rollout. Before Ajax with had "Dynamic HTML" and also some pretty innovative things with "long polling."
And contrary to popular belief, "Web 1" was not as dark ages as people think. Forms were part of the HTML spec literally a decade before Web 2. We had plenty of dynamic content. It just required a post to a server instead of being able to be refreshed dynamically.
PHP was part of the Web 1 legacy and launched in 1995 and provided dynamic content. The first eCommerce sites predates "Web 2.0" by over a decade. I should know, I launched one of them. And before that you could make CGI programs in C. And before SQL databases became widely adopted for web applications we had these things called "flat files" we could write to.
Even the very early web specs has POST as a method/verb. Literally, it is in the HTTP 1.0 RFC from 1996.
What differentiated Web 2 specifically was user generated content and dynamic refreshing of the content. And I was actually there, when the phrase was coined. It's not some distant memory for me or something I read about, I was one of the Web 2 early pioneers.
Unless you were on ARPANET you did not use "Web 0"... "Web 0" is not a thing I have ever seen used in my entire 28 year career until this Hacker News post. You can't just rewrite history... there are people alive today who actually lived it.
You can disagree that Usenet is a predecessor to the WWW but you can't redefine Web 1 as Web 0. To anyone who lived it, that is absurd (and slightly disrespectful to our legacy).
I'm not "rewriting history," and I never claimed "Web 0" was a term anyone actually used at any point in time. I was simply disagreeing with your definition of Web 0 upthread, which was itself disagreeing with OP's definition. I've never seen "web 0" used outside of this specific post either. But, like yourself, I was simply stating what I thought a more accurate taxonomy using that term would be. It was speculation. A thought-exercise.
Besides, as one of the great old-timers of the web you should prefer zero-indexing anyway.
It doesn't matter, in either case. I apologize. My frustration is not directed at you but rather the poster of this HN post who is claiming Plain HTML is "Web 0" -- it sounded at first (and second) read you were defending their definition of Web 0. I misread.
When you said "We agree to disagree" I thought you were taking the side of the poster of the "article" -- I agree Usenet is not "Web 0" -- Web 0 isn't a real thing. I was just offering a counter point that what came before "Web 1" was not "plain HTML."
Anyway, my apologizes for escalating. It was not my intention. I was just venting my frustration over the past year at people redefining terms with no regard for history.
Anyway, have a pleasant day.
Of course, Big Tech would want you to use its browser (not browser_s_) and indulge in the insane complexity of "the web stack" which guarantees its monopoly, but it doesn't have to be that way for everyone. Maybe once more people realise the latter, the web can become mostly-browser-neutral again.
I guess that's a dig but I'm not sure why. It's resolvable via the ICANN root and is no more "nonstandard" than any of its siblings.
I like the attempt, though. I could go off on an old-man tangent about "kids these days don't even know what a fieldset is!" or whatever, but I very much appreciate the call to do things more simply.
these are fully standard.
Plenty of frontends still rely on jQuery which can easily be sprinkled into a frontend, and the API hasn't really changed in nearly 20 years.
I think they are advocating we stop trying to constantly make things that are complex, or can one day handle potential complexity, and try to make things more simple from the get-go.
Sure not everything can be Simone [simple*] html, but anecdotally, I can see the value in advocating more simplicity (but not at a raw html level). So much of what I need to work on and maintain are over architected "just in case"
The number of support tickets that I've seen in the past from users who:
Weren't able to see a button dead center on the page without adding a bit of animation
Didn't realize there was a form error above without an auto-scroll.
Help text that needed to be contextual so as to not overwhelm the user.
Dropdowns that are too unwieldy to navigate without a typeahead.
Why though? Just make it simple and functional. No need to get complex.
> I would expect that if Banking portals, EHR's, CRM's (name any other service even moderately complex) were stitched together with a series of html documents and forms, customers wouldn't be as excited about that.
Why should customer excitement force us to create over-complicated solutions?
If you're going through a 40 page mortgage application, and an accidental page refresh loses everything you've filled in, thats not ideal.
If you're trying to catch an uber, and the status isn't updated in real time, thats not ideal.
If you're a customer support agent, and incoming messages require a page refresh to see, that not ideal.
So we need to make things complex enough to solve a user's problem, and no more. I think the difficulty we run into is that last bit.
It was fun, but I don't blame developers for trying to make things more composable and reusable. I'm big on tools that let you build plain static HTML pages, but rendered from templated components.
I did mine APoAC in a blizzard. Lots of dropped packets.
Like a lot of Gen X/Gen Y oldsters, I wrote my first web page in Notepad in 1995 or so and uploaded it to my ISP hosting, after undergoing a "bootcamp" that consisted of reading W3C specs, Netscape.com tutorials, going on Usenet, and viewing source on a bunch of other people's pages, sitting all night on my grandfather's Pentium desktop (because we dialed up to get online) and listening to Mine Inch Nails on headphones.
Someone in here points out that complexity produces friction that will drive consumers away. Good. Not all technology has to be a product you can pimp to a VC in an elevator. I love the idea of Web0, of building a new web that requires a bare minimum of engagement on the part of users rather than simply making everything as easy as possible.
There's more than enough brightly lit malls in the metaverse, where everything is just a funnel to catch your attention or your money. But there's also infinite space for simplicity and the digital equivalent of "roughing it", of applying the concepts of appropriate technology to online interaction. Viva la Web0. :-)
The website is currently 16.59KB transferred and it has the following
- dark mode support
- social media display support
- basic, anonymous analytics using simpleanalytics
It's hosted for free using GitHub pages and a push to master immediately makes the changes available on the internet. Additionally GitHub manages all of the SSL certs so I don't even have to think about it. Compared to the SPAs I (and most WebDevs) write for work nowadays it was actually a breath of fresh air to take away so much!
 website: https://trentprynn.com/
 source: https://github.com/trentprynn/trentprynn.com
For most things, it's the way it should be.
Less really IS more
I deeply miss websites that -reliably- loaded faster than you could blink. The megatons of cruft that comes with even the simplest page, the dozens of remote code callouts that show up in NoScript (and the mere fact that it's been for years prudent to run NoScript...), the slogging wait for pageloads.... It'd just a vast disappointment. And these awful load times and performance are on CAD-level laptops.
I left the industry 15+ years ago seeing the writing on the wall and hoping I was wrong - writing code had become like walking in quicksand - unable totrust the stacks of garbage underneath to behave as documented, and abstractions becoming increasingly purposeless. I'd hoped it'd get better, but as far as I can tell, it was a good move, when something like this is merely quaint.
The hardware industry has produced astonishing improvements every decade, and the software industry just continues to squander those improvements at an even faster rate, producing an ever-degrading experience.
<center style="font-size: 18px">...
is it now web1?
this is the thing everyone adds because someone said it was the right way.
it is the "assumption" that everyone does not want their user to _zoom out_ on mobile sites.
it is wrong, and nobody questions it
I do think we should be conservative about complexity in websites, but advocating for removing interactivity and styling (js/css) seems too idealistic and simply contrarian. So I definitely think this is a good conversation starter, but not a serious goal.
Theyre just Super comments nobody’s can reply directly to
Is it good enough for some people? Absolutely. Does it even remotely meet the preferences of the average web user? No.
This must be the 100th iteration of the "motherf'ing website' manifesto I've seen. They all use default browser styling and system fonts. It's easy to marvel at how fast the site loads when the page doesn't even include so much as a JPG.
Is there something special about this page specifically e.g. not being hosted on a normal web server?
I bet if we allow this to happen, most of Web3's hosting will belong to a few, and they will finally have made us pay for using the web.
Of course, you also have things like gatsby/gridsome which are supposedly also "static site generators" but also ship ridiculous amounts of js to do ridiculous things like "hydrate" the client-side to a full spa.
Pandoc is probably my favorite tool for that job:
It delightfully old school and writing bare ugly html works okay for me, I can actully concentrate on the content instead of learning how to use and maintain a cms.
It is my main homepage and very portable if I ever need to change hosts for example.
Its kainda like 2003 again for me, when I had a 100MHz AMD K5 box with 64mb ram and 1gb ide hdd serving my homepage over my parents adsl connection.
Too bad bare html looks a bit bad on mobile.
That snippet is what it injects to opt you out of Google Analytics.
I do not see that snippet in the page source.
Whenever I get around to making a personal site again, it will probably be compiled to a static site, even if I end up using a static site generating tool to allow for some templating.
Get off my lawn.
A poem can be elegant and beautiful.
A blog post yearning for web0? That's a bit tougher to pull off when you compare it to modern web designs. That feels more like an imposter.
Also, are HTML comments are allowed before the DOCTYPE technically? (this is the kind of tomfoolery that escapes unit testing on browsers)
Happy New Year!
"It's so cool is just works"?! OP sounds shocked that you don't need 50MB+ of scripts and CSS to serve a webpage.
edit: also, can we talk about unneeded bloat? Those 5kb css files add up, how many tons are coal are burned every year because of all of those CSS files being downloaded over the internet?
(Betting 60% yes, 40% no)
"When you speak, it is silent. When you are silent, it speaks" - Zen Koan
theyre already including a favicon might as well swap that for a css file if youre worried about get requests or something
if you can set up a web server i am sure you could figure out enough to qualify for web 1.0 level complexity
if simplicity is the concern why not just host a text file? browsers format them nicely
Crypto bagholders need new marks.
Your first sentence goes my way, but then... I don't know.