Hacker News new | past | comments | ask | show | jobs | submit login
I made my first web0 website today. It's so cool it just works (elliott.computer)
258 points by cookingoils 22 days ago | hide | past | favorite | 253 comments

I made my first "professional" web0 page 25 years ago.

I made twenty of them, in fact. Each of the 20 webpages belonged to a Computer Science professor in the CS department I was in. I got paid a grand total of $800 to do that. Of that, I paid $200 in rent to my landlord for my room and utilities. My monthly groceries were always under $100. The remaining $500 I diligently sent home to my poor mom & dad back in India.

Of the 20 professors, one(two?) has died. Few have retired. Some have quit academia for industry. The rest have replaced my web0 page with the latest & greatest web2, web2++ pages.

But one stubborn professor continues to hang on to the web0 page I made for him !!! When I made that page, he was an assistant professor. He became associate professor, then full professor, then Dean! Here is his web0 page I made for him 25 years ago - https://www.unf.edu/~wkloster/

Mmm, those raised table borders. Timeless!

That's so beautiful I wanna cry! :D 25 years ago is exactly when I started hanging out on the internet and that's how it all looked like back then. That professor is awesome for keeping it. Most recent update seems to be 2012.

What an awesome comment. The site brings back memories of the style of the times. But why are his publications in flames?

Placing moving gif flames next to newly added content was a standard procedure back then. These flames indeed look like it is rather a printer or photocopier which is on fire. Still, they belong. :)

An online shop for Computer parts in The Netherlands left their site online for memories of the good old days http://www.pcwebstore.nl/defaultoud.asp

The domain was registered on 1997-05-28. In that same year the young entrepreneurs won a Business Challenge by the Rabobank for their innovative business model.

My first website was about the same time. If I realized the significance of it, I would have kept it. You're lucky to have kept yours.

Mine was a Nintendo 64 fan site where I reviewed N64 and eventually Playstation 1 games. Those were the days.

haha that moment when something is so old it becomes cool again

Aaaaand it's gone! 404 for me.

Hm. What happened? Did someone realize, that there is traffic on that page and took it down?

It's funny that web3 is being sold as the decentralized web when that's what the internet architecture was literally designed for -- to survive an apocalyptic nuclear war. However, big players used their ability to purchase infrastructure and unforeseen flaws in the protocols to centralize much of the activity on the web. Lol if you think money can't do that to any decentralized architecture.

One thing that would at least help. We got the web fractured from a p2p two-way street into a broadcast medium because ISP's were able to use the exhaustion of the IPv4 address space to limit stable IPs to business class connections. I can't claim we can bring that back, but wide deployment of IPv6 would at least demolish the argument against consumer static IPs and open up the network as a fair flat graph again.

I 100% agree. Imagine if self-hosting was taught in schools. Would Facebook and GMail be this big?

I'm dreaming of an alternate universe where you read your friends microblogs through RSS, with the client and timeline algorithm of your choice. Everyone has his own mail server, with proper open-source spam filters. Open-source is a public service, receiving donations from governments. We were so close to this.

No regular person would ever do any of the things in your alternate universe. Even most tech people wouldn't, because it's a massive pain in the ass.

I don't know why you think the world was ever close to this.

The obvious question is why should these things be a pain in the ass.

If you want everything to be decentralized it will always be a pain in the ass and have way more friction compared to a centralized alternative even if it's pretty easy in absolute terms.

The problem is psychological, not technical.

You could say the same thing about laundry.

While US has a plethora of laundromats and buildings get dedicated laundry rooms, in Europe (at least), even the tiniest apartments in apartment buildings have their own washing machine.

While there is some overhead (if it breaks, you have to find someone to fix it; you need to clean that filter...), people have happily accepted the decentralization since the benefits outweigh the overhead.

Same can be done with any other tech if we focus on streamlining it.

In this case "decentralization" means drastically less overhead in terms of simply taking your dirty clothes to the other room instead of dealing with the hassle of traveling somewhere with it and potentially paying money per load and/or dealing with other tenants that might be using all the machines.

People in the US don't use communal laundry because they want to, they use it because they (most likely) have to because they are in a large urban center or can't afford the appliance.

This is basically the reverse of the internet centralization/decentralization example, where centralized services on the internet are much easier to use than their decentralized counterparts.

The fact you compare this to laundry means you don't really understand the problem. Adding any sort of friction to an onboarding flow severely reduces conversion.

The tech and it's benefits don't mean anything if there is too much friction before the user realizes the benefits, and it doesn't take much friction to dissuade people.

See these tweets from a founder who built a social app startup and sold it to Facebook for 8 figures:

- https://twitter.com/nikitabier/status/1273437328866832384

- https://twitter.com/nikitabier/status/1339278829571854336

- https://twitter.com/nikitabier/status/1324354709201711104

Consumers won't adopt a decentralized solution because there is way too much friction. The only reason people use crypto is because there is a monetary incentive, and even then it's pretty damn fringe.

This comment explains it beautifully: https://news.ycombinator.com/item?id=29709446.

Uhm, but that's exactly what I am saying. We need to get the onboarding flow simpler for people to buy into decentralized solutions, and there are technical hurdles in achieving it, but they are not insurmountable: it's just that nobody is working on them.

I don't see why any of that should be impossible, it's just that most of decentralized stuff is free software that was never focused on the starting-up flow and UX.

Does it have to be that hard though?

From a user point of view, RSS readers are easy to come by.

From a server point of view, I think a raspberry pi build that scripted most of the setup, is an achievable goal. Not an easy one, but doable.

Sadly, even if you made it super easy, most folks would stick with the big names. Still, it could be nice to start a little community of self hosted sites...

> From a server point of view, I think a raspberry pi build that scripted most of the setup, is an achievable goal. Not an easy one, but doable.

Yes! I'm thinking something like standardizing Docker deployments with nginx-proxy[0] (which takes care of automatic Let's Encrypt certificates, it Just Works™[1]). If everyone shipped a docker-compose.yml tailored to nginx-proxy this would make it very easy for anyone to deploy stuff on their home server.

Then you'd need a standardized interactive install script that asks for things like hostname, email details, whatever else needed in .env (or whatever config). Perhaps a good ol' Makefile?

  make setup      # interactive install
  make up         # run docker-compose up -d
  make down       # run docker-compose down
  make uninstall  # uninstall, plus clean up volumes and images
People would just need to learn how to (after installing Docker, which is trivial) git clone a repo and then run those commands and go to the URL and finish the setup (if relevant).

[0] https://github.com/nginx-proxy/nginx-proxy

[1] This is how I deploy all my Docker stuff, it usually takes from 1-5 minutes to modify docker-compose.yml to fit

in most cases, the answer is “it’s a pain in the ass because of the huge stack of poorly written software full of security holes you need to open a TCP port and send HTTP”

I do all those things now, and so do many other people. There is a parallel universe just like this. It’s here now, and has a large and happy population. Not a pain in the ass, but a pleasure.

> It’s here now, and has a large and happy population.

Happy maybe, but large no.

Of course there's issues, that's why it didn't happen!

But solutions do exist: open-source focusing on UI, standardisation, free static IPs, more open operating systems, etc

You're completely misunderstanding what I'm saying. The people/product problems are far larger than the technical problems.

Why would people ever want to manage all of those things? A large part of the value proposition of Gmail and Facebook is that they are easy to use.

Also, this: https://twitter.com/nikitabier/status/1369098642171162625

People are larger than the technical problems indeed, but I think that would have been a good thing in the "self hosting" world the parent commenter desired. People willing to put in the proper energy are likely those worthwhile to listen to. With the "easy to use" nature you pointed out, it has brought everyone and their dog to the thunderdome babbling their nonsense daily.

I have no use for Facebook, and Gmail is not easy to use because it doesn’t do what I want. I need real email.

That last part is only true if the fundamental feature set meets some minimum requirement for whatever they're trying to do. All these companies competing on polish have confused the issue (perhaps because meeting that minimum is often quite easy).

I entirely agree. But OSS maintainers could have product vision too. We just need the right incentives, like better business models.

It didn't happen because the masses of people absolutely do not want to do it. That will never change. This is why decentralized thing will rarely do anything other than fail.

Consumers don't want to do it. They're tired. They want to sleep. They want to watch TV or Netflix or TikTok or YouTube or listen to Spotify. They want to turn off their brains in most cases. Maybe spend time with their kids. Or go to the grocery store, or football practice, or holiday shopping, or browse Amazon or eBay, or fret about bills, or prepare their taxes, or fix the sink, or mow the lawn, or play with the dog.

They want someone else to be responsible for the thing. They want a central entity to blame, or complain to, and to pay to take it all off their plate. They just want to send Netflix $14 / month and click a button and never have to think about running their own systems. That's what they want. That's what they will always want, so long as the price is even slightly reasonable.

No amount of wishing by techies is going to change that reality.

Nothing you mentioned alters the fundamental problem: consumers don't want to do it, they're never going to want to do it. They're also not going to suddenly want to do all of their computing by bash. They're not all going to run out and teach themselves JavaScript. They don't want to run their own email servers. They don't want to run their own media servers. They don't want to take total control of their privacy and run their own decentralized social software. Sure, there are exceptions to the rule, and that's all there will ever be, slivers of exceptions.

They want to be able to walk away from Facebook or Instagram or YouTube for three weeks or three months and not worry about whether the server ate itself, or needs updates, or a new power supply, or has a critical security vulnerability - those things all become the consumer's responsibility. If they use Facebook they can shake their fist at Zuckerberg on the rare occasion the service is down; he's to blame for X Y Z; and it costs them nothing whether they use it once per day or once per year (yeah but privacy, they're the product; right, well, step into the average consumer's mindset and out of your mindset).

The majority of consumers could change their own oil too. They don't want to because it's a big hassle to them. They don't even want to sew their own clothes, imagine. Bring back decentralized clothing.

I'm endlessly baffled by the inability of most techies to put themselves into the shoes of the average consumer and just get it, get their point of view, grasp their mindset. It's extraordinarily easy to do. Just try it sometime, with practice it gets easy, I promise. No matter how many decades go by, none of this changes, and yet the techies persist with the decentralized fantasies. One day human nature will change, just you wait!

Geocities. Tripod.

Geocities was the centralised easy alternative to hosting your own website, that was what made it popular.

Geocities was entirely static though. You could back up your FTP folder and move hosts overnight if you wanted to.

Plus most people had an identical copy locally, effectively working as a backup.

We were never close to this. You have to accept we need nurses, hairdressers and farmers and these people cant be expected to do the work required to maintain a critical email server on top of everything else they have to do.

Look I heard in the US some people even have several jobs, I suppose by end of your second or third shift of the day, the last thing you want to do is a log4j patch type upgrade... doubly so when it's christmas and the kids are noisy.

Does that not make it obvious that the problem is in the distribution of software? I run my own server and I don't want to care if any of the installed apps I run depend on log4j somewhere in the chain.

I so far trust my Ubuntu LTS package archive to give me sufficient security updates for stuff I install from main, and this is the reason I avoid non-main software, but even Ubuntu is moving away from this model to snaps and such.

Yes but why Ubuntu LTS, gmail is even better at it: you dont even need to learn what a package is and you can focus on breaking your back 16 hours a day to pay your credit card debts.

You overestimate the time available to people and that's why google can shove its ads down our throats in exchange for a huge time gain for everyone. Fix that first, then tell people to update their mail daemon.

That's quite a difference: gmail does not provide strictly security updates, but might change the entire UI/UX on their whim (or get cancelled: Google has infamously done that a lot).

But, I am not saying that even LTS approach is sufficient, but that we could work on improving it so it's trivial for anyone to do (people using laundromats probably have no idea what a "filter" is in a washer either: that's not a barrier for people to get one, though).

You are dreaming about “trust everybody” universe, which ceased to exist long time ago. This is still possible in an isolated bubble, but our society is closer to “trust nobody” at this moment, so regardless of the technology used your dream is just a dream. We were close to it only for a brief period of time, when the bubble was small and society within it relatively homogenous.

I’m sure they would be. You ship your org chart, so large organizations that move quickly will ship bulky applications. It’s very hard to make those applications smaller without slowing development considerably. Moreover, it’s not like there was a dearth of system administration experience at Google around the time GMail was conceived.

I read the GP's "big" as popular, not bloated.

Ah, I hadn’t considered this possibility. Thanks for pointing it out.

The spam problem is still pretty tricky, but this would be a good step. Owning the means of communication gives a lot of power to natural persons over corporate persons. Removing dealing with NAT and complications like that makes things a lot simpler even if it's not the be all and end all.

The web is fairly decentralized though. Google could go down and the web would still operate. Amazon could go down, and the web would still operate.

Yes there's a few centralized components still such-as DNS, but even that is heavily cached and distributed in nature.

Blockchain could aid in replacing BGP & DNS allowing for a better web. But web3 is just as centralized as our current web, if not more centralized. For example, abracadabra money get's current crypto prices from a "price oracle" which is a consortium of 16 parties who state what they feel the price should be. I wouldn't call that "Decentralized"

If I’m understanding the key feature of blockchain-based stuff, it’s that it’s not just decentralized but distributed, right? Content is replicated across a whole lot of computers, rather than just sitting on one, and any modifications made along the way are a matter of public record, and recorded in a way that’s hard to tamper with.

Like a number of other people here, I primarily associate blockchain/crypto stuff with tech-bro huckster types, but the underlying tech does seem to have some interesting properties. Anybody remember Publius? https://en.m.wikipedia.org/wiki/Publius_(publishing_system)

This is NOT because of blockchain. This is because of IPFS.

These are two technologies that are being conflated because in theory crypto can be used to pay for IPFS.

They are conflated, but the correlation isn't just that one can be used to pay for the other. IPFS is increasingly becoming the "content layer" for blockchains, since it's not efficient to store content (like images) in (most) blockchains.

Having a kind of literally global GUID that can point to arbitrary decentralized content is fantastic. While the credit for it absolutely goes to IPFS rather than to cryptocurrencies, it does happen to be perfectly suited for any kind of application you may try to build on a cryptocurrency blockchain.

Aws and cloudflare outages have taken down a significant portion of the web. No, the web isn't 100% centralized, but what percent is centralized to a small handful of players?

That doesn't mean the web isn't decentralized, that means people have gravitated towards a single vendor for reasons I won't get into.

But whatever vendor you choose is a centralized vendor.

Unless your using an IP address for you website, your domain name is registered through a centralized registrar. The DNS architecture/protocol itself is centralized as it is controlled by ICANN.

> your domain name is registered through a centralized registrar. The DNS architecture/protocol itself is centralized as it is controlled by ICANN.

There are tons of registrars and DNS is even more decentralized — it’s not like .ru has the same operators as .us or .aws.

The key question is how many parties can limit you. The DNS roots are stable and widely cached, so you don’t have bottlenecks based on those. Once you own a record in your hierarchy of choice, you can do whatever you want with it without needing to involve the roots in any way.

What happens, however, is that people voluntarily share infrastructure because it’s cheaper and easier for them to let Google operate their email, AWS their servers, etc. That’s different in a very key way: if you don’t like how it’s going, you can easily leave at any time. The centralization is weak — lose that competitive edge and the exodus begins.

The same thing has already happened in the blockchain world: theoretically you can be your own bank, IdP, etc. but in practice most people prefer to let other people manage those functions for them. Control of the blockchains, oracles, etc. similarly has a relatively small number of parties involved and it would be harder to change them than updating DNS records.

> There are tons of registrars

But those registrars only exist at the discretion of a single centralized authority (ICANN)

It doesn’t matter which registrar you use, ICANN can always take away your domain name and give it to someone else, no laws apply to protect your domain it is all subject to ICANN and their decision. They have taken away 10’s of registered thousands of domain names.

Alternatively there is no centralized authority that can take away a decentralized domain name such as ENS.

ICANN controls the root name servers but it’s an international organization with bylaws and governance involving a number of parties around the world. They can’t force a registrar to transfer your domain without following a formal process, and misuse of what power they have would have significant consequences since they depend on cooperation.

ENS is based on Ethereum, which is controlled by a smaller number of people and has already had one case where they rewrote history when it was financially advantageous. There is no reason to think that in a hypothetical future where ENS came to matter to anyone the same pressures wouldn’t apply because the critical flaw for ENS is the same one preventing popular use of blockchains in general: there’s no way to recover from mistakes or abuse. This is not an idle concern: domain theft happens regularly and there’s a process for recovering your domain:


If ENS became popular there’s no chance that the same problem wouldn’t occur, and the value of a domain and risk to the public means that people would not accept the answer that the attacker won for all time. Something would happen - whether that’s the Ethereum developers rewriting history again, a global block or override list, or something else.

As a simple learning example, imagine that the Chinese government got the keys for Taiwanese government in a hack or someone stole the keys for Amazon. Do you think that the ENS community’s response would be to say there’s nothing to be done and Amazon should rebrand, or that anyone would continue to use ENS if they did?

> ICANN controls the root name servers but it’s an international organization with bylaws and governance...

All true, but doesn’t make ICANN not centralized.

>ENS is based on Ethereum…and has already had one case where they rewrote history when it was financially advantageous.

There is a difference between rewritten and forked. Nothing was rewritten, both chains exist. Similarly there is no centralized authority that prevents an individual or group from forking the blockchain, nor any central authority forcing users to the new fork.

Alternatively, you won’t have much success attempting to “fork” your own .com TLD because of conflict with DNS. Realistically even ENS potentially faces this risk with .eth conflicting with DNS at some point.

But as it relates to ENS on EVM, if you are suggesting Amazon could in anyway force the take over of Amazon.eth from another owner on the blockchain, it is not possible, not even the ENS DAO has that power. This may be a result of your misunderstanding that Ethereum was previously “rewritten”.

You have either no clue how DNS works or are entirely out of your depth wrt the definition of de-/centralization..

ICANN has an entire dispute resolution set up for businesses to use to take away domain names from users.

Local laws do not apply, there is no independent or impartial court, it is entirely centralized decision making by ICANN if they will take away your domain name and give it to someone else. It is exactly the same as when Twitter recently took away @metaverse handle from a active user and gave it to Facebook. At least in the case of Twitter when faced With user backlash and public support they reversed their decision and took @metaverse back from Facebook and returned it to the user. In practice ICANN is even more centralized than Twitter, because ICANN isn’t accountable to its users.

It’s clear you don’t know how ICANN words, and are entirely out of your depth wrt the definition of de/centralized. You likely also have a conflict of interest.

You really have no idea how DNS works then.

ICANN can do that for .com domains, because that's their TLD. They cannot do it for the TLDs they've already sold/aren't responsible for.

You can even purchase your own TLD from ICANN if you've got the money and create whatever domain names you want.

Maybe stop talking about stuff you've no clue about, otherwise you'll forever be a dunning kruger person

There's nothing stopping you from hosting a functional website from an IP address.

And there is nothing stopping businesses from running their websites from IP addresses, and yet ICANN developed rules that allow them to take away domain names and give them to businesses.

It is not at all unlike Twitter taking away the handle @metaverse from a user and giving it to Facebook, it was only the result of user backlash/public support Twitter reversed its decision.

There’s no connection between what I said and what you’re saying.

What’s your point? My point is that the internet is decentralized but many websites use centralized services as there’s some value in doing so.

The third stage of internet arguments: arguing definitions

architecture != infrastructure != supplier

If AWS went down, some critical infrastructure would definitely be out of commission.

So true. IP address should have been like phone numbers. The public should have known what they were and how to use them for 2-way communications. I tell non-technical people about this all the time and get blank stares. It didn't occur to me that IPv6 could fix it. I sure hope you're right about that -- what a glorious day it would be if everyone gained control of their computers' networking abilities.

IP addresses shouldn’t be portable, that would be a disaster for routing table sizes. It’s a bit like saying your street address should be portable.

On the contrary, addresses and phone numbers should be like domain names. It sure would be great if I could just update my mailing/physical address once when I move.

When I said "phone number," I wasn't implying any portability. I was actually thinking of a land-line phone number.

The point I was making is that it's assumed you can get a static phone number for your house, but most of the public don't know how to get a static IP address for their house (and it's often not even possible to do, depending on ISP).

Well, PO boxes are a rough equivalent to domains: you've got to explicitely set it up at some cost.

As long as you only ever give out your PObox address (domain) instead of your real address (IP), you are good.

The ‘web’ is still just as decentralized as before. Provided you keep a list of hostnames/ips you want to connect to, and a continent is wiped off the map, it’ll neatly route the traffic through the remaining nodes.

Anyone that actively wants to can fairly easily obtain a stable IPv4 IP, or even IPv6, it’s not that it isn’t available, it’s that nobody uses it.

I’m sure my home connection has a stable IPv6 address, but I honestly wouldn’t be able to notice from the name whether it had changed or not, too many friggin characters.

That's not what web3 refers to. The new decentralization means:

1) backend running as code on a blockchain spread across the nodes instead of a single provider

2) user accounts are nothing more than a wallet address and completely controlled by the end user

3) transactions and ownership of tokens between addresses are controlled by users without any middleman control, approval, or censorship.

Web3 still runs on all the same internet architecture as the rest of the web. It's still TCP/IP underneath, and browsers with JS providing frontends.

Yes. Watch that become centralized. If it goes mainstream, it will happen.

IPv6 deployment removing the not giving consumers static IPs argument is a pipe dream. Many ISP's that deploy IPv6 don't guarantee you'll get the same prefix and in many cases you don't whenever your CPE loses connection even for just an instant. There is zero economic incentive for them to do that so why would they.

Yea I agree with the social aspect of it, but social aspects can be made to bend and break a physical limitation is impossible.

They didn't do anything to the architecture. The fact that most people spend most of their time on a few websites doesn't centralize the architecture at all, or even the web, it only centralizes attention.

I'm not claiming they did something to the architecture and it definitely doesn't solve the problem you identified. More that this flaw in the difference between userbase growth and the protocol made it very difficult to have the kind of vibrant consumer-to-consumer architectures that could have potentially competed. Nowadays, probably ddos and things like that are other barriers. However, I do think that IPv6 does at least remove a fundamental limiter even if that alone isn't enough.

DNS/root servers/ICANN are fundamental architecture of the internet that are centralized.

People either forget, or weren’t around, when domains names were thought of and treated much like NFTs are today. There was a small group of early adopters but for the most part big business considered websites/domain names something akin to digital monopoly money, just a play thing with no relation to the real world.

In the early days even the law and courts didn’t know how to treat claims over domain names. Now ICANN dictates DNS rules worldwide and streamlines the process for businesses to take domains according to ICANN rules.

There are many examples of new decentralized architecture and protocols being built for web3 and the decentralized web, like:

IPFS: a p2p alternative to https

ENS: a decentralized domain name protocol as a alternative to DNS

DNS fulfills a direct consumer need -- to more easily access a website. I'm hard pressed to say what a claim on a JPEG fulfills. Sounds like it makes the world harder to navigate for consumers, like a tollbooth on a public park.

I didn’t say DNS didn’t fulfill a need, it just does so in a centralized way that is now antiquated by newer technology that removes the need for ICANN to act as the centralized authority.

Also an NFT is token standard, sure it can be used as a fungible record of ownership, but your framing as if all NFTs are is a “claim on a JPEG” shows either a lack of understanding or a common Luddite attitude toward the technology. My NFTs work perfectly fine as domain names, far from a JPEG.

It’s interesting too because web3 repurposes existing internet and compute resources to build a more sophisticated network, similar to how phone networks were repurposed back in the early internet for connectivity.

Sorry, but this is a rant.

I am deeply saddened by the current state of affairs in the web today. in Javascript right now there are so many layers upon layers upon layers of DSLs and mannerisms on top of questionable language features which 'must be used' because the former language features are not cool anymore since a few days ago. The whole ecosystem is turning into a meme of itself.

In a few months someone will reinvent another part of the big clusterfuck again and publish it in a shiny new website and we'll have another 3 month old Javascript framework being touted as the next earth-shattering thing that will revolutionize all things. And again. And again.

We have nice things. We have nice abstractions already. We should be working on toning down and optimizing the good stuff, the good bits that get stuff done, not ramping up on new shiny buttons for each and every edge case out there just because we need to stay on the loop (whatever this loop is).

The tech industry is completely dependent on throwing away anything that is finally half-stable, so they can convince VCs to shovel out more cash to rebuild everything in a new language, framework, system, protocol, app, etc.

If you look at what we actually do with the web, it's the same thing we were doing 20 years ago. Click on links, look at pictures of cats, chat with your friends on message boards and instant messenger, send e-mail, buy the occasional thing online, read the news, play games. If you changed none of the software, the result would be the same, but we would have saved a couple trillion dollars (or, alternately, not caused people to fork out a couple trillion dollars).

The only functional change today is much more streaming video, and instead of writing blogs, everyone's recording podcasts (which we had back then as "streaming audio" using IceCast).

I don't understand this view.

JS is the most accessible and distributable code someone can write. It is accompanied for free by a portable, OS-independent UI library with application-tier capabilities that requires no user friction to download.

OF COURSE this construct comes with a ton of bullshit and noob reinventions.

But you, as a software engineer, should have the skillset to pick some strong set of libraries and stand behind it. For me, I use React, React-Router-Dom v5, MobX, TypeScript, and roll my own for everything else. It works well, my code from 2017 that runs without updates is easy to port if I want to because it uses the same underlying libraries, and the browser platform is fairly stable overall.

How does it upset you that you can also do some things in a lower friction way capitalizing on new language features? The old ones didn't go anywhere. The old libraries didn't stop working. Nothing fundamentally changed, some idiots just bolted on some new stuff and some geniuses carefully architected some other new stuff. Why is velocity something to rail against in and of itself? It impacts you exactly how much you let it.

The amount of reinvention in the JS space is mind boggling. While this works to drive the whole thing forward it would be nice to see more LTS support rather than yet another templating library.

Not that most their languages are inherently better but some have a way of mitigating the problem slightly, like e.g. python's extensive std lib collection.

I am a backend engineer by trade, and I have been trying to learn me a little React because I decided I should learn new tech (not the stuff I use on a daily basis, which is Python) on side projects. So the backend is being done in Rust, and frontend is using React.

I had this moment where I thought "well, I need Typescript, because I want nicely typed interfaces and contracts, and I need React". Turns out it's not easy like that. To cobble everything together you need a proper template for webpack. and there's Babel and that also needs to be caressed in the right React-y way. You can't just "hop in". You have to use cookiecutter-styled tools to bootstrap because the whole thing just got so complex so that even a Hello World styled project needs a mindboggling amount of tools all orchestrated and plugged into each other in the right way or it all blows up in your face. Also, there are more than one cookiecutter-styled tools and you have to choose the right one. "But don't fret you can always run eject to see the whole machinery inside".

It's saddening and tiring.

> Turns out it's not easy like that. To cobble everything together you need a proper template for webpack. and there's Babel and that also needs to be caressed in the right React-y way. You can't just "hop in".

I would tell the exact same story if I was to decide to use Python as backend, just with different technologies. Maybe you just tried to do too much too quick. I can build a Python script, just like you can probably build a JS script, it doesn't means that I can just jump into Django head first like it was nothing. You can't just "hop in", you need to learn each step, and that's fine. The number of times I thought it was as simple as a "pip install"... lost quite a bit of time in various CTF while trying to use some Python scripts. You also can stop at any steps, and that's fine too. I know that HN is good at "THIS IS THE ONLY RIGHT WAY", even this thread if filled with people arguing that "web0" is the "right way", but you know what, the right way is the one that works for you, even if it means that some page will take more time to load, or takes more data transfer, or is missing interactivity, or has a less optimal UX, or can be done faster in another way, or any other downside.

You _can_ just hop in with Flask. That's kind of the main selling point.

You didn't need TypeScript. That was your first mistake. If you had chosen Create-react-app (the official boilerplate) - that's all you need. Hello world right out of the box. Nothing blows up in your face. Nothing mind-boggling. I've never had to ever run eject. Nothing saddening, nothing tiring. In fact- just the opposite. Happiness, joy, energizing React productivity.

You can literally pass a flag to that command to get all the same stuff in Typescript. Don’t yuck someone else’s yum.

Creat-React-App has a TS option (and, IIRC, it's not that hard to use TS on a CRA app that wasn't created with that option, but it's been a while since I did that.)

Manually setting up React (with or without TS) isn't all that hard, either.

Same experience here, I wouldn't be able to get off the ground without those templates. Plus updating can be a nightmare, if you regularly update and the build breaks then it's easy to find out what caused it, however a few months down the line be prepared to spend significant time figuring out what broke again and how to fix it.

> You can't just "hop in". You have to use cookiecutter-styled tools to bootstrap because the whole thing just got so complex so that even a Hello World styled project needs a mindboggling amount of tools all orchestrated and plugged into each other in the right way or it all blows up in your face.

Manually setting up React isn't all that hard, but it's extra friction you don't need when learning to code React (and probably 999 out of 1000 times using it in anger, either.)

So, yeah, using CRA or another template is generally a good idea.

> Also, there are more than one cookiecutter-styled tools and you have to choose the right one

No, you really don't.

Why do you need types to learn React? You really need a compiler to tell you when you have made a mistake that bad?

Your first mistake is trying to add typing on a dynamic language that works just fine without types. You won’t really learn the underlying language that well that way, and you therefore won’t have a good understanding of it.

TypeScript makes sense in large projects, with many developers where you need to catch a class of problems that compile time type checking can catch. It doesn’t offer a lot or add much in the way of actually learning the fundamentals of frontend development and is a distraction when viewed through that lens.

Are you sure you weren’t actually avoiding the scary unknowns of actual learning and doing a bunch of busy work to bring familiarity into an otherwise unfamiliar space? I catch myself doing that a lot, especially if I need to learn a completely new language to learn something. I wasted a lot of time trying to make OpenCV work with native JavaScript when I should have just learned enough C++ to get by.

I used to think like this, but TypeScript has saved me massive amounts of time. If you ever refactor your code, the compiler will help you trace all the dependencies, so you don't end up with subtle bugs or broken code. In other ways, TypeScript deepens understanding of JavaScript. For example, TypeScript forces you to think explicitly about the differences between undefined and null, helping you to know the difference and to avoid a whole class of JavaScript pitfalls. You can also look at a function signature and know exactly what inputs and outputs it requires, rather than having to memorize or search docs. Even for smaller projects, I start with TypeScript now.

While driving it forward, it also fragments it: look up a tutorial for any given thing you might want to learn to do and it's done six different ways. Trying to integrate code written at different times can be an exercise in frustration.

JavaScript framework churn (on the frontend) ended 6 years ago. The only thing keeping this meme going is commenters on HN.

What framework did you end with? What state management approach? What styling strategy? I think the churn stops whenever you stop adopting new things to replace functionally equivalent things, but new things still come along and get hyped.

Library: React State Management (if needed): Redux via Redux Tool Kit. Maybe Redux Toolkit Query too if necessary. Style Strategy: SCSS or Styled Components (if I have to) Framework: Remix[0]

This is honestly a very easy set of questions to answer today.

[0] remix.run

Software engineers that replace things with functionally equivalent things don't get a lot of actual work done and typically aren't employed for very long either. So, the churn had to stop long ago from inside the average high performing company (and professional engineer's mindset). Only when things are really bad (as they were 6 years ago) is it worth rewriting everything. We've done that and it's not happening again. You can follow my work here: react.school

So because there's no churn and everything is standardized I assume you can tell me which state management library is the correct choice for a React app without causing any controversy?

Yes, that's easy.

React-Query[0]: You don't need to manage state, let your server manage state just like you do in typical non-js heavy apps.

[0]: https://react-query.tanstack.com

Sure, book a free call with me here: react.school/call

Why not share it in a single 10-word comment here for the benefit if all if it's so uncontroversial.

React, Redux, styled-components.

As someone who's always been somewhat mediocre at JS and html/css programming, but finds himself needing to build something new every few years. I'm constantly amazed at how much easier it's become to build something that I would have wanted to build before, but didn't have the experience or time to build.

I recently built a dashboard/reporting site for Neo4j that's completely config driven (delivered from the backend), which can refetch the config (with a gmail style "Settings have changed - Refresh" notification) and have only changed parts of the site update. It has tables with resizeable columns, filtering, search, ordering, virtualization, server side fetch, etc. It has progress charts, bar charts, counts, markdown rendering, json schema rendering. It has inputs that autocomplete, with single or multi-select. It's fully responsive, and works perfectly on desktops, tablets and phones.

If I would have tried to build this three years ago, I wouldn't have been able to do a fraction of these things. So, rant away, but the frontend community is amazing and they deserve way more credit than they get.

Just don't switch to those new shiny useless/bad frameworks?

Or do you mean that you have to suffer like when you visit websites, like cnn.com: https://i.imgur.com/VAZMUx8.png

Don't mistake being effusive with ranting.

  Web 1.0: HTML
  Web 2.0: HTML + Javascript
  Web 3.0: HTML + Javascript + Ponzi.
source: https://twitter.com/jeremiahg/status/1473768877377130505

There are real things happening in web 3.0, they are just being obscured by layers and layers of bullshit, scams and other fuckery. Once all that has crashed and everyone has lost all their money there will still be real and interesting tech remaining.

Be very sceptical, but don't ignore the whole thing or you might end up missing the next real wave of innovation in tech (which is what i thought HN was about).

I think if you want to dispel the skepticism describing those real things would go a long way.

I'm working on a decentralized storage solution for personal data. To be fair, it's not actually being used anywhere meaningful but I think it has real value (data ownership / privacy, encryption, protection from client code) and we have a proof-of-concept website that you could test it out on.

I've yet to see those "real things" described and would be interested to see it summed up so I can.

Identity that solves Zooko's Triangle, Censorship Resistant Storage, P2P/Decentralization (separate from blockchain), alternative funding models for all kinds of SaaS and Open Source projects.

Look it up yourself, or don't.

Ah, yes, I remember this from web0:

    <meta name="twitter:card" content="summary_large_image">
    <meta name="twitter:title" content="My first web0 website!">
    <meta name="twitter:description" content="">
    <meta name="twitter:image" content="https://elliott.computer/pages/web0/social.jpg">
Trying to make web3 go away is just as pointless as trying to make web2 go away. It's here, it's going to stick around. Yes, there's a lot of hype, it probably won't look the same in 10 years, and yes there's a lot of charlatans, but I think ignoring its importance is myopic.

> Trying to make web3 go away is just as pointless as trying to make web2 go away. It's here, it's going to stick around.

Web3 is not here lol, it’s literally nowhere. It’s a few Ponzi schemes grafted on top of some MySQL stores like OpenSea and Coinbase. This does not a revolution make.

You’re forgetting that it also has hoards of naysayers clamoring to be the loudest to say “it will never work”. So, at least in that limited sense, it very much feels revolutionary.

Sure but you’ll have folks doing that whether an idea is brilliant or utterly idiotic. That in and of itself doesn’t really mean anything other than the idea is polarizing.

> Web3, also known as Web 3.0, is an idea for a new iteration of the World Wide Web

Cool, cool...

> that incorporates decentralization based on blockchains

k bye

> Web 3.0

In my mind, Web 3.0 was synonymous with the Semantic Web.

So true. It's funny that the Web 3 folks just coopted that phrase from the Semantic Web. And the Semantic Web was/is an actual useful building block on top of the current web whereas blockchain based Web 3 is a side step using entirely different protocols and paradigms.

Some of the arguments here fill me with good hope that the semantic web is going to be the next revolution.

After all, it very much feels like a revolution with the amount of people saying it'll never work.

Right, we're somewhere up around Angular or React version numbers of web.X at this point

Same here. After web 2.0, next one was supposed to be about the semantic web.

Yeah, the Web is supposed to be decentralized from Web 1.0, right? Or is this wrong?

It's supposed to be, and it is. Every domain points to a different node in the decentralized web. Just because some nodes are massively popular doesn't mean the technology is not decentralized. Popular nodes will always exist in every decentralized vision of the internet.

Yeah, so... why on Earth do people keep saying that it is centralized? I understand that for some there is nothing beyond Facebook, but that has to do with Facebook apps on the smartphones, restrictions or limitations at the smartphone-level, ISP-level, and so forth. Regardless, it is decentralized, so what is this decentralization based on blockchain bullshit? Do we need it? What problem does it solve? I hope that the problem it solves (if any) has not been created beforehand.

> Yeah, so... why on Earth do people keep saying that it is centralized?

Cryptocurrency culture fosters dishonesty: billions of dollars have been poured in by people who think it’s an investment they’ll get a large return from. The only way they profit is from other people putting real money in, which means everyone involved has to be a marketer.

That’s why crappy generative artwork will get talked up as nauseum, people will wax rhapsodic about logging in using a third-party service, etc.: the people saying that know they will lose money if the rest of us don’t agree that their random hashes are worth more than they put in.

I would argue it's demonstrably wrong to say that the web isn't decentralized, however the thing about web3 is that it means a dozen different things to different people. For some it's all about a vision of the web where everything is gated behind cryptocurrency payments, for some it's a world where websites, apps, and games rely on blockchain data instead of self-hosted DBs, for others it's a world where most software vendors have acquiesced to a business model where all digital products are minted as NFTs. There are many other visions, but in my view, all these ideas have glaring problems that make them fundamentally unworkable.

> Trying to make web3 go away is just as pointless as trying to make web2 go away. It's here, it's going to stick around. Yes, there's a lot of hype, it probably won't look the same in 10 years, and yes there's a lot of charlatans, but I think ignoring its importance is myopic.

Web 1 and 2 were almost immediately popular because they had real things you could do better than the alternatives. The blockchain financial systems being marketed as “web3” aren’t here in any meaningful sense: if you aren’t interested in cryptocurrency speculation, there’s very little reason to care about it (this is staggeringly unlike the earlier web), and a micro-transaction system for rich people with more centralized infrastructure is a big reversal — especially since exactly none of it works in any way without using the real web.

"web3" is not here, and it's not going to stick around. piling it up with web2 is beyond absurd.

A year from now there will be a new buzzword to try to take money away from the get rich quick crowd.

Let's hear what the father of the web had to say about web 3.0.

> People keep asking what Web 3.0 is. I think maybe when you've got an overlay of scalable vector graphics – everything rippling and folding and looking misty – on Web 2.0 and access to a semantic Web integrated across a huge space of data, you'll have access to an unbelievable data resource… — Tim Berners-Lee, 2006

That's funny, he didn't mention blockchains once.

People bring up this semantic web thing in Web 3.0 discussions all the time. I still don't know what he was talking about. I don't think it happened, though?

Semantic Web 3.0 and blockchain web3 stuff both suffer from at least one shared problem* -- the internet is a living, evolving system. Web 2.0 was labeled in retrospect to describe a general, but clearly visible change in the way websites were build. Much easier to predict something that has already happened and which everyone can see.

The only prediction I think that we can make about the internet is that it will defy any predictions that we make about it.

*Not to say this is the only problem either had -- for example, web3 also suffers from a wide range of credibility issues, IMO.

> It's here

Can you post a link to a place where I can interact with some web3 content directly then?

Not an article about what web3 will be, but something I can play with? Back in the early web 2.0 this is pretty easy, just point to twitter, myspace, stumble upon etc.

I'm genuinely curious what web3 really even looks like since all the pages that talk about it mostly sound like marketing pitches without a single real thing I can interact with.

OpenSea is probably the biggest thing, it's the largest NFT marketplace and it uses web3 extensively.

Starting to figure out Clarity.so, which is a project management tool kind of like Notion but with some web3 features. Haven't done too much with it yet, but an organization I'm working with is using it.

Metamask is a browser wallet that also serves as your login and identity for these sites.

If you want to learn some code and concepts in smart contracts in a really friendly way (while providing a lot of good information), you can go through the tutorials at https://cryptozombies.io/, which works like FreeCodeCamp.

Smart contract code has some similarities but also a few things you need to do differently because of the nature of the blockchain. Like random number generation you have to be careful with since your code is public, and people could potentially game it unless you do it a certain way, also function visibility is very important because it might allow other apps to call your contract and/or certain types might cost gas (i.e. tokens) when you don't need it to. Also a smart contract is immutable once it's published, so you need to be very careful you get it right, or you will have no choice but to point people to a different contract (and even still that other contract could still do some damage). Also the way string comparison is done is to make a hash of each of the strings and compare those to see if they're equal. Just a few examples of many.

So it actually feels like I'm learning something different, and not just "okay, what do I need to call/integrate to add authentication on this new platform" for the tenth time, or "Oh, the coordinates for drawing to the screen start from the top left instead of the bottom left for this tool, good to know."

I mean it's not as different as learning something like Prolog, but still fairly different.

> OpenSea is probably the biggest thing, it's the largest NFT marketplace and it uses web3 extensively.

Is this website hosted or somehow powered by web3? Does it store its data in a blockchain or a traditional MySQL/postgres/etc db? Are its user accounts nothing but wallet addresses and entirely managed and controlled by the wallet owners? Is the computation, access control etc done using smart contracts and resilient to dropping their servers?

These are honest questions I really don't know the answer to, but based on other posts in this thread this would all be required to qualify as a proper web3 example.

Powered for sure, but I don't know if absolutely everything is stored via web3, like their html or whatever. I know the NFTs and metadata are[1]. Purchases, transfers, bids, offers, etc. are handled using smart contracts. User accounts are just wallet addresses, yes. I don't know how it manages access control, but besides what I assume is some sort of admin access it doesn't really need it.

If OpenSea doesn't qualify as web3, I don't know what would.

[1]: https://filecoin.io/blog/posts/opensea-decentralizes-and-per...

There was that website posted a few days back on non-fungible Olive Gardens and you could make unlimited breadstick tokens. If that’s not peak web3, I have no idea what on earth is.

Web3 is not relevant today and there's little indication it ever will be.

thanks for viewing the source ; )

Thanks for taking the inevitable HN nitpicking politely :)

when the folks with the laser eyes won't even look at the existing systems which can be used to extend the web in a standards-compliant way, then we have every right to be dismissive of their position (holding bags).

They have every right to come to the table at the W3C. The federated social web has existed since 2008. It is the real web3.

"The folks with laser eyes" are affectionately known as bitcoin maxis, and they have more interest in web0 than in web3.

the bros with the ape avatars then? you know who I'm talking about.

Also <meta name="viewport" ...> didn't exist.

Don't remember if self closing like <br /> was possible in the HTML2 days.

I would say Web 0 was Gopher and Usenet. This page is definitely Web 1.

But with that said, I do find it interesting how many Web 3 things are just more complicated ways to solve problems already solved in the 80s and 90s.

I remember when I first heard Web 2.0 in a conference by Tim O'Reilly. I wasn't a fan but it became a convenient way to describe a new web that was driven by user generated and dynamic content. With the help of this new thing called Ajax and mirror effect, lots and lot of mirror effects on graphics and 12 pointed stars.

Web 3 is not an evolution of the web but rather a side show. A side show that will definitely have some useful things come of it but it lacks the spirit of Web 2.

>I would say Web 0 was Gopher and Usenet.

Let's not muddy the waters even more. "Web" refers to the Worldwide Web created by Tim Berners-Lee. Gopher and Usenet don't count.

By that logic then (which may be entirely fair and I don't necessarily disagree!) Web 3 is not the web either since blockchain is not a web protocol but rather a different type of P2P protocol also built on TCP/IP.

But also, to my sort-of point. Web 1 was (among other things like hypertext and some dynamic content via PHP/ASP/Perl/C via CGI-bin/etc) static plain HTML pages. This linked page is most certainly not "Web 0".

>By that logic then (which may be entirely fair and I don't necessarily disagree!) Web 3 is not the web either since blockchain is not a web protocol but rather a different type of P2P protocol also built on TCP/IP.

Yes. "Web 3" despite the name, is not the web, it's an application on the internet. The web is also an application on the internet, separate from Web3 (and Gopher and FTP and everything else.)

>But also, to my sort-of point. Web 1 was static plain HTML pages. This linked page is most certainly not "Web 0".

I guess we'll have to agree to disagree on this, since it's just a semantic argument. To me, Web 0 was static plain HTML. Web 1 was possibly the introduction of CSS and JS. Web 2 was AJAX and dynamic websites.

And my hot take on Web 3, not that it matters, is that it's properly multilingual computation on the web. Specifically being able to run applications compiled to Webassembly, and containers for legacy native applications.

As someone who actually lived through the Web to Web 2.0 transition... no.

Web 2.0 was not coined by historians to describe something after the fact. It was a term that was being used by the people who were inventing it. Much like Web 3 in that regard.

CSS and JS both weren't widely supported cross-browser until well into the initial web rollout. Before Ajax with had "Dynamic HTML" and also some pretty innovative things with "long polling."

And contrary to popular belief, "Web 1" was not as dark ages as people think. Forms were part of the HTML spec literally a decade before Web 2. We had plenty of dynamic content. It just required a post to a server instead of being able to be refreshed dynamically.

PHP was part of the Web 1 legacy and launched in 1995 and provided dynamic content. The first eCommerce sites predates "Web 2.0" by over a decade. I should know, I launched one of them. And before that you could make CGI programs in C. And before SQL databases became widely adopted for web applications we had these things called "flat files" we could write to.

Even the very early web specs has POST as a method/verb. Literally, it is in the HTTP 1.0 RFC from 1996.

What differentiated Web 2 specifically was user generated content and dynamic refreshing of the content. And I was actually there, when the phrase was coined. It's not some distant memory for me or something I read about, I was one of the Web 2 early pioneers.

Unless you were on ARPANET you did not use "Web 0"... "Web 0" is not a thing I have ever seen used in my entire 28 year career until this Hacker News post. You can't just rewrite history... there are people alive today who actually lived it.

You can disagree that Usenet is a predecessor to the WWW but you can't redefine Web 1 as Web 0. To anyone who lived it, that is absurd (and slightly disrespectful to our legacy).

You've suddenly become weirdly hostile and defensive. No shit, you were there? Yeah, so was I. So were a lot of people here. You're not the wizened greybeard telling the wet behind the ears noob how it was back in the day, so kindly remove that chip from your shoulder. This will be my last reply to you in this thread.

I'm not "rewriting history," and I never claimed "Web 0" was a term anyone actually used at any point in time. I was simply disagreeing with your definition of Web 0 upthread, which was itself disagreeing with OP's definition. I've never seen "web 0" used outside of this specific post either. But, like yourself, I was simply stating what I thought a more accurate taxonomy using that term would be. It was speculation. A thought-exercise.

Besides, as one of the great old-timers of the web you should prefer zero-indexing anyway.

Good day.

Which is it? Am I hostile (aggressive) or defensive? ;)

It doesn't matter, in either case. I apologize. My frustration is not directed at you but rather the poster of this HN post who is claiming Plain HTML is "Web 0" -- it sounded at first (and second) read you were defending their definition of Web 0. I misread.

When you said "We agree to disagree" I thought you were taking the side of the poster of the "article" -- I agree Usenet is not "Web 0" -- Web 0 isn't a real thing. I was just offering a counter point that what came before "Web 1" was not "plain HTML."

Anyway, my apologizes for escalating. It was not my intention. I was just venting my frustration over the past year at people redefining terms with no regard for history.

Anyway, have a pleasant day.

It's both pleasing and a little disappointing to see this being "marketed" under the name "web0" --- especially on a "nonstandard" TLD. All websites used to be simple handwritten hyperlinked pages. I guess what's old is new again?

Of course, Big Tech would want you to use its browser (not browser_s_) and indulge in the insane complexity of "the web stack" which guarantees its monopoly, but it doesn't have to be that way for everyone. Maybe once more people realise the latter, the web can become mostly-browser-neutral again.

> ... especially on a "nonstandard" TLD.

I guess that's a dig but I'm not sure why. It's resolvable via the ICANN root and is no more "nonstandard" than any of its siblings.

I've found that I tend to ignore any sites which aren't the traditional net/com/org or the well-known ccTLDs when they show up in search results. Perhaps because the majority of them seem to be used for hosting vapid SEO spam. It's an almost subconscious aversion trained by years of browsing experience, when assessing the trustfulness of a site, that "weird TLD" will be a negative weight.

I can understand how you might arrive at such an intuition but I'm not sure how well it serves you, particularly when you apply it outside of the search results that have formed it. There's over 150 million names under .com, people are going to go elsewhere.

CanIuse should have a page that lists all of the browser features that are supported by all browsers and in the same way. Of course it's not as simple as that, but in short, what has a green square across the current versions of all browsers? Green-green, not half-stepping green.

What would that presentation be useful for? They certainly have the data, but I have trouble thinking of a situation where I would want that kind of list.

Then it wouldn't be for you. For me, it could indicate the amount of web page knowledge necessary to minimize maintenance overhead.

I feel like just a basic stylesheet improves things quite a lot (this isn't using CSS at all). But otherwise I agree that a lot more website should be like this.

The site uses an inline CSS statement to set the font size.

I like the attempt, though. I could go off on an old-man tangent about "kids these days don't even know what a fieldset is!" or whatever, but I very much appreciate the call to do things more simply.

an aside, but .computer has been a valid top-level domain since 2013-2014:


these are fully standard.

We should invest even heavier into web∅, which is when you don’t even have a website. Can’t get lighter than that.

Maybe all web∅ pages are the empty string, so they're all equal to one another?

Empty string or null string?

I've never quite understood the gripe people have with modern web development. React took off in ~2015 and the only thing that has really changed about it is how you do state management (flux -> redux -> hooks).

Plenty of frontends still rely on jQuery which can easily be sprinkled into a frontend, and the API hasn't really changed in nearly 20 years.

What is the author advocating that we adopt this html-only approach for? Just personal blogs? Is it really a problem on the internet that personal blogs have way too much Javascript and CSS and therefore we need to throw out javascript?

I would love to see someone try to build something even remotely complex or valuable without using Javascript and CSS. I would expect that if Banking portals, EHR's, CRM's (name any other service even moderately complex) were stitched together with a series of html documents and forms, customers wouldn't be as excited about that.

> I would love to see someone try to build something even remotely complex

I think they are advocating we stop trying to constantly make things that are complex, or can one day handle potential complexity, and try to make things more simple from the get-go.

Sure not everything can be Simone [simple*] html, but anecdotally, I can see the value in advocating more simplicity (but not at a raw html level). So much of what I need to work on and maintain are over architected "just in case"

I think thats fair, but thats also becomes a truism since no one sets out to over-architect things. My biggest annoyance with this series of posts that show up every few days on HN is just how disconnected they are from the actual challenges of software that people use on a daily basis.

The number of support tickets that I've seen in the past from users who:

Weren't able to see a button dead center on the page without adding a bit of animation

Didn't realize there was a form error above without an auto-scroll.

Help text that needed to be contextual so as to not overwhelm the user.

Dropdowns that are too unwieldy to navigate without a typeahead.

etc etc

All these require adding more javascript, more css, and more complexity to solve, but it would be ludicrous to say that 50% of users should just not use the service because we need to keep the code clean.

> I would love to see someone try to build something even remotely complex or valuable

Why though? Just make it simple and functional. No need to get complex.

> I would expect that if Banking portals, EHR's, CRM's (name any other service even moderately complex) were stitched together with a series of html documents and forms, customers wouldn't be as excited about that.

Why should customer excitement force us to create over-complicated solutions?

Because most software exist to provide solutions to customers and not to satisfy the developers.

If you're going through a 40 page mortgage application, and an accidental page refresh loses everything you've filled in, thats not ideal.

If you're trying to catch an uber, and the status isn't updated in real time, thats not ideal.

If you're a customer support agent, and incoming messages require a page refresh to see, that not ideal.

That is a great answer: because we're trying to solve users' problems.

So we need to make things complex enough to solve a user's problem, and no more. I think the difficulty we run into is that last bit.

I wrote my first grad student website in raw HTML. In Vi, on a Unix mainframe, over SSH.

It was fun, but I don't blame developers for trying to make things more composable and reusable. I'm big on tools that let you build plain static HTML pages, but rendered from templated components.

>In Vi, on a Unix mainframe, over SSH.

I did mine APoAC in a blizzard. Lots of dropped packets.

> ...a Unix mainframe...


The university had (has?) some kind of central Unix system for students to mess around with, share files, and host their webpages. This was around 2014.

It's amazing to me how many "web developers" don't actually seem to know how to write pure html/css anymore by hand. They rely on these gigantic frameworks to do literally everything.

Like a lot of Gen X/Gen Y oldsters, I wrote my first web page in Notepad in 1995 or so and uploaded it to my ISP hosting, after undergoing a "bootcamp" that consisted of reading W3C specs, Netscape.com tutorials, going on Usenet, and viewing source on a bunch of other people's pages, sitting all night on my grandfather's Pentium desktop (because we dialed up to get online) and listening to Mine Inch Nails on headphones.

Someone in here points out that complexity produces friction that will drive consumers away. Good. Not all technology has to be a product you can pimp to a VC in an elevator. I love the idea of Web0, of building a new web that requires a bare minimum of engagement on the part of users rather than simply making everything as easy as possible.

There's more than enough brightly lit malls in the metaverse, where everything is just a funnel to catch your attention or your money. But there's also infinite space for simplicity and the digital equivalent of "roughing it", of applying the concepts of appropriate technology to online interaction. Viva la Web0. :-)

Funnily enough I recently did the same thing for my personal website[1][2]. Previously I had used create-react-app to build what was essentially some basic static HTML with styling and I took a step back and realized how insane it was that I had an entire build + deploy pipeline to show some basic information.

The website is currently 16.59KB transferred and it has the following - dark mode support - social media display support - basic, anonymous analytics using simpleanalytics

It's hosted for free using GitHub pages and a push to master immediately makes the changes available on the internet. Additionally GitHub manages all of the SSL certs so I don't even have to think about it. Compared to the SPAs I (and most WebDevs) write for work nowadays it was actually a breath of fresh air to take away so much!

[1] website: https://trentprynn.com/ [2] source: https://github.com/trentprynn/trentprynn.com

Totally cool to see this tip of HN!

For most things, it's the way it should be.

Less really IS more

I deeply miss websites that -reliably- loaded faster than you could blink. The megatons of cruft that comes with even the simplest page, the dozens of remote code callouts that show up in NoScript (and the mere fact that it's been for years prudent to run NoScript...), the slogging wait for pageloads.... It'd just a vast disappointment. And these awful load times and performance are on CAD-level laptops.

I left the industry 15+ years ago seeing the writing on the wall and hoping I was wrong - writing code had become like walking in quicksand - unable totrust the stacks of garbage underneath to behave as documented, and abstractions becoming increasingly purposeless. I'd hoped it'd get better, but as far as I can tell, it was a good move, when something like this is merely quaint.

The hardware industry has produced astonishing improvements every decade, and the software industry just continues to squander those improvements at an even faster rate, producing an ever-degrading experience.

Really sad.

It's not web0 without <blink> and <marquee>!

just to be a bit picky - it technically has some CSS albeit in a style tag:

<center style="font-size: 18px">...

is it now web1?

The viewport meta tag is also a post web0, almost web2 forced addition. I say forced in the sense that it’s required for proper rendering in mobile even for CSS-free sites.

>proper rendering in mobile

this is the thing everyone adds because someone said it was the right way.

it is the "assumption" that everyone does not want their user to _zoom out_ on mobile sites.

it is wrong, and nobody questions it

okay i'll remove if it will make you happy : )

If you want your bigger font back, change the DOCTYPE to HTML4 and use a good ol' <font> tag instead!

    <font size="+2">foobar</font>

good call. thank you

I think this website is a sales pitch for their patreon, which itself doesn't seem to be doing anything meaningful to advocate for web0. A funding goal is to "publish a guide on tapping the raw power of HTML". I mean, just google how to write HTML.

I do think we should be conservative about complexity in websites, but advocating for removing interactivity and styling (js/css) seems too idealistic and simply contrarian. So I definitely think this is a good conversation starter, but not a serious goal.

I guess this is the webdev equivalent of meta commentary in the media where somebody doesn't actually talk about anything other than what their reaction to it means.

Most blog posts are the same

Theyre just Super comments nobody’s can reply directly to

Bare HTML may as well be the internet equivalent of a .txt file.

Is it good enough for some people? Absolutely. Does it even remotely meet the preferences of the average web user? No.

This must be the 100th iteration of the "motherf'ing website' manifesto I've seen. They all use default browser styling and system fonts. It's easy to marvel at how fast the site loads when the page doesn't even include so much as a JPG.

The thing being called web3 doesn't seem mutually exclusive to what is being called web0 in this post. Might be a sign that web3 isn't a great name for sites integrated with blockchain content. I know, I know, you can't unsqueeze the toothpaste tube.

To the author: you would have fun checking out IPFS. It's a decentralized way to host your site. It's still in its infancy, and slow...but your site will be plenty fast on IPFS. Check out fleek.co for an easy way to start hosting on IPFS.

I don't understand, how is this different to me just serving a plain HTML page with the latest Nginx Docker image and claiming we're back to Web 0?

Is there something special about this page specifically e.g. not being hosted on a normal web server?

yes, did you feel the energy?

Web3 is a trojan horse. It's about exchanging money into crypto currencies to pay those hosting the Web3.

I bet if we allow this to happen, most of Web3's hosting will belong to a few, and they will finally have made us pay for using the web.

I don't think there's anything wrong with more complex tooling, as long as the deliverable is simple.... I use a static site generator (11ty in my case, but there's hundreds just like it) and, even though my node_modules is 81M, it produces clean html free of unnecessary js or css. Just like a modern digital artist might use photoshop/illustrator to create a static jpeg.

Of course, you also have things like gatsby/gridsome which are supposedly also "static site generators" but also ship ridiculous amounts of js to do ridiculous things like "hydrate" the client-side to a full spa.

I wish browsers would render text/markdown. It's easy to write, theme, and host. I know it's not the best text format out there but it's reasonable to write and read for most cases.

It's not hard to generate a static collection of HTML from Markdown documents.

Pandoc is probably my favorite tool for that job:


I'm making an addon for Firefox (not sure if it will work on chrome) that renders markdown and gemtext right now actually. It's not on AMO yet, but you can get it from https://github.com/easrng/txtpage

I recently also wrote my own page in bare html, in nano, on my shell account at the server, which I get s a member service to hobby club.

It delightfully old school and writing bare ugly html works okay for me, I can actully concentrate on the content instead of learning how to use and maintain a cms.

It is my main homepage and very portable if I ever need to change hosts for example. Its kainda like 2003 again for me, when I had a 100MHz AMD K5 box with 64mb ram and 1gb ide hdd serving my homepage over my parents adsl connection.

Too bad bare html looks a bit bad on mobile.

Just slap a viewport tag in the head of that bad boy and call it a day :)

I remember when web 2.0 became a thing it was more of a generic description of web design style and the advent of tech like ajax. Web 3 so far seems like astroturphed nonsense.

Inspected the page. What does this stuff do?

><script type="text/javascript">window["_gaUserPrefs"] = { ioo : function() { return true; } }</script>

Are you running the "gaoptout" extension? ( this thing: https://chrome.google.com/webstore/detail/google-analytics-o...)

That snippet is what it injects to opt you out of Google Analytics.

I do not see that snippet in the page source.

Oh, I'm running Privacy Badger, so perhaps that injected it.

It appears to be opt-out for Google Analytics.

Source: https://github.com/ampproject/amphtml/issues/21163#issuecomm...

I don’t see that, do you have some extension that might add that?

I think it is Privacy Badger, actually.

A happy medium can still be had by including all the typesetting niceties in the same file, or maybe one extra included file. Still amazingly fast.

Hey, I guess that's why the 3 www's in front of any website is for. For web3.0! web 2.0 is two w's and one had none?

I'm studying up on web3 but I also kinda miss web0, or at least a simple static site with some basic CSS and no javascript. More sites should go back to that.

Whenever I get around to making a personal site again, it will probably be compiled to a static site, even if I end up using a static site generating tool to allow for some templating.

Strange. I remember people using the shift key back in the web0 days to capitalize the first letter of a sentence.

Get off my lawn.

I would offer the design of this page is elegant, but not "beautiful" and the difference is not necessarily subtle.

A poem can be elegant and beautiful.

A blog post yearning for web0? That's a bit tougher to pull off when you compare it to modern web designs. That feels more like an imposter.

You're wasting characters, /> is only a requirement of XHTML. You can just use <br> and it is valid.

Also, are HTML comments are allowed before the DOCTYPE technically? (this is the kind of tomfoolery that escapes unit testing on browsers)

I was expecting plain text. Don't be shy. Browsers will serve up basically anything.

Plain text?? I was expecting binary.


It can also serve XML with an optional XML stylesheet (XSLT or something, I believe)? Browsers have a lot of support for many different standards.

To all those web0/HTML heads out there. Here's a song for you: https://html.energy/pages/imagine/

Happy New Year!

Taking a look at "web0" again actually is what the Jamstack is doing. Do static rendering and only load JS where / when necessary.

If everyone started at Web0, it would be a lot easier to understand (and question) how some of the things are so complex in web development today.

What is web0? Looks like I am out of the loop here.

Hand-coding a website using only rudimentary HTML tags (eg, <p> or <br>) and nothing else, resulting in a charmingly-ugly yet functional self-published site. Roughly speaking, like much of the internet was in the 90's.

Get off my lawn.

"It's so cool is just works"?! OP sounds shocked that you don't need 50MB+ of scripts and CSS to serve a webpage.

And yet it has stuff that wouldn't be recognized by early browsers, like https, or the viewport meta tag (useful for mobile)

Wait. Uppercase wasn't used for the personal pronoun "I" or to begin sentences in web0?

Not cool.

whats wrong with css

CSS and styling are immoral. Styling is used to manipulate readers. Content should stand on its own, anything that looks nice should be immediately met with skepticism. What are they trying to hide with their pretty background colors and hover styles?

edit: also, can we talk about unneeded bloat? Those 5kb css files add up, how many tons are coal are burned every year because of all of those CSS files being downloaded over the internet?

FYI, I can't tell whether you're being sarcastic

(Betting 60% yes, 40% no)

I like the way you think here. Less is more. Using less allows the rest to do more.

"When you speak, it is silent. When you are silent, it speaks" - Zen Koan

Is this trolling?

Typically it means adding another file.

sure but theyre already using inline css whats the problem with using a style tag?

theyre already including a favicon might as well swap that for a css file if youre worried about get requests or something

You could copy and paste the CSS in every page using style tags.

Server Side Includes

it requires expert knowledge

People learned how to do basic CSS off of web tutorials decades ago, and few if any of them were experts. It isn't that complicated.

layout sure but font-size and background are pretty straight forward

if you can set up a web server i am sure you could figure out enough to qualify for web 1.0 level complexity

if simplicity is the concern why not just host a text file? browsers format them nicely

That's a great point. I didn't get involved with the web until around 1998-99 so I'm not sure, but did the web ever not support plain old .txt?

This isn't a Gopher site.

I made my first professional web0 age 22 years ago, in 1999. It was amazing.

Nobody's going to get rich from shilling existing tech.

Crypto bagholders need new marks.

Back in the day html was <CENTER>UPPERCASE</CENTER>

IPFS is still super slow, but can work well for raw HTML sites like this. For example http://vitalik.eth.link is hosted on IPFS. It's a simple text blog and is pretty fast.

If I had a dollar every time one of these "javascript bad, raw html good" websites/rants gets posted here, I'd have enough money to buy the entirety of YC just so I could ban such posts.

needs more blink tag

An HTTPS URL makes this more like web1.5.

A site with no content and no feature. That's cool.

Yes, functionality would require skills, and skills require opinionated tools. Opinions require taking a stance, and that's the last thing anyone here wants to do.

I was trying to be snarky, implying that it's easy to make a site with no tech when that site has no feature and no content. I am quite curious at what you're trying to say, because I caught the sarcasm but none of your meaning.

Your first sentence goes my way, but then... I don't know.

The prevailing view here is that JavaScript is a nightmare to develop with when in fact, there is actually a bit of a golden age going on right now. It's never been more stable and fun to build apps on the web. Maybe I'll write a post about it.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact