
How the Blog Broke the Web - tempodox
https://stackingthebricks.com/how-blogs-broke-the-web/
======
Nadya
Am I allowed to plug my own site? The old web is still around - you just need
to dig through more new web to find it. I run my site through a few Gulp tasks
to minimize files and provide security-integrity checks for my CSS - but
otherwise it is all hand written HTML. Every page begins with copy/pasting
another page and changing out the content by hand.

I've noticed in the last few years the articles about the "early 90's/00's"
web has increased. A desire for the older, more personal, mostly text and some
cheesy few-framed animated .gif web. A web that felt more person to person
instead of corporation to corporation or plastic Barbie doll to plastic Ken
doll full with fake personas.

This is why I support Neocities [0] and host my site with them. You can find
all sort of interesting sites if you browse [1] a while. My personal favorite
is browsing all the layers of [2].

[0] [https://neocities.org/](https://neocities.org/)

[1] [https://neocities.org/browse](https://neocities.org/browse)

[2] [Warning: Auto-playing Music]
[https://mebious.neocities.org/Layer/Wierd.html](https://mebious.neocities.org/Layer/Wierd.html)

~~~
bananaboy
Does neocities survive solely from donations? That's pretty impressive if so.

~~~
kyledrake
Yes, and supporter accounts.
[https://neocities.org/donate](https://neocities.org/donate)

Nobody is more impressed than I am. It hasn't stopped being amazing to me that
it works.

~~~
Nadya
Thank you so much for your work on Neocities - and sorry for sending the HN
hoard your way!

I personally think the support model is all about scaling properly - or simply
not scaling past a sustainable point. As long as the number of patrons scales
with the number of "freeloaders" it is a sustainable model. The issue is if
you begin to hockey stick and don't restrict your scaling (because you "want
to be larger" or you "hope people will donate as we hockey stick upwards").
Once that happens, donation ran sites tend to collapse under themselves -
which, for recent examples, is what has happened to so many pomf.se clones.
Becoming too popular can become a death sentence.

As a support-driven site, can you provide any insight into that? Or offer
whether you agree/disagree with the idea? Would be interesting to hear your
thoughts on the matter.

------
_bxg1
I mourn whenever a consumer technology becomes streamlined, homogenized, and
commoditized.

There's a pattern - it happened with smartphones too - where a new technology
appears, and at the beginning people _engage_ with it directly. They _use_ it,
they _do things with_ it. And in that type of relationship there's the
opportunity to have _fun_ with it. But then as the new medium grows - as it
becomes more economically lucrative and scales to an exponentially bigger
audience - it's made to be more efficient instead. Eventually users no longer
_use_ , they _consume_.

Think about how weird and wonderful the first iPhone apps were, with their
kitschy skeuomorphism and novel uses of sensors. The accelerometer-based
lightsaber. The sound boards. The odd and interesting domain-specific utility
apps. Even Instagram had personality back in 2010. And then people would
jailbreak their iPhones, so they could add widgets and toolbars and ridiculous
neon color themes. The smartphone started out as a magical anything-object
that people looked at as its own entity, just as web pages started out as
magical worldwide bulletin-boards that people would put together and decorate
for their own sake.

These days the smartphone is boring. It intentionally falls to the background,
serving as little more than a stylish content-pipe feeding your every craving.
It is minimalist and inoffensive and ever-present. Even apps are supposedly on
their way out. The very action of opening an app and _doing_ something is seen
as a piece of friction, keeping people from consuming content as effortlessly.
Google Assistant and Siri would rather predict what you want and serve it to
you before you even ask for it.

Play is how the human brain stays alive, and it can't happen without back-and-
forth interaction and tactility, and user agency.

~~~
_bxg1
Interestingly, something of a reverse effect is happening in the game
development space. I think the driver of the above pattern is mainly economic;
as a technology becomes more and more commercialized, it gets less personal
and more streamlined. This is aided by the technology itself improving to
allow it to be more efficient. With games, though, the barrier to entry was
always (since the 90s, at least) so high that _only_ corporations could
participate. So in recent years, the technological strides that have been made
have actually allowed the space to become _less_ commercialized, and therefore
_more_ personal and diversified.

~~~
sarcasmic
Consequently, game development these days is more limited by art assets than
by programming or game systems design. The difficulty and complexity of
producing good-looking, stylistically consistent, original 3D assets has led
to sprite-based games, mock pixel art "8-bit" games, and games where form
follows function.

However, game engines companies now offer asset stores. To OP's point, it's
not hard to see a future where some successful indie games are made by
stitching together assets and mechanics sourced entirely from others. Would it
cheapen the result, despite the talent and effort put into those components
none the less grand? Perhaps.

Technology and popularity move the goalposts. It increases the amount and/or
changes the nature of effort required to attain acclaim.

~~~
earenndil
To a point. I think, however, as the technology increases and gets better, the
trend towards realism will overpower. Consider: as assets get more and more
complex, the uniqueness of an individual asset will actually _decrease_. I can
easily point out the difference between a tree in the witness and mario 64,
because they're both low poly games that do their own thing _instead_ of
emulating the real world. If, instead, you ask me to differentiate between a
tree in skyrim and one in the last of us, well, they both just look like
trees. At this stage, what matters is the visual aesthetic that you're able to
create with lighting, with combination of different assets. Whether you
happened to make an individual asset yourself will matter increasingly less as
time goes on.

~~~
_bxg1
Why would people abandon unique styles just because it gets easier to
implement a realistic one? An artist's goal isn't just to make something that
doesn't look like crap; it's to make something that expresses a certain
feeling. Stylism isn't just a band-aid for lack of realism.

~~~
earenndil
I'm not saying that they'll abandon it, I'm saying that as publicly available
art trends towards realism, being an 'asset flip' won't be such a bad thing
anymore.

------
jedberg
I feel like this story leaves out a very important factor -- Slashdot.
Slashdot was huge long before Moveable Type. It was so huge in fact that when
my team built a web server in 1998, the first question from the boss was, "can
this handle having one of our posts on Slashdot?".

I'd say if anything Slashdot was the one that drove people to make
chronological posts. Because if you wanted to be on Slashdot, you had to post
something new, and the best way people knew if you had something new was if
you had a date on it.

~~~
benatkin
Wikipedia has a page on the history of blogging, and indeed Slashdot is there:
[https://en.wikipedia.org/wiki/History_of_blogging](https://en.wikipedia.org/wiki/History_of_blogging)
That doesn't mean that it inspired people to convert their personal websites
to blogs, though. While people did see chronological posts, that doesn't mean
the idea of converting their websites to that format was put into their heads.
There's a big difference between Slashdot and a personal website.

~~~
ObsoleteNerd
Only one data point but I made my first blog inspired by Slashdot. I wrote a
CGI script that read flat text file posts to make the front page, then when
you clicked a link, another script wrapped the chosen post text file in a
basic HTML template (eg added a header and footer, that's it).

Making that (and clones customised for friends), got me into web development
and ended up in a 20 year career.

------
claudiulodro
If anything media companies and content marketing broke the web, not
individuals creating online journals or hobby content.

~~~
AznHisoka
I wish there were more _real_ personal blogs. Heck, sometimes I wish for the
days of Xanga and LiveJournal. Reading one from the past really feels like
you're inside someone's deepest thoughts. People were unafraid to really be
vulnerable when writing those things.

Now we have lame blogs owned by corporations like Techcrunch, or filled with
vacation pics and food porn.

~~~
classichasclass
Given the mob mentality that's settled into modern social media/the
blogosphere, it's actually amazing there are still as many personal bloggers
as there are. One wrong word and ...

~~~
freehunter
As someone who runs a fairly popular "personal" (not corporate, opinion-based)
blog, it is an ongoing fear. We have to try to make sure the things we say are
as least controversial as possible. But in a way, this is a good thing.

My blog is about my city, and there are plenty of things we "can" talk about
but probably should't. For example, I don't think the business that's in one
of the cornerstone buildings downtown is the right business for the
neighborhood. But I'd never say it. It'd be needlessly confrontational, my
opinion isn't going to get that business to move, they're paying their rent so
they have every right to stay there, and all it would do is piss people off
who disagree with me.

On the other hand, we generate minor controversies all the time. People who
live outside of the city always want more parking, we're pretty strongly
against setting minimum numbers of parking spaces when putting in new
buildings. We support new higher-density living in the city, many neighbors
are strongly opposed to it. As such, we're the target of smear campaigns and
any relevant social media post is often swamped with the people who disagree
with us. We've also been factually wrong in the past, and some people do their
best to remind us of it as often as possible.

The trick is to ignore it. Haters gonna hate. Don't be unnecessarily
confrontational, and just roll on without acknowledging your public critics.
They're just trying to piggyback off your success.

It's not really a "one wrong word" situation, it's "one word you really knew
you shouldn't have said, but against all better judgement you did anyway"
followed by "engaging your critics in arguments", then topped off with "caving
and admitting defeat because you knew you were wrong all along".

I can't think of a single collapse of a public figure because of one wrong
word that wasn't extremely egregious and then handled extremely poorly.

~~~
glenra
> _We 've also been factually wrong in the past, and some people do their best
> to remind us of it as often as possible_

THAT dynamic is toxic and we need to somehow evolve a social rule against it.
Something like Godwin's Rule, back when that worked.

 _Everybody_ has at some point said something factually incorrect and
_everybody_ has at some point said something that sounds wrong or mean when
taken out of context today.

If forever bringing up " _Remember, this is the person who said (worst thing
ever)!_ " is a valid move, it's an attack on identity continuity. The existing
defenses against that attack seem to be:

(a) completely abandoning identity continuity (eg: 4chan)

(b) trying to appear perfectly accurate and inoffensive at all times by
retroactively deleting any posts that might make one look inconsistent or mean
and hoping nobody notices or keeps an archive.

There really ought to be more options than those. For instance, one could
adopt a statute-of-limitations approach: _Anything said more than two years
ago is off-limits as an attack on that person /blog/institution today_.

Once the right rule is documented, those who break it are demonstrating they
are too dumb or ill-informed to engage with what is being said _now_ and thus
have lost the argument.

~~~
currymj
There’s an evolving social norm against this, I think. Increasingly, I’ve
noticed the people who trawl through years-old posts looking for controversy
are seen as deranged.

~~~
foobarchu
That only applies when the target was relatively anonymous to begin with. It's
frowned upon to crawl back in time to look for ammunition against someone, but
if that ammunition is something that went viral and is part of public
consciousness, then you're good.

One example is Bredan Eich. Because he chose to donate to a Prop 8 fund and it
got out, his opinion on all subjects is automatically null and void for many
people.

Another case is that of reddit user 'Unidan', who became an immediate pariah
overnight because he was a dick to someone online, so he abandoned the entire
identity. If any of his alternate identities were somehow linked back, it
would likely become useless as well. I'd be willing to be his professional
career suffered greatly because of that rant, too.

There's also Joy Reid, who has been in hot water lately because people found
old tweets and blog posts that were offensive. In that case, people actively
went back to look for things she said in the past.

None of these would have been possible pre-internet.

------
sarcasmic
Eh, static site generators, CMSes, and bloghosts didn't break the web. Sure,
they made it easier for people to churn out content, but it was authentic
material they cared enough to write about. Whether they posted under their
real name, or under a screenname, they built little fiefdoms of content with
their personal time, and made it available for anyone to read, without an
account, and without any obligation of feedback.

What changed was when people began putting their content into siloes protected
by a login wall, and platforms strongly defined by visible indicators of
popularity, which didn't really happen until Facebook and Twitter.

Even in the Livejournal and Myspace days, a lot of profiles were public, but
quasi-pseudonymous, requiring some effort to actually find. It was Facebook
that mandatorily juxtaposed one's real name with one's real words, which
quickly led to predictable outcomes: people being doxxed, harrassed, turned
down for employment. Within a few years, most people set their profiles to
private in an effort to protect themselves from snooping employers, colleges,
exes, and trolls, keeping most of what people write and share walled off
behind a login and a friend approval gate.

Twitter was billed as "microblogging", where one could publish short snippets
with more frequency vs. a long-form blog, but its bizarre interface, unclear
direction, and feature competition with Facebook caused it to evolve many of
the same mechanisms and signals of popularity as Facebook. Facebook's status
updates were a direct assault on Twitter, so Twitter eventually morphed
'favorites' into 'likes'. With it, it was blindingly obvious that most
people's content wasn't even being read.

All of this social transition took place in the shadow of the
commercialization of the web, where websites were no longer just billboards
for businesses, but platforms where one could conduct commerce, consume
professional content, and be subject to behavioral analytics that fed back
into ads. With the abundance of commercial content, consumption went up and
amateur production went down.

~~~
dnomad
Yeah, the article is stupid nostalgia. What killed the web is very clear: a
select cadre of companies conspired and worked very hard to turn the web into
an advertising platform. The developments here are obvious:

1\. Google Ads monetized linking and gave birth to the SEO "industry."

2\. Facebook mandated real names.

3\. Apple and Google completely closed off their mobile platforms requiring
pre-approval for all applications.

The last piece of this is that the stewards of the web, the W3C, have
completely abandoned their charge and sold the web out to these corporations.
The W3C has allowed the evolution of the web to be completely captured by
Google and other major corporations. Rather than making the web simpler and
more accessible we've seen the W3C bless standard after standard that make the
web significantly more complicated. Today nobody but extremely wealthy
corporations can afford to develop a browser. HTTP2 means nobody but major
corporations can write web servers these days. Abandoning a well-structured
web (XML, semantic technologies) for the current html5-js-soup means that the
knowledge published on the web is wholly inaccessible... unless it gets
exposed via proprietary, one-off json-soup APIs.

The complete corporate capture of the web wasn't driven by "blogging" nor was
it in any way a democratic process. It was a deliberate and carefully
engineered process that created a web and a computing platform (mobile phones)
that is absolutely and completely under the control of corporations, moreso
than any previous platform.

At this point there's really only two ways forwards: (1) abandon the web and
start over or (2) governments will step up and reign in corporations.

------
Alex3917
When the least expensive computer you could get was almost $5,000 and Internet
access cost $5 per hour (both inflation adjusted), web users were mostly
limited to academics and children of the 1%.

As the price of Internet access decreased so did the average socioeconomic
status of users, and so building tools that made content easy to create and
easy to consume suddenly became profitable. There simply is no alternate
timeline where the web could have stayed the way it was, except perhaps one
with a very abrupt ending to Moore's Law.

~~~
firemancoder
While your overall point is somewhat correct, your numbers are way off.

I was "surfing the net" in 1994 with a computer I built for around $700
($1,204.59 today's price) and was using an ISP account that was $25 per month
($43.02 today).

I was not an academic or anywhere near the 1% at the time. But I agree with
the point you're trying to make, as it did become more accessible for people
at a poorer level. Also computers and the internet both became easier to use
which contributed greatly to adoption as well.

~~~
brett40324
Yep, same here. Local dial up services were born in 94 in every American mid
sized town, and half my buddies (at 10 years old) were all online between 94
and 96. None of our families were wealthy, with basically lower to middle
class blue collar parents.

------
mjb
This article seems to be making two arguments. One is that popular blog
software was limiting, and its popularity discourages people from trying other
forms of content. Its a reasonable take, but it's not clear to me that's
actually what happened. The other argument is that the web got worse as it
became accessible to people who didn't have the knowledge, skill, or time to
write their own HTML. I can't help but strongly disagree with that. Yes, the
web lost a certain kind of character, but it gained billions of people and
many other kinds of character. There's more content, better content, and more
accessibility now than ever before. That's great news, and the value of it
vastly outweighs the loss of the handcrafted HTML aesthetic.

Preservation and archiving are good things to be doing, and discovering the
signal inside all the noise of today's internet can be challenging. Still,
inviting everybody is a huge step forward.

~~~
arthev
>better content

 _Maybe_ better content in terms of the quality of the best content available,
but, as you admit yourself by mentioning the noise-to-signal ratio... The
average quality is abysmal.

~~~
kqr
Just from personal experience: when I switched from writing my HTML by hand
(even with auto-expansion in web-mode.el) to generating pages with Org, I
simply had a lot more time available to increase quality.

As an example, my article on Kernighan's writing[0] would simply not have
happened if I was still writing HTML. It was a spur-of-the-moment thing which
took very little effort thanks to the nice Org amenities.

[0]: [https://two-wrongs.com/technical-writing-learning-from-
kerni...](https://two-wrongs.com/technical-writing-learning-from-
kernighan.html)

------
ruricolist
Personally, I think Wikipedia broke the web; it absorbed a world of
individually maintained, individually slanted resources into a mass, then
slowly drained away that personality in favor of a neutral POV, one edit at a
time.

~~~
fullshark
I don't think they should be singled out necessarily, it's just one example of
the general phenomenon of the web/content becoming centralized that we're all
seemingly sad about.

------
mercer
I think what happened is the same old story: early adopters/creatives did
stuff, said stuff became accessible to the masses, and as a result the overall
quality decreased. I guess 'eternal september' mostly covers it?

Digg was great until it became popular. Those who lamented the decline of Digg
moved to Reddit. Then a similar thing happened, despite the innovation of
subreddits, and some like myself mostly moved to HN.

Couchsurfing was great until it became overrun with people who didn't
represent the 'spirit' of CS. Those who lamented the decline of CS moved to
BeWelcome or whatever else there is.

I'm pretty sure the same patterns emerge in every human endeavour in existence
(musical genres come to mind as very similar, as does the constant protestant
splintering, starting with Evangelicals I suppose, followed by Pentecostals).

As an 'early adopter' in tech as well as a few other areas of life, I'm not
sure how to respond. Nostalgia is kinda fun. I loved reading this article. But
personally I try to remind myself that the cool part, the part that resonates
with me, is being an early adopter. I don't feel that it benefits me to start
gate-keeping or yearning for the past or whatever. Perhaps a better approach
is to consider it a victory: the masses arrived, our passion is validated,
what's next?

------
saas_co_de
I think it is misguided to think that the web is "broken."

I am sure there are more people online doing more interesting things and
producing more interesting content now then back in 1996.

It is just that now the internet is used by billions of people the signal to
noise ratio is much lower.

Back when the internet was a few million people globally it was a very select
group. That same group of people using the internet for cool interesting stuff
has grown dramatically but it is still probably only a few million people
globally. The difference is that now there are a couple billion other people
using the internet.

There is nothing wrong with billions of people using the internet for things
that interest them (mostly drama, porn, pyramid scams, and cat videos,
apparently). It doesnt detract from my ability to grok some avant garde
research paper that I would otherwise never have access to, run programs that
do research that pulls on terabytes of data published in public databases
around the world, have access to virtually every film and television show from
anywhere in the world, any newspaper, etc, and collaborate with other weird
people all over the world who are doing this stuff.

It is too bad that Google stop actually being a search engine and became an ad
trap, but that just means going back to how you found stuff before Google: on
chat (IRC then, but lots of places now) and in forums.

On the plus side google translate has transformed access to content. Back in
pre-2000 the internet was basically English only. there is now so much more
diversity, and google translate makes it possible to access content from a far
larger group of people than what was possible in the good ol days.

------
startupdiscuss
Consider this alternative theory: the change ("breaking") you see, the
preference for a timeline, is actually caused by the advertising model.

To advertise, you have to have people come back to the site frequently and
check the site frequently.

In order to do that, you have to produce new content all the time, and in
order to find that the new content the reverse timeline is the best way.

(Once you have too much fresh content, you have to come up with a way to sort
even that).

------
DoreenMichele
Those quirky blogs are still out there. I have several myself.

They are harder to find, I think, and they are easier to ignore when there are
slicker, more commercial things competing. I think maybe the real thing that
changed is that all of the "hobbyists" with homepages at that time were
unusually well educated and specifically knowledgeable about the internet and
that's not true anymore. So, instead of the web being written almost
exclusively by people with substantial college (or simply very well read and
self-educated) and very similar nerdy interests, parts of it are now written
by people with other backgrounds and interests.

------
epx
One thing I disliked in blogs, which made me migrate back to an old-school
website, was the low bar of writing on a blog, and I ended up writing a lot of
impulsive rants. Publishing on your own site makes you think twice and work
the opinions before writing. Another blog problem was the "planet" \- your
blog was syndicated by a planet and suddenly people complained that you were
writing articles not in planet's default language and subjects. I had no less
than 4 blogs to please the planets until I got fed up.

------
codingdave
And some further commentary and context around this:
[https://kottke.org/18/07/did-blogs-ruin-the-web-or-did-
the-w...](https://kottke.org/18/07/did-blogs-ruin-the-web-or-did-the-web-ruin-
blogs)

------
kickscondor
Here's my critique of the current web:
[https://www.kickscondor.com/2018/07/02/things-we-left-in-
the...](https://www.kickscondor.com/2018/07/02/things-we-left-in-the-old-web/)

Chronos is definitely an issue. But I think the bigger issues are: a move away
from custom design, the lack of a "home page" feeling, and the hostility
toward self-promotion. I know people talk about decentralization a lot - but
it doesn't intrinsically solve these other issues.

------
corodra
"Back in my day we only had static html and we were happy!"

All I kept hearing in my head while reading this article.

------
hjek
> The backgrounds were grey. The font, Times New Roman. Links could be any
> color as long as it was medium blue. The cool kids didn’t have parallax
> scrolling… but they did have horizontal rule GIFs.

That's exactly what the web looks like in graphical Links [0], with the grey
background and all, only difference being that the GIFs aren't animated.

[0]: [http://links.twibright.com/](http://links.twibright.com/)

------
squiggy22
The blog also did something else.. give non technical users a voice. Do we
celebrate that or mourn for a web full of people just like us?

In the same way that Facebook killed blogging- once a better mousetrap for
publishing comes along it finds an audience albeit the great technically
unwashed.

------
lkrubner
This seems to be wrong:

" _By late 2000, there were still only 1,285 according to Eatonweb. Same
disclosures apply on those numbers, of course, but seriously…_ "

Both Blogger and LiveJournal had launched in 1999. They both had many
thousands of blogs by mid-2000.

------
nhoven
Ah, nostalgia!

Yes, chronological blog feeds were hot back then, but now that FB/Instagram
have taken over the role of lifestreaming platform, we're starting to see some
blogs move away from the constraints of "chronostreaming", and more towards a
collection of essays.

Search traffic also tends to have a power law distribution, so the majority of
your incoming traffic will be focused on a few popular pages, meaning that in
the long term there's really no need for an sequential stream of posts. One of
the main benefits of a blog is its ability to earn search traffic for years,
so why limit yourself by dating your own content?

------
zokier
> Every design decision you make represents roughly equal work because, heck,
> you’ve gotta do it by hand either way.

This seems extremely fallacious to me. Surely different decisions would have
vastly different workloads? I would imagine the impact to be even greater for
hand-crafted stuff, because you aren't automating any of the workload. I know
that is just one sentence in the whole post, but it also seems like one the
core arguments.

------
octosphere
There's a small resurgence of those old tacky websites on Glitch
[https://glitch.com](https://glitch.com)

And some more context by Anil Dash about the old web / Geocities web dying out
[https://anildash.com/2012/12/13/the_web_we_lost/](https://anildash.com/2012/12/13/the_web_we_lost/)

------
kyledrake
Shameless plug for a weird thing I made:
[https://elementcss.neocities.org/](https://elementcss.neocities.org/)

The idea is to make it a lot easier to build sites with just simple HTML
elements, the way I used to do it before CSS. Generally designed for text-
oriented sites.

------
sbjs
I feel like this is a major point in this article, if not _the_ major point:

"Homepage production became suddenly a question of economics: Go with the
system’s default format: zero work. Customizing the system to your format: way
more work than pure HTML ever was"

I noticed that same exact trend even with WordPress or Bootstrap pages. I
refuse to get into WordPress as a career. It's just copying and pasting! But
if you write a React site from scratch, you really get full control over how
everything looks and works.

That's what I'm banking on, that people still want custom websites that don't
look generic. Those are the kind of gigs I want to get: make something
completely custom and unique, and make it _beautiful_ and still _functional_.
I already have one gig doing this and it's great, I'm in love!

~~~
smacktoward
_> I refuse to get into WordPress as a career. It's just copying and pasting!_

No, it really is not. There's a lot of things about WordPress to not like, but
working with it beyond a certain level _absolutely_ involves writing code, and
if you know how to write code you can "get full control over how everything
looks and works" just as much as you can writing your own custom CMS.

~~~
antonvs
True. There's copying, pasting, and monkey-patching!

------
tomtimtall
The web isn’t broken.

~~~
jasonvorhe
Thanks for that. Reading these comments make you feel as though we're in a
weird parallel universe where the internet disappeared somehow.

~~~
krapp
>Reading these comments make you feel as though we're in a weird parallel
universe where the internet disappeared somehow.

I've seen people claim that the big social media silos have "centralized" and
"taken control of" the web so often that I'm starting to wonder if it's just
hyperbole or if people really do think the rest of the web somehow ceased to
exist.

------
thanatropism
Org-mode can publish as tidy single-file HTML pages. I should get a neocities
account and keep a non-blog.

------
ZainRiz
tldr: At first making web pages and writing blogs was hard and tedious. So
very few people did it and we had awesome things like "the big red button that
doesn't do anything"

Now, making blogs is easy so millions of people do it and make generic
websites. The fact that their quality is much better is irrelevant, the
websites look more generic!

------
ddingus
Comment to mark rhis excellent discussion for later

------
Donzo
The web is not broke. People just prefer stories, situated within time, to
"homepages," which are like reading resumes.

------
Kenji
_No two homepages were alike. There was certainly no such thing as a Content
Management System. But it didn’t mean that the homepage content wasn’t
managed._

You know, you can do that too even with a modern website and a clean blog. I
have a website where I wrote everything myself, a complete front- and backend.
Blog engine with hierarchical blog comments, all 100% custom. You just gotta
put in the work to do this.

