
The Return of the 90s Web - mxbck
https://mxb.dev/blog/the-return-of-the-90s-web/
======
waltbosz
What I miss most from the early days of the Internet is the content. It was
all created with love.

My theory is that the high barrier to entry of online publishing kept all but
the most determined people from creating content. As a result, the little
content that was out there was usually good.

With today's monetized blogs, it is often content for content's sake. People
don't try, or they write about topics which they are not really interested in,
but did just to have a new post. Or often the writing is bad.

Maybe today's problem isn't the blogs, but the SEO that puts the crap blogs at
the top of the search results. Or maybe I'm misremembering and the old content
was crap too, or maybe my standards are higher than they were in my teenage
years.

~~~
kickscondor
People are still creating great stuff along these lines - you just won't find
it through Google or Facebook or most of Reddit. Complex, interesting
hypertext creations and web sites are still everywhere. But try typing
"interesting hypertext" into Google or Facebook and see where it gets you. You
can't search for something that's off the beaten track.

This is where directories come back in. Check some of these out:

* [https://marijnflorence.neocities.org/linkroll/](https://marijnflorence.neocities.org/linkroll/)

* [https://neonaut.neocities.org/directory/](https://neonaut.neocities.org/directory/)

* [https://webring.xxiivv.com/](https://webring.xxiivv.com/) (which led me to this gem: [https://dreamwiki.sixey.es/](https://dreamwiki.sixey.es/))

Competing with Google in search has become an insurmountable task. Personal
directories attack from the opposite direction (human curation, no algorithm)
in a way that actually puts Google far behind. It's kind of exciting and
unexpected.

~~~
nextaccountic
What we really need is a new Google, built on open principles (decentralized /
peer to peer, fully free software, backed by a nonprofit), and focused on
indexing the long tail of insightful content that is neglected by Google
because it lacks SEO, popularity, links, and other metrics that Google find
interesting but we don't necessarily do.

~~~
StillBored
For a long time I assumed that google indexed pretty much everything it, and
it was only a question of providing a specific enough set of search terms to
drag up older content.

But what you hint at might be more correct these days. They are running a
reverse wayback machine in that anything not changed in the last year gets
removed. If you click the advanced search its "updated within" and the max
timeframe is a year.

In fact it seems the date range example doesn't even work:
[https://developers.google.com/custom-
search/docs/structured_...](https://developers.google.com/custom-
search/docs/structured_data#page_dates)

If I fiddle with it, it returns a result, but I see an hit from just a few
days ago at the top...

~~~
koreth1
> They are running a reverse wayback machine in that anything not changed in
> the last year gets removed.

Sometimes I _wish_ that were true! Try Googling for, say, PostgreSQL
documentation and the top result will often be for a 10-year-old version of
the software.

~~~
pphysch
Nitpick, that's kind of a non example because the official Postgres docs let
you swap versions more or less seamlessly. IME I will click the top result
then click the correct version for my Postgres SE queries. 2 clicks, 0
scrolling.

~~~
rpdillon
You're right. I tend to click back, then revise my search to include the
version, and then click the result, so Google gets the message that when I
search for Postgres docs, I want the most recent version. I have no idea if
this actually works, but I heard Google uses bounces to determine relevancy,
so I thought it was worth a shot.

~~~
aarong11
Might as well piss in the wind. The number of back-links from different
sources probably has a much larger effect.

------
JohnBooty
I can't _wait_ for server-side rendering to take its place in the sun again.

There are many use cases for which a client-side framework like React is
eesential.

But I feel the vast majority of use cases on the web would be better off with
server-side rendering.

And...

There are issues of ethics here.

You are kidding yourself to an extent when you say that you are building a
"client-side web app." It is essentially an application targeted at Google's
application platform, Chromium. Sure, React (or whatever) runs on FF and
Safari too. For now. Maybe not always. They are already second-class citizens
on the web. They will probably be second-class citizens of _your_ client-side
app unless your team has the resources to devote equal time and resources to
non-Chromium browsers. Unless you work in a large shop, you probably don't.

Server-side rendering is not always the right choice, but I also do see it as
a hedge against Google's, well, hegemony.

~~~
Polylactic_acid
The less stuff on the server the better imo. Whenever I can get away with it I
use a static site generator or I use vuejs with a json file containing all the
data for the site. Being able to just drop a static set of files in to a
webserver without any risk of security issues in my code is great. I also hate
the tools for backend rendering since if you need any kind of interactivity it
becomes so much easier if you had just built it all in vue/react with no
downsides other than not running in someones cli web browser.

~~~
timwis
> Being able to just drop a static set of files in to a webserver without any
> risk of security issues in my code is great.

What security risks are removed by using a client side app instead of a server
side one?

~~~
Polylactic_acid
No database, no code running on your system other than nginx which I trust a
whole lot more than myself.

------
purerandomness
I recently watched the "Helvetica" documentary that was posted here a few days
ago [0], where they briefly mention "Grunge Typography" [1], a seemingly dead-
end branch of typography that, for some strange reason, became pretty popular
for a short period of time.

After some years however, consensus amongst designers formed that what they've
created was a pile of illegible garbage, and realized that there was no other
way than completely dismiss that branch, go back to the roots, and evolve from
a few steps back.

I feel the same kind of consensus is slowly forming around ideas like SPAs,
client-side rendering and things like CSS-in-JS.

We saw the same happen with NoSQL and many other ideas before that.

We recently deployed an entire SaaS only using server-side rendering and htmx
[2] to give it an SPA-like feel and immediate interactivity where needed. It
was a pleasure to develop, it's snappy and we could actually rely on the
Browser doing the heavy lifting for things like history, middle click, and not
break stuff. I personally highly recommend it and see myself using this
approach in many upcoming projects.

[0] [https://www.hustwit.com/helvetica/](https://www.hustwit.com/helvetica/)

[1] [https://www.theawl.com/2012/08/the-rise-and-fall-of-
grunge-t...](https://www.theawl.com/2012/08/the-rise-and-fall-of-grunge-
typography/)

[2] [https://htmx.org/](https://htmx.org/) (formerly "Intercooler")

~~~
dewey
I have fond memories of creating images with Grunge fonts in some pirated copy
of Photoshop and then positioning them with HTML tables and Dreamweaver.

~~~
binarytox1n
This is how I learned web development. Don't forget photoshopping the
glossiest buttons possible. Not sure if that fad was before or after the
grunge.

~~~
sidpatil
Gloss was a trend in the mid 2000s, if I remember correctly—around the same
time that AJAX started becoming really popular

~~~
brandonhorst
Web 2.0. Such a bright outlook in those days :)

------
simias
I'm not a web developer but my girlfriend needed a website to show her
photography work so I decided to make it for her.

It's the simplest thing in the world, basically just three columns with photo
thumbnails and the only javascript is some simple image viewer to display the
images full screen when you click the thumbnails.

It's really, really basic but I was impressed with the feedback I received
from it, many people were impressed by how slick and fast it was. And indeed,
I went looking for professional photographer websites and indeed, what a huge
mess most of them are. Incredibly heavy framework for very basic
functionality, splash screens to hide the loading times etc... It's the
electron-app syndrome, it's simpler to do that way so who cares if it's like 5
orders of magnitude less efficient than it should be? Just download more RAM
and bandwidth.

Mine is a bunch of m4 macros used to preprocess static HTML files, and a shell
script that generates the thumbnails with image magic. I wonder if I could
launch the new fad in the webdev community. What, you _still_ use React?
That's _so_ 2019\. Try m4 instead, it's web-scale!

~~~
abrookewood
Can you give more details? I don't think I've ever heard of M4 before.

~~~
hedora
You probably already have m4 on your development machine:

[https://en.m.wikipedia.org/wiki/M4_(computer_language)](https://en.m.wikipedia.org/wiki/M4_\(computer_language\))

~~~
bartread
OK, I'd never heard of this before and now I'm questioning why we have all
these JS templating frameworks like Mustache, Pug[1], and the rest.

[1] I actually use this one and it's just fine - great, in fact - but seeing
M4 does make me feel like a lot of people may have spent a lot of time
reinventing a wheel.

~~~
simias
I actually looked into some of these frameworks (in particular mustache but
also a few others) before deciding to use m4. They were all either too complex
or too limited (sometimes both!) for my use case.

M4 is old and clunky but it gets the job done without having to install half a
trillion Node.js dependencies. I also know that my script will still work just
fine 5 years from now (or even 50 years in all likelihood).

That being said, don't trash your favourite JS framework right away, while m4
is perfectly fine for simple tasks I don't even want to imagine the layers of
opaque and recursive macros you'd end up having to maintain for any moderately
complex project. It's like shell scripts, it's very convenient but it doesn't
really scale.

~~~
bartread
> while m4 is perfectly fine for simple tasks I don't even want to imagine the
> layers of opaque and recursive macros you'd end up having to maintain for
> any moderately complex project

I'm not doing anything _too_ clever with Pug, but there are certainly a bunch
of things that it makes quite easy that would otherwise be awkward or complex.

Lots of things work really nicely as well: includes, sections, configuration,
and it certainly cuts down on typing. I'm absolutely _not_ contemplating a
switch from Pug to M4.

I just find it interesting that there's this thing that's been hanging around
for decades that would do at least a partial job and is still decent for
simpler use cases.

------
pgm8705
I'm glad this is the case. I've been a Rails developer for close to 10 years
now, but 3 or 4 years back I got sucked into the React world. I bought right
in and my company quickly adopted the "React on Rails" pattern. Looking back,
it was one of the worst professional decisions I've made in my career. Now
we're back to server side rendering and StimulusJS on the front-end when
needed. Productivity is way up, and developer happiness is way up. With new
tools like [https://docs.stimulusreflex.com](https://docs.stimulusreflex.com)
and
[https://cableready.stimulusreflex.com](https://cableready.stimulusreflex.com)
I'm very excited about what can be accomplished with minimal JS.

(Note: I still think React is an awesome library! I'm sure there are devs that
are super productive with it too. It just wasn't the best fit for me and my
company)

~~~
wlll
A company I contract to for backend and server stuff made the jump from static
HTML to client side rendering with react. They did it because the consulting
company they went to receommended it "because it was the future", and I am
sure in no small part because that was what the consulting company specialised
in.

It was the worst decision they have ever made. The site they ended up with was
incredibly slow, and given the relatively few pages on the site you never
really make for that initial load in time saved later.

It's also incredibly hard to write well, requires a special third party
service to show _anything_ in Google and is incredibly hard to manage.

They don't realise this of course, and are now attempting to solve the
management and initial load issues by splitting the app up into three distinct
apps. It won't help.

------
kickscondor
I really like the turbolinks approach - you simply write HTML and then include
the script in your head tags. However, I'm still hooked on Markdown. So I am
still prerendering HTML - and then doing the routing with Hyperapp. (See
[https://href.cool/Tapes/Africa](https://href.cool/Tapes/Africa) for an
example - you get a prerendered static HTML page, but it uses JavaScript from
there to render the other pages.)

The ultimate approach is Beaker Browser though. You can actually just write
your whole site in Markdown (/index.md, /posts/batman-review.md, /posts/covid-
resources.md) and then write a nice wrapper for them at /.ui/ui.html. This
means you can edit posts with the built-in editor - and people can 'view
source' to see your original Markdown! It's like going beyond the 90s on an
alternate timeline.

(A sample of this is this wiki:
hyper://1c6d8c9e2bca71b63f5219d668b0886e4ee2814a818ad1ea179632f419ed29c4/. Hit
the 'Editor' button to see the Markdown source.)

~~~
pmlnr
I kinda went down the same path to generate my site. I had a static generator,
which worked as most static generators, being a standalone program, and moved
to having a python script (and it's .venv) in the root folder of my content,
that has markdown, that converts it to a HTML.

------
dredmorbius
Schopenhauer's 19th century essay "On Authorship" has been a personal fave
since discovering it last year:

 _Writing for money and reservation of copyright are, at bottom, the ruin of
literature. No one writes anything that is worth writing, unless he writes
entirely for the sake of his subject. What an inestimable boon it would be, if
in every branch of literature there were only a few books, but those
excellent! This can never happen, as long as money is to be made by writing.
It seems as though the money lay under a curse; for every author degenerates
as soon as he begins to put pen to paper in any way for the sake of gain. The
best works of the greatest men all come from the time when they had to write
for nothing or for very little...._

[https://www.gutenberg.org/files/10714/10714-h/10714-h.htm#li...](https://www.gutenberg.org/files/10714/10714-h/10714-h.htm#link2H_4_0003)

Brain Pickings articulates my reasons well, though really, just read the
source:

[https://www.brainpickings.org/2014/01/13/schopenhauer-on-
aut...](https://www.brainpickings.org/2014/01/13/schopenhauer-on-authorship/)

~~~
jasondclinton
Yes, the joke is that unemployment benefits are the real Endowment for the
Arts. One wonders if we would have a healthier society if those who are
capable of creating great art could do so by being freed up from financial
constraints.

~~~
pinkfoot
We would also play better golf. Still keen to subsidise it?

~~~
jasondclinton
The "Player of Games" by Ian M Banks takes place in a far-future society in
which the abundance is so high that many people do pass the time playing games
or whatever else enriches their lives. The question is: since we long since
passed the point at which most people could work one day a week and maintain
the level of abundance that existed in the 1960s, is that the society that we
want to live in?

------
secondcoming
My iPad is 10 years old and there are websites that bring it to its knees,
especially mobile.twitter.com links. I don't click them anymore, it's too
frustrating. Maybe web devs should be given low-end machines to work on so
they can experience what their non-desktop users experience. The whole 'mobile
web' distinction really shouldn't need to exist, my iPad isn't a mobile phone
from 2005.

~~~
coder543
If your iPad is 10 years old, then it’s an original iPad. The original iPad
received its last update 8 years ago. I really enjoyed my launch day original
iPad — it was incredible, just because it was such a new experience, but the
hardware was severely underpowered even for the time. Laptops from 2010 are
probably much better equipped to browse the web today than that 2010 iPad.

It just seems very idealistic to expect most websites to cater to a browser
that hasn’t received an update in 8 years running on a processor that has a
fraction of the power of any Raspberry Pi 2 or above.

Modern iPads are served the “real” web by default, further emphasizing why you
would benefit from upgrading at least once a decade.

~~~
echlebek
A 1 GHz Cortex-8 is a ton of computing power. The proof of this is iOS itself.
When you bring the GPU into the mix, it's obviously capable of a lot as an
operating system. There are no problems displaying high resolution static
content and video content on an iPad. But around the same time, it became
fashionable to start rendering static content on the fly, maybe inserting some
not-so-static content around it to justify this choice.

Add in advertising. And a browser that you might have trouble controlling
script execution on.

Add in tracking. Remember that story about how eBay fingerprints your browser
in part by mapping the open websockets you have? That costs power and cycles.
And the cookies have been piling up since the 90s.

Speaking of storage, it's now much more common to use localStorage. Now anyone
with a website on the internet can store appreciable amounts of stuff on your
computer if you visit their website. And they can read and write that storage
as much as they want while you're on their site, without regard for your
computer's performance.

And all of this is just considered normal and regular. This isn't even getting
towards abusive behaviour like running a crypto miner in a browser or
something. This is just web applications continually expanding their resource
entitlements.

The web is a great honking tire fire. Many articles and books have been
written about this, many of them are summarily dismissed by web developers as
ivory tower nonsense. But the trajectory of system requirements for displaying
mainly-text documents is growing at an unsustainable pace, and there is
eventually going to be some kind of reckoning.

I have a $3000 1 year old laptop, and sometimes slack gets so slow I have to
kill the browser process and start over again. The issue is not hardware.

~~~
miffy900
>I have a $3000 1 year old laptop, and sometimes slack gets so slow I have to
kill the browser process and start over again. The issue is not hardware.

Keep in mind, the 1st gen iPad was single core and only had 256MB of RAM,
which was the same RAM capacity as the then current iPod touch. Compare it to
any PC with a single core Intel CPU and the experience will be largely the
same, except of course the Intel CPU will require x10 more energy to run.

And yes, it did have a really great CPU for it's time. It's core CPU
architecture remained largely unchanged for 2 more generations. The iPad 2
added an extra core (A5), and with the iPad 3rd gen, Apple gave it more GPU
power in the SoC (A5X).

~~~
giantrobot
The issue is that when the GP's iPad was new, it didn't have many problems
browsing the web. I know mine did not. Outside of Flash-using sites or extreme
JavaScript beasts the original iPad was a pretty capable web browsing machine.

The web has regressed when an old iPad (or PC) can no longer reliably view it.
It's not like words got harder to display in the intervening decade.

A blog post, news article, or a tweet shouldn't require a quad core CPU and
gigabytes of RAM to be readable.

~~~
coder543
> The issue is that when the GP's iPad was new, it didn't have many problems
> browsing the web. I know mine did not.

Those are some rose colored glasses. The first page (technobuffalo) loaded in
this video takes 10 to 15 seconds:

[https://youtu.be/caTUPKJ5Zfo](https://youtu.be/caTUPKJ5Zfo)

Based on what was said, that page had already been loaded once, and it still
took that long.

Loading a Google Search took only about two seconds, because of how extremely
minimal and optimized the search page was back then.

Loading The NY Times took about 8 to 12 seconds, depending on where you draw
the line.

So, at the time, maybe we were used to webpages taking a while to load on
mobile devices, and it seemed very reasonable. I’m sure there were some
extremely simple websites that loaded quickly, but The NY Times was one that
Apple promoted heavily at the time (apparently) as demonstrating what a good
experience the iPad browser was.

Nowadays, we hold the web and our devices to much higher performance
standards.

It’s not an apples to apples comparison because the content is entirely
different, but the context of this thread is that modern websites are
significantly harder to load and render.

Even though content is supposedly much more resource intensive today, my 2018
iPad Pro loaded the technobuffalo home page initially in less than 3 seconds,
and subsequent page clicks are even faster.

This iPad loads google search results even faster than the original iPad.

This iPad loads the New York Times in under 3 seconds, with subsequent page
clicks taking the same or less time.

Based on the numbers, a contemporaneous 2010 iPad was 3x to 5x slower at
browsing the 2010 web than my 2018 iPad is when it comes to browsing the 2020
web, and that’s a roughly two year old iPad design, so it should be even more
at a disadvantage. My iPad is also rendering more than 5x as many pixels while
doing that.

In conclusion, the original iPad was severely underpowered.

> A blog post, news article, or a tweet shouldn't require a quad core CPU and
> gigabytes of RAM to be readable.

Conceptually, I agree, but every image and video we use for content in
websites now is significantly higher resolution and quality than they were
back then. If you just want to read text, then you’re correct.

~~~
giantrobot
Page load times are a trailing indicator for web performance, they're not
really the core issue I'm talking about. The original iPad could render even
large HTML documents at a usable speed, including styles and inline images.
Once a page was loaded and rendered the scrolling, tapping links, and
interacting with forms was all usable fast. Even while the content was loading
you could interact with the page.

The video you linked _showed_ this capability. That TechnoBuffalo page was a
pathological case for the iPad rendering and it was still interactive fairly
quickly even if all of the resources weren't finished loading. I had the
original iPad and browsing worked just fine on it. Even when pages took a long
(multiple seconds) time to load they were scrollable and interactive. I could
_read_ the content as everything loaded.

The issue today is there's no page to start rendering in the bloated
JavaScript way of the world. The HTML received from the server is just a
reference to load all the JavaScript which needs to parse and execute _then_
fetches and renders the content. A relatively low powered device, like the
original iPad, needs to do vastly more work to display even just text content
than a static HTML document.

It's not shocking that your modern iPad renders pages faster than the model
released a decade prior. Not only does it have far more power and memory but
the network (both last mile and far end) is faster. It's also got an extra
decade of development on WebKit. The web is more bloated but the modern iPad
has ramped up its power to compensate.

Look at Reddit versus old.reddit.com. The "modern" Reddit page has poor
interactivity even on my current iPad. The old.reddit.com site, which is
similar in complexity to 2010's Reddit, renders damn near instantly and has no
interactivity issues.

> Conceptually, I agree, but every image and video we use for content in
> websites now is significantly higher resolution and quality than they were
> back then. If you just want to read text, then you’re correct.

Using huge images and video for "content" is part of the problem. Images have
been used on the web since Mosaic, older devices can handle inline images just
fine. It's the auto playing video ads and tens of megabytes of JavaScript
executing to display a dozen paragraphs of text that's problematic.

~~~
coder543
> It's not shocking that your modern iPad renders pages faster than the model
> released a decade prior. Not only does it have far more power and memory but
> the network (both last mile and far end) is faster. It's also got an extra
> decade of development on WebKit. The web is more bloated but the modern iPad
> has ramped up its power to compensate.

The main point was that the 2010 iPad was being given the most favorable
conditions, and it still lost horribly, because even compared to
contemporaneous devices, it was very underpowered, unlike current iPads:

\- your claim is that websites are substantially heavier now (which I agree),
putting the 2018 iPad at a disadvantage

\- the 2010 iPad was browsing early 2010 websites in that video, so we've had
10 years of bloatification since then

\- the 2018 iPad Pro is browsing 2020 websites, websites built years after it
was released, so surely more "bloated" than they were in 2018

\- being 2010 websites, they were probably much simpler to render

\- the 2010 iPad's screen had 5x fewer pixels to contend with

Nowhere was I saying that the 2010 iPad was super slow to render 2020 webpages
in all their bloaty goodness. _That_ would be an _obvious_ conclusion. If the
2010 iPad's performance was so good at the time, but only became slower as the
web became much more bloated, why was it still so much slower at browsing 2010
websites than my 2018 iPad is at browsing 2020 websites?

The 2010 iPad was actually slow from the beginning, as the video proves. Since
it was slow back then, it shouldn't be surprising that it's slower and more
painful now that websites want to support higher resolution experiences by
default. Yes, they could put effort into giving old devices a lower res
experience, but why? That old browser is one giant security vulnerability at
this point, and no one should be browsing any websites they don't control on
that thing.

Even with all those advantages being in the 2010 iPad's court, it was still 3x
to 5x slower than a 2018 iPad browsing 2020 websites at 5x the resolution.
This is not even a 2020 iPad Pro -- this is a 2018 iPad Pro. Imagine how much
worse a 2008 iPad would have been at browsing 2010 websites, if it had
existed.

You say that it's "not shocking" that they ramped up the power so it can
browse better, but the point is that we're loading _substantially_ heavier
websites today _significantly_ faster.

How is that possible? Because the 2010 iPad was _severely underpowered_. If it
had been running on a chip that was equivalent to laptop processors of the era
(as my iPad Pro's chip is), then it would likely have loaded the 2010 websites
about as quickly as my iPad is loading 2020 websites.

> Once a page was loaded and rendered the scrolling, tapping links, and
> interacting with forms was all usable fast. Even while the content was
> loading you could interact with the page.

Yes, it's very impressive how much interactivity Apple was able to give the
2010 iPad with its really terrible processor, once the loading finished.

That interactivity isn't because the chip was any good. I remember very
clearly that it was because you were basically scaling a 1024x768 PNG while
you zoomed in and out. Once you let go, the iPad would take a second to re-
render the page at the new zoom level, but you were stuck staring at a blurry
image for a second after zooming in. The GPU was really good at scaling a
small image up and down. The CPU was not so good at rendering websites.

It was also very easy at the time to scroll past the end of the pre-rendered
image buffer, and you would just stare at a checkerboard while you waited on
the iPad to catch up and render the missing content. iOS actually drastically
limited the scrolling speed in Safari for many years to make it harder for you
to get to the checkerboard, but it was still easy enough.

> Look at Reddit versus old.reddit.com. The "modern" Reddit page has poor
> interactivity even on my current iPad. The old.reddit.com site, which is
> similar in complexity to 2010's Reddit, renders damn near instantly and has
> no interactivity issues.

New Reddit is one of the worst websites on the entire internet right now, if
not actually the worst popular website in existence. I really don't understand
how that hasn't been scrapped at this point. It's not representative of modern
web experiences, except possibly in your mind. YouTube, The NY Times,
Facebook, Amazon... these are all modern web experiences that work great on
anything approaching reasonable hardware.

------
stickfigure
This seems to be one developer's wishful thinking, without any evidence
presented to back up the assertion. Pointing out "hey, here's a couple
websites that do server side rendering" does not a trend make.

We're ripping out webflow, if anecdata counts for anything (it doesn't).
Webflow occupies the barren middle ground of "too complicated for marketing
people, too simple for technical people". I find it much easier to write html
than to figure out how to get their UI to make the html I want.

~~~
petepete
GOV.UK is a good example of a mainstream site that's built in the traditional
manner. It's actually a collection of hundreds (thousands) of separate
services, the vast majority of which are rendered on the server and use JS
only where necessary.

As there's no advertising or images on most pages they tend to be incredibly
fast too.

------
yagodragon
I really hope personal blogging becomes popular again!. Speaking of which, I
still haven't found a really good alternative to the "horrible" WordPress for
blogging. It has:

\- Integrated API, RSS

\- Tons of plugins

\- Accessibility, translations

\- Easy and powerful editor(Gutenberg)

\- Comments sections and forms w/ complete ownership and moderation

\- Easy data imports from multiple platforms.

\- Users and roles

\- 100% open source w/ GPL. You own your data

\- Extremely easy and cheap to host and move around.

I love modern tooling and git based workflows for all my project but my
"static" 11ty/Gatsby.js blog doesn't provide all these features out of the
box. Instead of writing, you end up reimplementing basic cms features.

~~~
movedx
Have you considered something like Ghost? ghost.org

~~~
yagodragon
Ghost looks great and very polished! For me though, it provides much less out
of the box (no built-in comments etc). It's also harder to self host and
manage and the hosted service costs 30$/month which is extremely pricey for an
indie blog. 3rd party ecosystem for wordpress (plugins/themes) is huge. I can
easily extend my blog to an e-commerce store, an online community etc.

~~~
input_sh
> It's also harder to self host and manage

Eh, I disagree. The only difference is that it uses Node instead of PHP. You
still hook it up to MySQL/MariaDB/SQLite and you're good to go.

Plus I consider its Members feature
([https://ghost.org/members/](https://ghost.org/members/)) to be a game-
changer, though it's powered by Stripe, so you'll have to be in one of 39
countries supported by it to make it work without extensively hacking your
theme.

~~~
yagodragon
That's a big difference. I can't use a cheap shared hosting and managing a VPS
(os updates, database backups etc) is too much for a small blog. Node.js is
fine but it turns out after all these years, php is still the king in
crud,blog,cms type of application, which is the 99% of the websites out there.

As for the members feature, wordpress has tons of "membership" plugins. Still
though, Ghost landing page for members[0] does a great job explaining why i
need this. Excellent design and copywriting

[0] [https://ghost.org/members/](https://ghost.org/members/)

~~~
johnonolan
Hey! John from Ghost here :) here's a direct comparison of setting up self-
hosted WP vs Ghost, just as a point of reference:
[https://youtu.be/rMKNgV1gTHg](https://youtu.be/rMKNgV1gTHg)

------
pjmlp
It never went away, those of us old fashioned devs on Java and .NET stacks, it
has been our bread and butter for the last 20 years regarding Web stacks, SSR
with some JavaScript on top.

I guess what it happening is the newer generations re-discovering that
actually it makes sense to generate static content once, instead of redoing it
in every client device across the globe.

------
sbussard
Social networks are failing us and we want independence and community. The web
used to be that, then it turned into a gated gossip community.

------
dhosek
Funny that he started with the claim that the dancing baby gif wasn't coming
back. Turns out, it's already back.
[https://twitter.com/JArmstrongArty/status/122590192989894656...](https://twitter.com/JArmstrongArty/status/1225901929898946561)

------
badsectoracula
> Frontpage and Dreamweaver were big in the 90s because of their “What You See
> Is What You Get” interface. People could set up a website without any coding
> skills, just by dragging boxes and typing text in them. > > Of course they
> soon found that there was still source code underneath, you just didn’t see
> it. And most of the time, that source code was a big heap of auto-generated
> garbage - it ultimately failed to keep up with the requirements of the
> modern web.

If you do not see the source code it doesn't matter if it is garbage or not or
if it is following any "modern web requirements" or not - all it matters is if
it does what you expect it to do. Besides, it is a bit of a hypocrisy nowadays
to complain about the code underneath a WYSIWYG tool when many web developers
use transpilers that target CSS, JavaScript and pretty much all sites rely on
dynamically generated and altered HTML that doesn't let you make more sense on
the final output than something like Frontpage or Dreamweaver would generate.

Sadly the closest thing i could find nowadays to something like a WYSIWYG site
editor is Publii[0]. It suffers greatly from the 'developer has a huge screen
so they assume everyone has a huge screen' syndrome and i really dislike
pretty much all of the themes available for it (everything is too oversized).
And it is an Electron app because of course it will be an Electron app despite
not needing to be one (it doesn't offer full WYSIWYG functionality, only for
the article editor which isn't any more advanced than Windows 95's WordPad and
it relies on an external browser to show you the final site). But it does the
job (i tried on a new attempt for a dev blog of mine[1]) even if i dislike how
oversized everything is.

[0] [https://getpublii.com/](https://getpublii.com/)

[1] [http://runtimeterror.com/devlog/](http://runtimeterror.com/devlog/)

~~~
CliffStoll
I first hardcoded kleinbottle.com with handwritten html; over 20+years, I've
gone through several tools. One by one they've evaporated - Home Page, Front
Page, GoLive, Dreamweaver 5. I'm now hobbling along with BlueGriffon.

------
bradgessler
I’d love to use a search engine that simply didn’t index websites with
moderate or excessive amounts of JavaScript, images, and video.

You wouldn’t need AMP because it would load quickly, ads would be minimal, and
the text content would probably be forced to be higher quality because it
would have to stand on its own.

Does such a thing exist?

~~~
hedora
Even better. Blacklist all sites with ads. I’m guessing google will never
implement that.

~~~
Summershard
How would websites create revenue then? People are very unwilling to pay for
online articles.

~~~
system2
Maybe websites should stop spamming junk for SEO and don't rely on clickbait
to create revenue? I am very happy to use adblockers and have very little
sympathy.

------
codr7
I've been playing around with this for several years now; building more or
less elaborate frameworks for server side rendering and dividing the interface
into separate pages.

I blame Seaside [0] for corrupting me. Never used it to build anything, but
once the idea of building the user interface on the server was in my head
there was no way back.

Though I have to admit I still find JSON really convenient for submissions
compared to using fields for everything as it allows massaging the data on the
way.

Besides that I've found the approach to be a total success. Pages load
instantly, bookmarks and back buttons work as expected and most of the
application stays on the server.

[0] [http://www.seaside.st/](http://www.seaside.st/)

------
Animats
Webflow is touted as "the new Dreamweaver". Of course, it's "software as a
service", about 3x as expensive as basic web hosting.

~~~
dvfjsdhgfv
Can you self-host Webflow-generated content?

~~~
Animats
You have to pay $192/year before you can export what you created.

~~~
system2
That's very expensive.

------
INTPenis
Another point to this, I equate the modern fediverse with all the old message
boards.

Message boards are still around but it used to be an integral part of web
culture. They essentially took over from dial up BBS.

But now community boards have moved to cloud services like Discord. The self-
hosted boards are still around in the shape of federated ActivityPub
instances.

It makes a lot more sense than hosting an isolated island of PunBB or
vBulletin.

I just hope more communities host their own small localized ActivityPub
instance, using AP relays to create a vibrant fediverse.

------
yedava
For a lot of internal corporate web applications, server side rendering is
what makes most sense. These applications are always used from browsers on a
company provided laptop. You don't need to worry about multiple frontends and
"web scale"

Back in the day, it used be that internal apps were shitty to use, and had
slow service layer code. But once the data got to the view layer, at least the
pages rendered fast. Now with proliferation of SPAs, we have shitty user
experience, slow backends and slow UI renders.

------
usrusr
Preloading on button-down, nice detail optimization. It's possible that you
have to abort that request, but it will be the rare exception. It's my
favorite thing I learned today.

~~~
noisem4ker
See previous discussion:
[https://news.ycombinator.com/item?id=23203658](https://news.ycombinator.com/item?id=23203658)

------
malwarebytess
Wow, Hey.com has none of the lag that modern websites have. Is the email
client as responsive?

~~~
tradesmanhelix
It's created by Basecamp which makes Turbolinks [0], Turbolink iOS [1], and
Turbolinks Android (deprecated) [2], so there's a good chance that snappiness
you're seeing is due to those awesome tools! (In fact, this Github issue
comment [3] seems to confirm that HEY uses Turbolinks.)

On a related note, really wish things had gone this direction. Instead, it
feels like React, Angular, etc. re-created the backend in the frontend, so now
for most apps you essentially have two backends plus one frontend to maintain.
I think as soon as our frontends started requiring controllers and routes to
work we should have been like, "Hey, wait a minute..." But I guess design
folks tend to know JS, so I can see how that combo won.

My $0.02 from 5+ years in the industry is that you probably don't need React,
but Godspeed if you think you do - just maintain extreme discipline or you'll
end up with spaghetti code (esp. if contractors are involved) faster than you
can blink.

Long-term, I hope things move back to sanity. React and company work OK for
some problems, but 95% of websites/webapps can probably get by with either
straight HTML/JS/CSS or server-side rendering/templating.

[0]
[https://github.com/turbolinks/turbolinks](https://github.com/turbolinks/turbolinks)

[1] [https://github.com/turbolinks/turbolinks-
ios](https://github.com/turbolinks/turbolinks-ios)

[2] [https://github.com/turbolinks/turbolinks-
android](https://github.com/turbolinks/turbolinks-android)

[3] [https://github.com/turbolinks/turbolinks-
android/issues/111#...](https://github.com/turbolinks/turbolinks-
android/issues/111#issuecomment-631617956)

Edit: Formatting

------
revskill
Actually it depends on the kind of website you'll build.

Some years ago, i made an app with Rails and turbolinks and some "tricks" to
make ajax work smooth. The first version was built in 6 month, then i rebuilt
the second version with React in 3 weeks !

The pain is in refactoring and adding new features as well as speed of
development.

That's the day i discovered React. The way i develop Rails app is just make
api json response to match mocked json on the React frontend, nothing more to
think or trick!

There's a reason (or many reasons) I and many other chose React (or similar
libraries/frameworks) to get the job done.

------
rudolph9
One idea I’ve been thinking about building is sort of a hybrid SSR where you
make use of server sent events to continually render more on the page based on
the users interaction (most obvious would be scroller).

Of course I have yet to investigate how modern browsers render a never ending
index.html, how I wouldnsend the client events and correspond them to the
existing open original request, how this would scale to multiple hosts behind
a load balacer (but maybe that’s getting ahead of my self haha).

~~~
bricej13
That's sorta the idea behind server-rendered Blazor. Components are rendered
on the server and passed over to the client via websockets.

Unsurprisingly, input lag and scalability can be major obstacles, but I expect
it to make large headwinds in enterprise apps.

------
ChrisMarshallNY
One of the things I say (when I think some poor bastard has no choice but to
listen), is _“The September That Never Ended was the best thing to happen to
the Web.”_

That was what changed it from a niche community of geeks to a true “everyman’s
town square.”

It was also the worst thing to happen to it. It heralded a tsunami of money
and God-awful content.

Money tends to be a damoclean butterknife. It will spur tremendous growth and
energy, and also reduce the moral structure to a septic tank.

I’m glad the BLINK tag is dead.

------
mafuyu
Along these lines, does anyone have any recommendations for a service or
framework to create a simple personal site? I just want a basic blog, project
pages, and photo gallery. I use Jekyll at the moment to statically generate
pages to host on github.io, but frankly, I don't have much webdev experience
and I'd rather just have something work out of the box at this point.

~~~
juliend2
I would say: start by doing pure HTML, then progressively work with PHP to
include header.php/footer.php on every page. Then create helper functions for
a few things you don't wanna repeat.

You'll be surprised by how good it feels to know your website from A to Z.

This is what I do for my own site:
[https://www.juliendesrosiers.com/](https://www.juliendesrosiers.com/) .
Almost every URL ends with an old-school .php for that very reason lol. But I
like the fact that every time i want to modify something, I simply SSH into my
dreamhost account and use vim to edit posts and pages. But om your case it
could be [https://www.panic.com/coda/](https://www.panic.com/coda/) .

And if starting from nothing seem scary, you can always start with a basic
themeforest theme, or a framework like Bootstrap. Having the html-css already
done goes a long way.

~~~
simonbarker87
This is exactly what I did a few weeks ago after getting fed up of Wordpress
for my personal site. Static HTML pages and some php for the header the and
the blog homepage. It means I need to write in plain HTML but that’s fine, the
only js is prism syntax highlighting.

------
ryanmarsh
If we’re going back to the 90’s let this be your reminder we still don’t have
something for web dev as easy to use as VB6.

~~~
toast0
We no longer have anything for the desktop as easy to use as VB6 (nor for
mobile).

~~~
gmfawcett
Plus ça change. :) I was just reading an article ("APL since 1978", [1]),
which recounts the complaints of mainframe programmers when microcomputers
were introduced in the mid-80s, because of how much harder it became to design
application interfaces:

> Worse, the technical skill required to write applications suddenly increased
> dramatically. You needed to know about networks, how to deal with the poor
> reliability of shared files on a LAN — or how to construct client/server
> components to achieve the security, performance, and reliability that you
> needed. User interfaces, which were so simple on the mainframe, became an
> absolute nightmare as an endless procession of new GUI frameworks and UI
> “standards” appeared and then faded away.

[1]
[https://dl.acm.org/doi/10.1145/3386319](https://dl.acm.org/doi/10.1145/3386319)

------
peterwwillis
> at some point our kids might think frosted hair tips are totally cool

you shut your face sir! my hair was the BOMB when I was 14!!!

------
mymythisisthis
I want a good webpage; -Clearly written like a Navy manual, -but using well
made animated gifs alongside flat pictures, -with a few videos thrown into the
mix, that are short and concise, no longer than 3minutes, -and short audio
clips to learn to properly pronounce something.

------
SomeoneFromCA
They need to bring back Gopher. It has a good chance to become very hipster
today.

~~~
ralls_ebfe
Gopher still exists. There is even a new protocol, which is similar to gopher:
gopher://gemini.circumlunar.space

~~~
SomeoneFromCA
I have actually tried a gopher HN mirror. And it was an interesting
experience.

------
mike503
I haven’t even adopted the more modern JAMStack type techniques yet, and now
this guy is saying those are old?! I still consider those closer to the holy
grail; clients scale indefinitely, servers you still have to pay for.

------
donmb
What I can confirm is that old fashioned html tags such as <fieldset> or
<iframe> are coming back and are underrated. So much can be done with just
plain html.

------
dndvr
I did get a chuckle out of a page extolling the virtues of 'html over the
wire' asking me if I was

'interested in things like front-end dev and the JAMStack'

------
t0ughcritic
Won’t work, now there is one search engine and the first page is all ads and
all pages are over optimized. No one will see any new sites or blogs
generally.

------
qwerty456127
This is so adorable. I wish more people and companies were
using/designing/serving web sites the way that was meant to be done.

------
uncletammy
I've been counting down the days until it's okay again to use <tables> . I
think I have a lot more counting to do ...

~~~
robertoandred
It's never been wrong to use <table> for its intended purpose.

------
pointillistic
Why people miss the simple solution, off with the ads on with micro-payments.
You like the content, pay for it.

------
Havoc
I was totally expecting the content of this post to be "return to html leading
to JAMstack"

------
armandososa
This is a legitimate question apropos of OP‘s web ring: Who qualifies as a
nerd of the 90s? Is it nerds born in the 90s or people who were nerds during
the 90s?

------
j4yav
The indieweb is so great. I ultimately turned back off webmentions though
because I was worried about GDPR liability, as well as wondered if people who
liked my tweet actually intended to have their name and like appear on my
website.

My site is otherwise marked up for the indieweb, I've got indieauth working,
and it's a cool community.

------
emersonrsantos
Serverless is the new CGI

------
platz
i'm getting a ERR_HTTP2_PROTOCOL_ERROR in chrome?

------
imagetic
Good.

------
agumonkey
swings and roundabouts

------
buboard
please don't call it server-side-rendering. The http server was always meant
to respond with hypertext content. Serving json is the oddity, not the other
way around

~~~
zzo38computer
For document-oriented stuff, you can serve HTML or plain text. For data-
oriented stuff, I think that it is not so wrong to serve JSON, CSV, RDF,
SQLite database, or whatever format it is, although such things should not be
hidden from the user (especially if scripts are disabled); this is independent
of whether or not there is also a HTML version of the data (if so, the HTML
could be generated either ahead of time or on demand; which is best may depend
on the situation).

------
DoubleGlazing
Maybe I'm missing something, the article isn't very clear. One of the reasons
why we have front end apps pulling from an API is because it allows for
interoperability.

The same API that serves data to the web browser can serve that data to mobile
apps and to third parties as well.

The idea of bringing HTML rendering back on to the server just doesn't seem
useful to me.

~~~
Karrot_Kream
That's if you're making an application, sure, but on the web I view all sorts
of sites and most of them are just styled text containers, maybe with an
optional way to mutate things. HN, Reddit, News, recipes, forums, galleries,
these are things I regularly consume that don't need APIs to return content.
For applications, like Facebook, it's understandable, but if you don't spend
all of your time in Social Media, then you probably are not really using an
application.

~~~
dndvr
You must use old.reddit.com

~~~
Karrot_Kream
On mobile I just use a client that requests data from the API and renders it
for me. On desktop, yes, for now, but I'm in the process of writing views that
load faster and don't require a browser.

