Hacker News new | comments | show | ask | jobs | submit login
How the Blog Broke the Web (stackingthebricks.com)
230 points by tempodox 5 months ago | hide | past | web | favorite | 128 comments



Am I allowed to plug my own site? The old web is still around - you just need to dig through more new web to find it. I run my site through a few Gulp tasks to minimize files and provide security-integrity checks for my CSS - but otherwise it is all hand written HTML. Every page begins with copy/pasting another page and changing out the content by hand.

I've noticed in the last few years the articles about the "early 90's/00's" web has increased. A desire for the older, more personal, mostly text and some cheesy few-framed animated .gif web. A web that felt more person to person instead of corporation to corporation or plastic Barbie doll to plastic Ken doll full with fake personas.

This is why I support Neocities [0] and host my site with them. You can find all sort of interesting sites if you browse [1] a while. My personal favorite is browsing all the layers of [2].

[0] https://neocities.org/

[1] https://neocities.org/browse

[2] [Warning: Auto-playing Music] https://mebious.neocities.org/Layer/Wierd.html


> [Warning: Auto-playing Music] https://mebious.neocities.org/Layer/Wierd.html

While it looks like Neocities doesn't suffer from this reverse chronological bias mentioned in the article, the page you linked to is post-millennial enough to not work at all without JavaScript.

But even in the 90s some would argue [0] that the web was getting too fancy:

> Enough's enough. The World-Wide Web promises a new way to let people communicate. But too many Web designers are being bewitched by "multimedia" - they load their sites with gigantic graphics, embedded sound clips and animation.

[0]: http://world.std.com/~adamg/manifesto.html


I'd argue 'a new way to let people communicate' is exactly in line with broadening the forms of communication to things other than just pieces of text.

There's plenty 'wrong' with the web as it has evolved, but I don't think it's a bad thing to make it easier to use a variety of ways to communicate. Text is just the one that is easiest to implement.


There are podcasts and youtube channels for that.


YouTube channels... You mean like videos on a website?


I still maintain my own blog “by hand” with VI, minimal CSS and a handful of CGI scripts written in Perl because I’m too lazy to learn the modern way to do it. Does that count?


This is cool in a sort of "stick it to the man" kinda way, but simple sites don't need to be weird. My whole site is just HTML and CSS. It renders normally on every platform. I host everything myself (purchased fonts, images, etc).

I don't have to copy and paste HTML each time, because I wrote a static site compiler from scratch in Ruby. I don't even use any gems, it's that easy. For an example article, check out: https://zachaysan.com/zero

The web doesn't need to be this horrible bloated thing we've all made it into.


While I agree with you I don't think we should encourage everyone to build their own static site compiler. However there's a sweet spot and I think it is Netlify (or any other comparable host) and the static site generator of your choice, for me it's Middleman.

Anyway, you're right, the tools to build a blog and publish content you truly own are absolutely there.


Oh I totally agree that most people shouldn't be writing their own static site compilers. That being said, I think that it's so much less work than an expert may think it is that I think it is worth mentioning. The web doesn't need to be that complicated. Sometimes stitching together some HTML isn't the end of the world and it's kinda enlightening to have something that is 100% you.


I'm a big neocities fan. I was on the original geocities team, my first real job, I love how neocities uses ipfs and their liberal content policy, a lot of hosts have as restrictive a content policy as the app stores.


Thanks for neocities.org link!! I created a site. It was fun and refreshing.


It's like 1998 all over again. Thanks for posting this.


I think you just /.'d neocities.


I was getting "502 Bad Gateway" for 20 minutes when I wrote that.


Does neocities survive solely from donations? That's pretty impressive if so.


Yes, and supporter accounts. https://neocities.org/donate

Nobody is more impressed than I am. It hasn't stopped being amazing to me that it works.


Thank you so much for your work on Neocities - and sorry for sending the HN hoard your way!

I personally think the support model is all about scaling properly - or simply not scaling past a sustainable point. As long as the number of patrons scales with the number of "freeloaders" it is a sustainable model. The issue is if you begin to hockey stick and don't restrict your scaling (because you "want to be larger" or you "hope people will donate as we hockey stick upwards"). Once that happens, donation ran sites tend to collapse under themselves - which, for recent examples, is what has happened to so many pomf.se clones. Becoming too popular can become a death sentence.

As a support-driven site, can you provide any insight into that? Or offer whether you agree/disagree with the idea? Would be interesting to hear your thoughts on the matter.


That's fantastic! Congrats! You should do an interview over at indie hackers :)


It's definitely impressive. In addition to accepting donations, they also offer a paid plan that offers more space, bandwidth, and features:

https://neocities.org/supporter


Am I allowed to plug my own site? - crack on, no need to be shy.


Okay:

https://philosopher.life/

Slow AF, but it usually works. If you feel it's just a Timecube or TempleOS, then I'm sorry for wasting your time.


It took a fair old while to spin up. The design is idiosyncratic but consistent.

Thinking about your habits will always be useful to you. Be the architect of this living art and your artful living. Go forth, sir, and be that which creates itself. Build your existential lifetool, reconstruct yourself, and ride your liferaft down the river of happiness

It is about time I reconstructed myself and incorporated a decent existential lifetool into my river journey.


I think it's really cool, and I'll rummage about in the near future. I do find the font a bit difficult to read, and perhaps that might be something to focus on improving (assuming you value the content of your site more than aesthetics).


The font is a weird problem. It does have an aesthetic component, but this is actually a choice towards functionality for my own context. I agree it can be unpleasant (I despise it on mobile, for example).

I work on my wiki on a 42" screen, and I've found it very useful to see large bodies of monospace text all at once on my screen (it's part of my workflow). I regularly have 50,000 words worth of tiddlers open at any given time, and I've found the font is useful in development. I'm going to accept the tradeoff for now.

If it ever becomes something which isn't intended for me, but instead primarily for others, I will have to change it.


Hey there, love the idiosyncratic design of your site. I'm interested in reading through this, but the about section left me a little confused.

Also, is computational existentialism a term of your own? If so, could you define it for me?


Yes, it's my own term. I'm still developing what it means to me. You can find my evolving definition here:

https://philosopher.life/#Applied%20Computational%20Existent...

Currently, it reads:

"Applied Computational Existential is the scientific method loaded with a profound respect for metaphysics applied to my fundamental existential problems. The goal is to balance my self-dialectic with empirical bottom-up reasoning and rationalist top-down modeling from a transcendental realist's position in epistemology and ontology. This is an assay in teleology based on the assumption that the physical world and my mind are almost entirely reducible to computational models while humbly accepting that the final telos of reality exists in metaphysics."


TempleOS is actually a very impressive piece of tech.


I said that poorly. I meant that it might be dismissed as merely the work of a crazy person. I agree with you; I think TempleOS is fascinating, and I'm blown away that one person built it.


I mourn whenever a consumer technology becomes streamlined, homogenized, and commoditized.

There's a pattern - it happened with smartphones too - where a new technology appears, and at the beginning people engage with it directly. They use it, they do things with it. And in that type of relationship there's the opportunity to have fun with it. But then as the new medium grows - as it becomes more economically lucrative and scales to an exponentially bigger audience - it's made to be more efficient instead. Eventually users no longer use, they consume.

Think about how weird and wonderful the first iPhone apps were, with their kitschy skeuomorphism and novel uses of sensors. The accelerometer-based lightsaber. The sound boards. The odd and interesting domain-specific utility apps. Even Instagram had personality back in 2010. And then people would jailbreak their iPhones, so they could add widgets and toolbars and ridiculous neon color themes. The smartphone started out as a magical anything-object that people looked at as its own entity, just as web pages started out as magical worldwide bulletin-boards that people would put together and decorate for their own sake.

These days the smartphone is boring. It intentionally falls to the background, serving as little more than a stylish content-pipe feeding your every craving. It is minimalist and inoffensive and ever-present. Even apps are supposedly on their way out. The very action of opening an app and doing something is seen as a piece of friction, keeping people from consuming content as effortlessly. Google Assistant and Siri would rather predict what you want and serve it to you before you even ask for it.

Play is how the human brain stays alive, and it can't happen without back-and-forth interaction and tactility, and user agency.


Interestingly, something of a reverse effect is happening in the game development space. I think the driver of the above pattern is mainly economic; as a technology becomes more and more commercialized, it gets less personal and more streamlined. This is aided by the technology itself improving to allow it to be more efficient. With games, though, the barrier to entry was always (since the 90s, at least) so high that only corporations could participate. So in recent years, the technological strides that have been made have actually allowed the space to become less commercialized, and therefore more personal and diversified.


Consequently, game development these days is more limited by art assets than by programming or game systems design. The difficulty and complexity of producing good-looking, stylistically consistent, original 3D assets has led to sprite-based games, mock pixel art "8-bit" games, and games where form follows function.

However, game engines companies now offer asset stores. To OP's point, it's not hard to see a future where some successful indie games are made by stitching together assets and mechanics sourced entirely from others. Would it cheapen the result, despite the talent and effort put into those components none the less grand? Perhaps.

Technology and popularity move the goalposts. It increases the amount and/or changes the nature of effort required to attain acclaim.


To a point. I think, however, as the technology increases and gets better, the trend towards realism will overpower. Consider: as assets get more and more complex, the uniqueness of an individual asset will actually decrease. I can easily point out the difference between a tree in the witness and mario 64, because they're both low poly games that do their own thing _instead_ of emulating the real world. If, instead, you ask me to differentiate between a tree in skyrim and one in the last of us, well, they both just look like trees. At this stage, what matters is the visual aesthetic that you're able to create with lighting, with combination of different assets. Whether you happened to make an individual asset yourself will matter increasingly less as time goes on.


Why would people abandon unique styles just because it gets easier to implement a realistic one? An artist's goal isn't just to make something that doesn't look like crap; it's to make something that expresses a certain feeling. Stylism isn't just a band-aid for lack of realism.


I'm not saying that they'll abandon it, I'm saying that as publicly available art trends towards realism, being an 'asset flip' won't be such a bad thing anymore.


I agree that art is becoming the bottleneck, but I feel the need to point out that pixel-art as a style is not chosen because it's easy. Quite the opposite; it takes more artistic talent to create something abstract that's still expressive than to create something detailed.

Asset stores are definitely used to fill gaps in artistic talent/man-hours, though. In fact there's a term for games made out of nothing but hastily slapped-together assets: "asset flips".


I can speak only for myself, but it feels like the "rules" were still more lax in the 90s. So much of my time was spent toying around with the original Half-Life: making sprays, importing custom models, playing half-finished maps, exploring whatever bizarre mods were trending on Fileplanet. And everyone else was doing the same thing! There might be tons of zany games out there today, but that comfortable feeling of loose, collective exploration is missing. (Well, maybe the Minecraft kids have similar experiences.) Even the user-generated content is overly polished and unsurprising.

I miss existing in that collective weird space.


You just miss your childhood. Those things still exist if you dig a bit.

Furthermore, it could have been just your group. Most people I know from back in the 90’s were just playing games.


I'd say HL was the beginning of the end of "quirky" mods.

Making content for that game was so much more time consuming than grandaddy Quake 1.


I can only imagine; glad I caught the tail end of it!


> These days the smartphone is boring.

I'd like to respectfully disagree, as I think that there are many people out there, a notable example being the XDA forums[1] for many people intensely modifying and changing their android devices. Smartphones and their apps have gotten so much more complex and capable since their inception, someone could pretty much manage all the basics of their online life on one.

> It intentionally falls to the background...It is minimalist and inoffensive and ever-present.

I wish :) There is a real problem of smartphone addiction, where the phone and its apps try its best to entice you into spending as MUCH time as possible, definitely not in the background or minimalist.

> Even apps are supposedly on their way out.

Given the ever-growing user base and size of the Apple and Google app stores, I don't think that's true, people just may just be looking to launch or access their functionality a tad-bit faster (if at all in truth).

Also, the general user adoption sounds a lot like you're describing the general trend mentioned here: https://meaningness.com/geeks-mops-sociopaths, although smartphones aren't necessarily a subculture (more specifically, android and iphone rooting/modding communities might qualify however.)

[1] = https://www.xda-developers.com/


There's something that was magical about jailbreaking iOS, because of the way Objective-C allowed you to modify the system by swizzling methods. This can't really be compared to people modifying their Android devices, it's just not the same because it's such a pain to compile Android and flash it everytime you want to change something.

Jailbreaking is dead, and Swift does not allow swizzling. I agree, smartphones are boring, it's too locked down. Not many people had the skill to write quality apps in Objective-C, so now we're stuck with apps written using React Native. This makes smartphones more complex, but the quality is worse.


> it's just not the same because it's such a pain to compile Android and flash it everytime you want to change something.

One does not have to recompile and flash Android just to change something. Bytecode can be redirected at class load time and native binaries can be individually "replaced" by mounting over them. This is how Xposed and Magisk work, respectively, BTW.


It's the circle of life.

You could try and manufacture all those tasty molecules yourself, or you could outsource the job to mitochondria, and meanwhile you could go off in pursuit of loftier goals, such as how to figure out multicellular life, and so on.

Same with everything else, including personal websites.


I'd say this is better explained by a shift in the distribution of users than by anything changing intrinsically about the device.

It seems quite possible to me that the number of people jailbreaking their iPhones and developing weird apps is actually greater than in the early days, it's just that everyone has a smart phone now, so the percentage of people doing that has shot way down.


I'd also like to plug this project: https://mavo.io/

Its goal is to bring back the idea of hacking together HTML pages by hand, but with the benefits of modern dynamic content. It's cool and inspiring.


I think everybody likes to play with new things, but it's quite natural for the novelty to wear off after a while.


Honestly i see the iphone as the start of the "scale" moment.

There had been smartphones before iphone, for a long time.

Damn it, the iphone was not even launched as a smartphone. It was launched as basically a ipod that could make calls.


Have you seen the article about the birth of radio ?


No, but I have heard anecdotes about ham-radio operators having a similar experience with that technology


It was linked not long ago on HN. It tells how radio caught wildly as an unregulated hobby. People just used the new discovery for fun to talk to each other randomly. Only later when the army saw use it became what we know today. Allocated frequencies, business mostly. I had no idea people enjoyed radio first.


> I had no idea people enjoyed radio first.

Even disregarding what you're referring to, do you remember walkie-talkies?


I've very rarely seen them outside of job purposes. I remember a fad as a kid but I wouldn't compare making your own radio in early 1900s with buying a plastic toy in the 80s.


I feel like this story leaves out a very important factor -- Slashdot. Slashdot was huge long before Moveable Type. It was so huge in fact that when my team built a web server in 1998, the first question from the boss was, "can this handle having one of our posts on Slashdot?".

I'd say if anything Slashdot was the one that drove people to make chronological posts. Because if you wanted to be on Slashdot, you had to post something new, and the best way people knew if you had something new was if you had a date on it.


Wikipedia has a page on the history of blogging, and indeed Slashdot is there: https://en.wikipedia.org/wiki/History_of_blogging That doesn't mean that it inspired people to convert their personal websites to blogs, though. While people did see chronological posts, that doesn't mean the idea of converting their websites to that format was put into their heads. There's a big difference between Slashdot and a personal website.


Only one data point but I made my first blog inspired by Slashdot. I wrote a CGI script that read flat text file posts to make the front page, then when you clicked a link, another script wrapped the chosen post text file in a basic HTML template (eg added a header and footer, that's it).

Making that (and clones customised for friends), got me into web development and ended up in a 20 year career.


If anything media companies and content marketing broke the web, not individuals creating online journals or hobby content.


I wish there were more real personal blogs. Heck, sometimes I wish for the days of Xanga and LiveJournal. Reading one from the past really feels like you're inside someone's deepest thoughts. People were unafraid to really be vulnerable when writing those things.

Now we have lame blogs owned by corporations like Techcrunch, or filled with vacation pics and food porn.


I wonder if it has something to do with trackbacks.

I thought it was a little silly when Hacker News began hiding points, but I think it has a lot to do with how Hacker News hasn't turned into a game about scoring points rather than people sincerely trying to express some analysis or opinion.

Getting to the end of a blog post, and wanting to see what other people have to say, then only seeing links to other people's blogs, or garbage comments like "really amazing post! come look at my blog" is disheartening. It even makes you feel unwelcome commenting on the actual content, as though a decadent interest in ideas interferes with a bunch of people trying to do their jobs and put food on the table.


Given the mob mentality that's settled into modern social media/the blogosphere, it's actually amazing there are still as many personal bloggers as there are. One wrong word and ...


This is my biggest concern. In the past, I was never afraid to be public about my RL identity online. But now I have to realistically be concerned that in the event that I make a bad joke that's taken wrong, and my username is connected to my real name, and my real name is connected to my employer, someone asks my employer what they think, and I'm potentially out of a job less than 24 hours later.

People have seemingly figured out that confronting someone's employer, even about a topic unrelated to their work, is a good way to punish someone for a misdeed. That was never really a concern most people had to be concerned about until recently.

I like to thank that A. I'm a decent person and B. my employer would not drop me over an Internet argument, but I think really anyone who communicates online needs to be afraid of this risk.


Unless you're a figurehead at some company any company that would fire you because of something like that most likely isn't worth working for to begin with. Of course that doesn't offer anything in terms of support or recourse if it should happen, but those employers open themselves up to a particularly nasty form of denial-of-service attack and they should probably be made hip to that.

Otherwise one day some group will have fun with them until they have no employees left.


Ok, but in the meantime you’re still unemployed, stressed about health insurance and debts and feeding your family, and you get to explain to every potential employer why you were fired.


Yes, it absolutely sucks. But that's one of the things where unions might come in handy.


Tell that to the Marriott employee who got fired for liking a tweet.


I think that case is rather different. They liked the tweet with Marriott's account, not their personal after hours account. Ridiculous or not, if your job is running Marriot's Twitter, part of the job is liking the right tweets.


Totally ridiculous and absolutely despicable. So, if you get fired for liking a tweet claim wrongful termination and sue the bastards.


When you run a company's social media account, liking and not liking the right things is part of the job description, and probably out of bounds for wrongful termination.


As someone who runs a fairly popular "personal" (not corporate, opinion-based) blog, it is an ongoing fear. We have to try to make sure the things we say are as least controversial as possible. But in a way, this is a good thing.

My blog is about my city, and there are plenty of things we "can" talk about but probably should't. For example, I don't think the business that's in one of the cornerstone buildings downtown is the right business for the neighborhood. But I'd never say it. It'd be needlessly confrontational, my opinion isn't going to get that business to move, they're paying their rent so they have every right to stay there, and all it would do is piss people off who disagree with me.

On the other hand, we generate minor controversies all the time. People who live outside of the city always want more parking, we're pretty strongly against setting minimum numbers of parking spaces when putting in new buildings. We support new higher-density living in the city, many neighbors are strongly opposed to it. As such, we're the target of smear campaigns and any relevant social media post is often swamped with the people who disagree with us. We've also been factually wrong in the past, and some people do their best to remind us of it as often as possible.

The trick is to ignore it. Haters gonna hate. Don't be unnecessarily confrontational, and just roll on without acknowledging your public critics. They're just trying to piggyback off your success.

It's not really a "one wrong word" situation, it's "one word you really knew you shouldn't have said, but against all better judgement you did anyway" followed by "engaging your critics in arguments", then topped off with "caving and admitting defeat because you knew you were wrong all along".

I can't think of a single collapse of a public figure because of one wrong word that wasn't extremely egregious and then handled extremely poorly.


> We've also been factually wrong in the past, and some people do their best to remind us of it as often as possible

THAT dynamic is toxic and we need to somehow evolve a social rule against it. Something like Godwin's Rule, back when that worked.

Everybody has at some point said something factually incorrect and everybody has at some point said something that sounds wrong or mean when taken out of context today.

If forever bringing up "Remember, this is the person who said (worst thing ever)!" is a valid move, it's an attack on identity continuity. The existing defenses against that attack seem to be:

(a) completely abandoning identity continuity (eg: 4chan)

(b) trying to appear perfectly accurate and inoffensive at all times by retroactively deleting any posts that might make one look inconsistent or mean and hoping nobody notices or keeps an archive.

There really ought to be more options than those. For instance, one could adopt a statute-of-limitations approach: Anything said more than two years ago is off-limits as an attack on that person/blog/institution today.

Once the right rule is documented, those who break it are demonstrating they are too dumb or ill-informed to engage with what is being said now and thus have lost the argument.


There’s an evolving social norm against this, I think. Increasingly, I’ve noticed the people who trawl through years-old posts looking for controversy are seen as deranged.


That only applies when the target was relatively anonymous to begin with. It's frowned upon to crawl back in time to look for ammunition against someone, but if that ammunition is something that went viral and is part of public consciousness, then you're good.

One example is Bredan Eich. Because he chose to donate to a Prop 8 fund and it got out, his opinion on all subjects is automatically null and void for many people.

Another case is that of reddit user 'Unidan', who became an immediate pariah overnight because he was a dick to someone online, so he abandoned the entire identity. If any of his alternate identities were somehow linked back, it would likely become useless as well. I'd be willing to be his professional career suffered greatly because of that rant, too.

There's also Joy Reid, who has been in hot water lately because people found old tweets and blog posts that were offensive. In that case, people actively went back to look for things she said in the past.

None of these would have been possible pre-internet.


It's more "one word you should say but know probably isn't worth the trouble" followed by "feeding the trolls", then topped off with "giving up because you can't summon the energy to stand up to a never-ending flood of ignorant assholes".

It is best to just ignore them.


Could you pretty please email me a link to your blog and let me know what kind of traffic it gets?


Anything you do in a public space as large as the internet requires a thick skin. Blogging is (and HN is too) no exception.


If it were just having to have a thick skin, I'd fully agree with you; there's always going to be someone to disagree, and some will disagree floridly. But these days there can be real consequences to your social circles and in notable cases even your employment, and opinions on what is socially acceptable shift in a hurry. What you wrote a few years ago and is still visible on your blog (and may or may not be a view you still hold) can still hurt you. And you can't claim on your blog you didn't have enough space to express a nuanced opinion.


> What you wrote a few years ago and is still visible on your blog (and may or may not be a view you still hold) can still hurt you.

This is why the web was better when everyone was operating under a pseudonym.


I operated under a much thinner one at the time. Saw the present situation arriving, somewhat, and abandoned that.


I've been physically threatened because of stuff I wrote online, have had a stalker for a couple of years and other pleasantness. Even so, for me that's worth the risk, I can see how for someone more vulnerable that sort of risk would be totally unacceptable.

You only need to look at the kind of stuff the griefers on various other websites get up to to see the extent of the menu of real-world consequences. Those things are not jokes.

So I totally see your 'real consequences' argument, and it goes way beyond just your employment, in some cases it is your life that is in the balance.

But there will always be idiots. Just like there are people using drugs and then driving vehicles I have to share the road with them. But that won't change my attitude towards driving, and neither will the existence of assholes diminish my activities online. But I totally sympathize with those people for whom lines have been crossed that cause them to censor themselves or even to tune out completely. And I'm sure I too have such lines (in fact, I can easily think of several) and if those were crossed I too would bow out.


I agree, but I have a question. Was the child comment by 'firemancoder downvoted to oblivion in order to prove the comment correct?


I have no clue what you are on about.


Turn on "showdead". I'm not able to respond directly to that response to your comment, so I responded to your comment instead. The question is a bit moot anyway. 'firemancoder is hellbanned for some reason. It just seemed that this particular comment of his was especially true. b^)


Ah I see. Thank you for clearing that up.


It's not hurt feelings on the part of the blog writer you have to worry about, it's the readers. If you post something super controversial like supporting the president, you could lose your job and any volunteer/community positions or more. There is a pitchfork mob out there just waiting.


The reason this is so difficult to undo is because callout posts possess attributes that allow them to bubble to the top of social networks: a clueless subject who can be positioned as entitled, a controversy that garners a strong emotional reaction, and a reveal from a foil with a bit of additional insight. It feeds our need for drama, vengeance, and putting someone in their place, even if the mechanisms of doing so are much the same as bullying. With the Internet having a deep memory, and screenshots and reposts being photocopies, the risks of speaking off the cuff have gone up significantly.


I had a real, personal blog, with plenty of loyal readers. In mid-2012 it just stopped being a happening thing. I no longer had the reliable interaction from repeat visitors.

That, not consequently, is the same time as Facebook became the dominant way for many people to share and consume links on the web.


> Now we have lame blogs owned by corporations like Techcrunch, or filled with vacation pics and food porn.

... And written by "influencers".


Here's mine:

https://philosopher.life/

It loads slowly, but you're downloading the entire wiki. I'm as transparent as I can be with my thoughts. It's as personal as it gets. If you feel it's just a Timecube or TempleOS, then I'm sorry for wasting your time.


There are more real blogs today than there were in the 90ies, you just stopped visiting them.


thanks, just reminded me to delete my LJ


Eh, static site generators, CMSes, and bloghosts didn't break the web. Sure, they made it easier for people to churn out content, but it was authentic material they cared enough to write about. Whether they posted under their real name, or under a screenname, they built little fiefdoms of content with their personal time, and made it available for anyone to read, without an account, and without any obligation of feedback.

What changed was when people began putting their content into siloes protected by a login wall, and platforms strongly defined by visible indicators of popularity, which didn't really happen until Facebook and Twitter.

Even in the Livejournal and Myspace days, a lot of profiles were public, but quasi-pseudonymous, requiring some effort to actually find. It was Facebook that mandatorily juxtaposed one's real name with one's real words, which quickly led to predictable outcomes: people being doxxed, harrassed, turned down for employment. Within a few years, most people set their profiles to private in an effort to protect themselves from snooping employers, colleges, exes, and trolls, keeping most of what people write and share walled off behind a login and a friend approval gate.

Twitter was billed as "microblogging", where one could publish short snippets with more frequency vs. a long-form blog, but its bizarre interface, unclear direction, and feature competition with Facebook caused it to evolve many of the same mechanisms and signals of popularity as Facebook. Facebook's status updates were a direct assault on Twitter, so Twitter eventually morphed 'favorites' into 'likes'. With it, it was blindingly obvious that most people's content wasn't even being read.

All of this social transition took place in the shadow of the commercialization of the web, where websites were no longer just billboards for businesses, but platforms where one could conduct commerce, consume professional content, and be subject to behavioral analytics that fed back into ads. With the abundance of commercial content, consumption went up and amateur production went down.


Yeah, the article is stupid nostalgia. What killed the web is very clear: a select cadre of companies conspired and worked very hard to turn the web into an advertising platform. The developments here are obvious:

1. Google Ads monetized linking and gave birth to the SEO "industry."

2. Facebook mandated real names.

3. Apple and Google completely closed off their mobile platforms requiring pre-approval for all applications.

The last piece of this is that the stewards of the web, the W3C, have completely abandoned their charge and sold the web out to these corporations. The W3C has allowed the evolution of the web to be completely captured by Google and other major corporations. Rather than making the web simpler and more accessible we've seen the W3C bless standard after standard that make the web significantly more complicated. Today nobody but extremely wealthy corporations can afford to develop a browser. HTTP2 means nobody but major corporations can write web servers these days. Abandoning a well-structured web (XML, semantic technologies) for the current html5-js-soup means that the knowledge published on the web is wholly inaccessible... unless it gets exposed via proprietary, one-off json-soup APIs.

The complete corporate capture of the web wasn't driven by "blogging" nor was it in any way a democratic process. It was a deliberate and carefully engineered process that created a web and a computing platform (mobile phones) that is absolutely and completely under the control of corporations, moreso than any previous platform.

At this point there's really only two ways forwards: (1) abandon the web and start over or (2) governments will step up and reign in corporations.


When the least expensive computer you could get was almost $5,000 and Internet access cost $5 per hour (both inflation adjusted), web users were mostly limited to academics and children of the 1%.

As the price of Internet access decreased so did the average socioeconomic status of users, and so building tools that made content easy to create and easy to consume suddenly became profitable. There simply is no alternate timeline where the web could have stayed the way it was, except perhaps one with a very abrupt ending to Moore's Law.


While your overall point is somewhat correct, your numbers are way off.

I was "surfing the net" in 1994 with a computer I built for around $700 ($1,204.59 today's price) and was using an ISP account that was $25 per month ($43.02 today).

I was not an academic or anywhere near the 1% at the time. But I agree with the point you're trying to make, as it did become more accessible for people at a poorer level. Also computers and the internet both became easier to use which contributed greatly to adoption as well.


Yep, same here. Local dial up services were born in 94 in every American mid sized town, and half my buddies (at 10 years old) were all online between 94 and 96. None of our families were wealthy, with basically lower to middle class blue collar parents.


This article seems to be making two arguments. One is that popular blog software was limiting, and its popularity discourages people from trying other forms of content. Its a reasonable take, but it's not clear to me that's actually what happened. The other argument is that the web got worse as it became accessible to people who didn't have the knowledge, skill, or time to write their own HTML. I can't help but strongly disagree with that. Yes, the web lost a certain kind of character, but it gained billions of people and many other kinds of character. There's more content, better content, and more accessibility now than ever before. That's great news, and the value of it vastly outweighs the loss of the handcrafted HTML aesthetic.

Preservation and archiving are good things to be doing, and discovering the signal inside all the noise of today's internet can be challenging. Still, inviting everybody is a huge step forward.


>better content

Maybe better content in terms of the quality of the best content available, but, as you admit yourself by mentioning the noise-to-signal ratio... The average quality is abysmal.


Just from personal experience: when I switched from writing my HTML by hand (even with auto-expansion in web-mode.el) to generating pages with Org, I simply had a lot more time available to increase quality.

As an example, my article on Kernighan's writing[0] would simply not have happened if I was still writing HTML. It was a spur-of-the-moment thing which took very little effort thanks to the nice Org amenities.

[0]: https://two-wrongs.com/technical-writing-learning-from-kerni...


The average quality is always abysmal. Most of the content on that old web was also abysmal, it's just that no one cared at the time.


I think what happened is the same old story: early adopters/creatives did stuff, said stuff became accessible to the masses, and as a result the overall quality decreased. I guess 'eternal september' mostly covers it?

Digg was great until it became popular. Those who lamented the decline of Digg moved to Reddit. Then a similar thing happened, despite the innovation of subreddits, and some like myself mostly moved to HN.

Couchsurfing was great until it became overrun with people who didn't represent the 'spirit' of CS. Those who lamented the decline of CS moved to BeWelcome or whatever else there is.

I'm pretty sure the same patterns emerge in every human endeavour in existence (musical genres come to mind as very similar, as does the constant protestant splintering, starting with Evangelicals I suppose, followed by Pentecostals).

As an 'early adopter' in tech as well as a few other areas of life, I'm not sure how to respond. Nostalgia is kinda fun. I loved reading this article. But personally I try to remind myself that the cool part, the part that resonates with me, is being an early adopter. I don't feel that it benefits me to start gate-keeping or yearning for the past or whatever. Perhaps a better approach is to consider it a victory: the masses arrived, our passion is validated, what's next?


Personally, I think Wikipedia broke the web; it absorbed a world of individually maintained, individually slanted resources into a mass, then slowly drained away that personality in favor of a neutral POV, one edit at a time.


I don't think they should be singled out necessarily, it's just one example of the general phenomenon of the web/content becoming centralized that we're all seemingly sad about.


I think it is misguided to think that the web is "broken."

I am sure there are more people online doing more interesting things and producing more interesting content now then back in 1996.

It is just that now the internet is used by billions of people the signal to noise ratio is much lower.

Back when the internet was a few million people globally it was a very select group. That same group of people using the internet for cool interesting stuff has grown dramatically but it is still probably only a few million people globally. The difference is that now there are a couple billion other people using the internet.

There is nothing wrong with billions of people using the internet for things that interest them (mostly drama, porn, pyramid scams, and cat videos, apparently). It doesnt detract from my ability to grok some avant garde research paper that I would otherwise never have access to, run programs that do research that pulls on terabytes of data published in public databases around the world, have access to virtually every film and television show from anywhere in the world, any newspaper, etc, and collaborate with other weird people all over the world who are doing this stuff.

It is too bad that Google stop actually being a search engine and became an ad trap, but that just means going back to how you found stuff before Google: on chat (IRC then, but lots of places now) and in forums.

On the plus side google translate has transformed access to content. Back in pre-2000 the internet was basically English only. there is now so much more diversity, and google translate makes it possible to access content from a far larger group of people than what was possible in the good ol days.


Consider this alternative theory: the change ("breaking") you see, the preference for a timeline, is actually caused by the advertising model.

To advertise, you have to have people come back to the site frequently and check the site frequently.

In order to do that, you have to produce new content all the time, and in order to find that the new content the reverse timeline is the best way.

(Once you have too much fresh content, you have to come up with a way to sort even that).


Those quirky blogs are still out there. I have several myself.

They are harder to find, I think, and they are easier to ignore when there are slicker, more commercial things competing. I think maybe the real thing that changed is that all of the "hobbyists" with homepages at that time were unusually well educated and specifically knowledgeable about the internet and that's not true anymore. So, instead of the web being written almost exclusively by people with substantial college (or simply very well read and self-educated) and very similar nerdy interests, parts of it are now written by people with other backgrounds and interests.


One thing I disliked in blogs, which made me migrate back to an old-school website, was the low bar of writing on a blog, and I ended up writing a lot of impulsive rants. Publishing on your own site makes you think twice and work the opinions before writing. Another blog problem was the "planet" - your blog was syndicated by a planet and suddenly people complained that you were writing articles not in planet's default language and subjects. I had no less than 4 blogs to please the planets until I got fed up.


And some further commentary and context around this: https://kottke.org/18/07/did-blogs-ruin-the-web-or-did-the-w...


Here's my critique of the current web: https://www.kickscondor.com/2018/07/02/things-we-left-in-the...

Chronos is definitely an issue. But I think the bigger issues are: a move away from custom design, the lack of a "home page" feeling, and the hostility toward self-promotion. I know people talk about decentralization a lot - but it doesn't intrinsically solve these other issues.


"Back in my day we only had static html and we were happy!"

All I kept hearing in my head while reading this article.


> The backgrounds were grey. The font, Times New Roman. Links could be any color as long as it was medium blue. The cool kids didn’t have parallax scrolling… but they did have horizontal rule GIFs.

That's exactly what the web looks like in graphical Links [0], with the grey background and all, only difference being that the GIFs aren't animated.

[0]: http://links.twibright.com/


The blog also did something else.. give non technical users a voice. Do we celebrate that or mourn for a web full of people just like us?

In the same way that Facebook killed blogging- once a better mousetrap for publishing comes along it finds an audience albeit the great technically unwashed.


This seems to be wrong:

"By late 2000, there were still only 1,285 according to Eatonweb. Same disclosures apply on those numbers, of course, but seriously…"

Both Blogger and LiveJournal had launched in 1999. They both had many thousands of blogs by mid-2000.


Ah, nostalgia!

Yes, chronological blog feeds were hot back then, but now that FB/Instagram have taken over the role of lifestreaming platform, we're starting to see some blogs move away from the constraints of "chronostreaming", and more towards a collection of essays.

Search traffic also tends to have a power law distribution, so the majority of your incoming traffic will be focused on a few popular pages, meaning that in the long term there's really no need for an sequential stream of posts. One of the main benefits of a blog is its ability to earn search traffic for years, so why limit yourself by dating your own content?


There's a small resurgence of those old tacky websites on Glitch https://glitch.com

And some more context by Anil Dash about the old web / Geocities web dying out https://anildash.com/2012/12/13/the_web_we_lost/


> Every design decision you make represents roughly equal work because, heck, you’ve gotta do it by hand either way.

This seems extremely fallacious to me. Surely different decisions would have vastly different workloads? I would imagine the impact to be even greater for hand-crafted stuff, because you aren't automating any of the workload. I know that is just one sentence in the whole post, but it also seems like one the core arguments.


Shameless plug for a weird thing I made: https://elementcss.neocities.org/

The idea is to make it a lot easier to build sites with just simple HTML elements, the way I used to do it before CSS. Generally designed for text-oriented sites.


I feel like this is a major point in this article, if not the major point:

"Homepage production became suddenly a question of economics: Go with the system’s default format: zero work. Customizing the system to your format: way more work than pure HTML ever was"

I noticed that same exact trend even with WordPress or Bootstrap pages. I refuse to get into WordPress as a career. It's just copying and pasting! But if you write a React site from scratch, you really get full control over how everything looks and works.

That's what I'm banking on, that people still want custom websites that don't look generic. Those are the kind of gigs I want to get: make something completely custom and unique, and make it beautiful and still functional. I already have one gig doing this and it's great, I'm in love!


> I refuse to get into WordPress as a career. It's just copying and pasting!

No, it really is not. There's a lot of things about WordPress to not like, but working with it beyond a certain level absolutely involves writing code, and if you know how to write code you can "get full control over how everything looks and works" just as much as you can writing your own custom CMS.


True. There's copying, pasting, and monkey-patching!


Static site generators like Pelican/Hugo/Metalsmith/Hexo/etc. are probably better replacements for WordPress than an SPA. SPAs aren't ideal for content based sites, unless server-rendered. (SPAs still appear to affect ranking, even if the pages get indexed.)

Static sites are easier for the next programmer to modify too (changes often don't require programming), and they can be hosted at no cost at places like Netlify.


Or you could just write a plain site, with simple markup, that's like how things used to be and still can be, except a colleague of mine was critical because "it's just default styles. Why wouldn't you do more with it?"


The web isn’t broken.


Thanks for that. Reading these comments make you feel as though we're in a weird parallel universe where the internet disappeared somehow.


>Reading these comments make you feel as though we're in a weird parallel universe where the internet disappeared somehow.

I've seen people claim that the big social media silos have "centralized" and "taken control of" the web so often that I'm starting to wonder if it's just hyperbole or if people really do think the rest of the web somehow ceased to exist.


Org-mode can publish as tidy single-file HTML pages. I should get a neocities account and keep a non-blog.


tldr: At first making web pages and writing blogs was hard and tedious. So very few people did it and we had awesome things like "the big red button that doesn't do anything"

Now, making blogs is easy so millions of people do it and make generic websites. The fact that their quality is much better is irrelevant, the websites look more generic!


Comment to mark rhis excellent discussion for later


The web is not broke. People just prefer stories, situated within time, to "homepages," which are like reading resumes.


No two homepages were alike. There was certainly no such thing as a Content Management System. But it didn’t mean that the homepage content wasn’t managed.

You know, you can do that too even with a modern website and a clean blog. I have a website where I wrote everything myself, a complete front- and backend. Blog engine with hierarchical blog comments, all 100% custom. You just gotta put in the work to do this.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: