Hacker News new | comments | show | ask | jobs | submit login
The real responsive design challenge is RSS (begriffs.com)
221 points by begriffs on May 29, 2016 | hide | past | web | favorite | 108 comments

Hey, a post about RSS content quirks! It makes me feel young again!

On a more serious note - RSS is the Great Web Leveller. It spites your fancy CSS hacks, it's disgusted by your insane javascript, and it will piss all over your "mobile-optimized" crap. No semantic markup == no party; because markup is for robots, and RSS parsers are very stubborn robots that can see through web-hipster bullshit like Superman through walls.

The only real sin of RSS (beyond the holy wars and format bikeshedding and committee madness and and and...) is that it's too honest a format. It's a format for stuff that matters, for content that deserves to be read; it's too pure to survive in a world of content silos, stalking analytics and inaccessible material-designs. Its innocence doomed it in a very poetic way.

I'm very sad how publishers can find inventive and creative ways to ruin even RSS.

Medium, in particular, will only post "Continue reading on medium.com [link]" to its RSS feeds.

We refuse to pay for content, and we refuse to view ads. I keep hearing "Then come up with a different business model, just look at X" where X is simply burning VC, and then we all complain when X goes away or X starts to show ads or charge. Models like Patreon don't scale. If you can figure out a way to monetize without ads or paywalls, I'd love to hear it.

What if I don't want to make money from my content? I just want to share it with the world, let them read it (or not read it) however they want, share it, copy it, comment on it, link to it.

That was the dream of the early internet, and with the centralised nature of Facebook, Google Groups it has faded.

It's easy enough for me as somebody technical. But what about my non-technical friends? They move to Medium, Facebook, Blogger, etc, as a way of hosting their content because it's easier.

WordPress is probably the remaining champion of the easy-to-use open web, and all we do is bitch about how slow and insecure it is.

> What if I don't want to make money from my content?

That is perfectly fine and there is nothing that stands in your way of doing that - more power to you! There is a contingent who do not want to pay for, or view ads on content from publishers who would like to make money from their content (because it is their day-job)

Well, when are the publishers' demands reasonable? A lot of online media is less about adding value, and more about doing whatever it takes to get your attention.

Not to mention opinion writers or critics: They already have the privilege of pushing their opinions on a large audience. Why should we pay them? For their eloquence? Sure it's a lot of work, being persuasive in writing, but is it the audience who should pay for being persuaded?

I think that the collective action problem of paying for the stuff we want can be solved, will be solved - even is solved, in increasingly many ways. But there's a lot of things we're used to that we don't really want, in proportion to what we used to pay for it.

I agree, but I was referring to Medium's business model (because of the OP's comment) - get people to write your content for free, and then make money off it. It's frustrating that they block distribution of your content to increase their own revenues. It's a completely different use case to BBC, the Guardian, NYT, etc, who pay people good salaries to write their stories.

Then people who are concerned about that shouldn't use Medium. It's not as if there aren't a million other free blogging platforms out there.

Well, someone needs to pay for hosting, the performance improvements, and security audits and patches. There ain't no such thing as a free lunch...

That's the thing, right? Most of us have plenty of bandwidth and storage space and everything we need to host a site sitting in our living room. It's still too complicated for most people, though. It should already be as easy as Facebook...

Then just host it yourself. But Medium has to make a profit.

Then people who get paid full-time will produce content better than yours and we will read them instead of you.

Then why are you reading that free comment and not a comment you're paying for?

Why do you think Patreon-esque models don't scale? I don't have info one way or the other but I'm curious why you're so strongly of the opinion that they don't.

We did an assessment a while back, looking for options, and I can't share that. But this essentially sums it up.


Sure, it's a good pay check for one, acceptable for two if you're in the top 10 or 20 patreons, but nobody is coming back to your site if you have 1 or 2 contributors. Now, put together a team to compete with current media, and it simply doesn't add up. This could change, but I was expecting bigger numbers by now.

edit: I just want to clarify, we didn't go with a Patreon model. My expectation was that the amount donated for Patreons would be bigger by now.

I'd argue that Patreon was never intended to support content/media companies. It was designed to allow individuals to create content and "sell" it directly to the people consuming it in a way that they'll actually pay for it. In a way, it was designed to allow people to avoid working for/supporting media companies.

it looks like the "quality" of patreons & what kind of "product" you produce matter.

The last has 2 entries with similar number of patreons (Aaron Mahnke, podcast, with 1726 and Jessica Nigri, cosplayer, with 1634) and vastly different income (8k vs 23k).

Well, if something can't make money in a honest way (and no one is willing to support it out of goodness of their heart) - maybe it doesn't deserve to exist?

I'd say Internet would be 100x better if those sites that complain about "business models" all went bust.

The New York Times? Wired? Guardian? WSJ? ProPublica?

Yes. Yes. Yes. WSJ really can't figure out a business model for itself? ProPublica - you understand the conflict of interest present here?

I reduced the op's statement to 'I don't care about content provided by professional writers' because each and every one of them/the institutions employing them is/has been trying to figure out a business model.

And the WSJ is as far as I know not one of those offering a full RSS feed for free, so that'd probably fall under "not honest" in their definition.

One thing I notice that needs to be pointed out: "professional writers" != "good writers". "Professional" is sometimes treated as a synonym for "good" / "serious", but IMO shouldn't because it isn't. Professional writers are those who write for a living. Today's market does not promote good writing. So yeah, I don't care about content provided by professsional writers, because more often than not, it's not content worth reading.

Newspapers actually do have a shot at figuring this out. Some let you read a few articles for free, and ask for a small payment for more, some experiment with publishing platforms for per-article content via micro-payments (e.g., https://blendle.com), others simply make a selection of articles and news available for free to act as an advertisement for their paid subscriptions.

If you think of "simply burning VC" as "wealthy benefactors subsidising content for everyone" then it looks like a very appealing model.

Crikey's business model comprises a daily email and a monthly bill. They've managed to grow steadily for over a decade, and they complain more about defamation lawsuits than about add blockers. The important news would get covered if every city the size of Melbourne had one of those.

The real problem might be that it's advertisers who need lots of reporters; the public doesn't, because there is only so much news that's worth paying to read. The only answer would be to avoid trying too hard to scale.

Note: dreamsofdragons's contribution was brought to you by Arby's.

Sorry for the snark, but a significant portion of what I read online is not ad supported, it's just people driven by a need to share ideas.

It strikes me as ironic when someone uses uncompensated content to criticize uncompensated content.

I've been using a self-hosted version of Full-Text RSS [1] to turn excerpt-only RSS feeds into full-text RSS feeds (which I use in conjunction with Calibre [2] to generate a weekly Kindle book containing all of the latest articles to read on my Kindle). Full-Text RSS works great and the developer is very responsive.

1. http://fivefilters.org/content-only/

2. https://calibre-ebook.com/

Instapaper very cheap account does almost this. I click my ReadLater Instapaper bookmarklet and it sends an ebook with the lastest 50 pages to my kindle email. Not perfect, since it sends some stories multiple times, but works fine.

I absolutely love this. I wish all my "read later" bookmarks were turned into an ebook at the end of the week.

Not an eBook, but:... in Tiny Tiny RSS, I mark everything I want to read later as "Publicized", which puts it in a new RSS feed. Via ifttt, that gets moved to Pocket. I've got a Kobo eBook Reader, which has Pocket integrated. Voila.

I used to use Yahoo Pipes to fix this exact annoyance. Pull down the RSS feed, find the "read more" link, then pull down the full article and output a new RSS feed with that. Since Pipes went away I use a python script on shared hosting to do the same. It's more powerful but to say it is nowhere near as user-friendly would be an understatement.

> Its innocence doomed it in a very poetic way.

And yet, virtually every single site on the internet has an RSS or ATOM feed. I think I've come across a whopping 5 that don't and those are mostly tech guys who made their own blog system and didn't bother with the rss feed because, as everybody knows, rss is dead.

Most of those are auto-generated feeds by the usual-suspect CMSs. In most cases, nobody is actually looking at what those feeds look like, with terrible results. Outside the niche of news providers, nobody cares.

Besides, does Facebook have RSS feeds?

Does Twitter have RSS feeds?

That's basically half the web right there, these days.

> Besides, does Facebook have RSS feeds?

Yes, for notifications: https://www.facebook.com/help/212445198787494/

Other feeds used to be available too but I can't seem to get them to work.

> Does Twitter have RSS feeds?

It used to, but is disappeared as they were locking down their apis.

I am still upset about Twitter removing the RSS feeds. It made it so easy to programmatically parse Twitter data. Their API is a sorry replacement.

I explicitly try to use Facebook without visiting Facebook, and the notifications RSS feed is the only feed that still works (they retired all the others). So without cranking your own oAuth based app, or scraping the HTML, it's all you've got. To make matters worse, the feed itself seems to be delayed on some sort of random variable, making it useless for timely alerts.

> In most cases, nobody is actually looking at what those feeds look like, with terrible results.

The results look fine to me. It's pretty rare to find one that's completely mangled. For the most part, even embedded videos work. The news providers are the bigger pain in the ass, since they like to truncate content and insert ads.

> Besides, does Facebook have RSS feeds?

Facebook doesn't, which isn't a huge loss. Nothing of consequence gets posted to Facebook, at least not for me. I could see where that would be a pain point for some people.

> Does Twitter have RSS feeds?

There are several services that provide twitter feeds. I use twitrss.me.

Twitter used to have RSS feeds, and then they took them away as they made their content more siloed.

Both once had RSS feeds, and discontinued them. The RSS feeds didn't have ads.

Bing's RSS feed for news results is broken; there is a bare ampersand in the namespace declaration for search results' RSS feeds rather than an & entity.

Facebook also has RSS feeds for your 'Facebook Notes'

I wish that was true. I maintain a site for a manufacturers representative company. We want to include news from the manufacturers but few of them provide a feed. For a while I ran a "feed" created by scraping their news pages. Of course we got burned when they changed the page format.

The sinfest RSS feed is technically still there, but it's no longer updated which kind of defeats the purpose.

About that, which is better, RSS or Atom?

Atom, in my opinion; it cleaned up some problems with RSS.


Atom is the only choice, these days. Being a newer spec, it's cleaner and stricter on what should go where. There was some debate back then on compatibility grounds but that problem has long disappeared and any decent feed parser will understand Atom just fine.

> "web-hipster bullshit"

Why so much negativity lately regarding new trends in the web?

One reason might be that, for anyone with historical perspective, a situation where our pipes can download hundreds of MB per second and still our browsers crawl like turtles is just shocking. Modern browsers are the most compatible ever and come with all sorts of bells and whistles baked in, and still people are using JS frameworks over JS frameworks over JS frameworks.

Another reason might be that most of this website-fat is actively anti-users. It breaks the back button. It breaks bookmarks. It spies on you. It makes it impossible to interoperate/mash stuff, and it makes it incredibly hard to automate things (do you know that in the '90s you used to be able to download entire websites, so you could comfortably read them offline?).

Another reason is probably that a lot of "mainstream web plumbers" (like yours truly) are now hitting late middle-age and feeling it hard. We used to be able to view-source and copypaste our way to image-rollover glory; we were breathing markup and feeds... now it's all tooling and frameworks and stacks, even http is going binary, and it feels like the loss of this innocence is not actually gaining anything for anyone except marketing people.

Now get off my lawn.

> and still people are using JS frameworks over JS frameworks over JS frameworks

Even as someone who is ostensibly a front-end hipster dev (long live React), this is something I'm in total agreement with.

Every single web developer should take 10 minutes, and go have a poke around WebPlatform[0], and see what these amazing networked virtual machines we call browsers can do natively these days. And they can do it well faster than anything we can write ourselves. Of course, browser support is an issue, which is why polyfilling as needed should still be the name of the game, but it genuinely blows my mind how often I've seen devs add a brittle dependency to the code-base for something browsers can do themselves, better, faster, and supported back to some seriously old browsers!

[0] https://www.webplatform.org/

Thanks for the link the WebPlatform. It's a great resource for me as someone who only occasionally touches the front-end.

I guess its very telling that the terminology these days is not site, but app...

I think it's telling that some people don't seem to grasp that the terminology is app because they are applications. The web is no longer just html, they are full-fledged applications running in the browser.

Yes, and most of them shouldn't be.

That's not the problem. The problem is that the frameworks developers use to make their content interactive and manage state are overkill. It's like needing to display today's date on the page and downloading an entire date library to do it when you can just do "Date();"

They are two distinct problems, but they are both problems.

Problem 1: A text-sharing platform was coerced into being a sandboxed pseudo-distributed virtual machine.

Problem 2: The way this virtual machine is built and re-built for each "app" is terrible (wasteful, overkill, overbearing).

You might think 1) is a fait accompli, but it continues to impact and create issues downstream in so many areas (security, privacy, usability etc etc) because the platform was simply not built for what we're making it do.

But both of those problems aren't problems, they are solutions to problems.

When I make a web app, it's specifically because i want it to run in a sandboxed pseudo-distributed virtual machine that installs and runs in less than a second, and is removed automatically after some time has passed.

My users don't want to download and install a binary onto their PC, they don't want to have to manage any dependencies I may rely on, they don't want to have to configure anything, they just want to use my app ASAP in a safe manner and then go on with their lives.

If i gave you 2 options, which would you choose?

1. Download and execute this binary blob on your PC.

2. Run this javascript in your browser.

Option 1 provides little protection. I could access GPS, camera, microphone, disk, etc... without as much as a single prompt. And if the PC's permissions are messed up, i can even access other programs' data and even manipulate them.

Option 2 provides permission-locked options to access a lot of those, in a language which was designed to be sandboxed, in a browser with exactly the same amount of permissions as Option 1 (so even if all the permission-locking measures failed or were bypassed, the app is still bound by the same permissions and restrictions as option 1 would be). It installs itself in seconds, it runs fast enough for most use cases, and is cross-platform in a way that nothing else ever has been able to even come close to before.

Please, let's not try to justify post-facto what is an utter accident of history.

Javascript wasn't "designed to be sandboxed", it was designed (in something like a week) to move crap around an html page. It then grew and mutated in the frankenvm we all "enjoy" today; which is why all these "security features" that were tacked on (like xmlhttprequest, same-origin, "secure cookies" etc etc) are constantly failing and require more and more hacks to keep up appearances.

> My users don't want to download and install a binary onto their PC

Your users were absolutely fine with Flash, and if Macromedia had not gone to the dogs after the Adobe acquisition, they would likely still be. They were fine with ActiveX objects (in fact, they loved them!). Your users don't know the difference between markup and binary blobs, they just don't care. By shoving scripts down their throat "because the browser starts faster than a java applet", we simply abuse their good faith. Maybe you could have claimed "javascript is more secure" back in 2001, when 99.99% of scripts out there were perfectly readable with view-source; in the age of minifying, obscuring and compiling, that's simply not true anymore. Javascript is now effectively the same as a binary blob, the only difference being that Google, Microsoft and Mozilla do some of the bootstrapping for you. Hell, even HTTP/2 is now binary!

Please, don't try to claim a moral high ground that was lost 10 years ago. We've decided to abuse the browser-donkey, fine; but let's not delude ourselves that there was some grand design or logic behind it all.

> Option 1 provides little protection. I could access GPS, camera, microphone, disk, etc

Yeah, because the browser today cannot access GPS, camera, microphone, disk... oh wait. In Microsoft land, the browser can even bork your entire system!

These problems are problems because we're using the wrong tools to find solutions to a different class of problems altogether. In so doing, we've perverted what was a wonderful document-oriented system. The browser was not born to run apps, but we've decided it's good enough to do it anyway, logic be damned. This comes to mind: https://www.youtube.com/watch?v=aXQ2lO3ieBA

Saying javascript was designed in a week is incredibly disingenuous. Literally nobody is using the javascript that was designed in a week, in fact the language that was designed in a week was called mocha. Javascript (in name or in spirit) wasn't even included in the browser until at least a year (and several revisions) after it was written in "a week".

>all these "security features" that were tacked on (like xmlhttprequest, same-origin, "secure cookies" etc etc) are constantly failing and require more and more hacks to keep up appearances.

I don't see those security features or anything like them in any other platform, and that's my point. Regardless of how we got here, the browser is a platform that i feel MOST comfortable executing random code from the internet on because of those restrictions. If i'm going to execute code, the next best alternative is pretty poor. Obviously they aren't perfect, but they are more things that need to fail in order for something bad to happen.

Also, since when is an evolution of a product considered "more and more hacks"?

Is anything that wasn't included in Linux 1.0 a hack?

> Javascript is now effectively the same as a binary blob

I 100% agree that modern JS payloads are basically binary blobs at this point, that's why i compared them to binary blobs... I'm saying that the language (more the "standard library" that js running in the browser has access to) is meant to be run by everyone in a secure and safe manner. That is something that just about no other language/platform is designed to do today. Yes, there are ways you could make something like python for instance safe enough to allow random websites to execute code on your PC, but that is going to be much riskier because the language is not meant to be run that way (not to mention that it won't be even within a few magnitudes as "battle tested" as javascript is).

>Yeah, because the browser today cannot access GPS, camera, microphone, disk... oh wait.

You can't just quote the parts of the sentence you like and ignore the rest. The very next words were "without a single prompt". The browser has those features and more permission-locked (and more recently enabled only for TLS secured pages), just about all desktop platforms don't even have the concept of "access to webcam" permissions, most mobile platforms are pretty on-par in this aspect, but even they don't have restrictions like requiring the user confirm if the app wants to store more than like 10mb of data on the disk in a permanent manner (which all major browsers do).

>In Microsoft land, the browser can even bork your entire system!

And i'm in no way advocating for that... Most of the web community and all browser vendors are moving away from dangerous plugins/extensions like those that were used in the 90's and early 2000's. Bringing up microsoft specific extensions like activex (or even flash!) in a discussion about web development in 2016 is just building yourself a strawman to kick down. That's not web development any more than running a program in x86 real mode is current desktop application programming.

>In so doing, we've perverted what was a wonderful document-oriented system.

That wonderful document-oriented system still works just fine.

I get that you don't like the web, but arguing that it was a mistake is completely missing the point. The language and the platform has evolved since it's inception, and these evolutions aren't mistakes, they are changes made by all major browser vendors as well as multiple standards bodies in an effort to improve it. You may not like that it's changing, but it's no mistake. Javascript is designed to be a sandboxed, quick, programming language regardless of what it's "first inception" was meant to be, and the web is designed to be an application platform regardless of how much you don't want it to be one.

> I don't see those security features or anything like them in any other platform

... except the ones that were built exactly for the purpose of being sandboxed VMs: java applets, flash, mobile apps etc. The browser might have more prompts now simply because Microsoft showed everyone how not to do it with ActiveX, but it was not designed that way.

> just about all desktop platforms don't even have the concept of "access to webcam"

That's staggering ignorance, sorry. Why do you think you have groups like "cdrom" in Linux?

No, sorry, you can try with other arguments (standards, design in the open, whatever), but security really has nothing to do with why JS got where it got or why we use it today.

> That's not web development

No, that's exactly "web development" as you understand it (i.e. building "apps"), just on previous iterations. What you do today in JS was done yesterday with plugins; but nobody could agree on a single architecture (too many salespeople influencing the process), so the community took it upon itself to standardise on the minimum common denominator (JS) no matter how bad it was -- it was the only thing browser vendors could agree on (and even then, barely). JS managed to jump on the "standard" bandwagon that HTML+CSS spearheaded, and found itself as The Blessed VM. And here we are.

> That wonderful document-oriented system still works just fine.

No, it goddamn does not, hence the rant. loads of mainstream websites simply do not work without JS anymore, even when they are simply supposed to show me some text. I have to trust umpteen third-party spynalytics websites just to read a couple of football scores.

The document-oriented system is dead and buried. We are left with the JS frankenvm.

> I get that you don't like the web

Read my top comment again -- I love the web, and that's why this involution is hurting me.

> these evolutions aren't mistakes

No, in most cases they are simply accidents of history like XMLHttpRequest.

> the web is designed to be an application platform

No, the web has mutated in an application platform. Big difference.

In a way, it's another wonderful metaphor for human nature: the evolution of web technology has been as messy, inconsistent and accidental as most human history. There is no rising sun, there is no grand plan: evolution is random and all political or religious systems are equally broken... we simply agree on the minimum set of norms necessary to avoid stabbing each other in the face, and then we call it a masterpiece of divine design that we obviously wanted all along.

Screw this, it's 2AM and tomorrow I have to work to pay a mortgage. Have fun, you silly kids.

>Screw this, it's 2AM and tomorrow I have to work to pay a mortgage. Have fun, you silly kids.

Sorry, my comment ended up being more aggressive than i wanted, and i apologize. But I really do want to keep having a bit of a discussion without it dissolving to insults. Clearly we both think we are right, and i'm curious where the disconnect is.

>... except the ones that were built exactly for the purpose of being sandboxed VMs: java applets, flash, mobile apps etc.

But most of those are gone now (or at least gone from "popular" usage, and for good reason), and while mobile apps are still around and aren't going anywhere they aren't nearly as locked down as something like a web app is. When comparing mobile apps and web apps, i'm not saying one is better than the other, just that they are different.

>The browser might have more prompts now simply because Microsoft showed everyone how not to do it with ActiveX, but it was not designed that way.

Then current browsers were designed with more prompts... Maybe not from version 0, but their current versions are designed to have them.

>That's staggering ignorance, sorry. Why do you think you have groups like "cdrom" in Linux?

Now i'm not nearly as knowledgeable about linux as i should be, but I thought those groups all had to do with file access. I've installed applications and had them access GPS hardware, cameras, microphones, and more without a single prompt on multiple linux platforms (as well as windows). Is there something i'm missing there, or were my systems just not configured correctly?

>No, that's exactly "web development" as you understand it (i.e. building "apps"), just on previous iterations.

But again, that's like comparing current desktop/server application development with x86 real mode applications. Yeah, that was how it was done in the past, but it's not how it's done now. I agree with the rest of that statement though, JS was pretty awful, but it's grown significantly and I wouldn't call current day JS bad by any means. It definitely has it's warts (more than many other languages, but it is extremely capable and it's becoming a language i reach for more often than not. Once people successfully started writing non-trivial server side code in JS instead of alternatives that argument kind of goes out the window IMO.

>No, in most cases they are simply accidents of history like XMLHttpRequest.

I wouldn't call XMLHttpRequest an accident. It was designed to allow script access to Http Requests... And then it was adopted by other browser vendors because it was great functionality. There were mistakes in web development and javascript (like automatic semicolon insertion making it's way into the standard), but things like XMLHttpRequest wasn't one of them.

>No, the web has mutated in an application platform. Big difference.

Regardless of the terminology that you want to use, it is an application platform, and a very capable and secure one. It's far from perfect and it is still rapidly changing, but arguing about how awful it was in the past isn't going to make anyone's lives better. Like the linked article is saying, let's try to work on making sure that things get better in the future. Personally, I really like the direction everything is heading (with only a few exceptions).

> mobile apps [...] aren't nearly as locked down as something like a web app is.

Not sure why you brought up mobile, but anyway... mobile OSs have very fine-grained policies and sandboxes that can lock apps down as tight as any website. The problem is that most of them get waived away by lazy developers and unsuspecting users, and walled gardens' owners have an interest in not bothering developers too much (for fear of losing them). It was like this for browsers as well, when they were an IE monoculture; and it will likely go back there if Mozilla ever croaks under pressure from Google.

> Then current browsers were designed with more prompts

Yeah, but current browsers are part of the problem: they've decided to be runtimes, so they will act as runtimes. People who want document browsers have been left by the wayside. That is my point in a nutshell.

> I thought those groups all had to do with file access.

Of course: standard Unix philosophy is that everything is a file, including your devices; so they were secured as such.

> Is there something i'm missing there, or were my systems just not configured correctly?

Linux desktops can be locked down to various degrees; there are so many different ways to enforce policies on all sorts of things (from traditional permissions to ACLs to SELinux to PolicyKit and whatnot). What distributions decide to allow out of the box is due to their target markets. Try doing the same on OpenBSD, to see what life is when a system is completely locked down by default: you'll have to explicitly authorize many, many things...

> Yeah, that was how it was done in the past, but it's not how it's done now.

The point is, the aim of the effort is the same; but since people couldn't agree on the best way to do it, it was done on the most politically acceptable way, piggybacking on a system that was not designed for it but that nobody could object to. It's a bit like QWERTY: a keyboard layout nobody ever liked, but that somehow became The One Way To Type. JS was retrofitted to replace plugins and be the "web runtime" for everyone. In a way, that was the objective that Firefox pursued almost from the start, although it came to life in a slightly different way from their specific implementation.

I'm not saying JS is completely bad, but that it was clearly designed for "quick and dirty" scripts and it shows. There is very little philosophy, very few stylistical choices in the core language beyond "it has to be OO" when OO meant Java and Smalltalk. It's to "mainstream" OO languages what PHP is to C: a quick hack to use a familiar syntax in a context where the original cannot go. Sure, it's evolved and we can use it to do all sorts of things (which is not at all a recent development, btw; Netscape had a server-side product almost from the beginning), but its roots are undeniably weak. The fact that we need rocket-science VMs on state-of-the-art processors to move a few boxes with it (slowly) IMHO is proof enough.

>I wouldn't call XMLHttpRequest an accident.

It pretty much is. Microsoft's vision for it was a mechanism to fetch XML and mash it with XSL, which is what the web was supposed to become at one point. It became a way for the runtime to interact with any networked resource, for all sorts of purposes (loading other code, APIs, authentication, tracking, data etc etc) without having to reload the full layout.

But again, it did not come from "the web runtime camp" and it wasn't even a JS thing (it was a COM object): it was a cool feature added by a browser vendor with a completely different view of what the web should be, but it was retrofitted because it sorta worked. They had to tack-on same-origin limitations because originally it was ripe for abuse, and tbh even now it looks a bit silly when compared with any other mainstream language http-request implementation. Random evolution at its finest.

I say this as someone who, when "ajax" started to become a buzzword, was like "oh yeah I've been pushing that concept for ages". I just think we've gone too far.

> Personally, I really like the direction everything is heading (with only a few exceptions).

Well then, to each his own :) personally, I'd very much prefer to start again with a clear separation between "content browser" and "application runtime". You can have your runtime, but let me read content in peace without having to execute anything. Which is why I've always liked RSS: I fetch the content and then I'm free to read it as I damn well like. Beside, a runtime rebuilt from scratch could do cool things like supporting multiple languages by design, an extended stdlib, proper sockets, etc etc, removing the need for the current "selection of features by marketing popularity" process that produces framework over framework over framework.

I would join your club

I would like to subscribe to your newsletter.

Do you have a RSS feed with your new posts?

Because continued development of the Web is an obvious, user-hostile money-grab. If you only care about getting your software to users, you have a million choices for your application runtime. If you want to monetize every user by selling their privacy or headspace, you have only one: i.e., "pirating" Facebook is a logical impossibility. Communicative efficacy on the Web has not improved since the early 1990s--Javascript and its ilk have virtually no positive effect on the usefulness of Wikipedia, for example. We're converting what was intended to be a transport for communication into a framework for writing portable software.

The thesis here is that there should be a digital way to share hypertext articles and other statically-described communication without pay the costs (privacy, rendering efficiency, security bugs, transport performance and uncacheability, accessibility) associated with a cross-platform sandboxed multimedia software runtime.

Weren't computers built at first to do completely different from what we are currently using them for? I don't get the negativity about new ways people try to use things we have. If we were always following guidelines and rules, we would be stuck in the past

I don't want to be rude, but I really don't understand your message.

Computers have always been, and will always be, tools for automating away work. The specific nature of the work the user needs automated is in a constant state of flux. One particular kind of work that we've wanted computers to do for us since the 20th century is the sharing of written and visual information.

Ostensibly, this is what the Web was written for, and though it wasn't perfect, it could have evolved into a simply, highly efficient tool for this purpose. Instead, it can be seen evolving into a sandboxed application runtime, a workload with vastly different properties.

Good implementations of software for these two workloads would share little structure outside of that required for inevitable hardware access. Evolving the Web into an application runtime is possible, and well underway, but doing so still leaves the niche of software for an unfettered global commons of knowledge unfilled.

By using the Web we're building for the latter purpose, we are met with drawbacks which include (as I mentioned):

- lack of privacy

- rendering efficiency

- security bugs and malware

- transport performance

- uncacheability

- accessibility

We can attempt to address each of these piecemeal, and with sustained effort we're likely to make good progress on most of them, in due time. But a lot of these are really hard problems for an application runtime to solve.

Conversely, these problems are a lot easier (where they exist at all) on a software platform designed from the ground up for hypertext-based, archivable communication, rather than as an application runtime.

This is possibly the absolute best comment I've read in months. It puts into words why I dislike JavaScript 'web apps' so much: the Web was this really neat tool for sharing the written word, and it's turned into a virtual machine runtime. I don't mind the VM, but it's a hack, and I miss the tool for sharing information.

And isn't it funny how many of those problems were reasonably well addressed (not all, not perfectly, but mostly better than current state-of-art) by a software platform designed from the ground up to be a sandboxed application runtime back in the 1990's... Java something something.

Similarly, if we were never negative about things that sucked from our perspective, we would be stuck in a world where Internet Explorer was the only browser and everyone was using "Windows 2016". Negative feedback (preferably with suggestions for improvement) is just as important as imaginative new ways of using the web.

Nobody is complaining about people playing around and inventing new stuff. That's the hacking spirit, that's the essence of progress. I like browser experiments just like everyone else, even if half of them lock my computer up.

No, we're complaining about those inventing new ways to make people's life much worse in exchance for the off chance that someone leaves them more money.

It's not "being new" that is the criticised activity here.

Because most of the developments are a net loss for the user.^1 The most important benefit of an ad blocker is not that it blocks ads, the most important benefit is, that the web is a lot faster. I basically stopped surfing on my tablet, because after 2 seconds some ad network comes around to finally deliver an ad, and the entire site jumps around. I lose the place what I just read, and from time to time I actually click some ad by accident ( I wanted to scroll when the site suddenly jumped around, guess what if I want to play stupid reaction games I start Mario, not surf to the NYT.) and I have to click the back button and have a few seconds to curse all web designers while I wait for the same slow ad network to deliver the same ad.

It is the same with analytics, JS to have 'nice' scrolling for galleries, and css. Analytics at best does not get into my way, when I notice it, then because it slows my experience to a crawl (another nice opportunity to reflect on the depravity of web designers). JS for galleries or blinking effects or 'improved' nagigation has the same characteristic, either I don't notice or it annoys me. (Nice thing about NoScript, it breaks nag walls by accident. So I get all the advantages of an ad blocker, without having to feel guilty for trying to break the revenue model of websites.) CSS, well the defaults of the browsers are pretty good, and in general browsers are very good at displaying easily readable text. So at best CSS looks slightly better than the original, however I just had to switch off Gnome dark mode because some sites only define the foreground of a text input field not the background, which ends as black text on a dark gray background.

^1 Don't get me wrong, I understand how that happens. I am currently working on my new blog and wanted an entirely static site, without JS. I almost managed to get there, but I do not have a good static replacement for MathJax, so now I am wondering if I should deliver MathJax myself ( and a site jumps from a few kB to more than a MB) or if I should use the hosting by MathJax, with all the downsides above.

Install Firefox Mobile with uBlock in your tablet. I'll discover a whole new Web.


Because it seems more and more of the trends are designed to target me for advertising and analytics while making the user experience worse.

I look at one website and uBlock says it blocked 12 domains the page tried to connect to and allowed 13 domains. What the heck? I go to audible.com and it flashes after 2 seconds and redraws the page. Never mind the banner at the top of the page that says to download the app.

All the while, I still have a data cap of 10 gigabytes on my cell. Simple pages are downloading 2 megabyte frameworks, custom fonts, and autoplaying video, and groups of megabyte images. Whole programs in the 90's fit in the space of one of these images (and they said Smalltalk program were too big). We aren't all browsing at wifi spots at some coffee shop. We have to pay for this overpriced bandwidth.

I actually hope it gets worse and worse. I'm looking forward to what comes next after it all collapses. Perhaps we will get a medium that matches the app vision people are programming to.

// I did up vote you, no sense in down votes for a simple question

I paid close to $2k for a computer with 16GB of memory and can only have the same amount of tabs open as I did in 2009 with 4GB of RAM. My computer still slows to a crawl when on certain pages.

These are the literal costs (for me) to be able to use the web nowadays for minimal usability and functionality 'increases'.

For reference, the multiprotocol instant messenger client Meebo was a successful competitor to Trillian in 2005. It was a web application that used less memory and was more responsive than a desktop application. I used it on a computer with 512mb of memory just fine.

Also, browsing the web on my phone is unnecessarily difficult and battery consuming.

Because even once respected outlets seem to be hiring people who have spent their entire childhoods on SA and 4chan, whose only real skill is gaining attention by negative social manipulation.

Clickbait is more than sensationalized headlines these days. The entire media, regardless of place on the political spectrum - and much non-news media - seems to have embraced what was once the domain of talk radio hosts.

I can voice my perspective. Essentially: bike-shedding, bedazzlement, and bloat.

The honesty of your comment made my morning.

I really wish the web could be more like that. Too many websites these days consider themselves applications where really they're just showing simple documents which should conform to the user's viewing preferences.

RSS is odd. Unloved by many, killed by Google, but there's still no better way of getting notified that a rarely (or perhaps not so rarely if you have endless time to read "internet stuff") updated site has something new to read. So I have an Android app which checks once per day and if there's anything new that day (I can go days or weeks without an update) I then pass it on to my Pocket account. Sometimes it goes another step further and I send it on to my Kindle.

Perhaps there's a better way of handling this. I can't read it directly on my Kindle because the browser there - optimistically described "experimental" - is shocking. Pocket is great because it does a good job of producing a page which consists of just the typing without the usual horrific web fluff (although sometimes it gets it wrong and the graphics go missing).

It seems a shame that, when most of what I'm interested in started life as someone else essentially entering text into a document, there's no way of obtaining it in that form but instead it has to be manipulated into something sensible. I don't want an "experience"; I just want to read what you've typed.

Would it help if I gave you my email address?

Regarding reading news on the Kindle: I've noticed that the browser becomes much more responsive (read: usable) if one disables JavaScript altogether. Sure, some websites will break, but most will work well enough to be able to read articles.

Disabling JavaScript is also my trick for actually being able to browse the internet on a phone these days. In Firefox for Android, you can install a plugin to toggle it on/off for when you need it.

A bit sad that you have to do this to get at decent experience, but what can you do...

I think your last line is tongue-in-cheek, but it's actually my preferred approach.

> still no better way of getting notified that a rarely (or perhaps not so rarely if you have endless time to read "internet stuff") updated site has something new to read.

Maybe it's because I became a heavy web user after RSS was already on its way out (of the mainstream), but my favorite way to be notified is an email newsletter. I don't care very much about the frequency because I save everything to Instapaper anyway and read it weeks later.

> I don't want an "experience"; I just want to read what you've typed.

Exactly. That's why I prefer command-line browsers (and like reader mode): I just want to see words, not distraction.

It would be fantastic if there was a standard format like you suggested, it would also make the web a whole lot more accessible.

"Marc Andreessen" suggested something like that at the bottom of this: http://www.zerobugsandprogramfaster.net/essays/2.html (search for "separate content from presentation")

I thought the whole point of css was to separate content from presentation. But it only seems to be used for reskinning sites and - occasionally - for producing a "printer view". I suppose I could pretend to be a printer. But can I not instead just pretend to want to read the text?

> It would be fantastic if there was a standard format like you suggested

I thought that's what HTML was supposed to be …

Somehow it turned into a portable runtime instead.

For the last three days, we've had a Teletype Model 14 tape printer following the Reuters RSS feed [1] at a steampunk convention in San Jose, printing hundreds of feet of 8mm paper tape. Trying to condense RSS down to all-upper-case Baudot tape printing is harsh. All markup is deleted. All links are deleted. Most characters outside letters and numbers become "?".

For the Reuters feeds, this works out fine. The content is text, not markup. There are few or no HTML tags. The Reuters feeds are headlines and a sentence or two. The Associated Press also has RSS feeds, and it's very similar. The Voice of America's feeds are much wordier; they often have the whole article.

Space News has an RSS feed.[2] The Senate Democrats have an RSS feed covering what's happening on the Senate floor.[3] (The GOP discontinued their feed.[4]) The House Energy and Commerce Committee has a feed with markup in embedded JSON.[5] Not sure what's going on there. Even The Hollywood Reporter has an RSS feed.[6]

So for real news, RSS is in good shape. RSS seems to be doing fine for sources that have something important to say.

[1] http://feeds.reuters.com/reuters/topNews [2] http://spacenews.com/feed/ [3] https://democrats.senate.gov/feed/ [4] http://www.gop.gov/static/index.php [5] https://energycommerce.house.gov/rss.xml?GroupTypeID=1 [6] http://feeds.feedburner.com/thr/news

I develop and maintain an open source RSS reader. In my experience it's not so bad. I strip CSS and javascript from feeds, and most of them are displayed fine anyway. I don't think I've ever found a feed that needed javascript to load content, it seems even SPAs include plain entry content in their feeds, thankfully. I've never found a feed that became unreadable after stripping styling either.

I agree it's interesting to look at your content when loaded in an RSS reader. IMHO most feeds are actually more readable when loaded in a clean uncluttered RSS reader than in the original webpage. If the content is good, the reading experience should not be harmed by focusing just on its text and images and removing extra styling.

Shameless plug: the RSS reader I maintain is https://www.feedbunch.com , comments are welcome.

I also think it is not so bad. It is the responsibility of the reader to style the content, for example to auto-size images via css so they don't get too big. If the reader does that, the feed content will in 99.99% of all cases be at least as readable as with the original styling.

Feedbunch looks great, all I want from an RSS reader is to show me a list of my feeds, and to show me the unread items in each feed, which it seems to do beautifully! Unfortunately, I dislove the fact that it clicking on a list item would just show "loading..." forever :(

Firefox 46.0.1 on Ubuntu with uBlock, if it helps.

It certainly shouldn't do that. Unfortunately I cannot reproduce it, it works fine here.

Loading entries (among other things) is done with web workers. Are you blocking the execution of web workers or dynamic loading of remote javascripts, by any chance?

I shouldn't be. What's worse, it works fine now. Yesterday I didn't even get the tutorial, which makes me think that some JS wouldn't load. I think I'll switch to it as my default reader, thank you!

EDIT: Ah, the demo worked but my signed-up account does not.

EDIT 2: It works erratically.

When Google Reader was in its heyday, I remember thinking the futuristic promise of the web from the early 90s had finally arrived. I so miss it. So much effort is put into unique "platforms" nowadays — I get why Reader (and by extension RSS) can't survive in such an environment where exclusive attention of our eyeballs is monetized — but I do sometimes wish I would wake up to an announcement that RSS is a priority for big companies once again. One can dream.

As the developer of an RSS parser, I spend a lot of time hitting View Source, surprisingly often on pages that appear empty in Chrome.

In a more general sense than RSS, I also have to install extensions to format JSON. Considering how much browsers are targeting developers these days, might they consider rendering JSON, XML, etc in some standard way that is useful to developers (as an option at least). I am talking about syntax highlighting as well as some basic interactive features like expanding/collapsing.

If there are no style parser directives, chrome will render the XML very nicely by default, in my experience.

As the developer of an RSS parser

As far as I know Firefox does this for XML; haven't tried it lately, so I don't know if it behaves similarly for JSON (I know Chrome doesn't).

Firefox nightly and developer edition show JSON loaded directly in a tab (i.e. not in a subframe) in a nice JSON viewer), and have for a while now. Release Firefox does not, for various reasons (including the fact that the current JSON viewer implementation actually changes the DOM of the page involved, which is technically an HTML spec violation).

I think the nastiest thing to parse in RSS feeds these days is code highlighting. To a surprising degree, people who should know better use blogging software that chops up code into tables, divs, and spans, and styles them with CSS that is not included in the feed. You either have to either reconstruct the underlying plain-text code as best you can, or try to recognize and support a zoo of different highlighting libraries.

...oops. Good point, I never thought of that. I'll include a style in my own blog, thanks!

Yahoo! Weather's RSS feed has some handy additional data. Useful if you only have a US postal code. Includes things like lat/long coords, separate elements with weather forecasts ad current conditions, sunrise/sunset times... Pretty handy bits of data just for requesting an RSS feed.


Bonus: the @code attribute can be substituted into the URL for an image, and visually identify the weather condition referred to by the code:


Just replace "26" part of "26.gif" with another value.

It depends on who your audience is, right? If most of the people who read your site come from social media on their phones instead of reading via RSS, optimizing for RSS readability is an activity with rapidly diminishing returns.

but how did your article get onto social media?

Was it one of the 20 people who read your RSS feed and linked to it on twitter and elsewhere? Been there, done that, got the Hackernews Karma.

But if you want to optomise for casual readers vs people who want to actively subscribe to your content...

I'm not sure what this has to do with responsive design, but this is a cool list of some things to be aware of :)

Making your content reasonably usable without any CSS is (or at least should be) the epitome of responsive design.

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact