The only real sin of RSS (beyond the holy wars and format bikeshedding and committee madness and and and...) is that it's too honest a format. It's a format for stuff that matters, for content that deserves to be read; it's too pure to survive in a world of content silos, stalking analytics and inaccessible material-designs. Its innocence doomed it in a very poetic way.
Medium, in particular, will only post "Continue reading on medium.com [link]" to its RSS feeds.
That was the dream of the early internet, and with the centralised nature of Facebook, Google Groups it has faded.
It's easy enough for me as somebody technical. But what about my non-technical friends? They move to Medium, Facebook, Blogger, etc, as a way of hosting their content because it's easier.
WordPress is probably the remaining champion of the easy-to-use open web, and all we do is bitch about how slow and insecure it is.
That is perfectly fine and there is nothing that stands in your way of doing that - more power to you! There is a contingent who do not want to pay for, or view ads on content from publishers who would like to make money from their content (because it is their day-job)
Not to mention opinion writers or critics: They already have the privilege of pushing their opinions on a large audience. Why should we pay them? For their eloquence? Sure it's a lot of work, being persuasive in writing, but is it the audience who should pay for being persuaded?
I think that the collective action problem of paying for the stuff we want can be solved, will be solved - even is solved, in increasingly many ways. But there's a lot of things we're used to that we don't really want, in proportion to what we used to pay for it.
Sure, it's a good pay check for one, acceptable for two if you're in the top 10 or 20 patreons, but nobody is coming back to your site if you have 1 or 2 contributors. Now, put together a team to compete with current media, and it simply doesn't add up. This could change, but I was expecting bigger numbers by now.
edit: I just want to clarify, we didn't go with a Patreon model. My expectation was that the amount donated for Patreons would be bigger by now.
The last has 2 entries with similar number of patreons (Aaron Mahnke, podcast, with 1726 and Jessica Nigri, cosplayer, with 1634) and vastly different income (8k vs 23k).
I'd say Internet would be 100x better if those sites that complain about "business models" all went bust.
And the WSJ is as far as I know not one of those offering a full RSS feed for free, so that'd probably fall under "not honest" in their definition.
The real problem might be that it's advertisers who need lots of reporters; the public doesn't, because there is only so much news that's worth paying to read. The only answer would be to avoid trying too hard to scale.
It strikes me as ironic when someone uses uncompensated content to criticize uncompensated content.
And yet, virtually every single site on the internet has an RSS or ATOM feed. I think I've come across a whopping 5 that don't and those are mostly tech guys who made their own blog system and didn't bother with the rss feed because, as everybody knows, rss is dead.
Besides, does Facebook have RSS feeds?
Does Twitter have RSS feeds?
That's basically half the web right there, these days.
Yes, for notifications: https://www.facebook.com/help/212445198787494/
Other feeds used to be available too but I can't seem to get them to work.
> Does Twitter have RSS feeds?
It used to, but is disappeared as they were locking down their apis.
The results look fine to me. It's pretty rare to find one that's completely mangled. For the most part, even embedded videos work. The news providers are the bigger pain in the ass, since they like to truncate content and insert ads.
> Besides, does Facebook have RSS feeds?
Facebook doesn't, which isn't a huge loss. Nothing of consequence gets posted to Facebook, at least not for me. I could see where that would be a pain point for some people.
There are several services that provide twitter feeds. I use twitrss.me.
Why so much negativity lately regarding new trends in the web?
Another reason might be that most of this website-fat is actively anti-users. It breaks the back button. It breaks bookmarks. It spies on you. It makes it impossible to interoperate/mash stuff, and it makes it incredibly hard to automate things (do you know that in the '90s you used to be able to download entire websites, so you could comfortably read them offline?).
Another reason is probably that a lot of "mainstream web plumbers" (like yours truly) are now hitting late middle-age and feeling it hard. We used to be able to view-source and copypaste our way to image-rollover glory; we were breathing markup and feeds... now it's all tooling and frameworks and stacks, even http is going binary, and it feels like the loss of this innocence is not actually gaining anything for anyone except marketing people.
Now get off my lawn.
Even as someone who is ostensibly a front-end hipster dev (long live React), this is something I'm in total agreement with.
Every single web developer should take 10 minutes, and go have a poke around WebPlatform, and see what these amazing networked virtual machines we call browsers can do natively these days. And they can do it well faster than anything we can write ourselves. Of course, browser support is an issue, which is why polyfilling as needed should still be the name of the game, but it genuinely blows my mind how often I've seen devs add a brittle dependency to the code-base for something browsers can do themselves, better, faster, and supported back to some seriously old browsers!
Problem 1: A text-sharing platform was coerced into being a sandboxed pseudo-distributed virtual machine.
Problem 2: The way this virtual machine is built and re-built for each "app" is terrible (wasteful, overkill, overbearing).
You might think 1) is a fait accompli, but it continues to impact and create issues downstream in so many areas (security, privacy, usability etc etc) because the platform was simply not built for what we're making it do.
When I make a web app, it's specifically because i want it to run in a sandboxed pseudo-distributed virtual machine that installs and runs in less than a second, and is removed automatically after some time has passed.
My users don't want to download and install a binary onto their PC, they don't want to have to manage any dependencies I may rely on, they don't want to have to configure anything, they just want to use my app ASAP in a safe manner and then go on with their lives.
If i gave you 2 options, which would you choose?
1. Download and execute this binary blob on your PC.
Option 1 provides little protection. I could access GPS, camera, microphone, disk, etc... without as much as a single prompt. And if the PC's permissions are messed up, i can even access other programs' data and even manipulate them.
Option 2 provides permission-locked options to access a lot of those, in a language which was designed to be sandboxed, in a browser with exactly the same amount of permissions as Option 1 (so even if all the permission-locking measures failed or were bypassed, the app is still bound by the same permissions and restrictions as option 1 would be). It installs itself in seconds, it runs fast enough for most use cases, and is cross-platform in a way that nothing else ever has been able to even come close to before.
> My users don't want to download and install a binary onto their PC
Please, don't try to claim a moral high ground that was lost 10 years ago. We've decided to abuse the browser-donkey, fine; but let's not delude ourselves that there was some grand design or logic behind it all.
> Option 1 provides little protection. I could access GPS, camera, microphone, disk, etc
Yeah, because the browser today cannot access GPS, camera, microphone, disk... oh wait. In Microsoft land, the browser can even bork your entire system!
These problems are problems because we're using the wrong tools to find solutions to a different class of problems altogether. In so doing, we've perverted what was a wonderful document-oriented system. The browser was not born to run apps, but we've decided it's good enough to do it anyway, logic be damned. This comes to mind: https://www.youtube.com/watch?v=aXQ2lO3ieBA
>all these "security features" that were tacked on (like xmlhttprequest, same-origin, "secure cookies" etc etc) are constantly failing and require more and more hacks to keep up appearances.
I don't see those security features or anything like them in any other platform, and that's my point. Regardless of how we got here, the browser is a platform that i feel MOST comfortable executing random code from the internet on because of those restrictions. If i'm going to execute code, the next best alternative is pretty poor. Obviously they aren't perfect, but they are more things that need to fail in order for something bad to happen.
Also, since when is an evolution of a product considered "more and more hacks"?
Is anything that wasn't included in Linux 1.0 a hack?
>Yeah, because the browser today cannot access GPS, camera, microphone, disk... oh wait.
You can't just quote the parts of the sentence you like and ignore the rest. The very next words were "without a single prompt". The browser has those features and more permission-locked (and more recently enabled only for TLS secured pages), just about all desktop platforms don't even have the concept of "access to webcam" permissions, most mobile platforms are pretty on-par in this aspect, but even they don't have restrictions like requiring the user confirm if the app wants to store more than like 10mb of data on the disk in a permanent manner (which all major browsers do).
>In Microsoft land, the browser can even bork your entire system!
And i'm in no way advocating for that... Most of the web community and all browser vendors are moving away from dangerous plugins/extensions like those that were used in the 90's and early 2000's. Bringing up microsoft specific extensions like activex (or even flash!) in a discussion about web development in 2016 is just building yourself a strawman to kick down. That's not web development any more than running a program in x86 real mode is current desktop application programming.
>In so doing, we've perverted what was a wonderful document-oriented system.
That wonderful document-oriented system still works just fine.
... except the ones that were built exactly for the purpose of being sandboxed VMs: java applets, flash, mobile apps etc. The browser might have more prompts now simply because Microsoft showed everyone how not to do it with ActiveX, but it was not designed that way.
> just about all desktop platforms don't even have the concept of "access to webcam"
That's staggering ignorance, sorry. Why do you think you have groups like "cdrom" in Linux?
No, sorry, you can try with other arguments (standards, design in the open, whatever), but security really has nothing to do with why JS got where it got or why we use it today.
> That's not web development
No, that's exactly "web development" as you understand it (i.e. building "apps"), just on previous iterations. What you do today in JS was done yesterday with plugins; but nobody could agree on a single architecture (too many salespeople influencing the process), so the community took it upon itself to standardise on the minimum common denominator (JS) no matter how bad it was -- it was the only thing browser vendors could agree on (and even then, barely). JS managed to jump on the "standard" bandwagon that HTML+CSS spearheaded, and found itself as The Blessed VM. And here we are.
> That wonderful document-oriented system still works just fine.
No, it goddamn does not, hence the rant. loads of mainstream websites simply do not work without JS anymore, even when they are simply supposed to show me some text. I have to trust umpteen third-party spynalytics websites just to read a couple of football scores.
The document-oriented system is dead and buried. We are left with the JS frankenvm.
> I get that you don't like the web
Read my top comment again -- I love the web, and that's why this involution is hurting me.
> these evolutions aren't mistakes
No, in most cases they are simply accidents of history like XMLHttpRequest.
> the web is designed to be an application platform
No, the web has mutated in an application platform. Big difference.
In a way, it's another wonderful metaphor for human nature: the evolution of web technology has been as messy, inconsistent and accidental as most human history. There is no rising sun, there is no grand plan: evolution is random and all political or religious systems are equally broken... we simply agree on the minimum set of norms necessary to avoid stabbing each other in the face, and then we call it a masterpiece of divine design that we obviously wanted all along.
Screw this, it's 2AM and tomorrow I have to work to pay a mortgage. Have fun, you silly kids.
Sorry, my comment ended up being more aggressive than i wanted, and i apologize. But I really do want to keep having a bit of a discussion without it dissolving to insults. Clearly we both think we are right, and i'm curious where the disconnect is.
>... except the ones that were built exactly for the purpose of being sandboxed VMs: java applets, flash, mobile apps etc.
But most of those are gone now (or at least gone from "popular" usage, and for good reason), and while mobile apps are still around and aren't going anywhere they aren't nearly as locked down as something like a web app is. When comparing mobile apps and web apps, i'm not saying one is better than the other, just that they are different.
>The browser might have more prompts now simply because Microsoft showed everyone how not to do it with ActiveX, but it was not designed that way.
Then current browsers were designed with more prompts... Maybe not from version 0, but their current versions are designed to have them.
>That's staggering ignorance, sorry. Why do you think you have groups like "cdrom" in Linux?
Now i'm not nearly as knowledgeable about linux as i should be, but I thought those groups all had to do with file access. I've installed applications and had them access GPS hardware, cameras, microphones, and more without a single prompt on multiple linux platforms (as well as windows). Is there something i'm missing there, or were my systems just not configured correctly?
>No, that's exactly "web development" as you understand it (i.e. building "apps"), just on previous iterations.
But again, that's like comparing current desktop/server application development with x86 real mode applications. Yeah, that was how it was done in the past, but it's not how it's done now. I agree with the rest of that statement though, JS was pretty awful, but it's grown significantly and I wouldn't call current day JS bad by any means. It definitely has it's warts (more than many other languages, but it is extremely capable and it's becoming a language i reach for more often than not. Once people successfully started writing non-trivial server side code in JS instead of alternatives that argument kind of goes out the window IMO.
>No, in most cases they are simply accidents of history like XMLHttpRequest.
>No, the web has mutated in an application platform. Big difference.
Regardless of the terminology that you want to use, it is an application platform, and a very capable and secure one. It's far from perfect and it is still rapidly changing, but arguing about how awful it was in the past isn't going to make anyone's lives better. Like the linked article is saying, let's try to work on making sure that things get better in the future. Personally, I really like the direction everything is heading (with only a few exceptions).
Not sure why you brought up mobile, but anyway... mobile OSs have very fine-grained policies and sandboxes that can lock apps down as tight as any website. The problem is that most of them get waived away by lazy developers and unsuspecting users, and walled gardens' owners have an interest in not bothering developers too much (for fear of losing them). It was like this for browsers as well, when they were an IE monoculture; and it will likely go back there if Mozilla ever croaks under pressure from Google.
> Then current browsers were designed with more prompts
Yeah, but current browsers are part of the problem: they've decided to be runtimes, so they will act as runtimes. People who want document browsers have been left by the wayside. That is my point in a nutshell.
> I thought those groups all had to do with file access.
Of course: standard Unix philosophy is that everything is a file, including your devices; so they were secured as such.
> Is there something i'm missing there, or were my systems just not configured correctly?
Linux desktops can be locked down to various degrees; there are so many different ways to enforce policies on all sorts of things (from traditional permissions to ACLs to SELinux to PolicyKit and whatnot). What distributions decide to allow out of the box is due to their target markets. Try doing the same on OpenBSD, to see what life is when a system is completely locked down by default: you'll have to explicitly authorize many, many things...
> Yeah, that was how it was done in the past, but it's not how it's done now.
The point is, the aim of the effort is the same; but since people couldn't agree on the best way to do it, it was done on the most politically acceptable way, piggybacking on a system that was not designed for it but that nobody could object to.
It's a bit like QWERTY: a keyboard layout nobody ever liked, but that somehow became The One Way To Type. JS was retrofitted to replace plugins and be the "web runtime" for everyone. In a way, that was the objective that Firefox pursued almost from the start, although it came to life in a slightly different way from their specific implementation.
I'm not saying JS is completely bad, but that it was clearly designed for "quick and dirty" scripts and it shows. There is very little philosophy, very few stylistical choices in the core language beyond "it has to be OO" when OO meant Java and Smalltalk. It's to "mainstream" OO languages what PHP is to C: a quick hack to use a familiar syntax in a context where the original cannot go.
Sure, it's evolved and we can use it to do all sorts of things (which is not at all a recent development, btw; Netscape had a server-side product almost from the beginning), but its roots are undeniably weak. The fact that we need rocket-science VMs on state-of-the-art processors to move a few boxes with it (slowly) IMHO is proof enough.
>I wouldn't call XMLHttpRequest an accident.
It pretty much is. Microsoft's vision for it was a mechanism to fetch XML and mash it with XSL, which is what the web was supposed to become at one point.
It became a way for the runtime to interact with any networked resource, for all sorts of purposes (loading other code, APIs, authentication, tracking, data etc etc) without having to reload the full layout.
But again, it did not come from "the web runtime camp" and it wasn't even a JS thing (it was a COM object): it was a cool feature added by a browser vendor with a completely different view of what the web should be,
but it was retrofitted because it sorta worked. They had to tack-on same-origin limitations because originally it was ripe for abuse, and tbh even now it looks a bit silly when compared with any other mainstream language http-request implementation. Random evolution at its finest.
I say this as someone who, when "ajax" started to become a buzzword, was like "oh yeah I've been pushing that concept for ages". I just think we've gone too far.
> Personally, I really like the direction everything is heading (with only a few exceptions).
Well then, to each his own :) personally, I'd very much prefer to start again with a clear separation between "content browser" and "application runtime".
You can have your runtime, but let me read content in peace without having to execute anything. Which is why I've always liked RSS: I fetch the content and then I'm free to read it as I damn well like.
Beside, a runtime rebuilt from scratch could do cool things like supporting multiple languages by design, an extended stdlib, proper sockets, etc etc, removing the need for the current "selection of features by marketing popularity" process that produces framework over framework over framework.
The thesis here is that there should be a digital way to share hypertext articles and other statically-described communication without pay the costs (privacy, rendering efficiency, security bugs, transport performance and uncacheability, accessibility) associated with a cross-platform sandboxed multimedia software runtime.
Computers have always been, and will always be, tools for automating away work. The specific nature of the work the user needs automated is in a constant state of flux. One particular kind of work that we've wanted computers to do for us since the 20th century is the sharing of written and visual information.
Ostensibly, this is what the Web was written for, and though it wasn't perfect, it could have evolved into a simply, highly efficient tool for this purpose. Instead, it can be seen evolving into a sandboxed application runtime, a workload with vastly different properties.
Good implementations of software for these two workloads would share little structure outside of that required for inevitable hardware access. Evolving the Web into an application runtime is possible, and well underway, but doing so still leaves the niche of software for an unfettered global commons of knowledge unfilled.
By using the Web we're building for the latter purpose, we are met with drawbacks which include (as I mentioned):
- lack of privacy
- rendering efficiency
- security bugs and malware
- transport performance
We can attempt to address each of these piecemeal, and with sustained effort we're likely to make good progress on most of them, in due time. But a lot of these are really hard problems for an application runtime to solve.
Conversely, these problems are a lot easier (where they exist at all) on a software platform designed from the ground up for hypertext-based, archivable communication, rather than as an application runtime.
No, we're complaining about those inventing new ways to make people's life much worse in exchance for the off chance that someone leaves them more money.
It is the same with analytics, JS to have 'nice' scrolling for galleries, and css. Analytics at best does not get into my way, when I notice it, then because it slows my experience to a crawl (another nice opportunity to reflect on the depravity of web designers). JS for galleries or blinking effects or 'improved' nagigation has the same characteristic, either I don't notice or it annoys me. (Nice thing about NoScript, it breaks nag walls by accident. So I get all the advantages of an ad blocker, without having to feel guilty for trying to break the revenue model of websites.) CSS, well the defaults of the browsers are pretty good, and in general browsers are very good at displaying easily readable text. So at best CSS looks slightly better than the original, however I just had to switch off Gnome dark mode because some sites only define the foreground of a text input field not the background, which ends as black text on a dark gray background.
^1 Don't get me wrong, I understand how that happens. I am currently working on my new blog and wanted an entirely static site, without JS. I almost managed to get there, but I do not have a good static replacement for MathJax, so now I am wondering if I should deliver MathJax myself ( and a site jumps from a few kB to more than a MB) or if I should use the hosting by MathJax, with all the downsides above.
Because it seems more and more of the trends are designed to target me for advertising and analytics while making the user experience worse.
I look at one website and uBlock says it blocked 12 domains the page tried to connect to and allowed 13 domains. What the heck? I go to audible.com and it flashes after 2 seconds and redraws the page. Never mind the banner at the top of the page that says to download the app.
All the while, I still have a data cap of 10 gigabytes on my cell. Simple pages are downloading 2 megabyte frameworks, custom fonts, and autoplaying video, and groups of megabyte images. Whole programs in the 90's fit in the space of one of these images (and they said Smalltalk program were too big). We aren't all browsing at wifi spots at some coffee shop. We have to pay for this overpriced bandwidth.
I actually hope it gets worse and worse. I'm looking forward to what comes next after it all collapses. Perhaps we will get a medium that matches the app vision people are programming to.
// I did up vote you, no sense in down votes for a simple question
These are the literal costs (for me) to be able to use the web nowadays for minimal usability and functionality 'increases'.
For reference, the multiprotocol instant messenger client Meebo was a successful competitor to Trillian in 2005. It was a web application that used less memory and was more responsive than a desktop application. I used it on a computer with 512mb of memory just fine.
Also, browsing the web on my phone is unnecessarily difficult and battery consuming.
Clickbait is more than sensationalized headlines these days. The entire media, regardless of place on the political spectrum - and much non-news media - seems to have embraced what was once the domain of talk radio hosts.
Perhaps there's a better way of handling this. I can't read it directly on my Kindle because the browser there - optimistically described "experimental" - is shocking. Pocket is great because it does a good job of producing a page which consists of just the typing without the usual horrific web fluff (although sometimes it gets it wrong and the graphics go missing).
It seems a shame that, when most of what I'm interested in started life as someone else essentially entering text into a document, there's no way of obtaining it in that form but instead it has to be manipulated into something sensible. I don't want an "experience"; I just want to read what you've typed.
Would it help if I gave you my email address?
A bit sad that you have to do this to get at decent experience, but what can you do...
> still no better way of getting notified that a rarely (or perhaps not so rarely if you have endless time to read "internet stuff") updated site has something new to read.
Maybe it's because I became a heavy web user after RSS was already on its way out (of the mainstream), but my favorite way to be notified is an email newsletter. I don't care very much about the frequency because I save everything to Instapaper anyway and read it weeks later.
Exactly. That's why I prefer command-line browsers (and like reader mode): I just want to see words, not distraction.
I thought that's what HTML was supposed to be …
Somehow it turned into a portable runtime instead.
For the Reuters feeds, this works out fine. The content is text, not markup. There are few or no HTML tags. The Reuters feeds are headlines and a sentence or two. The Associated Press also has RSS feeds, and it's very similar. The Voice of America's feeds are much wordier; they often have the whole article.
Space News has an RSS feed. The Senate Democrats have an RSS feed covering what's happening on the Senate floor. (The GOP discontinued their feed.) The House Energy and Commerce Committee has a feed with markup in embedded JSON. Not sure what's going on there. Even The Hollywood Reporter has an RSS feed.
So for real news, RSS is in good shape. RSS seems to be doing fine for sources that have something important to say.
I agree it's interesting to look at your content when loaded in an RSS reader. IMHO most feeds are actually more readable when loaded in a clean uncluttered RSS reader than in the original webpage. If the content is good, the reading experience should not be harmed by focusing just on its text and images and removing extra styling.
Shameless plug: the RSS reader I maintain is https://www.feedbunch.com , comments are welcome.
Firefox 46.0.1 on Ubuntu with uBlock, if it helps.
EDIT: Ah, the demo worked but my signed-up account does not.
EDIT 2: It works erratically.
In a more general sense than RSS, I also have to install extensions to format JSON. Considering how much browsers are targeting developers these days, might they consider rendering JSON, XML, etc in some standard way that is useful to developers (as an option at least). I am talking about syntax highlighting as well as some basic interactive features like expanding/collapsing.
As the developer of an RSS parser
Bonus: the @code attribute can be substituted into the URL for an image, and visually identify the weather condition referred to by the code:
Just replace "26" part of "26.gif" with another value.
Was it one of the 20 people who read your RSS feed and linked to it on twitter and elsewhere? Been there, done that, got the Hackernews Karma.
But if you want to optomise for casual readers vs people who want to actively subscribe to your content...