"A social network is disempowering because you put a lot of energy into it, all your personal data out there, and tell it who your friends are. You can only use that information inside the silo of that particular social network."
and something we tried to do as well: http://yansh.github.io/articles/moana/
The issue is not storing the data, just like "email" doesn't store data. It's just a protocol and a data format. This makes it interoperable across whichever industry player wants to spring up and compete for your service.
Don't like gmail? Go to ProtonMail. And you don't lose the ability to interact with people who do use Gmail.
Don't like Facebook? Go to Ello. But now you've lost your entire network.
The decentralized web will be built on data formats and protocols that allow you to take your data with you and force companies to compete with the quality of their service, not the size of their network.
If I move from Gmail to Outlook, I can still email my friends but I can't go back and search our old conversations unless I export everything from Gmail to Outlook. If I switch from Facebook to Instagram, I can still message my friends but I can't see all of the pictures I posted on Facebook unless I export the pictures from Facebook to Instagram, but even then there are a lot of features of Facebook that I can't export data from because they just don't exist on Instagram.
History creates lock-in, and history is really hard to move around.
That's the importance of open protocols.
Who remembers RSS feeds? I miss those days. We should have some superior spin on that but for social networks.
Imagine Facebook, but your history/data were your own property. A friend would find you on Facebook, or Ello, or AnythingAnyWhereBook ... add your feed, and done. Then it is up to the Social Networks to make a superior product for you to enjoy your friends data and interaction.
Right now it's about the monopolisation of our data.
Zuckerberg. If you're reading. Your current model is bullshit.
I think a big missed opportunity for Google on social was the failure to cultivate Google Reader into an open social network (it already had social features), and build each aspect of it on an RSS/OPML-like structure.
Anyway, to the point, try using RSS with facebook in the above context.
I hate being locked in - hence I gave up on FB. It's trash to me until it solves this issue.
For me they're different things. RSS is for things that I want available until I get around to reading them or until I manually mark them as read, this is precisely what I don't want out of facebook.
Intruigingly given how many RSS feeds are out there from WordPress blogs, about the only way I found to get news on WordPress releases is via their RSS feed.
I think it's the ability to communicate that's harder to move around, and why there's such facebook lockin. Your social network and ability to communicate with them in a public/group fashion.
And the legalities of that.
FB did a major legal smackdown on someone trying to do that.
It doesn't need to be. For example, I moved my email between IMAP servers multiple times simply by dragging & dropping a swath of messages in Outlook. That Outlook (of all programs!) is the only tool that I know of that makes this easy is not a problem with the decentralized nature of email but simply with mediocre client implementations.
Not sure what you mean.
It works as easy with Mutt. Open IMAP folder, select all, save to [other IMAP folder or local folder, as you wish].
Works reliable even with very large folders, you just have to wait a bit longer until it's finished.
Recently we moved all our company mail with IMAPsync. Automated commandline tool. Awesome.
I think there's a way to get around that, if the protocol defines standards for verbs.
Of course, it's difficult and can grow wild, yet very much possible.
Here's the rules for putting the letters after names: https://www.debretts.com/expertise/forms-of-address/letters-...
The Joint Forms of address gives you some clues about the order of words before the name: https://www.debretts.com/expertise/forms-of-address/joint-fo...
Note that most people don't know about this stuff, and don't really care about it.
Mr Berners-Lee surely won't feel offended because the person addressing him isn't a [UK] royalist?
Most of her subjects do not complain. Some of them think we should have done away with such notions of reigning over other people already.
FWIW outside celebrity circles it is often considered quite gauche to insist on such pompous titles.
If you insist I'd be happy with calling him Professor Berners-Lee (even assuming he doesn't still have an active official professorship). If you want to talk about proper titles then is Sir Berners-Lee also rude when it should, by the right of HM EIIR be "Professor Sir Berners-Lee" [when spoken]?
If you're worried about recognition I think Mr Berners-Lee is not lacking in that department and was already party to the Queens inner circle as a holder of the Order of Merit which appears to be a far greater royal honour than a "mere" KBE. He has my utmost respect, if you doubted it.
Increasingly and to the point of arguable totality, such titles are bestowed by popularity, committee, and HM Government. Damned shame, since it would mean more what it should were it not given to celebrity riff-raff for 'services to sport'.
I digress. You say it's fine as a non-'UK royalist' to use an improper title; I say I bet most of the world doesn't use even Mr, and I'd do my utmost to pay proper respect to local custom and any honours.
For example, I'm not Catholic, but I would of course refer to Pope Francis; not 'Mr Francis', or anything involving his birth (as opposed to regnal) name.
Pope is a high office, and Francis is his chosen name. It would be rude of you to call him by his birth name; it would be rude of him to insist you call him "Your Holiness", although you probably should if you're visiting the Vatican.
You are clearly free to call him whatever you want, but I would advise you to be honest enough to realize that you're projecting your feelings about royalists onto him, and not calling him what he has chosen to be called.
If a person is an MD, professor, and such then that's totally acceptable. If the designation is "the Queen says you have to call me sir now" I don't really see what business it is of the Queen's or that it's more respectful to recognise a person because "the Queen smiled at them" than recognising their actual work/effort as fellowship of a learned society does.
It's an anachronous system of nepotism; bleurgh. What's not to like./s
(edit: Today I learned that Kazuo Ishiguro is British. The statement below is completely wrong. I've managed to get his nationality wrong while enjoying his books for 20 years!)
So in the specific case of Kazuo Ishiguro, the sir does not apply. For Brits with family-first names, see the other answers.
If my friend can sign a message using the same key on both services - and I already trust that key is them - I can be almost certain it is them. Regardless of the service.
This is one of the first use cases for keybase.io - verifying identities across a few main social sites. (E: For clarification, Keybase makes it easier to find these proofs. It isn't necessary as part of the proofs.)
For example, you can find me on Reddit or Twitter and know it is me. You can also see my website URL and know that I own it - and a bitcoin address where you know I will receive the money. This is because I've verified that I own these accounts using PGP - and I've done the same for HN in my user profile. See: https://keybase.io/nadya
It's only 'solved' if you don't care about usability. PGP quickly becomes a nightmare when you consider the day-to-day things 'normal' computer users will go through (private key stolen, password to private key forgotten, phishing attacks to sign replacement keys for friends, etc).
The difficulty with decentralising is that these are the biggest challenge to centralised systems like Facebook; they're problems which are very nice to have ("oh no, we have a billion users!").
The biggest challenge for decentralised systems is getting momentum behind an agreed-upon protocol.
Email was born when there were few people who needed to agree, and their major concern was being able to send/receive messages. As the Internet expanded, the momentum behind email ensured that it could ride that wave as well.
These days the Internet, and hence the number of people who need to agree on the protocols, is huge. Think about, for example, HTTP2; how many organisations and individuals were involved in its definition and publication? How many more have been involved in implementing it for various browsers, servers, crawlers, libraries, etc? How many more have been installing updates, editing config files, etc. to use it?
Another change is the concerns of the stakeholders; if all we cared about were sending/receiving messages, then we could use something like telnet, HTTP, email, etc. Instead, the concerns are more fuzzy (e.g. "presence", "likes", etc.) and more political (silos vs federation vs p2p, expectations of privacy, encryption, archiving, searchability, etc.). Reaching agreement on such things is very difficult, as everyone wants different things.
The results either:
- Have buy-in from too-few people (e.g. diaspora, pump.io, gnusocial)
- Are so general as to be inappropriate for most particular tasks (e.g. RDF; an interesting example considering that RSS is a stripped-down subset of RDF, and is/was widely adopted; also compare to ATOM, which took that subset, threw away the ties to RDF, and became more interoperable!)
- Only offer tiny featuresets (e.g. the various microformats)
Email is a mess of kludges and frankly could benefit from a ground up re-implementation. In fact many have tried. But like with creating a decentralised social network, it doesn't matter if you have an arguably better implementation if you don't generate the snowball effect attracting users to your "better" platform.
RSS 0.9 was RDF, as was 1.0. 0.91–0.92 (and the 0.93/0.94 drafts) and 2.0 had nothing to do with RDF.
These days however it's not the people who decide, it's organizations.
>Think about, for example, HTTP2; how many organisations and individuals were involved in its definition and publication?
Many may be involved, but very few can decide whether to use it or kill it. Namely, Google and Facebook would be enough. Descending down to individuals, it's about 3-5 from Google and one from Facebook.
It was like 'AI' in the 80's, but with XML.
It's like a decentralized version of Twitter, with federalization. Might need to find a instance that's not busy with signups though.
I really like the idea that Eben Moglen seems to pitch in every of his recent talks when talking about FreedomBox:
A decentralized network of cheap "plug-computers" that run Facebook-like functionality on the hardware of the users. And yet, while FreedomBox is a great project and does many things right, it omits the one crucial aspect of this: the decentralized social network.
Let's hope Solid or something similar takes over the social networking world eventually and we can get rid of all the megacorporations hoarding our data.
I run a GNU Social site. It works well. It's a bit slow and the themes are out-of-date, but if it had more of a following I'm sure we could turn it around.
Gmail considers anything beyond big corporate email providers as second class participants - and so does the other big providers.
I've used one of the many blacklist check websites to test my domain and only ended up with two false positives which I've rectified through their web forms.
It was mostly smooth sailing from there.
Note that this is using a dedicated server as a host, a while ago I tried the same thing from my home ADSL (with a static IP) and that was a pain since those "home" addresses are generally blacklisted by default since people tend to assume that emails from those addresses are sent by compromised computers.
Nowadays I know many ISPs even block port 25 by default so that's not even an option anymore.
This may not apply if you are a low volume sender.
But if I were running email for any new organisation I wouldn't have that luxury ofcourse - I'd have to pay for a Google/outlook or other major provider account - or spend a lot of time "reversing" the reputation of the ip/ip block my dedicated servers had - and as none of the major providers provide a simple check like rbl etc - it's a frustrating guessing game where your test-mails to your own Gmail(etc) account go through, but mail to new senders is silently dumped to the spam folder.
I don't mind greylisting, mail bouncing or blacklisting. It's the magic "intelligent" filtering which is annoying - because the only indication of error is that you don't get a reply - because your recipient haven't seen your mail.
In a way it's worse that it's rare - you never know if you've fixed the issue, or some arcane combination of your sending configuration and email content (an email not in English?! Probably spam!) will mean a new recipient will never see your mails.
We've been running on the same domain+IP for over a decade. Hotmail block us when we're whitelisted by the original sender and also responding to an email and also have corresponded before. We've never sent even a multi-person email never mind spam, always as a response to an enquiry, very low volume.
We're blocked apparently because Hotmail has an intermediate level warning on a sender domain on another IP address managed by the same ISP; we were un-blocked before but they now say they won't allow our email through - we have SPF but can't work DKIM with our shared status. The only silver lining is they actually responded inside a fortnight to tell us that they're going to continue blocking our [at most] once a week emails to our customers who happen to have hotmail/live/outlook addresses ...
To work around I made a live.com address to send through for customer who use MS email. ~4 days later they started blocking it from remote access ... something like not enough direct website logins for the amount of IMAP lookups (a single client checking every hour or so during work hours). Few log-ins seemed to fix it.
We can pay to have the spam service MS use whitelist us though, apparently.
You have to accept and work with changing expectations.
If you are unable to relocate to more reputable hosting, and your messages are legitimate, solicited, authorised, well-formed transactional email (no promotional mail shots), I suggest you switch to delivering through a squeaky-clean provider like Postmark.
They could just obey their own customers white-listing, or over-ride blacklisting for domains that act appropriately.
So you really think that having a domain of 10+ years standing that doesn't send unsolicited emails at all is "not behaving like a first class citizen"?
If someone in your ISPs /26 sends spam from a different IP and using a different domain, that totally means you're a massive spammer who's ruining the internet??
I could of course change ISP but which of the 15+ yo ISPs will have a spammer somewhere in their IP block, no way to tell.
So the only option left is pay to be able to send email to MS addresses. And you're fine with that? It's all our fault for replying to emails?
Your expectations don't match reality and you seem to prefer complaining to change.
I have basically no sympathy at all, I'm afraid. It is your fault, not for "replying to emails" but by failure to update infrastructure as the context changes. This is your problem to fix, and it appears you've made poor choices in that direction based on a misattributed sense of entitlement.
But how is such an effect not an issue? It makes it hard to "elevate" a box at home to a proper mail server, and makes it hard for new generations to host their own mail (until an ip6-only world (and probably also then) all new users will need to recycle old ips).
As for dmarc, I'm still not entirely convinced it's a great solution to smtp/spam issues. Dkim is already annoying enough if you occasionally want to send mail through/from a different smtp - like Gmail if you don't run your own webmail.
These days I'll always take a paid service over a "free" one where I'm the product.
Ultimately I'd love to host everything myself. But, at present, I don't have the technical knowledge to trust myself to secure everything.
On top of that, on an even more emotional level, the principle of myself being the product doesn't sit right with me.
Indie web is about people owning their data and having control over it. It's about being able to build what you want for yourself. It's a direction, a set of ideas and principles that some people enjoy.
Otherwise, content falls into a specific set of mediums (text, images, video, sound), all of which are assumed on a modern platform and aren't worth reconsidering. Design and presentation, however, has infinite possibility.
Most of the possibilities get in the way of the content.
And yet he decided to be "pragmatic" on web DRM, instead of taking the same idealistic approach because it was the right thing to do.
What's the product in a social network?
It's not purposefully evil or inherently wrong, just an inevitable consequence of the business models involved in free-to-use social networks.
I'm trying to explore whether a paid social network could be a workable model with https://postbelt.com
Please check it out :-)
EDITED: 20th century, not 19th.
For example, Google and other search engines would not work without the principle of least power , which a lot of people, including Alan Kay , somehow don't understand. That is, if the web language was a VM rather than HTML, there would be no Google.
It would also not have been possible for the web to make the jump from desktops to cell phones as the #1 client now. You know the handler in iOS and Android that makes <select> boxes usable? That's an example of the principle of least power.
I recommend reading his book "Weaving the Web"  if you want to learn more about the story behind the web.
I'm very glad that TBL is getting this recognition. He is a genius and also has a very generous personality.
People in the programming community seem to talk about Torvalds or Stallman a lot, perhaps because of their loud styles, but I don't see that much about TBL.
Ditto in the CS community. "HyperText" used to be a big research area but I guess TBL solved it and people don't talk about it anymore.
Kay's criticism of the Web is very well justified and (like most of his high-level criticisms) typically misunderstood. He doesn't criticize it as a repository of hyperlinked documents. He criticizes it as platform for application delivery, which it became. Modern web with all its scripts is a VM -- and badly designed one at that.
The hypertext repository was so compelling that everyone installed software to access it. Then that universally available software was so compelling that people found ways to run increasingly complex applications on it. And that's how we naturally found ourselves where we are today.
I don't see any reason to think that engineering a more perfect solution all at once would have worked better than this natural progression.
He thinks that you can just design something nice from whole cloth and people will use it. That's why his designs aren't deployed.
I've been looking at projects like OOVM and going back in history to Self and SmallTalk, and there's a reason that those things aren't deployed. Don't get me wrong -- they're certainly influential and valuable.
But he's basically confusing research and engineering, as if engineering wasn't even a thing, and you can just come up with stuff and have people use it because it's good. You need a balance of both to "change the world", and TBL has certainly done that.
Another analogy I use is complaining about the human body. Like "who designed this thing there there trachea and esophagus are so close together?!? What an idiot!!!" Or "why are all these people mentally ill and otherwise non-functional members of society? Who designed this crap?"
The point is that it couldn't have been any different. It wasn't designed; it was evolved.
Okay, so what's wrong with discussing the limitations of the human body and the ways to improve it then?
Yes, the web evolved instead of being designed (however much that distinction makes sense), but arrived at a shi^H^H^H suboptimal result. And it arrived there through deliberate design decisions of people - who unfortunately were designing a different system in the first place.
It's like English. I love English, but it's a bloody mess that we're all stuck with now - except that changing a computer system is comparatively easy to changing the direction of a language.
Because that's exactly what they did in Xerox Park, many times over.
>That's why his designs aren't deployed.
>But he's basically confusing research and engineering, as if engineering wasn't even a thing, and you can just come up with stuff and have people use it because it's good.
Kay has many talks about the difference between invention and innovation (which are much better terms than ones you're using). In fact, his analysis of this difference is probably the most insightful and though-provoking technology talk I have ever seen:
Of course, this subject makes a lot of developers highly uncomfortable, hence a lot of shallow, ignorant, knee-jerk dismissals. "Everything is incremental." "Everything is the only way it could be." "This is fine." And so on. Thing is, Kay worked at Xerox and Apple. He read a myriad of books and research papers on computing, which he constantly references in his talks and writings. He worked and continues to work with some of the most forward-thinking people in the field of computing. In late eighties he foresaw most of the current computing trends - which is verifiable via YouTube. Even without any context his talks display a considerable depth of thought. In short: unlike some people, he actually knows what he is talking about.
>The point is that it couldn't have been any different. It wasn't designed; it was evolved.
And that is why someone who designed it just received a Turing award. Makes perfect sense.
Regarding your other comment here.
>If the web is a genius for hypertext, but not for app delivery, then he should have just said so. That is not a very hard sentiment to express. "The Web was done by Amateurs" doesn't capture it.
He has several decades worth of talks and writing. If you haven't bothered to familiarize yourself with at least some of them to understand what he means it's your own fault.
If the web is a genius for hypertext, but not for app delivery, then he should have just said so. That is not a very hard sentiment to express. "The Web was done by Amateurs" doesn't capture it.
But I don't even think that's true. If the web were really bad as an application delivery platform, someone should have supplanted it by now. Alan Kay or someone else should go design their own awesome VM for application delivery. I guarantee you it will fail, for reasons of fundamental to its design, while TBL's platform succeeded for reasons fundamental to its design.
Leaving out those things wasn't an accident or ignorance as Alan Kay claims. There were a lot of very conscious design decisions involved in the web -- again see "Weaving the Web".
He may regret that the web has evolved into walled gardens, but what could he have possibly have done about it? There's to prevent that decades in advance, at least not without strangling it from birth.
But, your argument is perilously close to begging the question that the success of web is a good thing. Now, I happen to think that the web is a net good (no pun intended), because (among other things) it helped break Microsoft’s hegemony and continues to force OS vendors to provide and support a standard universal computing platform (albeit a crippled one). But, it’s also arguable that the success of the web may have set personal computing back by a few decades, while also exacerbating a bunch of other problems like wealth inequality and reduced privacy/sovereignty, because it turned the Internet into a big modem.
I’ll take a look at Weaving the Web; thank you for the recommendation. To you, I commend Jaron Lanier’s Who Owns the Future.
he rails against the walled gardens while at the same time putting things into the standard, like EME, they powers them.
The W3C has to negotiate with industry. But, unfortunately for us that care about the open web, its position is weak.
-- Alan Kay
That quote is pretty unfortunate. I guess nobody's perfect.
3. Comment elsewhere in this thread about how we have been forced into turning a document system into a VM.
I think the problem would be that the computation would get cut off at different points on every machine, leading to an unstable ecosystem. Remember the browser has to run on devices with at least an order of magnitude difference in resources, probably 2 orders now.
Generally, you want to guarantee that your style computations terminate. Now it appears that CSS doesn't actually provide that guarantee, since it's Turing complete :) But I guess it's close enough in practice.
Thanks for the recommendation. Let me check it out. Let me search for it. WWW is very important to my life.
Stallman did Emacs in whose invention AFAIK he did take part, and then he took part in implementing some other innovative projects, though I don't really know his CS career (I mostly know him as the face of GNU and FSF).
IDK but having created a popular project should not be equal to a big innovation in the field.
So you seriously think that "new" is better, then "well done"?
I don't think it makes sense to compare those 2 things and value one higher than the other.
Of what use are innovations, if you can't use them in a "popular project"?
No. But Linux is not more well-made than, say, the kernel of any modern BSD, or that of illumos, etc. Git is not technically superior to Mercurial et al. Torvalds' success is certainly a big, admirable one, but it's a different kind of success than that of Berners-Lee.
Also, while it's the most popular kernel, it's not like we'd not have anything we do have today if it didn't exist, it's in essence an ordinary kernel that came out in the right time. It's those who made the distros and reverse-engineered the drivers and ported/packaged thousands of programs who made Linux a big thing.
WWW, on the other hand, is an invention. It's something that did not exist, and it transformed the world like nothing else.
That was my point ;)
Otherwise we agree ...
Claude Shannon never won it.
On the other hand, I think I use Claude's work in the same sense that I use Kirchhoff's laws everyday.
More than a significant fraction of Turing awards have been won by theoreticians.
Go with the Turing Award if you want to be remembered.
> Forbes magazine updates a complete global list of known U.S. dollar billionaires every year. John D. Rockefeller became the world's first confirmed U.S. dollar billionaire in 1916
If you want to play, note that I did not say USD anywhere in my comment. Not to mention that Forbes was founded in 1917, and so it clearly has no data collected to "confirm" an earlier billionaire. We can go back to the 14th century if you like (https://www.wikiwand.com/en/Musa_I_of_Mali):
> During his reign Mali may have been the largest producer of gold in the world at a point of exceptional demand. One of the richest people in history, he is known to have been enormously wealthy; reported as being inconceivably rich by contemporaries, "There’s really no way to put an accurate number on his wealth" (Davidson 2015).
But thanks, I actually never knew that about USD billionaires.
Anyways, it was just an example.
(But I guess developing them into a useful product is itself an invention. Maybe there's no bright line between discovery and invention.)
I remember there even internet search engines at the time. People wrote automated scipts to look for public ftp ports and compiledbthe results. The internet only had a few thousand addresses at the time, so it wasnt an ardorous process. I think one of these databases was called archie.
The Web is the best example I can think of for why we still need net neutrality.
> The Web is the best example I can think of for why we still need net neutrality.
I think the Web is best example of why we need privacy protection as it is a form of communication.
I personally pay for a VPN. And I hope my right to use independently implemented (independent from Google/Facebook/ISPs) privacy preservation technology and de-anonymization-resistant technology is not lost, and instead encouraged by the legal system.
In my view, the so called 'net neutrality' is simply a deceiving name for 'a subjective favoritism' given away (as political/economic favors) to some web-monetization players vs others.
Eg. ISPs cannot favor/unfavor content providers, but yet Search engine company can favor/unfavor/rank content providers.
So ISPs are labeled as 'evil' while search and social networking service providers are the 'good doers'.
PCX, BMP (1985?) or TIFF (1986?) maybe? PBM and PGM I think are slightly earlier but I can't readily find dates for them.
Highlight of my life right there, having my butt warmed by the residual heat of the inventor of the www's butt.
Ahem, in any case, congrats to him! Definitely well deserved.
Ah ha! I did suspect he was not acting alone - who did the other bits and bobs?
It's certainly a paradigm shift similar to what Gutenberg accomplished.
I haven't heard Kay mentioned as an early proponent of hypertext, I'd be interested to hear more if you can find it.
His list of questions for kids is great, especially the one on 'what happens when I click a link' 
I do think he probably should have started a browser company or something though. The nice thing is that his beautiful dream for the web is (slowly) coming true.
I love looking at that NeXT box he used as the first web server: https://en.wikipedia.org/wiki/CERN_httpd
> This project is experimental and of course comes without any warranty whatsoever. However, it could start a revolution in information access.
The applications of hypermedia were clearly in the ether in thay era, but that document certainly pulls a lot of the threads together.
"HyperCard was created by Bill Atkinson following an lysergic acid diethylamide (LSD) trip."
That's like the exact opposite of what Turing proved, with the halting problem and all.
I'm any case, a well deserved award!
Glad man get credit for his work.
He seems like a generaly good guy with some real forward thinking and I like many for the ideas he has. I'm happy for him.
Personally I think that's a mark of a great pioneer who is still using his capabilities for the best interests of us all.
I'm afraid he died last year.
Note that TBL never expected HTTP and HTML to replace all other protocols, they were just intended as a hypertext system which could connect to all the existing systems like Gopher, NNTP and so on, thereby increasing the usefulness of all the systems.
"conceived of the web in 1989 at the European Organization for Nuclear Research (CERN) as a way to allow scientists around the world to share information with each other on the internet. He introduced a naming scheme (URIs), a communications protocol (HTTP), and a language for creating webpages (HTML). His open-source approach to coding the first browser and server is often credited with helping catalyzing the web’s rapid growth"
So no, not hypertext, not SGML, not MIME: URI, HTML (an application of SGML) and HTTP.
If you're criticising him for building on top of what others did, no man is an island. He did improve on what was already there. Why not read the details for yourself?.
In Tim's words:
"I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!—the World Wide Web."
The still-used command line browser Lynx was first made for Gopher in fact, before being adapted for HTTP, which it still does.
I think the main reason to allow this is that otherwise the only effect is that the same thing will happen in a non-standard fashion with Apps and other binaries being the way to access protected content.
That said, I'm not happy with it either and I wished that the web would remain DRM free, the only reason it has the adoption that it does is because it was open from the beginning.
Finally, the way to vote as a consumer is with your feet: simply refuse to access DRM protected content and it will go away all by itself.
Content owners have - in principle, legally backed - the right to distribute their content in any way they see fit, and content consumers have the option to refuse that content if the format it is presented in is un-acceptable to them, this is a consensual producer-consumer relationship.
If you feel 'bullied' to consume content with DRM the problem lies with you, you do not have an automatic right to content in a particular format. If you do not agree with that the solutions are to be found in the political realm, not in the technical realm.
you are giving content "owners" (and I use that term losely, because legally they are Copyright Holders not owners, copyright is not property) many more rights that are simply not granted by copyright law.
DRM is used in way to massively exceed the authority granted to creators by copyright, they abuse DRM and you claim the rest of society must just "take it" because "that's the market"
Well there is no "market forces" when it comes to content because the government grants upon a person a monopoly over a copyrighted work, this exhausts many market forces. Sure you can claim a general market of "Movies" but due to the nature of the product these are not interchangeable widgets that normally constitute a market.
>If you feel 'bullied' to consume content with DRM the problem lies with you
The "being bullied" is a weak argument, i will give you that, but what about platform or time shifters. Should the MPAA be able disable my bluray player with a code on all new bluray discs? Should MPAA be able to tell me which HDMI cable I must buy and which monitor on my PC is acceptable?
Should the MPAA be able to force me to use Chrome instead of Midori.
If I buy content today, that can play on Firefox, but then next year google revokes the Widvine license firefox uses, it is acceptable that now the content I paid for is unusable by me?
Yes, that will limit your choice but so what, if enough people do it the coin will drop.
That's my solution to this whole DRM issue and I've yet to find a 'must-see' thing that was DRM encumbered.
Copyright law gives rights-holders all kinds of options, and digital tools have given them options to make it harder to do some things that are not per-se violations of the law but that are also not explicitly granted as rights.
And that's a huge difference from a legal perspective.
People wouldn't care about your opinions much, some would even call them dumb just like you call others opinions dumb. But when a respectable person like TBL says something people tend to take his words seriously.
This is why people were worried about what TBL did. A very few would have cared if you or I did such a thing.
Personally I stopped eating meat, and just this is quite a cost in additional constraints in your life, because society is designed in a way and doing something different requires extra efforts.
Yes, boycotting something you don't want is a good move, but if you apply it to all things you want to protest against you will no longer be part of society.
So you can't only rely on that. It can't be the only answer. We need to take stand to also refuse those bad moves, especially before they happen.
A figure like TBL openly supporting something like DRM sends a message, and it's not a good one.
Copyright law is here to stay, rights holders will try to use technology to be able to squeeze every last $ out of their legally backed position and consumers have the collective powers to give those rights holders the finger.
The fact that consumers as a group don't care enough is the main problem, see also: privacy and many other items like this.
Is is not. Only poisoned food that kills you quickly. You have plenty of legal food that harms your body or gives you diseases that are legal. We just call them junk food, alcohol, and other names because they are morally accepted.
You accept DRM as fact of life. We don't. Because it isn't. It's just the next move pulled out by the majors to try to lock in the consumer. It brings zero benefits to society. No culture sharing restriction ever did, and all the stats in the world show that what they claim to protect against is a scam: majors are making more money that ever.
> The fact that consumers as a group don't care enough is the main problem
It can be said about any problem in society. Health, education, whatever. Regulations are not the solution, having people caring is.
Still, we do have regulations.
You are essentially trying to ignore the fact that the Berne convention exists and that it (and not ideals) govern the position of rights holders. TBL is a pragmatist, first and foremost, that is why we have the WWW, not because he is an idealist who has forsaken his idealism.
As such, his benefits to society are such that few people can claim to have changed the course of history to such an extent, if you wish to argue that DRM is not a 'fact of life' then you are out of touch with life.
Major rights holders 'making more money that [sic] ever' is not automatic, they require our cooperation and consent.
And that consent and cooperation are ours to withdraw.
Any other changes are political and likely an uphill battle.
When Firefox came around (as Phoenix), it didn't endorse ActiveX, making a lot of sites unusable. It chose fair standards.
What would have happened if they decided that they were too small and that Microsoft was so big that ActiveX would be inevitable anyway ?
Note that eventually the DRM won't affect me. I'm tech saavy enough, so I will always find ways to get around them. I think they arm society as a whole, and that you should say no.
I had an interview proposal with google some years ago. I refused politely. Google is inevitable, I still do have a few gmail addresses and use the SE. And it pays well. And their projects are cool. But I said no because I believe I should not be part of it.
I'm doing my part.
You don't get a democracy if you are not doing your part. Even if it means you will loose. Strategic vote is just promoting immobility.
I'm all for pragmatism, but it has to be used in conjunction with bigger goals.
Now you don't have to be perfect, I'm certainly not, but I still think that M. Lee decision sent the wrong message.
See if that changes your mind.
> Moreover, a case could be made that EME will make it easier for content distributors to experiment with—and perhaps eventually switch to—DRM-free distribution.
I can't see how the author made this leap.
> It doesn't matter if browsers implement "W3C EME" or "non-W3C EME" if the technology and its capabilities are identical.
It matters as a matter of principle. It would have sent a message. Maybe this would have made the W3C irrelevant, but if it did, so be it, at least they would have gone without compromising.
I see you already discussed with sametmax about this, so I won't go further.
As long as there is a market, they will come.
The web is too big of a pie to let it go.
But even if they decided to go, it wouldn't be a bad thing. Proprietary things on proprietary platforms, and less people trying to destroy the open platform. I'm all for that.