Take a look at this way to do it: http://panzi.github.io/SocialSharePrivacy/
As a website owner I try to lead by example by not including any 3rd party JS(or any JS at all for that matter). Specifically avoiding trackers from Google or Facebook.
I do use facebook.* in one of my tutorial about reducing privacy exposure, but of course this applies to any ubiquitous 3rd-party servers out there (of which Disqus qualifies in my opinion, same for Gravatar, etc.)
This approach will minimally break the pages with nice benefits in return: reducing the ability of ubiquitous 3rd-parties to profile browsing history, faster page load.
Would recommend this over their old two click social share. Your link is a fork of the old Heise tool. It looks dated on mobile.
Uhg. I really need to think about whether I want to be part of the problem.
Note that a major difference is that you apparently have to go into the forum page to leave a comment, you can't do it from the page you're discussing.
Discourse = set up at least a 2gig server then figure out how to integrate with your site.
On that level they aren't really comparible.
I've removed all ad network code from my blog (troyhunt.com)
This included a screenshot of DoubleClick still being blocked on Troy Hunt's blog.
1. May we retrieve common libraries from third party CDNs? Doing so helps support this site by saving on our bandwidth costs, but may expose information about you to those third parties.
2. This site allows commenting through Disqus. We have no control over what Disqus does with your data, and so your information may be exposed to Disqus and any third parties they communicate with. Would you like to enable comments?
3. (Similar for tracking, if I decide to do something other than log parsing.)
Default 'no' to all, and I still need to find a way to ask the questions in a way that doesn't disrupt simply viewing a blog post that someone linked. Perhaps if someone returns, I'll prompt then.
Anyone have thoughts on if this sounds sane?
That way people with µMatrix or similar blockers can use the control tools they have instead of needing to do something site-specific.
Also, such decisions can't be remembered if cookies/localstorage are disabled. So prompting over and over again could also be annoying.
> 1. May we retrieve common libraries from third party CDNs? Doing so helps support this site by saving on our bandwidth costs, but may expose information about you to those third parties.
In an ideal world browsers would never send a cache-refresh request for resources tagged with SRI because the hashes would guarantee that the content is 100% stable. Alas, it has non-trivial privacy implications, so they don't do that.
Maybe it could be implemented as a privacy addon with a whitelist for CDN domains, but then sites would still have to adopt SRI for that addon to do its work. Or maybe an addon that injects cache-control: immutable to CDN could work too, but that's limited to https.
1. All local. Unless you don't want and a 100-200kb JS file is too much of a strain on your server bandwidth. Or are you serving 15Mb of JS files?
2. Screw Disqus. Screw Facebook Comments. Start thinking about your visitors, as someone said on another related thread, you are responsible for the tracking of your visitors by 3rd-party sites. Local comments or turn them off if you don't care about what others are saying. Don't save any information about the commenters except what they enter in the boxes. One-way hash the IPs if you need to compare for spam reasons.
3. If you need your ego stroked when you see you had xx visitors on your site, go ahead, use Google Analytics and screw us all. We're gonna block it anyway.
That's a bit harsh - wanting to know whether you get 0 visitors to your blog post or 2,000 yesterday isn't just about ego; it helps you understand the value of your posts (and whether you should bother). Knowing how many people visited isn't the same as bragging about it.
3. I've just about talking myself into going with log-based analytics here. I find ga's omnipresence too worrisome to contribute to it, even with consent.
Thanks for that link, that's essentially the policy in my head before I started thinking about things like comment support. It's way better written than I would've come up with.
Can't say I would disagree. A lot of folks these days just seem to append an HN/Reddit link to posts that get discussions on those sites. There's the blog as an expression of author's personality, and then there's the discussion space as an area with a life of its own.
2. I'm thinking of putting a "Click to load comments" box in place of Disqus on my blog so nothing gets loaded unless the user clicks. Seems better than bothering the user up-front.
3. I use Google Analytics - I figure it's common enough that if people don't like that, they'll already have it blocked, so there isn't really any additional tracking they won't want (unless the twitter timeline widget is tracking; which it might be, but I suspect I'll remove it soon anyway).
2. I like that idea. I also kind of like the idea of just not using comments - when I used disqus years ago it was mostly spam - but I think I want to try again and see if it's worth it.
3. Also a good point, but that only accounts for those people who are aware of the tracking as a point of concern. Given that the blog will be technical with a personal bent and vice-versa, one or two of my ten readers may not be aware of tracking as a thing :)
On this front, though, I'm probably just going to start with log analytic tools. It's really the only way to get a fully accurate picture across visitors (server side logging can't be blocked, but GA and even self-hosted data gathering can), and I don't really care too much about the additional info that analytics can provide.
Given the option, I would probably also just parse logs - I don't think Analytics is adding much on top of that; I just don't have that option using GH Pages. The reason I moved from AppEngine to GitHub was to stop messing with the code for my blog in an attempt to make me write more posts instead! =D
1. Serving from github still shares the tracking information. It can be argued that github is better than cloudflare/facebook, however bear in mind github has politically motivated staff. Long cache is a great idea. Alternatively cut out unnecessary js.
2. Nice idea, it does hamper the ease of use of your blog though - I would never click to view, though I did read some that were visible when I finished the article.
3. Do you find the information from this useful? In a way that isn't trivially parsable from server logs? I ask because we are reviewing the quality of our user analytics, and our ga seems rather pointless atm.
2. Yeah, it's not ideal. In this case, it looks like Disqus are gonna fix stuff though (they've commented on my post; there's a link right at the top of the article now).
3. I don't have access to the server logs as I'm running on GitHub Pages, so something like Analytics is all I have. I do find it useful (given no server logs), it's nice to see the traffic to my blog; there's no point posting if nobody is reading! :-)
3. That is very interesting, now knowing your stack (pages + disqus + adverts) I see one side of the 'problem' is that bloggers don't have much choice in terms of revenue, so the infrastructure charges with user data . The other side is likely the complexity, incompatibility, and time wasting of home rolled solutions.
The really nice part of a CDN deployed blog is handling the traffic spikes though.
In fact I would say CloudFlare are better than both GitHub and Facebook, and I am only wary of them because of their position of power and the potential they have (ie. they are a victim of their own success). Both Facebook and GitHub have shown themselves to make political decisions at the expense of their users.
To privacy-conscious users: CloudFlare is the man-in-the-middle for more and more of the Internet, potentially tracking at Google-like levels.
CloudFlare may: ... Add script to your pages to, for example, add services, Apps, or perform additional performance tracking. (Unfortunately this is opt-out rather than opt-in.)
To Tor users: CloudFlare implements a captcha to protect servers from malicious traffic; the implementation has caused tremendous annoyance in the past and the company may have been slow to address this problem.
https://news.ycombinator.com/item?id=7977780 (example complaint, 3 years ago)
https://news.ycombinator.com/item?id=11388560 (9 months ago, from cloudflare)
https://news.ycombinator.com/item?id=11404770 (the tor project response)
https://news.ycombinator.com/item?id=12122268 (6 months ago, additional discussion of tor vs. captcha)
To DDoS victims: CloudFlare protects several DDoS vendors while gaining business protecting DDoS victims, citing free speech.
To CloudFlare customers: CloudFlare has a "target on its back" and has faltered against DDoS in the past, causing outages for all of its customers. AFAIK: It's been a while.
To CloudFlare freeloaders like me: CloudFlare doesn't have much incenctive to protect its free-tier users from DDoS.
Related: Akami stopped helping DDoS'd pro-bono client Brian Krebs. https://news.ycombinator.com/item?id=12561928
In my book CloudFront easily ranks ahead of had-been "do no evil" Google's irrevocably merging it's entire history on me ex post facto. https://news.ycombinator.com/item?id=12760003
One of my early design decisions is to be as lightweight and fast as possible. This means no oauth, no ads, and only core features that you would expect to find in a comment system.
My suggestion would be to make the design more appealing, it looks a little bland now.
And also promote the privacy oriented mission of the service a lot more. Currently there is no mention of privacy/tracking, you only mentioned no ads.
And https is a must in 2017.
Just a few question:
* When do you plan to launch?
* What is the backend built with?
Good luck man.
I'm soft launching with beta users right now.
Python, Pyramid, SQLAlchemy (which supports PostgreSQL, Mysql, and SQLite3), uWSGI, Nginx, Ubuntu
I don't mind included scripts on my page from huge orgs that have a lot to lose by doing bad things but there aren't that many companies that fall into this (Disqus did, but possible shouldn't ;))
I'm planning on building a business around this service, and reputation will matter.
I don't plan to sell out because I eat my own dog food. I built Remarkbox for a personal itch, an itch I feel other people may also have.
For some things, this is true. However you can be sure an entity like Microsoft or Google isn't going to accept a few thousand quid to inject ads or affiliate links into customers websites. A one-man-band that's struggling to turn a profit though, it's less certain. There are a lot of people trying to make a quick buck online and most people have a price.
There are definitely some great things out there being built by small teams that I might miss out on, but that's how it is unless we can tightly control what third party scripts can do on our pages. Sometimes a service will be "so good" that I'll do it anyway, but it's always a trade-off. I don't think most people are as anal about this as me though!
It is possible for someone to say "hugs" at the end of their discourse and still be a liar and a cheat and a terribly bad actor.
No idea, of course, about any of these people - but don't let cost-free, content-free expressions alter your (bullshit/fraud) detector.
 See comment on OPs blog from "disqus here"
I'm @madbyk on Twitter and you can also Google my full name to catch my other lies and bad acting on some of my recorded talks.
I did no such thing. In fact, I specifically admit to having "No idea ... about any of these people".
Skepticism is good :)
^ That sounds legit to me... I believe this was the primary reason why Facebook made an SDK and Like button in the first place...for data mining. Pretty clever.
This is the consequence of building on a platform like FB, you exchange your visitors browsing habit data for access and FB expands their graphs of IP<>websites to improve their ad targeting. And with Disqus is won't be as obvious because the publisher might not be aware that it leads to an FB connection.
So regardless if it was unintentional this is a relevant story for the trade offs of using platforms.
I ended up researching WAY too many comment systems, and eventually settled on Reddit. Not ideal, but better than all the alternatives.
Blog commenting is pretty broken right now, I guess due to the dominance of social networks. I wanted to write my own blog comment service in rage but thought better of it.
Disqus seems pretty sloppy. I was surprised to learn that they were an early YC company.
It's not very active, but it's only been alive for one week. It's an experiment, as I mentioned.
It could be improved by showing the comments inline using Reddit's API , which seems pretty good (although I haven't used it). And I could probably automate the subsmission too.
But I'm trying to keep it simple for now, until there's evidence that a lot of people want to comment!
It feels to me like the typical Facebook approach: do what they want to do or a little bit more, monitor the blowback and walk it back as little as possible only if required to keep everyone happy.
https://www.heise.de/extras/socialshareprivacy/ -> http://panzi.github.io/SocialSharePrivacy/
Facebook continues to use every tool at their disposal to protect their expansion of the privacy invasion of their product.
monitor the blowback and walk it back as little as possible
In this specific case it was indeed only possible to walk it all the way back.
Ads should be loaded into <iframe sandbox referrerpolicy="no-referrer">
It would still give them some information (affiliate ID and user IP) but no cookies or tracking of user interaction with the page itself.
Sort of a meta-tracker. But maybe I'm too paranoid.
most likely they are getting paid for this tracking