Hacker News new | past | comments | ask | show | jobs | submit login
GDPR Version of USA Today Is 500KB Instead of 5.2MB (twitter.com)
466 points by brundolf on May 26, 2018 | hide | past | web | favorite | 205 comments

The day I added an ad blocker (Adblock Plus, Ghostery and uBlock origin) was the day my CPU fans stopped spinning almost constantly, my battery life improved, web pages loaded faster and were more responsive.

My experience of the web is about a thousand times better than it used to be.

I will remove all these when the ad companies and websites start to behave themselves, which will be never.

If you liked the effect of installing an adblocker give disabling Javascript a try. (You can selectively whitelist a small number of websites). Did this a couple of years back and the difference is astounding.

I'd be willing to bet your "small number" of white listed websites is actually a lot more than you think. Turning off Javascript breaks a ton of sites, IME.

Same thing with cookies.

On a side note, web developers have gotten really lazy at feature checking. In the IE days they'd at least have a banner saying their page wasn't going to work because it required IE or Netscape. Browsing with cookies turned off nowadays, I hit one or two sites a day that won't even display and then often get stuck in infinite redirect loop.

Even white listing sites to use cookies is a PITA. Outlook for Office65, for example, requires white listing cookies from 3 or 4 domains, and due to the way it redirects, I had to dig around in Chrome's page inspector to even find out what those domains are. And of course the page itself gives no indication of why it's not loading, it just flashes between empty pages forever.

> I'd be willing to bet your "small number" of white listed websites is actually a lot more than you think. Turning off Javascript breaks a ton of sites, IME.

My experience is the opposite.

I used NoScript for almost 8 years, and then switched to uMatrix one year ago.

I don't need to enable JavaScript in most sites. And, if I need it, only enable some parts of the page. Most websites are much faster with this.

I've found the Cookie AutoDelete browser add-on works really well for dealing with cookies.

Firefox https://addons.mozilla.org/en-US/firefox/addon/cookie-autode...

Chrome https://chrome.google.com/webstore/detail/cookie-autodelete/...

Disabling JavaScript actually works great on news sites as far as I remember.

It used to be that disabling JS rendered web pages illegible. But good CSS changed that. So now you can read sites perfectly fine with JS. You can't interact, but you can read.

EDIT: Wow I just tried disabling JS on nytimes.com and it's INCREDIBLE. Reallly, try it. In Chrome I used the pattern [*.]nytimes.com. Not sure what that syntax is.

I good compromise is uMatrix, which by default only allows first-party JavaScript and cookies and makes it easy to quickly enable JavaScript/XDR/cookies when necessary using a point and click matrix:


Isn't uMatrix fully included in uBlock if you simply enable Advanced Mode?

no. you get a 'lite' version of uMatrix is uBlock, you gain significantly more control w/ uMatrix

Is uMatrix something you use instead of uBlock, or in addition to? Also is there a guide on how to use it somewhere

I use both, but sometimes it's a bit a pain in the ass with capchas and and embedded iframes etc. A lot of trial and error which scripts are needed. But it's great to know most of the crap, including cookie/consent banners are blocked.

Does this not blow/up or block most logins for apps?

The first time I log into a website I often need to do a few rounds of "allow this script; refresh; allow this script; refresh" to get captcha/cdns working. But then I save for that site and don't think about it again. It can be a bit of a pain sometimes, but I find it interesting having to acknowledge where different sites pull resources from.

To answer my own question, it's a bit redundant but I need both.

uMatrix handles network stuff (blocking css, js, media, etc).

uBlock also handles network stuff, but the difference is that uBlock can also block specific elements on the page, eg, divs by id or class.

The Stylus extension is great for that. Firefox's style editor lets you test CSS edits in real time. Then copy those CSS rules into Stylus and the browser will remember them. I use it to turn off all CSS animation too.

If you block all JS and CSS with umatrix, you can still read articles by pressing the "reader view" button in Firefox.



uMatrix can do the later too by now

The author is the same as uBlock Origin; here is the repo: https://github.com/gorhill/uMatrix

From the wiki page, there are many useful FAQ/ guides/ docs. Specifically, there is a link to a decent guide:


edit: using both is redundant

If you want something a little simpler try scriptsafe[0]. Still lets you whitelist JS you want to trust, enable temporarily or for a set time. I find it rather clearer than uMatrix.

In fairness to gorhill it's a couple of years since I last looked at uMatrix so it may be much improved.

[0] https://github.com/andryou/scriptsafe

I have been doing that for a long time but with the use of javascript framework & SPA for pretty much any website, including the least appropriate applications like blogs, I kind of had to capitulate in the name of convenience.

That's a great idea, I'll give that a go!

if that's too extreme try just disabling images.

Is there a list somewhere of what websites actually work without JavaScript enabled?

If have had javascript disabled for about four years now, and it was among the best quality of life decisions I ever made in terms of web browsing.

I don't think there is a list, but I'd say it heavily depends on your browsing behaviour. In my case the vast majority of web sites that I access randomly (mostly trough some feed or aggregator) are text based and actually more pleasant to use without js: Faster, no jumping around of content, no obnoxious subscribe modals, almost zero ads, etc. If it doesn't work without js, I usually just close the tab. I hold the view, that if something is important enough for me to care, it will reach me in some form or the other eventually.

But if you are mostly about multi media content, checking out tech demos or similar, you are going to have a bad time.

You can use the "Quick JS switcher" plugin to easily enable js for a page



I'm quite lenient in white listing, because for me personally disabling js is mostly about performance. If I'm interested enough in a web sites offering and see the utility of it requiring js, I'm also willing to wait a little. But I'm not patient by default

Sadly it's a shrinking list and as sites replace static content with things like React it's shrinking even faster.

Good: HN works well and handles the downgrade gracefully (ex: upvoting becomes a POST rather than AJAX).

Bad: The original Reddit site works ok but the new React based one one fails miserably as it shows a loading screen that never changes.

The new Reddit React site hijacks right-click so you can't even open your responses/user page in a separate tab. Fail. This is how Slashdot.org lost their user base. Who in their right mind would use React for a content-based site?

But reddit seems to know they suck at websites: you can still access https://old.reddit.com/r/<whatever> and even https://old.reddit.com/r/<whatever>/.compact for the old site and the old old site.

> Who in their right mind would use React for a content-based site?

People who are also writing content-based apps and want to write once but still obtain native speeds on mobile? The lack of right-click functionality is just bad design, not React's fault.

> obtain native speeds on mobile

With browsers and heavy JavaScript frameworks? That's an oxymoron right there.

The truth is, people/organizations doing this simply put other concerns above user experience and friendliness.

"Native speeds" in this case meaning "needlessly slow". But at least it's consistently slow across devices!

I frequently use the old reddit design (as in old old) because the new and new new design both suck a lot on mobile in terms of speed.

There is nothing native about a website which has actual lagging while I scroll when the previous version could do it just fine in desktop mode.

> still obtain native speeds on mobile

The irony is, Reddit on mobile is god-awful slow. It takes a good 5-10 seconds of displaying a spinner just to show me a kilobyte of text (self-post or linked comment). I no longer click Reddit links on mobile for this reason.

Reddit has got to have talented people working for it. But whatever they're doing that results in a 10-second spinner to display pure text is "Mongo DB is web-scale" levels of comedy.

Are you talking about Reactive Native? Sounds good on paper, but it's just as fast or slow as a browser because it is a browser, and is far from creating a native app experience on either iOS or Android because it tries to do animation on a JavaScript-style single-threaded execution model among other things. What content-based apps do you have in mind? The Reddit app nobody is using because it doesn't add value to the web site, and which Reddit keeps nagging about in the most annoying way possible (thereby signaling users to never ever install it)?

React is fine for browser-based line-of-business (and I guess also collaboration) apps; for content, not so much because you have to go through hoops to parse content into React's vdom [1].

[1]: https://stackoverflow.com/questions/50420248/how-to-parse-a-...

> Reactive Native [...] because it is a browser

Reactive Native's primary selling point (and nature of existence) is that it uses native components instead of wrapping a webview. This is the most obvious, relevant, and hyped detail of React Native. So your statement, "React Native is a browser" implying the same performance penalties as visiting as a website or using Cordova/PhoneGap, is false.

When old reddit goes away, there will still be this:


I think there needs to be a movement to bring the WWW back to basic, readable, low-JS, animation-free, content-based sites.

React is actually capable of sever side rendering, in fact one of the Reddit announcements specified that React's SSR was a key factor in choosing it. I'm curious why they're not using it? The new site is still in beta though.

Probably because React having this capability was enough to silence the opposition, without actually committing to using said capability.

(Also, they probably save money on electricity by externalizing more and more processing to users.)

The new reddit interface is horrible without JavaScript. One can get back the good one under old.reddit.com

The new reddit interface is horrible period. Even with JavaScript on.

I reckon that the number of websites I regularly visit is small enough that whitelisting them over a few days will get most of them. HN, Reddit (for the articles), a few work sites, a couple of news sites. Everything else I'll just see how it pans out. It will be an interesting experiment.

I use Reddit and HN without JavaScript. Most sites work just fine or with minor annoyance, which the websites in many cases could fix if they wanted.

> I will remove all these when the ad companies and websites start to behave themselves, which will be never.

Which will be when access portals (currently search engines, and app stores for apps) heavily correlate resource usage with ranking / commission / etc.

Currently, the biggest access portal(s) are invested in ads, so not any time soon. But when every kb drops your pagerank one point, and every cpu cycle costs an extra penny of App Store commission, that is when the entire market will suddenly follow.

It’s not soon but it’s not never. It’s possible.

I think this is sensible, because sites in the same industry should have the same energy consumption. For example a video streaming site will always use more CPU than craigslist, but that’s okay because a video streaming site is competing with other video streaming sites.

But I think the industry’s solution to this has always been to just wait for the inevitable faster and faster computers that come out each year, so that the lag becomes less just by the passage of time. But that becomes a cat and mouse game because as CPU’s and network speeds get faster and storage gets cheaper, developers can get away with larger sites and slower code.

case in point, nowadays you can barely browse the web with an iPhone 4, but when it first came out people weren’t waiting 30 seconds for a pageload on it. Even look at the average size of an app in the App Store now. I think facebook was 40MB when it first was released, now it’s 300MB. It’s like you have to keep getting newer and newer hardware just to keep up.


You surely aren't using it anymore, are you? They are now owned by an ad company.

I haven't signed up for an account or given them any data (and specifically made sure that I don't share usage or diagnostic data with them).

They are a German company very much covered by GDPR, so I think it's relatively safe to use the tool this way. I'd be happy to hear otherwise though.

With Privacy Badger, uBlock Origin, DuckDuckGo Privacy Essentials and Firefox' Tracking Protection available, there's little reason to keep using Ghostery (and Adblock Plus as well, FWIW).

I also recommend Decentraleyes, Smart Referer, First Party Isolation and Facebook Container. Plus of course HTTPS Everywhere.

It's hard for me to imagine productive browsing without most of those addons now.

Good lord, what's that, 10 plugins?! Sure, we nerds could do that, but god help normal people! :(

That's what the Web is like these days, unfortunately. Browsing with JS enabled on my Nokia N900 with 256MB RAM is mostly impossible, even though it should be much more than enough to display most of the content. We nerds with our heavily configured browsers often overlook that, but regular folks have to live with it.

Yesterday Ghostery sent out a "Happy GDPR Day" email, which also leaked their users' email addresses in the CC field. I'd say they're off to a rocky start.



I've not given them any data, and not signed up for an account, so I wasn't affected by this breach.

It's a German company with a very 'interesting' history: https://twitter.com/notrevenant/status/1000085227421782017?s...

They used to be owned by Evidon Inc., later renamed to Ghostery Inc., which was a classic ad company (sold data gathered from the opt-in GhostRank feature to help other ad companies figure out how to not have their ads blocked).

Nowadays, the Ghostery extension is owned by Cliqz GmbH, which is still an ad company, but they specialize in privacy-conscious ads (mainly ad personalization based on evaluating browser history locally), have a privacy policy that has no holes as far as I can tell, have all of their client-side code open-sourced and they are even minority-owned by Mozilla, so Mozilla can at least check over what they're doing and would probably give up ownership should Cliqz infringe on privacy (even if you think Mozilla itself is the devil, they would still likely do that for PR reasons).

So, I do think it is nowadays fine to use Ghostery. I still don't quite understand why it's so popular, there's tons of other tools for the same purpose (for example Disconnect, Privacy Badger, Firefox's built-in Tracking Protection), but yeah.

> I will remove all these when the ad companies and websites start to behave themselves, which will be never.

Maybe, maybe not.

The problem with straight up blocking of all ads is obviously it cuts of funding for content creators whose work I enjoy.

What I really want is blocking of the ridiculous crap you find in ads, but still allow reasonable publishers to show ads & monetize.

That's why I'm hopeful for the "Coalition for Better Ads". It's essentially trying to define that standard across multiple parties.

Add a pihole into the mix and everything gets even better.

Websites can be "documents" or they can be "apps". It's great that we now have the option to build the latter, but far too many things that should be simple documents (news sites, I'm looking at you) are trying to be apps, and it's virtually always a worse user experience, setting aside the cost in load time, memory, and battery usage.

I'm saying this as a JavaScript developer.

The market has spoken.

No seriously, if it sucks so much, why haven't users responded by going somewhere else, thus incentivizing sites to behave well?

Or: the market does not (always) work. This too is an option.

Agreed, and my comment was partially as a response to the anti-regulation demo here on HN.

But, it's still something worth probing into: if it really does make the user experience worse -- and I agree it does -- why haven't people punished those sites by going elsewhere?

My best guess is that it's like bad customer service: it bugs people, but it's not really the differentiating factor when choosing a product/service provider.

Don't be so sure people aren't going elsewhere. Besides craigslist, there's also the not-Internet option, which I think many of us on here forget, possibly largely because it's so seemingly difficult to measure (in our skewed experience).

I've met many young people who still default to pen-and-paper for things like note-taking. It's tempting to dismiss them as luddites, but I can't exactly blame them, given how much I know about just how user-hostile modern computer software and hardware can be.

A properly working market requires informed decisions. Internet privacy invasions are not something that people are well-informed about at a practical level. So GDPR is actually helping to restore the market.

Sure, but people are saying that it makes the UX really obviously terrible too.

I agree, but the providers are financially incentivized to make the UX worse via all this stuff, which reduces the number of providers offering a non-terribled UX.

In becoming ad revenue maximizers, taking for-granted assumptions on audience, this situation doesn't optimize for serving the audience well. And since their models are to provide content in bulk, it keeps users served "well enough" to keep coming back, as opposed to less-funded alternatives which might provide a better UX, but don't provide the bulk content that monied sites can.

Fundamentally, people visit sites and endure bad UX if the content is there to draw them. The utility cost of bad UX often isn't the dominating factor.

> the providers are financially incentivized to make the UX worse via all this stuff

Yes, but again, that's only true because apparently the UX isn't actually bad enough to push away a significant chunk of their userbase.

> Fundamentally, people visit sites and endure bad UX if the content is there to draw them. The utility cost of bad UX often isn't the dominating factor.

Agreed. But you could also interpret this as, "people don't mind this type of UX that much".

They do and they have. A ton of old media companies are going bankrupt.

USA Today definitely would not have survived another 5 years with 25 second page loading times. Or at least, such an old website wouldn't have generated enough revenue to survive long term (and USA Today would put their content on something else like Apple News, Youtube, etc)

They have. Growth is not through the WWW. Instead, it's through phone apps.

The news experience through websites is so bad that I don't read news on the WWW anymore.

> The market has spoken.

That is one tread bare cop out.

When the choice is to go without completely, few will do so (and those that do will often get marketing blitzed as "luddites").

People who say this, do you realize that if the market were left to its own devices cars would still be deathtraps without seatbelts?

The market isn’t magic, and the market we actually have is far from free.

why do you assume it's a market in the first place?

This is what project AMP is all about: to make websites lightweight again, using radical but effective means. You can feel the difference in desktop Chrome using the AMP Browser Extension: https://chrome.google.com/webstore/detail/amp-browser-extens...

> This is what project AMP is all about: to make websites lightweight again, using radical but effective means.

It's also completely unnecessary. Those radical means that lead to lightweight websites are plain HTML and CSS with as little JavaScript as necessary. I don't see the point of adding all that bloat, just to have it removed by a third party like Google.

I'm honestly starting to reach the opinion we should all be VPNing our web traffic through Europe so we can pick up more of the benefits.

Orrrrr.... We could actually send a real message that this type of behavior is completely unacceptable by raising awareness in our collective countries of origin to get GDPR like regulations codified in law so this type of abuse becomes the exception rather than the norm.

Otherwise, it'll turn into a case where companies will farm out their web design and digital marketing to "marketing heavens" just like they already do creative restructuring and accounting for tax evasion purposes.

>implying Congress will do anything other than continue to line the pockets of rich cronies

The federal government in the US is profoundly dysfunctional to the point where this isn't worth the time or money. However, it may be worth the effort on the state level.

> The federal government in the US is profoundly dysfunctional to the point where this isn't worth the time or money

If everyone who says that voted in the primaries and mid-term elections it wouldn’t be true. The party of dysfunction is in charge now but only because so many people choose not to vote.

There's a host of reasons why the dysfunction goes all the way down to the low efficacy of voting, but HN is notoriously unable to handle these kinds of discussions without several 500-post demon threads.

For example, people tend to get stuck in tribal thinking where one party or the other is to blame when the seeds of the problems go all the way back to the founding of the country. It's hard to shake people out of it so they can have a productive discussion even when everyone's trying to be fair and honest.

In their defense, last vote was on a working day and not all employers let their employees skip work to vote.

It's weird that voting does not take place on a non-working day in the US (like Sunday, for example).

Does the US not have some kind of law that guarantees employees the opportunity to take at least a few hours off work for voting?


Some employers tried making it look fancy to let employees take the day off to go voting so that maybe some other employers would pick up the habit in exchange of some bragging rights but it didn't go very far.

I'd say, why not both? You can advocate and also demonstrate that you walk the walk.

Is anyone concerned that we are starting to replicate real world boundaries on the internet.

Internet was supposed to free us from the limitations of the real world. World of internet was supposed to be the one where you can fluidly switch between your preceptions of self, become a new person whenever you felt like, leave your past behind. This was supposed to be a new world where people see themselves differently.

Now we have created countries on internet. Transferred our real world identity onto internet. Masses were rushed into the internet before they were ready, before they got the concept of what internet means psychologically. Now we vast bureaucracies ruling the internet, so depressing. Depressing to see ppl on HN saying "Good" to every GDPR news. Sad to see internet age squashed by beurocracies right when it was getting started.

The story of mankind, I'm afraid. A few people find a neat little spot where they can hang out, and for a while things are great, everything is pretty cool. Then, sooner or later, the group gets Too Big. Not so cool stuff starts happening, and where formerly you had a small anarchic group that worked with mutual understanding and (unspoken) agreements, now you have a need for Laws and Enforcement and Bureaucracy.

Every beautiful place on Earth will sooner or later be exploited for mass tourism and marketing. If you ever find such a place, keep it a secret, to keep it intact.

It’s a known pattern of subcultures. See Geeks, MOPs, and sociopaths in subculture evolution:


We haven't "created countries on the Internet". The Internet exists in the real world, and it exists in countries. Those countries have laws, and they always applied. Governments can take time to adopt to new technologies, but this was always going to happen.

And GDPR is an awesome thing, it's an example of governments doing the right thing.

> We haven't "created countries on the Internet".

We have though. Your world on the internet looks different based on citizenship( ipAdress) and ppl have to get a visa (VPN) to go that country on the internet.

People hated it when China did this but somehow EU doing this is somehow commendable. Now people are europeans, americans, indians ect on the internet. I just want to be a human being.

> Your world on the internet looks different based on citizenship( ipAdress)

But this started long before government intervention ("personalization", "regional content"), and it is not the fault of the EU that tech companies pursued behaviors that were against the spirit of the laws. There are two ways for tech to respond to the multigovernment problem: conform to all of the laws throughout the world, or "create countries on the internet". As is evident now, most companies chose the latter

I didn't hate it when China did it just because they implemented national regulation. The deciding matter is what is regulated and how.

> Now people are europeans, americans, indians ect on the internet. I just want to be a human being.

Then let's put people in power that uphold international law and treaties, and empower international bodies.

I feel like you need to explain how China's great firewall (I assume this is what you referenced) is the same as the GDPR?

One prevents people in China from accessing stuff the government doesn't like, the other prevents companies from treating your data like shit.

> Your world on the internet looks different based on citizenship

Yes, my world on the Internet looks a hell of a lot better than the Internet as seen from the US.

If you follow the money, they are similar. Both make it harder for the incumbent, US based monopolies to compete with home-grown services.

Also, they both massively break the “end-to-end” principle, since your geographic location (and therefore ISP) has a big impact on what data gets served.

(The differences between the two are arguably more important, but they are obvious, and have been discussed to death, so I won’t bother pointing them out)

How is it harder for a US based company to comply with GDPR than a EU based company?

There is no breakage of the end-to-end principle either. EU can only legislate in their own region, so they can't enforce GDPR outside of EU. A company thqt is GDPR compliant for every user doesn't have to check the IP at all. Any differences stem only from the companies will to keep abusing users data in the rest of the world.

> Yes, my world on the Internet looks a hell of a lot better than the Internet as seen from the US.

Internet in rest of world is fine. Atleast our govt doesn't record our porn watching activities via unknown third parties.

Huh? Maybe I've misunderstood you, but porn laws in EU are quite a bit more liberal compared to US. And authorities are also subject to GDPR.

He's probably referring to the UK, specifically.

Only via the NSA :)

generating a page based on geoLocation didn't start with GDPR. I've been fighting (often US) sites making "choices" for me based on geoLocation for a decade.

> ppl have to get a visa (VPN) to go that country on the internet

That has always been the case for people on "third-world" countries. Hulu, YouTube (the publisher of this content has not made it available in your country), and even stuff more tangible as Amazon.

Now you're just catching up with what the internet has been like for basically everyone outside the US and Western Europe since ever.

It comes down to one thing: Money.

The older internet was not a network of for-profit users trying to rent-seek on every digital interface & every page view. Standards were created by academics, internet committees, and concerned & involved users, trying to figure out what would work best as a resilient distributed system.

Now what goes on online is defined by the rent-seekers, the data pimps, and the predators. Their actions have real-world, legal consequences which ruin it for everybody, and their size & clout overwhelms the decisions of average people.

There's a really good piece [1] by David Chapman on how subcultures grow, from creators to fanatics to mops to sociopaths, which ultimately results in their death. Your comment made me realize that the Web is currently well in the "sociopath" phase. I wonder whether any of his suggested remedies can be applied to save the Web from death.

[1] https://meaningness.com/geeks-mops-sociopaths

I don't follow your idea of the internet. The internet is a faster, cheaper, immaterial postal system. Of course there are legal frameworks and national borders in it.

I feel freer now companies are limited with what they can do with my data.

I hope that preventing companies exploiting data will lead to more interesting business models than "provide distracting trinket and harvest data".

I don't want to take part in a grand age of the Internet which basically amounts to people harvesting information about me so they can do anything from trying to sell me shit to attempting to corrupt the democratic process.

Aren't you basically adimitting that you are

1. distracted by 'trinkets'

2. Expose your personal information in exchange for trinkets

3. Prone to buying 'shit' on the internet

4. naive to fall for fake news

5. Govt needs to step in and take charge to create "more interesting business models"

It's true for everyone to a greater or lesser extent. That's why ads and data abuse are the dominant business model on the Internet.

This comment breaks the HN guideline which asks:

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize."

That's a form of crossing into incivility, and it leads to much worse (as seen below). Could you please not post like that here?


wlll on May 26, 2018 [flagged]

No, you fucking moron.

Please don't violate the site rules like this, regardless of how wrong and/or provocative another comment is. It just makes the thread even worse, and obviously we have to ban accounts that do it repeatedly.


Sorry Dan, it was a comment born of a lack of time, frustration and lack of judgement. I'll try to respond better in the future.


Its been that way since day one.

The FBI can clobber your .com domain if you infringe US law, even if both you and your server are located in places it would be perfectly legal.

> World of internet was supposed to be the one where you can fluidly switch between your preceptions of self, become a new person whenever you felt like, leave your past behind. This was supposed to be a new world where people see themselves differently.

Since when? The Internet was designed to be a robust way for Americans to communicate in the event of a nuclear attack.

Apparently that's a myth, see e.g. https://en.wikipedia.org/wiki/ARPANET#Debate_on_design_goals

I've also read about it in the book "Where Wizards Stay Up Late: The Origins of the Internet". In the prologue it mentions:

“Rumors had persisted for years that the ARPANET had been built to protect national security in the face of a nuclear attack. It was a myth that had gone unchallenged long enough to become widely accepted as fact.”

“Lately, the mainstream press had picked up the grim myth of a nuclear survival scenario and had presented it as an established truth. When Time magazine committed the error, Taylor wrote a letter to the editor, but the magazine didn’t print it. The effort to set the record straight was like chasing the wind; [Bob] Taylor was beginning to feel like a crank.”

Some other sources seem to confirm this, e.g. http://www.alphr.com/features/369490/top-ten-internet-histor...

I see that you are one of the privileged, who didn't have to constantly deal with "this is not available in your country" messages.

But how will all the startups who need to sell your data stay in business? /s

who NEED ???? Are you kidding ??? Even without TARGETED ads, the world is still running, you know?? And if a startup business model is to make money by selling its user datas without telling them and without their consent.. well... let it die!!! And the world will be a better place...

That was the joke.

People need to not downvote. The in-joke is the bane of communication. Let it be explicitly spelled out for those not in the know.

but now we have half a dozen comments including the original sarcastic one that add nothing, when everyone could have inferred all of this from the grandparent "we should pick up some benefits from routing through europe" comment. None of the replies, nor yours, add anything substantial, nor are very funny or interesting.

That's why the downvotes have come -- to clean this up, so people can scan past the grey stuff that probably adds nothing.

It's not an inside joke, just sarcasm.

It kind of makes me want to throw away the Internet and start over from scratch.

The net, nah. The web, perhaps.

Maybe time to take a new look at gopher and newsgroups?

I miss newsgroups so much. But different times.. i dont think it would work again, we need something new

This is what the Project AMP is about.

It is cheaper to block the ads and tracking scripts.

Lately, I've found using a VPS and setting up my own VPN is a bit cheaper (and safer? assuming you don't want anonymity).

Vultr has a $2.50 a month box with a few European locations. You only get 500gb of transfer per month on that size, but you can test the waters and size as per your needs. They even have an install script for OpenVPN if you don't want to go through that headache.

The only real downside is they limit you to 2 boxes at that size per account. Clearly just a loss leader for them (but it got me using their service).

Alternatively, LowEndSpirit.com will do you the same transfer for about $5/year. It's NAT ipv4 but that's fine for VPN (you get some ports forwarded) and has native ipv6.

> It is cheaper to block the ads and tracking scripts.

The combined effort needed to produce and maintain huge lists of rules to hide ads everywhere is probably way above the one needed to setup a VPN.

Does the site degrade gracefully? What's the easiest way to train non technical users how to step through and configure each site they use?

While the technical people have ability to implement solutions. I suspect many are short on time. I think a VPN despite it's cost is a better solution for many (provide websites don't restore the js/bloated functionality)

> Does the site degrade gracefully?

Surprisingly many do.

With that said, a fair number also require some tweaking and it is that tweaking that non-technical users will be unlikely to understand nor be patient enough to work through in order to get a particular site to work.

or just stop visting crap sites. Sites that do that are a signal that content is crap.

On the other hand I, being in the EU, am VPNing my web traffic through the USA so I can access websites that block access to EU citizens or provide a degraded experience (such as the USA Today website). I also get less of those annoying banners asking me to consent to getting cookies or the processing of my personal data when I only want to read an article (these types of banners are in fact a known attack vector).

I would be very happy if everyone followed this example. Think of this as "modern" from now on. Return to the old, simple web that I know we all loved. Keep the lessons in usability and design learned in the last decade or two.

Now we only need to find somebody to pay for that simple web....

I think there's somewhat of a false dilemma there.

There are plenty of sites that I frequent that have tastefully located, static banner ads that are served directly through the site operator rather than through an add network. This has the dual advantage of allowing the operator some editorial control, and of producing ads that tend to be much higher quality (to me, anyway) because they involve actually telling me about new products and services I might be interested in because they are related to the actual content I'm reading, rather than being a giant pile of "one weird trick" ads and targeted advertisers' sad attempts to sell me things I've looked at before, and therefore either already bought or already decided not to buy.

In essence, it would be a return to the kind of advertising that print media uses. Which, incidentally, tends to command a higher price than all this junk, anyway.

I've also done paid subscriptions to various journalistic websites in the past. I've since stopped, generally because their reading experience is generally so awful that I've retreated back to print media for such things. So I suppose you could count me as one of those people who would be willing to directly pay for a simple web. (It's not like I prefer to kill trees. I just value readability is all.)

I don't think it is a false dilemma. The problem is about the difference between the magnitude of revenue generated by the proposed methods. Both for publishers AND people interested in advertising products.

Tasteful banners are nice but tracking the performance of such ads is very hard. Ad business is really cut-throat. When you visit a site, there literally are multiple ad networks / advertisers bidding for a spot in front of you, with real money, based on the site, your location and your interests. A huge network of autonomous trader bots that trade real money billions of times per day for a spot in front of a human's eyeballs. If an algorithm decides that you have a higher than normal probability of converting to a paying customer, they (an AI or expert system) decides on the spot to pay "premium" to be in front of your eyeballs. They do this while running A/B tests with different ad variations to figure out what converts better, for which segment of the population etc.

Modern ads are not only kind of targeted. They are INSANELY targeted. There have been instances of people targeting singular people through ads based on geolocation and interests.

A publisher cannot be expected to run all that bidding code on their own servers. Those networks are actually a network of networks. If you are buying ads today, you want analytics immediately to see how you are doing, how much you are spending so that you can improve and adapt before sinking more money on it.

The profits generated by such methods are a couple orders of magnitude higher than a simple untrackable banner on a site placed by a site owner which gets swapped out whenever...

It's a heck of a lot cheaper than it was in decades past, and things like torrents and ipfs can help eliminate high-cost choke points.

That would be us. :)

For all that cannot access both versions, keep in mind that the EU one is a vastly simpler UI, it almost looks like just a blog archive, while the US one is a more feature full typical newspaper homepage. Not saying that either one is better, just that it's not an apple to apple comparison.

> not an apple to apple comparison.

Hm, what if you’re a reader looking for journalistic text articles, is the same content available on both?

I'm not a regular USA today reader, but from a brief look it seems like in the EU version only the most recent ~30 articles are "reachable" by clicks from the home page.

p.s. the EU version is on it's own subdomain, maybe you can reach it from the US too: https://eu.usatoday.com/

Nope, it redirects back to the regular one for US users.

I’ve rarely seen a website load so fast on mobile...

Whilst all the articles seem to be accessible if you know the exact URL, the homepage only shows the latest stories and there are no links to different news categories.

Also, the articles seem to be missing links and the videos are not available.

If all you want is the text of the articles and you only want the latest stories, the EU experience is probably great. But if you want more, you're probably going to be disappointed.

I use uBlock to explicitly make some newspapers "article only", that is, remove all UI elements that are not the article. No ads, other article links, or navigation. Vastly improved reading experience.

Firefox has Reader Mode (little book icon to the right of the URL bar, appears on text-heavy sites) which should do this automatically.

I believe Safari has this, too. Might find an extension for other browsers.

This tweet shows the difference with screenshots:


Rather than linking to another insidious "social media platform," would it be possible to simply describe what is there?

Many of us have sworn off Twitter, FB, Google, youtube, etc. because we don't care for the tracking, nor do we care for regulations. Unfortunately, we miss out on many things when people insist we participate on those sites by merely linking to them as part of a discussion.

I get this argument when you're taking about a non-public Facebook post, but you don't need a Twitter account to view content posted on Twitter.

They're redirecting to https://text.npr.org/ (which already existed)

Not simple at all, as in both instances we are looking at multiple images that compare different version with various measurements attached.

So we've recently heard in the news that cryptocurrency mining supposedly takes .5% of all global power.

Now compare that to the extra electricity required for everybody undergoing the horrible experience of 80 poorly written trackers and ads constantly overloading each tab in a web browser.

This reminds me of the American budget. Cut the funding for Planned Parenthood and the Arts but we need a hundred new tanks!

It’s similar to tons of paper wasted for flyers no one asked for. The people spamming this stuff are never really held accountable for their environmental footprint.

i have often thought at what point do you start rendering web pages remotely and simply send back the (ie 'png') image to the client. A single png image of a entire web page is definitely under 5.2MB.

Personally, I can't wait for a shift in business practices. Ads have ruined my confidence in privacy. Most sites share your data with over a dozen different ad tracking vendors. I've seen twice that for a specific class of site, ie thechive.com.

> I have often thought at what point do you start rendering web pages remotely and simply send back the (ie 'png') image to the client.

Please never do that, unless you want your visitors to have the worst accessibility experience.

Yeah, but aren't startups and other profit-driven entities more interested in catering to the mainstream than to a small set of users with specific needs? If companies will gleefully ignore Linux (and BSDs), browsers other than the ones that ship with a major OS, privacy concerns, free speech principles or shipping to remote areas, I can't really expect them to care much for marginalized groups such as people with accessibility issues.

> Yeah, but aren't startups and other profit-driven entities more interested in catering to the mainstream than to a small set of users with specific needs?

How are "people on a smartphone" or "people with bad vision" small sets of users? Also, rendering the whole page in a PNG file takes more work than merely serving the page as is, which is the second reason nobody does it (the first one being "why would someone want such a shitty browsing experience?").

What about something like a PDF?

We already have HTML/CSS; we don’t need a non-responsive format like the PDF.

> at what point do you start rendering web pages remotely and simply send back the (ie 'png') image to the client

That point is never. Never do that.

Opera mini used to work this way more than a decade ago, it was pretty nifty as it let moderately powerful feature phones access the 'full' (non client JavaScript) web efficiently. Such a thing is no longer needed though; I'd hope a large amount of that 5.2mb is for reusable assets making subsequent requests a few/hundred kilobytes.

Opera Mini is still a thing, mostly in developing countries.

Back in 2000 people used to do this sometimes (but with jpegs). Exports from MS Publisher seemed to the worst culprit. They used image maps for navigation (https://www.w3schools.com/tags/tag_map.asp).

edit Clearly this is a terrible idea, just to clear up any misattribution of intent.

What is wrong with sending plain HTML?

Do you enjoy clicking links? Selecting text? Any interaction at all?

Let alone scrolling.

Even before the GDPR, text-only news sites seemed to be making a comeback. On bad cellular connections, these two are great:



These are also worth including:



I have a very strange feeling that GDPR will end up resetting the internet (in a good way). It's already astonishing how we're getting unsubscribed from every possible mailing list we didn't want to be on in the first place, and it'll become even more awesome when all the websites will take notice, how little they are allowed to do in terms of piling crap on top of their web presence. I'm rather excited about the future right now.

Would be really cool to have a list of before and after (size drop), due to GDPR in popular websites.

They went from a load time of more than 45 seconds to 3 seconds, from 124 (!) JavaScript files to 0, and from a total of more than 500 requests to 34.


When I see a site imploring me to disable my adblocker I think - nope, too late, you ad people screwed up this thing forever, look for a new business model. This one is dead. Or at least, it's dead for me. I am ok with having reasonable number of ads. I am not ok with blowing up site 10x and making my laptop max out CPU and my mobile devices get stuck trying to render all that garbage.

How do you think free sites pay for producing all that valuable content? I find it astonishing that most people seem to think having access to news for free is almost a right. If you had taken their paid subscriptions they'd have served you a less add riddled version.

Ads are perfectly capable of being served without javascript and without cookies and without tracking users across the internet.

Serving ads requires exactly:

1) The HTML <img> tag.

2) Server side rendering (at least enough to insert a URL into the src="" attribute of the <img> tags.

3) A webserver that responds to the URL's inserted in #2 above by supplying an image.

That's it. And, in fact, in the beginning of ads on the internet, the above was how all ads were served. But sites didn't care for it much, because they had to do extra work to serve ads. And advertisers very much did not like it because they had to trust that their partner sites were truthful in their reporting of add impressions (for the pay by impression model).

And having the advertiser be the one running "the server" of #3 above (which would allow them to monitor impression counts) means that their partner sites need to be kept up to date with the latest set of active URL's, lest some <img> tags show the broken link icon. Also a huge hassle.

The JS ad frameworks came about because the ad networks realized if they could lower the bar to gaining "ads" on a site, they could get more sites running ads (the push went something like "now use 'ad world 2.0', now just a single <script> tag in your website, no other work on your part").

And along the way the ad networks realized that the companies purchasing the impressions were willing to pay more for impressions that might be more "significant", and "relevant ads" were invented. Of course, the unstated, behind the scenes, part of "relevant" was that in order to determine "relevance" we now have to track and monitor the end users activities all across the internet so that if we see J. Smith searching for cat food on Amazon, we can now start serving him ads on facebook for catfood, and they will be "relevant" because we know he is interested in cat food for some reason.

So sites don't have to go "ad free" to also be "track your users activities everywhere" free. They just have to return to the original model where the ad was the internet equivalent of a highway bill-board or a poster on the side of a bus-stop shelter. Untargeted, just there, maybe it is seen, maybe it is not.

For that matter "untargeted" in this case just means "not targeted using unrelated viewer data." There's a significant amount of targeting that can take place by targeting based on the content and site. If anything, I would argue that the obsessive ad-network, viewer-based targeting online often has tremendous targeting failures as compared to traditional content-based targeting.

I started distinctly noticing this over the last year or two in thinking about how much more relevant I found advertisements in print magazines and journals as online. In literary journals, I'll find advertisements from publishers about their new releases in related topics. In scientific journals, I'll find conferences, new lab equipment and products for related fields, and so on. In design journals, I'll find advertisements for furniture, fabric, and so on. I get one local art magazine mostly for the advertisements, which are primarily new exhibition announcements. I actually somewhat enjoy seeing these sorts of advertisements: they're clearly marked as ads, but they're also actually useful. If, say, New England Biosciences puts out a kit with new features, their advertising in Science can let me know about an option I might otherwise not have heard about, for example.

Yet even before I started very strongly blocking as many ads and tracking online as possible, the targeting was horrible by comparison. There were the saturation-advertising systems, which would come up with wonderful decisions like "this person just bought a new mattress; that must mean they buy mattresses often, so let's show those advertisements" or "this word was used somewhere in the website, let's show stuff tangentially related to that word," or "everyone is interested in ONE WEIRD TRICK." Most of the time, these advertisements had nothing to do with me, or what I was reading, and they were obnoxious and unhelpful. How this became the norm online, just because on some rare occasions, the algorithms might work well, as opposed to the very well-targeted advertisements in print publications, is quite confusing.

You can run ads without 5mb of tracking scripts. The options are not “invade evryone’s privacy to maximize profit” or “become a charity” after all. Most of the history of advertising functioned without probing the virtual rectum of your customers.

I wish that all sites provided alternative mini-versions such as this.

This is great. (1) How do I pretend to be from the EU? (2) Th EU should make a law requiring HTML versions of pages with no JS.

The evolution of the Internet in the last two decades has been such a tremendous disappointment. It really peaked in the early 2000s (when I got my first DSL connection but before JS became a thing).

> Th EU should make a law requiring HTML versions of pages with no JS.

I know this is well-intentioned, but so is most authoritarianism. We don't want to go down that road. Your personal preferences don't get to become law.

Not so, it's about basic accessibility, so blind readers can use your site. The EU could and probably should dig into that at some point too, and the rest of the world will (hopefully) benefit. (I added hopefully because, unfortunately, a number of websites have engaged in dick moves; namely implementing GDPR protections for EU users only or blocking EU users entirely.)

> Th EU should make a law requiring HTML versions of pages with no JS.

Having politicians and lawyers micromanage technology decisions for the entire Internet? What could possibly go wrong.

Couldn’t get any worse? The Internet is basically unusable today.

You're writing that comment on a site that uses Javascript:


It seems pretty usable to me. I haven't found the Javascript intrusive here.

Sounds like an exaggeration since I'm using it to post on HN right now.

How long have you used the Internet with that opinion?

I’ve been using the Internet since the mid 1990s. It peaked around 1998-2002. It took me awhile to realize what crap it had become. These days, I stick to native apps for Maps, FB, mail, etc., plus websites that look like they’re out of the 1990s (e.g. HN, Westlaw, Google). To the extent I read the news, I get it from subscriptions on my iPad. (Mostly I just wait for my wife to fill me in based on pod casts she listens to.)

Every now and then I’ll wander into the regular Internet for something (e.g. HN links some article). Usually it’s a jarring and deeply unpleasant experience.

Consuming written content on the Internet got materially worse (I might dispute how much worse, but it's a material difference).

Consuming any other kind of content on the Internet has gotten so much better it's hard for me to imagine, in the late 1990s, predicting how good it is. I had a laptop hanging 1 hop off a default-free peering core router and never would expected this network to replace all of radio and television.

Similarly, creating any kind of content is drastically better than it has been at any other point in the history of the Internet. For 90% of what I need a word processor or spreadsheet for, free Google Apps is more convenient than MS Office on my laptop; I'll use them simply to avoid launching the app. I'm still in native apps to draw diagrams, but I bet I won't be in 5-10 years.

Finally, with respect to written content, I think you take the good with the bad, too. Specifically: in the 1990s, every news source in the world wasn't crudded up with 100mB of Javascript grinding your machine to a halt, but there was much less news on the Internet. Add the Kindle Reader back into the mix and I think we're out ahead even on this axis.

And don't forget: Flash is dead.

On balance, I think we're much better off than we used to be.

Text is really important to me. Video is nice, but I kinda was okay with cable. If I could go back to cable in exchange for JS-less everything else, I’d do it. And really, I guess I should clarify I mean the “web,” not the Internet. Obviously having 100 mbps on your phone is better than 256 kbps into your house. And I know the apps I use (including Apple TV apps) leverage those improvements. But I don’t think video is an appropriate medium for the web, because it’s usually an impoverished way to communicate information. For example, I see lots of product reviews these days are posted as videos. I hit the back button soon as I see that.

As to content creation, I guess I don’t see the point. If you told me in 1998 that two decades from now I’d be using a word processor that wasn’t any more capable than Word 97, simply more resource intensive, I would’ve thought that to be a pretty bleak prognostication.

What if someone told you it would be free.

If you’d told me that “crap for cheap” would’ve migrated from processed food and Chinese imports to the software industry, I would’ve thought that to be pretty bleak. “In the future, you’ll be using the software equivalent of a Twinkie.”

I bought a copy of BeOS back in the day, and avidly followed the newsletters: https://www.haiku-os.org/legacy-docs/benewsletter/Issue2-47..... An honest exchange of currency for useful software; plain HTML content. I miss that tremendously.

Having a fully syncs writing environment that “just works” on all of my various devices & places I write is pretty impressive to me. I’d pay a fair bit of cash for it. But it’s free! It’s easy to be cynical about that but the licenses I paid for in the 90s got me no where near that.

We're still trying to do that with the Haiku project, though we don't have time to write nearly as comprehensive engineering notes as BeOS did, though every now and again for major feature merges we get a writeup (e.g. https://www.haiku-os.org/blog/axeld/2015-07-17_introducing_l...).

The quality of most content is so poor it’s almost physically painful though. Clickbait is pervasive, divisive trolling is ubiquitous and it infects everything. Look at how hard this site has to work day in, day out, just to keep the rot at bay.

There’s an ocean of content, but most of what’s actually born online is an aggressive waste time, designed to maximize engagement and nothing else. There are 10 minute+ YouTube videos on topics that could be better encapsulated in a paragraph of the written word, but of course it’s easier to monetize the video. The pervasive online business model is industrial scale invasion of privacy, news media is teetering, blogging is suffering, forums are mostly dead.

A majority of the population visits a handful of sites like Facebook and Google, the latter of which wants to subvert email. Brandolini’s law rules, and has been turbocharged with the power of a thousand retweets and a million echo chambers. White supremacists and all other manner of kooky fucks finally have the global audience they’ve always dreamed of. Twitter and other platforms have fueled an outrage culture with the soul of a jackal and the attention span of a gnat.

I do like streaming content, but it seems like a bad trade off.

Oh yeah, and we have the security disaster that is the IoT and resulting bothers.

It's hard for me to take seriously the idea that we're in the midst of a security disaster today; I got started professionally in the mid-1990s, during an era when virtually every computer system on the entire Internet was riddled with stack overflows. It was so easy to exploit memory corruption in 1995 that you could sometimes write a code execution exploit blind.

> How do I pretend to be from the EU?

1. Move here and get a job, I have no idea what the visa system is like for wherever your country is, but if you work in tech that's a good start.

2. Get a VPN

Do you genuinely believe anyone should be regulating using modern technologies on the web like that, asides from stuff related to accessibility?

No I’m being glib. I’m not much of a believer in regulation. If the people want the Internet to be crap it should be crap.

Their eu site only has access to a limited quantity of articles. Shame

Tried to open the twitter link. Asked me to agree to their terms to continue to use th service. Answered no. No the link just says "Oops, something went wrong. Please try again later." (the link is https://twitter.com/i/flow/consent_violation_flow)

Gotta love the GDPR, a nice way to get rid of all the "social media" crap and hundreds of ad trackers that has infested every corner of the internet.

Archived copy will hopefully work better:


The decline button "works" for me, but just takes you to a page where you can delete your Twitter account.

Looks like this is for all Gannett-owned newspapers, including my local paper :D

And that's why people use as lockers.

what is more scary to me is 2 different versions of the same website for 2 different locations!

Happens all the time (besides the obvious i18n) , especially with stuff that has different regulations per country, like gambling.

Yep. They also block countries until they can be sure they can abide by their regulations.

And then we have many websites that show news display a different version with different content to every user.

That's why I love Hacker News -everyone sees the same thing

google.com, google.fr, etc. :)

Nope, those are actually the same site now (used to be different). If you want to change your language, you have to do it from settings (although google.fr will give you a button to change your language to french).

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact