Humans lived in small groups, then villages pretty much until yesterday, easily for most of our species' history (99%+ of ~200k years). People knew everything about each other. And then gossiped to make sure nothing went unnoticed.
Contrary to the article, we probably still reveal less than we used to. People would bathe in semi-public places. That's not common outside of vacation spots any more.
Yes, advertisers bank on our nature. Gossip blogs bank on it. But they didn't make us that way.
OK, so the nature of information collection changed but we don't feel it. Some people can rationally appreciate it but not casual Internet users. And most nightmare scenarios are still hypothetical.
Imagine being subject of public hate because you expressed a unpopular view when you were young.
Many people, even here, are fine with that. On Twitter it's practically a part of regular programming.
Secondly, we do not live in a small community, we have an incredibly interconnected world. Democracy does not function well when people lose the capacity to moderate access of information about ourselves, our thoughts and communications to some degree, when other parties may use that information against the individual. It stunts individual expression and that is pretty fundamental to living a modern society.
It's incredibly naive to undervalue the need and importance of privacy. Nightmare scenarios are not hypothetical to a lot of people, they are acting out right now, each and every day.
Im not sure but I think people want privacy more now because they are more concerned about the consequences of faceless people ruining their lives for reasons they don't know or understand, that actual local privacy.
I live in UK village, and I want privacy because I am not an American and I dont expect to be subject to US law which from my POV is a very disturbing and frightening thing. OK, we Brits need to deal with our UK/US relationship at a political level, but even so, I am happy for most of my life to be known to those around me, but I sure as hell dont want some Yank spook having anything on me whatsoever. I assume Americans dont want British spooks having info on them either.
Over all though, I do think its new uncontrolled consequences people are scared of.
Generally, the core problem that the more thoughtful people who are concerned about privacy point to is how certain actors (governments, corporations, doxing mobs) are being empowered at the expense of individuals. If that is one's concern it might make sense to get to the root of the problem and work towards limit the power of these actors directly.
Power imbalance is something that social animals designed to live in small groups would seem to care about deeply.
Rather than being able/forced to hide all of my unpopular opinions, I would prefer to live in a society that didn't hate me for having different ideas.
Privacy is important, but often the need is rooted in disfunction elsewhere.
Imagine all the lost opportunities, all the worry, all the steps some people take to conceal the truth. It can't go forever, and it will become increasingly more expensive to keep the privacy you had in the past, simply because technology makes transparency cheap and ubiquitous.
Designing a society that relies on the secrecy of certain information is a recipe for disaster. Passwords, credit cards, etc. It won't be long before we simply can't keep any of these secrets, and we will have to switch to a better identification system.
I've had enough to worry about what people might think if they encounter the truth. I don't want to lie anymore. I don't want to keep and remember secrets anymore. I don't want to watch each of my steps and hide behind 7 proxies when I surf the web.
We're due for a paradigm change toward transparency, and the earlier the better. Privacy and secrecy only leads to deception and inefficiencies.
An important result of those discussions at the time was the Fair Information Practices
which came out of two U.S. government studies on privacy during the 1970s.
These principles include things that are quite similar to what this article proposes, including notice (of what's being collected), choice (about whether it should be collected), and access (to know what others know about you).
The Fair Information Practices formed the basis for European data protection legislation, which has now been implemented in some form everywhere in Europe as a result of the EU Data Protection Directive and other legal instruments. (Of course the Europeans reformulated it and did not directly enact the original U.S. Fair Information Practices into law.) An interesting consequence of that is that most Europeans, at least in theory, have quite extensive rights against information collection that violates these rules (at least by the private sector).
Many Europeans have been able to exercise these rights in practice to challenge data collection and retention by private companies, to see what the companies know about them, or to demand that companies delete information about them. Some of those examples have been mentioned here on Hacker News; the one that I found the most interesting was when Malte Spitz got his cell phone location records from Deutsche Telekom by exercising his right of access under German data protection law.
Anyway, I think these rights are quite similar to what this article is proposing, so I wanted to point out that there is a long history of similar proposals, and that the idea that technology was taking away people's practical right to control over data about them is something that's been a concern for some decades.
By the way, the United States never enacted a comprehensive data protection law, despite being where the Fair Information Practices were first cooked up. They were never given the force of law in a general way, as they were in Europe; here in the U.S. companies can, in general, collect and use data in ways that would be considered "unfair" elsewhere. The main consideration in the U.S. is that the companies can't lie in their privacy policies, but there are few substantive restrictions on the private use and disclosure of data, outside of particular regulated sectors (like credit cards with FCRA, health care with HIPPA, and education with FERPA). There is extremely strong industry opposition to a generally-applicable data protection law here.
Some sore points about data protection where it did get implemented into law:
① European data protection law is leading to some weird and counterintuitive results, recently including the Google v. AEPD/González case where Google was ordered to remove links to old disparaging (but accurate) information about individuals when users search for their names, based on the idea that Google was "processing" personal data about those individuals in an inappropriate way.
② Data protection often has major loopholes for government collection of information. (Government agencies, including police and spy agencies, very often are subject to privacy and data protection laws, but the application of those laws often means just that those agencies are supposed to deliberate about whether they think what they are doing is OK; if so, they can carry on.)
③ As this article and this discussion seem to suggest, notice and consent have become more difficult where companies expect to use large amounts of personal data routinely. The amount of consenting that users would be asked to do and the frequency with which they are asked to do it could become quite annoying and also decrease the likelihood that users will take the time to understand what they are being asked to consent to. (We can see this to some extent with the cookie notices on European web sites, asking users to consent to being tracked by cookies. Contrary to the mainstream view of web developers, I think cookie tracking is a serious privacy risk that users should still worry about in 2014 and that addressing this risk is pretty important. But we can see that the warnings haven't necessarily made most users better-informed or more cautious about cookie tracking, and many users are probably kind of annoyed that every site they use is warning them about cookies.)
As a result of the last point, I heard a Microsoft executive in a speech say that he thought notice and consent were now obsolete and ought to be rethought. (This statement isn't super-shocking to Americans, who might not even have heard about Fair Information Practices in the first place, but it could have been something of a scandal if he had said it in Europe.)
The executive gave the example of the number of different entities that are receiving user information when a user interacts with a major web site, and the number of different privacy policies that would be applicable to these interactions. He suggested that few users would even read the policy of the site that they're trying to visit, let alone the policies of third parties (that might receive user data as a result of embeds or as a result of business partnerships).
I thought that preventing and discouraging some of those data flows was actually a goal of privacy protection. In fact, a lot of privacy software, including software recently developed by my colleagues, is actively trying to stop them, based on the idea that users don't know about them and that they aren't in the user's interest.