

Thoughts on Privacy: The right way forward - kashifzaidi1
http://kashif.io/thoughts-on-privacy

======
schoen
Many of these discussions happened in the 1970s as people first became aware
that large databases (then often called "databanks") were being built to store
lots of personal information, and that information from one database could be
combined with information from another via a database join. That raised the
specter that information originally collected for one purpose could come to be
used for a very different purpose.

An important result of those discussions at the time was the Fair Information
Practices

[https://en.wikipedia.org/wiki/FTC_Fair_Information_Practice#...](https://en.wikipedia.org/wiki/FTC_Fair_Information_Practice#History_and_development)

which came out of two U.S. government studies on privacy during the 1970s.

These principles include things that are quite similar to what this article
proposes, including notice (of what's being collected), choice (about whether
it should be collected), and access (to know what others know about you).

The Fair Information Practices formed the basis for European data protection
legislation, which has now been implemented in some form everywhere in Europe
as a result of the EU Data Protection Directive and other legal instruments.
(Of course the Europeans reformulated it and did not directly enact the
original U.S. Fair Information Practices into law.) An interesting consequence
of that is that most Europeans, at least in theory, have quite extensive
rights against information collection that violates these rules (at least by
the private sector).

Many Europeans have been able to exercise these rights in practice to
challenge data collection and retention by private companies, to see what the
companies know about them, or to demand that companies delete information
about them. Some of those examples have been mentioned here on Hacker News;
the one that I found the most interesting was when Malte Spitz got his cell
phone location records from Deutsche Telekom by exercising his right of access
under German data protection law.

[http://zeit.de/digital/datenschutz/2011-03/data-
protection-m...](http://zeit.de/digital/datenschutz/2011-03/data-protection-
malte-spitz)

Anyway, I think these rights are quite similar to what this article is
proposing, so I wanted to point out that there is a long history of similar
proposals, and that the idea that technology was taking away people's
practical right to control over data about them is something that's been a
concern for some decades.

By the way, the United States never enacted a comprehensive data protection
law, despite being where the Fair Information Practices were first cooked up.
They were never given the force of law in a general way, as they were in
Europe; here in the U.S. companies can, in general, collect and use data in
ways that would be considered "unfair" elsewhere. The main consideration in
the U.S. is that the companies can't lie in their privacy policies, but there
are few substantive restrictions on the private use and disclosure of data,
outside of particular regulated sectors (like credit cards with FCRA, health
care with HIPPA, and education with FERPA). There is extremely strong industry
opposition to a generally-applicable data protection law here.

Some sore points about data protection where it did get implemented into law:

① European data protection law is leading to some weird and counterintuitive
results, recently including the Google v. AEPD/González case where Google was
ordered to remove links to old disparaging (but accurate) information about
individuals when users search for their names, based on the idea that Google
was "processing" personal data about those individuals in an inappropriate
way.

② Data protection often has major loopholes for government collection of
information. (Government agencies, including police and spy agencies, very
often _are_ subject to privacy and data protection laws, but the application
of those laws often means just that those agencies are supposed to deliberate
about whether they think what they are doing is OK; if so, they can carry on.)

③ As this article and this discussion seem to suggest, notice and consent have
become more difficult where companies expect to use large amounts of personal
data routinely. The amount of consenting that users would be asked to do and
the frequency with which they are asked to do it could become quite annoying
and also decrease the likelihood that users will take the time to understand
what they are being asked to consent to. (We can see this to some extent with
the cookie notices on European web sites, asking users to consent to being
tracked by cookies. Contrary to the mainstream view of web developers, I think
cookie tracking is a serious privacy risk that users should still worry about
in 2014 and that addressing this risk is pretty important. But we can see that
the warnings haven't necessarily made most users better-informed or more
cautious about cookie tracking, and many users are probably kind of annoyed
that every site they use is warning them about cookies.)

As a result of the last point, I heard a Microsoft executive in a speech say
that he thought notice and consent were now _obsolete_ and ought to be
rethought. (This statement isn't super-shocking to Americans, who might not
even have heard about Fair Information Practices in the first place, but it
could have been something of a scandal if he had said it in Europe.)

The executive gave the example of the number of different entities that are
receiving user information when a user interacts with a major web site, and
the number of different privacy policies that would be applicable to these
interactions. He suggested that few users would even read the policy of the
site that they're trying to visit, let alone the policies of third parties
(that might receive user data as a result of embeds or as a result of business
partnerships).

I thought that preventing and discouraging some of those data flows was
actually a _goal_ of privacy protection. In fact, a lot of privacy software,
including software recently developed by my colleagues, is actively trying to
stop them, based on the idea that users don't know about them and that they
aren't in the user's interest.

[https://www.eff.org/privacybadger](https://www.eff.org/privacybadger)

------
spindritf
What if they don't care? Not because it's "designed" or "rigged" but because
privacy is not something they value?

Humans lived in small groups, then villages pretty much until yesterday,
easily for most of our species' history (99%+ of ~200k years). People knew
everything about each other. And then gossiped to make sure nothing went
unnoticed.

Contrary to the article, we probably still reveal less than we used to. People
would bathe in semi-public places. That's not common outside of vacation spots
any more.

Yes, advertisers bank on our nature. Gossip blogs bank on it. But they didn't
make us that way.

OK, so the nature of information collection changed but we don't feel it. Some
people can rationally appreciate it but not casual Internet users. And most
nightmare scenarios are still hypothetical.

 _Imagine being subject of public hate because you expressed a unpopular view
when you were young._

Many people, even here, are fine with that. On Twitter it's practically a part
of regular programming.

~~~
pknight
I think it's a flaw to bring in village life in pre-modern time in the context
of the meaning of privacy. For one, in that context what information one
shares is apparent and the impact of the information is more readily
understood. This is not the case when large volumes of data are shared with
unknowable powerful parties, used in unmonitored ways with effects that one
cannot predict or control.

Secondly, we do not live in a small community, we have an incredibly
interconnected world. Democracy does not function well when people lose the
capacity to moderate access of information about ourselves, our thoughts and
communications to some degree, when other parties may use that information
against the individual. It stunts individual expression and that is pretty
fundamental to living a modern society.

It's incredibly naive to undervalue the need and importance of privacy.
Nightmare scenarios are not hypothetical to a lot of people, they are acting
out right now, each and every day.

------
lsh123
I believe Windows Vista tried to do exactly that: annoy user with security
related screens all the time. This didn't go well because people were just
clicking "yes" without reading and in the same time everyone was absolutely
frustrated with these crappy alerts.

~~~
kashifzaidi1
Yes, good point, That would be traumatising to handle, all i am saying is, its
a debate we need to have, and this cannot be left in the dark corners of
modern web. This is just a proposal which imho covers all the bases, what we
choose to do with it may change according to local geographical conditions and
mindsets.

~~~
lsh123
There is no need to have debate - the current internet infrastructure does not
allow privacy. Even Tor and over tools don't let you to stay under the radar.
And it is not really clear if there are any better ways. At least I haven't
seen any research or proposals that would provide users with better privacy
online.

------
paul
"Imagine being subject of public hate because you expressed a unpopular view
when you were young."

Rather than being able/forced to hide all of my unpopular opinions, I would
prefer to live in a society that didn't hate me for having different ideas.

Privacy is important, but often the need is rooted in disfunction elsewhere.

~~~
kashifzaidi1
yes, i second that to the letter, only if we lived in that perfect society.

~~~
paul
Things do get better. Being gay used to be a major crime (and thus had to be
hidden). Now gay marriage is becoming legal in much of the western world. I'm
optimistic that we can continue building a more open and tolerant society.

------
miguelrochefort
Privacy is simply not sustainable.

Imagine all the lost opportunities, all the worry, all the steps some people
take to conceal the truth. It can't go forever, and it will become
increasingly more expensive to keep the privacy you had in the past, simply
because technology makes transparency cheap and ubiquitous.

Designing a society that relies on the secrecy of certain information is a
recipe for disaster. Passwords, credit cards, etc. It won't be long before we
simply can't keep any of these secrets, and we will have to switch to a better
identification system.

I've had enough to worry about what people might think if they encounter the
truth. I don't want to lie anymore. I don't want to keep and remember secrets
anymore. I don't want to watch each of my steps and hide behind 7 proxies when
I surf the web.

We're due for a paradigm change toward transparency, and the earlier the
better. Privacy and secrecy only leads to deception and inefficiencies.

------
ape4
But how can it be done every translation? Lots if time the service wants to
keep info about you to be used later. So if you say "no" 99 times but "yes"
once ... they have you.

~~~
kashifzaidi1
just to reiterate, i am saying to change the mindset we handle the privacy,
They can have you in a hundred different ways without you noticing. all i am
saying is we (as users) make them care enough to be on our side and not
collect info without telling us in the best ux possible.

