
The Potemkinism of Privacy Pragmatism - willvarfar
http://www.slate.com/articles/technology/future_tense/2014/09/data_use_regulation_the_libertarian_push_behind_a_new_take_on_privacy.single.html
======
vezzy-fnord
There needs to be restrictions on both, conceptually. Actual implementation is
yet to be proposed. Relying purely on data usage reform is a pipe dream, just
as the ideal of having an intelligence agency that works completely within the
framework of law. It didn't work for the Church Committee, and the whole point
of an intelligence agency is that it will do borderline or outright illegal
acts (black bag operations, disinformation, psyops, etc.) to further state
interests. Almost inevitably, this includes domestic surveillance, which the
USA has been conducting en masse since at least 1945.

------
qwerta
Europe (Czech) already has several limitations, and it works well. It is
illegal to cross reference some database and so on.

But limitations usually apply only to commercial companies, state agencies
have several exceptions. What I really miss are limitations on government
usage of data. We already have:

* national medical registry (with all prescription purchases),

* public transport ID cards,

* bank history and financial records,

* cell phone location history (tower based +-1 mile),

* phone calls registry...

And 2% of population have unrestricted access to all of this. In theory you
need court permission, but in practice it is just a few key strokes away...

------
bowlofpetunias
The foundation of good privacy regulation is the definition of ownership. All
data about you belongs to you, no matter who collected it why and how.

If you don't own your data, you don't have any privacy.

Yes, that is very hard to properly regulate, leaves a lot of fair use grey
areas and with the rapid pace of technologies leads to hastily created
regulation like the "cookie-law" or the right-to-be-forgotten. (Although in
both cases the industry deliberately made a circus of the implementation, and
especially the latter is now backfiring hard on Google.)

But the bottom line is, if the law doesn't recognize your fundamental
ownership of your data, you have no privacy and you never will, no matter what
other forms of legislation and self-regulation get implemented in the areas of
collecting and using that data.

The US is one of the few Western democracies that doesn't recognize this
fundamental aspect of privacy, and in the information age, that makes any form
of privacy regulation a complete joke.

Step one should be to put the individual in control of their data at any time.
If people are not informed about what happens to their data _if and when it
happens_ and are not in a position to stop it, they've already lost their
privacy.

Privacy should not be a faux "right" like copyright, something you can just
sign away for someone else to exploit. It's a fundamental civil right, and
without it we have no freedom.

------
anon4
Each time I read that I'm reminded of Sherlock Holmes. Sherlock Holmes can
look at a person and tell you everything about them by just deducing facts
from observable facts and cross-referencing what he knows. He can do that,
because he's exceptionally smart and knowledgeable, neither of which is
illegal - you can't forbid people from being smart and knowing things. Having
a machine to do it for you, to me, is a natural extension of your mind, so I
feel very much against outlawing a method of doing something, which would not
be illegal if you could do it by being exceptionally smart.

Still, I really don't want corporations to have such power over people. The
difference between Sherlock Holmes deducing that I usually ride a bicycle to
work and a computer system deducing that, is that the computer system then
proceeds absolutely automatically to spam me with offers about bike safety
gear and indiscriminately shares that information with third parties.

So perhaps the answer here is not to outright outlaw data collection, but to
find a model where there is someone personally responsible for your data that
will safeguard it and make sure it's not misused.

Imagine a data bank institution, which collects and stores your data and then
only allows a very limited access to it to certain parties. You sign up with a
databank and pay a monthly fee for your account. Then, whoever wants to
collect data from you uploads it to the databank's servers, never stores it
locally, and you have full access to it. Each party that collects data on you
can access all the data they have stored on the databank's servers, but not
the data of some other party, without your permission, and you can always
revoke sharing rights or just disallow someone to store data about you,
effectively turning off data collection for them. It might also be a good idea
to disallow parties that upload data to a databank to compute locally with
that data. That is, if someone wants to perform some inference on the data
they've collected, they upload a program to the databank, the databank runs it
on their servers, subject to access controls that you can modify, and then
they get back the results of that computation.

In this model you're the one paying for the safety of your data and the people
keeping it are making money not from giving it away, but from keeping it safe.
And you can't keep people's personal data without being a registered and
regulated databank.

So next time you sign up for a Target card, you give them your databank
account number and have full control over your own data.

~~~
djulius
"Having a machine to do it for you, to me, is a natural extension of your
mind"

To you, just to you, really.

