
Why privacy is important, and having “nothing to hide” is irrelevant - synesso
http://robindoherty.com/2016/01/06/nothing-to-hide.html
======
tobbyb
I think the tech crowd is in denial about their role in surveillance.

We expect professionals to behave ethically. Doctors and companies working on
genetics and cloning for instance are expected to behave ethically and have
constraints placed on their work. And with consequences for those behaving
unethically.

Yet we have millions of software engineers working on building a surveillance
society with no sense of ethics, constraints or consequences.

What we have instead are anachronistic discussions on things like privacy that
seem oddly disconnected from 300 years of accumulated wisdom on surveillance,
privacy, free speech and liberty to pretend the obvious is not obvious, and
delay the need for ethical behavior and introspection. And this from a group
of people who have routinely postured extreme zeal for freedom and liberty
since the early 90's and produced one Snowden.

That's a pretty bad record by any standards, and indicates the urgent need for
self reflection, industry bodies, standards, whistle blower protection and for
a wider discussion to insert context, ethics and history into the debate.

The point about privacy is not you, no one cares what you are doing so an
individual perspective here has zero value, but building the infrastructure
and ability to track what everyone in a society is doing, and preempt any
threat to entrenched interests and status quo. An individual may not need or
value privacy but a healthy society definitely needs it.

~~~
karmacondon
Not everyone agrees with you that the tech sector is contributing to the
building of a surveillance society or police state. There are a lot of people
who have carefully considered the issue and come to the conclusion that
facebook knowing what posts you liked or ad networks knowing which pages your
IP address has visited is not a Bad Thing. It's clear that you don't agree and
all debate is welcome, but I caution you not to trip in your rush to claim the
moral high ground.

I don't think there's any need to rehash the debate here. Simply, I and many
others do not believe that any western government is going to use information
gathered by tech companies to preempt threats to entrenched interests and the
status quo. I've seen the same arguments made here for years, and none of it
is convincing.

It's admirable that you are so certain in your beliefs. If you don't like what
the tech sector is doing, please by all means continue to advocate. Shout it
from the mountain tops, go to work for the EFF. But don't discount people that
legitimately disagree with you as being irresponsible. At least some of us
have made the effort to understand your point of view. The least you could do
is to try to understand ours.

~~~
SomeStupidPoint
> Simply, I and many others do not believe that any western government is
> going to use information gathered by tech companies to preempt threats to
> entrenched interests and the status quo.

It's simply hard to take your stance as one made in good faith.

The US government has a long history of using its national police, the FBI, to
infiltrate and subvert domestic political movements that the powers that be
found unpleasant -- including using their police powers against modern groups
such as the Occupy movement.

Further, we know that the US government has used records held by tech
companies to create massive cross-referenced databases of people, including
domestic activities. The recent leaks about surveillance programs has made
that abundantly clear.

Your position is literally that an organization with a history of doing this
kind of activity won't use the technology we already know the government
possesses to keep doing the same thing.

So I think there is a need for you to rehash the debate here, because it's not
clear how you sincerely hold that position.

Because rather than a rational view, what you describe sounds like irrational
denial.

~~~
karmacondon
You're saying, "We should not trust the government because they did things
that I didn't agree with in the past". This seems like an unfair standard to
hold any person or group of people to. I would be unhappy if people said "I
think that karmacondon has made mistakes in the past, so he shouldn't be
trusted to do his job ever again."

I understand what you're saying, and I think I get where you're coming from.
But like the GGP post, you're begging the question and assuming that your
beliefs are so correct that anyone who disagrees with them must be insincere.

I don't think there's anything inherently wrong with the government monitoring
potentially criminal groups or building databases. That's what we pay them to
do. If they get out of hand then we, the people, will deal with it.

~~~
jacquesm
> I would be unhappy if people said "I think that karmacondon has made
> mistakes in the past, so he shouldn't be trusted to do his job ever again."

It's not about mistakes. Mistakes are - usually - a sign that someone needed
to learn. They do not as a rule include wanton intent.

And if a person were to make too many mistakes then they probably should not
be trusted.

> I understand what you're saying, and I think I get where you're coming from.
> But like the GGP post, you're begging the question and assuming that your
> beliefs are so correct that anyone who disagrees with them must be
> insincere.

No, that's the opposite. You have beliefs that you state are so correct that
they stand on their own, in spite of a bunch of historical evidence to the
contrary, starting roughly at the time that we invented writing going all the
way into the present. That's a pretty gullible position.

> I don't think there's anything inherently wrong with the government
> monitoring potentially criminal groups or building databases. That's what we
> pay them to do. If they get out of hand then we, the people, will deal with
> it.

Potentially criminal groupls: everybody.

You're apparently one of the people where the 'fear' button has been pressed,
don't let your fear get the better of you.

Btw, I note that you write all these 'reasonable disagreement' things from the
position of an anonymous coward which makes me think that maybe you do realize
the value of your privacy after all.

~~~
karmacondon
We both made the mistake of discussing the US government like it's a single
entity. We're talking about hundreds or thousands of individuals spanning
multiple generations. I'm not going to worry about government metadata
collection because of something that happened during the Eisenhower
administration. Each person and group of people should be evaluated based on
their own behavior and merits, not the reputation of the organization that
they are affiliated with.

It looks to me like the US agencies, and the Five Eyes in general, are capable
people who are just doing their jobs. They aren't bothering me and I'm not
bothering them. The past actions of the US government or hypothetical
scenarios based on historical examples just aren't very convincing. _Anything_
could happen. But I'm not going to concern myself with it until I see some
evidence.

~~~
zAy0LfpBZLC8mAC
> Anything could happen. But I'm not going to concern myself with it until I
> see some evidence.

So, you are drinking battery acid until you see evidence that it's not good
for you?

Or do you maybe take the evidence of other people's experience into account?

If so, how about you take into account the evidence of hundreds of societies
that have dealt with massive surveillance (where "massive" still was "almost
none" in comparison to today's and tomorrow's technical possibilities) and
with oppression (those two empirically tend to go hand in hand).

If those are your sincere beliefs, I really would recommend you pick up a few
books about recent German history. How Hitler came to power, how the state
functioned once he was in power, how people tried to get rid of him but
failed, and what it took to finally remove him. And then continue with the
history of the GDR, how surveillance by the Stasi influenced everyday life,
how people tried to reform the political system but failed, and what it took
to finally reunite Germany.

The history of other countries might teach you similar things, but Germany is
a good example because it is culturally a rather "western country", so it's
easier to recognize similarities.

------
raminf
The issue with the 'nothing to hide' argument is that it puts the burden of
proof and determination of whether something is 'hide-worthy' on the target of
the inquiry.

If you subscribe to the 'presumed innocent' premise of the law
([https://en.wikipedia.org/wiki/Presumption_of_innocence](https://en.wikipedia.org/wiki/Presumption_of_innocence))
then the burden of proof is on the inquisitor.

Either you believe in presumed innocence or you don't. Pick one.

~~~
mverwijs
My respons to the 'nothing to hide' argument is: "Why are you so scared of
things that are hidden?"

That puts _them_ on the defensive. Since now they have to justify their fear
for the unknown. And then we're onto the real topic: fear.

~~~
Casseres
Someone who pushes the "nothing to hide" argument probably wouldn't understand
the implication of your response. I suggest a different approach: say "Okay,
in that case what prescriptions are you taking, how much money do you have in
your bank account, and how often do you have sex? After all, you have nothing
to hide right?"

That also puts them on the defensive side of the argument.

~~~
m1sta_
Are those really sensitive questions for most people?

~~~
Casseres
For some people, no. I tried to cover several different bases, but you get the
idea: ask sensitive personal questions. Maybe the money one should be about
credit card debt.

------
x5n1
Privacy is less important than the ability to trust your government won't do
blatantly illegal things like put innocent people behind bars, steal their
money or property, when they actually know that they are innocent. The biggest
problem with America is that the government can not be trusted to follow the
rules, their own rules.

I don't live in the US, but the stuff the government whether local, state, or
federal gets away with is very scary to me. What scares me even more is how
the United States encroaches on everyone else's legal system. That's the
underlying problem. Under such governments that are actually out to get people
at times without much cause breaking all sorts of rules, that's what's scary.

The type of soft totalitarianism that exists and passes as common place is
very scary. And that's really the people you should be scared of, and that's
who you really want to protect your information from. Your run of the mill
government that's actually trying to do a good job and not break its own
rules, that sort of government like my government, scares me a lot less.
Despite the fact that they encroach on my privacy. I know heads are going to
roll if it comes out that they do things that are blatantly wrong or abusive
with the information that they are collecting.

Not so in the US. They always have a half-ass lie that still somehow passes
muster.

~~~
opo
>...The biggest problem with America is that the government can not be trusted
to follow the rules, their own rules.

Isn't that the problem with all governments? Like they say "Eternal vigilance
is the price of liberty”

~~~
robmcm
Governments are also made up of individuals.

------
kmonad
Whenever I read one of these articles I am wondering why the examples of WHY
mass surveillance affects ME as average Joe negatively have to be so weak /
contrived:

For example, imagine someone convinced by the argument "nothing to hide
nothing to fear". Would this example convince them that in fact they do have
to fear something? "You might think twice about contacting or meeting people
(exercising your freedom of association) who you think might become “persons
of interest” to the state". I do not think so, after all, average Joe does not
know such people.

The solution, in my experience when talking to sceptical people not convinced
of the risks is talking about money. Imagine someone with the kind of
knowledge we are talking about with mass surveillance. And imagine this person
could inform your insurance companies. Do you still think that you have
nothing to hide? One then must only show that data is never "safe" and could
always be "leaked" to make a very simple, everyday example of why it is not in
my (average Joe's) interest to be continuously monitored.

~~~
karmacondon
Still not convincing, at least not to me and many of the people that I know.
I'm pretty sure my insurance company has already calculated my premiums based
on my demographics and claim history. I made not have told them "I have more
than five drinks almost every weekend", but they know. They were making
inferences from Big Data since back when it was just "data". And practically,
I don't think most insurance companies have an infrastructure set up for
handling snitching, especially not the smaller ones.

I'm really still waiting to hear a convincing argument as to why I have
something to hide, ideally something practical as opposed to hypothetical or
philosophical.

~~~
zAy0LfpBZLC8mAC
Imagine some type of insurance that typically costs 100 USD/month in premiums
per person and where the insurance company typically pays out on average 90
USD/month for damages per person. Now, imagine that one insurance company has
figured out a way to detect a certain group of people who will only have
damages of on average 50 USD/month, which make up about 5% of the potential
customers, and detecting which of their (potential) customers is in that group
costs on average 0.10 USD/month per customer. Do you agree that they will
start offering lower premiums for that group of people, because that will
attract new customers, and thus increase income?

And what if one company does this - what will other companies do? Will they
keep the same price for that group as for everyone else? Then that group will
leave for the company that's cheaper for them, right? Leaving the other
companies with the higher-risk customers, right? So, they will have to pay out
more for damages, right? Now, will they just go bankrupt? Or will they
increase premiums to cover the costs?

Noone says that insurance companies aren't making inferences from data. It's
just that the more data there is available to them and the more powerful
computers and algorithms get, the better they will be able to model risks. And
individual companies won't be able to ignore that, even if they want to. And
it's the exact opposite of what insurance is intended to do: It's intended to
distribute risk. The more exact insurance companies are able to model risks,
the more insurance will become unaffordable to those who need it, and the
cheaper it will become for those who don't need it.

~~~
greeneggs
The insurance industry is fairly heavily regulated (in the US). They can only
charge different rates based on specific factors. This means that your
entirely imaginary (and therefore entirely unconvincing) scenario will remain
entirely imaginary (and unconvincing).

Personally, I would prefer if car insurers could price discriminate more based
on data. I think this would lower rates for me personally, both in the short
term because I try to cultivate safe driving habits, and in the long term
because it would create an incentive for everybody to try to drive more
safely.

~~~
zAy0LfpBZLC8mAC
> This means that your entirely imaginary (and therefore entirely
> unconvincing) scenario will remain entirely imaginary (and unconvincing).

Well, that very much depends on how you look at it. It might be imaginary in
so far as regulation in the US possibly prevents those consequences, which is
great. One might also see it as evidence that collection of personal data has
risks--and that regulation might be one way to deal with those risks, at least
in some cases. After all, this kind of regulation is in effect a prohibition
on collecting certain kinds of personal data, even if the collection in itself
is permissible, as companies won't collect data when they can't use it for
anything anyway.

> Personally, I would prefer if car insurers could price discriminate more
> based on data. I think this would lower rates for me personally, both in the
> short term because I try to cultivate safe driving habits, and in the long
> term because it would create an incentive for everybody to try to drive more
> safely.

Are you sure that it would? Remember that for the insurer, it doesn't matter
whether they calculate your risk (and thus your premium) correctly, what
matters for them is that they aren't worse at calculating risks than the
competition (i.e., the competition can't outcompete them on price or cause
them to be left with a non-representative sample of their risk model), and
that on average, their criteria match reality (i.e., they don't take on risks
that are actually larger than they can pay for). Even if you in fact do drive
more safely than the average driver (as in: at the end of your live, you will
have had fewer/less severe accidents), it might happen that their predictive
models group you in a different category, because characteristics of your
driving behaviour that they use to categorize you are correlated with high-
risk drivers. If insurers don't know how to (economically) measure why your
driving behaviour is safe, it doesn't matter whether it actually is.

Also, the incentive can actually be a problem, exactly because risk models
employed by ensurers tend to not be an exact representation of reality. If you
have an incentive structure that does not align with reality, the incentive
can end up promoting harmful behaviour. For example, one obvious proxy for
safe driving habits could be lack of sudden decelaration. It's easy to
measure, and generally, if you pay attention to traffic and drive with
foresight, you usually will not need to brake suddenly as much as a reckless
driver. So, it's probably true that both, incentivising people to not brake
suddenly would have as one consequence people driving with more foresight,
which should reduce accidents, and also that people who don't brake suddenly
generally are a lower risk for the ensurer than those who do. However, this
proxy can not distinguish whether you brake suddenly because you didn't pay
attention--or because someone else didn't pay attention and surprised you. In
the latter case, though, the thing to do to avoid an accident might be to
brake as hard as you can. But that will be seen by your insurance as risky
driving behaviour (which it most of the time is) that comes with a higher
premium, so you have created an incentive for the driver to let an avoidable
accident happen. Note that the driver in question won't think about this for
an hour before deciding what to do, it's a gut reaction that might well be
influenced by having internalized "braking hard costs money".

And also: What if you actually are a really good driver but you enjoy braking
hard? What if you brake hard just for the fun of it, in situations where it's
completely harmless. Is it fair if you have to pay higher premiums for that?
Such incentives that work via proxy measurements of the actual risks tend to
force adherence to a standard of behaviour. I find the idea frightening that
insurance companies might get to dictate "safe behaviour", where the specific
behaviour is not actually necessary for safety, it just happens to be easily
distinguishable from risky behaviour, so behaving differently costs you money,
simply because it's difficult to figure out that your behaviour is not
actually risky.

------
bitL
It pains me to see that power-hungry people used most techies as useful idiots
to implement their own goals - how many politicians with dirty hands are now
getting on board of prime tech companies? They finally understood what
technology can offer to them. It seems like the end game is who is going to
control everything - those types can finally see the time when technology is
sufficiently advanced to control every aspect of our lives. It seems like
technology would enable a special caste above law, with power unlike anyone
before them. Instead of using technology to improve living conditions of all,
establishing new, unseen before, more democratic and free society, we seem to
be hell bent on preserving all the nasty traits of previous societies and even
doubling down on them by having almost complete control. Seems like some
dystopian sci-fi novel is happening now :-(

~~~
marcosdumay
I'm always suspicious of that specific dystopy.

It's not that they aren't trying, or that we are not letting them. The problem
is that technology is very "stubborn", "subordinating" only to those it
chooses to, and those are always almost impossible to predict. No elite was
ever successful on that kind of coup.

------
blitzprog
"This affects all of us. We must care." is not an effective way of convincing
someone.

I personally do not care about privacy. I see no reason why I should.

It's just my opinion. I know other people do but please don't generalize.

~~~
enraged_camel
>>I personally do not care about privacy. I see no reason why I should.

When late 19th century Germany started recording census data, a clerk made the
suggestion that they should also record each person's religion. No one
objected. What could be the harm, right? They were already collecting age,
gender, occupation, etc. so they might as well collect one more thing.

Half a century later, the Nazis were able to use those same historical census
records to identify whose grandparents were Jewish, and therefore who must be
Jewish, which greatly aided in rounding up those people.

So imagine: a piece of information commonly believed to be harmless to reveal
about oneself became the primary method that facilitated one of the greatest
atrocities in human history.

The lesson here is that tomorrow's government may turn out to be very
different than today's. Information you willingly reveal about yourself today,
or don't mind others (such as the government) finding out about you, may be
used against you, your children or their children.

That is why privacy is supremely important.

~~~
vixen99
Yes! Everybody should read
[https://www.schneier.com/blog/archives/2006/05/the_value_of_...](https://www.schneier.com/blog/archives/2006/05/the_value_of_pr.html)
who quotes Cardinal Richelieu "If one would give me six lines written by the
hand of the most honest man, I would find something in them to have him
hanged."

As Schneier puts it: two proverbs sum it up: "Who watches the watchers?" and
"Absolute power corrupts absolutely."

------
bcg1
A tragicomedic irony of course is that most "nothing to hide" advocates demand
secrecy and legal cover for their own actions. This sort of doublethink leads
to things like a "Freedom of Information Act" in the US that provides a legal
framework for concealing information that should otherwise be out in the open.

If you apply the "nothing to hide" principle to states' own actions, I can
think of two possible conclusions:

1) It is not true that if you have done nothing wrong, you have nothing to
hide (i.e., there is a legitimate right to privacy)

2) State actors have something to hide, so they must be doing something wrong.

------
Laaw
I have two unrelated thoughts.

"Chilling effect" has always been a profound term for me, because I imagine
the "cold" (numbness really) sensation a human body often senses when
something truly awful (disembowlment/dismemberment) occurs. The body's way of
protecting itself is to go "cold", and in many ways that's exactly the effect
taking place here, as well.

There's also an undeniable part of this conversation that rarely gets
addressed simultaneously, and I'd like to see it sussed out more in concert;
what about the folks who are doing Evil in these private channels? It's
unacceptable to me that TOR gets used for child pornography, and it's
unacceptable to me that my government finds out I'm gay before I come out to
my family.

I don't want to provide those who would do Evil any safety or quarter. I also
want to give people a powerful shield to protect themselves against judgement
and persecution from the public and sometimes the law.

We should talk about achieving _both_ of these goals, but we generally don't.

~~~
ohthehugemanate
Does it bother you that national highways are used by kidnappers, and civic
electricity is used by rapists?

It's infrastructure, so it's all inherently neutral. SSL is used by banks,
protesters, and criminals alike. You can't weaken it for one group without
weakening it for everyone. It's also global: you can't backdoor an IRC client
only for marijuana users in the US, for instance.

So if you get to surveil pedophiles in the US, it means that Saudi Arabia gets
to surveil homosexuals. We're on the same infrastructure.

Also, it's important to recognize that illegal behavior is a critical part of
Democratic change. If SSL could discriminate based on your intent to break a
law so we could arrest them all, people campaigning for marijuana legalization
would all be in jail, and the law would not be changingl. So would people in
the 60s campaigning for civil rights, and every homosexual in the country.
There is always a grey area period of time in which people break a law because
they don't believe in it. That period of civil disobedience is how laws end up
getting changed. Even (especially) morality laws against things like
sexuality, drugs, or alcohol. It's important to a living democracy that the
police are not a perfect force.

~~~
Laaw
I'm not content simply throwing my hands in the air as everyone else seems to
be. We should talk about other options.

There _might_ be a way to stop pedophiles and kidnappers, and rapists that we
haven't thought of simply because we're not willing to talk about how we could
do it.

Highways have police. Where are the digital police? I'm not sure I prefer such
a thing, but why don't we even discuss it?

------
jkonowitch
This issue always boils down to the LOTR argument for me: the surveillance
power is too great, and no individual or group can or should be trusted with
it, regardless of its actual current or potential future benefits.

The crux of the debate then is where to draw the line between safe and unsafe
amounts of power?

~~~
enraged_camel
It's not the amount of power that makes it unsafe, but rather its nature. In
order for it to be safe:

1\. It must be granted through democratic means.

2\. It must be under strict oversight by an independently elected or appointed
group that's free from both private conflict of interest and popular pressure.

3\. There must be reliable mechanisms to quickly and efficiently strip said
power away from the authority if they are determined to have used it
irresponsibly.

~~~
fweespeech
#2 doesn't happen with US democracy now at any level. It also inherently
contradicts #1.

------
exodust
Very good, but it's funny how on the "why IPVanish" page he links to, the
first reason given for using a VPN is, to watch Netflix from any location! Oh
the horror of limited localised Netflix content. We must protect ourselves.
(Really it is awful, I use a VPN for that purpose too). But the point is, it
doesn't seem popular to hide metadata from ISPs with VPNs. Will it ever be
popular? I'm not so sure. For good or bad, I'm suggesting most people don't
care that their IPs are recorded. Email content is not seen, nor what I type
into this comment form.

Also, when I send an email to my friend "laserpants@something.com", sure the
data captures the send-to email address. But the data doesn't know who
laserpants actually is, nor does the email content get saved. I'm not saying
laserpants can't be found if the law decides to investigate, but I doubt it's
a matter of pressing a button to bring up the real name of laserpants.
Especially if laserpants uses different email addresses and a shared internet.

~~~
tdyen
I think Snowden showed that it is as easy as pressing a button which is why he
went and blurted it out. The NSA were also working to make it easier to track
people.

My take on it is privacy is dead or nearly and we have to have good legal
protections of who can use what data and when. The privacy arms race will
mostly be won by big government with lots of resources and enough
willing/foolish patriots (Depending on your point of view).

~~~
cm2187
I agree. And the logical next step is that there will be inevitably abuses,
that some of the victims of these abuses will ultimately access power, and
that we will have proper privacy protection after that, like we do for regular
mail or like we do for your physical home.

It is just a shame that we have to go through the whole cycle given that we
just know it should happen. But in a way, the more extensive the surveillance
is, the quicker this cycle will happen. And right now we are in a pretty bad
place already. So let's be optimistic!

------
rplnt
This is an interesting post on the topic from reddit:

[https://www.reddit.com/r/changemyview/comments/1fv4r6/i_beli...](https://www.reddit.com/r/changemyview/comments/1fv4r6/i_believe_the_government_should_be_allowed_to/cd89cqr)

~~~
uptown
The thread under that comment just serves to reinforce the OP's point.

------
blfr
This is correct but appeals to freedom of speech/expression and association
won't work. People don't believe in these any more if they ever did. They're
being paid lip service at most.

Both professional and amateur politicians are taking notice. "We support
freedom of expression, however" there's always something more important:
safety, inclusivity, respect, diversity, civility...

We have dozens of examples, in the west, with hate speech laws, discrimination
laws, mobs organizing online over some tweet, employers firing people because
of offhand remarks, even opensource projects being assaulted over unrelated
comments made by contributors.

------
hellofunk
The definition of "privacy" very much differs significantly among different
generations and cultures. Compare the Dutch to the Americans, or anyone over
35 with anyone in their teens or early 20s. There is a smaller overlap in what
they all consider "private" than you'd at first think.

------
hasukimchi
I really like the phrase "I have nothing to hide, so I don't care about
privacy" is equivalent to "I have nothing to say, so I don't care about
freedom of speech" .

------
mattlutze
I forget why now, but I was reading this article[0] by Moxie Marlinspike, from
2013, just a few days ago.

Interesting how the same argument can take so long to take hold, and how long
some truths need to be told before they gain the traction needed to make a
change.

0: [http://www.wired.com/2013/06/why-i-have-nothing-to-hide-
is-t...](http://www.wired.com/2013/06/why-i-have-nothing-to-hide-is-the-wrong-
way-to-think-about-surveillance/)

------
logicallee
meh, I want to have my cake and eat it too. (I'm not making fun of someone,
this is actually how I feel, and there is some tension between requirements.)
I don't want any surveillance whatsoever, I want to just be able to do
whatever I want, jeez. To live freely. I shouldn't even think about being
watched.

At the same time, take something like the Dell database that was just stolen,
and criminals starting to do their criminal crimes. Then I want courts to be
able to flip a switch and say, you know what, if you're brazenly stealing a
private company's database and calling its customers trying to defraud them,
at some point there is some probable cause to make you stop doing that or
figure out who you are. You're not just going to stay anonymous behind a skype
number while you're defrauding people halfway across the world.

Also I don't want some bitcoin asshole to pay off an old soviet general and
get a nuclear bomb, just because they think it would be a fun troll to blow up
a major city, trololo.

These aren't theoretical concerns - ransomware, kidnapping, all these yucky
things that civilized societies don't have, all happen absent rule of law.

There's a reason there wasn't a period in the Constitution (specifically the
fourth amendment) after the words "The right of the people to be secure in
their persons, houses, papers, and effects shall not be violated." (Extra
points for what _is_ there.)

Even absent an anonymous Internet, way back in the eighteenth century, there
were limits on privacy. Think of it like an operating system - a good kernel
isn't reading my memory contents and slowing me down, but if I start
performing illegal operations I might very well get shut down :)

It's not an easy line to find. Also, I don't want tens of thousands of people
employed doing this crap. It's a minimal thing we need to live safely and
sanely, not some fun snooping. Frankly I don't see why humans even need to be
involved, until crimes start getting committed and the courts are trying to
figure out why or where.

~~~
zAy0LfpBZLC8mAC
> Then I want courts to be able to flip a switch

But you also don't want anyone else to flip the switch, or it being flipped
for any other purpose. That might just be impossible. So you might be better
of without the switch.

> You're not just going to stay anonymous behind a skype number while you're
> defrauding people halfway across the world.

Also, for this problem, as for many others, there are many possible solutions
that don't involve surveillance.

> Also I don't want some bitcoin asshole to pay off an old soviet general and
> get a nuclear bomb, just because they think it would be a fun troll to blow
> up a major city, trololo.

So, you would prefer them to use USD cash instead, then?

> These aren't theoretical concerns - ransomware, kidnapping, all these yucky
> things that civilized societies don't have, all happen absent rule of law.

Except they very much do happen in "civilized societies". And sometimes with
the help of the powers of authorities.

> It's not an easy line to find.

No. But it's quite easy to see that the direction we are heading is completely
at the wrong end of the spectrum.

> a good kernel isn't reading my memory contents and slowing me down, but if I
> start performing illegal operations I might very well get shut down :)

Which is very much the opposite of mass surveillance.

> Frankly I don't see why humans even need to be involved,

Humans who see the potential of the collected data will get involved. People
who want to abuse power don't usually wait until someone gives them permission
to.

------
Pharaoh2
Personally I really think that encryption is a matter of second amendment and
in the day of knowledge and communication the right to bear encryption should
fall under the second amendment. Hell, the US even classifies encryption as a
munition. We should be using the same argument for encryption that we are
using for the right to own guns and form militias.

I wonder if the encryption will be recognized as a right under the second
amendment by the court if it goes to that.

~~~
ploxiln
meh - the second amendment isn't very popular these days, and outside the US
(well of course it doesn't apply but additionally) people really don't seem to
appreciate it

------
blaze33
If you are interested in this topic, I highly recommend you read _Tradition of
Freedom_ by Bernanos (original title in French: _La France contre les robots_
).

Written in 1944, there is a specific passage where he argues against this "but
I have nothing to hide!" argument, only criminals benefits from hiding, right
? He talks about how a simple citizen who never had trouble with the law
should stay perfectly free to conceal his identity whenever he likes for
whatever reason, and laments how this very idea already died.

The extract is available online [1] in French, the google translation [2] is
not that good.

[1] [http://www.books.fr/quand-bernanos-predisait-une-societe-
sou...](http://www.books.fr/quand-bernanos-predisait-une-societe-sous-
surveillance/) [2]
[https://translate.google.com/translate?hl=en&sl=auto&tl=en&u...](https://translate.google.com/translate?hl=en&sl=auto&tl=en&u=http%3A%2F%2Fwww.books.fr%2Fquand-
bernanos-predisait-une-societe-sous-surveillance%2F)

------
ozim
Thing is that everyone have things to hide. I think there should be extensive
list of what you have to hide and reasons why.

I would start with two:

-salary : thieves, kidnappers (do you really trust that everyone in law enforcement is trustworthy they can cooperate with thieves)

check your contract, you might get even fired because you leaked your salary,
and it is your duty to keep it secret and explaining that some hacker did that
does not matter

-contacts : maybe someone will be stalking your friend and he might get access to your friend because you were hacked. You also should not share your friends mails and numbers without their consent.

They might be at least angry at you because you shared their contact info.
\------------------ Maybe you guys have better or more ideas.

But nothing to hide applies to hermit on top of the mountain. If someone
thinks otherwise he is an as __*le because he does not think about people
around him.

------
SFjulie1
Wishing privacy on the internet is like wishing no turmoil while shagging
during a massive religious events of paranoid gunned puritans.

If privacy is such a problem for some it is not a technological problem, it is
a political problem. If so, people concerned should make their revolution in
an appropriate place: the real world, and let internet stay a public media.

PS noticed another fun topic there are blacklisted keywords on HN, like F
words. Isn't censorship more concerning than privacy on a media? And funnily
enough all the "lite" censorship nowadays are first about sex and gross words.
Are sex and slang that dangerous?

~~~
zAy0LfpBZLC8mAC
> Wishing privacy on the internet is like wishing no turmoil while shagging
> during a massive religious events of paranoid gunned puritans.

Why do you think that?

> If privacy is such a problem for some it is not a technological problem, it
> is a political problem.

Why do you think that?

> If so, people concerned should make their revolution in an appropriate
> place: the real world, and let internet stay a public media.

The internet is the real world.

Also, who wants to make the internet a non-public media?

~~~
SFjulie1
Because main manufacturers of equipment are also from paranoid puritans
country ? (Huawey, Cisco, juniper, alcatel, nortel, ericson ...) And that it
make MITM easy for agencies since government heavily subsidise telecoms maybe?
($$ for a backdoor)

And since Internet is globally like a very fast Gutenberg press, it has the
same property as printed paper: privacy is not a problem of the media, but of
the institutions/organizations trying to control it. It is controled by law or
force. Law/force comes from/is backed by government, so if you don't have
privacy then complain to your government for his actions (or lack of).

Sum up: privacy is not an internet problem (media). It is a problem between
the citizens and their governments (use of media).

Had you opened a good history book you should know it. And also in the same
books you should discover that the concern for privacy has already existed
before internet : the more governments are authoritative the more they both
tend to love secrecy for themselves and hate public debate for the others and
want to control media.

Control of media by the government do not always align with the public's best
interests. DMCA, patents, IP laws, "censorship for the protection of the kids"
? Does it ring a bell? Google who wants you to read only sites that have nice
ads so they filter out as non relevant some information.

Want to overthrow your government in the shadow? commit a crime? Sext your gf?
I don't care about these usage and consider them accessory. They divert people
from real problems like accessing the information vehicled by the media.

By the way, they make very nice tinfoil hats nowadays. There are successful
crowdfundings for it.

Ho, and internet is the real world? Ah ah ah.

As much as "war and peace" is the real world.

~~~
zAy0LfpBZLC8mAC
OK, I guess I'm starting to get where you are coming from. I read your
original post pretty much as saying that people who wish for privacy on the
internet are wrong.

I still disagree with you, though. Yes, of course, technology alone cannot
solve the political problems. But I very much think that technology can play a
role in implementing and securing the political will. If you think of a
democratic voting process as a piece of technology, for example, that by
itself won't make a dictator go away. However, if there is a general consensus
that democracy is the way to go, the specific implementation of the voting
process very much makes a difference as to whether rogue actors can subvert
this political will or not. If you make it so that votes cannot be bought and
that the general public can watch the election from start to finish, that
makes a democracy robust against a minority of people who try to rig the vote,
for example.

I think information technology has a similar role to play in securing privacy.
You can build systems that are far more robust against surveillance than
facebook, for example. That doesn't help if the police comes after you for not
using facebook--but it might prevent Mark Zuckerberg from figuring out how to
manipulate the masses to vote for him as the next president ... or whatever he
considers his interest ;-)

Also, I think the internet is actually a pretty good basis for that. In
contrast to previous networks, at its basis, it doesn't distinguish between
clients and servers, and it is a mostly transparent network with all the
intelligence at the edge, so it's technically relatively easy to build your
own protocols and applications without approval from network operators.

As for the internet being the real world: Yes, I get where you are coming
from, but I think the distinction in this way is still counterproductive, as
it supports a narrative that is used by the other side that implies that
somehow the internet is special and that therefore human rights don't apply,
and that special (and usually more restrictive) laws are necessary, as if laws
that forbid fraud, say, for some reason didn't already apply if the fraud was
committed via the internet.

~~~
SFjulie1
I am coming from the sububurbs of Paris. Internet is not the problem.

Internet is expensive, exclusive, not efficient, not securable. It is the
wrong tool for privacy. I have taken part in building it.

Ideas main vector is words and language and their meaning not the paper on
which it is written. Education change the world. Because with or without
internet, words get exchanged by humans. All e-learning attempt without
someone to mentor have failed.

Getting in touch with people is working best where they actually live (at my
experience). Internet is a very updated map, not a territory, and it has blind
spots.

And democracy is not about voting. It is not a system, it is a property that
should ideally apply to a system.

A monarchy, a dictatorship, a republic can be democratic, the same way that
any geometrical figure can be concave or convex. It is just a property that
"the people"'s interest caged (oops born by luck randomly citizen of a place)
in a nation are being "fairly" re-presentend.

Native americans not being representented by their government on their
traditional land (or palestinians in israel, corsican & muslims in france,
scotts in UK, ouigour in china, flamish/wallon in belgian) is it fair?

Well, I don't know. Fairness at my opinion is a non achievable goal but a
never ending unknown path.

Internet has a lot of answers, but very few interesting questions being asked.

------
sreejithr
Don't mix __misuse of private information __and __privacy __. All the
"chilling effects" listed here are implications of misuse of private
information. On the other hand, in a world where we have weapons which can
single handedly wipe out whole civilizations, we need to make sure it doesn't
get used by some lunatic.

From that angle, a tight security apparatus (which may include surveillance),
is necessary for our survival. I think we should think more about how we can
prevent the "misuse" of private information than preventing surveillance
completely.

~~~
zAy0LfpBZLC8mAC
Well, yeah, how can we prevent misuse of private information other than by
keeping it out of the hands of those who might abuse it?

It's simple: You have to create a perfect test that can tell good people from
bad people, and then build a police force that only hires good people.

The fact is that there are people in this world who are willing to hurt other
people for their own advantage. You cannot fix that by giving huge amounts of
power to some more or less random selection of the general population. There
is no silver bullet, and trying to construct one nonetheless tends to end
badly.

------
enginn
"Our digital lives are an accurate reflection of our actual lives"

Which of course presumes we have a digital life, and which of course has been
proven repeatedly to not be the case. It is also not accurate.

Take data warehousing companies who are profiling home IP addresses and
hoovering up any digital breadcrumbs people leave behind, like user agent
strings, length of time spent on a page, any previous cookies stored locally
on the machine: an enormous store of value for anyone who decides to purchase
such information, except for the fact that it has no value.

The 'info' exists without any context, and could even be poisoned by a small
portion of users who decide to stuff the system full of disinformation to
control market share or lobby for certain products.

Also - IPV4 addresses (now more than ever) can be attributed to several
hundred people because ISPs grant a subnet to multiple customers.

This is not saying everything's fine and our digital doppel is a fuzzy haze of
nonsense. But it does say that privacy advocates are apt to overestimate how
accurate such information is, and that the people who buy such information are
finding out this too and have probably decided to pay more to other collection
points to get a finer-grained doppel of some person.

I say let them spend more, but I will cry tears of joy when I find that money
has been ill spent too and doesn't accurately portray a person digitally.

------
ck2
The problem is people who have never been on the wrong side of an encounter
with law enforcement have no clue about why all these "super tools and powers"
given to police, TSA, FBI etc. are so incredibly dangerous.

They just watch the news and get fear-mongered into thinking, oh we better arm
the hell out our "protectors".

Usually the first experience with abuse by police is enough to wake people up
but if they are white and middle-class, such an encounter may take years to
happen.

------
miguelrochefort
Why don't privacy proponents don't go all in and just ask to get rid of the
Internet? Surely, privacy is easier to maintain when communication is
inefficient.

Perhaps that's because privacy is actually an archaic and backward idea that
maintains all of our problems alive. I can't think of a less progressive (more
conservative) idea than privacy.

The next most important revolution in human history will be our transition to
a completely transparent society.

~~~
m1sta_
How do we get to a completely transparent society without massive collateral
damage? I understand the rationale for the goal but the transition is so very
difficult. I wish more thought was being put into it.

~~~
miguelrochefort
My only concern at the moment is to get people to realize that privacy is not
good. Only when people understand that our goal is transparency can we come up
with a strategy to make the switch.

Considering that we need to get to a transparent society, contributions to the
privacy movement only ensures that the transition will be even more difficult
and violent.

~~~
maxerickson
Have you written anywhere about how you arrived at such a radical position?

(I mean radical in the traditional sense, not as a slur)

~~~
miguelrochefort
I don't see how it's radical. It's the only logical conclusion.

Many people have written about this. I haven't read a single convincing
argument in favor of privacy as anything other than a defensive measure.

~~~
maxerickson
So no? I don't care about the labeling. I want to better understand the
reasoning. Or at least, what pieces of information make it so obvious.

~~~
m1sta_
I'm of a similar (although not identical) opinion to miguelrochefort. The
short answer is... if you imagine a utopia, do you imagine lots of secrets or
lots of openness and acceptance?

The reality is that I came to this conclusion through a long process but I'm
pretty tired. The process definitely considered whether the 'private' version
of the world was even possible. I don't think it is. Surveillance will happen.
Better we accept it and keep an eye on how it's used than pretend like we can
prevent it in the long term. Even if you were able to discourage the
ubiquitous 'high tech hackers' and 'big data' forms of surveillance (which I
don't think you'll be able to do), bribes and drones will continue to be used
for the powerful to get what they want.

~~~
maxerickson
How do bathroom doors fit into this?

In your utopia, if you ask me a question, do I have to answer honestly? That
people would always want to answer honestly is not a satisfying answer.

~~~
m1sta_
Bathroom doors are great. Let's not confound surveillance of interactions and
privacy in the extreme sense.

Bathroom doors allow you to interact with yourself with _some_ level of
privacy. People know you're in there. They know for how long. They know if
there are extreme sounds or scents. If you misuse the plumbing or other
bathroom features there is clear evidence of this. In rare circumstances, the
bathroom door can be kicked down while someone is using the bathroom.

The most common result of such information is "are you ok darling?".

Bathroom behaviours are interesting because they provide a real case study on
the impact of acceptance on privacy. On a first date, where you're not
presenting the real version of yourself, you don't want the other party on the
date to know anything about the events while you're in the bathroom. Over
time, if the relationship progresses, the secrecy around these events changes.
The reason for this secrecy changing, I believe, is trust/acceptance. Over
time you know that if the other person learns more about the bathroom events,
it will not change how they perceive you. If you're in such a relationship, it
also doesn't mean you wont use a bathroom door, and it doesn't mean you don't
want your partner to use one.

Hopefully that analogy makes sense. Most privacy advocates have a list of
'secrets' they fear will be used against them. I think we're more likely to
see a world where this fear is addressed through improved respect of
differences, compared to a world where suddenly surveillance is effectively
impossible.

~~~
jacquesm
Excellent comment, but I lost you at::

> Most privacy advocates have a list of 'secrets' they fear will be used
> against them.

What makes you say that?

~~~
m1sta_
Why do I think privacy advocates have a list of 'secrets' they fear will be
used against them? Because most of them are intelligent people who understand
how society works. Society today is full of lies, secrets, and hypocrisy. If
someone knows your secrets but you don't know theirs, you're probably
vulnerable.

I don't think privacy advocates, and persons such as myself, differ in opinion
on the problem today. The difference of opinion is on where we're trying to
get to and how we get there.

Tbh, I don't know yet how to get to the end state I'd like to see.

~~~
jacquesm
Ok, so you are making an assumption there. I don't think your assumption
holds, it is very well possible to be a privacy advocate and to at the same
time not have any crucial secrets worth keeping.

Privacy is a good thing, whether you have secrets worth keeping or not is
immaterial, privacy doesn't have anything to do with secrecy. The two are
often mixed but if you look a bit longer you'll see that they are in fact
orthogonal concepts only very loosely related.

Taking your bathroom example: there is obviously nothing secret about what is
going on in that bathroom, you can infer most of it from your own experience.
And yet, we do seem to feel the need for privacy.

Another example would be a diary. Diaries are intensely personal and our
etiquette around them is that if you happen to come across someone's diary
that you do not open it to read it. It is considered a private document, even
if it will not contain any secrets it may contain thoughts that the writer
does not want to divulge to the world at large.

So privacy does not require any secrets at all to be a very important thing to
many people, including privacy advocates.

~~~
m1sta_
We're into semantics here. Privacy provides a way to hide detail, to prevent
confirmation, and to allow people to deny things. For simplicities sake I
consider these, or more broadly, anything protected by privacy, to be
'secrets'.

I don't think the word 'crucial' that you added is useful here.

~~~
jacquesm
Such simplicity is lossy and in this case the loss is crucial, hence that word
and that's why sometimes (not always, I'll give you that) semantics matter.

Especially when not seeing that distinction might cause one to state something
that is either not true or that inadvertently allows re-framing the discussion
in ways that hamper progress.

------
rubberstamp
People of the U.S lost it because they allowed this to happen by "not caring
about it". And the whole privatized everything is bollocks. See what the costs
of "health care", "college education" and most importantly "how the cops" have
become. Health care and education has to be free if the people in the society
is to remain intelligent. Even by the scale of economics, providing with good
health care for free would cost much less that what it currently costs. The
police is severely lacking accountability that it much needs. Read the
original article linked in here. Think about it.

------
mirimir
OK, so now David Chaum is proposing PrivaTegrity.[0] It's "meant to be both
more secure than existing online anonymity systems like Tor or I2P and also
more efficient". But it includes a "carefully controlled backdoor that allows
anyone doing something 'generally recognized as evil' to have their anonymity
and privacy stripped altogether". Just exactly how the bloody hell can a
backdoored design be styled as more secure than Tor and I2P?

There's no fool like an old fool, as they say. Sad :(

[0] [http://www.wired.com/2016/01/david-chaum-father-of-online-
an...](http://www.wired.com/2016/01/david-chaum-father-of-online-anonymity-
plan-to-end-the-crypto-wars/)

------
Mendenhall
Many would trade their privacy for what they think is safety.

~~~
baddox
And they're more than welcome to trade their own privacy, although I think
it's unwise. The trouble is when they trade _my_ privacy.

------
edpichler
This read changed my mind. Even I don't have nothing to hide, I need to care
about my privacy.

------
CurtMonash
Bravo!

I've been making the chilling effects argument for several years:

[http://www.dbms2.com/2013/07/08/privacy-data-use-chilling-
ef...](http://www.dbms2.com/2013/07/08/privacy-data-use-chilling-effects/)

[http://www.dbms2.com/2013/07/29/very-chilling-
effects/](http://www.dbms2.com/2013/07/29/very-chilling-effects/)

[http://www.dbms2.com/2015/06/14/chilling-effects-
revisited/](http://www.dbms2.com/2015/06/14/chilling-effects-revisited/)

This article makes it with reasonable, appropriate breadth.

------
danielam
The reasons given are largely consequentialist. There are deeper philosophical
reasons why privacy is important in this context. The essential reason it is
important concerns the proper relationship between the individual and the
state. Surveillance and intrusion violate the proper relationship and
establish an improper relationship between the two. In other words, to justify
the relation and thus intrusion, one hold concepts of state and individual
that are anti-individualistic and place the state above the individual. The
undesirable effects follow. To borrow Koneczny's terminology, a surveillance
state is move away from Latin civilization perhaps towards Byzantine
civilization.

------
secfirstmd
( __ _Apologies for the blatant plug_ __)

On this subject, if anyone is interested, we just launched a free, open
source, Android mobile app to help people manage the complex issues of digital
and physical security. It's got simple lessons on everything from sending a
secure mail to dealing with a kidnap.

[https://play.google.com/store/apps/details?id=org.secfirst.u...](https://play.google.com/store/apps/details?id=org.secfirst.umbrella)

------
bobby_9x
In a recent study I read, millenials overwhelminly didn't think that the
fredom of speech should be upheld, so why would privacy?

Many people have already gotten in trouble or fired all based on private
conversations. Most people didn't care about the privacy implacatons, because
of personal feelings.

I feel like our civilization has gone backwards: online mobs determine guilt
and the ends justify the means.

------
chevas
It's government that has something to hide and it's them I don't trust, which
is why privacy is important. I wouldn't want government searching my home on a
whim and neither my digital content because they're the ones with the track
record of planting evidence, seizing assets, and perpetuating falsehoods for
their agendas.

------
bitL
Is there any polymorphic encryption toolkit that could enable a custom
encryption method tailored for your own use? Kinda like security by obscurity
for encrypting your own important files by deviating from the established
standard encryption schemes in order to overwhelm capabilities of
Mallory's/Trudy's reverse engineering?

------
erikb
There are so many unproven ideas and theses in the first (argumenting)
sentence alone.

Maybe privacy is a right granted by someone. But do I need to fight for all
rights? Can I not trade some for some others? E.g. I don't want to live in a
completely free market, becaues in a completely free market thiefs and bullies
always win. In our real life markets at least they need a lawyer's license
first, which stops some of them from succeeding.

Does it really underpin freedom of expression? Is complete freedom of
expression something that is worth fighting for, something people want? Look
at something where expression is nearly completely free: Clothes. Most people
tend to wear what other people wear.

Free society. What is that? Why do I need it?

Democratic society. Doesn't the current global development show that democracy
is failing us? Every system comes to an end, and democracy certainly is behind
the top of the hill.

I stopped reading after that sentence. If a blog post doesn't even think about
the nuances of what they are talking about, there won't be much content
anyways.

Just a little side note: I was in China three times now. Many people consider
China very unfree, very undemocratic. But I see people there having more hope,
more optimism and more opportunity to develop their dreams than we have in the
west. And mobile internet is developing bigger and faster there than anywhere
else, despite having nearly everything run via one mobile, government observed
app: Wechat. In some regards I wonder if it is really "despite" government
control or "because".

------
amelius
Don't trust someone who says "I have nothing to hide". And certainly don't
trust them with your secrets!

------
elrodeo
There are very few topics where I just cannot get the point of the discussion
among smart people, but this is one of them.

Look at the real world NOW, 15 years after all the surveillance. You still can
explode bombs and kill people middle in a european capital without any
encryption at all. Is this the kind of surveillance you are afraid of?

If you want to hide something, there are infinitely many ways to do this. No
surveillance can (or ever will) read the one time pad encrypted communication.
So you have (and always will have) your freedom and capabilities to hide —
what's your problem?

Arguments like "well then show me your bank account" are just plain stupid: I
have no interest in sharing this information with my work colleagues, my
neighbours or my friends just because it would have implications in some
social aspects (it's not about security!). But his information is only
sensitive in context of a personality. I'd neither have problem to show
anything to a random stranger nor would I be interested myself in this
information coming from a random stranger.

If somebody uses my information in an unethical way, it is not the problem of
a surveillance, but that it's possible at all.

Exposing my personal data to a government during an investigation could also
protect me by verifying my alibi. We have nothing to hide, right?

The comparison with free speech is ridiculous. Free speech is the opposite of
hiding and doesn't imply breaking the law. Hiding implies playing by other
rules, than commonly established. Free speech is important because eventually
I might have something to say. But no one would ever agree that he or she will
have something to hide eventually (without getting criminal).

So I'm still missing the point...

~~~
robmcm
Yes because you assume the people who are collecting your data are ethical,
secure and will always share your opinions and beliefs.

Say your bank account information was stolen and someone used it to blackmail
you because as you said it would have, "implications in some social aspects".

Say the country you live in converted to a religion you are not part of or
want to be any part of. Imagine if they had a record of your beliefs and used
it as a handy tool in mass genocide.

Say someone working for the government or a start up was jealous of something
you had, and used their access to take your information in order to discredit
you.

Say you medical information was sold by a fitness start up that went bust and
sold to insurance companies to bump your premium.

~~~
elrodeo
If we assume, that — by default — people are unethical, governments are
corrupt and most people in power are criminal, then you're f*cked anyway. With
or without mass surveillance.

~~~
zAy0LfpBZLC8mAC
We don't assume anything by default, we just make an empirical observation of
how people behave, and then act accordingly. The empirical observation is that
there are people who are unethical, corrupt, what have you. And also, the
empirical observation is that there exist certain group dynamics that make
certain societal developments very hard to reverse. That is why it seems like
a very good idea to avoid putting too much power into a single person's hands
(in case it turns out to be one of the bad apples, or in case the power is
delegated to a role rather than a specific person, in case one of the bad
apples ever gets into that role), and to try and avoid the kinds of
developments that tend to end badly.

Nobody says that _all_ people are unethical, or _all_ governments are corrupt,
or that _all_ people in power are criminal. But rather, that being part of a
government or having power does not prevent people from being unethical or
corrupt. Bad people are generally a minority, but they do exist. That is one
reason why we have government and police and military in the first place. But
there is nothing that necessarily prevents bad people from becoming part of
government, police, and military. That is why it is important to limit the
power of those institutions. To limit the damage that bad people inside them
can do. And also to limit the appeal to bad people wanting to become part of
them. That's essentially the whole point of democracy and the separation of
powers, BTW. It's a security mechanism that protects you from bad people in
power - not because all people in power are bad, but because occasionally bad
people manage to get into powerful positions, and that tends to end badly.

------
bitL
"I have nothing to hide, and what I am keeping private is encrypted by a
quantum- and $5-wrench-method-resistant encryption."

------
yuhong
My goal is more modest, to fix the problems with posting using real names if
possible (and not by using real name policies).

------
ternbot
Truth prevails; it is not mass surveillance but mass data manipulation we must
be concerned about now.

------
zero1954
What an irony that the link is missing the "s".

