
Amnesty International: Encryption Is a Human Rights Issue - DiabloD3
https://www.eff.org/deeplinks/2016/03/amnesty-international-encryption-human-rights-issue
======
alextgordon
The average person isn't even _aware_ that VLC is an illegal circumvention
tool in the United States under the DMCA. I fail to see how restrictions on
encryption software would have a different outcome. Encryption is built into
so many products, it's part of daily life and most people don't even know
they're using it.

~~~
ptha
I wasn't aware of this:
[https://en.wikipedia.org/wiki/VLC_media_player#Legality](https://en.wikipedia.org/wiki/VLC_media_player#Legality)
Harks back to people not being aware that making copies of CDs for private
(fair) use is illegal
[https://en.wikipedia.org/wiki/Ripping#Legality](https://en.wikipedia.org/wiki/Ripping#Legality)

------
userbinator
_Encryption is therefore also an enabler of the rights to freedom of
expression, information and opinion, and also has an impact on the rights to
freedom of peaceful assembly, association and other human rights_

I haven't been following the whole encryption debate in detail recently, but
it seems there's one point which hasn't been discussed much: What about
encryption which works _against_ users' freedom, things like DRM, "trusted"
boot and the whole idea of mandatory forced trust, etc.?

On the one hand, I'm against government surveillance. On the other hand, I'm
also against DRM and user-hostile locked-down systems. Government
surveillance, breaking DRM, jailbreaking, rooting, etc. all rely on cracks,
the "imperfect" nature of security in some form or another. That's why I think
this is a particularly perplexing issue, and bluntly saying "encryption is
good and we should have more of it" is not seeing the whole perspective.

Relevant story: [http://www.gnu.org/philosophy/right-to-
read.en.html](http://www.gnu.org/philosophy/right-to-read.en.html)

~~~
EthanHeilman
>What about encryption which works against users' freedom, things like DRM,
"trusted" boot and the whole idea of mandatory forced trust, etc.?

Isn't this a "freedom from/freedom to" distinction?[0]

People should be free to use encryption. People should be free to use software
to break DRM. People should be free to create, use and sell hardware which
limits the behavior of software. Using hardware to hide keys from software for
instance is a common DRM usecase, but it has applications in security and
privacy as well (SGX or Apple's enclave for example). People should be free
not to use DRM as well.

The problem, it seems to me, is should technology be used to enhance or
degrade a users agency and security? The technology, including hardware which
limits software behavior, can be used in both directions.

[0]: [http://plato.stanford.edu/entries/liberty-positive-
negative/](http://plato.stanford.edu/entries/liberty-positive-negative/)

------
TheSpiceIsLife
What is left to say about this? Does anyone have any new insights?

I want encryption so I can perform online transactions safely. Or at least
safe enough that the financial services I use online can ensure the movement
of my money.

If I want encryption to communicate online then it all comes down to who you
trust. Who do you trust? Certainly not any provider of encrypted messaging
services, since we can't audit their code, can't guarantee there isn't a TLA
(Three Letter Acronym) plant working there, can't guarantee it's not a TLA
front, can't guarantee the service provider isn't subject to a NSL etc etc.

I'm not sure I'm convinced this is a human rights issue. How would we enforce
the right? Rights that aren't enforceable aren't much use. Negative Rights[2]
and all that.

The UN's 'The Universal Declaration of Human Rights'[1] is a nice document,
but it's existence hasn't made those things real. Adding another article, or
interpreting the right to encryption in to an existing article, does not, and
can not, ensure that right.

What am I trying to say? Probably something like: if there are people who are
inclined to, and have the power to, infringe your _right_ to privacy, you're
probably pretty screwed.

1\. [http://www.un.org/en/universal-declaration-human-
rights/](http://www.un.org/en/universal-declaration-human-rights/)

2\.
[https://en.wikipedia.org/wiki/Negative_and_positive_rights](https://en.wikipedia.org/wiki/Negative_and_positive_rights)
(... rights are ranked by degree of importance, and violations of lesser ones
are accepted in the course of preventing violations of greater ones ...)

~~~
jensen123
> What is left to say about this? Does anyone have any new insights?

There is one thing that I'm really, really wondering about. Are large cities
like London and New York basically screwed?

Let me explain: One large city with 10-20 million people is obviously far more
vulnerable to terrorism than 10 smaller cities of 1-2 million each. Obviously
Amnesty is correct that encryption is good for freedom of speech, human rights
etc. Also, encryption is necessary for business/banking in a modern world. But
let's not kid ourselves, terrorists can also hide behind encryption.

Terrorism is nothing new, and there will probably be more of it in the future.
I'm not a law expert, but my impression is that the countries in Europe with
the worst laws for privacy is the UK and France. Probably because that is
where you find the largest cities: London and Paris. These people are
scared/worried. I can see no other reason why they would enact laws like
these.

So in the future, when there are more terrorism attacks, knee-jerk politicians
(and their voters!) will probably want even more laws restricting encryption.
This will make these places even worse for both human rights and business.

The industrial revolution basically created mega-cities. Is modern computer
technology making them impractical?

~~~
stegosaurus
The thing is that even these cities aren't that vulnerable.

Imagine you're a terrorist.

Where are you going to attack for maximum impact? Piccadilly Circus? Heathrow?
Gatwick?

Or a random suburb out at the end of a line somewhere?

~~~
jensen123
I see your point. Much of this vulnerability is psychological, though. If you
look at how many people have died from terrorism in the past, it's not many,
at least compared to things like cancer, heart disease or traffic accidents.
Still, I remember how most Americans were more than happy to give up their
freedoms and human rights after September 11, in order to fight terrorism.
Never mind that those freedoms and human rights were probably much of the
reason why the US became great (compare it to say Argentina or Russia).
Ironically, I think in order to preserve freedom during periods of terrorism,
we would have to do away with democracy, since most voters are such morons.

------
studentrob
Cool. I do think we should stay on offense about this.

The DOJ has said they won't let up seeking legislative and court room power to
demand warranted access to encrypted data.

The problem with this is that as a society, we are not all focused on the
right ways to keep each other safe. Of course, we as technologists know that
no such law is enforceable given the existence of free and open source
software. But the rest of society doesn't get that, and some portion of them
are simply taking our word for it. Given another terrorist attack in which
encrypted communications is somehow shown to be a factor, some of the public
could swing the other way.

I believe it is our civil duty as technologists to educate each other about
this issue in a respectful manner. We must not assume there is some nefarious
government position. That hurts our ability to convince those people who do
trust the government. And, those are precisely the folks we want to convince.
We're on the winning side of this ride now and we should ride the wave as long
as it is carrying us towards greater public understanding of encryption and
technology. We'll be safer on balance and have a better IT industry if we do
so.

------
gremlinsinc
What about voter rights? Why is nobody fighting the Fiasco in AZ?

------
ars
When everything becomes a human right, you cheapen the entire concept. So many
times I read in the news "this latest issue is a human right, and that thing,
etc, etc".

Let's keep human rights to the basics.

If your goal is privacy as a human right, then make that the human right, and
stop there.

Not the tool you use to accomplish it! Next a computer will become a human
right, and then electricity, and a keyboard. Where does it end?

Encryption is a tool, not a goal in and of itself.

~~~
mapleoin
Yes and that's how Amnesty references it, as an _enabler_ of __other __human
rights:

 _Encryption is therefore also an enabler of the rights to freedom of
expression, information and opinion, and also has an impact on the rights to
freedom of peaceful assembly, association and other human rights._

