
No encryption was harmed in the making of this intercept - Khaine
https://risky.biz/bannedmath/
======
Zak
> _Do we believe that law enforcement bodies should have the authority to
> monitor the communications of people suspected of serious criminal
> offences?_

I actually believe they should not. They should have the authority to
_attempt_ to do so with proper oversight, but that in no way implies that they
will necessarily succeed.

I do not, however believe that they should have the authority to compel others
to actively aid them. They should not, as the author suggests have the
authority to compel Apple or Google to install malware on users' phones. If
the police can do it themselves, with proper judicial oversight, or convince a
software maker that spying on Osama bin Satan is worth compromising their
principles, so be it.

The idea that the ends justify the means tends to lead to some dark places and
I submit that we should not start down those paths.

~~~
Khaine
We've already gone down those paths. Societal expectations and legal precedent
have been building for this since the 1880s where registers of telegram
metadata were kept. I don't understand why everyone is so naive with respect
to the internet.

The Government's main argument is "Why should the internet be different from
the 'real world'?" Telephone companies have had to comply with these kinds of
government requests since telegrams were around. No-one I have heard has had a
good serious argument against this.

Saying no-one should have this power is great and all, but its waay to late.
These powers to get access to this information with a warrant have been around
in some form or another since before WW1.

~~~
AJ007
Certainly tracking telegrams is a far reach from today. Metadata tracks user's
physical moves. Depending on web activity, it can even comes close to
reconstructing what they are thinking.

Who draws the line of what laws a commercial communication platform must
follow and under which jurisdictions? There is no line because complying with
the laws of one country your users are in will break the law of another.

I believe he problem is the centralization of communication. There is no
reason that these platforms can not be distributed networks running open
source software. Any government, which does or does not respect basic human
rights, would be free to attempt to intercept the communication and anyone
else would be free to try to prevent it.

~~~
Khaine
The Australian Government has asked tech companies to support them getting
legal access to devices, specifically the Prime Minister called out for the
ability to get plaintext versions of WhatsApp, Telegram and iMessages. These
are arguably the telegrams of today.

Yes, given the centralised nature of these services it is a concern that
foreign government's demands could be unreasonable. However, the nature of
operating in a country is to comply with their laws and regulations. What the
Australian Government has asked for is that these communication methods be
treated equally, and for tech companies to get on board with that.

If Google and Apple don't want to support the operations of authoritarian
governments then perhaps they should actually take a stand and not operate in
those countries, and not have their products made in those countries.

I value my privacy and my freedom, but honestly I think that as an Australian
the Australian Government does have the right to be able to access my devices
during an investigation, provided their are appropriate checks and balances
involved. Just like if I had a safe in my house, they have the right to get it
opened with a search warrant. If I refuse to open the safe, they have methods
and tools that allow opening it. I fail to see what makes iPhones any
different from a safe conceptually.

I understand that technically it may be very difficult or potentially
impossible to provide a secure mechanism to provide that support to the
Government, and that is where my greatest concern for these laws come from.

~~~
jancsika
> I value my privacy and my freedom, but honestly I think that as an
> Australian the Australian Government does have the right to be able to
> access my devices during an investigation, provided their are appropriate
> checks and balances involved. Just like if I had a safe in my house, they
> have the right to get it opened with a search warrant. If I refuse to open
> the safe, they have methods and tools that allow opening it. I fail to see
> what makes iPhones any different from a safe conceptually.

Warning: you are relying on analogies to pre-digital technology to make sense
of digital technology.

I'll try a lightning round in ten minutes:

1\. A physical safe generally must be _physically_ opened by officers coming
to your location and opening it. Typically a warrant is served to the person
who owns the safe, who then reads it, and can often watch the officers
physically enter their property and possibly even watch them open the physical
safe using the combination they give to the officers. After the officers leave
the person can have the lock changed with a fair amount of certainty that the
officers didn't change the physical makeup of the safe in such a way that they
can remotely access the contents of the physical safe in the future.

2\. Under certain circumstances officers could access the safe without the
person present. But, generally, they have to physically enter to the person's
dwelling to do that. Neighbors may see them enter because officers are
conspicuous physical objects that travel well below the speed of light.

3\. Officers can compel people to open the safes under certain circumstances.
But some safes are expensive often because their design makes it difficult to
open them without the key. I'm unaware of previous situations where law
enforcement made a public awareness campaign about the difficulty of opening
expensive safes under the rare circumstances that such safes held vital
evidence to time-critical investigations. I'd love to read about such a
campaign of "Our Safes Going Dark" if you have a reference to such a campaign.

4\. Law enforcement cannot outsource the physical opening of safes to an
Italian safe-opening company that installs proprietary software at the station
that allows the officers to do remote and clandestine opening of many types of
physical safes. (Where _remote_ means opening _and_ viewing the contents of
the safe without having to dispatch a physical officer to a physical
location.)

5\. Law enforcement cannot leverage the same type of Italian safe-opening
techniques in an automated system that checks the metadata coming across the
internet for certain selectors and then automatically and secretly opens
certain safes, storing pictures of contents of those safes for later perusal
or even changing the substance of the safe so that the safe itself will report
certain data back to law enforcement just in case it turns out this safe
belongs to someone who turns out to be a criminal.

6\. Nation states could not indiscriminately store the data about the sum
total of all the places you traveled and people you interacted with on your
way to and from the safe, store that data about you, and then redefine the
word _collection_ to mean what happens when someone reads one or more of the
pieces of data that were collected.

 _Bonus_ 7\. If nation state #1 wants to embarass nation state #2 by releasing
some of state 1's secrets, those secrets cannot easily be leveraged to make
digital ransomware appear on a large number of safes of nation state #3.

Essentially, everything I can think of in ten minutes that is important about
digital security/privacy/surveillance is lost when you use analogies to
physical pre-digital objects.

------
adekok
FTA:

(a cop could) _ask them to devise a way to retrieve the requested data from
that device. Like, say, pushing a signed update to the target handset that
will be tied to that device’s UDID (Unique Device Identifier). That way
there’s no chance the coppers can intercept that update and re-use it on
whomever they want_

I'm amazed at the naiveness of this approach. Sure, it looks good on paper:
"Cops can't use wiretap for user X against user Y".

But using _technical means_ to prevent _illegal behavior by police_ seems to
be entirely the wrong approach. How about instead not giving the police the
means to break the law?

Or, why stop with device-specific checks? Why not just force the APP to do a
UDID lookup to a police-controlled server, where it can control who gets
wiretapped? If the companies don't comply, throw the executive in jail for
"contempt of court"! After all, we're out to get the bad guys!

Apple and others are right to fight this nonsense. They _should_ take every
possible step to ensure that they themselves can't help the police side-step
the law. It not only protects their users, it protects themselves, too.

~~~
luke-stanley
Remotely installing apps works fine with Android. What stops Google from being
compelled to deploy an app for a user or millions of them? Probably not much,
and the user probably has no control over it, right?

------
adekok
What about being forced to hand over passwords?

[https://9to5mac.com/2017/06/01/fifth-amendement-passcodes-
pa...](https://9to5mac.com/2017/06/01/fifth-amendement-passcodes-passwords-
law/)

 _The Miami Herald reports that a child abuse suspect was jailed for six
months for contempt of court after failing to reveal the correct passcode to
his iPhone_

Why is he being forced to incriminate himself?

1) the information on the phone _will_ change the charges against him and/or
sentence. It would seem that a fifth amendment defence is appropriate, and
legal

2) the information on the phone will _not_ change the charges against him
and/or sentence. In which case the information isn't needed.

This whole debate for me can be summarized into one idea: _Do you want other
people to be able to watch everything you do?_. If so, all bets are off. If
not, the government shouldn't have the right to watch _suspects_ , or to force
people to incriminate themselves.

~~~
dTal
Sounds like the contempt was due to deliberately handing over the wrong
passcode and playing innocent, rather than a principled refusal to self-
incriminate. Being held in contempt for such shenanigans is entirely
appropriate. Of course if it really was a mistake that's hugely problematic.

------
nattmat
And when it is publicly known that the Apple, Samsung and Google compromises
their operating systems, the adept criminals will flash their own operating
system onto their phones.

Apple and Samsung need to provide a worldwide infrastructure for handling
requests from law enforcement all around the world (also in the less liberal
regimes). How secure can this process really be by now. It doesn't matter that
the coppers cannot reuse the exploits, they will just request and receive a
new one instantly.

Now criminals also have access to these services, since we now police
corruption is a thing.

------
RubyPinch
> Let me put this bluntly: If [weakening of encryption] is what the government
> winds up suggesting, then by all means hand me a bullhorn [...]

well how else would you interpret the statement that Australian law supersedes
math?

\- - -

That aside, "We do not share customer data with 3rd parties, as it is not
possible due to end-to-end encryption, except when we send things in
plaintext, because of our moral obligation to the Australian government" is
going to be a greaaaat addition to privacy policies for Australian companies.
Going to go real well for international business.

------
Canada
Just as spying through eavesdropping on plaintext was defeated through
technical means, pushing malicious updates will also be defeated.

Tech companies will have to declare that their update mechanisms contain no
means to apply updates without user consent, and when updates are sent they
will publish the hash. Security researchers will independently verify such
claims by reverse engineering.

Then whoever some government wants to spy on cannot be tricked into installing
a malicious update, as suggested by this blog post.

Governments are just going to have to accept that fact that their ability to
intercept private communication and recover private documents will be reduced
to zero in the near future, as it should be.

------
ivanbakel
A very good point. We've been obsessing over the security of the pipes - and
rightly so - with the real assumption that our access to them is our own. The
recent border holding phenomenon in the US come to mind when thinking about
how much you really own your technological property. Maybe it's time to make
free hardware the next major project.

------
peteretep
Excellent article. Nothing irritates me more than hyperbole about politicians
and law enforcement. Already we have people saying "Macron wants to ban
encryption".

All that said, LE hasn't helped itself at all with over-zealous and blunt-
edged approaches so far -- cf FBI cracking iPhones, the NSA dragnet with
little oversight.

I don't mind the government being able to read my mail. I just want them to
need a warrant first, that a judge has signed off on, and that's particular to
me.

~~~
rocqua
This article promotes the idea of companies pushing malicious updates to
specific users. Malicious in the sense of circumventing E2E encryption without
notifying the user and even doing this retroactively?

Suppose then, that we get deterministic compilation and an open-source client
for E2E encryption. It would then be possible (or rather, much easier) to
detect such malicious updates. Should such protection be made illegal?

I agree that there is no principled objection to access to communications
provided there is a warrant. However, if this requires weakening security far
beyond such points, it is no longer obvious that the trade of is worth it.

------
75dvtwin
" _Australia passed a metadata retention law that came into effect in April
this year. It requires telecommunications companies and ISPs (i.e. carriage
service providers, or CSPs) to keep a record of things like the IPs assigned
to internet users (useful for matching against seized logs) as well as details
around phone, SMS and email use.

The PROBLEM is, people have moved towards offshore-based services that are not
required, under Australian law, to keep such metadata. Think of iMessage,
WhatsApp, Signal, Wickr and Telegram._ ...."

Wiretapping at its best. have not done historical analysis, but seems like
much of the personal freedom reduction laws, start in Australia these days(if
looking amongst English speaking countries). Closely followed by UK.

~~~
fit2rule
Australia has long been the petri-dish for the new globalist elite to test
their social engineering tools. It is, after all, one of the most successful
white societies to have survived the empire.

>Now look, I’m not advocating for these laws. I’m not. What I am trying to do
is move the goalposts for this discussion.

To me the goalposts have to be moved back to the public sphere, and what we
should be doing is addressing the devolution happening in the computer world
that results in users having absolutely no clue how to use their computers
beyond opening up a single app and pouring all content into it. There is a
fine balance between usability and uselessness, and its the user who is the
most difficult component in this line.

------
kerkeslager
The alternatives proposed aren't any better. If we are to believe that law
enforcement has Trojans that can bypass encryption, then we must believe that
they aren't reporting the security flaws which allow the installation of these
Trojans. These security flaws are not limited to law enforcement--terrorists
can use them just as effectively as law enforcement can. So what the argument
boils down to is: we shouldn't worry about law enforcement undermining
security because law enforcement had already fundamentally undermined
security? I don't buy it.

------
jasode
I think Patrick Gray's essay actually adds more confusion. His effort to
clarify the confusion by differentiating the layers between "metadata" vs
"encryption" is _technically correct_ but the extra details ends up obscuring
what governments want: _cleartext_

Yes, I agree with Patrick that framing the arguments on "math of encryption"
is misguided. Yes, it leads to thinking that legislators are stupid similar to
trying to pass a bill to redefine the value of pi.[1]

Let's be clear.... the lawmakers actually don't care if the most powerful
supercomputers at NSA/GCHAQ can't crack WhatsApp's latest 2048-bit-quantum-
elliptical-encryption used in conjunction with Apple iOS tamperproof enclave
chip.

All that math doesn't matter. It's a mistake to think that government is
retarded because "math is irreversible". That smugness makes people lose sight
of the ultimate goal: the cleartext.

If getting that cleartext means new laws that require all Android/Apple phones
include a software keyboard interceptor (the onscreen keyboard) to log
keystrokes and send the cleartext to a government server, then so be it. Such
a keystroke logger would then make the "irreversible math" a moot point.

Tldr: Don't frame the policy in terms of _" encryption"_. Instead, focus on
the _" cleartext"_.

 _> The first problem actually has very little to do with end-to-end
encryption and a lot more to do with access to messaging metadata._

This is true, but it's misleading because we know that _governments eventually
want more than the just the metadata._

 _> Now it’s all very well and good for WhatsApp to argue that it doesn’t have
the technical means to do so, which is a response that has lead to all sorts
of tangles in Brazil’s courts, but the Australian law will simply say “we
don’t care. Get them.”._

This line dances close to what we really should be discussing (the cleartext).
If you find yourself grumbling _" the government wants to weaken
encryption!"_, it means you're thinking like a computer scientist. On the
other hand if you imagine, _" if there was a new algorithm for _stronger
encryption_ but it didn't matter because all the cleartext was available for
monitoring"_, it means you're thinking like the government.

 _> Do we believe that law enforcement bodies should have the authority to
monitor the communications of people suspected of serious criminal offences?_

This question is another way of framing it which is closer to the aims of the
government. They want to monitor the cleartext.

[1]
[https://en.wikipedia.org/wiki/Indiana_Pi_Bill](https://en.wikipedia.org/wiki/Indiana_Pi_Bill)

~~~
resf
Anybody who cares about the privacy of their communication can still use an
open source stack. So yes it works as a tool for oppression and for catching
extremely inept criminals, but it doesn't do much else for national security.

~~~
CJefferson
When using an open source stack to do illegal encryption leads to XX years in
prison, not many people will use it.

~~~
ams6110
There's something in cryptology called "plausible deniability" that addresses
this.

~~~
PeterisP
Plausible deniability only works if any doubts are actually interpreted in
your favor (which, as history shows, isn't guaranteed in practice even in
legal systems where it should be in theory), and it's easily possible to make
laws that turn most practical options into crimes.

Yes, it's plausible that the non-approved software stack on your phone isn't
doing any illegal encryption, but that fails if having non-approved OS on your
phone a crime by itself.

Yes, it's plausible that the TrueCrypt volume you have doesn't contain
anything bad, but that fails if mere possession of TrueCrypt tools is a crime
by itself.

Yes, it's plausible that the encrypted traffic sent to/from your phone didn't
contain anything bad, but that fails if having any encrypted traffic not going
through state-approved MITM https is a crime by itself.

Etc ad infinitum. Don't underestimate the coercive power of gov't if they
actually want to restrict something. Technical means can protect you only if
you physically live outside of their reach.

~~~
grumio
They may as well ban all files of random data of unknown origin. If it passes
diehard tests with flying colors, confiscate it and arrest all known
possessors of it.

~~~
PeterisP
And they may actually do just that eventually.

The point I'm trying to make is that in an oppressive regime the _only_ thing
that actually provides plausible deniability is having and using the exact
same hardware/software as everyone else uses; a rooted phone with an
opensource OS, unusual chat apps or cryptography tools won't give you any
plausible deniability but actually make everything even more risky for you.

