
Open Letter From UK Security Researchers - austengary
http://bristolcrypto.blogspot.co.uk/2013/09/open-letter-from-uk-security-researchers.html
======
andrewljohnson
These guys join a chorus of academic voices calling for the same. We've seen
similar statements from Professors Rogaway and Green in the past few weeks on
HN.

I think we need a legal solution, but if the tech community refuses to
participate in corrupting the technology, it puts the spooks in a worse
position. I'd sign a pledge saying I will never work on any apparatus
(software, data centers, anything at all) that are used for mass surveillance.
That's pretty easy for me to do though... the money government throws at some
companies and people for this work is more tempting than the One Ring.

[http://www.cs.ucdavis.edu/~rogaway/politics/surveillance.pdf](http://www.cs.ucdavis.edu/~rogaway/politics/surveillance.pdf)

[http://bits.blogs.nytimes.com/2013/09/10/government-
announce...](http://bits.blogs.nytimes.com/2013/09/10/government-announces-
steps-to-restore-confidence-on-encryption-standards/?_r=0)

More like this couldn't hurt.

~~~
jacquesm
> I'd sign a pledge saying I will never work on any apparatus (software, data
> centers, anything at all) that are used for mass surveillance.

Anybody that ever worked on Hadoop could not sign such a pledge.

Anybody that ever worked on Linux could not sign such a pledge.

Anybody that ever worked on solr could never sign such a pledge.

And so on. You get the idea, it is not possible to contribute to open source
projects that have applications in large scale data storage, mining,
generalized operating system work and so on without as a side effect making
the apparatus of mass surveillance possible.

The only safeguards that will really work are very strict legal ones,
transparency and accountability. As long as the last two are not present the
first is meaningless (which is the situation we are currently in).

~~~
andrewljohnson
I think you know what I mean, but I'm sure the pledge could be worded to avoid
this sort of conundrum.

If I make something that gets co-opted for use in surveillance, fine. But if,
for example, I accept a grant from the NSA or GCHQ to improve Hadoop in some
specific way, I've crossed a line.

I'm not going to say anyone who ever worked on designing a computer is
complicit, but I don't think it's unreasonable to avoid working for companies
that supply tons of computers for surveillance. Like I said, it's a tough sell
to people who have families to support, but I don't think it's crazy to quit
working for certain companies based on the revenue they get from the NSA and
related agencies.

There was a time in my life that I would have contracted for the NSA or Booz
Allen, I'm fairly sure. Given the revelations, and also given my current
financial circumstances, I would not today.

------
viraptor
I'm still surprised the Data Protection Act hasn't been mentioned in the UK in
relation to those issues. Basically it says in Schedule 1, Part 1 (The
principles), point 8:

> Personal data shall not be transferred to a country or territory outside the
> European Economic Area unless that country or territory ensures an adequate
> level of protection for the rights and freedoms of data subjects in relation
> to the processing of personal data.

Which people usually assume includes the US. Well... I guess you could argue
otherwise now. It was demonstrated that even encrypted data is not protected
while sending it to the other side. Can we start referring to the DPA every
time some storage of the UK personal data is done in AWS, or similar hosting?

~~~
richardwhiuk
The DPA doesn't apply because there is an exemption for National Security.

~~~
viraptor
I was under the impression that it's about the national security of the UK,
not just any country. (part about the exemptions; (1) lists national security)

> (2) Subject to subsection (4), a certificate signed by a Minister of the
> Crown certifying that exemption from all or any of the provisions mentioned
> in subsection (1) is or at any time was required for the purpose there
> mentioned in respect of any personal data shall be conclusive evidence of
> that fact.

Also such certificate is not a secret and can be challenged. IANAL, so correct
me if I got this wrong.

------
jgrahamc
Interesting to see this because at least one of these guys used to work for
GCHQ.

~~~
ihsw
There's some correlation between the recent NIST/NSA row, especially since the
same trading of researchers occurs between those too.

------
devx
Can we remake the Internet by encrypting every packet with ECC? Something like
DJB's CurveCP [1] to replace TCP, but with some user-friendly improvements
(seeing proper names and links, not just strings of random characters). DJB
said the performance overhead to do that is only 15 percent [2], which seems
well worth it to me.

I feel that to feel really protected against the NSA in the future we'll need
something like that, along with starting to demand open source firmware from
all hardware vendors. In the meantime we can use ECDHE to encrypt all
sessions.

[1] [http://curvecp.org/](http://curvecp.org/)

[2]
[http://www.youtube.com/watch?v=K8EGA834Nok&feature=youtu.be&...](http://www.youtube.com/watch?v=K8EGA834Nok&feature=youtu.be&t=40m31s)

~~~
daxelrod
Be careful with what ECC algorithm you use.

Bruce Schneier recommends against ECC in [1]:

    
    
      > Prefer conventional discrete-log-based systems
      > over elliptic-curve systems; the latter have
      > constants that the NSA influences when they can.
    

and clarifies in [2]:

    
    
      > I no longer trust the constants. I believe the
      > NSA has manipulated them through their relationships
      > with industry.
    

[1]: [http://www.theguardian.com/world/2013/sep/05/nsa-how-to-
rema...](http://www.theguardian.com/world/2013/sep/05/nsa-how-to-remain-
secure-surveillance) [2]:
[https://www.schneier.com/blog/archives/2013/09/the_nsa_is_br...](https://www.schneier.com/blog/archives/2013/09/the_nsa_is_brea.html#c1675929)

~~~
tptacek
First, Schneier has a weird track record with ECC. I think he may be alone
among "well-known" cryptographers in his distrust for ECC, which goes back
over a decade.

Second, the CurveCP "constants" aren't NIST derived; they're Bernstein's
Curve25519, which is derived transparently from first-principles math.

Third, there are standardized NIST curves (over binary fields) that are _also_
derived transparently from first-principles.

Fourth, the curves that _aren 't_ totally transparent are still derived from a
SHA hash of random string, per the method in IEEE 1363 (which in 1363's
context makes perfect sense, since you can't really generate "fresh" curves
for applications from first principles without everyone ending up with the
same curves). The backdoor scenarios here are... convoluted.

The more you learn about the situation with NIST ECC, the less likely an overt
backdoor seems. Maybe academia has missed something big, and all of ECC is
broken; if that's the case, I think you should kiss conventional IFP and DLP
crypto guby too.

------
3327
The sad truth is that they never will fully truthfully come out due to
"National Security". At this point NSA views this as an information leak and
they are trying to "plug" the drainage. They are in damage control mode.
Unfortunately now that the word is out "potential enemies" will be looking for
these holes in places they did not look before, just because of the hints that
they might exist, and probably do so...

------
jb17
Really? Only the professors deserve their own line?

~~~
arethuza
I agree that looks rather silly, but also worth noting that "Professor" in the
UK means something rather different to the US. In a UK academic department
only the top researchers have the status of Professor, with other full time
teaching and research staff being Lecturers, Senior Lecturers or Readers.

