
Two months after FBI debacle, Tor Project still can’t get an answer from CMU - pavornyoh
http://arstechnica.com/security/2016/01/going-forward-the-tor-project-wants-to-be-less-reliant-on-us-govt-funding/
======
rtpg
>... a few weeks earlier had canceled a security conference presentation on a
low-cost way to deanonymize Tor users. The Tor officials went on to warn that
an intelligence agency from a global adversary also might have been able to
capitalize on the vulnerability.

This is kind of worrying. I hope the Tor Project has information on the attack
is looking into ways to mitigate this. But if it's due to the protocol nature,
then maybe it's time to look for a successor (we aren't using WEP anymore,
right?)

As to the CMU stuff... Tokyo University has this pledge to make sure basically
no military research is done on campus, which I feel to be pretty laudable.

I wonder if there's a similarly worded pledge for this sort of thing. But at
the same time, universities can do a lot of good security research that can,
in the end, strengthen the systems we use.

The "$1 million to target these specific people" sounds dirty, but "$1 million
to do research on the vulnerabilities of Tor"... well that sounds like
research to me. Pretty tricky.

~~~
lwf
> Tokyo University has this pledge to make sure basically no military research
> is done on campus, which I feel to be pretty laudable.

So, you move it off-campus. See e.g. the MIT Lincoln Lab,
[https://www.ll.mit.edu/](https://www.ll.mit.edu/)

~~~
Thriptic
I'm at MIT proper and a good portion of our team's medical device work is DOD
funded. While we are primarily designing devices to be used in civilian
hospitals, our diagnostic devices could also potentially be used to optimize
battlefield care for soldiers, which I personally think is great.

I think a wholesale ban on military research is pretty silly; the ethical
implications of projects should be considered on a case by case basis by the
university.

~~~
ethbro
_> the ethical implications of projects should be considered on a case by case
basis by the university_

How's that work during Vietnam when the DoD dangled bags of money in front of
universities?

~~~
Thriptic
I can't speak to that as I wasn't alive then and have not researched the
topic. Care to elaborate?

~~~
ethbro
As someone who wasn't alive then either, it's a rather well documented period
of history, albeit mostly in dead tree form. Karnow is probably the classic
([http://www.amazon.com/gp/aw/d/0140265473/](http://www.amazon.com/gp/aw/d/0140265473/)).

To use a more modern analogy that exists on the internet, the 2010 US military
research / development / testing budget looks like it was around USD$80b.

Or in other terms, roughly equal to the total of all research spending by all
other branches of the US government.
([http://www.aaas.org/sites/default/files/RDGDP_1.jpg](http://www.aaas.org/sites/default/files/RDGDP_1.jpg))

Now you're a university professor / dean / president. Times are hard (they
always are, you're in academia). There's a huge pie sitting right next to the
one you've been fighting over, and all you have to do is work on certain
technologies that may or may not have lethal consequences.

I wouldn't take the bet on many people saying "No thanks, I'll be happy giving
up grant money for moral reasons."

------
sandworm101
The intelligence community used to value Tor. Remember where it came from. Now
they don't, presumably because the primary intelligence target has shifted
from fixed actors like nation states and large businesses to the general
public. Now those nation states and businesses are 'intelligence partners' in
the fight against the 'lone wolfs' hiding within the masses. Perhaps then it
is in Tor's interests to restart some rivalry between nation states.

~~~
rdtsc
NSA is schizophrenic in that regard. Remember that one of the things it does
besides looking in everyone's underwear drawers is it also advises US govt (3
letter agencies, military) on what crypto to use. In other words it tells
Uncle Sam how to lock his underwear drawers so other agencies don't peek in
there.

It is always interesting to see what they say there. Because if they know, for
example, one type of crypto technique or implementation is vulnerable will
they still recommend it for TS classified material storage? Will they
recommend for US military or diplomatic service? If they don't, it might leave
that open to attack, and they are not doing their job. If they do say "don't
use this combination of AES, prime numbers, or OpenSSL implementations", that
also gives something away.

I wonder if people people who make these recommendations even talk to people
who discover, exploit, and actively penetrate systems? Because everything is
very compartmentalized, they actually might not be able to.

That is why they are probably very interested (like we saw) in somehow
subverting or weakening some algorithms and implementation so they are the
only ones that have a key (Dual_EC_DRBG) , or they are the only ones that
potentially have a computational capacity to exploit (DES).

~~~
AlyssaRowan
NSA themselves have used Dual_EC_DRBG (which can be distinguished from a PRF
even if you don't have the 'backdoor key': it's not just backdoored and slow,
it's _bad_ \- and they know that). GCHQ behaves even worse and is at this
point almost entirely out of control.

In either case, I feel information assurance and signals intelligence arms
really should never have been the same agency: they are roles entirely at odds
with each other and do not seem to even have their own governments' equities
properly balanced, nor their recommendations always having been given in good
faith. So be cautious drawing any conclusions from their advice.

Unfortunately, that is not the sort of 'reform' that either government is
interested in, particularly my own. It's quite depressing, really.

~~~
rdtsc
> NSA themselves have used Dual_EC_DRBG

That actually makes sense because of the way it was backdoor-ed. What they did
there is the golden standard of subverting and backdoor-ing a crypto
algorithm: go through a standards body, backdoor-ed it by using a public-
private key. They hold the private key. Encourage others to use the system as
much as they can (which includes showing the world that they themselves use
it).

NSA have been having dreams of key escrow forever. It seems since the 90s,
that dream was further and further from reality. But they didn't completely
give it up. Dual_EC_DRBG was effectively becoming that key escrow they wanted
for all the system that used it and they got to keep the private key and thus
have a high enough assurance others won't use their backdoor.

Whoever was in charge of that operation, was probably patting themselves on
the back every morning after waking up.

------
hackuser
I worry about Tor's security:

1) For security, most systems rely on their obscurity and on the fact that the
assets they protect probably aren't worth much investment by the attackers.
Tor can't rely on either of those circumstances: It's prominent and breaking
into it is a one-stop solution to attacking many valuable targets.

2) Many organizations with large amounts of resources, from state intelligence
agencies to law enforcement to security vendors to ISPs, would like to find
solutions to hacking Tor security inexpensively.

3) True security is very difficult and expensive. For Tor, this is taken to an
extreme by #2. Does the Tor Project have the resources to implement bug-free
software (e.g., the kind that flies passsenger planes)? Certainly not. Can
they find and fix bugs as quickly as the attackers described above find and
exploit them? Certainly not. I'm not criticizing them; they just don't have
the resources.

4) Assuming the underlying concept of onion routing is secure, there still are
plenty of targets for attacks such as implementation and all the other code
Tor relies on (e.g., almost all of Firefox for the Tor Browser, encryption
algorithms, your OS, etc.). Attacking a Tor user doesn't seem impossible.

Based only on the theorizing above, and not knowing about Tor's actual
implemenation, I fear that we're lucky if Tor still is expensive to attack. Of
course, any smart attacker with an exploit will publicly complain how hard Tor
is to hack.

~~~
baby
If you look at Tor's concept. It's pretty clear that it cannot be considered
secure.

Each time you use tor your packets actually go through a path of 3 different
servers (or relays). If the attacker owns the two ends it's game over. How
many relays are there out there? How many are owned by the NSA or other gov?

It's pretty obvious that this system just cannot work because a majority of
relays are owned by the attacker.

------
snsr
This continues to reflect very poorly on CMU and CERT.

~~~
greggarious
Yes, but if they're under some Kafkaesque gag order there not much they can do
right?

~~~
zymhan
I still find it hard to ever trust an institution that wouldn't raise a huge
stink about the ethical implications of this. They don't exist to serve
"national security interests", that's what the NSA is for.

~~~
munin
> They don't exist to serve "national security interests"

Yes they do. From their website:

"The Software Engineering Institute (SEI) is a not-for-profit Federally Funded
Research and Development Center (FFRDC) at Carnegie Mellon University,
specifically established by the U.S. Department of Defense (DoD) to focus on
software and cybersecurity."

~~~
p4wnc6
I interpreted the parent comment as saying that _CMU_ doesn't exist to serve
national security interests, whether or not there is an entity, like the FFRDC
SEI, that does exist for related reasons.

On one hand, leading academic institutions are commonly understood to have a
responsibility to preserve free speech (especially speech that is critical of
military or government action), remain as a neutral education and research
body decoupled from any specific political or military agendas, and help lead
in social progress towards greater overall ethical standards in education,
research, and scholarship.

On the other hand, many universities loan out the credential and status of
being affiliated with them as a recruitment tactic to assist the DoD in the
task of creating a diversified set of military research organizations, which
from a superficial observation point of view (the view taken by many of the
younger engineers duped into working for below market pay at such places)
_look like_ run-of-the-mill software/science/engineering jobs while having all
sorts of ethical gray areas, and the end result is to create rampant ethical
conflicts of interest, questionable management practices, and many other
problems.

I don't think it's as simple as just pointing out that SEI is an FFRDC and
moving on. The fact that universities in general continue to perpetuate this
problem -- academia-military pseudo-credible research facility affiliation and
status-mongering -- _that_ is the bigger issue.

------
enginn
From what I've gathered, TOR is pretty robust at least on paper, and when
explained in an academic way it has me almost convinced that the apparatus
does what its supposed to, except for the part where it catastrophically fails
when put into practice, like when:

1.) Custom Firefox 'Browser Bundles' which do not auto-update and ensure
latent vulnerabilities are left un-addressed

2.) Trusted 'Third Parties' running exit nodes who we hope and pray are doing
their job correctly

3.) Weird and non-innocuous looking domains on the wire that do nothing more
than alert the neighborhood that somebody's using TOR (Unless everyone's using
it you stand out like a sore thumb)

4.) Sybil attacks in the form of people-with-more-money-than-you polluting the
network

5.) ???

6.) Any number of other issues (which have since been patched in the past),
but still work if the TOR user is uneducated about how TOR works (traffic
analysis / correlation attacks / zero-knowledge-proof attacks, etc)

------
hiq
> Personally, I use it maybe 10, 20 percent of the time. I know that there are
> people out there that are using it a lot of the time. But for me as much as
> I might hate Flash, there are times that I need to watch something on
> YouTube.

YouTube has been working for me using Tor Browser for months, if not years.

~~~
codingdave
That was an analogy, not a bug report.

~~~
ChristianBundy
That's an example, not an analogy. It just so happens that the example is
completely false.

------
vaadu
In what way is this an FBI debacle? The F35 is a debacle as is the TSA. But
the program the FBI was running to deanonymize TOR users? Not even close.

------
edgarallanbro
It's two months after the FBI debacle and people still don't know the
difference between CMU and CERT.

~~~
p4wnc6
I don't think universities should get a free pass on whatever their affiliated
FFRDCs might do. If the university wants to be disassociated from an unethical
action, do so by severing the tie between the university and the FFRDC, and
stop lending credibility and credential to the FFRDC via the university's
reputation. Otherwise, accept the fair guilt by association that will follow.

------
archimedespi
I'm waiting for someone to build an implementation of Tor in a proof-
verifiable language.

That would be pretty cool, since anyone could prove source correctness
automatically.

~~~
dguido
That would help with things like the memory safety of the daemons you run, but
that hasn't been the problem when Tor has failed its users.

Tor has failed its users because the idea of running a public Tor cloud with
volunteer entry, onion, and exit nodes is ludicrous. It means that the entire
network is under surveillance all the time, the exact opposite of what you
want. There has been widespread confirmation that the data you transfer via
the public Tor cloud is being passively surveilled at the endpoints and
actively modified when you, for example, download software. This makes it
incredibly dangerous to use, likely more dangerous than just using the regular
internet.

There are many other problems (like the fact that .onion sites are a dirty
hack and likely have many undiscovered weaknesses like the ones CMU found) but
nearly all of them are either deployment or architectural issues, not code
security issues.

~~~
archimedespi
Yes, I agree. When I wrote the parent comment I was thinking more about
implementation detail correctness: memory safety, protocol implementation
correctness, etc.

Like you said, Tor has architectural issues. Tor would be fine if it were low-
profile, but it's not, and that's a major part of why the architecture is
breaking down - it doesn't scale well with increasing users/publicity/nation-
state-interest.

