
Why the Tor attack matters - pmh
http://blog.cryptographyengineering.com/2015/11/why-tor-attack-matters.html
======
wsxcde
I don't really buy the comparison that what CERT did is similar to a
university-sponsored DDoS. I think a better parallel is the Dan Egerstad case.
He ran a Tor exit node and analyzed all the plaintext traffic leaving the exit
nodes. He ended up collecting a ton of sensitive usernames and passwords. He
tried to contact some of these people by e-mail but they ignored him. So he
posted a bunch of these passwords on his blog. He was promptly arrested (and
eventually released). At that time the security community was outraged that an
obviously well-intentioned researcher was being harassed by the police for
doing his job. The response is a lot different now for reasons I don't really
understand.

I do wish both sides would acknowledge this is a tricky issue. On the one
hand, if I run a tor exit node or relay, it is my node and it seems like I'm
allowed to do with it as I please. At the same time, it also seems obviously
unethical (maybe illegal?) to be harvesting passwords off an exit node or to
dole out vigilante justice to Tor users I don't like.

One other thing to keep in mind here is that SEI is a DoD funded center. It
may be nominally affiliated with CMU, but all their money comes either from
the DoD or external grants awarded to the researchers at SEI. So CMU the
private research university and SEI the DoD-funded research center have very
different obligations to the public. It's important not to conflate the two.

The big question is this: what are our responsibilities as security
researchers, especially when we're working on "live" software systems? Green
seems to be suggesting some form of a review board which pre-approves
experiments on live targets. Maybe this is what we need, but be careful what
you wish for though. The bad guys don't have review boards.

~~~
ohmygodel
> I don't really buy the comparison that what CERT did is similar to a
> university-sponsored DDoS. I think a better parallel is the Dan Egerstad
> case.

Here's why it's worse: they inserted a plaintext encoding into the response
from the onion-address lookup relay, and so anybody observing the user (e.g.
the ISP) could detect what onion address the user was connecting to. This
applies after the fact to recorded traffic as well. Thus the researchers had
no control over who got deanonymized, to whom they were deanonymized, and when
they were deanonymized.

> I do wish both sides would acknowledge this is a tricky issue. On the one
> hand, if I run a tor exit node or relay, it is my node and it seems like I'm
> allowed to do with it as I please.

You actually are not allowed to do with your relay as you please. At least in
the US, the legal theory protecting relay operators (i.e. safe harbor) also
makes it illegal to observe user traffic content except in certain cases (e.g.
to improve network performance).

> One other thing to keep in mind here is that SEI is a DoD funded center.

This doesn't seem very relevant. All researchers have an obligation to
consider and mitigate possible harms that occur during their research (source:
I work in a military research laboratory). These researchers clearly did not
fulfill that obligation, and I'm sure their institution is reviewing or has
reviewed their procedures to make sure it doesn't happen again.

~~~
wsxcde
Let me try to understand your position a little better.

Are you saying the problem here is simply that the effects of the attack were
observable by others? If this were not the case, you'd have been fine with it?

And since you seem to be arguing that researchers shouldn't examine user
traffic, do you also think that what Egerstad did was also wrong? Do you agree
with his arrest?

And one more thing sort of related to this. What's your opinion on research
like Arvind's Netflix deanonymization attack? Do you think the work that
research involved was also unethical?

> All researchers have an obligation to consider and mitigate possible harms
> that occur during their research

This is nice idealism and I'm totally in support of it. But I can't help think
this is pie-in-the-sky thinking, especially when organizations like the DoD
are involved.

------
eropple
_> But there's also a view that computer security research can't really hurt
people, so there's no real reason for sort of ethical oversight machinery in
the first place._

Worse: there's a view that people who get owned "deserved it." Our industry,
and its academic attachments, have a really strange vindictive streak towards
those who it should be looking out for. (Which is not to say that those people
should be looking out for people swapping child porn--but what about the
thousands and thousands of people who were _not_?)

------
guelo
It would have been more ethical if the university had not blocked the
"researchers" from disclosing the vulnerability at Black Hat. (Though even
then they were not following responsible disclosure practices). The fact that
Tor had to guess what the vulnerability was and the "researchers" still have
not released their paper is unethical and probably illegal.

~~~
carlosdp
I'll grant you it might be unethical, I don't see how it could possibly be
illegal.

~~~
mehrdada
CFAA

------
Absentinsomniac
Seems like more research needs to go into preventing traffic confirmation
attacks: [https://blog.torproject.org/blog/tor-security-advisory-
relay...](https://blog.torproject.org/blog/tor-security-advisory-relay-early-
traffic-confirmation-attack/)

"A traffic confirmation attack is possible when the attacker controls or
observes the relays on both ends of a Tor circuit and then compares traffic
timing, volume, or other characteristics to conclude that the two relays are
indeed on the same circuit. If the first relay in the circuit (called the
"entry guard") knows the IP address of the user, and the last relay in the
circuit knows the resource or destination she is accessing, then together they
can deanonymize her."

Interesting technical problem. They patched it, obviously, but similar attacks
are still possible. It does say more research needs to be done, when that post
was published. Obviously the method they used to send and receive signals from
one side to the other doesn't work anymore, but statistical methods presumably
do. Sort of like this:

[https://mice.cs.columbia.edu/getTechreport.php?techreportID=...](https://mice.cs.columbia.edu/getTechreport.php?techreportID=556&format=pdf)

Seems like a very difficult problem to solve.

------
dogma1138
What is so surprising here? The DOD is the largest funder of research grants
in the US. Pretty much every university is doing research for a US agency from
cyber security to lasers for missile defense. I find it very hard to believe
that this is the firs time a university was conducting computer security
research on live targets.

~~~
zaroth
Whether or not it's _surprising_ is perhaps the least interesting point for
discussion. Universities have a responsibility to conduct human research
ethically and I hope we hear a lot more about how this research in particular
was conducted. This could have endangered lives depending on how it was done,
and I'm quite sure the ends don't justify the means unless it was specifically
done in a way which protected the anonymity of untargeted users.

~~~
Umn44
>Universities have a responsibility to conduct human research ethically

which means little given the laws of nature, all that matters is what people
end up doing and measuring that statistically. If statistically speaking most
people aren't ethical then that's what we'll get. This whole idea that people
are in control of their actions or have any freedom whatsoever given what we
know about the laws of nature has to go.

~~~
throwaway2048
If people lack free will, then the people judging/punishing them also lack
free will, and the entire premise of your arguement is an absurdity.

~~~
borkabrak
If free will didn't exist, it would be necessary to create it.

------
mirimir
From their website, I get that CMU/SEI/CERT works with both DHS and DoD.[0]
Although I don't see anything specific about the FBI, it's not too much of a
stretch. As DHS has grown and evolved since 9/11, distinctions between police
and military have weakened. A decade ago, CERT would have been carefully
shielded through parallel construction.

In my opinion, this is a wakeup call for the Tor Project. The attack would
have been obvious if they'd been tracking the requisite circuit parameters.
Ironically enough, it strikes me that the Tor network needs something like
CERT for detecting attacks.

[0] [https://www.cert.org/about/](https://www.cert.org/about/)

------
LukaAl
The article raises the issue for computer security but computer science is
used in many other fields where it could have ethical implications. Self-
driving cars is on top of my mind, but for sure other applications has issue
too. So I agree with his point and should be extended.

------
DickingAround
I think we have to assume that if a government can hack it, they will try.
Perhaps it's sad that a university will help them but I'd also to be assumed
that they're going to be trying it in some way.

~~~
nullc
> I think we have to assume that if a government can hack it, they will try.
> Perhaps it's sad that a university will help them but I'd also to be assumed
> that they're going to be trying it in some way.

Sure. And we can also-- for the purpose of thinking about risks-- assume that
if a government can torture people, they will.

This doesn't make it right, and it doesn't mean that people should sit idly
by. Nor does the fact that people oppose and discourage such actions mean that
systems can be left vulnerable to these attacks.

Opposing unethical and abusive behavior is not mutually exclusive with
building systems which are robust even against unethical attackers. Human
wellbeing is maximized when we do _both_.

------
zatkin
I'm willing to bet that the NSA has started to hook into the Tor network and
add in their own nodes, which monitor the traffic. Unless it's not possible to
snoop in on data.

~~~
SturgeonsLaw
William Binney claimed in a recent reddit AMA that the NSA is monitoring
packet routes throughout the tor network in a program called Treasuremap.

[https://www.reddit.com/r/IAmA/comments/3sf8xx/im_bill_binney...](https://www.reddit.com/r/IAmA/comments/3sf8xx/im_bill_binney_former_nsa_tech_director_worked/cwwr7y9)

------
AMEDICALRe
The response by Patio11 regarding how this was acceptable penetration testing
was beyond stupid.

Just because you are univesity researcher does not means you can take money
and then attack some random company and say LOL JK just doing "Research".
Universities have enormous computing power / resources available via various
means to do research. Just because I have access to a thousand node cluster
does not means I can randomly launch DDOS attack against some company and then
claim "Research". This is equivalent to those youtube videos where at the end
they justify assault and other egregious behaviour claiming "Social
experiment" or "Prank".

~~~
hawkice
This is perhaps the most unnecessarily rude comment to be at the top of a
hacker news thread in some time. Let's all remember that disagreeing with
someone doesn't mean being glib or mean.

------
revelation
I still remember the researchers working with Facebook on some social science
project or the discussion on that guy tweeting about airplane security, so the
response from HN on this case baffles me somewhat.

------
PhantomGremlin
None of this should be much of a surprise.

There has always been the possibility of bad actors being involved with Tor.
In addition, the Tor software is complicated enough that there undoubtedly
_will_ be bugs in it.

This is "you bet your life" serious. However, both the architecture and the
implementation of software must be _perfect_ for that to succeed. It's pretty
easy for one bug to mean "game over".

People using Tor just don't have a chance when it comes to dealing with the
NSA, FSB, GCHQ or any similar state actors. Even allowing for inevitable
government bureaucracy and incompetence, the disparity in resources can just
be staggering. A big agency can easily, easily afford to devote 100 full time
people to one high value target. Those are not odds I'd like to bet against.

In the bigger picture, the NSA doesn't give a rats ass about either Silk Road
or about child pornography (at least I hope they don't). Which is why an
"academic institution" was enlisted to help out the FBI with this.

But if I was a dissident or protester in Turkey, Syria, Russia, or any of a
large number of authoritarian countries, I certainly wouldn't use Tor. Not if
my life and the life of my family was at risk.

