
The Guardian walks back claims of direct NSA access to servers of tech companies - skwirl
http://www.mediaite.com/online/fulsome-prism-blues-the-guardian-offers-2nd-worst-clarification-ever-on-nsa-story/
======
tptacek
A lie gets halfway around the world before the truth gets its pants on. This
story made it 1/3rd up the front page; stories based on the most extreme
possible interpretation of what seems to be Greenwald's own extreme
interpretation of a slide deck covered the entire front page for days.
Remember when "direct access" meant Palantir was a third party doing the
actual collection? Just look at their name! Remember when all of Facebook and
Google and Yahoo's denials looked "suspiciously similar"?

Someone on Twitter (sorry) said that Google and Facebook "looked like angels"
compared to Verizon. That sounds about right to me, too. But what incentive do
they have to do that when their reward is conspiracy-theoretic nonsense about
how NSA has their TLS keys and 3rd party contractors are used to keep them
from "lying" when they say NSA has no direct access?

~~~
discostrings
>A lie gets halfway around the world before the truth gets its pants on. This
story made it 1/3rd up the front page; stories based on the most extreme
possible interpretation of what seems to be Greenwald's own extreme
interpretation of a slide deck covered the entire front page for days.

There's definitely some extreme speculation going on--both from those trying
to maximize as well as those trying to minimize this issue. It will take some
time to get to the truth. This article is a perfect example of an extreme
attempt to minimize this issue.

This article seeks to minimize the Guardian's original story by saying the
Guardian is "walking back" their claims. That doesn't seem to be the case.
This article cites a paragraph near the end of a minor story published days
later as the passage where they "walk back" their original story. The intent
of the paragraph seems to be to illustrate that both the "direct access" claim
from the NSA and the "no direct access" claims from tech companies can both be
true. The original article doesn't seem to be changed.

Beyond that, the Guardian never claimed the NSA had "direct access". They
claimed that the NSA slides stated the NSA had direct access. The Guardian has
not stated they read too much into "direct access" in the slides, and the
original article is pretty clear that "direct access" is simply the NSA claim
in the slides, not the Guardian's verdict.

There is another remaining issue: the original article claims access to "live
communications", which has yet to be supported by a slide, but it would pretty
much rule out the SFTP-only possibility that some people seem to be accepting
as fact at this point. Maybe there is direct access to live information from
Skype and Apple, but Google insisted on SFTP? We still have a lot to find out.

It could be that the Guardian did exaggerate. But it is far too early to
conclude that with so many questions remaining. Not all the companies have
described their systems.

One thing is certain: the Guardian does not seem to be walking back their
claim.

~~~
tptacek
How is running a story that defines "direct access" as a "dropbox system"
where "legally requested data could be copied from their own server out to an
NSA-owned system" possibly not a walk-back, given Greenwald's original story?

~~~
discostrings
The original story is a report about the "direct access" claim from the NSA.
People seem to have interpreted the article as claiming as fact that the NSA
had a root password to all the companies' servers, but that's not what the
Guardian reported. They simply reported an NSA claim.

This separate article covers the story that the original article broke. This
paragraph in the article gives an attempt to reconcile the competing claims
from the NSA and the companies. What makes you say this attempt should
invalidate the original story?

~~~
leoc
The Guardian's original story [http://www.guardian.co.uk/world/2013/jun/06/us-
tech-giants-n...](http://www.guardian.co.uk/world/2013/jun/06/us-tech-giants-
nsa-data) really does seem to be specifically claiming NSA root access or the
equivalent:

> When the FAA was first enacted, defenders of the statute argued that a
> significant check on abuse would be the NSA's inability to obtain electronic
> communications without the consent of the telecom and internet companies
> that control the data. But the Prism program renders that consent
> unnecessary, as it allows the agency to directly and unilaterally seize the
> communications off the companies' servers.

As soon as people suggested that "collection directly from the servers"
actually meant a FISA workflow-automation system involving an API and maybe
dropbox servers, Glenn Greenwald indignantly denied, or maybe didn't
understand, the possibility that the companies' statements could actually be
compatible with the PRISM document
[https://twitter.com/ggreenwald/status/343421926057861121](https://twitter.com/ggreenwald/status/343421926057861121)
[https://twitter.com/ggreenwald/status/343422182589870081](https://twitter.com/ggreenwald/status/343422182589870081)
[https://twitter.com/ggreenwald/status/343423399609131008](https://twitter.com/ggreenwald/status/343423399609131008)
[https://twitter.com/ggreenwald/status/343423727066824705](https://twitter.com/ggreenwald/status/343423727066824705)
. Meanwhile both the Washington Post and the Guardian started backing down
from the NSA-has-root idea. The paragraph WaPo added to its original story

> It is possible that the conflict between the PRISM slides and the company
> spokesmen is the result of imprecision on the part of the NSA author. In
> another classified report obtained by The Post, the arrangement is described
> as allowing “collection managers [to send] content tasking instructions
> directly to equipment installed at company-controlled locations,” rather
> than directly to company servers.

plus the later story it printed [http://www.washingtonpost.com/world/national-
security/us-com...](http://www.washingtonpost.com/world/national-security/us-
company-officials-internet-surveillance-does-not-indiscriminately-mine-
data/2013/06/08/5b3bb234-d07d-11e2-9f1a-1a7cdee20287_story.html) both help to
make clear that the Post did intend its original PRISM story to be understood
as NSA-has-root.

~~~
discostrings
I agree that whether or not the NSA can pull records from these companies
without the companies meaningfully reviewing each request is a very important
detail.

"But the Prism program renders that consent unnecessary, as it allows the
agency to directly and unilaterally seize the communications off the
companies' servers" is a strong statement. I agree that it is probably not
compatible with the details Google has divulged about its "SFTP and manually
by human only" process. But that is only one of the many companies.

I understand the "slides or GTFO" attitude that I'm seeing in these claims
that the original story is inaccurate, but I think it's a bit arrogant and
premature. A journalist who has seen the entire slide deck continues to tell
us that the nature of what the whole presentation reveals is more invasive
than a digital lockbox with workflow management software where humans
meaningfully verify, evaluate, and approve requests. He could have
misinterpreted the slides, but I doubt he would stick to the report so
steadfastly once all these objections arose if he were not pretty confident he
understood the claims in the Prism presentation.

We shouldn't accept that the NSA can grab a user profile without explicit,
individual legal approval from the company as fact yet--there's a lot more we
will hopefully learn. And how true this is could vary from company to company.
But it's silly to ignore that a credible voice who has seen the presentation
is telling us something.

~~~
leoc
I would be slow to assume that anything is known for certain in this kind of
"spook biz". I also don't assume that everything interesting has been released
on the slides already. (For example, there's interesting reporting in the
first Guardian story about what happened to FISA 702 request rates since PRISM
was introduced which includes quotations from the slides, but hasn't got much
attention seemingly because the relevant slide or slides have not been
reproduced yet.) However, there a couple of things that make me fairly
confident Greenwald is (or was? I'm not sure if he is still standing by his
claim) wrong about this.

One is that AFAIK the Guardian and the WaPo both have access to all the same
materials Greenwald has, and they have both been backing away from the NSA-
has-root claim for some time. But an even bigger factor is how Greenwald
defended his claim. If he'd said "there's still-unreleased material which
proves me right, hold tight" that would be one thing. But instead he quoted
the "collection directly from the servers" text and linked the new slide it
came from, implying that the quotation unambiguously ruled out the drop-
box/API interpretation and supported the NSA-has-root interpretation. But in
fact "collection directly from the servers" is not at all unambiguous between
the two interpretations. And even worse, the You Should Use Both slide, which
Greenwald produced as his trump card, provides context which clearly
undermines the NSA-has-root interpretation! In that slide it's clear that
"collection directly from the servers" is being contrasted with upstream
collection of IP data from the telcos. The fact that Greenwald evidently
didn't pick up on this himself is pretty clear evidence that his understanding
of the presentation is imperfect, whether because it's being distorted by his
desire for a bigger and more damning scoop or just impeded by a lack of
technical savvy.

------
WestCoastJustin
Steve Gibson suggests [1] that _direct access_ means, direct access to their
_internet pipes_ , effectively tapping or mirroring their content for
analysis. He says that "direct access" has more meaning to non-techies than
saying they "are listening at the upstream router". Very interesting podcast
too. The $20 million per collection point, is to build the secure secret rooms
[2] at their [google, facebook, apple, microsoft, etc] telecom providers, slip
the fibers, and for all the gear they need to do it. He even suggests this is
legal, since it is the open internet and anyone is allowed to peer into the
traffic, since this is not actually in any of their [google, facebook, apple,
microsoft, etc] data centers, but at the peering level.

[1] [http://twit.tv/show/security-now/408](http://twit.tv/show/security-
now/408)

[2]
[http://en.wikipedia.org/wiki/Room_641A](http://en.wikipedia.org/wiki/Room_641A)

~~~
slg
I am not a network expert, but won't most of that data still be encrypted
through SSL? If that is the case, it would be almost useless since sites like
Facebook and Google have made the transition to defaulting to using SSL.

~~~
aryastark
That may be true. However, the giant elephant in the room is the Utah data
center.

This is a data center designed to, supposedly, store data on the scale of a
yottabyte. I only say "a" yottabyte, because to assume even slightly greater
than that is sheer lunacy.

That is freaking massive. If you took all terrorist cells and all terrorist
activity for the history of terrorism and terrorist activity, you would not
even touch a fraction of a percent utilization. We're talking rain drops in
the ocean.

There is no way the NSA is merely watching the bad guys here. The data center
is a few magnitudes too large for such a task.

I would assume right now they are merely recording all data, in hopes that one
day they will have technology to quickly crack encryption. However, even
without knowing what is said (the content), the metadata of connections gives
plenty of information on what people are doing.

~~~
tptacek
That Utah data center is literally the Internet's Area 51.

~~~
jongraehl
"literally" :)

and yes, agreed - analogous on so many levels - a publicly admitted places
where secret government things happen, which can be invoked to give an aura of
reality to conspiracy theories true and false alike.

------
anologwintermut
So the Gaurdian appears to have done some very very shoddy reporting. But
isn't this all a distraction?

FTP or direct acces, it makes no practical difference if the employee on the
other end rubber stamps the requests when they get them. The real question is
how much data can the NSA get and what procedures do they have to prevent the
targeting of US person?

what checks does e.g Google actually run on the requests? If the procedure is
email FISA@google.com and then some google employee rubber stamps it and
sticks the data in an SFTP server, the NSA effectively has unfettered access.

Its clear that you don't need a warrant for targeting a foreign person, so the
employee can't check that it came for FISC. Even if they did, FISC seems to be
willing to rubber stamp things themselves. So aside from maybe checking if the
account is typically accessed from a US IP, whats Google going to do? I pick
on Google here specifically only because they have a reputation for trying to
automate everything including a lot of customer support and I suspect that if
they don't have any discretion on these cases, they may well have automated
it.

Of course, maybe they didn't, maybe there are rigorous checks both at the NSA
and at the receiving companies. But we don't know and we need to.

~~~
skwirl
Google has directly contradicted what you are saying about rubber stamping
requests. See
[http://googleblog.blogspot.com/2013/06/what.html](http://googleblog.blogspot.com/2013/06/what.html)
and [http://googleblog.blogspot.com/2013/06/asking-us-
government-...](http://googleblog.blogspot.com/2013/06/asking-us-government-
to-allow-google-to.html).

For example: "Our legal team reviews each and every request, and frequently
pushes back when requests are overly broad or don’t follow the correct
process."

~~~
emn13
They may be lying. If there's a gag order, they may be required to lie. Or
they may consider "review" not to necessarily be human review. Or the person
writing that may simply not know. We don't know what's going on at google, and
with no transparency, it's hard to have a lot of trust, particularly since
there are many examples of government overreach in the past.

Sure, google and facebook and whatever should push back, but it's clear that
if the government really wants to abuse its power, they won't stop it. Given
the current witch-hunt on whistleblowers, it's clear that those in power do
not appreciate being questioned. Trust is fine, but verification is better.

~~~
moultano
What would you consider transparency if your response to that is, "they might
be lying."

~~~
emn13
Actual transparency, not just words? That could be free access to their
internal architecture (not exactly likely), or more plausible, external,
independent auditors.

However, I think you're focusing on the wrong party here - I think it's a lot
more reasonable to request this of the government than of companies. There's
no reason not to require the government to publish the general structure of
what they're doing in great detail, and to let others make up their own mind
if it's overstepping its bounds.

In short: I want an independent parties to have free access and be allowed to
verify what's going on.

------
room271
The precise mechanism isn't important. The key question is: is there mass
surveillance going on without warrants? And the answer appears to be yes. No
one has disputed, for example, the Verizon leaks that started all this.

~~~
brown9-2
Technically the Verizon "leak" was a leak of a court order, so it has some
legal merit. We can debate if it's fair or in spirit with the Constitution and
4th Amendment to issue such a wide-reaching court order, but it's not warrant-
less.

~~~
joeguilmette
Technically, it is a court order, not a warrant. So it is warrantless.

------
ig1
Except that theory doesn't explain the public statements either.

Google has specifically denied having a drop box facility:

"We cannot say this more clearly—the government does not have access to Google
servers—not directly, or via a back door, or a so-called drop box."

[https://plus.google.com/+google/posts/TMh6gUVrwMq](https://plus.google.com/+google/posts/TMh6gUVrwMq)

Among other features WashPo specifically describes live interception which
would require more sophisticated integration than a mere drop box facility:

"Google’s offerings include Gmail, voice and video chat, Google Drive files,
photo libraries, and live surveillance of search terms."

[http://www.washingtonpost.com/investigations/us-
intelligence...](http://www.washingtonpost.com/investigations/us-intelligence-
mining-data-from-nine-us-internet-companies-in-broad-secret-
program/2013/06/06/3a0c0da8-cebf-11e2-8845-d970ccb04497_story_3.html)

Not to mention NYT's independent source:

"In one recent instance, the National Security Agency sent an agent to a tech
company’s headquarters to monitor a suspect in a cyberattack, a lawyer
representing the company said. The agent installed government-developed
software on the company’s server and remained at the site for several weeks to
download data to an agency laptop.

In other instances, the lawyer said, the agency seeks real-time transmission
of data, which companies send digitally."

[http://www.nytimes.com/2013/06/08/technology/tech-
companies-...](http://www.nytimes.com/2013/06/08/technology/tech-companies-
bristling-concede-to-government-surveillance-
efforts.html?pagewanted=2&_r=0&hp)

~~~
discostrings
It's important to remember that there are many companies involved here.
PalTalk's direct access could be a begrudgingly set up teletype of live chat
transcripts while Apple could have provided a VPN connection and a root
password. We don't know yet know what access was sufficient to be a part of
the "direct access" Prism program.

It will be interesting to see what degree of commitment each of these
companies showed to the privacy of its users. It appears that however Twitter
was complying with requests, it wasn't as convenient for the NSA as Prism
access...

------
pvnick
The Guardian "backwalk" actually _VALIDATES_ our worst fears, albeit with
technical nuances to ensure legality. It's really just about the only viable
way to implement this absent an SSL encryption crack (I would guess they're
dumping encrypted communications from Room 641A waiting for an encryption
weakness to be discovered, that way they'll have _everything_ ). As Snowden
said, you could wiretap the president if you have a personal email.

Here's how it works (this is my opinion as a web developer, not verified
details from Snowden leak):

1) Fancy user interface developed by Booz Allen Hamilton. Enter email address
(good choice for a unique identifier, used as unique key in many databases).

2) Backend uses curl to send a request to NSA-certified web api on each
service shown on the slide. This serves as a legally-binding FISA request
either regarding a foreign agent - no court order required - or a domestic
agent - secret FISA court order required (see
[http://www.npr.org/2013/06/13/191226106/fisa-court-
appears-t...](http://www.npr.org/2013/06/13/191226106/fisa-court-appears-to-
be-rubberstamp-for-government-requests)) and assumed to be fulfilled by the
api.

3) Kick off NSA equivalent of gearman worker that checks contents of
"dropbox"-like service from each company for updates.

4) Services (Google, Facebook, etc) automatically grant request without
question as it is a legally binding FISA order. This saves them a ton of money
and, hey, it's legal! They have some custom code that allows them to look up a
user by their email address - almost guaranteed to be indexed in their
database - join it to relevant data sets, and dump it to the "dropbox"-like
system.

5) Fancy frontend shows progress bar, while skinny backend compresses
retrieved data into zip file for easy download.

This is the most efficient, cost-effective way to do this without venturing
into science fiction, ie storing a mirror of all the data which would be
stupid on NSA's part. It still verifies our worst fears and answers the
question as to how such a program can cost "only" $20 million per year as
reported by the slides.

The trick is in the legal framework, not the technical details. This is why
the FISA courts are secret.

~~~
tptacek
_Here 's how it works (this is my professional opinion as a web developer_

    
    
        +++ATH0
        NO CARRIER

~~~
pvnick
Yes I am a web developer that goes to work everyday in a professional
environment.

~~~
tptacek
In your professional experience as a web developer, maybe you could square
this assertion:

 _Services (Google, Facebook, etc) automatically grant request without
question as it is a legally binding FISA order_

... with the court order the NYT linked to from FAS, where Yahoo is seen to go
several rounds with the FISC after having received a lawful directive from NSA
to initiate surveillance?

From exactly what evidence do you argue that Google (or any other Internet
company) automatically approves all FISA requests?

~~~
pvnick
Snarky tone unnecessary. To which court order do you refer?

Edit: Also, your tweet quoting me sarcastically was also unnecessary. I'm
trying to be helpful and you're being rude.

~~~
leoc
This is apparently the article.
[http://www.nytimes.com/2013/06/14/technology/secret-court-
ru...](http://www.nytimes.com/2013/06/14/technology/secret-court-ruling-put-
tech-companies-in-data-bind.html)

~~~
pvnick
Thank you :)

Relevant part of the article:

>In a secret court in Washington, Yahoo’s top lawyers made their case. The
government had sought help in spying on certain foreign users, without a
warrant, and Yahoo had refused, saying the broad requests were
unconstitutional. Related

>The judges disagreed. That left Yahoo two choices: Hand over the data or
break the law.

>So Yahoo became part of the National Security Agency’s secret Internet
surveillance program, Prism, according to leaked N.S.A. documents, as did
seven other Internet companies.

From the court order:

"After a careful calibration of this balance and consideration of the myriad
of legal issues presented, we affirm the lower court's determinations that the
directives at issue are lawful and that compliance with them is obligatory."

Seems to me that tptacek is supporting my hypothesis

------
andrewcooke
which guardian article is being discussed? because if you look at
[http://www.guardian.co.uk/world/2013/jun/06/us-tech-
giants-n...](http://www.guardian.co.uk/world/2013/jun/06/us-tech-giants-nsa-
data) they are absolutely correct in what they report. everything is qualified
with "the documents say...", etc.

~~~
skwirl
Are they? The Guardian has selectively released three or four slides out of
41. You can split hairs and try to come up with some contrived reading of the
article that could be argued as accurate, but I think in reality everyone
understood the article to be claiming that the NSA had unfettered access to
the severs of tech companies.

~~~
andrewcooke
they're reporters, not clairvoyants. their job is to tell you what the docs
say. they told you, and they were right to tell you, because the information
looked very interesting. you jumped to conclusions. and so they now have to
apologize because you weren't as careful as they were?

they did their job as well as they could. they had what appeared to be
interesting data, but instead of just saying, "wow, XXX" they were very
careful to not go beyond what was said.

i jumped to conclusions too. but then i realised i was probably wrong. it
happens. but i don't then blame the reporters who did their job reasonably
well. _i made the mistake, not them._

------
danso
I think that Greenwald should've been more skeptical, regardless of how
technical savvy he may or may not be. Look at the slide in question...in any
other organization, that slide would be interpreted as a slide aimed at the
newbies/idiots in a company...it pretty much literally says, using brightly
colored bubbles: "Hey people, remember that we have _two_ systems for
collecting data, so please use _both of them_. We even gave one of them an
easy-to-remember acronym"

Isn't it possible that a slide written for newbies (within NSA, or those who
work with the NSA) might have also been written by someone who is not an all-
star in technical communication?

~~~
Steko
The Bob Cesca item this mediaite post is building on makes the case that
Greenwald saw what he wanted to see:

 _I’m going to put it all out there and let the chips fall where they may: I’m
increasingly convinced that Glenn Greenwald’s reporting on the NSA story is
tainted by his well-known agenda, leading him to make broad claims for the
purposes of inciting outrage._

[http://thedailybanter.com/2013/06/greenwald-sticks-with-
his-...](http://thedailybanter.com/2013/06/greenwald-sticks-with-his-story-in-
spite-of-growing-questions/)

See also:

[http://www.rants.org/2013/06/11/epic_botch_of_prism_story/](http://www.rants.org/2013/06/11/epic_botch_of_prism_story/)

~~~
brown9-2
Greenwald's twitter feed of the last few days confirms this (in my mind) -
many tweets praising those who are praising Snowden, comparing him to
Ellsberg, etc, about how a survey shows X% of Americans think Snowden is a
hero. It's not really what I would have expected of a journalist at a major
newspaper.

------
jessaustin
How would access to an FTP server that serves all the data one wants not be
considered "direct"? I really doubt NSA analysts need shell access to do their
dirty work...

~~~
skwirl
The point is that the companies decide what production data to deposit on
these servers after their legal teams have reviewed FISA requests. According
to the tech companies, these requests are individually reviewed and target
individuals and are narrow in scope. And they push back when they are broad.

This detail is a major one. This would mean the NSA cannot simply log on to
Facebook and query for whatever they want. It means the system is simply a way
for companies to comply with FISA requests, something that they were already
required to do.

So: even though Twitter doesn't use PRISM, there is really no difference
between what the NSA can access on Twitter and what they can access on
Facebook. Twitter just complies through some other manner.

~~~
holloway
I think that's a bit generous. They look over the request but if the result
set is too large it couldn't be reasonably vetted by lawyers. Eg, they would
have a look at it but we don't know how rigorous they are or could be.

~~~
skwirl
Yeah, I think the main point is that this is simply a way of responding to
FISA requests, which are completely independent of PRISM. In other words,
there is no new news here. The debate over FISA itself is certainly quite
valid.

Google has claimed these requests are infrequent and narrowly focused, and
they have requested permission from the government to publish some statistics.
I hope they get it.

------
DanielBMarkham
Just to recap, PRISM looks like a web front-end to a shitload of _other_
systems. This is why a relative noob like Snowden was working on it -- it was
web work. It's also why he had such a broad overview of what was going on. He
knew the direction and capabilities, if not the details. He was probably just
hooking up APIs somewhere.

That's just guesswork, of course. I think it's easy for us (and the Guardian)
to imply a lot of detail where none exists. I don't think the Guardian has
anything to apologize for. It'd be great if we all got a better technical view
of these systems. But asking to receive it third-hand through a leaker and a
non-technical reporter is probably a bit much.

~~~
mpyne
The Guardian could at least "apologize" for using "scare quotes" in the
process of walking back their claim.

That's underhanded for _any_ media publication that aspires to the idea of
"journalism".

In addition they could have at least mentioned the idea that other theories
emerged as to what the slides they presented might actually mean that way
people would be aware that there were other valid conclusions that could
possibly be drawn, especially by those with tech and government experience.

~~~
cadlin
The Guardian isn't walking back anything. The submitted article just takes
another Guardian article about a different aspect of the story (article title:
"NSA scandal: Microsoft and Twitter join calls to disclose data requests") and
proceeds to call it a walk-back, which it is not.

~~~
tptacek
No. Rather than issue a direct correction to their initial story which so
falsely represented the facts on the ground that all of the US's largest
internet companies simultaneously issued categorical denials, the Guardian ran
another article that redefined the term "direct access", knowing (as they had
to have) that their own original interpretation of the term, in black and
white in Greenwald's original reporting, had been repeated as fact by numerous
major media outlets.

~~~
daywalker
If the tech companies have been slandered unfairly, why don't they sue the
Guardian?

~~~
marshray
Because The Guardian has 19,998 more documents that they don't know the
contents of.

~~~
discostrings
If they have have acted properly, why should that stop them?

It's hard to lie effectively when you don't know how much of the truth is
known.

~~~
marshray
Because nobody, probably not even at the NSA, is confident that they _actually
know_ the full extent of what's going on.

~~~
discostrings
So true. That's one of the most dangerous aspects of conducting this type of
thing in the dark.

------
CurtMonash
Simple explanation of the original error -- the person who put together the
slide deck wasn't exactly a dev on the technology itself.

It does call into doubt the credibility of Snowden's OPINIONS about the scope
of NSA technology and procedures.

The technology error -- or more precisely the lack of rapid correction --
isn't one of Greenwald's finer moments, but in general he's done such an
awesome job on this issue that I give him a pass on that blunder.

~~~
CurtMonash
And by the way -- $20 million budget always meant PRISM was only a small part
of the whole. Not realizing that was actually a dumber error on Greenwald's
part than him taking "direct access" at face value.

------
lostlogin
I can help but think about the following point when looking at this. Do people
really care whether access was direct or indirect if their own personal
records have all been captured and/or seen/analyzed? It sure wouldn't matter
to me how they pried their way in - it would matter to me that they had done
it though.

------
gasull
This is the best explanation I've read about how PRISM might work:
[http://uncrunched.com/2013/06/11/connecting-the-prism-
dots-m...](http://uncrunched.com/2013/06/11/connecting-the-prism-dots-my-new-
theory/)

------
junto
It would seem to be relatively obvious to me, that would I be interested in
partaking in a bit of terrorism, I would be relatively sure that using Skype,
Facebook or Gmail would be a stupid idea. I think I would have deeply
suspected that that was a stupid idea before PRISM, and now even more so since
Snowdengate.

The idea that this technology is being used for anything other than mass
control is bullshit. I don't want to sound like a member of the tinfoil-hat-
brigade, but sadly, I just don't see terrorists using any of the resources
offered by the PRISM mentioned US corporations. I do see a very dangerous
threat to democracy.

------
skwirl
An interesting article was just published with the headline "Greenwald gives
away the game on his PRISM claim." He is continuing to walk back his original
claims of direct access.

[http://littlegreenfootballs.com/article/42126_Greenwald_give...](http://littlegreenfootballs.com/article/42126_Greenwald_gives_away_the_game_on_his_PRISM_claims)

------
pradocchia
I don't know, but it appears the message brigade is out in force now. Our
initial shock is gone, and we are eager for more narrative. Now is the
opportunity to cast doubt, to reframe, to discredit.

------
Aqueous
[https://news.ycombinator.com/item?id=5844303](https://news.ycombinator.com/item?id=5844303)

------
detcader
According to Greenwald, The Guardian is "not remotely" walking back any
initial claims.

[https://twitter.com/ggreenwald/status/345315199257047040](https://twitter.com/ggreenwald/status/345315199257047040)

onekade: Confused about this. The Guardian "correction" doesn't walk back its
initial claim at all. mediaite.com/online/fulsome…

ggreenwald: @onekade Not remotely - they're desperate to discredit all the
spying, but it's not going to work. Documents are too powerful

------
mpyne
So in other words, the Guardian is saying that I'm right (again)?
[https://news.ycombinator.com/item?id=5859493](https://news.ycombinator.com/item?id=5859493)

~~~
jonlucc
I think it's possible that Steve Gibson is right on this week's Security Now
podcast. He suggests that perhaps the NSA has tapped into the ISP just
upstream of the 9 companies listed. Then, they copy all of the data coming
into and out of the fiber going to the Google datacenters. This fits with the
name Prism, and is corroborated by the 2006 revelation about the NSA-
controlled room in the ATT building in San Francisco.

Edit: this does rely on the NSA having access to routers, not servers, so it
still isn't exactly what they said

~~~
tonfa
With a 20M budget?

~~~
holloway
Prism might only be part of it, there are ways of breaking up projects in
budgets obviously. It could just feed the data into another system

------
interstitial
Anyone else think they are playing "Rope a Dope"?

