Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
US doesn't know what Snowden took, sources say (nbcnews.com)
104 points by danso on Aug 21, 2013 | hide | past | favorite | 48 comments


> Another [official] said that the NSA has a poor audit capability, which is frustrating efforts to complete a damage assessment.

Wow. The NSA spent untold billions building advanced tech to snoop others but never bothered to set up proper internal controls for their own systems?

Of course this also raises the question, how can they continue to insist that there are reasonable controls in place to prevent abuse when they can't even determine what Snowden accessed, after he collected thousands of documents over a span of years?


They do put some effort into internal controls, e.g. SELinux came from the NSA originally. But apparently not enough!


Someone has to be able to sudo on the box and administer the internal controls. It's pretty hard to audit that person.


Wrong. Read about audit tools in RHEL. They are entirely designed to cope with this scenario.


Since "root" in an SELinux configuration has just that one job, it is significantly easier to limit access to that privilege level to very specific cases which you can two-man or externally record ala the script command.

Assuming, that is, the system was set up that way in the first place. Which is such a PITA that I totally believe they just skipped it.


That makes root sound like a gaping security hole. Almost all Windows installations disable the Administrator account and hand out permissions only where necessary.


It's fundamentally nearly impossible to audit (or even control) what sysadmins do. You can try, but at the end of the day the most secure really is based on trust and it works remarkably well. NSA must have thousands of sysadmins (at least over the years) and we've got one that leaked things in a significant way.

The system they use is not really broken, it's just embarrassing to them when it's proven imperfect. The best fix would be to simply avoid doing things that are likely to piss off ethical people like Snowden.


This is nonsense. The whole point of the audit tools that we (Red Hat) ship is to audit the sysadmins. This includes: auditing off machine to secure sites under control of others; having the machine hard power-off if the audit log cannot be written/sent; extremely fine-grained auditing of every command used, file read/written, etc.


What does Red Hat do to prevent someone from copying backup tapes? Or authorized users from copying data to other servers that aren't audited? Or from databases and repositories via their own protocols?

It's rather unimpressive to claim that you can tell a customer that "your sysadmin had authorized access to all data, so any of it could have been copied".


It's the responsibility of a business's owners/executives/officers to physically secure their building and technology. Red Hat and other groups (like Schlage, ADT, and police departments) make tools/services you can use to do this, but it's your responsibility to make sure all of your agents are doing what you want them to do.

It's rather embarrassing have your IT supplier tell you "This other person who you trusted with access to all your data may have copied some or all of it -- you should trust your employees more granularly."


Which sysadmin has access to grant other sysadmins access? What do you do about them?


If it's your system, then only you have access to grant access to others. If you hire a sysadmin to build you a system, you make sure you trust that person, and make sure that person is the only one with access to grant access. If that person lies to you and starts handing out that access, you replace her/him. And if you can't find anyone trustworthy, then you're forced to do this job for yourself.


Copying the backup tapes to what? Auditing is implemented along with physical measures like ensuring USB ports are glued up and people are searched going into and out of the secure data center (which is air-gapped from the internet). You can't even carry a mobile phone -- there are detectors for the signals they give off.

(USB ports / CDs being what failed in the Bradley Manning case)


Physical security is a joke everywhere. I've worked in 50+ datacenters and I'd bet anyone anywhere that I could sneak a 64GB micro SD card out of any datacenter in the world. No one is doing cavity searches, not even NSA. Getting a "decommissioned" or "test" server out full of data would be trivial in 99.99% of organizations.

Theoretically it's trivial to audit sysadmins. In practice it's virtually impossible. Let me know when Red Hat wants to bet money on it.


I'm pretty sure I recall hearing the NSA never lets anything leave their facilities intact when decomissioning or otherwise disposing. I've heard a story of brand new very expensive servers (big Sun machines I think) sitting in a loading dock, still on pallettes, having the project cancelled and having them destroyed unused rather than risk data leaving the building.


I'm sure they have great policies. I'm also sure they're regularly not followed or easily bypassed by certain people. Who exactly destroys the servers, for example?


Red Hat doesn't sell physical security. Let me know when ADT wants to bet money on it.


Red Hat doesn't offer a real world solution to the problem of auditing sysadmins. That's entirely my point.


Oh the logging and auditing are ok

The real issue is who cares about this (the information collected)

I've seen this with "limited administrators" where they were audited by senior people.

But there's always a point from where the trust is "absolute"


> and we've got one that leaked things in a significant way.

One that leaked to the public (Blew a Whistle). This in no way precludes previous leaks from leaking to

a) Foreign Governments b) Businesses c) for Extortion purposes d) Personal use.


This is simply incorrect. It's quite possible to audit what system admins do by logging changes to a system and auditing the logs. I've personally ran an audit of logs in a worldwide cloud email service (you have heard of them) to find out which of my fellow admins made changes to a senior admin's access.

And you can definitely control what access system admins have to your system. Not every person who joins workstations to a domain needs to have full domain permissions.

This kind of sloppy authorization and system state control is inexcusable.


You only think this because you haven't had your system tested in a real way. Hire a really good security firm and I assure you they will easily circumvent your auditing with no more access than your most trusted sysadmin.


I'm pretty sure Microsoft did.


I wouldn't bet on it. Most security audits are designed to tell clients (and their clients) what they want to hear.


I genuinely wonder what other sysadmins working at/with the NSA think about Snowden.

His actions have probably made their jobs more difficult, but they also work with these same systems, have a sense of their scope, and know the extent to which internal controls are (apparently) more procedural than technical. I wonder how many secretly feel vindicated for some concern they have felt or expressed in the past.


We have one for with we know there was a leak. There may be more who sold information on the black market.


From my experience at a very large bank. No one can do anything significant (such as connect to a database) without leaving a audit trail at said bank. It is very much possible to audit and it is being done routinely in places, at banking systems, for instance.


It's not impossible. The low tech proof that it's possible is to just put a camera on them and their screen. There are better ways, obviously.


A camera would not be even close to good enough. It's trivial to type commands in a window that is not visible on your screen, for example. Keyboard logging? Not even close to good enough since you could easily break commands up or write a script that does what you want in an obfuscated way.


If you can read something, you can copy it.


Sure, but why isn't all data encrypted at rest? And access to the keys heavily logged? There are ways to allow sysadmins access to backup, and repair file systems, and yet still not have any access to the (plaintext) data stored on those systems. Sure, eventually the key server (whether it's Kerberos style or X509 certificate style) is going to be something where a single trusted individual can do great harm (if by no other means than by checking in some obfuscated code that introduces a bug that releases keys, and then gets this past the code review process). But you can limit the damage that can be done by "thousands of sysadmins".

It sounds like there are cloud services providers that are doing a better job at this today than the NSA. And we're supposed to trust them with their audit controls?

P.S. My undergraduate thesis in 1990 designed a symmetric key system that would allow data at rest to be encrypted, such that access could be controlled (and logged) via single key server system in which you invested all of your trust. This is not rocket science...


To be honest, we don't know that. All we have here is some offhand remarks by supposedly unnamed official.


If they don't audit what people have access to, who knows what those people are doing with it. That's even scarier than the NSA recording everything, that their employees and contractors can do whatever they want with the info.


Remember, there is a difference between not auditing access to their internal fileservers that they keep their PowerPoints on, and not auditing access to their wiretap data.

Now, a big concern is that even if they do audit access to the wiretap data, there are still too many people who have "legitimate" access to it, and it is still hard to prevent a rogue sysadmin or programmer from bypassing those controls. Merely having all of that data makes it a high value target for attack.


> Remember, there is a difference between not auditing access to their internal fileservers that they keep their PowerPoints on, and not auditing access to their wiretap data.

This will clearly be NSA's response -- whether it's true or not -- but Snowden didn't just take random PowerPoints and internal training docs. If I'm not mistaken, he also took copies of FISA court documents and other highly classified materials that were never intended to be shared among NSA staff.

I truly believe the lack of audits for these materials has destroyed NSA's credibility across the board:

1. We know that NSA hires/contracts incredibly smart and technically talented individuals who are experts at breaking into systems and avoiding detection.

2. The only way for NSA to provide reasonable controls in this environment is to create a culture of monitoring and accountability, and design all their systems from the ground up with auditing and security in mind.

3. But apparently they didn't do #2 (or never figured out how to enforce this for sysadmins), because Snowden repeatedly accessed restricted and highly classified material without an audit trail.

I don't see how they can credibly admit a Snowden sized failure but still ask us to trust them with our personal data.


Those PowerPoints are classified! Why would you think it's ok not to audit access?


But who audits those who set up audit systems? Presumably he was one of the "system administrators" from what I understand.

I guess they haven't solved the problem technically yet as they have instituted "no lone zones" policy. They'll just have everyone work with an accountabil-a-buddy.


If it was one of the standard Windows file servers, and not a SMB/CIFS appliance, the audit controls are very-detailed and fine-grained from 7/2008 up. Previous audit controls were so. You need to modify which ones you want logged, and then you would need some central sink to log all the Event Log events on all the servers, and review those of interest. I have never worked in an area with that level of paranoia.

However, if NSA contractors are put under the same Federal Desktop Core Configuration requirements [0] at a minimum, I would imagine the client and/or server (if he was required to use Windows) are known, or the NSA contractors are not in compliance.

What they do with that information is another story. That said, any admin could change local policy settings, reboot with the computer, and reconnect it to the domain later. I got clever with such things to disable security policies like those when troubleshooting lab computers. We shall never know how smart/dumb Snowden was and they were.

[0] http://usgcb.nist.gov/usgcb/microsoft_content.html


I think they know Snowden accessed a lot of stuff, but they don't know what subset of it he took.


If Snowden could take gobs of stuff without audit, so could everyone.


So, they don't know if anyone else has take anything either? Snowden could in fact merely be the one they know about. And on the face of it, Snowden has risked everything to let the world know whats going on, an act of human patriotism, where as others unknown may well just sell the data to actual enemies.

Maybe Snowden's error is doing the right thing as opposed to making a few Renminbi.


It baffles me how poor PR damage control there is from NSA/government. All they had to say was something along the lines "yeah, some of it is true, but we can't comment any further", same with wikileaks leaks. "some", and you can't be sure anymore what's true and what isn't from that source and end of story, only speculations.


If they didn't know they wouldn't destroy Guardians' hard disks but would take them with them to check out the data.


I thought the Guardian destroyed their own hard drives, to prevent them from being taken?



That's not what the Guardian says happened. "But once it was obvious that they would be going to law I preferred to destroy our copy rather than hand it back to them or allow the courts to freeze our reporting." http://www.theguardian.com/world/2013/aug/20/nsa-snowden-fil... Discussion https://news.ycombinator.com/item?id=6245419


>By using a “thin client” computer he remotely accessed the NSA data from his base in Hawaii.

Oh no, not a "thin clinet"!!!


I think Hollywood would do it more like this:

"Set the Command and control server to engineer a virus to backdoor the system return code with a thin client."

"... oh no, not a thin client!"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: