> Another [official] said that the NSA has a poor audit capability, which is frustrating efforts to complete a damage assessment.
Wow. The NSA spent untold billions building advanced tech to snoop others but never bothered to set up proper internal controls for their own systems?
Of course this also raises the question, how can they continue to insist that there are reasonable controls in place to prevent abuse when they can't even determine what Snowden accessed, after he collected thousands of documents over a span of years?
Since "root" in an SELinux configuration has just that one job, it is significantly easier to limit access to that privilege level to very specific cases which you can two-man or externally record ala the script command.
Assuming, that is, the system was set up that way in the first place. Which is such a PITA that I totally believe they just skipped it.
That makes root sound like a gaping security hole. Almost all Windows installations disable the Administrator account and hand out permissions only where necessary.
It's fundamentally nearly impossible to audit (or even control) what sysadmins do. You can try, but at the end of the day the most secure really is based on trust and it works remarkably well. NSA must have thousands of sysadmins (at least over the years) and we've got one that leaked things in a significant way.
The system they use is not really broken, it's just embarrassing to them when it's proven imperfect. The best fix would be to simply avoid doing things that are likely to piss off ethical people like Snowden.
This is nonsense. The whole point of the audit tools that we (Red Hat) ship is to audit the sysadmins. This includes: auditing off machine to secure sites under control of others; having the machine hard power-off if the audit log cannot be written/sent; extremely fine-grained auditing of every command used, file read/written, etc.
What does Red Hat do to prevent someone from copying backup tapes? Or authorized users from copying data to other servers that aren't audited? Or from databases and repositories via their own protocols?
It's rather unimpressive to claim that you can tell a customer that "your sysadmin had authorized access to all data, so any of it could have been copied".
It's the responsibility of a business's owners/executives/officers to physically secure their building and technology. Red Hat and other groups (like Schlage, ADT, and police departments) make tools/services you can use to do this, but it's your responsibility to make sure all of your agents are doing what you want them to do.
It's rather embarrassing have your IT supplier tell you "This other person who you trusted with access to all your data may have copied some or all of it -- you should trust your employees more granularly."
If it's your system, then only you have access to grant access to others. If you hire a sysadmin to build you a system, you make sure you trust that person, and make sure that person is the only one with access to grant access. If that person lies to you and starts handing out that access, you replace her/him. And if you can't find anyone trustworthy, then you're forced to do this job for yourself.
Copying the backup tapes to what? Auditing is implemented along with physical measures like ensuring USB ports are glued up and people are searched going into and out of the secure data center (which is air-gapped from the internet). You can't even carry a mobile phone -- there are detectors for the signals they give off.
(USB ports / CDs being what failed in the Bradley Manning case)
Physical security is a joke everywhere. I've worked in 50+ datacenters and I'd bet anyone anywhere that I could sneak a 64GB micro SD card out of any datacenter in the world. No one is doing cavity searches, not even NSA. Getting a "decommissioned" or "test" server out full of data would be trivial in 99.99% of organizations.
Theoretically it's trivial to audit sysadmins. In practice it's virtually impossible. Let me know when Red Hat wants to bet money on it.
I'm pretty sure I recall hearing the NSA never lets anything leave their facilities intact when decomissioning or otherwise disposing.
I've heard a story of brand new very expensive servers (big Sun machines I think) sitting in a loading dock, still on pallettes, having the project cancelled and having them destroyed unused rather than risk data leaving the building.
I'm sure they have great policies. I'm also sure they're regularly not followed or easily bypassed by certain people. Who exactly destroys the servers, for example?
This is simply incorrect. It's quite possible to audit what system admins do by logging changes to a system and auditing the logs. I've personally ran an audit of logs in a worldwide cloud email service (you have heard of them) to find out which of my fellow admins made changes to a senior admin's access.
And you can definitely control what access system admins have to your system. Not every person who joins workstations to a domain needs to have full domain permissions.
This kind of sloppy authorization and system state control is inexcusable.
You only think this because you haven't had your system tested in a real way. Hire a really good security firm and I assure you they will easily circumvent your auditing with no more access than your most trusted sysadmin.
I genuinely wonder what other sysadmins working at/with the NSA think about Snowden.
His actions have probably made their jobs more difficult, but they also work with these same systems, have a sense of their scope, and know the extent to which internal controls are (apparently) more procedural than technical. I wonder how many secretly feel vindicated for some concern they have felt or expressed in the past.
From my experience at a very large bank. No one can do anything significant (such as connect to a database) without leaving a audit trail at said bank. It is very much possible to audit and it is being done routinely in places, at banking systems, for instance.
A camera would not be even close to good enough. It's trivial to type commands in a window that is not visible on your screen, for example. Keyboard logging? Not even close to good enough since you could easily break commands up or write a script that does what you want in an obfuscated way.
Sure, but why isn't all data encrypted at rest? And access to the keys heavily logged? There are ways to allow sysadmins access to backup, and repair file systems, and yet still not have any access to the (plaintext) data stored on those systems. Sure, eventually the key server (whether it's Kerberos style or X509 certificate style) is going to be something where a single trusted individual can do great harm (if by no other means than by checking in some obfuscated code that introduces a bug that releases keys, and then gets this past the code review process). But you can limit the damage that can be done by "thousands of sysadmins".
It sounds like there are cloud services providers that are doing a better job at this today than the NSA. And we're supposed to trust them with their audit controls?
P.S. My undergraduate thesis in 1990 designed a symmetric key system that would allow data at rest to be encrypted, such that access could be controlled (and logged) via single key server system in which you invested all of your trust. This is not rocket science...
If they don't audit what people have access to, who knows what those people are doing with it. That's even scarier than the NSA recording everything, that their employees and contractors can do whatever they want with the info.
Remember, there is a difference between not auditing access to their internal fileservers that they keep their PowerPoints on, and not auditing access to their wiretap data.
Now, a big concern is that even if they do audit access to the wiretap data, there are still too many people who have "legitimate" access to it, and it is still hard to prevent a rogue sysadmin or programmer from bypassing those controls. Merely having all of that data makes it a high value target for attack.
> Remember, there is a difference between not auditing access to their internal fileservers that they keep their PowerPoints on, and not auditing access to their wiretap data.
This will clearly be NSA's response -- whether it's true or not -- but Snowden didn't just take random PowerPoints and internal training docs. If I'm not mistaken, he also took copies of FISA court documents and other highly classified materials that were never intended to be shared among NSA staff.
I truly believe the lack of audits for these materials has destroyed NSA's credibility across the board:
1. We know that NSA hires/contracts incredibly smart and technically talented individuals who are experts at breaking into systems and avoiding detection.
2. The only way for NSA to provide reasonable controls in this environment is to create a culture of monitoring and accountability, and design all their systems from the ground up with auditing and security in mind.
3. But apparently they didn't do #2 (or never figured out how to enforce this for sysadmins), because Snowden repeatedly accessed restricted and highly classified material without an audit trail.
I don't see how they can credibly admit a Snowden sized failure but still ask us to trust them with our personal data.
But who audits those who set up audit systems? Presumably he was one of the "system administrators" from what I understand.
I guess they haven't solved the problem technically yet as they have instituted "no lone zones" policy. They'll just have everyone work with an accountabil-a-buddy.
If it was one of the standard Windows file servers, and not a SMB/CIFS appliance, the audit controls are very-detailed and fine-grained from 7/2008 up. Previous audit controls were so. You need to modify which ones you want logged, and then you would need some central sink to log all the Event Log events on all the servers, and review those of interest. I have never worked in an area with that level of paranoia.
However, if NSA contractors are put under the same Federal Desktop Core Configuration requirements [0] at a minimum, I would imagine the client and/or server (if he was required to use Windows) are known, or the NSA contractors are not in compliance.
What they do with that information is another story. That said, any admin could change local policy settings, reboot with the computer, and reconnect it to the domain later. I got clever with such things to disable security policies like those when troubleshooting lab computers. We shall never know how smart/dumb Snowden was and they were.
So, they don't know if anyone else has take anything either? Snowden could in fact merely be the one they know about. And on the face of it, Snowden has risked everything to let the world know whats going on, an act of human patriotism, where as others unknown may well just sell the data to actual enemies.
Maybe Snowden's error is doing the right thing as opposed to making a few Renminbi.
It baffles me how poor PR damage control there is from NSA/government. All they had to say was something along the lines "yeah, some of it is true, but we can't comment any further", same with wikileaks leaks. "some", and you can't be sure anymore what's true and what isn't from that source and end of story, only speculations.
Wow. The NSA spent untold billions building advanced tech to snoop others but never bothered to set up proper internal controls for their own systems?
Of course this also raises the question, how can they continue to insist that there are reasonable controls in place to prevent abuse when they can't even determine what Snowden accessed, after he collected thousands of documents over a span of years?