As a system administrator, he needed access to many of the agency's computer systems -- and he needed access to everything on those machines.
He certainly did NOT need this access to administer systems. It still boggles my mind that the sensitive data on those systems was not encrypted so administrators could not read it.
In any well-run IT organization, the DBAs cannot read the credit card numbers stored in the database and the systems administrators cannot read user passwords. But they can still administer the database and the the servers.
This is actually easier said than done. One of the major concerns is you don't want some dude with a security clearance taking work home with him. So a lot of effort goes into things like selinux being used to control what programs can access files.
Here you have a fundamental conflict. A root user can reconfigure the controls to allow access (that is absolutely necessary from a practical perspective), and so you end up having to trust the sysadmins.
Or you could set up other systems that would read the files, decrypt them only within special programs. Now you have two problems..... Moreover the sysadmin might be able to pull passphrases as they are typed, keys as they are uploaded, etc. It is not that easy to manage.
In the end a determined sysadmin can get the information and there isn't really a way you can lock him or her out fully.
A sufficiently careful one probably could for some time, especially if he was quite aware of what was checked and could work around such checks initially.
DBA's and system administrators can usually use their privileges to install tools or change settings in such a way that they gain access to the raw data. Whether it goes undetected or not is another matter.
In the end someone or some process has access and a sysadmin / DBA could pretend to be that someone or that process.
Once you have physical access to a box it is game-over, the only thing that might still trip you up is an auditing system that is not part of that box.
You shouldn't make it easy either, but if someone configures a system they are in a position to bypass the security on that system. It all ends with trust.
You can set things up so there's no way for any single to bypass logs or other audit controls. A trivial way is to have all logins go via an administration host which logs separately , and never give admin rights on that machine to people with admin on the protected systems.
Yep. That's the way to do it. But a surprising number of companies have things set up so that a single individual can get at whatever data he or she wants without any logs or with logs that can be tampered with. Typical reason in small companies: there is only one person with the required skills.
What is surprising is that even the NSA did not follow these practices after all if there is a place where such a system would make sense it would be there, and they certainly could not make the excuse that they're small.
It boggles my mind as well. I worked for a credit scoring company, and I must say they took their security really seriously (much more so than the NSA, it looks like).
They created separate positions, with deliberately and carefully separated permissions and often conflicting duties, so that one party watched over the other and ensured no overreaching, and no single person being able to access data unmonitored (as rdl exemplified). In all the time I worked there as a contractor, I never saw production data.
Looking over the Internet, it seems standard practice:
"The first is compartmentalization. Trust doesn't have to be all or nothing; it makes sense to give relevant workers only the access, capabilities and information they need to accomplish their assigned tasks. In the military, even if they have the requisite clearance, people are only told what they "need to know." The same policy occurs naturally in companies."
"Make sure a single person can't compromise an entire system. NSA Director General Keith Alexander has said he is doing this inside the agency by instituting what is called two-person control: There will always be two people performing system-administration tasks on highly classified computers."
Given that NSA internal controls seem to be less than commercial best practice in high security settings, and yet people (probably correctly) assume NSA is superior to known science in many areas, there are three reasonable scenarios:
1) NSA actually is fairly competent overall, and feigns incompetence as a way to let people bypass legal controls when "useful".
2) NSA is actually incompetent overall, and people like tptacek who say they're wildly beyond the open world are wrong. They obviously have a huge budget and legal protection (legal or illegal) to do whatever they want, but a lot of what someone would want to do is something even I could pull off given the NSA's level of resources.
3) NSA has really fucked up priorities, and for some reason (limited resources, cynical political calculations, expediency, etc.) put all of their effort into offense and none into defense, not even from their #1 threat (insiders). Maybe they also put too much faith in the vetting process and external shell and less on protections within the chewy center. This was pretty common in the commercial world 10-15 years ago (when "use a firewall and AV filters" was the leading security advice). If NSA actually went backward from the 1980s when they strongly believed on internal controls to actually cross over and be inferior to the commercial world, we're kind of screwed.
These systems are on 'known good networks' staffed by people who have some of the most extensive vetting you can get.
These are the folks who invented (or funded the development of) RBAC and the like. SELinux came out of NSA, for example. The tech exists, but it's a hastle to work with. If you've got fully-cleared, all-known-good actors, why bother?
It's not like it's on the internet, or accessible in some fashion.
"Maybe they also put too much faith in the vetting process and external shell and less on protections within the chewy center."
They put a lot of effort into screening staff, but they're also the most attractive target for well-funded adversaries (Russia, China, etc.).
The only stuff I've ever seen come out of NSA and actually used by anyone in government that hasn't been crap has been hardware. SELinux, etc. are a sideshow. Their OS protections as deployed elsewhere in government are primarily COTS or were developed by external contractors (HBSS, etc.), and are actually shittier in a lot of ways than best commercial practice.
> More surprising than Snowden's ability to get away with taking the information he downloaded is that there haven't been dozens more like him.
Is an assumption on Schneier's part. For all we know there were dozens or more. They may not have released the data or they may have sold it to some foreign agency instead.
Interesting angle. Makes you wonder if any politicians data was part of Snowden's haul. If so that opens up another can of worms.
If they didn't have segregated access (and it does not appear that they did) then that would have surely been too juicy to pass up. 20,000 documents is a lot.
This might also go some way to explaining the panic, if they don't know what he's got and he's had access to data like that then some people must really not be sleeping well right now.
They may still become whistleblowers. Whistleblowers are only whistleblowers once they blow the whistle, before then they're thieves.
Some would like them to remain thieves after they've done their thing and I'm sure that some thieves eventually consider becoming whistleblower but in the interval between taking the data and exposing it there is no real difference that you could outwardly discern between the two cases.
Part of Assange's philosophy is that secretive unjust organizations will operate less efficiently and become less effective as they lock down their internal flow of information in response to leaks. Looks like things are going according to plan.
Sun 31 Dec 2006 : The non linear effects of leaks on unjust systems of governance
You may want to read The Road to Hanoi or Conspiracy as Governance ; an obscure motivational document, almost useless in light of its decontextualization and perhaps even then. But if you read this latter document while thinking about how different structures of power are differentially affected by leaks (the defection of the inner to the outer) its motivations may become clearer.
The more secretive or unjust an organization is, the more leaks induce fear and paranoia in its leadership and planning coterie. This must result in minimization of efficient internal communications mechanisms (an increase in cognitive "secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption.
Hence in a world where leaking is easy, secretive or unjust systems are nonlinearly hit relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance.
Only revealed injustice can be answered; for man to do anything intelligent he has to know what's actually going on.
Hmm... a corollary might be that startups (insecure due to their very nature) have a huge edge with regards to standard institutions, precisely because they don't have to deal with security problems.
So, as they grow bigger, they become more inefficient and customers more unhappy (see: Paypal). I hope there are ways to mitigate this.
If we take Schneier's example of the bank president and the ATM, you could easily argue that is a flawed philosophy. Every day I go to the ATM, money comes out. Like clockwork, very efficient.
Because bank's principal job has been running ATMs for 40 or 50 years and it is run-the-bank stuff. If you look at internet banking (and how out of date and clumbsy most of these systems are) you will see this effect. Change-the-bank stuff is what "suffers" from this.
Retail banks' ATM and overnight payments systems are often 25-year-old pieces of software and hardware.
Multi-party controls and generally complex technical controls add minimal overhead relative to benefit in large organizations working on static systems; they're horrible in small companies working on rapidly changing products. (Imagine a tiny startup where you have a 4 week change control process for every modification...)
>Think of an employee as operating within a sphere of trust -- a set of assets and functions he or she has access to. Organizations act in their best interest by making that sphere as small as possible.
The idea is that if someone turns out to be untrustworthy, he or she can only do so much damage. This is where the NSA failed with Snowden.
And if you read easily between the lines, this is where the NSA failed us as well.
Snowden's sphere of access was very large. He broke org procedure for the good of all Americans and net users on the planet. What is implied is that existing analysts in like positions have similar access, and can break org procedures for nefarious purposes (indeed, as they already have countless times according to reporting), and no amount of official gov't reassurances on "checks and balances" or "proper procedures" can wipe this fact away.
I wonder how many roles at the NSA have inflated spheres of trust, and about all the little and big ways their operators break org procedures all the time...and especially for what reasons...
Finding a balance between paralyzingly complicated processes and having to trust single employees with all of your secrets is not easy. I think that's great, since it helps discourage organizations from doing the kind of things that morally outrage their employees.
Pretty much every company has confidential information. When a company administrator sells the internal address book to a recruiter or when a salesperson makes a copy of the prospects database before joining a competitor - that's leaking too.
It would be interesting to consider a world that is truly without secrets, even of obscurity. It's possible we might get there. I think the way companies and so forth do their work could end up being adjusted to cope.
i recommend stephen baxter and arthur c. clarkes "the light of other days". as far as i remember it focusses more on the social than the economical impact of absolute transparency, but i still strongly recommend this excellent book.
Two points seemed to contradict eachother. The first was cutting classified sysadmin positions by 90%. the second was requiring doubling up for work on classified systems.
Does this mean that the sysadmins will each be responsible for 20x as much information or work? With such a workload how can the doubling up be effective?
I guess my rule is that when confronted with a crisis, organizations will usually react in such a way as to preserve or even exacerbate all pre-existing problems.
> A public or private organization's best defense against whistle-blowers is to refrain from doing things it doesn't want to read about on the front page of the newspaper.
I feel like this hasn't been said enough in the debates surrounding the surveillance leaks.
Every nation need secrets but not in that way it is currently handled. There must be a kind of balance like a committee which decides in every single case "should this information be classified or not". When this procedure would be consistently applied there might wont be a prism program. Leakers should get a chance to contact this committee and get exemption from punishment. I hope all of you know what I mean - I know my english is not the best.
That's a lot of power granted to some committee. Who verifies that it's working well, and its members aren't actually selecting what to publish for their own benefit? What guarantees would a leaker have that the committee will protect her/him?
In a sense, Snowden was whistle-blowing on himself! whenever you have lower level employees given such tremenous power without auditing and checks, the system is open to abuse. He has shown that a conspiracy of one can engage in very significant and very bad actions. What if he did such things and told no one?
I'm baffled. One the one hand, the NSA wants their employees to "uphold and defend the constitution". On the other hand, they want their employees to keep secrets.
(... It is possible for these two things to not be in conflict with each other...)
They want their employees to honor the commitments they've made. At the same time they're lying to congress and the FISC.
Dear NSA: Let me help you un-fuck your business: Stop requiring your employees to uphold and defend the constitution. Stop hiring people who believe that the NSA should be held accountable to the citizens and/or to the congress.
(Perhaps you should start by creating a cult of personality around a strong, charismatic leader. And for the signing-in ceremony, instead of swearing to "defend the constitution against all enemies, foreign and domestic", maybe they could just wipe their asses with a copy of the Bill of Rights.)
If you do this, you will never have another Edward Snowden again. I guarantee it.
Apparently they employ pre-brainwashed employees (Mormons).
Edit: okay, that was offensive to Mormons, but they do apparently employ a lot of them. If one group is overrepresented, so will their views (which might diverge from the rest of the population).
How about being an ethical organization that people would feel morally compelled to support versus harm? How about not creating secrets around ills committed but actually being candid about such issues when eventually discovered and reforming openly? It seems organizational psychology is partly to blame.
I would argue that politicians ARE an exception. They are the most public figures, and they do enjoy a unique status. They should have no professional privacy whatsoever; their personal privacy can be on par with celebrities etc.
He certainly did NOT need this access to administer systems. It still boggles my mind that the sensitive data on those systems was not encrypted so administrators could not read it.
In any well-run IT organization, the DBAs cannot read the credit card numbers stored in the database and the systems administrators cannot read user passwords. But they can still administer the database and the the servers.