> Apache Accumulo is a sorted, distributed key/value store based on Google's BigTable design. It is a system built on top of Apache Hadoop, Apache ZooKeeper, and Apache Thrift. Written in Java, Accumulo has cell-level access labels and server-side programming mechanisms.
> Accumulo was created in 2008 by the National Security Agency and contributed to the Apache Foundation as an incubator project in September 2011.
According to the PRISM slides, the program costs $20 million per year. This article doesn't mention that, although it should, because it is telling. If the author believes that this program could be implemented at a minimum of $187 million per year, then that $20 million claim is problematic.
Either the $20 million claim is wrong, and then all the information on the slides is suspect, or it is correct, and the scope of PRISM is much smaller than is widely believed and is believed by the author of this article. Or the author of this article properly understands the scope and is in error in his calculation.
Either the $20 million claim is wrong, and then all the information on the slides is suspect, or it is correct, and the scope of PRISM is much smaller than is widely believed and is believed by the author of this article.
Multiple replies below have already questioned this either-or choice you present. From what I know about government agency presentations to higher-level authorities who set budgets, the likely claim on the slide is that the marginal cost of PRISM-as-such in an environment in which NSA already has other programs and the facilities to run them is just an insubstantial $20 million. And on the more extravagant assumptions of the submitted article, that might very well be a true claim for a PRISM program that gathers and analyzes quite a lot of data. That's especially likely if NSA has low-cost in-house software development capabilities, as it surely does.
One of the fundamental issues with these discussions is that things such as what you're saying get thrown into the mix when we're talking about things we've actually seen evidence for.
We've, for a very long time, said things like, "this is probably happening". That is in no way whatsoever a novel idea. What is novel, and why these discussions are happening so frequently now, is that we have evidence that a 20mil/year program is actually happening.
So when we're talking about things we have evidence for, let's please avoid throwing in conjecture.
If you assume that the servers only retain data for one month, then the server costs are cut by a factor of 12 and you end with €168M/12=€14M (roughly $18M). And a total cost of $22M.
Additionally the posting assumes that all the data is stored, that is a lot of cat videos. With decent preprocessing you can probably cut the data rate by a rather large factor ( I would assume at least 100, since you do not need to store warez or the NYT homepage.) Then to do the opposite estimate, by assuming that the system is CPU bound, one needs hardware to process 120 GB/s. With roughly $10M you can then buy a few thousand machines, and your PRISM software needs to handle something like ~50 MB/s per machine. ( Which may or may not be a reasonable data rate, depending on the sophistication of the algorithms, and how much can be discarded very easily.)
> This is a worst case scenario that does not include potential discounts due to renting such a high volume of hardware and traffic or acquiring the aforementioned hardware (which incurs a higher initial investment but lower recurring costs) .
"worst case scenario" is emphasized in the article.
The author counts the storage on a yearly basis (servers to store a year of data). If you allow an expiration date for the records (let's say 4 years), after that period you can spend less on hardware, as you can free space from the old records. Then you only need to spend money on the traffic difference (as the traffic would increase in 4 years).
As the storage boxes in the article also have a nice CPU, the collected data can be indexed and then compressed, saving a lot of space.
Given that the Internet grows exponentially year by year, while the cost to store a bit of information drops in a similar fashion, I doubt there is any money to be saved by deleting old data. The save in system complexity is likely to handily outweigh the additional cost of storage, not to mention it isn't worth one iota of frustration and bad reviews if data an analyst wants is not available.
Well, eliminating redundant or spurious data further down the stack can make these kinds of big data operations very cheaply. It's the difference between parsing unstructured logs and a flat file. There is no resources being spent if the data is proactively being maintained and curated.
500k is not that outlandish for a system like this. Cameron Purdy, who developed Tangosol (now Oracle) Coherence said that he regularly turned down $500k job offers. Bear in mind that their idea of "top notch" is not what gets bandied around here as "rock star". They're not programming Rails. They're talking people who have had their shit together since day one, have always been top of their class even after they got into the best universities, and have applied themselves their whole lives to the theory and practice of building big systems. I've built some biggish systems but I still spent a large amount of my university years and youth generally studying beer, skirts and house music. It makes a difference, I'm a long long way from what they're talking about.
Here's how I would think about it if I were building this. The total hardware costs are 168M, and the total personnel costs are 4M. Say I pay $500K instead of $200k and in doing so I get Jeffrey Dean instead of someone like me (I suspect I might have to pay more than $500k for Jeffrey Dean but bear with me). My costs have doubled but the efficiency of the system might be 5x or 10x better because I'm just quite good at my job and he's a total legend. That efficiency scales the total hardware cost, which dwarfs the personnel cost. I'd say $500k starts to look pretty cheap at that scale.
It's the cost of renting a developer with security clearance from BAH. Snowden would likely be in the same salary band as a "supporting developer." He was taking home $122K, thus his actual cost to the government was likely $250-$300K.