So when a big organization like the Washington Post, and the Guardian US, win it, that's a strong statement. They could've just as likely been given the National or Investigative reporting awards.
(also, unlike the other prizes, there is no cash prize for the Public Service award)
The WaPo has won it before, including for Watergate and the Walter Reed investigation: http://www.pulitzer.org/bycat/Public-Service
Turns out they did. I'm very pleased to see that. Congratulations to the winners!
There is a perception among some people that the selection process is biased away from controversial figures. Last year's Person of the Year was Pope Francis; some people thought it should have been (e.g.) Edward Snowden. He was the runner-up.
Its's more than a perception, they've tacitly admited it. Wikipedia:
As a result of the public backlash it received from the United States for naming the Khomeini as Man of the Year in 1979, Time has shied away from using figures that are controversial in the United States due to commercial reasons. Time's Person of the Year 2001, immediately following the September 11, 2001 attacks, was New York City mayor Rudolph Giuliani, although the stated rules of selection, the individual or group of individuals who have had the biggest effect on the year's news, made Osama bin Laden a more likely choice. The issue that declared Giuliani the Person of the Year included an article that mentioned Time's earlier decision to elect the Ayatollah Khomeini and the 1999 rejection of Hitler as "Person of the Century". The article seemed to imply that Osama bin Laden was a stronger candidate than Giuliani, as Adolf Hitler was a stronger candidate than Albert Einstein.
And if you attribute the existence of nuclear weapons to Einstein (a legacy he'd have been pretty unhappy about), that changed the world much more lastingly than Hitler did. Hell, it almost destroyed the world entirely. How's that for a legacy.
Meanwhile it was the year of "Collateral Murder" and the diplomatic cable leaks; it was arguably the year Wikileaks went from unknown to notorious.
Assange was at least in part resisting a possible extradition to the U.S.
Irrespective of whether you feel Assange was trying to avoid a hatchet job and rendition to the US, justice or something in between, 2011 was a bad year for him. I don't think he'd disagree.
The danger here is that far too many will again just laugh it off as they justify to themselves they cannot do anything about it and worse really just don't care to expend the effort.
Also, you can't have it both ways and say Catholics care passionately about the Pope but surveilled citizens are too jaded to care when comparing the effects of the two. How many Catholics are indifferent to the Pope, care passionately about governments surveilling their citizens?
How much more?
Anyone know what kind of digital tools they used?
Anyone know of other digital tools journalists/the press use to investigate/uncover content?
Jeremy Singer-Vine, formerly of the Wall Street Journal, was named a Pulitzer finalist for National Reporting this year (http://www.pulitzer.org/finalists/2014?):
> John Emshwiller and Jeremy Singer-Vine of The Wall Street Journal - For their reports and searchable database on the nation’s often overlooked factories and research centers that once produced nuclear weapons and now pose contamination risks.
HNers may remember some of Jeremy's stuff recently making the front page, including:
Reverse Engineering xkcd's Frequency: https://news.ycombinator.com/item?id=7290868
Without doubting your point here, do you think access to data is driving the underlying ability to leverage these tools? Or is it just people have now started to look harder at the problems with an analytic toolkit in mind?
But I think we (as a society) are just beginning to make use of data, in terms of analysis and general computational thinking. To go back to the domain of journalism...Aron Pilhofer, who heads the interactive news team at the New York Times, said that in "one day...we can teach you the skills that if mastered would allow you to do 80 percent of all the computer-assisted reporting that has ever been done" (http://knight.stanford.edu/life-fellow/2012/times-editor-say...)...I think this is still the case.
As an example, you don't have to go back much further than last year's Public Service Pulitzer...probably my favorite winner in modern times: http://www.pulitzer.org/citation/2013-Public-Service
The reporters took a sensational story (off-duty cops being caught on YouTube egregiously breaking the speed limit) and opted to do an empirical analysis. But of course, what they were trying to find -- cops who broke the law by speeding -- was inherently non-existent, in terms of public records (because it is the cops who determine whether the law is broken). So instead, the reporters requested toll booth records, which recorded the passing through of each cop car. Taking distance divided by time, they were able to prove so convincingly how egregious the abuse was that Florida police stations pretty much rolled over and immediately repented.
Besides whatever database they used to hold the data, the analysis here is literally elementary level. This is not to say that the reporters' had it easy (they still had to do all the footwork, interviews, confrontations, and fact-checking, among other things), but it just goes to show you how many important stories are out there, in every jurisdiction, that only need someone who cares enough to do some counting and arithmetic. I kind of love the Public Service awards because of how they recognize these relatively non-sexy, but incredibly important stories done by determined and clever journalists.